Jump to content
IGNORED

A novel way to massively improve the SQ of computer audio streaming


Message added by The Computer Audiophile

Important and useful information about this thread

Posting guidelines

History and index of useful posts

Most important: please realize this thread is about bleeding edge experimentation and discovery. No one has The Answer™. If you are not into tweaking, just know that you can have a musically satisfying system without doing any of the nutty things we do here.

Recommended Posts

When I first received my tX-USB HD (USB-to-SPDIF converter), SOtM had told me I should be able to power it fine with 7V/1A (my LPS-1). When I got it and I attempted to power it with my LPS-1, it would turn on but it wouldn't function and so I think they are now quoting a minimum of 1.5A for the sMS-200 based on my experience (this is provided you aren't connecting any bus-powered hard drives to the back of the sMS-200 which would result in much more than 1.5A draw).

Just to be clear, I have used the sMS-200 (not Ultra) with an LPS-1, and it works just fine. Of course, this is without any bus powered drives.

 

May's comment was about the forthcoming sMS-200 Ultra.

My point was that with the Ultra, one would need to factor in the cost of a different LPS than the LPS-1, and if one wants an ultracap PS, one would have to buy up to a Vinnie Rossi Mini Pure 4 EVR.

 

It's also unknown whether the tX-USB Ultra will need more than 1.1A, but if your experience with the dX-USB HD is any indication, then sounds like it will.

 

So the costs are racking up fast here!

Link to comment
Your argument makes sense to me that the best clock should be at the end. I am looking for ways to simplify my usb chain. Maybe the tX-USB Ultra is the way to go, do you have any idea when it will be out?

 

This has always made sense to me, and in my case, that last clock is the W4S RUR.

 

So the questions that remain at large:

  1. @romaz is finding that it is not just the last clock before the DAC that should be "good," but the last n clocks, where 1 < n <∞! In his case, he's tried n=3 (switch, sMS-200, USB-SPDIF converter), and is going for n=9 with his 5-clock MOBO mods. He's a wild man!
  2. Will the tX-USB Ultra outperform the RUR, all else constant?
  3. Does the Intona after a good clock like the sMS-200 negate the value of that clock?

Also the new mystery product that Alex from uptone audio will announce in a week or so is interesting as well.

 

Indeed! How will this mystery device compare with the Intona+<RUR, tX-USB Ultra>? So many open questions!

Link to comment
My goal is to replace 8 clocks but that is because it makes the most financial sense to do so. It is always the first clock of four that is the most expensive and so if I am going to buy two clock boards, why not use all four clocks per board? I am hoping that somewhere between 4 and 8 clocks, I will stop hearing any further improvement because I don't relish the idea of buying a 3rd clock board.

You got PM.

Link to comment
Bad clock > Better clock > Best clock

<snip>

 

CONCLUSIONS: I had previously postulated that a bad clock that follows a good clock has the potential to negate the good clock. These findings would support that theory. Without a doubt, it's best to put your best clock at the end and to avoid all bad clocks whenever possible. Once again, I am surprised that further reclocking (with good clocks) results in better and better SQ but my guess is that with each reclocking, there has to be some diminishing return. I am more optimistic now than before that replacing all pertinent clocks in my upcoming build will result in further improvement in SQ.

 

Hi Roy,

 

Very interesting experiments! I really appreciate your empirical approach to the hypothesis of end-to-end clock quality.

 

I have some thoughts and a request for a couple of additional experiments, if you'll indulge me!

 

Thoughts:

 

  • In terms of clock quality, I see your baseline configuration of:
    • Mac Mini > Trend Net switch with sCLK (LPS-1) > sMS-200 Ultra (LPS-1) > dX-USB HD Ultra (SR7) >DAC
    • In terms of clocks: Bad > Best > Best > Best > World class (DAVE)

     

    [*]Your config 1, in terms of clocks, was: Bad > Better > Best > Best > Best > World class

    • It is still amazing to me that the addition of a better clock upstream of 3 existing "best" clocks still resulted in an SQ boost!
    • I assume this is why you tried it, but the Paul Pang switch in config 1 is a "proxy" for your new system build with better clocks. So this was a good PoC (proof of concept). Well done!

     

    [*]I do wonder how much of the sensitivity to superior clocks is dependent on the inherent quality of the DAC's clocks. Your DAVE is obviously in a rarefied category, as it should be, at the price. What I wonder is if this phenomenon would hold true with more modest DACs.

     

     

    [*]Finally, I wonder whether the upgraded clock and LPS-1 on your Trendnet switch might render the "direct connection" mod moot? I'll elaborate in the experiments I propose for you below.

Suggested (requested) Experiments:

 

  1. Do you have any lower-end DAC on hand to test your clock sensitivity experiment? It would be very interesting to know if even with DACs in the sub-1k or sub-2k range, your clock sensitivity still holds. My gut says it should.
     
     
  2. With the superior clocking in place on your switch, go back to a switched instead of direct connection.
    • Compare your baseline: Router > Mac Mini > Trend Net switch with sCLK (LPS-1) > sMS-200 Ultra (LPS-1) > dX-USB HD Ultra (SR7) >DAC with:
    • configuration 4:
      • Mac Mini > Trend Net switch with sCLK (LPS-1) > sMS-200 Ultra (LPS-1) > dX-USB HD Ultra (SR7) >DAC
      • Router > Trend Net switch with sCLK (LPS-1)

Hope this makes sense.

Link to comment
I agree, Rajiv. I am surprised as well and this is the best way I know to explain it:

 

The best signal you can have is the original unfettered, unadulterated signal but as this signal goes through the signal path, it goes through repeated processing and reprocessing and with each processing, that signal must be regenerated and reclocked. When the signal from your ISP enters your internet modem, it is processed. It is processed again when the data is converted into an ethernet stream and again with every switch or FMC it encounters and again when that stream reaches your server's LAN port, and when it hits your system bus, and when it is rendered by your CPU and so on and so on. With ever regeneration and reclocking of the signal, there is potential for the signal to be harmed through the introduction of jitter through poor clocking and the introduction of substrate noise that is likely to be additive as the signal moves through the chain.

 

While this is pure conjecture, my experience would suggest that placement of a clean and accurate clock in the signal path has the potential to clean up and even repair some of the harm that has been caused but if the harm already done is significant, a single reclocking may only be able to improve it so much. A good analogy might be running a fairly clean car through a car wash vs running an off-road vehicle with caked on mud and tar through that same car wash. It may take several car washes before the off-road vehicle gets thoroughly cleaned and even with multiple washings, it may not be possible to completely clean it. This is why I have suggested that it would probably be best to avoid bad things in the signal path early on rather than having to add heroic (and expensive) fixes at the end.

 

Yes, until this phenomenon is better understood, we'll have to rely on our own empirical experiences. My analogy here was more like the princess and the pea. The bad clock is the pea, and the best clocks are soft mattresses. Each extra layer of good clocking - another soft mattress - alleviates the discomfort, but best would be to remove the pea. ?

 

My guess is that if your system (which would include not just your DAC but also your amp, speakers and cables) is resolving enough to reveal the benefits of this direct connection, it is probably resolving enough to reveal the benefits of better clocking. While I suspect there is more to this direct connection than just avoiding the bad clocking introduced by your router and any bad switches after the router, I suspect that has to be at least part of the reason. Of course, as this direct connection brings about even greater resolution and transparency to your system, I suspect the impact of removing bad upstream clocks or introducing reparative downstream clocks should become all the more apparent. Once I find the time, I will borrow other DACs from friends and see what kind of difference I hear but I would be surprised if I hear no difference at all.

 

I suspect you are right.

 

 

I have already done this. Using my Trend Net switch in the "direct path" as my reference, I then placed this switch just after my router and then plugged both my Mac Mini and my sMS-200 into this switch. While it did result in some improvement compared against plugging both the Mac Mini and sMS-200 straight into the router, it was quite small and definitely only a fraction of what I got with the switch in the direct path. Based on this small amount of SQ improvement, I would not find the switch upgrade to be worthwhile.

Ok, thanks - so the direct connection is another layer of improvement, not just related to the clocking. Glad you already tried this.

Link to comment

<snip>

 

SOtM has promised to send me their phase noise and stability measurements for their new superclock but regardless, my ears have told me all that I need to know -- this is one incredible clock that could possibly bring about a new revolution to music servers.

 

Hi Roy,

 

Did SOtM ever send you any measurements or metrics to characterize what is so special about their sCLK-EX?

Link to comment
I agree, the Iso Regen could potentially change the music server landscape once again and I think we are all eager to see how well it performs. In my mind, it may come down to how good its clock is but I'm sure there are other factors to consider.

 

It has been eye-opening to realize just how much clocking and reclocking occurs in a typical audiophile setup. Anything that passes through a buffer requires a reclock. Any transfer from one bus to another (ie M.2 PCIE to SATA or system bus to USB) requires a reclock. Any CPU processing, rendering, conversion, upsampling, etc involves clocking. Anything that enters and leaves RAM is clocked. I think the ideal system would have equivalent good clocks from beginning to end and this is what I am aiming for with my upcoming build but again, it's simply not possible to replace any subclocks (DPLLs) that may be in the path. If it isn't possible to remove the bad clocks in your chain, my listening experiences have suggested that if you have one exceptional clock to use, that you use it as close to your DAC as possible. This boils down to the concept of signal integrity and wanting to provide your DAC the highest SI possible.

 

As to upstream signal degradation resulting in irrecoverable damage, I believe this is probably is true. At some point, the damage becomes so engrained into the signal that it becomes part of the signal and no "signal decrapifier" device made will be able to distinguish what is artifact and what is original. Here is what Paul Hynes told me once regarding the importance of applying clean power to every component in your system:

 

"All circuits require a reference to operate and react with each other. This is typically via a ground (0V) system. Some circuit sections also require a voltage reference above (or below) ground to provide precise operation and this voltage reference is usually connected to ground for its own reference. If the power supply is not clean and free from noise and transient disturbances, it will pollute the ground reference and anything connected to it creating uncertainty of reference voltage. This noise and transient disturbance can be passed on from stage to stage once embedded in the signal and it is quite capable of causing timing errors in data streams."

 

Roy,

 

In your relentless pursuit of the best clocks, do you wonder if you might still have leakage loops remaining?

 

This is where the Intona or ISO-Regen would normally add isolation, but at the cost of a bad clock (potentially) relatively close to the DAC.

 

I wonder if another option would be an EMO EN-70HD between your Mini and your modified Trendnet. This would give you another layer of galvanic isolation, but still have 3 "best" clocks between it and the DAC to recover any poor clocking.

 

Might be worth a try?

Link to comment
I don't have an EMO but I do have an SOtM iSO-CAT6 which is also a passive isolation transformer that operates at gigabit bandwidth:

 

[ATTACH=CONFIG]33741[/ATTACH]

[ATTACH=CONFIG]33743[/ATTACH]

 

Around the time that I was sensing this irritating HF noise (that later turned out to be coming from the PCIe SSD drive), I inserted this device into the "direct" connection path and if it resulted in any improvement, it was very minor and it could have been imagined. Even after I switched back to El Capitan on an SD card, I kept it in the chain but more recently I swapped it out for the Paul Pang switch and the Paul Pang switch resulted in a much more obvious improvement. In my system, the jury is still out on the value of this device but at least I know it doesn't negatively impact SQ either. For now, I have placed it back in its original position between my router and Mac Mini (more for peace of mind than because of any perceived improvement). What I am finding to be more effective is SOtM's CAT 6 cable with its smoother presentation. It is better than my BJC CAT6A but we are talking fairly minor differences.

 

Ok - yup, that covers that angle.

Link to comment
Yes, you want to uninstall any Intel graphics driver. Once done this will force the display into 800x600 VGA mode and turn off the internal GPU increasing SQ.

 

I assume you are running headless with rdp or vnc or something.

 

Nice tweak. I'll try this on my W10/AO/MinimServer bridged machine, and report back. It's using an NVidia graphics adapter.

Link to comment
Have any of you guys tried running your windows 10 pc displays at 600 x 800 resolution? The display picture is smaller but oh my does it sound good. I normally set my display at 1280 x 1000 (or something like this) and wow does it look good but not sound as good as the lower resolution.

 

Yes, you want to uninstall any Intel graphics driver. Once done this will force the display into 800x600 VGA mode and turn off the internal GPU increasing SQ

 

I just have a basic monitor on my W10 Ent/AO/Bridged machine, running at 1280x1024 native resolution. The graphics adapter is an nVidia with 1GB video RAM. I tried this:

  • Removed nVidia apps and driver
  • The display driver the system reverted to was the Microsoft Basic display
  • dropped the resolution down to 800x600
  • Also, tried an experiment running AO shell replacement directly to MinimServer

 

The last dropped my process count from 50 to 41.

 

In terms of SQ - I really can't say I heard a difference I could be confident in. It was late, and I was tired, so I might try again, but indications are that the difference is small.

Link to comment

 

I just have a basic monitor on my W10 Ent/AO/Bridged machine, running at 1280x1024 native resolution. The graphics adapter is an nVidia with 1GB video RAM. I tried this:

  • Removed nVidia apps and driver
  • The display driver the system reverted to was the Microsoft Basic display
  • dropped the resolution down to 800x600
  • Also, tried an experiment running AO shell replacement directly to MinimServer

The last dropped my process count from 50 to 41.

 

In terms of SQ - I really can't say I heard a difference I could be confident in. It was late, and I was tired, so I might try again, but indications are that the difference is small.

 

Perhaps remove the Nvidia card and try again?

 

No can do - I'm just using a vanilla Dell XPS 8700 desktop, and the only video ports (VGA and DVI) are on the Nvidia card. Without it, I'd be headless.

 

Larry,

 

Just wanted you to know I followed up on this. It turns out I was wrong about the integrated video ports on my box. I was looking for "outies" like a VGA port, but it turns out, the only integrated ports were "innies" - HDMI and DisplayPort. Since my display only had VGA and DVI outputs, it was off to Amazon for a DVI to Displayport adapter, which just arrived.

 

So, after a couple of missteps - success! I removed the nVidia adapter, and am now driving the display with the Intel integrated video. The misstep was realizing that Roon wouldn't start with using the Microsoft Basic Display Driver. I had to get the Intel driver, and then all was good.

 

While I was in there - now I sound like the orthopedic surgeon who operated on my knee - I also discovered my box had an active CD/DVD drive, which I also disconnected.

 

No SQ evaluations yet, will listen tonight.

Link to comment
Project Lasso surprsingly has been very effective even with the simpler dual-core CPU in my Mac Mini. While continually resident in memory, it has a very small footprint. I use mine in game mode which maximizes the responsiveness of your CPU and results in this enhanced perceived immediacy to playback. Without any harshness, the image is more vivid. Not only is timing better but the timbre of instruments is truer. It was Project Lasso that really pushed me to explore minimizing latencies elsewhere in my system. Other than game mode, I run it default. If I had a quad-core CPU, I would probably experiment further but even without tweaking it, I am quite impressed.

So Roy, here's what I do with Process Lasso:

  • Set it to Gaming mode
  • Set Bitsum high performance power scheme
  • Set javaw.exe, roon.exe, and raatserver.exe to High priority
  • Set the above 3 process to "classify as a game"

 

I really haven't done any critical comparisons with these settings, so I cannot honestly say it improves SQ.

Link to comment
Rajiv,

 

Good to know you are making progress. I am surprised to hear you need a video driver for Roon. Perhaps you should install Roon server, and control the box from a wireless laptop or tablet?

 

Yes, it's the Roon control application that needs OpenGL. I guess I could just install Roon Server, but the problem was easy enough to solve with the Intel driver.

 

AOs ST can be set to start HQplayer and Roon at boot. Nevertheless don't forget to install tightvnc server and client so you can remote into windows as well.

 

Yes, I'm aware I can use Service Tool to replace Roon as the default shell - and I do plan to play with that. My earlier experience with shell replacement was with MinimServer, that didn't seem to work too well for me. Oh now I remember, I had to turn off autologon.

Link to comment
Rajiv,

 

Each of the steps you've highlighted above are essentially the same thing. Gaming mode = Bitsum Highest Performance Mode. No harm in individually classifying each program as a game but when you select Bitsum Highest Performance Mode, you are essentially classifying everything as a game. At least this is how Jeremy Collake described it to me since I was curious about this as well.

 

LOL! That's funny.

Link to comment
Larry,

 

Just wanted you to know I followed up on this. It turns out I was wrong about the integrated video ports on my box. I was looking for "outies" like a VGA port, but it turns out, the only integrated ports were "innies" - HDMI and DisplayPort. Since my display only had VGA and DVI outputs, it was off to Amazon for a DVI to Displayport adapter, which just arrived.

 

So, after a couple of missteps - success! I removed the nVidia adapter, and am now driving the display with the Intel integrated video. The misstep was realizing that Roon wouldn't start with using the Microsoft Basic Display Driver. I had to get the Intel driver, and then all was good.

 

While I was in there - now I sound like the orthopedic surgeon who operated on my knee - I also discovered my box had an active CD/DVD drive, which I also disconnected.

 

No SQ evaluations yet, will listen tonight.

Well, this is the kind of thing I can't A/B, since it took a while to pull out the video card etc.

 

But I do feel there was an uptick in SQ. Small, but beneficial. I did try different resolutions on my monitor, now using the integrated Intel video: 1280x1024 (native) vs 800x600, and really couldn't hear a difference.

Link to comment

@romaz

 

Fascinating, as always, but just so I'm not losing context - you think these things matter so much even when we're talking about a music server connected to an endpoint like sMS-200, rather than a music player directly feeding a DAC?

 

Just wondering if some of your past experiences - like 4gb vs 8gb - were in the context of a music player, whereas the music server is "once removed" from this, no?

 

Or is that what you want to revisit with your new build?

 

I think I lost the thread in these arcane minutiae of PC architecture. :D

Link to comment
21 hours ago, romaz said:

 

Sorry, I believe I didn't read your post properly and so I failed to answer your question.  With the standard way of connecting an mR or sMS-200 to a router first, I don't think these things matter as much.  With the direct connection, while transparency to the recording has increased, so has transparency to the qualities (and deficiencies) of the music server itself.  As long as I continue to hear improvements, I will probably continue to push along.

 

Think of it this way. Say you had a direct connection:

  • Music Server > USB > DAC

Let's say you improve the signal integrity by some factor S (using better clocks), and reduce noise (HW/SW optimization, zero leakage loops PSUs, etc) by a factor N.

 

Now consider the streaming case:

  • Music Server > Switch > sMS-200 > dX-USB HD > DAC
  • ______________< ---------- sCLK-EX -------------->

where your 3 intervening devices (switch, sMS-200, and dX-USB HD) have the best clocks (sCLK-EX) and the best PSUs.

 

The central question then becomes - do those improvement factors of S and N that you make on the music server still have the same impact? Or are they attenuated by a factor a (0 < a < 1) due to the 3 other intervening devices that are reducing noise and improving signal integrity?

 

Qualitatively, it sounds like you expect that a > 0, else this exercise would be pointless. I think I expect that a < 1, and eager to find out vicariously through your experiments if it is closer to 0.8 or 0.2. :D

 

 

Link to comment
  • 2 weeks later...

Hi @romaz

 

Now that the tX-USBultra is released, the question will quickly turn to appropriate power supplies. May has already said that the power draw will exceed the capacity of the LPS-1, and will likely be around 1.5 amps.

 

Since yours is one of the only sCLK enabled "Ultra" chains in the wild :D - do you have the means to measure current draw? I was just wondering what the current draw is for:

  • your modded dX-USB HD "Ultra", and
  • your modded sMS-200?
I suspect this is hard without some instrumentation.
Link to comment
26 minutes ago, romaz said:

 

I will measure it this evening and will report back buy I suspect my dX-USB HD Ultra just barely draws more than 1.1A (less than 1.5A for sure).  Since the clock on my modded sMS-200 Ultra is powered by the dX-USB HD Ultra, my sMS-200 actually draws less current than a stock sMS-200 and is easily powered by an LPS-1.

 

The current draw of the tX-USB Ultra will depend on how many devices you connect to it.  As a USB 2.0 hub, it is supposed to offer up to 500mA per USB port based on the USB 2.0 specification and as this hub has 2 ports, it needs to offer at least 1A but that doesn't mean that will be the draw.  If you connect only 1 component (such as your DAC) and it doesn't draw from the 5V VBUS, the draw could be well below 1A and could potentially be powered by an LPS-1 (at least this is my hunch).  I think SOtM is just covering their bases.

 

Thanks, Roy, I suspect you are right. Your measurements data will be very useful.

 

I bet I am not alone in thinking the tX-USBultra would be far more financially attractive IF it could still be powered by an LPS-1 rather than a much more expensive option like your Paul Hynes or the VR Mini.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...