Jump to content
KenRW

Article: SOtM sNH-10G Network Switch Review

Rate this topic

Recommended Posts

If you follow the various threads on this forum about clocking, re-clocking, phase noise, etc. I think you'll see that a) John Swenson appears to be building a measuring device capable of showing what he and the folks at Uptone are addressing with their switch and why, and b) this goes beyond the current ways one understands "noise" in a system. The way I look at it, streaming-based digital audio has ushered in a new era of research - subjective and objective - that cannot really be equated to older analog ways of looking at noise.

Share this post


Link to post
Share on other sites

Older assumptions about measurability based on analog systems, is what I mean. I know you are measuring in the digital realm - but like thyname just said - those measurement tools are not addressing the actual problems introduced by streaming audio.

Share this post


Link to post
Share on other sites

Single blind evaluation of the TP-Link switch I used to have vs. the SOtM switch?? Ha. You obviously don't know the difference in sound or you would not offer to pay me for my time like that. Same goes for generic CAT6A cable vs. the SOtM LAN cables...

 

Anyway, I am going to move on from trying to prove to you or anyone that what I hear is there. Doubt all you want, please. Doubt is good. Just don't let it kill your curiosity!

Share this post


Link to post
Share on other sites

1) I still don't understand the experiment you are asking us to do. Where in the chain am I meant to pull the ethernet cable? I have ethernet cables connecting a lot of different components right now. In other words, which device are you imagining is going the buffering? The server? The DAC? That matters a lot as there are many steps in my digital playback chain. You would need to tell me where this is meant to be happening before I consider the logic of what you are saying.

 

2) Any device that receives, transmits and/or buffers a digital signal is re-clocking that signal before it sends it along. So the phase noise, leakage current, and ground plane issues of that device all get introduced at that juncture.  There is also a growing body of inquiry on these forums that believe that phase noise, leakage currents, and ground plane issues still make it through from  previous reclocking devices. It is my understanding that no current device can eliminate everything from upstream.

 

So you want each juncture from modem to DAC to have the most accurate and quietly powered clock possible as well as the best possible noise isolation and/or filtering. So if you pull an ethernet cable from some random place in the chain -- assuming there is some random component buffering the signal -- and there is no change - doesn't that simply mean that the component doing the buffering is no worse than anything that came before it? Which is good, right? If you pulled the cable and the sound got worse, you would want to look at the device doing the buffering with an eye to upgrade. If the sound gets better, then you may want to look at your upstream re-clocking components.

 

To me, the modem itself is the tricky part - the actual place of entry of the signal from outside the home. Someone needs to make a purpose built modem along the lines of what SOtM and Uptone are doing with switches so the first reclocking into the home is the best it can possibly be. Then you can add and remove anything down stream of that to see their effects. Anyway...

 

Share this post


Link to post
Share on other sites
2 hours ago, The Computer Audiophile said:

His statement said all digital signals, so I wanted to make sure to correct that.

 

I've talked to the device designers about reclocking and they say their devices don't reclock. Perhaps a discussion for another topic.

Sorry - I meant within the context of ethernet streaming

Share this post


Link to post
Share on other sites
2 hours ago, Advieira said:

Best configuration ever.

 

 

bestconfig.jpg

This is what SOtM now suggests in response to this thread.

 

Share this post


Link to post
Share on other sites
21 hours ago, PeterSt said:

 

You do understand digital, do you ?

 

How would that signal be affected ?

He's saying that to prove the switch and cable have no effect on the sound, just pull the cable and hear that the buffered signal sounds the same. My point is that that doesn't prove what he's saying it does. It's logic, digital or analog. The signal hanging out in the buffer has already passed through the switch and the cable so of course it sounds the same. For the skeptics among you, there just needs to be a loaner program or some other way to test. Play track. Then insert switch and play track again. Then post about it. (Which is, by the way, what the OP did and reported back on - as frustratingly unspecific / fanboyish as those reports may have been...)

Share this post


Link to post
Share on other sites

Off topic but it would great if the super wealthy among you took it upon yourselves to stock and maintain a kind of lending library with all manner of these devices in all their iterations and upgrades, etc. Then the components could make the rounds and a broad enough sample of listening data could be compiled. I must say all this back and forth between people who have heard a certain component and write about their impressions and those who have not heard the component but dismisses those impressions as wishful thinking is VERY tiring. It infected the old cable discussions - still does - and now it's all over this forum. I guess the way I look at it is this -- for every wishfully thinking fanboy who spent too much money on a dubious product and is hoping beyond hope it sounds better than not having it in his system there is a curmudgeonly troll objectivist who refuses to face the possibility there may still be undiscovered truths in audio. They cancel each other out. Now let the actual listening impressions and measurements begin.

Share this post


Link to post
Share on other sites
1 minute ago, thyname said:

 

‘You are wasting your time. These pseudoscience guys never try anything. They already KNOW the outcome 

Which is of course the absolute inverse of the scientific method.

Share this post


Link to post
Share on other sites

At least that's the general idea.  There has been some discussion on the forums about upstream clocking effects passing through into the DAC, but I don't have a sophisticated enough understanding to evaluate that, and it remains to be demonstrated that this can actually occur.

 

Here we can agree except that it has been demonstrated on a listening level. This is, after all, another form of demonstration. I think you are asking for mechanical demonstration, which I believe we will be able to do soon.

 

I am well aware of the functioning of async USB, thank you. And I think you are actually getting to the heart of my logic bump here. The moment the cable/switch/upstream-signal-source-whatever-it-may-be is disconnected, then there is no more signal coming into the DAC, right? So everything that is "stored" momentarily in the DAC as it's about to be reclocked has already come through the network to the DAC. Otherwise, where  did it come from? We are talking exclusively about streaming audio, not local playback, right?

 

So... if all the packets inside the DAC's receiver chip have already passed through the network to get there - no matter when or where it was buffered before - and IF there is any kind of phase noise influence (positive of negative) and/or electrical noise that carries over from the upstream network - then that would ALREADY be there in the signal now being taken in by the DAC. Then, yes, those can be mitigated by the DAC's re-clocking process, its own power supply and electrical noise isolation capabilities, etc. (But as a user of a regen product it seems your ears do agree there is something an upstream product can do that the async USB process alone is unable to do.) 

 

Still, my point is simple: how do the packets arrive at the DAC unless through the network upstream of it? That's what I mean when I say it has already passed through all those devices and been influenced by them. And, as I think is being discussed above, this includes whatever Tidal does to get the signal out, whatever the ISP does to get the signal to you, your modem, and all the attendant power supplies for these things, etc. The only logical way to test the effect of a switch or a cable or a power supply or a circuit in your home or the day of the week as it pertains to the noise on yours mains line,  is A vs. B: with vs. without the switch, the cable, a psu, same components plugged on one circuit in your home vs. on anther, same components on a Sunday vs. on a Friday, etc. THEN if we hear a difference we know the actual perceptible effect this element is having . THEN we can all work together to come up with a way to measure what we are hearing. In other words, that A/B result is empirical data that warrants explanation not dismissal.

Share this post


Link to post
Share on other sites
24 minutes ago, plissken said:

 

 

Like that matters. I posted years ago a screenshot all the process caching that goes one with even optimized PC has going on. A few more aren't going to matter.

 

If you couldn't hear the 'noise' the 120+ processes are generating you aren't going to hear a few more.

 

This ignorance is beyond the pale.

But what experiments have you done to say you can't hear that noise? There is a growing body of listening impressions on that big topic listing that shows different sounds to different processors throttled in different ways, lower latency software, RAM booted software, etc. Isn't this presumably reducing the audible effects of those processes? There is ignorance from inexperience that is forgiven with enlightenment and then there is willful ignorance despite enlightenment which is unforgivable.

Share this post


Link to post
Share on other sites
23 minutes ago, EdmontonCanuck said:

The thing that seems all "flat earth" to me is that some people believe that "noise" gets stored with a digital stream when it's buffered and is transmitted along the chain to other digital devices to eventually manifest itself when it undergoes analog conversion.

You got the reference backwards. Flat earth is  old dogma - which in this case is that bits are bits. Round earth is new discoveries of ways in which phase noise from prior clocking may indeed seem to show up later. Not to mention electrical noise traveling with the signal (not stored in it) from server, switch, endpoint, whatever does the buffering, through all their power supplies, etc. All apparently brought to the USB receiver chip to deal with unless it's mitigated along the way. Obviously a great clock at the USB chip with great isolation characteristics would do a lot to mitigate all this. Seems to me Master Clock connections on USB receiver chips would be something we should see more widely adopted (I know they already exist) now that all these other kinds of master clocking have cropped up and there seems to be some consensus about their benefits.

Share this post


Link to post
Share on other sites
12 minutes ago, Superdad said:

 

Seems a good place to reprint a post of @JohnSwenson's from October 2017:

 

The hypothesis goes thusly:

ALL crystal oscillators exhibit frequency change with power supply voltage change. This is known and well measured. A cyclical change in voltage causes a cyclical change in frequency which shows up in phase noise plots. For example if you apply a 100Hz signal to the power supply of the oscillator you will see a 100Hz spur in the phase noise plot. 

 

A circuit that has a digital stream running through it will will generate noise on the power and ground planes of the PCB just from the transistors turning on and off that are processing that stream. This effect is very well known and measured. Combine this with the previous paragraph and you have jitter on the incoming data stream producing varying noise on the PG planes that modulates the clock increasing its jitter.

 

The above has been measured.

 

But shouldn't ground plane isolation and reclockers fix this? At first glance you would think so, but look carefully at what is happening. What is a reclocker? A flip flop. The incoming data with a particular phase noise profile goes through transistors inside the flip flop. Those transistors switching create noise on its internal PG traces, wires in the package and traces on the board. This noise is directly related to the phase noise profile of the incoming data. This PG noise changes the thresholds of the transistors that are clocking the data out thus overlaying the phase noise profile of the local clock with that of the clock used to generate the stream that is being reclocked. This process is hard to see, so I am working on a test setup that generates a "marker" in the phase noise of the incoming clock so it becomes easy to see this phase noise overlaying process.

 

This process has always been there but has been masked by the phase noise of the local clock itself. Now that we are using much lower phase noise local clocks this overlying is a significantly larger percentage of the total phase noise from the local clock.

 

Digital isolators used in ground plane isolation schemes don't help this. Jitter on the input to the isolator still shows up on the output, with added jitter from the isolators. This combination of original phase noise and that added by the isolator is what goes into the reclocking flip flop, increasing the jitter in the local clock. Some great strides have been made in the digital isolator space, significantly decreasing the added phase noise which over all helps, but now the phase noise from the input is a larger percentage, so changes to it are more obvious.

 

The result is that even digital isolators and reclocking don't completely block the phase noise contribution of the incoming data stream. It can help, but it doesn't get rid of it.

 

For USB (and Ethernet) it gets more complicated since the data is not a continuous stream, it comes in packets, thus this PG noise comes in bursts. This makes analysis of this in real systems much more difficult since most of the time it is not there. Thus any affects to an audio stream come and go. Thus just looking at a scope is not going to show anything since any distortion caused by this only happens when the data over the bus actually comes in. To look at anything with a scope will take synchronizing to the packet arrivals. Things like FFTs get problematic as well since what you are trying to measure is not constant . It will probably take something like wavelet analysis to see what is really happening.

 

The next step in my ongoing saga is to actually measure these effects on a DAC output. Again I have to build my own test equipment. The primary tool is going to be an ADC with a clock with lower phase noise than the changes which occur from the above. AND it needs to be 24 bits or so resolution. You just can't go out and buy these, they don't exist. So I build it myself.

 

I have done the design and have the boards and parts, but haven't had time to get them assembled yet. Then there is a ton of software to make this all work. Fortunately a large part already exists, designed to work with other systems but I can re-purpose it for this.

 

So it's not going to be right away, but hopefully not too off in the future I should be able to get to actually testing the end to end path of clock interactions all the way to DAC output.

 

John S.

===========

 

FYI, his elaborate clock test set up--we call it the Golden Gate Bridge--has been through a couple of iterations and John has been working ever more actively on it--even this very day.  But mostly in-between product development work. 9_9

Thank you.

Share this post


Link to post
Share on other sites
17 hours ago, plissken said:

 

 

Sorry but your response needs picked apart. I want to point out this tidbit 'had a prototype of one the streamers'. I see a whole slew of issues in that statement WRT to all the possibilities, namely design errors, that could have defeated reasonable measures a DAC designer took to insulate their product from noise.

 

With so much of your efforts in highly buffered I/O systems and knowing how they work it never occurred to try it before?

 

My Emotiva DC-1 seems impervious to the systems I have had it connected to when it comes to testing the upstream noise possibility.

 

So the noise of your Emotiva is greater than anything that's come before it? Interesting.

Share this post


Link to post
Share on other sites
19 hours ago, plissken said:

Who here would like to perform a blind test were I ship them a PC with Tidal and Prime 95 on it. They can connect to their setup  and I can remote manage the PC and turn Prime 95 on and off and see if you can track.

 

Chris?

I've tried to explain to you many times already that your "experiment" proves  nothing. The logic of it is entirely faulty. There is, in fact, no logic to it. But you don't seem willing to engage on the level of logic. You are dead wrong about what is being heard under the two conditions you are comparing. So do I want to indulge some glitch in your cognitive reasoning by attaching a device of your making to my private home network? Excuse me if I decline.

Share this post


Link to post
Share on other sites



×
×
  • Create New...