jabbr Posted October 24, 2020 Share Posted October 24, 2020 4 hours ago, Superdad said: What seems to be missed in most of these discussions is that what is going on for these packet data interfaces is that jitter/phase-noise (from ALL the chips, etc.) is a cause, not an effect. It transfers and along the way is creating ground-plane noise/bounce, which in turn creates threshold jitter for subsequent circuits--ultimately through to the DAC. Please reread our paper (and the technically knowledgable can ignore our attempts to simplify/explain the more basic concepts) and look for the core of what we are explaining: https://cdn.shopify.com/s/files/1/0660/6121/files/UpTone-J.Swenson_EtherREGEN_white_paper.pdf?v=1583429386 This is pure fantasy in modern Ethernet, i.e. specifications within the last 20 years. The stressed eye pattern, or stressed receiver test was specifically designed 20 years ago to prevent this exact problem. Very nice story you continue to tell. It simply doesn't happen on a competent network. For further reading: sandyk and pkane2001 1 1 Custom room treatments for headphone users. Link to comment
Popular Post jabbr Posted October 24, 2020 Popular Post Share Posted October 24, 2020 18 minutes ago, R1200CL said: @jabbr https://en.wikipedia.org/wiki/Carl_Bergstrom That man work concerns the flow of information through biological and social networks. We’re talking about another network 🤓 The only jitter of consequence are in the unsupported stories, or lack of research into potential problems that were considered and solved in the last century. The arguments are very shaky and unsupported, yes, that kind of jitter? From the linked "whitepaper" Quote Q. What about fiber-optic interfaces? Don’t these block everything? A. In the case of a pure optical input (zero metal connection), this does block leakage current, but it does not block phase-noise affects. The optical connection is like any other isolator: jitter on the input is transmitted down the fiber and shows up at the receiver. If the receiver reclocks the data with a local clock, you still have the effects of the ground plane-noise from the data causing threshold changes on the reclocking circuit, thus overlaying on top of the local clock. This is exactly what the stressed receiver test looks for. The transmitter "injects" jitter into the signal, and the receiver need to be immune to a reasonable degree. You can't have significant ground plane bounce and keep a tight eye pattern. The ground plane literally is the bottom of the eye. sandyk and pkane2001 1 1 Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 3 hours ago, ASRMichael said: Are you saying the only clock that matters is the DAC? The global enterprise network market approaches $100 Billion yearly. All these devices have femtosecond range clocks. Good clocks in network devices are a given. PCs themselves are Getting there — the reason PCIe-4 has been rolled out slowly has to do with the difficulty reliably meeting the timing standards (tight eye patterns). It’s not that network clocks don’t matter, it’s just that network devices are already designed to high standards — when you talk about the horsepower of a Corvette, you aren’t measuring it against an actual pack of horses — we are beyond that. Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 27 minutes ago, PeterSt said: 2CV ... Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 1 hour ago, PeterSt said: Any logic (that I can think of) adds jitter just because it doesn't respond equally from one sample to the other (or from one bit to the other, depending on what it is about). Your shift-buffer (thinking of your first DSD design) adds jitter. A fan out buffer adds jitter. Do you mean pin-pin skew or random jitter? There are circuits that clean jitter. Obviously the modern network chips need to perform clock data recovery and clean incoming jitter -- that's what the stressed receiver test is testing. This article demonstrates the actual function of actual chips: https://www.analog.com/en/analog-dialogue/articles/dual-loop-clock-generator.html But this has nothing to do with ADC/DAC clocks, rather the network clocks. Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 FIFO buffers are the standard technique to cross a clock domain, that means whenever an incoming network signal which is clocked by an external device (the transmitter) is received, the receiver, which operates on its own clock, has to clock the bits internally. The external clock used to clock the bits into a FIFO buffer and the internal clock is used to clock the bits out of the buffer. This is such a standard operation that companies like Xilinx provide drag and drop FIFO blocks when programming an FPGA. Of course the internal clock will have its own phase noise profile, every clock does. The question is whether the phase noise profile from the external clock "infects" the phase noise profile of the internal clock. The design needs to ensure that it doesn't ... one technique is the use of PLL containing circuits for clock data recovery, followed by FIFO buffering. Both of these techniques are known to work and have been extensively measured to work. Copper ethernet networks have not been so extensively tested regarding common mode noise rejection. pkane2001 1 Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 9 minutes ago, PeterSt said: It is not, because a. it was you stating that only clocks can "define" jitter (while there is more doing that); b. the OP clearly suggests that something must be overlooked; c. I tell for several times that it is NOT upstream jitter that influences the result for jitter in the D/A process (repeat: NOT); d. while the A/D process surely is influenced by the upstream (or even elsewhere) devices (my claim) ... OR ... a DAC sounds the same whatever is going on upstream. But sadly (for all who don't like what I am saying) this is not true. So what is it what we will be discussing next ? ... that any DAC indeed sounds the same no matter what digital cable is used, no matter what playback software is used, no matter what PC is used ? Great! If we can agree that for a variety of reasons, upstream network jitter *does not* leak into the DAC, then what are the possibilities? I know this is an open ended question, but I have suggested that: common mode noise transmission is a good candidate as a factor. The evidence that various cabling, including various shielding patterns, affect SQ does support the idea that common mode noise might be a factor. Why not differential mode noise ... well the signals are all differential so ... and apropos this thread, buffers don't magically remove common mode noise ... sorry Custom room treatments for headphone users. Link to comment
jabbr Posted October 25, 2020 Share Posted October 25, 2020 1 minute ago, PeterSt said: I hope you are not saying Sorry to me, because I claim the same. Now on to my mentioned "backdoor" entries ... No "apropos this thread" was my intention of indicating that I am replying to the OP/thread Custom room treatments for headphone users. Link to comment
jabbr Posted October 30, 2020 Share Posted October 30, 2020 On 10/26/2020 at 1:58 AM, KeenObserver said: Interesting! If two diametrically opposed interpretations of the laws of physics are put forth, can they both be right? Do the products lead to the sometimes false interpretation of the laws of physics or do the sometimes false interpretations of physics lead to the product? Laws of Physics? Which? Custom room treatments for headphone users. Link to comment
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now