Popular Post Superdad Posted May 11, 2018 Popular Post Share Posted May 11, 2018 Uh, I think the obvious is being overlooked here: Forget the DAC for a moment. If the source is a computer (or some form thereof) playing back files (ripped from CD, downloaded, streamed, whatever), then that data has to exit the computer in some format. While I know there are some music servers with S/PDIF output--and some sound cards with the same--a computer is a nasty environment within which to place audio clocks and S/PDIF transmitting parts. So the vast majority of "computer audiophiles" are sending the data from their computers towards their DACs either via USB or Ethernet. And at some point--generally in the DAC--that data will go through PHY chips/processor and a protocol engine and be turned into I2S/DSD for the actual DAC chips/ladders, whatever. Thus use of S/PDIF or AES EBU inputs on a DAC generally requires the use of a DDC, most of which take the form of a USB>S/PDIF converter. While of course quality of implementation is the key, think about the general data flow. That is, which is the better path?: a) Computer USB> USB input of DDC > S/PDIF transmitter (embedding clock in data) > S/PDIF cable > DAC's S/PDIF receiver chip> reclocking > I2S for the DAC section. OR b) Computer USB> USB input of DAC > I2S to the DAC section. And an Ethernet-input DAC is about equivalent: Ethernet PHY/processor > I2S Any DAC manufacturer who still says their S/PDIF input is better than their own USB input is simply admitting that they can't design a decent USB or Ethernet input board (or they want to sell their own outboard DDC). Just my $0.02 Audiophile Neuroscience, look&listen, lucretius and 2 others 3 2 UpTone Audio LLC Link to comment
Popular Post Superdad Posted May 13, 2018 Popular Post Share Posted May 13, 2018 2 hours ago, Ralf11 said: back to TOSLINK and SPDIF - how bad are they? What's the source? TOSLINK or S/PDIF from a computer motherboard? In that case, rather bad. (Could dig up measurements on the web if inclined.) If you are talking about a converter (USB>S/PDIF or Ethernet>S/PDIF), then we come back to my point before: Why insert more steps of conversion and clocking just to eventually get to the I2S/DSD signal for the DAC chips? Build a decent USB or Ethernet input board into the DAC and be done with it. Beyond that, there are other new interface/data transport means that can be designed (we are working on a couple), but the path to wide adoption will be VERY slow. Standards are a good thing, but the S/PDIF standard--originally not even meant to be used externally--really ought to just fade away in the computer audio arena. Teresa and lucretius 1 1 UpTone Audio LLC Link to comment
Superdad Posted May 14, 2018 Share Posted May 14, 2018 4 hours ago, mansr said: Really? What was it meant for then, and why has been used on Philips CD players since the mid 80s? Sorry, was thinking about I2S at the time. Doesn’t change my opinion of S/PDIF though. A lot of effort is required to at both transmit and receive end to make it perform well, and then the are stil issues with the cable, etc. No denying that, unless generated right in the computer, S/PDIF adds another unnecessary conversion in the chain. Audiophile Neuroscience 1 UpTone Audio LLC Link to comment
Superdad Posted May 14, 2018 Share Posted May 14, 2018 1 hour ago, Summit said: And still you use I2S (which was not made to be used externally), so it’s possible (I guess) to improve the design like has been done with LVDS I2S. Well I2S internally--right from the Ethernet or USB input board--with master clocking done right. I2S (via LVDS over HDMI) is rarely done in ideal fashion as the source ends up as the master clock (very few DACs feed master clock out to slave the I2S source). I personally presently use I2S external (modified Singxer SU-1 feeding I2S/DSD over LVDS/HDMI cable to Holo Spring L3) because the Crystek CCHD-575 clocks in the Singxer are a lot better than the clocks in the Spring (and/or the USB input of the SU-1 is better). There is nothing preventing an enterprising manufacturer/designer from developing a more ideal--and yet still operating system "sound card driver" compatible--two piece solution with.... (oops. shhh... ). UpTone Audio LLC Link to comment
Superdad Posted May 15, 2018 Share Posted May 15, 2018 3 hours ago, mansr said: If you ignore drift, you'll need to either drop or insert samples whenever the clocks slip by more than a sample period. The S/PDIF spec requires a frequency accuracy of 1000 ppm for the sender. Suppose your local clock is running at a perfect 48 kHz while the sender is at the upper end of the permitted range, that is 48048 Hz. Every second, you'll be receiving 48 samples more than you know what to do with. You have no choice but to discard them, and this causes distortion. Or you can do what @JohnSwenson did for the unique S/PDIF input of the Bottlehead DAC, which was to use an FPGA instead of a traditional S/PDIF receiver with its jitter-prone PLL. He did a little cleanup of the S/PDIF signal then sent it into an FPGA for the decoding. But the special part was that he’s used a digitally controlled low phase-noise clock, with performance close to some of the best fixed frequency clocks. The FPGA told the variable clock to speed up or slow down so it was synchronized to the average data rate of the source. It was a REALLY good S/PDIF input! UpTone Audio LLC Link to comment
Superdad Posted May 15, 2018 Share Posted May 15, 2018 5 hours ago, mansr said: Yes, this [using an FPGA and variable-rate clock] is a well-known and obvious approach. The point is that you must do something to synchronise with the source clock. If that is “well known and obvious,” can you point us to any other commercial DACs that implement this technique? It actually was not that easy to develop. UpTone Audio LLC Link to comment
Popular Post Superdad Posted May 15, 2018 Popular Post Share Posted May 15, 2018 31 minutes ago, mansr said: The concept can still be obvious. Yes, well it has been obvious for decades that electric vehicles with regenerative braking would be an advancement, and yet look around. Ideas are cheap and easy; proper design execution takes more work. ssh and asdf1000 2 UpTone Audio LLC Link to comment
Superdad Posted May 15, 2018 Share Posted May 15, 2018 36 minutes ago, Summit said: SPDIF doesn’t suck! I certainly never said it did. I don't see the world in back-and-white. My original point was simply that, unless one is producing a really good, well clocked S/PDIF single right in the source computer (eschewing USB altogether), the choice to use S/PDIF is just a choice to do USB>I2S>S/PDIF conversion externally--and to then require another S/PDIF>I2S conversion in the DAC. Between elimination of stages and the ability to slave an internal USB>I2S board to the DAC master clock, there would seem to be very few decent arguments in favor of using an external DDC. The only valid argument I have ever heard was that the DAC designer's USB (or Ethernet) input did not have enough effort put into it. Cheers, --AJC Audiophile Neuroscience 1 UpTone Audio LLC Link to comment
Popular Post Superdad Posted May 17, 2018 Popular Post Share Posted May 17, 2018 32 minutes ago, Ralf11 said: what about a Jewish motherboard? You mean one that turns itself off at sunset and Friday and won’t turn on until Saturday night? Audiophile Neuroscience and miguelito 2 UpTone Audio LLC Link to comment
Superdad Posted May 18, 2018 Share Posted May 18, 2018 And again: what’s better about an external USB>S/PDIF converter versus a decent DAC-internal USB>I2S board? Really prefer to have all the clocking as close to the DAC chips as possible (oversimplification of how it all works—but I’m boarding a plane now). Marcin_gps 1 UpTone Audio LLC Link to comment
Superdad Posted May 19, 2018 Share Posted May 19, 2018 14 hours ago, Fitzcaraldo215 said: What's better than asynch USB straight to the galvanically isolated DAC without any extra do dads or gizmos involved? Remember, DACs which have digital isolators (what you refer to as galvanic isolation) ALWAYS have them on the I2S lines AFTER the USB input PHY/processor. (So far the ONLY exception to that is the new Auralic Vega G2, which uses the same Silanna isolator chip as our ISO REGEN, right at the input.) Quote Anticipating your likely answer, my Regen is out of the signal path gathering dust. Do you want to buy it back? No but I’ll sell you the much more advanced ISO REGEN. Same 30-day money-back guarantee as you had with the original when you got it three years ago. Audiophile Neuroscience 1 UpTone Audio LLC Link to comment
Superdad Posted May 19, 2018 Share Posted May 19, 2018 3 hours ago, jabbr said: Please, @PeterSt isolates the Phasure NOS1a/G3 at the USB input Ah, forgetful me. Indeed the current Phasure includes a board (again with the Silanna chip) to pass USB through input isolation. Does it go directly USB>I2S, or does the galvanicly isolated signal come back out to then get fed into the main USB input of the DAC? I don’t recall. UpTone Audio LLC Link to comment
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now