Jump to content
IGNORED

Why does SPDIF basically suck?


Recommended Posts

5 hours ago, mansr said:

If you ignore drift, you'll need to either drop or insert samples whenever the clocks slip by more than a sample period. The S/PDIF spec requires a frequency accuracy of 1000 ppm for the sender. Suppose your local clock is running at a perfect 48 kHz while the sender is at the upper end of the permitted range, that is 48048 Hz. Every second, you'll be receiving 48 samples more than you know what to do with. You have no choice but to discard them, and this causes distortion. Similarly, if the sender is slow, you'll have to somehow pull 48 samples per second out of thin air, again distorting the signal.

 Thanks for the explanation Mans. I think my gross misunderstanding is/was assuming a clock is a clock is a clock ie so long as the receiving clock is accurate you need no reference to the source clock for timing.Each sample with its corresponding bit depth would be clocked out, in the order that it arrived (FIFO). Certainly in the example when the sender is slow I get now that you cant clock out samples that have not yet arrived.

Sound Minds Mind Sound

 

 

Link to comment
12 hours ago, mansr said:

One reason might be that to have the DAC be the master clock, you'd need to deal with an unknown amount of signal skew. A 1-metre cable would have a minimum of 10 ns round-trip delay, probably twice as much when LVDS interfaces and other logic is included. That's too much to be fed directly into most DAC chips, and you'd need a resynchronisation stage to ensure the phase requirements are met. The source would also need to support an external master clock, which not all do.

Many years ago I had a CD playback system by Audio Alchemy which consisted of a CD transport, a jitter "filter" and a DAC, and the last two could be connected using a proprietary I2S cable which I never tried due to ignorance and lack of funds; I used S/PDIF coax.

The jitter interface is mentioned here:

https://www.stereophile.com/features/368/index.html

"Science draws the wave, poetry fills it with water" Teixeira de Pascoaes

 

HQPlayer Desktop / Mac mini → Intona 7054 → RME ADI-2 DAC FS (DSD256)

Link to comment
10 hours ago, mansr said:

Absent a feedback channel for flow control, the playback rate must be slaved to the sender in order to avoid clock drift.

But there are at least two ways in which the conversion clock can run independently of the playback rate (long term clock rate) of the sender- one is to have a long buffer and delay playback by long enough to render any clock drift moot- I think chord did this around 2007, PS audio later on. The price is a 2-4 sec delay which never bothered me.

The other is ASRC where the sending clock rate is encoded in the data and the discrepancy between the sending and conversion clocks manifests as a minute change in pitch (if my brain's wired up correctly this morning).

You are not a sound quality measurement device

Link to comment
7 hours ago, Superdad said:

Or you can do what @JohnSwenson did for the unique S/PDIF input of the Bottlehead DAC, which was to use an FPGA instead of a traditional S/PDIF receiver with its jitter-prone PLL.  He did a little cleanup of the S/PDIF signal then sent it into an FPGA for the decoding. But the special part was that he’s used a digitally controlled low phase-noise clock, with performance close to some of the best fixed frequency clocks. The FPGA told the variable clock to speed up or slow down so it was synchronized to the average data rate of the source.  It was a REALLY good S/PDIF input!

Yes, this is a well-known and obvious approach. The point is that you must do something to synchronise with the source clock.

Link to comment
13 minutes ago, adamdea said:

But there are at least two ways in which the conversion clock can run independently of the playback rate (long term clock rate) of the sender- one is to have a long buffer and delay playback by long enough to render any clock drift moot- I think chord did this around 2007, PS audio later on. The price is a 2-4 sec delay which never bothered me.

We already discussed this a few pages ago. With a 4-second pre-buffer and a worst-case (1000 ppm slow) sender, your buffer will run dry in about an hour. Enough for a regular album, but too short for longer works, of which there are many, both classical (think opera) and modern (double albums).

 

13 minutes ago, adamdea said:

The other is ASRC where the sending clock rate is encoded in the data and the discrepancy between the sending and conversion clocks manifests as a minute change in pitch (if my brain's wired up correctly this morning).

One application of ASRC is to replace a variable-rate clock. Instead of adjusting your clock to match the sender, you measure the average rate of the incoming data (relative to the local clock) and resample the signal to make it match. The advantage of this method is that you can use a fixed-rate clock, and these generally have better stability than variable ones. Your clock is also not limited to the usual audio sampling rates, so you can choose whatever makes your DAC stage perform optimally. I believe Benchmark DACs use an approach similar to this.

Link to comment
11 minutes ago, adamdea said:

but buffer size is not necessarily equal to one sample.

Of course it isn't. That would never work. The buffer size is, however, irrelevant. No matter what it is, if you don't synchronise playback to the incoming data, any buffer will eventually become inadequate. A properly designed DAC should be able to run indefinitely without glitches.

Link to comment
50 minutes ago, mansr said:

Of course it isn't. That would never work. The buffer size is, however, irrelevant. No matter what it is, if you don't synchronise playback to the incoming data, any buffer will eventually become inadequate. A properly designed DAC should be able to run indefinitely without glitches.

I see your comment about the 4 second buffer and indefinitely etc. But in my experience (with the ps audio dac) it works perfectly well, probably because most sending devices are much closer than 1000ppm (I think one of my squeezebox touches is 44094, the other 44100, as is my cdp). You can get through the ring cycle with that.

Even listening to opera  there are always track breaks- no one wants a recording of the interval between acts. Even Rheingold is only 2.5 hours or so without interval. A few things (maybe Einstein on the beach? go on fro longer than that without an interval.

In practical terms there is no real issue apart from a delay at the beginning. And of course you can switch over to pll or rebuffer if there is a problem. Hence the Lavry style solution. 

You are not a sound quality measurement device

Link to comment
6 minutes ago, adamdea said:

I see your comment about the 4 second buffer and indefinitely etc. But in my experience (with the ps audio dac) it works perfectly well, probably because most sending devices are much closer than 1000ppm (I think one of my squeezebox touches is 44094, the other 44100, as is my cdp). You can get through the ring cycle with that.

That puts your Squeezebox at an error of 136 ppm, making it a bit of an outlier. Most devices seem to manage an accuracy of 50 ppm or better. So yes, in practice, you can get away for longer with a smaller buffer. Nevertheless, the spec is what it is, and it is IMO inexcusable to not comply with it.

 

6 minutes ago, adamdea said:

Even listening to opera  there are always track breaks- no one wants a recording of the interval between acts. Even Rheingold is only 2.5 hours or so without interval. A few things (maybe Einstein on the beach? go on fro longer than that without an interval.

How will the playback software know when to stop and restart the DAC so as to let it re-buffer?

 

6 minutes ago, adamdea said:

In practical terms there is no real issue apart from a delay at the beginning. And of course you can switch over to pll or rebuffer if there is a problem.

That really complicates things.

Link to comment
6 minutes ago, mansr said:

That puts your Squeezebox at an error of 136 ppm, making it a bit of an outlier.

Yes- that was the one I had an "improved" aftermarket clock fitted to(!)

Anyway the major advantage of trying out a really long buffer is to discover that it doesn't make any difference and stop angsting  about jitter.

You are not a sound quality measurement device

Link to comment
5 hours ago, mansr said:

Yes, this [using an FPGA and variable-rate clock] is a well-known and obvious approach. The point is that you must do something to synchronise with the source clock.

 

If that is “well known and obvious,” can you point us to any other commercial DACs that implement this technique?  It actually was not that easy to develop.

Link to comment
17 minutes ago, Superdad said:

If that is “well known and obvious,” can you point us to any other commercial DACs that implement this technique?

I don't know about DACs specifically, but the idea of adjusting a variable-rate clock to match an external signal is hardly novel. It's such an obvious thing to do that I really can't recall when I first encountered it. If you don't see it that way, I can only say I'm sorry.

 

17 minutes ago, Superdad said:

It actually was not that easy to develop.

The concept can still be obvious.

Link to comment
19 hours ago, Audiophile Neuroscience said:

Depends what you mean by "most USB". Comparing apples with apples, for me, well implemented USB sounds better than well implemented coax spdif or aes/ebu or glass ST fibre (the bayonet connection)

 

I mean all the DACs I have own have sounded better with I2S or coax. All of the DACs I have been able to test with different digital inputs have sounded better or equal to USB.

 

Comparing apples with apples can be made with Music Players like Aurander W20/N10, Auralic Aries etc. You can also compare if the sound gets better or worse by using something like a mR/uR direct to DAC or with a good DDC with BNC/SPDIF/AES (apples to orange).

Link to comment

Kids: Just use an external clock like dCS does! :)

 

NUC10i7 + Roon ROCK > dCS Rossini APEX DAC + dCS Rossini Master Clock 

SME 20/3 + SME V + Dynavector XV-1s or ANUK IO Gold > vdH The Grail or Kondo KSL-SFz + ANK L3 Phono 

Audio Note Kondo Ongaku > Avantgarde Duo Mezzo

Signal cables: Kondo Silver, Crystal Cable phono

Power cables: Kondo, Shunyata, van den Hul

system pics

Link to comment
20 hours ago, Superdad said:

 

Well I2S internally--right from the Ethernet or USB input board--with master clocking done right.

I2S (via LVDS over HDMI) is rarely done in ideal fashion as the source ends up as the master clock (very few DACs feed master clock out to slave the I2S source).

 

I personally presently use I2S external (modified Singxer SU-1 feeding I2S/DSD over LVDS/HDMI cable to Holo Spring L3) because the Crystek CCHD-575 clocks in the Singxer are a lot better than the clocks in the Spring (and/or the USB input of the SU-1 is better).

 

There is nothing preventing an enterprising manufacturer/designer from developing a more ideal--and yet still operating system "sound card driver" compatible--two piece solution with....  (oops.  :ph34r:  shhh... B|).

 

I don’t know how your reply are related to my respond to your declaration that SPDIF wasn’t even meant to be used externally when in fact it was I2s and that haven’t stopped you from using it, and shouldn’t.

 

I wonder which digital codex, commonly used today, that wasn’t originally meant to be used for audio? :D

Link to comment
8 minutes ago, Summit said:

And look which type of cable they are using to the external clock :P

BNC... Why?

NUC10i7 + Roon ROCK > dCS Rossini APEX DAC + dCS Rossini Master Clock 

SME 20/3 + SME V + Dynavector XV-1s or ANUK IO Gold > vdH The Grail or Kondo KSL-SFz + ANK L3 Phono 

Audio Note Kondo Ongaku > Avantgarde Duo Mezzo

Signal cables: Kondo Silver, Crystal Cable phono

Power cables: Kondo, Shunyata, van den Hul

system pics

Link to comment
15 minutes ago, Summit said:

I don’t know how your reply are related to my respond to your declaration that SPDIF wasn’t even meant to be used externally when in fact it was I2s and that haven’t stopped you from using it, and shouldn’t.

He already said he misspoke, misremembered, or mis-something-else.

Link to comment
2 minutes ago, Summit said:

They use a coax cable with BNC connectors. SPDIF use the same 75 Ohm coax cable and this is a thread about “Why does SPDIF basically suck”.  

If S/PDIF sucks, it's not because of the cable or connector.

Link to comment
6 minutes ago, mansr said:

He already said he misspoke, misremembered, or mis-something-else.

 

Yes, but it was his intention to put something down sole because it was “originally not even meant to be used externally”.

Link to comment
36 minutes ago, Summit said:

SPDIF doesn’t suck!

 

I certainly never said it did.  I don't see the world in back-and-white. 9_9

 

My original point was simply that, unless one is producing a really good, well clocked S/PDIF single right in the source computer (eschewing USB altogether), the choice to use S/PDIF is just a choice to do USB>I2S>S/PDIF conversion externally--and to then require another S/PDIF>I2S conversion in the DAC.

 

Between elimination of stages and the ability to slave an internal USB>I2S board to the DAC master clock, there would seem to be very few decent arguments in favor of using an external DDC.  The only valid argument I have ever heard was that the DAC designer's USB (or Ethernet) input did not have enough effort put into it.

 

Cheers,

--AJC

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...