Jump to content
IGNORED

Discussing USB implementations. Why should the quality of the interface and cable matter?


Recommended Posts

Preface: I am a graduated computer engineer and work in the software business. I am also a relative newbie with respect to the audiophile scene. That being said, my father is an avid audiophile and I have had the pleasure of auditioning many setups with him. Recently, I built him a music server and we've auditioned the Meitner, Invicta, and Weiss DAC202 on his B&W 800 Diamond speakers.

 

Being a computer engineer, I am relatively well versed in the technical workings and interactions of computer interfaces. That being said, I do not have any professional experience with the USB interface. Most of my hardware engineering experience is limited to FPGA development, microcontroller programming, and communication with relatively simple interfaces such as serial interfaces.

 

I am not going to discount the possibility that the quality of the USB interface on the PC or that the quality of the USB cable matters. It very well might. I'm here to find out why.

 

To me, used in the intended asynchronous fashion, USB should be able to deliver perfect 1:1 data that is completely devoid of data degradation or timing issues. Take a USB drive, for example. When you transfer data to and from it, nothing happens to it. Error correction and proper back/forth communication in between the drive and the USB's port controller ensures that the data arrives perfectly. If there are any flaws in the data, they are detected thanks to error correction and promptly fixed. Undetected errors are extremely rare, to the point where I have never encountered a corrupt file sent over USB in my lifetime. They're possible, but the error correction used is quite robust and makes undetected errors extremely unlikely.

 

That being said, I realize that the USB interfaces in DACs may not use USB in the ideal fashion. They say they use asynchronous USB, but I'm not sure how it's implemented. Data is buffered into the DAC's USB interface at a rate determined by the DAC's asynchronous USB controller. How big is the buffer? How likely are data errors? I would have to assume that the PCM audio data arrives in "chunks", each chunk representing a bunch of PCM data with attached error correction bits. If the chunk is determined to be erroneous, the DAC would have to request a new chunk from the PC. If it's not retrieved correctly in time, you'd basically have a missing chunk of data, resulting in audio drop outs. After the data makes it through the buffer, it's essentially reclocked by the DAC's master clock, retransmitted and sent to the DAC chip.

 

I've described what I suspect happens using asynchronous USB implementations. If designed in a proper fashion, I see absolutely no reason why the quality of the cable or USB port would effect the sound. At least, there's no reason it should effect the bits and how they're processed by the DAC in any way.

 

That being said, there are a few unknowns to me: can the -V, +V lines in the USB connection somehow "dirty" the DAC's power? I don't know...I'm not terribly comfortable with electrical engineering subject matter.

 

If someone could enlighten me, I'd be forever grateful.

 

Link to comment

Welcome.

 

This question or one almost like it must come up here once a week. Some times, the ensuing discussion lasts longer than a week, so there are often dueling debates on the subject.

 

But, since you framed the question so eloquently, I thought a sincere reply would be good for later reference by others. So, thanks for the nice lead in!

 

First, this really is primarily an electrical engineering or physics problem, not so much a mathematical or data communications problem. As your lead in so nicely explained, in theory what you described should be a perfect solution.

 

But, here's some of the issues.

 

Unlike with a printer or USB drive, the digital bit stream always has to be converted to the analog domain for transmission to your ears through the air. Whether the D/A conversion is performed in a separate box that's labeled a DAC, or it's done by a conversion section of a receiver, or by the conversion portion of some "active" loudspeakers, analog electronics are involved at some point.

 

If you dig deep into those converters, you also find analog electronics used in the conversion process itself. Oscillators used as conversion clocks are all analog, even if their output some times resembles a square wave. The voltage regulators used by the conversion chips are analog. When you come down to it, even the digital devices themselves are analog in nature; it's just that the modulation used is not as dependent on signal levels as with regular analog systems because the modulation has such poor resolution - all zeros and not zeros. The trade-off there is the bandwidth used, of course. But that's drifting off the subject.

 

Anyway, the analog electronics are sensitive to perturbations. These can and do affect the goodness of the analog signal, both in amplitude and time. These imperfections can and do cause a wide range of distortions. That is what we're talking about here after all.

 

One of the blessings of digital electronics is that they work pretty well with sloppy signals. Close to zero is good enough. Not so here. One of the curses of digital electronics is that because they can use sloppy signals by making up for the sloppiness through computations, they often do. Good for the digital part of the system, not so good for the analog.

 

The whole bit with cables and design is to minimize the transmission of noise and to minimize the susceptibility of the analog parts of the system to its effects. There's lots of possible paths to accomplish this. The important point I'd like to convey is that these details are not so trivial to solve in the real world with imperfect components and such messy things as thermal noise, the charge of electrons, EM fields, and so on.

 

I've gone on too long, but let me offer a really poor but maybe a tiny bit relevant analogy that you might be familiar with.

 

Why can't you connect an Ethernet cable between New York and San Francisco and get perfect error free data transmission at the full data rate specified for the terminal equipment?

 

I suggest that the underlying problems to that are similar to the problem you outlined.

 

Link to comment

Hi,

 

Async USB is indeed the way of the future when it comes to external DAC units for home user music playback. My Eastern Electric MiniMax DAC Plus comes with a built-in M2Tech OEM interface, which supports 24/192 just fine and which makes expensive digital cables and transports a lucky thing of the past. I have my DAC hooked up to my Windows 7 netbook. The cost of my USB cable combined with my USB extension cable is below that of a round at the local pub.

 

The main problem with the older tech is called jitter and it's caused by the clock of an external DAC unit being different than the clock on the other end of its digital cable. With async USB, just like you already pointed out yourself, data is error-corrected in a bi-directional type of communication, and clock differences lose their audible effects through a buffering mechanism that compensates for the skew. On top of that, the M2Tech's circuitry is galvanically separated from the rest of my DAC so the noise in the USB signal has no effect on my sound quality either.

 

That said, a NOS DAC can typically suffer from a little bit too much distortion, whereas a delta sigma DAC can have, for example, too much crosstalk interference between its Left and Right channels so it messes up the stereo image etcetera. Needless to say, each one of these problems interacts with the jitter in its own unique way. Also obviously, the jitter caused by the older digital cable tech is not the only form of jitter affecting the DAC unit's analog output quality. However, if the older cable tech's jitter can be worked around, I believe it's safe to assume the battle is half won. Hey, at least it saves money. Lots and lots of money!

 

Another good thing about properly implemented async USB is the fact you can have digital volume control (I use the one that comes with the freeware foobar2000) without it messing with the jitter-removal mechanism inside the DAC alot more than what's truly necessary. Hence, I can live without an analog volume control (and I guess that also explains why my DAC doesn't have any). The term "bit perfect" is severely overrated. I can hear the audience breathe in a 24/96 digitized copy of the vinyl Harry Belafonte Live At Carnegie Hall, even with the digital volume set to as low as -50 dB. And, BTW, I really do need that much attenuation because my DAC runs straight into my Emotiva XPA-2 power amp — which doesn't have any volume control, but it does have a boatload of gain (does it?) — and then into my pair of black Canton Vento 890.2 DC speakers. To be completely off topic... every time I throw beans at my system, the grin on my face just grows bigger! :-D

 

If you had the memory of a goldfish, maybe it would work.
Link to comment

That's a fine introduction, cg. And how refreshing to find someone with adjacent expertise seeking to broaden their knowledge instead of merely flaunting their 'credentials' and boldly proclaiming impossibilities from their vantage point of omniscience. If only other forums were this civilised: it's a jungle out there.

 

It requires a modicum of readjustment from a purely IT-oriented mindset to see what's important here: if you shout at a printer, it behaves the same; if you shout at a microphone, it doesn't. By design, audio devices amplify voltage fluctuations and are peculiarly sensitive in the time domain.

 

Solely considering data in this scenario is like exploring Flatland: it can look perfect, but it's only a slice of an object with more than one dimension.

 

Link to comment

Hi TheExodu5,

 

I agree with your observations. In my experience, when Asynchronous USB is properly done, the quality of the cable or the USB port doesn't affect sound quality. This is true to a certain degree - if the USB port is not capable of maintaining enough bandwidth, eventually there will be dropouts. If the USB power is extremely noisy, the DAC may not be able to suppress the noise.

 

Assuming that the USB port / cable are performing within the USB specification, there should be no difference - good asynchronous DAC should be immune to computer noise and jitter.

 

Asynchronous USB can be implemented in many ways. The essence of it is that the DAC should store safe amount of sound data in a buffer away from the computer. The DAC firmware must insure that the buffer never gets empty during playback. This way the DAC chip always plays data from the buffer. The precision of the playback timing is determined by the DAC oscillators and it is not degraded in any way by the PC hard drive, the PC clocks or by hick-ups in the USB interface.

 

Regarding the unknowns of the -V, +V lines - a radical solution is to use galvanic isolation between the USB circuits and the DAC. Isolation prevents ground loops and stops computer-originated high-frequency noise.

 

I need to make a disclosure: I am a software developer like you, a long time audiophile like your father, and I develop USB to I2S interfaces and DACs.

 

Happy Holidays,

 

George

 

 

Link to comment

George, do you know if these asynchronous audio interfaces use any EDAC as is typical in normal data transmissions (e.g. External hard drive interfaces)? The reason I ask is that there are those who have posited that server-side noise can result in data corruption (1's being received as 0's and vice versa). Thanks.

 

Mac Mini / Pure Music > Firewire & USB > Metric Halo LIO-8 > Hypex NCORE 400 > Geddes Abbey Speakers > Rythmik Servo & Geddes Band Pass Subs // DH Labs Cables, HRS MXR Isolation Rack, PurePower 2000, Elgar 6006B

Link to comment

Definitely, EDAC or Error detection and correction must be deployed at some communication level. Every communication channel can cause errors. In the approach that I've described the USB must be fast enough to supply corrected data while maintaining safe buffer level.

 

Link to comment

Thanks for the input guys.

 

I think in my eyes that I will put this subject to rest. My suspicions seem to have been mostly confirmed (and by someone with experience in developing such systems, thanks exa).

 

I will assume that a USB connection should be error and jitter free under the condition that the connection has a high enough bandwidth to supply the buffer, and if the communication implementation includes proper error correction. I will also assume that the quality of the source USB port power (-V, +V) should be largely irrelevant as long as the DAC unit can properly isolate the receiving USB chip so that the power does not "dirty" the rest of the unit.

 

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...