Jump to content
IGNORED

HDMI vs USB for audio?


Regnad

Recommended Posts

Technically, HDMI does not use a separate clock for audio - it shares the clock for video. Whether this makes a sound quality difference or not is debateable, though I do not think it will be noticed and I certainly haven't noticed it in my (albeit basic and uncontrolled) tests. Really under most circumstances the differences in SQ from interfaces will make little to no difference anyway unless one is poorly designed by the manufacturer- though I have not seen this problem personally, nor should it be an issue with these interfaces. Null tests will prove this-I'm currently trying to find a link that isn't a $20 AES article.

 

The problems with HDMI, in my opinion, are related to the video side and are due to the spec not being as advanced as I'd like - even with 2.0 ( it needs to allow a wider color gamut). Some interfaces have advantages over each other, though one can rarely take advantage of them outside of a studio and/or if they aren't having problems with ground loops or noise.

 

Maybe someone can comment on processing power, does one need more? I do not know but this may be interesting to find out.

 

You will get varying answers here - especially from high end cable devotees, though I can tell you that it is a fact that each interface has more than enough bandwidth for the signal and neither should be more likely to give you problems related to ground loops or external noise, so you should hear no difference.

 

If your concerns are over the clocking, I'll refer you to this article:

Does Your Studio Need A Digital Master Clock?

 

To Quote: "Overall, it should be clear from these tests that employing an external master clock cannot and will not improve the sound quality of a digital audio system. It might change it, and subjectively that change might be preferred, but it won’t change things for the better in any technical sense."

 

 

I just found these:

http://www.computeraudiophile.com/f8-general-forum/hdmi-audio-future-14759/

 

http://www.hydrogenaudio.org/forums/index.php?showtopic=72894

 

http://www.aes-media.org/sections/uk/Conf2011/Presentation_PDFs/14%20-%20john%20dawson%20-%20Audio%20Transport%20over%20HDMI%20-%20AES%202011.pdf

Link to comment
Thanks Alex. So many ways of setting things up!

 

Fortunately there is constant work being done on the HDMI standard that holds the ability to make it better in the future. It may make sense for some to look into HDbaseT, or see the active HDMI cables on Monoprice and elsewhere. They allow for longer distances than optical or digital coax, in fact I think HDbaseT has a 100meter max and can run through the Cat5 many of us already have installed in the walls. I have not tested this personally, and it is relatively new to the market and expensive, but I have no reason to believe that it will negatively impact your sound or image quality.

Link to comment
Unless you have an HDMI receiver that is designed to remove jitter (Pioneer, and a few other high-end AVRs) HDMI has terrible audio performance.

 

pioneer 51fd jitter measurements. | AVForums

 

That is older information now, but unless the device specifically advertises that it eliminates HDMI jitter, it will have very poor performance compared to a good USB or S/PDIF interface.

 

It is true that HDMI often measures as a higher jitter interface than USB, the question is whether it is audible - and in my experience it is not, nor have I seen DBT's saying otherwise (for HDMI vs. USB or vs. other interfaces). Additionally, AES and SMPTE approve of the interface, and that's good enough for me. The problems, in my opinion, and as stated above, with HDMI are not related to audio.

 

Let's not start a new jitter argument thread, we already have too many, and you are welcome to pm me.

Link to comment
Personally, I think a lot of the claims about the audibility of jitter are overblown. But when you have jitter measured in thousands of picoseconds, it certainly will be audible.

 

My opinion is the exact opposite of that - there are no problems with video over HDMI. It's perfect every time.

 

The issue is that HDMI is very susceptible to audio jitter, unless the receiver is specifically designed to eliminate it. (which will be a major selling point on the hardware)

 

Another big issue with HDMI is its susceptibility to ground loops. Something like an AVR is often connected to many devices at once, rather than simply being used as an audio connection. (multiple playback devices connected to the AVR, which is in turn connected to a television or projector - or multiple outputs are used at once)

 

I guess I should have explained my issues with HDMI video more in depth. The issue is not that the image isn't being sent perfectly, it absolutely is (within certain constraints). The issue in my opinion is that it isn't as advanced a connection as it should be. The new specs for 2.0 should have allowed for higher image bit depths (preferably 12 bit), a wider color gamut, and better chroma subsampling (4:4:4 would be wonderful) all at 4k 60hz. The media is slowly becoming available.

 

I do not know why they made such a marginal upgrade to the spec, and SMPTE papers (an organization of which I am a member) do not give any good reasons. But this is better left for AVS Forum talk.

 

It is also true that HDMI is more susceptible to ground loops, though I've never experienced this in any of my setups. If you are not experiencing a ground loop problem, this can be ignored.

 

Clearly we disagree on the audibility of jitter. I will make no further comments on this and I suggest the OP look in other threads and/or do his or her own research on the topic.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...