Jump to content
IGNORED

Trying to make sense of this article on HDMI...and getting my head around streaming 'jitter'


Recommended Posts

Cnet recently published an article on HDMI. You can read it here: http://news.cnet.com/why-all-hdmi-cables-are-the-same/8301-17938_105-20056502-1.html.

 

To sum up what it says: "The second part of TMDS (the DS part) is the HDMI cable itself. Each HDMI cable is actually multiple, small copper wires. Two versions of the data are sent over different wires. One of these is out of phase with the "real" signal. The TV receives all the data, puts the out-of-phase signal back in phase, then compares it to the "real" signal. Any noise picked up along the way will now be out of phase, and as such it is effectively negated and ignored.

If you're an audio person, this is similar to how balanced (XLR) cables work."

 

It then goes on to say, audio or video, you either get bit perfect or you don't. If there are errors in the stream, they get ignored. If there are bits missing, the whole picture and sound stops--it doesn't proceed with the missing parts. I just thought this was interesting in that most people decry HDMI as the worst way to go, particularly on this site. So if the mechanism is similar to how XLR works, I guess I am left wondering what exactly the problems with this are. It would be great if someone could explain this in such a way that a mere mortal such as myself might understand.

This subject touches on something else that occurred to me lately streaming music to an oppo. Is there such a thing as jitter over ethernet? I know it sounds sorta silly saying that. But really, isn't the advantage to streaming that you needn't worry about the way the bits are getting to your transport? Or do the bits go through the same thing they go through on their merry way through s/pdif, jittering and stuttering their way toward analog conversion? Or is jitter only a factor once the information has been streamed to the transport and goes on its way to the receiver or DAC?

 

Just reading the little bits in the above article it would seem to me just being a layperson who loves listening actively to music that HDMI would be one of the preferred ways of moving bits. The reason for the supplemental question on ethernet comes because if ethernet is inherently a cleaner way to move bits of audio, and HDMI is somewhat akin to the way balanced connectors work, then I must be missing why this isn't the audiophile dream. I'd like to understand why it isn't, if that is indeed true. Anyone that has an opinion feel free to chime in. It just made me think in terms of that recent thread on AVRs and such and the use of HDMI came up a few times. A few people even hinted at people's ears being a bit suspect because they used HDMI. (So of course they couldn't hear a difference between transports because that person used HDMI for all their connections...something like that. I use HDMI quite a bit myself and can hear the difference between different transports I've used on my modest system so I sorta had to laugh at that, but if people think that Id really like to know more of why they feel so strongly about this.) I've heard the video argument but have not heard a difference feeding the video signal out a different cable than the audio. It did not make the music magically brighter. It made it sound...exactly the same. Great.

 

 

 

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

Most HDMI audio implementations use the original HDMI audio spec, where the audio clock is derived from the video clock with a PLL a bit similar way as S/PDIF. This is sensitive to amount and properties of jitter in receveived data. For example Sony and Pioneer have their proprietary way of doing async audio transfer between two supported devices of the same brand.

 

On ethernet, when TCP protocol is used for streaming, it can be technically asynchronous in a way that receiver runs the clock and sender is subjected to flow control. So technically closer to for example asynchronous USB.

 

Second level of audible differences are unrelated to async/sync transfer mechanism and more related to isolation of noise coming from the sender device over the cable. Due to HDMI spec, available audio data bandwidth depends on video resolution.

 

It is actually somewhat challenging to have a clean clock at MHz range with SNR equivalent of audio-band SNR (> 100 dB). Any noise on the clock signal will cause variation on the time when a logic level transition is interpreted.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

Like everything else it comes down to the way it is implemented.

FWIW, when playing HD Video clips from a USB stick into my

WD TV Live, then into my DAC, I found that HDMI into the 40" Samsung LED backlit TV, then Toslink from the TV into the DAC, sounded better than direct Toslink from the WD TV LIve,(the DAC has 2 Optical Inputs) despite the WD TV Live having the benefit of being supplied by a very good Linear PSU.

I also hear clear differences between different Transports, although I haven't tried this via HDMI.

Unlike many here, I am not a fan of USB audio, whether via the Benchmark DAC or other DACs I have heard.

USB power is often too damn noisy !!!

The link attached (supplied by another CA member)gives some interesting insights into why, but being of commercial origin, may be subject to a little exaggeration ? My own USB power supply is considerably lower noise etc. than this commercial product, but it may give inmproved results with some PCs and laptops.

SandyK

 

http://www.aqvox.de/usb-power_en.html

 

 

How a Digital Audio file sounds, or a Digital Video file looks, is governed to a large extent by the Power Supply area. All that Identical Checksums gives is the possibility of REGENERATING the file to close to that of the original file.

PROFILE UPDATED 13-11-2020

Link to comment

I am not certain I understand why the audio clock is dependent on the video clock or what that means when no video signal is present. Or how that might interact with video which has no resolution--I guess I am to assume that said device implements its own resolution? Thanks to both of you. I'll try to digest this some more. So then can I assume in the lack of an external DAC with asynchronous transfer that ethernet is going to be closer to giving me a great signal? Just curious as the Oppo has pretty much sold me on the benefits of streaming to a great device...the HDMI/s/Pdif thing not so much...

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

HDMI audio is determined by the source component, there's no feedback from the receiver (sink) to correct for data timing errors. So even if you have things balanced from a noise presepctive, the jitter problem is very real and difficult to control.

This is classic adaptive control compared to asynchronous operation, where the receiver controls the timing, which is a LOT better, where you can say jitter doesn't matter any more.

 

When (not if) timing errors occur, it leads to gobful of jitter and you can hear it. I setup Bill Evans trio 24/192 to run through the Vaio's HDMI to a Sony AVR for this post. Cymbals were splashy, and overall impact of the music was certainly dull compared to an asyncronous DAC.

 

The Yamaha RX-V2065 has very good pedigree behind it, HDMI audio won't bring out its best due to the transmission. However, you could get some great results feeding an asynchronous SPDIF converter from the Mac to the RX-V2065's coax inputs with very acceptable and pleasing results with small outlay, such as a V-DAC, M2tech, Halide Bridge (

AS Profile Equipment List        Say NO to MQA

Link to comment

I am not certain I understand why the audio clock is dependent on the video clock

 

It's a design decision to ease up keeping video and audio in sync. Although in any sane system, like normal computer playback, it is done vice versa, next video frame is displayed after certain number of audio samples have been played back (sync-on-audio).

 

what that means when no video signal is present. Or how that might interact with video which has no resolution--I guess I am to assume that said device implements its own resolution?

 

There's no audio without video on HDMI (it's just a DVI on steroids), with audio-only devices or when playing only audio, there's usually a black video image transmitted at some suitable resolution.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

Hey Miska - can you point out the right section of the CEA 861 standards to document this?

 

I was thinking that the EDID for an audio only connection would not publish a video capabilty, and thus, only data island transmissions (the part that carries audio data) would occur. DI transmissions, carrying PCM or DSD data, are packetized similar to IP packets, and reassembled at the receiving end. i.e. Audio jitter, as we experience with S/PDIF, would not be a factor if this is so - again like IP transmissions. Data transmission jitter is not going to effect the audio data stream unless it is so bad as to cause dropped packets.

 

I'm not saying my understanding is correct, just that is where I am currently at.

 

Anyone who considers protocol unimportant has never dealt with a cat DAC.

Robert A. Heinlein

Link to comment

Hey Miska - can you point out the right section of the CEA 861 standards to document this?

 

I'm reading HDMI specification v1.3a.

 

Section 5.2.3.2 tells about Data Island placement.

 

Section 5.3.3 tells about Audio Clock Regeneration Packet and following sections about audio packets.

 

Section 7.2 talks about audio and clocking.

"Audio data being carried across the HDMI link, which is driven by a TMDS clock running at a rate corresponding to the video pixel rate, does not retain the original audio sample clock. The task of recreating this clock at the Sink is called Audio Clock Regeneration."

 

"The Source shall determine the fractional relationship between the TMDS clock and an audio reference clock (128

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

Hey Miska - can you point out the right section of the CEA 861 standards to document this?

 

I'm reading HDMI specification v1.3a.

 

Section 5.2.3.2 tells about Data Island placement.

 

Section 5.3.3 tells about Audio Clock Regeneration Packet and following sections about audio packets.

 

Section 7.2 talks about audio and clocking.

"Audio data being carried across the HDMI link, which is driven by a TMDS clock running at a rate corresponding to the video pixel rate, does not retain the original audio sample clock. The task of recreating this clock at the Sink is called Audio Clock Regeneration."

 

"The Source shall determine the fractional relationship between the TMDS clock and an audio reference clock (128x audio sample rate [fs]) and shall pass the numerator and denominator of that fraction to the Sink across the HDMI link. The Sink may then recreate the audio clock from the TMDS clock by using a clock divider and a clock

multiplier."

 

Section 7.2.3 lists recommended dividers and multipliers for certain sampling rates for certain TMDS clock frequencies.

 

Nice part in HDMI is that it can technically support up to 12.288 MHz DSD...

 

Section 7.11 documents Audio Rate Control feature for running asynchronous audio stream, but currently I'm not aware of any sink that would support this in the standard compliant form. Only those proprietary Sony and Pioneer implementations.

 

Most source device manuals also clearly state, that for audio-only use, the device transmits blank video data.

 

P.S. Some tens of recent jitter measurements for different AVRs with HDMI I've seen the average jitter figure has been around 1.5 ns.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

1. The thing Cnet was talking about only works as a proprietary solution (ie audio clocking over hdmi) for acting like balanced outputs.

2. 1.5 ns is a lot of jitter?!?!

3. And basically, while such an implementation is possible, it's only available between devices by the same manufacturer.

 

I really appreciate everyone's response here. Not sure that it made that much more sense to me after, but I am sort of getting why ethernet and such is not asynchronous transfer...

Thanks for the advice on the Halide Bridge, ect. I am hoping to invest in such a solution but I just bought the entirety of my system over the last year and I am unfortunately BROKE. So it will go on a wish list in the mean time. This is the first time I have actually heard some great understandable reasons why I should not go with HDMI all the time. Thanks!!!!

 

 

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

Thanks for the references. Yes, I agree with you on the jitter measurements, similar to what I am seeing here.

 

Have you seen any testing with direct relationships to the audio bit rate measured?

 

Yours,

-Paul

 

 

Anyone who considers protocol unimportant has never dealt with a cat DAC.

Robert A. Heinlein

Link to comment

I have no idea if that's a lot of jitter or not...I was asking! haha! You're talking to a guy who loves music and navigated this field with my ears...it wasn't until I came upon this site (after purchasing my equipment) that I realized there was much going on I didn't understand.

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

It isn't clear where in the transmission chain the 1.5 ns jitter measurement was made. Was it at the TMDS frame level? at the audio data level?

 

If it was measured at the audio data level, just to put things in perspective, if we consider a 192 kHz sample rate the stated jitter value is less than 0,03% of the bit period. It would be even less important at lower sample rates. I guess it would have no effect for all practical purposes.

 

On the other hand, any digital transmission system is prone to jitter and properly designed system would be able to deal with the expected ammount of jitter.

 

From what was said before from the HDMI 1.3a specifications, I understand that audio data is sent in packets along with information about audio sample rate. Therefore, audio data should be buffered by the receiver, the proper clock rate is determined from the recovered video clock by a multiplication and division operation and then the audio samples are sent to the DAC at the proper sample rate. If the reclocking circuitry is properly designed the output jitter should be minimal.

 

This would be entirely similar to streaming from a file in a hard disk. In this case the data is buffered and output at the proper sample rate as well.

 

Cheers

 

jrlx

Link to comment

Was it at the TMDS frame level? at the audio data level?

 

From D/A converted analog audio signal.

 

jitter value is less than 0,3% of the bit period. It would be even less important at lower sample rates

 

That would mean 0.3% distortion. For 24-bit data, DAC's clock input shouldn't deviate more than -144 dB from the timing interval. For 192/24 data that is 0.3 ps.

 

If the reclocking circuitry is properly designed the output jitter should be minimal.

 

This is the exactly same situation as for S/PDIF.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

But didn't jrlx just negate what was said by the others here? To my eyes and mind what I got from the original article was that if the audio arrives at the transport and somehow gets 'messed' up, missing bits, ect. it seems that hdmi is an all or nothing kind of thing. Either works or doesn't. That's sort of why my confusion comes in with respect to jitter and stuff. The article makes it sound like you are either getting bit perfect or not. It also seems to say to me that if there are more than a few bytes of a file either missing or corrupted somehow midstream that the signal will just stop. Or did I miss something? Is this throwing out of corrupted information the PLL implementation that was spoken of earlier between like proprietary systems. I hope no one thinks I'm beating this horse to death-I am really trying to understand this particularly in light of all the comments I've heard suggesting it to be the dearth of audiophile ears. And of course I am not saying there's not better sound to be had I am just curious as to the implementation of this HDMI thing. Is there something proprietary to Oppo that makes their implementation particularly good? I definitely hear an improvement (even over HDMI) from the Oppo over the Mini over the same protocol. So Jrlx just threw a monkey wrench in what I thought I was understanding...It's ok Jrlx I am a bit slow...Now miska chimes in and Im way confused! :-)

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

To my eyes and mind what I got from the original article was that if the audio arrives at the transport and somehow gets 'messed' up, missing bits, ect. it seems that hdmi is an all or nothing kind of thing. Either works or doesn't.

 

For PCM D/A conversion, you can think about 2D coordinate system or a grid. Vertical scale values are determined by sample values with for example 16 or 24 -bit accuracy. Horizontal scale stepping interval is assumed to be static for sampling rate (1/fs interval).

 

Now if Y axis value has error from the real value, there's a corresponding distortion in the waveform. If X axis stepping (distance between the vertical grid lines) has error from the assumed static and perfect interval, there's a corresponding distortion in the waveform. In order to properly re-create the waveform (assuming the original sampling side was perfect) both the Y and X axis values need to have equally low error. Linearity error in the D/A converter shows up in Y axis conversion and timing error (jitter) in the D/A converter shows up in X axis conversion. These two axis are technically in relation through cosine and sine functions for the continuous waveform function at the output.

 

Above description applies directly on a traditional multibit ladder-DAC. It is more complicated process with a modern sigma-delta converter.

 

Unfortunately many still ignore the timing requirements (X-axis accuracy) completely while maintaining certain claims about "bit-perfect" sample value (Y-axis). Both of these aspects are actually equally important. For many HDMI cases, X-axis accuracy results in analog end result being between "12- and 14-bits accuracy". HDMI leaves it to the receiver (sink) to re-generate accurate clock for X-axis, basis for it is formed by the pixel clock. So yes, receiver can either successfully receive the data or not, but it is unrelated what kind of outcome there is after D/A conversion (if there is any in first place) - which falls outside of the HDMI spec.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

I have heard of some newer DACs that could be arriving with i2s over HDMI support. Unfortunately miska that went right over my head--still digesting it with my remaining math brain cells (there are exactly 3 left and two have ADD). Although I guess I am getting that the current implementation colors the sound with some kind of erroneous results. "12-14 bit accuracy" out of 16 or 24 bits? So does that mean I am effectively halving the bit rate by using HDMI or did that fly right by my understanding?

 

Macbook Pro 2010->DLNA/UPNP fed by Drobo->Oppo BDP-93->Yamaha RXV2065 ->Panasonic GT25 -> 5.0 system Bowers & Wilkins 683 towers, 685 surrounds, HTM61 center ->Mostly SPDIF, or Analog out. Some HDMI depending on source[br]Selling Art Is Tying Your Ego To A Leash And Walking It Like A DoG[br]

Link to comment

"12-14 bit accuracy" out of 16 or 24 bits? So does that mean I am effectively halving the bit rate by using HDMI

 

Yes, sort of. Naturally things are more complicated, as for any distortion or error, it usually has some kind of properties. But when put simple, yes.

 

It is actually really really hard to get full theoretical potential of 192/24 out of any kind of interface or hardware. If you can get 20-bit worth of true accuracy, you can be already quite happy!

 

S/PDIF has been subject to lots of careful engineering over the years and it's technically far from being perfect. HDMI is much younger and engineering efforts have been so much around adding new features rather than making the current ones really good.

 

Best standard interface options for audio between devices we currently have are Thunderbolt, Ethernet, USB and FireWire. With these, quality of incoming data reproduction is almost solely on the "sink" side.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

We see between 1 and 2 nanoseconds of jitter on the data transmission, but nothing like that at all on the D/A conversion, which is isolated from the data transmission. Buffered and so forth.

 

The D/A jitter is much more a function of the quality of the audio equipment than the HDMI data transmission, isn't it?

 

-Paul

 

 

Anyone who considers protocol unimportant has never dealt with a cat DAC.

Robert A. Heinlein

Link to comment

I'm talking about analog measurements of jitter with "J-test signal" sent over HDMI (vs the same signal sent over S/PDIF or played from network/USB-stick). In these figures the average for AVRs seems to be around 1.5 ns for HDMI and typically significantly lower for S/PDIF.

 

I blame the audio clock regeneration PLLs in HDMI receiver chips like commonly used Sil9135. Probably not the place where they have put most R&D effort when developing those chips. Unlike on S/PDIF receiver chips.

 

The D/A jitter is much more a function of the quality of the audio equipment than the HDMI data transmission, isn't it?

 

Same as for S/PDIF, yes and no. If the incoming TMDS clock jitter is too high it is really hard to get anything proper out of PLL.

 

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

Audio was added to the DVI spec to create HDMI. Its is imbedded into the video stream. The thought was that there was unlimited bandwidth so there should be no problem making audio work (same argument for audio over Ethernet). The audio is packetized and inserted into the horizontal refresh periods, which are dependent on the video frame rate. That rate can be from 24-60 frames per second times the number of vertical lines so there are a lot of the slots for the audio to be stuck into. However they have no relation to the audio sample rate clocks except by chance. And its not asynchronous since there is no feedback path to manage buffers and control the rate audio is fed to the "sink". Effectively its very similar to adaptive USB audio. No better if as good. Further the HDCP spec prevents pulling digital audio out of the system at a higher sample rate than 48 KHz. This would be true for any BD player as well, technically. I would look elsewhere for a high resolution audio source.

 

The I2S over HDMI spec (has nothing to do with HDMI) originated with PS Audio and takes advantage of the high bandwidth of the differential signalling developed for HDMI hardware. Its a good idea but does not currently support dac driven clocking, the master clock originates at the source only. It may change in the future. While PS audio will share the spec its not really a standard at this time. And do not interconnect with a TV or something may break.

 

Demian Martin

auraliti http://www.auraliti.com

Constellation Audio http://www.constellationaudio.com

NuForce http://www.nuforce.com

Monster Cable http://www.monstercable.com

Link to comment

"HDCP spec prevents pulling digital audio out of the

system at a higher sample rate than 48 KHz."

 

Demian

Perhaps I have misunderstood what you are saying here,

but products such as

http://www.jaycar.com.au/productView.asp?ID=AC1625&keywords=ac%2D1625&form=KEYWORD

are reportedly capable of extracting the original audio resolution

I asked Jaycar to check this with the manufacturer.

 

SandyK

 

 

 

How a Digital Audio file sounds, or a Digital Video file looks, is governed to a large extent by the Power Supply area. All that Identical Checksums gives is the possibility of REGENERATING the file to close to that of the original file.

PROFILE UPDATED 13-11-2020

Link to comment

are two separate organizations that control HDMI. Per the licensing agreement to get the security codes (HDCP) for the chip you need to folllow the rules, including the no hi res audio out in digital. However it seems the enforcement is quite lax. Some products have been kicked back at the ATC for things like this but not every HDMI product has been through the full licensing process. Its not difficult to build and the control of the chips isn't very stringent (unlike Apple authentication chips) so the barriers are low, but you won't get the same from a major vendor like Sony or even Oppo.

 

Demian Martin

auraliti http://www.auraliti.com

Constellation Audio http://www.constellationaudio.com

NuForce http://www.nuforce.com

Monster Cable http://www.monstercable.com

Link to comment

As far as I'm concerned, any audio system where video frames are leading, can have a most poor result only. Whether they try to systain the original audio rate, resample it, anything - it doesn't matter; they will loose samples at some stage (or inject them for that matter).

 

All is similar to the other way around, where in the normal situation audio is leading, and video frames are skipped or injected, which merely comes down to miss them, or grab the same frame twice (fairly complicated by itself because of interlacing / pull down stuff and such). -> stutter.

 

This is all a huge subject and about the complexity of the, say, pixel clock, never being able to be the same as the audio clock, because they are just different - the pixel clock already being 100% dependent on the refresh rate of the display device, the resolution of it and some more things, all not really under your control. There's frame rates of 25Hz, ~24Hz and 30Hz, and when the audio is derived from that, the quality will alreeady differ per video (movie) material. Also notice the roughness of the video frame rate vs. the fineness of the audio sample rate, and you can expect problems everywhere.

 

The systems based on the audio clock being the master can be fine tuned fairly well (also think about ReClock), and video stutter can be avoided to a large degree or otherwise can be made quite unnoticeable. Soundcards which allow for adjusting the audio sample rate will really make your day, and technical stutter can be minimized to whatever happening common denominator of (once in a few minutes) a video frame catching up with the audio sync (an RME FireFace can be finetuned with 1Hz resolution).

 

I hope this is not too vague for this nutshell, but again, the subject is really big.

 

Peter

 

Lush^3-e      Lush^2      Blaxius^2.5      Ethernet^3     HDMI^2     XLR^2

XXHighEnd (developer)

Phasure NOS1 24/768 Async USB DAC (manufacturer)

Phasure Mach III Audio PC with Linear PSU (manufacturer)

Orelino & Orelo MKII Speakers (designer/supplier)

Link to comment

Whether they try to systain the original audio rate, resample it, anything - it doesn't matter; they will loose samples at some stage (or inject them for that matter).

 

Peter, in case of HDMI it works because audio clock is derived in direct relationship from the video pixel clock:

pclk / D * N = aclk

 

Where D and N are integers and sent from source to sink as information. HDMI specification lists recommended values for those for different standard values of pclk and aclk.

 

And no, I'm not saying this would be good approach from audio clock jitter perspective, but it works from A/V sync point of view.

 

Signalyst - Developer of HQPlayer

Pulse & Fidelity - Software Defined Amplifiers

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...