ggking7 Posted May 10, 2010 Share Posted May 10, 2010 I've been working a lot lately on optimizing my Linux system for low-latency playback. It's really improving the sound, so I think jitter must be decreasing as latency decreases. Does anyone know of a way to measure jitter? Link to comment
wgscott Posted May 10, 2010 Share Posted May 10, 2010 I'd love to know too. Meanwhile, can you tell us what you have done? Link to comment
The Computer Audiophile Posted May 10, 2010 Share Posted May 10, 2010 This is a path you do not want to go down. Even the best engineers in the world admit measuring jitter is very hard and sometimes impossible. Plus there are numerous jitter measurements for all kinds of jitter. The most popular tool is made by Audio Precision and is tens of thousands of dollars. Founder of Audiophile Style | My Audio Systems Link to comment
Shadorne Posted May 10, 2010 Share Posted May 10, 2010 This is what you need (around 20K US). I would not recommend getting one unless you know what you are doing (requires training). Latency is not at all the same as jitter. Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
ggking7 Posted May 10, 2010 Author Share Posted May 10, 2010 As for what I've done, I've been following the instructions here: http://irc.esben-stien.name/mediawiki/index.php/Setting_Up_Real_Time_Operation_on_GNU/Linux_Systems http://proaudio.tuxfamily.org/wiki/index.php?title=Howto_RT_Kernel It really sounds great, much better than when I started. Gotta love improving the sound quality of the source component. OK, so much for measuring jitter. Nobody said latency was the same as jitter. However, decreasing latency should not increase sound quality, but it's having that affect. This is an explanation of one of the latency-reducing tools called rtirq: http://www.linuxfordevices.com/c/a/Linux-For-Devices-Articles/The-Linux-real-time-interrupt-patch/ It says: "With a measured worst case latency of five microseconds and with a typical jitter below one microsecond at an interrupt period of up to 100 kHz an rtirq-enhanced linux kernel may be usable for a broad range of hard real time control loop applications." Link to comment
Shadorne Posted May 10, 2010 Share Posted May 10, 2010 Jitter is usually measured in picoseconds. Anything close to 1 microsecond of jitter would very likely sound awful - even if it were entirely random ( the least intrusive form of distortion ). A microsecond is 1 MILLION higher than a picosecond. Yikes Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
Wavelength Posted May 10, 2010 Share Posted May 10, 2010 ggKing, Latency has nothing to do with jitter. Actually in most cases to improve sound on computer audio merely takes increasing the processing power of your system. Which it looks like you are doing. Remember not everything has to do with jitter. Thanks Gordon J. Gordon Rankin Wavelength Audio http://www.usbdacs.com/ http://www.wavelengthaudio.com/ http://www.guitar-engines.com/ Link to comment
ggking7 Posted May 11, 2010 Author Share Posted May 11, 2010 Very interesting to read that Gordon. Actually I remember one of your previous posts in which you said you measured higher jitter in slow computers. I've been making a lot of tweaks to my system for low-latency and in the process the sound has really improved. Maybe the changes I'm making are making my system more efficient and maximing its processing power, thereby reducing jitter? When you say processing power, do you mean CPU power, or speed in all aspects of the computer? Can you explain the relationship between jitter and processing power? I'm very curious to learn more about why I'm hearing what I'm hearing. Link to comment
bordin Posted May 11, 2010 Share Posted May 11, 2010 Latency in audio production systems by Matt Ottewill Latency, what is it good for? What is latency? What causes latency? PCs and DAW's DSP delays link Some more articles. Optimising The Latency Of Your PC Audio Interface 2005 Low Latency Background - Buffer and Latency Jitter 2000 Low Latency Background: Monitoring, ZLM and ASIO M-Audio Transit latency, affects sound quality ? etc. Link to comment
Miska Posted May 11, 2010 Share Posted May 11, 2010 When using USB devices, especially adaptive mode ones, reducing operating system latency for example by using RT-patched Linux kernel improves USB packet timing. Thus ensures more steady packet flow and reduced jitter. When used with busmaster DMA interfaces (PCI/PCIe) it doesn't have any impact on jitter, since the interface card itself is reading data out of the RAM based on it's own clock. Increasing CPU processing power on a traditional operating system tends to have similar effect, since other tasks executed by the operating system take less time on faster CPU and thereby timing and latency gets improved. However, this is sort of sledgehammer method of improving things... Latency optimized Linux is a very good platform for any audio work, outperforming Mac OS X and Windows. Signalyst - Developer of HQPlayer Pulse & Fidelity - Software Defined Amplifiers Link to comment
Shadorne Posted May 11, 2010 Share Posted May 11, 2010 Jitter does not exist in the digital world - jitter starts once you have clock. Of course external factors (power supplies, processor activity etc.) can all influence a clock. However an external device with re-clocking or asynchronous clocking can eliminate these issues altogether (although it will not cure dropouts from CPU or network bottlenecks) Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
Miska Posted May 11, 2010 Share Posted May 11, 2010 Jitter does not exist in the digital world - jitter starts once you have clock. In adaptive mode USB audio devices, the clock is derived from the USB packet rate produced by the computer. Signalyst - Developer of HQPlayer Pulse & Fidelity - Software Defined Amplifiers Link to comment
ggking7 Posted May 12, 2010 Author Share Posted May 12, 2010 I'm using an asynchronous USB DAC and the tweaks I'm making (which are aimed at low-latency) are improving the sound. Can anyone explain why this is happening? Link to comment
Wavelength Posted May 12, 2010 Share Posted May 12, 2010 ggk, At some point... and I am already there. We have to assume that there is more to this than just bits. I have tested 4 applications that are bit true. I optically coupled my dac and placed it 30M away from my computer. In each case I controlled the application with my iPad. Ok why do they sound different? Beats me... I can only tell you that applications that require less processing time and CPU overhead seem to sound better. Now I am going to concentrate on bigger things. Thanks Gordon J. Gordon Rankin Wavelength Audio http://www.usbdacs.com/ http://www.wavelengthaudio.com/ http://www.guitar-engines.com/ Link to comment
Shadorne Posted May 12, 2010 Share Posted May 12, 2010 "In adaptive mode USB audio devices, the clock is derived from the USB packet rate produced by the computer." Well there is the problem! It is not a digital "bit" issue after all but an analog one - the timing of data packets produced by the computer may be modulated by the other tasks of the CPU. A simple buffer arrangement & re-clocking or asynchronous clocking might eliminate this form of jitter. Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
Shadorne Posted May 12, 2010 Share Posted May 12, 2010 If you have confirmed that the bits reaching the DAC are bit transparent (precisely the same) then the sound difference is most likely 1) your mind playing tricks 2) or the DAC not providing repeatable performance 3) or the different jitter over the Toslink not being rejected equally by the DAC (perhaps caused by the various different applications running on the computer and modulating the clocking out and analog signal of the optical data) 4) or the mains power supply affecting everything including the clock in your DAC (no repeatable performance) Simply put - we don't have to throw away all the science (a bit is a bit) when unexpected results are observed that contravene physical and mathematical laws. Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
Mr.C Posted May 13, 2010 Share Posted May 13, 2010 I'm pretty sure that Mr. Rankin was using an optical usb repeater with his async usb dac rather than 30 feet of toslink. We don't have to throw science out the window, but we do need to allow for observations that contravene the dominant scientific paradigm. If we did not follow these unexplainable circumstances, then science would be stagnant. Link to comment
ggking7 Posted May 14, 2010 Author Share Posted May 14, 2010 So Gordon says CPU-efficient players sound best and fast CPUs also "sound" best. I've found that latency-decreasing tweaks also improve the sound. I want to know why this is happening in order to maximize it, so it's really the "how" that's important, not the "why". I'm starting to think a stripped-down, souped-up, tweaked-out system would sound best. I need an SSD.... Link to comment
Shadorne Posted May 14, 2010 Share Posted May 14, 2010 "I want to know why this is happening in order to maximize it, so it's really the "how" that's important, not the "why"." How or Why? Provided you have bit transparent digital audio going to the same reliable, repeatable DAC device then clearly the issue must be jitter related. Something is affecting the clock that clocks the data out of your DAC. There is nothing more to it - a bit is a bit and a good DAC should convert the bitstream in the same way each time PROVIDED the clock timing is reliable/consistent to an extremely high degree of accuracy - a few picoseconds (intrinsic clock accuracy). In practice, a device that maintains accuracy to a few picoseconds while being driven is very difficult to build and it is inevitable there is some modulation of the clock by the bit stream and/or the output stages drawing on the power supply or some other kind of crosstalk. IMHO - if you are fiddling with CPU latency and hearing significant audible improvements then the setup you are using is intrinsically poor (in regards to jitter immunity) to begin with. A good design should be robust. Mathematically - random jitter is several orders of magnitude less audible and intrusive - the problem with most devices is when something modulates the clock in a non-random manner. Roon, Mac M2 Studio, Benchmark DAC3, Benchmark LA4, T+A DAC 200, ATC EL150ASL + SCM0.1/15SL Pro Link to comment
ggking7 Posted May 15, 2010 Author Share Posted May 15, 2010 I'll try an ASUS Xonar Essence STX soon. I'd like to find out if the tweaks I'm making make less of a difference with a PCI-E card compared to a USB DAC. I'm really interested to read any more comments or ideas on this. Link to comment
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now