Jump to content
IGNORED

Understanding Sample Rate


Recommended Posts

But wait, the rabbit hole is deeper than I thought. JJ pointed out that "the formula" is somewhat simplified, and that it would be more accurate to use a prolate spheriodal sequence or perhaps a gaussian sequence. But he also pointed out that it's only a decimal point difference at best, and we're already so many orders of magnitude past significance that it's not worth the effort.

 

Some people might seize on the point that the resolution is apparently lower at low frequencies. "But", they might say, "surely we then can't accurately tell when a low frequency note begins or ends." This ignores the fact that a start or stop of a note is a transient, and most people concerned about time resolution do so in relation to the timing of transients, which are generally considered to be mostly high frequency - the "edges" of signals.

 

The short answer, then:

For any sensible bit depth, the time resolution far exceeds the audible detection limits.

"People hear what they see." - Doris Day

The forum would be a much better place if everyone were less convinced of how right they were.

Link to comment
6 hours ago, jasenj1 said:

Of interest for understanding sampling: Understanding Waveforms

 

 

Here is the crux of the bisquit:

 

"the low pass [filter] before the sampling stage has to be extremely sharp to avoid cutting any audible frequencies below 20kHz but still not allow frequencies above the Nyquist to leak forward into the sampling process. This is a difficult filter to build..."

 

I suspect that is what beer was hearing.

Link to comment
12 hours ago, audiventory said:


I suppose you mean analog filter of DAC.

Absolutelly brickwall filter in infinite time restore 100%.

But we have not such filter and existing time of spectrum harmonic is limited.

 

I meant the digital anti-aliasing filter.

 

9 hours ago, psjug said:

Sorry if you already know this stuff...

 

You could start with linear phase vs minimum phase.  Here is a concise summary of the difference, showing the different impulse response and differences in phase plots: https://mrapodizer.wordpress.com/2011/08/16/technical-analysis-of-the-meridian-apodizing-filter/

 

Archimago's blog entry on ringing shows how the difference in these filters can show up when playing real music.

http://archimago.blogspot.com/2018/01/audiophile-myth-260-detestable-digital.html

 

the first link I haven't read before so I'll dive into that, thanks,

 

I'll re-read @Archimago 's blog page again to see if I can get anything else from here I didn't get the first time I read it last month.  I've also tried to resurrect his post here on CA about 'linear phase vs minimum phase blind test' and see if we can't get more chatter going on there.  If I remember correctly, Archimago is of the opinion ringing introduced by filters is of no consequence since it only happens in the transition band.  I'm not sure I'm convinced of this just yet. I can clearly pick out a subdued attack with linear phase vs minimum on the same DAC, just different izotope settings.  It wouldn't be an issue if I didn't listen to prog and metal where the drums are doing a lot of work.  But I find that the kick drum attacks gets blurred in the mix more with linear phase.  

 

Maybe I'm confusing ringing with phase delay/time smear?  But I thought they were one and the same.  If so, I currently disagree with Archimago.

Link to comment

ok, I found this on his blog that is also helpful.  http://archimago.blogspot.ca/2017/12/howto-musings-playing-with-digital_23.html

 

This helped me understand (slightly better) the effect of phase based on filters.  

 

Let me ask you guys.  Is it possible I prefer the minimum phase reconstruction filter because its own phase distortion during playback is negating the phase distortion introduced during analogue to digital capture/encoding?  If the phase distortion has a negative slope at encoding and I reconstruct it using a positive slope phase distortion, I theoretically put the phase right back to where it was in the analogue domain, right?

 

Surely, the above hypothesis will be debunked as I can't be the first to propose this!

Link to comment
9 minutes ago, buonassi said:

Let me ask you guys.  Is it possible I prefer the minimum phase reconstruction filter because its own phase distortion during playback is negating the phase distortion introduced during analogue to digital capture/encoding? 

That's unlikely. Audio equipment tends to use either linear phase (no distortion) or minimum phase (delayed high frequencies). What you suggest would mean some step in the processing used a filter skewed towards maximum phase. I've never heard of anything doing this, and I can see no reason why anyone would want it.

Link to comment
5 hours ago, Ralf11 said:

 

 

Here is the crux of the bisquit:

 

"the low pass [filter] before the sampling stage has to be extremely sharp to avoid cutting any audible frequencies below 20kHz but still not allow frequencies above the Nyquist to leak forward into the sampling process. This is a difficult filter to build..."

 

I suspect that is what beer was hearing.

 

Or, as is what’s really done, record at 96-192 kHz and there won’t be signal which needs filtering or a more gentle filter can be employed.

Custom room treatments for headphone users.

Link to comment

so I've been playing with SoX a bit tonight.  I came upon a very low level detail in a track I'm familiar with, but I'd never heard before.  I was purposely subjecting myself to linear phase, but with SoX vs iZotope this time.

 

Anyway, I went back and tried to hear that little nuanced detail using minimum phase, and could not pick it out for the life of me.  Back to linear phase, and it was there again.  In it's own little pocket deep in the back of stage right.  Amazing.  

 

Soundstage has more depth to it with linear phase, not as 2D - more separation between instruments?.  But you lose some of the slam for some reason.  I still don't know why things sound less impacting and smoothed over.  But I've decided to give linear phase a real chance now that it's proven that it can retrieve detail.

 

Then there's intermediate phase.... 

Link to comment
2 hours ago, mansr said:

That's unlikely. Audio equipment tends to use either linear phase (no distortion) or minimum phase (delayed high frequencies). What you suggest would mean some step in the processing used a filter skewed towards maximum phase. I've never heard of anything doing this, and I can see no reason why anyone would want it.

Wait a minute... I have this backward?  Minimum phase delays the highs relative to the lows?  So the lower frequencies arrive slightly before the highs?  I realize this happens on such a fast level that it's barely (if at all) perceivable.  But I thought minimum phase did the opposite - allowed the highs to arrive first, followed by lows - giving it the attack I think I'm hearing.

 

I can see where your statement is correct however.  Looking at @Archimago's phase graphs, the highs take on a positive phase shift, which would mean moving the sine ahead in time.  Is that a correct interpretation?

Link to comment
5 hours ago, buonassi said:

'linear phase vs minimum phase blind test'

 

I'd recommend be very careful for with blind tests.The tests demands serious approach in methodic design, condition control and number of checks. Read details here https://samplerateconverter.com/educational/hifi-blind-test

 

At the end of the page [References] you can see example of prober blind test protocol "One of careful test examples"

 

5 hours ago, buonassi said:

Maybe I'm confusing ringing with phase delay/time smear?

 

Ringing is response at output to delta-impulse at input. Read more https://samplerateconverter.com/content/what-ringing-audio

 

"Time smear" is not technical term, but it like to ringing, me seems.

 

Phase is time moment of sine (harmonic). Phase is measured in degrees or radians.

It is relative value. Depending on frequency it may be scaled on time in seconds.

 

Phase shift between input and output like to time delay, but phase is relative shift in degrees or radians.

 

Linear phase filter have linear dependency of the phase shift by frequency.

For linear phase filter,
phase shift between input and output is different for different frequencies,
but time delay for all frequencies is the same.

 

Minimal phase filter have non-linear dependency of the phase shift by frequency.

And the time delay for different frequencies is various.

AuI ConverteR 48x44 - HD audio converter/optimizer for DAC of high resolution files

ISO, DSF, DFF (1-bit/D64/128/256/512/1024), wav, flac, aiff, alac,  safe CD ripper to PCM/DSF,

Seamless Album Conversion, AIFF, WAV, FLAC, DSF metadata editor, Mac & Windows
Offline conversion save energy and nature

Link to comment
3 hours ago, audiventory said:

 

I'd recommend be very careful for with blind tests.The tests demands serious approach in methodic design, condition control and number of checks. Read details here https://samplerateconverter.com/educational/hifi-blind-test

 

At the end of the page [References] you can see example of prober blind test protocol "One of careful test examples"

 

 

Ringing is response at output to delta-impulse at input. Read more https://samplerateconverter.com/content/what-ringing-audio

 

"Time smear" is not technical term, but it like to ringing, me seems.

 

Phase is time moment of sine (harmonic). Phase is measured in degrees or radians.

It is relative value. Depending on frequency it may be scaled on time in seconds.

 

Phase shift between input and output like to time delay, but phase is relative shift in degrees or radians.

 

Linear phase filter have linear dependency of the phase shift by frequency.

For linear phase filter,
phase shift between input and output is different for different frequencies,
but time delay for all frequencies is the same.

 

Minimal phase filter have non-linear dependency of the phase shift by frequency.

And the time delay for different frequencies is various.

But orthodox digital reconstruction filters do nothing below the transition band to phase. So all of this only happens between 20-22khz for a sensible filter. Obviously ones which are "better in the time domain" will tend to mess up way down into the audible spectrum, but that's the price you pay for a visually impressive impulse response.

You are not a sound quality measurement device

Link to comment
29 minutes ago, adamdea said:

But orthodox digital reconstruction filters do nothing below the transition band to phase. So all of this only happens between 20-22khz for a sensible filter. Obviously ones which are "better in the time domain" will tend to mess up way down into the audible spectrum, but that's the price you pay for a visually impressive impulse response.

 

Indeed. Why go to heroic efforts to make an invalid artificial signal look good when it just messes up the filter performance for actual music.

"People hear what they see." - Doris Day

The forum would be a much better place if everyone were less convinced of how right they were.

Link to comment
8 hours ago, buonassi said:

so I've been playing with SoX a bit tonight.  I came upon a very low level detail in a track I'm familiar with, but I'd never heard before.  I was purposely subjecting myself to linear phase, but with SoX vs iZotope this time.

 

Anyway, I went back and tried to hear that little nuanced detail using minimum phase, and could not pick it out for the life of me.  Back to linear phase, and it was there again.  In it's own little pocket deep in the back of stage right.  Amazing.  

 

Soundstage has more depth to it with linear phase, not as 2D - more separation between instruments?.  But you lose some of the slam for some reason.  I still don't know why things sound less impacting and smoothed over.  But I've decided to give linear phase a real chance now that it's proven that it can retrieve detail.

 

Then there's intermediate phase.... 

I've never tried to tell the difference like you are doing, but I'm pretty sure I would have a hard time telling the difference.  Maybe with headphones, but I never listen with headphones.  Loudspeakers mess with phase too.  Then there are room reflections, and probably other things to consider.  Since I'm an engineer, the fact that linear phase reconstruction filters are better at reconstructing the waveform appeals to me, so I'd probably choose linear phase if I make a choice of filter.  But really I don't care much about it.

Link to comment
2 hours ago, adamdea said:

but that's the price you pay for a visually impressive impulse response.

 

Impulse response so terrible for delta impulse only. Music have a few places, where the ringing is visible.

Ringing may be reduced by wider transient band on higher sample rate. But higher sample rate have wider band and can cause more audible products of intermodulations.

Reducing of pre-ringing cause increasing post-ringing energy on the pre-ringing's one.

 

Design of resampling filters is compromise.

AuI ConverteR 48x44 - HD audio converter/optimizer for DAC of high resolution files

ISO, DSF, DFF (1-bit/D64/128/256/512/1024), wav, flac, aiff, alac,  safe CD ripper to PCM/DSF,

Seamless Album Conversion, AIFF, WAV, FLAC, DSF metadata editor, Mac & Windows
Offline conversion save energy and nature

Link to comment
1 hour ago, Don Hills said:

 

Indeed. Why go to heroic efforts to make an invalid artificial signal look good when it just messes up the filter performance for actual music.

It seems to me that all the dicking around with filters (including no filter) chiefly serves to demonstrate how little any of it matters. (In much the same way that the existence of DSD64 proves that nothing over 20 Khz matters.)

You can try a maximum phase filter. It sounds fine, although obviously you wouldn't except out of curiosity. Go figure.

You are not a sound quality measurement device

Link to comment
4 minutes ago, adamdea said:

What happens to all that quantisation noise in the sub 20kHz range? Why is this acceptable?

Now I’m following your logic even less ...

 

Aside from reality, how would sub 20kHz noise prove or disprove anything about >20 kHz signal?

Custom room treatments for headphone users.

Link to comment
11 minutes ago, jabbr said:

Now I’m following your logic even less ...

 

Aside from reality, how would sub 20kHz noise prove or disprove anything about >20 kHz signal?

I'm talking about the quantisation noise that has to be shaped by the noise shaper. Sorry if that was not obvious. You have 1 bit of resolution spread over 64fs that still only 4 bits in the audioband isn't it. Where do we put the other 12 bits of noise to get 16 bit performance in the audioband?

You are not a sound quality measurement device

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...