Jump to content
IGNORED

Misleading Measurements


Recommended Posts

6 hours ago, Bill Brown said:

My impression (I don't have sources, perhaps it was Earl Geddes) suggest that air is largely linear except at very high pressures (not at the levels our speakers can produce).  The discussions I have seen are in regard to compression drivers, but even there (huge pressures) I don't believe it is non-linear.

 

Bill


With very large amplitude sound waves (extremely loud) the air becomes nonlinear. Even the air temperature can change locally due to high compression at the peaks of the waveform. Localized compression and temperature change result in faster sound wave propagation at the peaks than at the troughs, where the air gets rarefied  and cooler. The result is a distorted sine wave that might look more like sawtooth than sine.

Link to comment
9 hours ago, bluesman said:

It’s transferred from the moving molecules of the fundamental waves to the molecules pushed and pulled into the IM products.  The overall energy is the same amount that was transferred to the air molecules by the string or reed or speaker cone, but now it’s distributed among the expanded spectrum.  So the energy in each fundamental has been reduced by the amount transferred to the intermodulation products.

 

Yes, but we are still arguing about where that transfer of energy moved from the fundamental to the intermodulation products. Did it occur in the air, or did it occur in some object which is not the the air?

 

Note, the air immediately adjacent to the instrument is still air, any IM here is from the body parts of the instrument that the air contacts.

Link to comment
43 minutes ago, sandyk said:

 

What has any of that got to do with normal listening to music in a typical entertainment or optimised Audio room ?

 As for temperature changes due to Output levels in any normal listening environment . . . . . . .

 

Why so closed minded, Alex? In a less typical entertainment environment, it could make for a very interesting speaker system, as you can see below:

 

Link to comment
14 minutes ago, pkane2001 said:

Why so closed minded, Alex? In a less typical entertainment environment, it could make for a very interesting speaker system, as you can see below:

 Perhaps in Academic circles, however it has little if any relevance to the topic of Misleading Measurements.

 Perhaps you should start another thread in the General forum about your S/W just as John Dyson has done with his ?

 

How a Digital Audio file sounds, or a Digital Video file looks, is governed to a large extent by the Power Supply area. All that Identical Checksums gives is the possibility of REGENERATING the file to close to that of the original file.

PROFILE UPDATED 13-11-2020

Link to comment
Just now, sandyk said:

 Perhaps in Academic circles, however it has little if any relevance to the topic of Misleading Measurements.

 Perhaps you should start another thread in the General forum about your S/W just as John Dyson has done with his ?

 

Perhaps, perhaps, but I'm not the one who keeps asking if air is non-linear in this thread.

So, perhaps you should ask them to start another thread? 

 

Link to comment
2 minutes ago, pkane2001 said:

 

Perhaps, perhaps, but I'm not the one who keeps asking if air is non-linear in this thread.

So, perhaps you should ask them to start another thread? 

 

 

Do you want this to become another FrankenGeorge type thread ?  😉

 

How a Digital Audio file sounds, or a Digital Video file looks, is governed to a large extent by the Power Supply area. All that Identical Checksums gives is the possibility of REGENERATING the file to close to that of the original file.

PROFILE UPDATED 13-11-2020

Link to comment
11 hours ago, John Dyson said:

Such tools are wonderful for developers and those trying to understand subtle behaviors in their software & HW. 

I agree - but only if you know what the tool is doing and how.  Introduce unspecified nonlinearities to a signal path to add distortion products seems to be of little or no value to engineers, designers, et al.  It may be entertaining and even educational for audiophiles to see if they can hear differences.  But having no idea what was done to the signal seems to me to make it impossible to relate to HW or SW design.

Link to comment
1 hour ago, bluesman said:

I agree - but only if you know what the tool is doing and how.  Introduce unspecified nonlinearities to a signal path to add distortion products seems to be of little or no value to engineers, designers, et al.  It may be entertaining and even educational for audiophiles to see if they can hear differences.  But having no idea what was done to the signal seems to me to make it impossible to relate to HW or SW design.

 

Don't know if I mentioned it already (five times, maybe?) but the software lets you configure your own non-linearity. You specify it by entering the amplitudes of the desired harmonics. Take a look at any FFT plot of a real device, of say a 1KHz sine wave, find the amplitudes of the harmonics, and enter them in. The exact non-linearity will be generated to produce that set of harmonics, in effect simulating the original device. You can then feed a sinewave or music or a two- or three-tone IMD test signal through the nonlinearity and confirm that it looks like the original device. I've done this many times as part of testing using various DACs and preamps.

 

image.png.9fbf5bdec3e3760ee4767c79495b9295.png

 

DISTORT also has a number of different simulation non-linearities built-in, like a triode-based preamp or various compressor types. 

 

Here are examples of some built-in transfer functions:

image.png.353c041f1a94cbf30d0bbf94774bb166.pngimage.png.66e8aff1e5f5ff99dc308f9d2ad7f914.pngimage.png.97271a9eb59c5b305701251537a816f0.pngimage.png.c51fcc0ea3c0d3d179df019c81c5f36b.png

 

 

I can add the ability to enter your own mathematical expression for the non-linearity, will that work for you? How would you find out the expression for the non-linearity of a specific device to enter into this?

 

Link to comment
29 minutes ago, pkane2001 said:

The exact non-linearity will be generated to produce that set of harmonics, in effect simulating the original device.

And that’s where we diverge in opinion. Simply recreating a set of harmonics does not in any way confirm that the transfer function you inserted is in any way a simulation of any specific function of the original device.  It also can’t confirm that the observed harmonic distortion is the sole effect on the signal, or that it has the same effect(s) at the same point(s) in the signal path as the cause or causes in the original device.

 

You also don’t know if your intervention causes the same set of changes in SQ.  All you can say with certainty is that your intervention caused the addition of a given set of harmonics that are also generated at one or more points in the original device.
 

You can add the flavor of almonds to food with almond extract or with cyanide.  The taste is the same, but...........
 

 

Link to comment
20 minutes ago, bluesman said:

And that’s where we diverge in opinion. Simply recreating a set of harmonics does not in any way confirm that the transfer function you inserted is in any way a simulation of any specific function of the original device.  It also can’t confirm that the observed harmonic distortion is the sole effect on the signal, or that it has the same effect(s) at the same point(s) in the signal path as the cause or causes in the original device.

 

You also don’t know if your intervention causes the same set of changes in SQ.  All you can say with certainty is that your intervention caused the addition of a given set of harmonics that are also generated at one or more points in the original device.
 

You can add the flavor of almonds to food with almond extract or with cyanide.  The taste is the same, but...........
 

 

 

Maybe yours is an opinion, mine is the result of actual testing and validation. As I said, I tested this on multiple devices. If harmonic content and IMD content match that measured using the physical device, then it's a pretty good simulation, IMO (ok, so that last part is an opinion, but based on measurements and experiment).

 

Link to comment
26 minutes ago, pkane2001 said:

 

Maybe yours is an opinion, mine is the result of actual testing and validation. As I said, I tested this on multiple devices. If harmonic content and IMD content match that measured using the physical device, then it's pretty good simulation, IMO (ok, so it is an opinion, but based on measurements and experiment).

There are many pieces of audio gear that sound grossly different despite identical distortion measurements.  Could those measurements possibly be misleading? Or maybe there’s a lot more to that “wire with gain” than how much IMD and THD it generates.

 

Have you considered the possibility that your interventions have more effects on the signal than those you’re focused on and measuring?  Is it at all possible that distortion measurements alone could be misleading you?  Might different nonlinearities lead to different effects in addition to the same harmonic distortions?

 

You obviously don’t consider the efforts and results I posted earlier in this thread to be measurement, experimentation, testing or validation.  I’ll just have to pull myself together once I finish grieving over your disapproval.

 

PS: nice catch on your “IMO”. You were on the verge of contradicting yourself.
 

 

Link to comment
39 minutes ago, bluesman said:

There are many pieces of audio gear that sound grossly different despite identical distortion measurements.  Could those measurements possibly be misleading? Or maybe there’s a lot more to that “wire with gain” than how much IMD and THD it generates.

 

Have you considered the possibility that your interventions have more effects on the signal than those you’re focused on and measuring?  Is it at all possible that distortion measurements alone could be misleading you?  

 

You obviously don’t consider the efforts and results I posted earlier in this thread to be measurement, experimentation, testing or validation.  I’ll just have to pull myself together once I finish grieving over your disapproval.

 

PS: nice catch on your “IMO”. You were on the verge of contradicting yourself.

 

It would seem that you are consistently misinterpreting what I say or write based on your perception of "my agenda", at least that's what it sounds like to me. You keep bringing up things I've not said or claimed and attributing them to me. I really don't get it. Would it not be better to try to have a discussion with me, the real person, rather than with some idealized "objectivist" impression you seem to be arguing with? I'd certainly appreciate it.

 

Where have I claimed that distortion is the only measurement that affects audibility? Point me to that message, please. Or that THD or IMD are the only measurements that are important? Heck, I have spent many months building many different types of distortion simulations into DISTORT (non-linearity & compression, multiple different kinds of jitter, different dither types, noise shaping, amplifier cross-over distortion, oversampling and filters, negative and positive feedback, various types of noise, etc.) I have plans to add a number of others. It's not because I think these are all completely unimportant, or else I wouldn't waste the time. 

 

But, let's ignore that for now. This discussion is not really about DISTORT, or at least it shouldn't be.

 

What effort of yours and results did I ignore, can you please review these for me (or point to the relevant posts), as I might have missed them? What was it that you tested and what was the finding? I've seen a lot of opinions and arguments, but somehow missed any of the facts that contradict anything I've said here.

 

Link to comment
36 minutes ago, pkane2001 said:

This discussion is not really about DISTORT, or at least it shouldn't be.

And it isn't.  Go back to posts 580 and 620 for a start.  You (and others - you're not alone) keep pushing the fact that all intermodulation is distortion and is generated by nonlinearity.  You're the one who introduced your app in support of this belief - I couldn't care less about it.  I'm suggesting and supporting the belief that not all intermodulation is distortion caused by nonlinearity.  Musical notes intermodulate naturally in the air because their compression and rarefaction waves collide with the molecules in the air around them, pushing those that are randomly minding their own business into sum and difference waves that are heard and recorded as part of the program material.

 

Those intermodulations (which are NOT distortion in the true sense, as they arise from and are part of the source signal) are then recreated again from the reproduced program material being thrust into the air by speakers.  This is in addition to and apart from any intermodulation distortion caused by nonlinearities in the signal chain.  Your repeated assertion that your app proves that all IM and harmonic distortion stems from nonlinearities is simply not correct, in my opinion.  I support my opinion by having recorded pure tones with their natural intermodulation products, and showing that those intermodulation products persist in the recorded waveform even after filtering out the fundamentals that generated them.

 

Another contributor believes that these natural IM products are coming from resonances in the instruments themselves.  I cited well done research by others showing that there are no resonances in a solid body guitar anywhere near the sum and difference tones in my demo, which I believe also refutes this belief.  Similar research shows a lack of resonance anywhere near the intermodulation products found in the playing of flutes, oboes, and many other instruments.

 

Because of my belief, I'm suggesting that distortion measurements alone are misleading.  I believe that it's possible to separate the natural IM in the program from the IM created by playback of the program (entirely apart form any IMD introduced by the system).  I don't know how yet - it may be phase differences, amplitude differences, or perhaps use of real time FFT to differentiate natural IM in the source program from the products of intermodulation between the recorded natural IM and that generated by its playback. 

 

I think this added layer of intermodulation is at least part of that famous veil that we all want removed from our music on playback.  But "IMO", believing that all IM is the product of nonlinearities and is distortion is counterproductive. And if I'm correct, inducing distortion as you advocate for evaluation, testing and development would be the wrong way to approach the problem.

Link to comment
59 minutes ago, bluesman said:

There are many pieces of audio gear that sound grossly different despite identical distortion measurements.  Could those measurements possibly be misleading? Or maybe there’s a lot more to that “wire with gain” than how much IMD and THD it generates.

 

Have you considered the possibility that your interventions have more effects on the signal than those you’re focused on and measuring?  Is it at all possible that distortion measurements alone could be misleading you?  Might different nonlinearities lead to different effects in addition to the same harmonic distortions?

 

You obviously don’t consider the efforts and results I posted earlier in this thread to be measurement, experimentation, testing or validation.  I’ll just have to pull myself together once I finish grieving over your disapproval.

 

PS: nice catch on your “IMO”. You were on the verge of contradicting yourself.
 

 

Electronics equipment and processors are not designed with the sole test sources being musical instruments.  Most often, they will be test recordings of all kinds, including recordings of instruments and recordings/direct output of signal generators.  Mr @pkane2001 piece of software is effectively the same as the oh-so-typical piece of test equipment.

 

Even though a lot of audiophiles are not full EE/DSP/Audio engineers/technologists, don't underestimate their competency and what they know.  It is actually *better* to have the correct already-designed tool rather than cobble together a piece of software.

 

Maybe, many non-programmers  don't really know how tedious it is to write even trivially useful/file compatible audio software -- just try to figure out how to reliably read/write .wav. files?   Remember, there are at least three common data variants, many sample rates, and all kinds of metadata. Then, there are semi-compliant/semi-nonstandard .wav variants.  Then, is the tool going to have to be compatible with RF64?  (Probably not, but still -- it ends up being an issue on applications software.)  Okay, go find a software libarary that does that work, but then, watch the licensing...  SW licensing is yet another issue to worry about (not so much on test software, but on redistributed test software, it is.)

 

When a test tool generates audio files with the appropriate/desired/needed data, that is a potentially a nice shortcut for the developer or moderately sophisticated (or more) audiophile type person.

 

Sure, I have a multi-threading framework where I can write a processing program in minutes, with all of the .wav file work already being done, but that still takes a few hours to create a single-purpose test tool.   If @pkane2001 has already done the work to effectively create the same as multiple single purpose test tools -- that saves the user/developer time.  (Properly controlled/standardized 'noise' isn't even trivially simple to do.)

 

An already-made test tool is really helpful.

Using the tool for 'listening' or SW test purposes, it all has a similar purpose.  (Even if I could play an instrument, I doubt that I'd get the Sax or Clarinet out and use it as a software test input.)   Even controlling an audio test with real instruments -- one would have to be VERY careful.

 

John

 

Link to comment
10 minutes ago, bluesman said:

And it isn't.  Go back to posts 580 and 620 for a start.  You (and others - you're not alone) keep pushing the fact that all intermodulation is distortion and is generated by nonlinearity.  You're the one who introduced your app in support of this belief - I couldn't care less about it.  I'm suggesting and supporting the belief that not all intermodulation is distortion caused by nonlinearity.  Musical notes intermodulate naturally in the air because their compression and rarefaction waves collide with the molecules in the air around them, pushing those that are randomly minding their own business into sum and difference waves that are heard and recorded as part of the program material.

 

Those intermodulations (which are NOT distortion in the true sense, as they arise from and are part of the source signal) are then recreated again from the reproduced program material being thrust into the air by speakers.  This is in addition to and apart from any intermodulation distortion caused by nonlinearities in the signal chain.  Your repeated assertion that your app proves that all IM and harmonic distortion stems from nonlinearities is simply not correct, in my opinion.  I support my opinion by having recorded pure tones with their natural intermodulation products, and showing that those intermodulation products persist in the recorded waveform even after filtering out the fundamentals that generated them.

 

Another contributor believes that these natural IM products are coming from resonances in the instruments themselves.  I cited well done research by others showing that there are no resonances in a solid body guitar anywhere near the sum and difference tones in my demo, which I believe also refutes this belief.  Similar research shows a lack of resonance anywhere near the intermodulation products found in the playing of flutes, oboes, and many other instruments.

 

Because of my belief, I'm suggesting that distortion measurements alone are misleading.  I believe that it's possible to separate the natural IM in the program from the IM created by playback of the program (entirely apart form any IMD introduced by the system).  I don't know how yet - it may be phase differences, amplitude differences, or perhaps use of real time FFT to differentiate natural IM in the source program from the products of intermodulation between the recorded natural IM and that generated by its playback. 

 

I think this added layer of intermodulation is at least part of that famous veil that we all want removed from our music on playback.  But "IMO", believing that all IM is the product of nonlinearities and is distortion is counterproductive. And if I'm correct, inducing distortion as you advocate for evaluation, testing and development would be the wrong way to approach the problem.

 


We can get into defining what constitutes a 'nonlinearity', but usually that means something like a non-fixed gain at a single frequency based on parameter (changing gain at a given frequency vs. a parameter).  The controlling value for that 'gain' can be instantaneous signal 'voltage' or 'pressure'.  A waveform can be distorted by a simple filter, for example, but that filter is NOT nonlinear, because new sine wave frequencies aren't generated (a simple filter doens't vary the gain at a specific frequency vs. time.) 

 

Normally new frequency components cannot be generated from a signal without nonlinearity.  Not all 'nonlinearities' are 'fixed' and are just bends in gain curves.   Any time that you multiply a sine wave by a fixed value (other than zero), you'll get a sine wave.  The gain curve can be bent by insidious (parameteric) ways.   There is circuitry whose behavior is fully dependent on the parametric effects.

 

You are probably thinking about a fixed nonlinear gain curve vs. time, but a lot of audio doesnt work that way -- distortion happens because of a changing gain curve, just another kind of nonlinearity, but instead is parametric vs. a 'dc' (constant) kind of nonlinearity.

 

Distortion is all 'relative' also -- the argument can end up being mired in sophistry.   However, traditionally, if you get new frequencies in a spectrum after sending a signal through a device, there be 'nonlinearities' in there.

 

I can agree with one sentiment -- not all nonlinearities are distortion.  It is all about the goal of the circuit/instrument/etc.   The output of an RF mixe that results from nonlinearities,r isn't deemed to be 'distortion', even though there are associated distortion components involved.

 

John

 

Link to comment
17 minutes ago, bluesman said:

And it isn't.  Go back to posts 580 and 620 for a start.  You (and others - you're not alone) keep pushing the fact that all intermodulation is distortion and is generated by nonlinearity.  You're the one who introduced your app in support of this belief - I couldn't care less about it.  I'm suggesting and supporting the belief that not all intermodulation is distortion caused by nonlinearity.  Musical notes intermodulate naturally in the air because their compression and rarefaction waves collide with the molecules in the air around them, pushing those that are randomly minding their own business into sum and difference waves that are heard and recorded as part of the program material.

 

Those intermodulations (which are NOT distortion in the true sense, as they arise from and are part of the source signal) are then recreated again from the reproduced program material being thrust into the air by speakers.  This is in addition to and apart from any intermodulation distortion caused by nonlinearities in the signal chain.  Your repeated assertion that your app proves that all IM and harmonic distortion stems from nonlinearities is simply not correct, in my opinion.  I support my opinion by having recorded pure tones with their natural intermodulation products, and showing that those intermodulation products persist in the recorded waveform even after filtering out the fundamentals that generated them.

 

Another contributor believes that these natural IM products are coming from resonances in the instruments themselves.  I cited well done research by others showing that there are no resonances in a solid body guitar anywhere near the sum and difference tones in my demo, which I believe also refutes this belief.  Similar research shows a lack of resonance anywhere near the intermodulation products found in the playing of flutes, oboes, and many other instruments.

 

Because of my belief, I'm suggesting that distortion measurements alone are misleading.  I believe that it's possible to separate the natural IM in the program from the IM created by playback of the program (entirely apart form any IMD introduced by the system).  I don't know how yet - it may be phase differences, amplitude differences, or perhaps use of real time FFT to differentiate natural IM in the source program from the products of intermodulation between the recorded natural IM and that generated by its playback. 

 

I think this added layer of intermodulation is at least part of that famous veil that we all want removed from our music on playback.  But "IMO", believing that all IM is the product of nonlinearities and is distortion is counterproductive. And if I'm correct, inducing distortion as you advocate for evaluation, testing and development would be the wrong way to approach the problem.

 

I think there's probably a terminology issue here, so perhaps we should first define our terms. I'm using the following definitions as they are normally used in audio:

 

Distortion: any frequency content that was not present in the original signal

Intermodulation Distortion: amplitude modulation of one signal by another caused by a non-linearity in the signal chain

 

From Wikipedia

Quote

Intermodulation (IM) or intermodulation distortion (IMD) is the amplitude modulation of signals containing two or more different frequencies, caused by nonlinearities or time variance in a system. The intermodulation between frequency components will form additional components at frequencies that are not just at harmonic frequencies (integer multiples) of either, like harmonic distortion, but also at the sum and difference frequencies of the original frequencies and at sums and differences of multiples of those frequencies. Intermodulation is caused by non-linear behaviour of the signal processing (physical equipment or even algorithms) being used. 

 

I highlighted the last sentence to show that it's not just my opinion when I state this.

 

Now,

  1. If you are saying that IMD is coming from something other than the non-linear behavior of some device or medium, then you are using a different definition, according to the above
     
  2. If you are saying that IMD is not a distortion, then you are also contradicting the definition, since by definition, IMD introduces new frequencies that were not present in the original signal and therefore it is a form of distortion

Now, the two posts you mention:

 

580: all seems correct to me, what's the issue here?

620: Let me understand: you found some, what you're calling, IMD, in the capture of a guitar tone. Could it be simple amplitude modulation? Not all amplitude modulation is IMD, but all IMD is amplitude modulation.

 

Quote

Your repeated assertion that your app proves that all IM and harmonic distortion stems from nonlinearities is simply not correct, in my opinion.  

 

You are again misquoting what I said. I said, and repeated, and will repeat one more time: IMD and HD are caused by the same non-linearity. What this means, is that given a non-linear audio device, HD will be result when testing with a single tone, IMD will be the result of multiple tones passing through the same non-linearity. THAT IS THE EXTENT OF MY CLAIM. Anything else you misread or misinterpreted. 

 

Quote

And if I'm correct, inducing distortion as you advocate for evaluation, testing and development would be the wrong way to approach the problem.

 

Where did I advocate using distortion for evaluation, testing or development? DISTORT is an audibility testing tool to simulate various distortions. It is meant to let me or others evaluate the degree of how various distortions sound using a mathematical model of the distortion applied to any recorded piece of music or test signal. Nothing more.

 

Quote

I think this added layer of intermodulation is at least part of that famous veil that we all want removed from our music on playback.

 

At least that's something you could easily test for yourself with DISTORT. But you refuse, so that's on you. You have nothing to back this up, except for an opinion.

 

Quote

I believe that it's possible to separate the natural IM in the program from the IM created by playback of the program (entirely apart form any IMD introduced by the system).  I don't know how yet - it may be phase differences, amplitude differences, or perhaps use of real time FFT to differentiate natural IM in the source program from the products of intermodulation between the recorded natural IM and that generated by its playback. 

 

Just check out my other software, DeltaWave (I hate to bring this up, as this will likely start another tangential argument). It does exactly what you want: separate all the distortions caused by the reproduction chain from those present in the source material. Yes, including IMD, AM, FM, phase, or amplitude, or any other kind of distortion.

 

Quote

But "IMO", believing that all IM is the product of nonlinearities and is distortion is counterproductive.

 

By definition, as stated above, this is incorrect.

 

Link to comment
5 minutes ago, pkane2001 said:

Distortion: any frequency content that was not present in the original signal

The natural intermodulation products in the source program ARE in the original signal.  So they're not distortion.

 

7 minutes ago, pkane2001 said:

If you are saying that IMD is not a distortion, then you are also contradicting the definition

IMD is intermodulation distortion, which is the addition of IM products that are not in the source material.  Intermodulation is the natural interaction of multiple tones and is in the source.  All IMD results from intermodulation.  But not all intermodulation is distortion.

 

10 minutes ago, pkane2001 said:

Let me understand: you found some, what you're calling, IMD, in the capture of a guitar tone. Could it be simple amplitude modulation? Not all amplitude modulation is IMD, but all IMD is amplitude modulation.

The only IM in a single note played on a guitar occurs among the fundamental and its harmonics - this is what gives each instrument its own characteristic sound.  What I presented is capture of the intermodulation products of two guitar tones at ~220 and ~246 Hz. 

 

All IMD is AM.  All IM without a D is also AM.  It seems that we differ largely over your belief that there is no such thing as normal, natural acoustic intermodulation among the notes in a musical performance. You seem to think that any and all intermodulation is distortion, and I strongly believe that you're wrong in this.

 

2 hours ago, pkane2001 said:

This discussion is not really about DISTORT, or at least it shouldn't be.

Then stop mentioning it.  You brought it up again twice in post #676.

 

40 minutes ago, pkane2001 said:

Just check out my other software, DeltaWave...It does exactly what you want: separate all the distortions caused by the reproduction chain from those present in the source material.

You're probably going to respond that I don't understand what you're saying - but I don't have a clue what you mean by distortion "present in the source material".  What kind of distortion do you think there is in a live performance by a symphony orchestra, an acoustic jazz trio, a string quartet or a piano concerto?  And what do you think is causing it?  Do you hear it at a live performance?

 

If the intermodulation products among instruments and voices are distortion, there can be no such thing as an undistorted musical performance.  If that's true, why are we wasting our time trying to reproduce one?

Link to comment

Before the harmonics start beating against each other there must be harmonics. I didn't know where these come from. For anyone else in the same boat, I offer this: https://music.stackexchange.com/questions/77472/what-causes-overtones-at-harmonic-frequencies-in-an-instrument

 

Now to hunt down a little more on intermodulation, harmonics beating on each other, whatever anyone wants to call it.

One never knows, do one? - Fats Waller

The fairest thing we can experience is the mysterious. It is the fundamental emotion which stands at the cradle of true art and true science. - Einstein

Computer, Audirvana -> optical Ethernet to Fitlet3 -> Fibbr Alpha Optical USB -> iFi NEO iDSD DAC -> Apollon Audio 1ET400A Mini (Purifi based) -> Vandersteen 3A Signature.

Link to comment

So I'm completely new to this, and what I'm reading on pp 40-41 of the following source sounds like what @bluesman is describing in instrumental tuning (and performance?), but not like the sum and difference frequencies of intermodulation distortion. So can folks who know tell me how clueless I am? 🙂

 

http://kellerphysics.com/acoustics/Lapp.pdf

One never knows, do one? - Fats Waller

The fairest thing we can experience is the mysterious. It is the fundamental emotion which stands at the cradle of true art and true science. - Einstein

Computer, Audirvana -> optical Ethernet to Fitlet3 -> Fibbr Alpha Optical USB -> iFi NEO iDSD DAC -> Apollon Audio 1ET400A Mini (Purifi based) -> Vandersteen 3A Signature.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...