opus101 Posted November 27, 2020 Share Posted November 27, 2020 Just now, pkane2001 said: But again, you've ignored the fact that I specifically described the context of how I was using these words in the original post. I haven't ignored anything ISTM here. But point to the particular thing you're claiming I've ignored, maybe I did miss something. After all, I'm not infallible. Link to comment
pkane2001 Posted November 27, 2020 Author Share Posted November 27, 2020 2 minutes ago, opus101 said: I haven't ignored anything ISTM here. But point to the particular thing you're claiming I've ignored, maybe I did miss something. After all, I'm not infallible. Not a chance. This has already gone way too long for no good reason. I'm not going to start from the beginning. -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
opus101 Posted November 27, 2020 Share Posted November 27, 2020 Just now, pkane2001 said: This has already gone way too long for no good reason. That's your perception which you're entitled to. My own is radically different - its been most enlightening. Link to comment
fas42 Posted November 27, 2020 Share Posted November 27, 2020 16 minutes ago, pkane2001 said: I didn't say we shouldn't study perception -- I just said it's a much more complicated field because of all the different components that go into forming a perception. Yes, we need to study the perception of the reproduction event - this can be broken down, in a highly analytical way, which is equivalent to the mob trying to work what an elephant, in the dark, actually is 😁; or step back, and "throw a bit of light on the scene" - and see if we agree that indeed it looks like we have an African animal in front of us. The latter I believe is the 'right way' of understanding - a Gestalt result is possible, IME. Link to comment
pkane2001 Posted November 27, 2020 Author Share Posted November 27, 2020 3 minutes ago, opus101 said: That's your perception which you're entitled to. My own is radically different - its been most enlightening. I'm happy for you, oh Enlightened One! It wasn't all time wasted for me either, I did manage to put away all the leftover food from the Thanksgiving dinner during this time, and believe me, it was a lot! -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
opus101 Posted November 27, 2020 Share Posted November 27, 2020 Then there was at least one 'good reason'! pkane2001 1 Link to comment
Summit Posted November 27, 2020 Share Posted November 27, 2020 7 hours ago, pkane2001 said: I'll let you argue your case at Merriam-Webster. If they agree to change their definition, I'll agree with their decision :) But again, you've ignored the fact that I specifically described the context of how I was using these words in the original post. You, instead, decided to use another context outside my message and have been arguing about this for over an hour. Sorry I wasn't much clearer. Next time I'll include links to dictionary definitions of all the words I use. I don’t think so. It is clear what you meant to say and the context. You wrote not about sound per se, but about audio reproduction, and you even stressed that by declaring it was for a different location and time. Audio reproduction implies listening. “Audio is an activity that aims to reproduce a physical phenomenon — sound, at a different location and time. This part of the activity has nothing to do with the senses and can be studied and measured using existing instruments.” (boldface added by me) Link to comment
Summit Posted November 27, 2020 Share Posted November 27, 2020 9 hours ago, pkane2001 said: I thought I was very clear to make that exact distinction, between the physical and the perceived. I may have missed it. What was the exact distinction, between the physical and the perceived you made? Link to comment
pkane2001 Posted November 27, 2020 Author Share Posted November 27, 2020 1 hour ago, Summit said: I may have missed it. What was the exact distinction, between the physical and the perceived you made? One is easy to measure the other is not. -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
Jud Posted November 27, 2020 Share Posted November 27, 2020 On 11/26/2020 at 1:07 PM, jabbr said: Of course not as important at distinguishing the differences in SQ supplied by different power supplies, this line research: http://europepmc.org/article/PMC/5084724 Komisaruk demonstrates that imagined sensory stimulation lights up exactly the same areas in the brain on fMRI as actual sensory stimulation. Perhaps lends new meaning to the old carol “Do you hear what I hear?” One never knows, do one? - Fats Waller The fairest thing we can experience is the mysterious. It is the fundamental emotion which stands at the cradle of true art and true science. - Einstein Computer, Audirvana -> optical Ethernet to Fitlet3 -> Fibbr Alpha Optical USB -> iFi NEO iDSD DAC -> Apollon Audio 1ET400A Mini (Purifi based) -> Vandersteen 3A Signature. Link to comment
Jud Posted November 27, 2020 Share Posted November 27, 2020 8 hours ago, pkane2001 said: One is easy to measure the other is not. Though the example of soundstage that is often given seems to me a bad one, since playing around with phase can move perceived location and expand or contract soundstage pretty precisely. One never knows, do one? - Fats Waller The fairest thing we can experience is the mysterious. It is the fundamental emotion which stands at the cradle of true art and true science. - Einstein Computer, Audirvana -> optical Ethernet to Fitlet3 -> Fibbr Alpha Optical USB -> iFi NEO iDSD DAC -> Apollon Audio 1ET400A Mini (Purifi based) -> Vandersteen 3A Signature. Link to comment
Kal Rubinson Posted November 27, 2020 Share Posted November 27, 2020 23 minutes ago, Jud said: Perhaps lends new meaning to the old carol “Do you hear what I hear?” Though the example of soundstage that is often given seems to me a bad one, since playing around with phase can move perceived location and expand or contract soundstage pretty precisely. Or, perhaps, do you hear where I hear? Jud 1 Kal Rubinson Senior Contributing Editor, Stereophile Link to comment
fas42 Posted November 27, 2020 Share Posted November 27, 2020 52 minutes ago, Jud said: Though the example of soundstage that is often given seems to me a bad one, since playing around with phase can move perceived location and expand or contract soundstage pretty precisely. Playing with phase has zero with the soundstage that matters - it's as crude as Viewmaster games. Soundstage when well retrieved allows one to hear the precise nature of every sound element in the mix, no matter how complex the construction of the scene - the precise location in terms of lateral and depth positioning, and every aspect of the acoustic associated with that sound element; which can be completely independent of the acoustics of all the other sound elements. In minimalist recordings of course there is a single acoustic - but that's by design. The other characteristic is that the soundstages completely change - when the music changes. Put on a compilation of memory hits, say, and a new vista opens, when going to the next track - it's like going to a different theatre and stage production, for each song of a musical. Link to comment
pkane2001 Posted November 27, 2020 Author Share Posted November 27, 2020 1 hour ago, Jud said: Though the example of soundstage that is often given seems to me a bad one, since playing around with phase can move perceived location and expand or contract soundstage pretty precisely. Is soundstage a physical property of sound or something derived by the brain, i.e., perceived? -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
firedog Posted November 27, 2020 Share Posted November 27, 2020 5 minutes ago, pkane2001 said: Is soundstage a physical property of sound or something derived by the brain, i.e., perceived? It's a perception derived from certain physical characteristics of sound in a particular place/space. Humans have their own ways they perceive sound. Some of it is hardwired, and some of it is a learned response/interpretation. Likely that other animals perceive the same sound differently than us. I'm also not sure that individual humans are consistent over time in how they perceive the same sound-soundstage. Nor am I sure that humans raised in very different cultures/environments would perceive them the same. pkane2001 1 Main listening (small home office): Main setup: Surge protector +>Isol-8 Mini sub Axis Power Strip/Isolation>QuietPC Low Noise Server>Roon (Audiolense DRC)>Stack Audio Link II>Kii Control>Kii Three (on their own electric circuit) >GIK Room Treatments. Secondary Path: Server with Audiolense RC>RPi4 or analog>Cayin iDAC6 MKII (tube mode) (XLR)>Kii Three BXT Bedroom: SBTouch to Cambridge Soundworks Desktop Setup. Living Room/Kitchen: Ropieee (RPi3b+ with touchscreen) + Schiit Modi3E to a pair of Morel Hogtalare. All absolute statements about audio are false Link to comment
Popular Post pkane2001 Posted November 27, 2020 Author Popular Post Share Posted November 27, 2020 33 minutes ago, fas42 said: Playing with phase has zero with the soundstage that matters - it's as crude as Viewmaster games. Soundstage when well retrieved allows one to hear the precise nature of every sound element in the mix, no matter how complex the construction of the scene - the precise location in terms of lateral and depth positioning, and every aspect of the acoustic associated with that sound element; which can be completely independent of the acoustics of all the other sound elements. In minimalist recordings of course there is a single acoustic - but that's by design. The other characteristic is that the soundstages completely change - when the music changes. Put on a compilation of memory hits, say, and a new vista opens, when going to the next track - it's like going to a different theatre and stage production, for each song of a musical. Nice story, Frank, except it goes against everything known about spatial hearing. Position in space is determined primarily by timing (phase) and level differences between the two ears, as well as reverb that helps with depth perception. Jud and sandyk 1 1 -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
fas42 Posted November 27, 2020 Share Posted November 27, 2020 10 minutes ago, pkane2001 said: Nice story, Frank, except it goes against everything known about spatial hearing. Position in space is determined primarily by timing (phase) and level differences between the two ears, as well as reverb that helps with depth perception. One can do Viewmaster experiments, with hearing - but it doesn't give the bigger picture, 😉. Phase is indeed important with competent playback; how it manifests is that the phantom image with a true mono recording "follows you" as you move laterally in front of the speakers - as one behaviour. But direct playing with phase, while recording, as a means of creating space is purely a gimmick. The reverb of the spaces used in the recording is captured, and when adequately reproduced allows the mind to decode the staging of the recording - this is the 'transformation' that occurs when SQ reaches the requisite level ... Link to comment
semente Posted November 27, 2020 Share Posted November 27, 2020 7 minutes ago, fas42 said: The reverb of the spaces used in the recording is captured, and when adequately reproduced allows the mind to decode the staging of the recording - this is the 'transformation' that occurs when SQ reaches the requisite level ... Unless you're listening to (most but not all) classical music then all reverb is an add-on effect as is phantom image location. In sum, the whole soundstage is fabricated, unreal and uncaptured, like a photomontage: pkane2001 1 "Science draws the wave, poetry fills it with water" Teixeira de Pascoaes HQPlayer Desktop / Mac mini → Intona 7054 → RME ADI-2 DAC FS (DSD256) Link to comment
pkane2001 Posted November 27, 2020 Author Share Posted November 27, 2020 11 minutes ago, fas42 said: One can do Viewmaster experiments, with hearing - but it doesn't give the bigger picture, 😉. Phase is indeed important with competent playback; how it manifests is that the phantom image with a true mono recording "follows you" as you move laterally in front of the speakers - as one behaviour. But direct playing with phase, while recording, as a means of creating space is purely a gimmick. The reverb of the spaces used in the recording is captured, and when adequately reproduced allows the mind to decode the staging of the recording - this is the 'transformation' that occurs when SQ reaches the requisite level ... I suspect you’re talking about absolute phase, and that indeed is not involved in spatial hearing. Phase is. -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
fas42 Posted November 27, 2020 Share Posted November 27, 2020 36 minutes ago, semente said: Unless you're listening to (most but not all) classical music then all reverb is an add-on effect as is phantom image location. In sum, the whole soundstage is fabricated, unreal and uncaptured, like a photomontage: Yes, the soundstage may be fabricated. No, it doesn't sound unreal - each element has its own integrity, and can exist in its own right. A good analogy are the docos which analyse how a famous album was put together - someone who was close to the creative process plays a multi-track of it, and pulls all but one of the sliders down, so that you only hear the contribution of one musician - this doesn't sound "weird", because it's just one sound. It's certainly captured, because the microphone was on - in a fully synthesized piece, the reverb is added via software, say ... but IME this takes nothing away from the impact of the piece. Quote Nothing like using a visual image where there is an intentional, massive clash of the parts 😁 ... for some strange reason, musicians typically don't want there to be obvious clashes - that's why recording engineers are employed - and I guess the latter won't keep getting paid if the result comes across as a mess ... 😉. Link to comment
fas42 Posted November 27, 2020 Share Posted November 27, 2020 44 minutes ago, pkane2001 said: I suspect you’re talking about absolute phase, and that indeed is not involved in spatial hearing. Phase is. The phase being constant along the signal path is not what I'm talking about - inverting of phase doesn't bother me; if I play something in Audacity, and do a duplicate and invert that, and switch between the original, and the inverted then they "sound the same" - to me. Note, it has no bass, so that will make a difference. Inverting one channel, versus the other, makes the listening '"weird" - it feels like it's pulling my head to one side; an uncomfortable feeling. "Spatial hearing" needs a good definition to argue around - one can "hear the space" in good playback; sub-par rendering throws up cardboard cutouts, which have to work hard to hold one's interest ... it doesn't have to be that way. Link to comment
pkane2001 Posted November 28, 2020 Author Share Posted November 28, 2020 10 minutes ago, fas42 said: The phase being constant along the signal path is not what I'm talking about - inverting of phase doesn't bother me; if I play something in Audacity, and do a duplicate and invert that, and switch between the original, and the inverted then they "sound the same" - to me. Note, it has no bass, so that will make a difference. Inverting one channel, versus the other, makes the listening '"weird" - it feels like it's pulling my head to one side; an uncomfortable feeling. "Spatial hearing" needs a good definition to argue around - one can "hear the space" in good playback; sub-par rendering throws up cardboard cutouts, which have to work hard to hold one's interest ... it doesn't have to be that way. Phase inversion or absolute phase is also not what I’m talking about. But signal phase differences between the two ears carry a lot of information about position/soundstage, as do level differences. I’ll post the title of a good book on spatial hearing I read recently. Maybe it’ll help with the definition. -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
fas42 Posted November 28, 2020 Share Posted November 28, 2020 Just found this piece, https://www.audiology.org/news/spatial-hearing Quote The authors report that the effects of reverberation on sound localization are "surprisingly mild." This is particularly interesting because a single sound source in a hard walled room will eventually be perceived at the listener's ears from every possible direction. However, the "precedence effect" (also known as the "law of the first wavefront") allows listeners to generally identify the direction of the original sound source. Indeed, reverb can be useful in locating sound sources such as when echoes provide information about the structure of the listening environment. For example, when blind people use a cane to send out tapping sounds and then listen to the responding sounds and echoes. So, there are two things going on - first, the position or direction of the source of the sound can be worked out; second, the environment that the sound is occurring in can be deduced by the listening brain. This happens automatically all the time in the real world - and also occurs with playback, when the information being presented is clear enough. pkane2001 1 Link to comment
Popular Post pkane2001 Posted November 28, 2020 Author Popular Post Share Posted November 28, 2020 46 minutes ago, fas42 said: Just found this piece, https://www.audiology.org/news/spatial-hearing So, there are two things going on - first, the position or direction of the source of the sound can be worked out; second, the environment that the sound is occurring in can be deduced by the listening brain. This happens automatically all the time in the real world - and also occurs with playback, when the information being presented is clear enough. Here are a couple of books that I like on this subject: First one is interesting because it discusses other effects, such as reverb and comb filtering, sound decay, pitch changes, etc., in addition to the standard ITD/ILD models and also has a chapter on how sound is generated by various musical instruments: https://www.researchgate.net/publication/320537658_Acoustics_and_psychoacoustics_Fifth_edition This is a more in-depth treatment written more like a textbook: https://www.researchgate.net/publication/303018400_The_Auditory_System_and_Human_Sound_Localization_Behavior Jud and fas42 2 -Paul DeltaWave, DISTORT, Earful, PKHarmonic, new: Multitone Analyzer Link to comment
jabbr Posted November 28, 2020 Share Posted November 28, 2020 6 hours ago, Kal Rubinson said: Or, perhaps, do you hear where I hear? Have you ever been up close at a live concert where the singer walks back and forth on the stage with a microphone? I've heard the music coming from the singer's mouth, until I have a doubletake and realize that the microphone feeds an on stage amp with a fixed speaker. Custom room treatments for headphone users. Link to comment
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now