Jump to content
IGNORED

Breaking time Parasound P5 HALO Preamp


Jamesroy

Recommended Posts

ok, we now have a bifurcated thread (just like Tim Berners-Lee intended when he decided to destroy primate communications)

 

as to burn-in -- it is possible that some subtle changes occur that can be perceived in high-zoot audio gear on music that cannot be heard for voice comm (and is not something one would test for anyway in mil or industrial appls.)

 

 

altho there is a much greater likelihood of perceptual changes with experience as the "burn time" continues, not to say the very high probability of mere confirmation bias

 

as usual, the burden of proof is on the proponents, tho anyone is able to toss their dollars at any legal product without rational basis

Link to comment
George - your position is that all RCA interconnects sound the same - do I have that right?

(mine is that even if they don't - none sound better than the others, just different)

and I think w-doc's position is that some RCA interconnects sound much better than others(?) - doubtless ones that are much more expensive

NOTE: I am posting the above purely on a subjective or perceptual basis - not ascribing any differences to any particular mechanistic explanation

Do you not see that any phenomenon that is real MUST have some mechanism that explains that phenomenon? Yet, cables have only three attributes: resistance, inductance, and capacitance. The effect of the latter two are dictated by the frequency that the cable is carrying. The higher the frequency, the more the cable will attenuate the signal. In audio, we are nominally talking about 20 KHz as the highest frequency that we care about. If one looks up the capacitance and the inductance of the coax types commonly used in audio interconnects, one will find when the impedance of the cable is calculated from these specs, that at 20 KHz, and the 1 to 6 ft lengths usually used for home audio, hat the attenuation of a 20 KHZ signal is a small fraction of a dB compared to 1KHz. At Frequencies lower than 20 KHz, the attenuation is even lower and even more inaudible and ultimately unmeasurable as well. At anything below about 15 KHz, coax is essentially lossless. Now I have no doubt that the types of fancy and expensive cables that these "boutique" cable makers use have much lower inductance and capacitance than do such normally spc'd coax like RG-59U, but when the effects of the cheap stuff are inaudible to any human, how could more expensive cables be any better?

 

I'll grant you that the boutique cables will most likely measure much better at 100 MHz and higher than will run-of-the-mill cables, but that has nothing to do with audio.

George

Link to comment

George - I'm a scientist. Not all mechanisms are known or me and my buddies would have nothing to do.

 

Also, if you are aware of tests showing humans cannot distinguish among impulses with components > 20 kHz then let me know (just as an outside possibility)

 

It is not hard to do a blind A/B/X on cables... I want w-doc to show me those tests...

Link to comment
Explain to me please what is going-on circuit-wise during this "break-in" phase. What is changing and what is the mechanism facilitating this change? I majored in Electronics Engineering and minored in Physics. I have worked for a number of military contractors including Lockheed Missiles and Space Co., GTE Sylvania, Motorola, etc., and I have worked for the Mil-Spec departments of semiconductor companies such as National, Zicor and Oki. In all that time, I have never seen nor heard of a concept for electronics called "break-in". The military and aerospace industries do "burn-in", but that's a static procedure designed to test reliability and weed out infant mortality of products. Products are not expected to change their specifications over that time unless a failure occurs. Why would audio products get "better" with age while other electronics products do not? If you could answer that for me, I'd really be grateful - because nobody else ever has been able to do so.

I can't explain your question, I'm just going on posts in this forum about possible break in time or not.

Link to comment
I can't explain your question, I'm just going on posts in this forum about possible break in time or not.

 

Well, Jamesroy, it was sort of a rhetorical question. You see, nobody can answer the question because nobody knows. There is no known reason for the phenomenon called "break-in" of electronic components or wires. Some audio engineers have used sophisticated measurement equipment to find what the difference is between new, out of the box components and those that are thoroughly "broken-in". No difference has been found. This has led many to attribute the phenomenon to users getting use to the sound of their new components; I.E., it's the listener who is "breaking-in" not the equipment!

Now, I'm of the (informed) opinion that it's another of those audiophile myths like speaker wire cable lifts designed to keep speaker cables up, off the floor, Tice clocks, myrtle wood blocks placed on top of components to improve the sound, and audio interconnects having a sound of their own, and the most ridiculous of all, the notion that audio cables (wires) need to "burn-in".

The exception is transducers (speakers, headphones, phono cartridges, microphones, etc.) many of those DO go through a break-in period, but that's merely a mechanical process whereby suspensions, cones, and other moving parts get more supple over time. IOW, it's real with actual physics behind it.

George

Link to comment
Well, Jamesroy, it was sort of a rhetorical question. You see, nobody can answer the question because nobody knows. There is no known reason for the phenomenon called "break-in" of electronic components or wires. Some audio engineers have used sophisticated measurement equipment to find what the difference is between new, out of the box components and those that are thoroughly "broken-in". No difference has been found. This has led many to attribute the phenomenon to users getting use to the sound of their new components; I.E., it's the listener who is "breaking-in" not the equipment!

Now, I'm of the (informed) opinion that it's another of those audiophile myths like speaker wire cable lifts designed to keep speaker cables up, off the floor, Tice clocks, myrtle wood blocks placed on top of components to improve the sound, and audio interconnects having a sound of their own, and the most ridiculous of all, the notion that audio cables (wires) need to "burn-in".

The exception is transducers (speakers, headphones, phono cartridges, microphones, etc.) many of those DO go through a break-in period, but that's merely a mechanical process whereby suspensions, cones, and other moving parts get more supple over time. IOW, it's real with actual physics behind it.

Best answer yet so I'll stop asking. Thanks Have a safe Christmas

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...