Jump to content
IGNORED

Lies about vinyl vs digital


Recommended Posts

3 minutes ago, esldude said:

With swiping you put a finger on the first letter and slide your finger to touch all the rest of the letters in a word.  Lightly lift and do the same for the next word.  Sounds much more awkward than it is.  With a little practice just like whole words become something of one coordinated action from your fingers, a swiping pattern becomes one pattern for one finger.  You can soon with not much thinking about swiping instead translate your thoughts via swiping more similar to typing. 

And just how do you differentiate between, say, soon and son?

Link to comment
2 minutes ago, The_K-Man said:

 

I'm still thumbing dude!  I guess I'm just not confident or happy to be typing on glass.  Been doing it for 7 years now, still feels alien to me.

 

I want that IBM desktop keyboard or Smith Corona back  😀

Swiping works pretty well.  It is possible on glass.  Something you couldn't do on actual keys.  Again, there is also dictating. 

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
7 minutes ago, esldude said:

Your concept of time is off.  You posted the above 3 minutes after I did.  Giving you a minute to see it and then type you had to type the above in 2 minutes or so instead of nearly 7 minutes.  

 

https://9to5mac.com/2018/05/10/how-to-use-swipe-or-glide-typing-on-iphone/

Look at the animation at the bottom of the page. 

 

With swiping you put a finger on the first letter and slide your finger to touch all the rest of the letters in a word.  Lightly lift and do the same for the next word.  Sounds much more awkward than it is.  With a little practice just like whole words become something of one coordinated action from your fingers, a swiping pattern becomes one pattern for one finger.  You can soon with not much thinking about swiping instead translate your thoughts via swiping more similar to typing. 

 

I looked under Add Keyboard, all I see are keyboards in about 50 languages.  I guess 'GBoard' or whatever is something one needs to download?

 

 

Link to comment
1 minute ago, mansr said:

And just how do you differentiate between, say, soon and son?

Say soon and son.  I just did this swiping. You learn to delay ever so slightly on a letter to double up on it.

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
1 minute ago, The_K-Man said:

 

I looked under Add Keyboard, all I see are keyboards in about 50 languages.  I guess 'GBoard' or whatever is something one needs to download?

 

 

Yes. It's actually from Google. It isn't the only one either. I'm Android or I'd know off the top of my head. I swiped this too. 

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment

No need for that just learn to say period, comma etc. And then know how to pronounce whatever you want to say like say soon son. This just done via dictation.

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
39 minutes ago, The_K-Man said:

 

Both were real good ol fashion keyboards is the point.  Humans were not meant to type with just two thumbs.  

 

It took me five minutes to type the above, of which three were spent backing up and correcting shit.  Heaven help if I had to type a TERM PAPER on this device - it would take me a whole month! lol

It was grade 9, in Fernie BC and either the province or the school board decided we should take typing class. 

 

Take a bunch of 13 and 14 year old boys, put them in front of a mechanical contraption, and make it a contest called “type words per minute”.  But wait!  There’s more!  

 

We have to deduct mistakes from your WPM!  My dad was a mining engineer, and he’s looking at my report card.

 

”Typing?  What the hell is typing?”

 

I explain we’re learning how to type.  He says “Well, that’s great.  You’ll make a wonderful secretary some day.”

 

That was in 1971.  I got my first computer, a Wang laptop (luggable) with a PC emulator card in 1984, and a real computer, a Compaq 386SE with a math coprocessor in 1986.

 

To this day I’m a wicked touch typist on anything with Qwerty.  Who knew that typing would ever be useful? 

 

Link to comment

Many  of you know that I consider the problem of DolbyA decoding to sometimes be one of the quality differences between vinyl and CD.  I just got an answer (after 3+yrs and writing my own -- VERY GOOD -- DolbyA decoder) from someone who actually does do 'CD greatest hits' type releases.

First -- the problem with CD releases is true, and it happens a lot.

Next -- It is not a planned thing, and apparently it is not an intentional 'shortcut' per-se. 

Next -- The problem depends on the record label and their record keeping/library integrity

 

Paraphrasing what was 'splained to me -- the paper records and tape labels are sometimes not in very good shape.  Some tapes are even missing calibration tones, but the various identifiers on the tapes are sometimes missing.  Since an EQed DolbyA encoded tape doesn't sound horrible, the wrong choice of decoding is very often made -- and (my idea inserted:  decoding  is an extra step that is easy to avoid).

 

So, from the  horses' mouth -- the leaked DolbyA does happen *often*, but is apparently unintentional.

 

Finally -- I found the right place (a secret place) to ask these questions.  I wasn't even told that I am an idiot or there is something wrong with me because of my suspicion :-).

 

John

Link to comment
9 minutes ago, SJK said:

It was grade 9, in Fernie BC and either the province or the school board decided we should take typing class. 

 

Take a bunch of 13 and 14 year old boys, put them in front of a mechanical contraption, and make it a contest called “type words per minute”.  But wait!  There’ More!  

 

We have to deduct mistakes from your WPM!  My dad was a mining engineer, and he’s looking at my report card.

 

”Typing?  What the hell is typing?”

 

I explain we’re learning how to type.  He says “Well, that’s great.  You’ll make a wonderful secretary some day,”

 

That was in 1971.  I got my first computer, a Wang laptop (luggable) with a PC emulator card in 1984, and a real computer, a Compaq 386SE with a math coprocessor in 1986.

 

To this day I’m a wicked touch typist on anything with Qwuerty.  Who knew that typing would ever be useful? 

 

I've a similar story.  My Mom said I was taking typing.  I had no choice.  And boy am I glad she did. 

 

Now her insistence I learn to be a keypunch operator didn't turn out quite as well.  She said they always need keypunch operators.  Not a lot of money, but a good job part time while in college.  She had me take it at a local community school before 11th grade.  I was the fastest in the class too.  13,000 strokes per hour clean. At that time a whole page in the local paper had such jobs.  By my second year in college those were pretty much not around anymore.  I did use it as college computer classes still programmed main frames that way.  I earned a few bucks putting the programs of others on the keypunch cards.  So her suggestion did earn me more than the class cost I suppose. 

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
59 minutes ago, STC said:

 

Thanks for this. Isn’t this “constant time delay” where the phase identical also known as “pure delay” ?  Just been looking for the proper definition of pure delay. 

 

Thanks again. 

'Linear phase' is a commonly used term of art.  It isn't meant as obscure sophistry -- technically, when someone says 'linear phase', they most often mean something like 'constant delay vs frequency' -- however, 'linear phase' could mean something where the delay isn't constant vs. frequency -- but might actually increase or decrease.   99.99% of the time, you'll probably find that linear phase means the constant delay vs. frequency thing though.

Link to comment
5 minutes ago, John Dyson said:

Many  of you know that I consider the problem of DolbyA decoding to sometimes be one of the quality differences between vinyl and CD.  I just got an answer (after 3+yrs and writing my own -- VERY GOOD -- DolbyA decoder) from someone who actually does do 'CD greatest hits' type releases.

First -- the problem with CD releases is true, and it happens a lot.

Next -- It is not a planned thing, and apparently it is not an intentional 'shortcut' per-se. 

Next -- The problem depends on the record label and their record keeping/library integrity

 

Paraphrasing what was 'splained to me -- the paper records and tape labels are sometimes not in very good shape.  Some tapes are even missing calibration tones, but the various identifiers on the tapes are sometimes missing.  Since an EQed DolbyA encoded tape doesn't sound horrible, the wrong choice of decoding is very often made -- and (my idea inserted:  decoding  is an extra step that is easy to avoid).

 

So, from the  horses' mouth -- the leaked DolbyA does happen *often*, but is apparently unintentional.

 

Finally -- I found the right place (a secret place) to ask these questions.  I wasn't even told that I am an idiot or there is something wrong with me because of my suspicion :-).

 

John

You mentioned there are obvious artifacts of Dolby A encoding with a 35 kHz marker?

 

knowing that, and not having any reliable labelling, wouldn’t it be a straightforward process to identify what has and has not been encoded?  Or does nobody care?

 

Perhaps you could modify your program so that if there is no Dolby A encoding it does nothing, or no more than a gentle levelling tweak.  Then, if it does, the Dolby A processing is “reversed”.  

 

Everyone will want to use it.  :)

 

 

Link to comment
5 hours ago, Paul R said:

Let’s say then, I strongly believe all, or at least the vast majority of all music on iTunes has a high res original, be it 24/48 or 24/96, or above. 

 

5 hours ago, Paul R said:

I also strongly believe, but cannot provide absolute proof of, all new music is recorded at higher than CD resolution, usually much higher, and so providing Apple with high resolutions originals is neither difficult or any extra expense. 

 

Sure I think the labels have lots of hi-res digital masters.

 

So they wouldn't just be able to provide Apple with these if Apple asked - exact same applies for Spotify, who have been requesting labels supply FLAC or WAVE for years...

 

So both Apple and Spotify could already have a huge lossless library. We (publicly) just have no idea on the number. Could be less than 1 million tracks, could be over 20 million tracks. No way to know.

 

But yes, at the labels side, it's obviously safe to guess that number to be higher than what Spotify or Apple have.

 

1444122469_ScreenShot2019-05-21at9_49_30pm.thumb.png.3d49b170d723fed2ed9cb3c0ad1758bf.png

 

Link to comment
9 minutes ago, SJK said:

My understanding is that the music labels are extremely reluctant to release anything to anyone that is true high resolution, whether that’s considered to be 24/96 or likely 24/192 or better.

 

It's really hard to know how things may change if the Amazon rumour turns out to be true... if there is a knock-on effect to the rest of The Big Four (Spotify, Apple, Google, Amazon)... for example, would the labels have a change of mind if The Big Four told the labels they wanted access to hi-res and will charge more to the customer?

 

In my opinion, any past thinking or theories go out the window if one of The Big Four makes the 1st move... we'll have to wait and see over the next 12 months if this Amazon thing does play out or if the rumour was just fake news.

 

Link to comment
30 minutes ago, John Dyson said:

'Linear phase' is a commonly used term of art.  It isn't meant as obscure sophistry -- technically, when someone says 'linear phase', they most often mean something like 'constant delay vs frequency' -- however, 'linear phase' could mean something where the delay isn't constant vs. frequency -- but might actually increase or decrease.   99.99% of the time, you'll probably find that linear phase means the constant delay vs. frequency thing though.

 

I apologies if I am hijacking the thread.  I am more concerned with a proper definition of “pure delay” which IIRC only possible in digital domain where the delay does not alter the response of the original signal as opposed to delays caused by acoustics transmission over a medium. 

 

I am suppose to remember this but I guess age is catching up. 

 

Thanks. 

Link to comment
6 minutes ago, Em2016 said:

 

It's really hard to know how things may change if the Amazon rumour turns out to be true... if there is a knock-on effect to the rest of The Big Four (Spotify, Apple, Google, Amazon)... for example, would the labels have a change of mind if The Big Four told the labels they wanted access to hi-res and will charge more to the customer?

 

 

Frankly, I don’t think there’s a market nor a business model to support that.  Your average bear with a smartphone and earbuds has proven that to to be true.

 

And, the only thing they give a damn about is bandwith, not quality.  SiriusXM has shown that they can sell “music kinda sorta through a soup can just listen really hard” and people will buy it.

Link to comment
57 minutes ago, esldude said:

I've a similar story.  My Mom said I was taking typing.  I had no choice.  And boy am I glad she did. 

 

Now her insistence I learn to be a keypunch operator didn't turn out quite as well.  She said they always need keypunch operators.  Not a lot of money, but a good job part time while in college.  She had me take it at a local community school before 11th grade.  I was the fastest in the class too.  13,000 strokes per hour clean. At that time a whole page in the local paper had such jobs.  By my second year in college those were pretty much not around anymore.  I did use it as college computer classes still programmed main frames that way.  I earned a few bucks putting the programs of others on the keypunch cards.  So her suggestion did earn me more than the class cost I suppose. 

When I was 13, typing lessons were obligatory at my school. Electric typewriters. It was rather dull, but it was only 45 minutes once a week, so not that bad either. In the years since, being able to touch type has no doubt saved me far more time than I spent learning it.

Link to comment
41 minutes ago, SJK said:

You mentioned there are obvious artifacts of Dolby A encoding with a 35 kHz marker?

 

knowing that, and not having any reliable labelling, wouldn’t it be a straightforward process to identify what has and has not been encoded?  Or does nobody care?

 

Perhaps you could modify your program so that if there is no Dolby A encoding it does nothing, or no more than a gentle levelling tweak.  Then, if it does, the Dolby A processing is “reversed”.  

 

Everyone will want to use it.  :)

 

 

DolbyA encoding has a few fairly negative effects (other than the intended encoding):  mixed-in IMD, and also these 'splats' that extend from the audio band up to/just beyond 35kHz.  When there is a transient in the audio, there is an impulse that produces distortion on up to approx the 35kHz point.  These are all difficult to measure technical issues, and the 35kHz splat issue is probably impossible to detect on CDs with a 22kHz frequency response limit.

 

I think why DolbyA has leaked so much -- it is sometimes harder than h*ll to reliably determine if something is/is not DolbyA encoded.  On some albums, maybe one or two songs have obvious defects, and the engineers that produce these releases don't always have time to very carefully listen to every song on the material.  If one actually knows the music, then it is *much* easier to detect DolbyA encoding, but when not really knowing the music -- esp pop music -- it is easy to sometimes mistake DolbyA effects for artistic intent.

 

Don't get me wrong, the DolbyA problems are 'not good', and might often be characterized as 'bad', but I know of no single, reasonable algorithm to determine if a recording is DolbyA encoded.  This problem of 'detectability' being based upon opinion after listening has caused me unlimited troubles with skeptics.   The damage done by not-decoding DolbyA material is qualitative, and non-decoded material is NOT as nice sounding as properly handled material, but it has been *good enough* for people to keep on purchasing it.

 

The most horrible, worst, frustrating, deadly, crazy-hard, nightmarish problem with DolbyA decoding is the calibration setting.  The DolbyA gain curve is very specific and requires a fairly accurate setting for each recording (usually on an album basis.)  The calibration drives me totally crazy, wasting lots of time, and causing me very very many embarassments.

 

It would be nice to be able to 1) accurately detect DolbyA encoding..  2) automatically determine the calibration level.

I do not know how to do either -- and I know DolbyA as well as anyone does.

 

The defects associated with incorrect calibration are very dependent on song dynamics, frequency balance, and probably phase of the moon.  Some material can withstand 0.5dB error and have almost no audible effects.  Other material needs 0.1dB accuracy or better...  The side effects can be gating, a kind of distorted sound (from the gain in each band not matching at the correct time/level) and so many other problems.

 

The decoder will always be a specialty product (or is the word 'hobby'), and any hope for a market will be on the pro side as either a convieience feature for CD production, or a quality feature for historical archives.  Some very intense consumers might want it -- but I give it for free to those people, because I cannot accept an official responsibiltiy for support (the free version has a timeout license -- probably always willing to update it.)  The support issues would be NIGHTMARISH.

 

My main thrust is NOT  to 'push' my decoder (even though it IS wonderful )-- I have some 'The Cars' recordings decoded by my decoder  that do very well WRT the 192k/24bit professionally mastered premium download version, but -- but being realistic -- WHO CARES?

 

My main thrust is to help solve the quality problem...  Really, my secret desire is to stir things up in the distributors & labels so that they produce better product.

 

Thanks for being kind and open minded!!!  However, I gotta be realistic.

 

John

 

Link to comment
7 minutes ago, mansr said:

When I was 13, typing lessons were obligatory at my school. Electric typewriters. It was rather dull, but it was only 45 minutes once a week, so not that bad either. In the years since, being able to touch type has no doubt saved me far more time than I spent learning it.

Off topic (mostly), you are lucky that you schooling required any kind of literacy. Some schools in the US don't even teach cursive writing anymore.  Scary stuff.

 

 

Link to comment
3 minutes ago, John Dyson said:

Off topic (mostly), you are lucky that you schooling required any kind of literacy. Some schools in the US don't even teach cursive writing anymore.  Scary stuff.

 

 

You are quite correct no cursive in many schools now.  That doesn't bother me as much as it once did.  I have written cursive so rarely in recent years it seems odd to do it.  

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
6 minutes ago, John Dyson said:

DolbyA encoding has a few fairly negative effects (other than the intended encoding):  mixed-in IMD, and also these 'splats' that extend from the audio band up to/just beyond 35kHz.  When there is a transient in the audio, there is an impulse that produces distortion on up to approx the 35kHz point.  These are all difficult to measure technical issues, and the 35kHz splat issue is probably impossible to detect on CDs with a 22kHz frequency response limit.

 

I think why DolbyA has leaked so much -- it is sometimes harder than h*ll to reliably determine if something is/is not DolbyA encoded.  On some albums, maybe one or two songs have obvious defects, and the engineers that produce these releases don't always have time to very carefully listen to every song on the material.  If one actually knows the music, then it is *much* easier to detect DolbyA encoding, but when not really knowing the music -- esp pop music -- it is easy to sometimes mistake DolbyA effects for artistic intent.

 

Don't get me wrong, the DolbyA problems are 'not good', and might often be characterized as 'bad', but I know of no single, reasonable algorithm to determine if a recording is DolbyA encoded.  This problem of 'detectability' being based upon opinion after listening has caused me unlimited troubles with skeptics.   The damage done by not-decoding DolbyA material is qualitative, and non-decoded material is NOT as nice sounding as properly handled material, but it has been *good enough* for people to keep on purchasing it.

 

The most horrible, worst, frustrating, deadly, crazy-hard, nightmarish problem with DolbyA decoding is the calibration setting.  The DolbyA gain curve is very specific and requires a fairly accurate setting for each recording (usually on an album basis.)  The calibration drives me totally crazy, wasting lots of time, and causing me very very many embarassments.

 

It would be nice to be able to 1) accurately detect DolbyA encoding..  2) automatically determine the calibration level.

I do not know how to do either -- and I know DolbyA as well as anyone does.

 

The defects associated with incorrect calibration are very dependent on song dynamics, frequency balance, and probably phase of the moon.  Some material can withstand 0.5dB error and have almost no audible effects.  Other material needs 0.1dB accuracy or better...  The side effects can be gating, a kind of distorted sound (from the gain in each band not matching at the correct time/level) and so many other problems.

 

The decoder will always be a specialty product (or is the word 'hobby'), and any hope for a market will be on the pro side as either a convieience feature for CD production, or a quality feature for historical archives.  Some very intense consumers might want it -- but I give it for free to those people, because I cannot accept an official responsibiltiy for support (the free version has a timeout license -- probably always willing to update it.)  The support issues would be NIGHTMARISH.

 

My main thrust is NOT  to 'push' my decoder (even though it IS wonderful )-- I have some 'The Cars' recordings decoded by my decoder  that do very well WRT the 192k/24bit professionally mastered premium download version, but -- but being realistic -- WHO CARES?

 

My main thrust is to help solve the quality problem...  Really, my secret desire is to stir things up in the distributors & labels so that they produce better product.

 

Thanks for being kind and open minded!!!  However, I gotta be realistic.

 

John

 

John,

 

I pretend to keep up with a topic that you are obviously very familiar and deeply engaged with.

 

If it’s any consolation, I and many others here find your posts to be deeply interesting.

 

Keep on fighting the good fight, we’re with you all the way.

Link to comment
On 5/20/2019 at 11:06 PM, Paul R said:

 

It's also that the available metadata for a WAV file can be specific to BWF, and include stuff that other formats do not recognize. In BWF format, it also has more strict rules that have to be followed.

 

I think that unless you clearly hear an advantage to WAV files, they are kinda clumsy to use. JRMC for example, stores a lot of metadata externally to WAV files. If you lose the JRMC database, rebuilding it can be a bit of a pain. Rebuilding with FLAC, AIFF, or other more common formats is usually much less troublesome. 

 

 

 

You can use ID3 tags for WAV (at least in Windows).  Never had a problem, although I prefer FLAC.

mQa is dead!

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...