Jump to content
IGNORED

AudioQuest adds MQA Support to Dragonflies via firmware


Recommended Posts

I understand your reluctance to support AQ. They are mostly "snake oil" salesmen. However the DragonFly is a good product. I have a V.1.2 and it has been a superb performer. They were all designed by Gordon Rankin, a well known and respected digital audio designer.

In this instance I don't care if it's good or not. The only reason I'd get one is to tap the I2S lines going to the DAC chip.

Link to comment
Interesting discussion on Audiostream.com today.

 

Apparently, there are 1) MQA "renderers", which require the player to do the first "unfolding", whereupon the renderer will do the final processing and 2) "decoders", which will play MQA files without any additional software required.

 

The DragonFly will apparently be a "renderer", not a "decoder" - which means the only software it will work properly with for MQA playback at this time is Tidal.

 

Not something I understood at all until today - maybe others were already aware.

 

I guess it wasn't possible to squeeze the MQA decoder onto the puny microcontroller in those DACs.

Link to comment
So out of this discussion comes an obvious question for AQ: Is " 'full decode' on the current gen DF roadmap with a firmware update?" The answer to this question could/should be the determining factor for a consumer purchase decision.

 

I'm looking at the decoder from the Bluesound firmware, and there is no way it can fit on the Dragonfly microcontroller.

Link to comment
What is being said by some is that the 88 or 96 limit of the Dragonfly is the USB input, not the DAC itself inside the Dragonfly (after the input).

Apparently the Dragonfly works with MQA software in playback that does the first unfold. (say 24/48 ot 24/96 or 24 44.1 to 24/88.2). This takes some load off the Dragonfly, which "only" has to apply the MQA deblurring that's been encoded into the Dragonfly firmware. The implication was made that the actual DAC inside the Dragonfly CAN playback 176 and 192, and that since the MQA system often has those higher rates folded into the file that appears to be a 96 or 88 file to the Dragonfly at it's inputs, the Dragonfly could unfold them to 176 or 192.

 

If the microcontroller is too slow to simply pass 192 kHz straight through to the DAC chip, how could it possibly be fast enough to upsample, by any algorithm, anything to that rate?

 

The ESS DAC chip itself obviously handles higher rates. The trouble is getting the data to it.

Link to comment
Not that I'm enamored of MQA's terms, but these conversations get really confusing. I propose that we begin to use the terms "MQA Core", "MQA Renderer", and "full decoding" instead of confusing terms like "first unfolding", "second unfolding", "de-blurring", "software decoding", etc.

 

I'd be delighted if Bob Stuart stopped inventing new names for things on a weekly basis.

Link to comment
The DAC's designer has gone on record that the 96 Khz limitation is more due to compatibility than hardware constraints. Its a USB DAC thing, to do with drivers etc.

 

The Dragonfly has enough power inside, and a capable enough internal DAC, to do MQA unfolding to at least 192kHz, given that it has been fed an MQA signal that has already been software decoded to the MQA Core.

It's an 80 MHz CPU with 32 KB RAM. You won't be doing a lot of processing with that.

Link to comment
Both the red and black are that spec?

 

The Explorer 2 I believe runs at 500mhz with 16 cores. No wonder the dragonfly cannot offer full decode. But I'm wondering if it can really offer any further unfolding at all. That's a pretty significant looking performance difference.

 

The spec sheet for the Dragonfly says both have a PIC32MX microcontroller. It doesn't say exactly which model, but they top out at 80 MHz according to the manufacturer's (Microchip) datasheets. The XMOS indeed runs at several hundred MHz and has 8 cores to boot, along with a more advanced DSP instruction set. CPU-wise it should be more than enough to do the MQA process. RAM size will be the limiting factor there.

Link to comment
  • 4 months later...
1 minute ago, miguelito said:

Rendering will have to be to a higher bitrate than 96 I would think. This is in principle possible as the DAC chip itself can go to 384 - AFAIK it is the controller that is the source of the limit. My understanding is that the rendering involves setting upsampling parameters (extracted from the unfolded MQA PCM stream) for the DAC to use.

Judging by the specs of the microcontroller in the DF, I estimate that it is too slow to do even the rendering part. Therefore I suspect all it does is set the upsampling filter coefficients in the ESS DAC chip according to the parameters indicated by the input stream.

Link to comment
11 minutes ago, abrxx said:

If the Dragonfly isn't doing rendering with its firmware update, what exactly IS rendering meant to be? I thought the dragonfly was the only MQA renderer on the market. Are you saying that proper MQA unfolding needs to do more than just upsampling (like applying a FIR filter, as per the patents) ?

Rendering is upsampling with specific filter coefficients. If the DAC chip allows programming arbitrary filters, as the ESS ones do, it can be made to perform the task. The microcontroller parses the metadata encoded in the incoming PCM stream and sets the DAC parameters accordingly. At least that's the only way I can see it working given the constraints.

Link to comment
28 minutes ago, abrxx said:

So can the Dragonfly do MQA rendering or not? :) Or are you saying it can approximate it, but perhaps other devices can do a better job?

MQA "rendering" is nothing more than upsampling using a specific (FIR) filter. DAC chips upsample their input to the operating rate of the sigma-delta modulator. A DAC chip with programmable filter coefficients, such as the ESS chip in the DF, can thus be used as an MQA renderer together with a piece of code (small enough to run on a microcontroller) that extracts the filter parameters from the incoming PCM data.

Link to comment
2 hours ago, abrxx said:

Thanks for confirming, I was just unsure what you meant by saying that the micro-controller is too slow for even rendering. But since the DAC (in this case) can basically do the job of the MQA rendering, its a mute point.

 

So where does the DAC specific/file specific management happen? This is also part of MQA rendering according to Bob. Is that also done just by selecting DAC filter parameters?

The decoded (but not rendered) PCM data encodes in the LSB, among other things, the original sample rate and the filter number to use in upsampling. Although the filter number is 5 bits wide, only a few of the 32 possible choices are (currently) defined. The renderer uses this number to look up the actual filter coefficients in a table. It is thus possible for filters to be tweaked for specific devices while adhering to the same general characteristics. I have only been able to obtain the exact coefficients from one device, so I don't know how or even if they might vary between DACs.

Link to comment
18 hours ago, Wavelength said:

Plissken,

 

Really!!! If you don't know something then why are you confusing people. A song does not carry metadata over USB.

 

 

Sorry no we are not going to do that. There are many more applications coming out with MQA support for DragonFly.

 

mansr,

 

The same, really if you don't know don't say anything. Assumption on how something works is the biggest problem with computer audio. I get more of the stupidest questions asked on emails from stuff read here and on other forums.

 

 

Totally incorrect!

 

 

Also totally incorrect!!!

 

~~~~~~~

 

Look everyone, you can guesstamate all you want about what is or what isn't done here. OR!!! you could sit down and listen and judge for yourself.

 

No more misinformation from people who don't know what's going on here. I have 33 emails in my inbox about people asking questions from users and links to your posts.

 

Be sane, have fun!

Gordon

Wow, such hostility. Everything I've said about the MQA rendering process is true. I learned it by studying the actual code. I obviously don't know exactly how it is implemented on the Dragonfly, but here's what I do know:

  • The Dragonfly has a PIC32MX microcontroller based on a MIPS CPU. Anyone can open the case and see this.
  • MQA rendering in real time needs 100 MHz of CPU time on a more efficient ARM system.

Based on this, I find it unlikely that the DF microcontroller is actually performing the calculations. The PIC32MX simply doesn't have enough CPU power.

Link to comment
2 hours ago, Wavelength said:

MQA libraries in all the applications that support it have a data base of known MQA devices. They match the interface to each device so that it correct and aligns everything going to that particular DAC.

Correct and align? What is that supposed to mean?

2 hours ago, Wavelength said:

I would check with other companies on the pass through option. But I think you would want this off for any MQA capable DAC. Checking this will basically bypass the MQA content.

Passthrough in Tidal means it sends the MQA-encoded stream as is to the DAC. This is what you want with a DAC that supports full decoding such as the Explorer 2 or Mytek Brooklyn. For a DAC with partial (e.g. Dragonfly) or no MQA support this options should be off in order to enable software decoding.

Link to comment
6 minutes ago, Wavelength said:

mansr,

 

The reason I get frustrated with you is simple. Your making assumptions about products that you either don't have or don't know enough about. This in the end will mislead users and causes false claims.

 

For instance your claim about 100MHz ARM processor is very vague. There are a boat load of different ARM processors and they vary vastly in performance and Audio capabilities. I can think of M series, A series, 9 series processors with some that have great capabilities and others that don't. Some have HS USB, others only FS USB. Some have cable I2S, others have really poor implementations. The poor ones require significantly more MIPS than a 100Mhz version would have and therefore would preform less than any Microchip MX processor would.

 

Take for instance most XMOS single core (what marketing calls 8 core, really 8 threads) that are the basis for a number of products in the industry. Six of those cores are used for USB & I2S, and almost 90% of the MIPS of those processors are used up and therefore are not really good candidates for MQA.

 

In the DragonFly line with the Microchip MX32 processor we wanted to make a product that was really low power that would work with all platforms. Something both the XMOS and ARM processors cannot do. When we started to work with MQA, we talked to the engineers at Microchip and they sent us DSP algorithms written in assembler. The reason being is that these processors have specific DSP functions which standard programming under C/C++ would not have access to. The engineers at MQA took that source code and optimized it for this implementation.

 

MIPS don't equal MIPS when you are talking about processors. You have to look at the entire system as a whole.

 

Heck take an iMX7 ARM processor from NXP/Freescale and compare it to say an iMX6UL. The iMX6UL will beat the pants on the 7 because of the IP it has for Audio. Just like the MX270 from Microchip will beat the pants off the MX795. You can't just make blanket statements about performance and suggest you know what's going on here.

 

That leads to misinformation and everybody that reads your posts will get confused.

 

Thanks,

Gordon

I measured the CPU cycles required to run the rendering code on a Cortex-A7 ARM device. It needed about 100 million cycles per second of audio. This CPU is more efficient per cycle than the MIPS M4K core in the PIC32. It also has bigger caches and better memory bandwidth. I read somewhere that the Dragonfly uses a PIC32MX270 which runs at 50 MHz. It would take one hell of an optimisation to run the upsampling algorithm on that and still have time for handling the usual tasks (USB communication etc). Assembly optimisation is something I have a great deal of experience with, for what it's worth.

Link to comment
6 minutes ago, Wavelength said:

mansr,

 

So you are admitting you don't have a DragonFly, correct? Then why are you even commenting here?

Maybe I was thinking of getting one. After witnessing your behaviour here, I certainly won't be, that's for sure.

6 minutes ago, Wavelength said:

Actually the A7 has the same problems as the A5 does. With the MX DSP functions you can do a multiply and add which is a requirement for filtering in 1T state. On the A5/A7 processor that takes a ton more!

 

Thanks, but no thanks,

Gordon

Clearly you do not have even basic knowledge of ARM. Really, if you don't know, don't say anything. Better to remain silent and be suspected of ignorance than to speak and remove all doubt.

Link to comment
12 minutes ago, crenca said:

Ah, don't get too upset at him - he is hamstrung by the boxed in world of IP/MQA no doubt.  He can only answer with generalities, and failing that accusations.

It's the baseless accusations and general rudeness I find unbecoming.

12 minutes ago, crenca said:

Did you notice how he complained of your "generalities" which lead to "confusion" and then says that "/A7 processor that takes a ton more!"?  What exactly is a "ton" in relation to MIPS and these questions? :)

Well, he explicitly mentioned multiply-and-add instructions. ARM happens to be much better equipped than MIPS in that regard.

12 minutes ago, crenca said:

Hope I am wrong, and some substantial answers to the specifics of AudioQuest's particular MQA implementation will be forthcoming...

I won't be holding my breath.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...