Jump to content

Recommended Posts

8 minutes ago, ElviaCaprice said:

dumb is all I can add to this suggestion.  It may be on the added to the mated device for those looking.

If Chord went this route, direct on the DAC, I'd dumb them in a second. 

Of course you would...you have too much time invested in your current solution, and don't want to pony up for something you don't need.  Eventually all competitive dacs will have an enet port to support the "next generation" of streamer children....we can discuss again in 5 years.

Then again, if chord can remain a "champion" with their FPGA, they may never need to be competitive, but i just don't see that happening over time.

A quetest with an enet port, may very well be the end game solution that could win all the money!  Or better yet, with fiber enet optional module...i hope one day they will at least produce a product more conventional looking.

Link to comment
3 hours ago, jcn3 said:

to accomplish what i think you mean, in addition to the ethernet port, they'd need to add a board with a cpu and memory and the software to allow it to function as a renderer.

 

basically, you'd end up with an auralic aries, or aurender, or lumin device.  of course it would be nice if those types of devices had as good a dac as in chord products.

 

Agree...a Chord network player....wishful thinking!

Link to comment
2 minutes ago, ElviaCaprice said:

 

Sorry, I don't see the advantage at all in going this route regardless of my system.  Besides Chord already has devised a wireless capability into their mated companion device with these DAC's, from my understanding.  Not something I'm interested in.  See Poly and 2GO.

 

Yes, I have seen poly...they are almost there...i believe they will get there eventually if they want to play to the future of the defacto new standard of "network players".

If they could save some costs with a conservative case and build a single network player at a competitive price, they could be a winner to the next and future generations.

Link to comment
7 minutes ago, ElviaCaprice said:

 

What?  Sorry beerandmusic, but I say your full of it and don't know what the hey your talking about.

The fact that they even made the qutest (without batteries), shows they are expanding beyond the mobile market...i see a "network player" some time in their future...may not be soon, but eventually.  The entire industry will be moving that direction...like it or not.

Link to comment
6 hours ago, barrows said:

There is no "subtle data integrity loss" which could result in a sonic change, you either have continuous music, indicating no data problem, or you have "tics" or dropouts, nothing in between.

 

This is very interesting to me.  I wasn't aware of that.  Are you suggesting that if i am playing native dsd 256 and that if i don't hear any "tics or dropouts" that I can assume that every bit of information was processed correctly?

 

I always thought that bits could be lost and we will miss some actual sonics, details, depth, or other "non-perfect" playback.  I didn't know that if bits were lost due to clocking, noise, or whathave you, that you would always here a tic or something....i have never ever heard any "tics" or dropouts, so i guess that is a good thing.

 

Is what you are saying "known as fact" that all engineers agree with, or is that just your belief?

 

Link to comment
12 minutes ago, barrows said:

Yes, if you do not hear any little "tics" or dropouts, there is no lost data.  Almost all computer audio products in common use by computer audiophiles will not ever have any data loss.  The same cannot be said, for say, Bluetooth (most of us who have ever used Bluetooth music transfer have heard "tics" and dropouts, right?)  Same is true for Ethernet and USB.

 

Sound quality differences are not about data loss, or data integrity problems, all sound quality differences are related to noise issues affecting digital audio clocks, DAC chips, and the analog sections of DACs.

 

damn that noise (grin)!

 

kind of funny when you think about it...noise ruins music.

 

I always thought before, that the noise could affect the accuracy of the data before it reached the flip-flops...I figured the data would be accurate after the flip flops, but was concerned that the data could be corrupted somewhere between the receiver and the actual dac circuitry.

 

I knew the circuitry up to the receiver had to have been perfected long ago, otherwise we would have some serious banking problems.  But not sure what happenned between the reciever and the time it reached the D-A circuitry.  I would "hope" that any DAC engineer could ensure the data integrity after the reciever, but nothing would surprise me.  My logic tells me that we "should be able to have perfection all the way to the output of the DAC chip itself, and that only the output section would really make a difference in SQ.  But i am a 1's and 0's kind of guy, and don't know the first thing about analog.

 

 

Link to comment
2 minutes ago, barrows said:

Any data problem, dropped samples, will result in an audible "tic" or dropout.  There is not subtle degradation of sonics caused by data integrity problems, as there is no error correction: any missing sample will be missing, and result in a "tic" or dropout.

The very earliest USB audio interfaces had these problems, and dropouts were common, but these problems were solved long ago.

 

That said, above you stated that the noise causes issues with the clocking....

So, hypothetically speaking, if there was NO noise, then there would be no issues with clocking?

The only reason we need better clocks is to deal better with noise?  There isn't any issues with cheaper clocks if there wasn't any noise?   And the power problems are all related to noise as well?  A cheap power supply would be fine, if we had no noise?  The only reason for a LPS over a cheap power supply is because they are less succeptable to voltage variances due to noise?

Are all of these true statements?

 

If so, you would think that more effort would be put into eliminating noise?  And noise really isn't anything more than variances in voltage.  I would think we should be able to produce "perfect voltage" by now....

 

I guess i should stick to 1's and 0's.

Link to comment

^^^ The more i think about this, the more it drives me crazy to think we haven't achieved "sound perfection up to the output stage....

 

I know we can get perfect data up to the DAC reciever, otherwise our banking system would fail.

Timing I can see as a potential  issue, but should be easily overcome with any buffer.

 

We should be able to accomplish perfection with a USB interface, regardless of any noise on the line, again otherwise the banking system would have failed. 

 

With a correct buffer, and the pc usb ground isolated, the only real PS we need to be concerned with is the PS used for the dac. 

 

We should be able to have a near perfect USB dac up to the DAC's output stage with just a simple good quality PS.

 

hmmmm.....

 

 

 

Link to comment

^^^^ Ok, let's leave the ANALOG out of the equation for a bit longer.

 

Assuming a good DAC PS, a good buffer, and an isolated ground from the PC, the signal should be perfect and the same for all DACs up to the DAC chip....so all dacs could potentially sound the same that use the same dac chips, up to the DAC chip.

 

Besides the analog output stage, the difference could/should be only based on the DAC chip or FPGA (which when you think about it, is "house sound").  I am content in not having "house sound" even though the FPGA algorithm may sound "subjectively better" than an off the shelf dac chip.  The ess9038 is such high resolution, that I would probably be more comfortable with it (for myself anyway, even though i like the house sound of the mcintosh amps)....and not willing to pay for some designer's concept and applied algorithm of what sounds good...plus an off the shelf ess9038 should be notably cheaper....

 

That leaves just the analog output stage....

Link to comment
10 minutes ago, barrows said:

No.  The better the masterclock (specifically the lower the phase noise of the clock) at the DAC chip, the more accurate the conversion.  For any given masterclock, any noise present will degrade its performance, so will vibration.

So best performance will be realized with the best clock, powered by the best power supply, with 0 noise present, and no vibrations, and perfect coupling to the DAC chip's clock input pin.  Everything matters, and of course, there is no such thing as perfection.  All clocks have some phase noise, all power supplies have some noise, every circuit layout has some parasitic problems, and some vibration is always present.  Best performance is realized by reducing all of these factors.

 

Ok, I don't disagree with any of what you are saying....i am just living in a hypothetical world at the moment, coupled with factual 1's and 0's....again assuming NO NOISE.

Link to comment
19 minutes ago, barrows said:

The point is, reducing noise is exactly what high end audio is about, it is not easy, or affordable to do so.  Same thing with clocks, even a $1500 NDK DuCoLon ovenized clock still has significant phase noise.  everything matters, the more attention which is paid to these details, the better performance gets, but it costs more to do so.

Besides these factors, you have the design of the digital filter(s) and the dAC conversion it self is never perfect, and then you have the analog output section which is also far from perfect.

 

Again, agree here...but at some point there is "diminishing returns", and what is considered subjectively better....

Link to comment
3 hours ago, barrows said:

I never suggested the isolation is 100% perfect, and I have mentioned some capacitive coupling is likely.  That is no reason to not use an isolated USB interface though.

 

Nothing like "Intona".  An isolated USB receiver inside a DAC isolates the USB receiver circuitry and incoming ground of the source, from the masterclock(s), and the rest of the DAC circuitry.  And also allows one to re-clcok the data lines directly from the masterclock, right before they go into the DAC (chip) for lowest possible jitter at the dAC chip (where it matters).

 

I am seriously considering the marantz ND8006 (mainly for it's alexa capability), but it does suggest it has "some usb isolation circuitry"

 

Optical and coaxial digital inputs as well as a USB-B port enable you to stream music directly from a PC or MAC, or connect other digital sources. The USB-B works in asynchronous mode to support not only 384kHz/32bits high-resolution audio but also the DSD 2.8MHz, 5.6MHz and even 11.2MHz formats for maximized performance and the most direct way to enjoy excellent quality. To safeguard quality when connected to a computer, Marantz built extended isolation around the USB-B input to eliminate the chance of high frequency noise generated by the computer entering the ND8006

Link to comment
  • 3 months later...
3 hours ago, flummoxe said:

The below is from the What Hi-Fi site and is tantalising!!

 

It may have only just launched its Qutest DAC, but Chord plans to go big this Munich by unveiling a number of new products - with each one covering a distinct area of the brand’s portfolio, and one of which is (according to Chord) set to be a landmark product in the company's history. The curtains will be opened on the first day of the show.

 

I hope it has a network port, spdif, usb, sd card and under $2K...if not, it will just be another yawner.

Link to comment
16 minutes ago, OldBigEars said:

 

Yep I noticed that.  But if you watch the Hans Beekhuyzen review of Hugo2 he found it's not quite as good as the Mytek Brooklyn Plus - and many would say that the Liberty SQ is every bit as good as the Brooklyn Plus.

 

It seems we've got opposing opinions from pro reviewers...

 

+1

Too much marketing bias out there....they need DBT to determine if ANYTHING is truly better. ..i don't trust any professional reviewers....and consumers follow their advice...this industry needs better way to weed through all the competition to make our own shortlist.

Link to comment
On 4/28/2018 at 12:36 PM, jos said:

That's true, but I prefer Darko's opinion (younger ears, more authority). It's also a question of taste, but in my opinion the Qutest is the better DAC for the money. Try to listen both. I do have the Qutest (with ultraRendu) and it was a major step forward.

 

I would think DARKO would be more likely to be biased with vendor relations than HANS....but i do trust Darko more than most magazine reviewers.

 

Link to comment
  • 2 months later...
39 minutes ago, pl_svn said:

been on that thread since the beginning and never read what you are stating

even asked already

 

oh, and... looks you lazily ? missed this other thread where he answers directly: https://www.head-fi.org/threads/watts-up.800264/

can you summarize what you are saying?

 

I was kind of curious myself when i read that chord can play native dsd and they convert to pcm?

How can it play native dsd512, when pcm can't even go that high?  Doesn't sound very native to me??

 

update...i just read past messages, but still confused....i thought native dsd means that it plays without conversion at same rate as original file...e.g. native dsd dsd256 in nos mode of dsd dac will play dsd256, without conversion.

Link to comment
1 hour ago, ecwl said:

There are two concepts of "native DSD". One is that your DAC actually plays back the DSD signal as recorded. That's what most audiophiles think about and so manufacturers always like to claim that's what's happening whenever possible (although Chord never did). The other has to do with how to get the DSD signal digitally via USB to a USB DAC. There are two ways to do that: "native DSD" which truly send the original DSD signals directly to the DAC (but a lot of computer/DAC USB interfaces can't handle) or "DSD over PCM" which repackages the DSD signals without decimation into a PCM format that the DAC can convert back to straight DSD. Why even make the distinction if either way, the DAC is getting the original DSD signals? Because to convert the native DSD signal into a re-packaged DSD over PCM signal, it actually requires a decent amount of CPU power so if your streamer can't catch up, you may hear drops while playing DSD signals. Newer Chord DACs support native DSD and DSD over PCM playback.

 

As for DAC design philosophies, I would say there are two "extremes"? One is that whatever the original recorded signal was, that should be the playback. So if you recorded in PCM, you need an NOS R2R DAC to play it back and if you recorded in DSD,  you need a DSD DAC to play it back. The other philosophy would be Chord/Rob Watts where the recorded signal is just a sampling of the original analog wave form. The only way to reconstruct the original analog wave form digitally is to take the sampled digital signals, do a lot of math on it (large tap lengths) and then reproduce the analog waveform via a very high frequency (104MHz) modulation with multiple elements. This theoretically would get the most accurate reproduction of the analog wave form in the time domain and the frequency domain. We can go very deep into this discussion and reading what Rob Watts said scattered throughout the many forums at Head-Fi.

 

I think instead of insisting one philosophy is better in a simple sentence (like any DAC that can't playback the original DSD signal exactly as digitally recorded is inferior for DSD playback), my take is that people pay for their DACs with their own money so if they like the sound of their DAC, then that's the right DAC for them.

 

And in defense of some on the forum, having read every single thread Rob Watts has written in the past 3 years, I acknowledge it is difficult to expect others to read everything he wrote as it would probably take weeks to do and it'll take months to mentally process what he said. People come to forums hopefully for quick answers, not to take a university level course and do homework. I think having cordial discourses is what should be expected as I'm sure we all get enough crap from colleagues at work that we can at least be polite to each other on the forum for our favorite hobby.

 

thanks for your knowledge and for clarifying.  Very interesting and i will re-read to better understanding when i have more time....(currently moving furniture around, reorganizing our house)....anyway, even more interesting to me is your tagline....i noted you have very high end equipment and also use the ultra rendu....i will have to give it more consideration (smile)....do you use one of those DIY cables for the power of the rendu that everyone is talking about here lately?

 

Also curious why you went with bookshelves and subs rather than a full range tower with your high budget? (I wouldn't guess many would buy a "DAVE", and not have full range towers unless space was an issue?)

Link to comment
  • 1 month later...
30 minutes ago, Miska said:

 

Ground plane noise at USB interface side or the DAC side? Since the two ground planes are not connected at all.

 

I rather stick to something objective, where you can measure the difference from DAC output.

 

So far what I've measured with those USB isolators/regens, in non-isolated device case about 50% seems to get better and about 50% gets worse.

 

When you say non-isolated device do you mean dacs that don't have galvanic isolation?

 

What effect would a usb toy have with a dac that did have galvanic isolation?

Link to comment
39 minutes ago, barrows said:

But, the ground plane internally in the DAC (in the case of those with isolated USB inputs) is not shared from the USB receiver to the "sensitive inner circuits", indeed this is one of the reasons to use an isolated USB interface.

 

Instead, I would suggest that even in the case of isolated USB inputs, the isolation is never "perfect" and that there is some coupling of noise across the isolation barrier, and that there is still some advantage (although less than the advantage with a non-isolated USB interface) to providing a cleaner USB signal. 

 

even using a usb toy, there is still coupling and isolation is still not perfect, correct?  even if you daisy chained 10 usb toys together?  I think you need to buy a dac that you trust mfr to design to isolate the usb noise to not interfere with their own design....

Link to comment
4 hours ago, barrows said:

This is an area where I find a belt and suspenders approach works better.  While I like the idea of the DAC having complete immunity to source quality, I have never found that to be the case in real life.

I use (mostly) DACs with isolated USB inputs (isolated and re-clocked just before conversion), but they still sound better with a quality USB source component, that said, I prefer to not use add-on USB devices myself, just a high quality USB source I know has a clean USB feed.

 

Won't all enet input to dacs be reclocked?

Link to comment
2 hours ago, Miska said:

 

Yes, DACs and ADCs without isolation between USB and the converter.

 

 

If the isolation is correctly done and works as intended, nothing... But sometimes even in such cases USB toys cause problems, like data corruption, devices falling off the bus, etc.

 

That said....It does sound like I would prefer to have a modern dac that isolates usb noise rather than have a usb toy or fancy cables.  It makes sense that this should be in the dac design.   All newer dacs should state rather they have usb isolation or not, and that should be the most important checkpoint for any dac on anyone's shortlist.

 

So question is.....do CHORDS more recent dacs (including the qutest) have usb isolation?

 

There, i will bring it back to topic (wink).

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...