Thread: HDMI? Digital all the way? Convenient way, but let's understand how this all works... To those dreamers who still think digital is silent...

Posts: 31
Page: prev 1 2 3 4 next

Post by The Seventh Taylor November 4, 2008 (11 of 31)
Correct. Sir George and Giles Martin started by converting all the multi-track master tapes to 24-bit 96 kHz PCM on HDD and have started mixing from there.

Post by raffells November 5, 2008 (12 of 31)
rammiepie said:

LOVE was indeed a remaster. Where else can one hear any 5.1 Beatles unless you count the various DTS and Dolby Digital DVDs. It was full blown 24/96 and on the Meridian 860/861 combo sounds better than any previous vinyl incarnation. I was NOT comparing the old quad vinyl remasters (atrocious as they are/were) with present technology. The one cable solution is indeed a godsend but in America, I don't recall any HDMI comparisons (Widescreen Review had an insert listing the various manufacturers and their own over-hype but NO comparisons). I would appreciate if you could enlighten SA-CD net readers as to the outcome of the British Magazines' findings regarding which HDMI v.1.3a came out victorious regardless of price. Thank you.

Sorry the review I was referring to was on different HDMI cables.
Yep the HDMI is a neat solution.(Cant believe magazines publish over hype LOL)
One thing Julien didnt comment on was, the fact that you might sync reclock both ends of the chain with superclocks? expensive if adding that extra wire has some problems for syncing.Not precisely sure of this area.(pun intended.)
Personally I believe the noise reduction techniques that are available nowadays but not always used are more than adequate.
)
I suppose what has happened to the original Beatles recordings (tapes) and the total changes with added orchestration and god knows what could be regarded as remastering.Surely the most extreme category. Im gonna play it again now.
It is definitely one for surround listening rather that stereo.The downmix into stereo sounds odd at times. Still doesnt sound anything like the Beatles in the Cavern.BTW I think the request for "added height" thread would be highly inappropriate for that.Quite the opposite.
The replacement Cavern does sound better than the original though.
Doesnt smell as bad either..
MMmmmmm how about added realism to the musical experience.
I can small that Disney Hall in La quite clearly ? Jaisus did that guitarist ever have a bath..wow. Im sure someone would then want the smells segregated front, back and sides soon after it would introduced. LOL

Post by pelley November 5, 2008 (13 of 31)
Julien said:

I don't think the "HDMI digital all the way no loss of signal" guys really understand how it works. By the way, there can be big differences in signal quality between HDMI cables too. Anyone who has made one knows how digital signals through a cable get deteriorated in real time.

HDMI sends ones and zeroes across the wire. It's just data. Unless the cable or the components are broken, the data arrives perfectly intact at the other end-- the exact same ones and zeroes that were sent. It's no different than your computer reading ones and zeroes from your hard drive across a USB or SATA cable, or reading digital photos from your camera, etc. If these digital cables were really so fragile, we'd be getting constant errors in our computer data and lots of our digital photos would be randomly garbled. We all know this not to be the case. With today's technology, even cheap cables can reliably send ones and zeroes with perfect accuracy from point A to point B.

Now your point about whether it's better to perform the D/A conversion in the source or target device is interesting. For a moment, let's forget about the convenience of one cable vs. six. And for the sake of example, let's also assume that both the SACD player and receiver have virtually the same or even identical D/A implementations. So now the theoretical comparison comes down to whether more noise is introduced along analog interconnect cables, or inside the receiver itself. We are all aware of the variables that can affect analog signal loss with interconnects. Doesn't it seem likely that a well-designed piece of circuitry inside the receiver could be cleaner than a mess of snaking cables between boxes? Couldn't the noisy digital circuitry be isolated from the rest somehow? I admit, my knowledge of analog circuits is limited here.

Chris

Post by amatala November 5, 2008 (14 of 31)
pelley said:

HDMI sends ones and zeroes across the wire. It's just data. Unless the cable or the components are broken, the data arrives perfectly intact at the other end-- the exact same ones and zeroes that were sent. It's no different than your computer reading ones and zeroes from your hard drive across a USB or SATA cable, or reading digital photos from your camera, etc. If these digital cables were really so fragile, we'd be getting constant errors in our computer data and lots of our digital photos would be randomly garbled. We all know this not to be the case. With today's technology, even cheap cables can reliably send ones and zeroes with perfect accuracy from point A to point B.

Unfortunately this is not how it happens in the real world because of jitter which can make digital cables sound very very different - and cheap digital cables sound very bad...

Post by Tobias November 6, 2008 (15 of 31)
amatala said:

Unfortunately this is not how it happens in the real world because of jitter which can make digital cables sound very very different - and cheap digital cables sound very bad...

Jitter only matters at the A/D and the D/A stage. Jitter introduced during the transport of digital data can be trivially removed by using a buffer at the receiver. The only drawback of this is increased latency.

Furthermore jitter cannot be generated by a passive component like a cable. A passive component just passes on what it receives on one end to the other. It cannot accelerate one bit and delay another.

Post by raffells November 6, 2008 (16 of 31)
Tobias said:

Jitter only matters at the A/D and the D/A stage. Jitter introduced during the transport of digital data can be trivially removed by using a buffer at the receiver. The only drawback of this is increased latency.

Furthermore jitter cannot be generated by a passive component like a cable. A passive component just passes on what it receives on one end to the other. It cannot accelerate one bit and delay another.

Ask yourself the question ,Why do they shield digital cables?
The whole envoirement is bombarded by electronic noise and the D to A conversion is not immune to this.
Im not convinced as the original post states that it is totally understood yet but the effect of noise within digital circuitry and especially added to the poor jitter conversion does make a difference.
Adding the buffer,which is even more electronic switching? could be considered adding more to the latency problem.
Personal experience of using opamps in the vicinity of digital circuitry also tells me to keep them far enough away and definitely not on the same power supply rails.
PS It gets even more complicated when there are various different types of jitter.
http://en.wikipedia.org/wiki/Jitter
http://www2.electronicproducts.com/Understanding_clock_jitter-article-tech-update-jul2007-html.aspx

Post by tailspn November 6, 2008 (17 of 31)

Post by raffells November 6, 2008 (18 of 31)
tailspn said:

Very worthwhile artical.

Thanks!
Tom

Hi Tom,
Its intresting to note that when you look at digital sonics and approach it backwards after learning the basics firstly in analogue.
Then ,probably the huge number differences involved in accuracy ie for D to A and A to D, plus the jitter problems and error correction in summation they seem almost frightening.
I remember the answer to a question about this accuracy being something like,
"Well ,All or most of the digits are there somewhere at some point in time."
Seemed funny at the time but highly relevant later on.
We had gone passed discussing why the error light was almost permanantly on.Especially when we realized a hairdryer was being used in the same building.
If someone has that 20 year old machine with the once mandatory flashing light ,perhaps you can try it out with a hairdryer.
Havent got one myself nowadays.
Dave

Post by Paul Clark November 6, 2008 (19 of 31)
Having once been involved with specing Cat-6 network/media cable (near gigahertz data rates over many meters distant) years ago I became familiar with the Belden company's patented "bonded twisted pair" technology. Belden, in my opinion, is the industry leader in cable design for everything from DC current to Optics. No hype or hocus-pocus theories. Just solid engineering design. I use their inexpensive, but solidly engineered speaker cables exclusively.

Knowing the above I am inclined (biased) to look for HDMI cable made from Belden cable stock:

http://www.bluejeanscable.com/articles/hdmi-cable-information.htm
http://www.bluejeanscable.com/articles/belden-hdmi-design-notes.htm
http://www.bluejeanscable.com/articles/belden-hdmi-info.htm?hdmiinfo


Detection threshold for distortions due to jitter on digital audio
http://www.jstage.jst.go.jp/article/ast/26/1/50/_pdf

CONCLUSION

"In order to determine the maximum acceptable size of jitter on music signals, detection thresholds for artificial random jitter were measured in a 2 alternative forced choice procedure. Audio professionals and semi-professionals participated in the experiments. They were allowed to use their own listening environments and their favorite sound materials. The results indicate that the threshold for random jitter on program materials is several hundreds ns for well-trained listeners under their preferable listening conditions. THE THRESHOLD VALUES SEEM TO BE SUFFICIENTLY LARGER THAN THE JITTER ACTUALLY OBSERVED IN VARIOUS CONSUMER PRODUCTS." [Emphasis mine]

In other words employ a quality HDMI cable and forget about jitter.

Post by stvnharr November 7, 2008 (20 of 31)
Paul Clark said:

Detection threshold for distortions due to jitter on digital audio
http://www.jstage.jst.go.jp/article/ast/26/1/50/_pdf

CONCLUSION

"In order to determine the maximum acceptable size of jitter on music signals, detection thresholds for artificial random jitter were measured in a 2 alternative forced choice procedure. Audio professionals and semi-professionals participated in the experiments. They were allowed to use their own listening environments and their favorite sound materials. The results indicate that the threshold for random jitter on program materials is several hundreds ns for well-trained listeners under their preferable listening conditions. THE THRESHOLD VALUES SEEM TO BE SUFFICIENTLY LARGER THAN THE JITTER ACTUALLY OBSERVED IN VARIOUS CONSUMER PRODUCTS." [Emphasis mine]

In other words employ a quality HDMI cable and forget about jitter.

This article confirms the "several hundred ns of jitter does not matter in cd players" argument that has been pretty standard design criteria for cd players for years. The study only tests 16 bit pcm recordings.
There is no mention of recordings with higher resolution than 16 bits, HDMI, or low jitter in a low noise power environment.

Page: prev 1 2 3 4 next

Closed