View Single Post
  #33  
Old 05-01-2019, 06:42 PM
audio bill audio bill is offline
Guest
 
Join Date: May 2013
Location: Chicago suburbs
Posts: 1,641
Default

Quote:
Originally Posted by Faintandfuzzy View Post
I am fascinated as to how the cable can transmit software code bit perfect regardless of cable used...yet for some reason with audio, there is a huge failure rate on data. How does the cable know if data bits or audio bits are being transmitted? Sorry for a bit of the snark, but when I was at an audio show, those claiming to hear the difference suddenly lost the ability under testing conditions. Why are software transmissions perfect, yet the data data from a FLAC suddenly falling apart?
This has been explained many times, but I'll try again... First of all digital data is not transmitted in digital format (ones and zeros as many believe) over a cable, but as an analog signal representing the digital data. As such the signal is not a perfect square wave... in reality it exhibits overshoot, settling time issues, jitter, judder, etc. which are all types of distortion related to the timing of the signal transitions (when changing between the levels representing zero and one).

When transmitting computer data only the recovered 'digital' data is of meaning, but with audio signals the timing of the data is also of significance. The digital clock signal is embedded in the timing of those transitions, and can be directly affected and corrupted by the types of distortion mentioned above. So a digital cable transmitting an audio signal via the SPDIF standard will have the resultant analog signal altered from the original analog signal after D to A conversion from such timing related distortions directly impacting accurate recovery of the digital signal's clock.

That's the best I can do to explain the difference between computer data transmission and digital audio transmission and why there is a difference. Hope it helps!
Reply With Quote