‘Contrary to popular belief, transmission of Digital Audio and Video is by no means perfect. The very fact that extensive use of error correction is emplyed; is a testimony in its self'.
During transmission, digital signal suffers from data corruption/loss due to a variety of reasons, these include bandwidth limitations, jitter, signal attenuation, crosstalk, external EMI (Electro Magnetic Interference) etc. To counteract these problems digital processing systems employ error correction and/or error reduction techniques.
‘Real world’ digital signal
The common misconception is that since a digital signal comprises of 1’s and 0’s, the device will therefore work perfectly or not work at all. An ideal theoretical square wave has instantaneous transitions between the High (logic 1) and Low (logic 0) levels. In practice, this is never achieved because of physical limitations of the system.
‘It is scientifically impossible to transmit an undistorted ideal square wave through a transmission channel/medium.’
The limitations of digital signal processors and cables create timing errors known as jitter, which remove portions of the signal and replace them with noise and distortion. Cables tend to round off the square waveforms of the signal, making them less clear to the processor, thus increasing jitter. This rounding effect varies greatly among cables and truly superior HDMI cables can make great improvements in imaging and sound quality.
Another important point to remember is that the digital signal degrades as the length of the HDMI cable increases due to bandwidth limitations.