... CD quality it is not ...

 

A couple years ago you mentioned that you had listened to an iPod and said that it sounded like crap (or "burnt brown dreck") or some such thing. I vividly recall that you said "I know distortion when I hear it."

At the time I knew exactly what you where talking about because, when I analyzed the various lossy codecs at [---] for automating QA, I saw copious instances of clipping caused by discarded phase information. (Phases are considered by MPEG to be "non-critical" for human perception of music and are thus discarded in lossy encoding schemes.)

Ever since your iPod distortion comment I have been meaning to send you some graphics demonstrating that you're right about the distortion and that those slogan-chanters who say that MP3s are "CD quality" are full of shit.

First here is a proof of concept. An idealized and streamlined mathematical demonstration of how the loss of phase information in lossy encoding can result in clipping.

The following graph shows a composite wave in black. It is the sum of its two component sinusoids: the first harmonic in red, and the third harmonic in blue. Imagine that the area between the two horizontal grey lines represents the PCM range. Notice that the composite waveform is well within this range.



The next graph shows the same two harmonics with the identical magnitudes as above. The only difference is that the phase of third harmonic is shifted by 60 degrees. Notice that now the composite waveform not only looks totally different, but a significant part of it now exceeds the PCM range and thus would be clipped in an actual digital audio playback system.



Now the skeptic might say, "well that's just a contrived graph, this probably never happens with actual music files." So following is a real world example of this phenomenon. The encoding type utilized here is AAC @ 128 kbps. This is the default encoding quality used by iTunes (supposedly "CD-quality").


This image shows the signal of a brief section of a song from a CD (i.e., this is the actual PCM stream that is on the CD).




This image shows the identical section from the PCM stream decoded from the AAC-encoded version of this song (the actual AAC that is for sale on the iTunes store). As you can see, there is a rather pronounced difference between the original and the AAC signals, and the left channel of the AAC signal has been clipped.



This image is the AAC in red overlayed on the Original. It's plainly visible that the AAC is nothing like the original. Anyone who knows music and instruments could not fail to distinguish the difference.




Putting aside the clipping distortion and phase incoherence inherent in lossy encoding, any pinhead can execute a spectral analysis of the encoded files and see that all frequencies above 15 or 16 KHz have been removed. Anyone who can hear a signal devoid of frequencies above 15KHz and say that it sounds as good as a CD either has a really bad playback system or compromised hearing. (Or, more likely, they're just repeating what they read somewhere or heard someone say. This is why humans are saddled with bad information.)