A measurement that compares a signal's strength to any background noise present, signal-to-noise ratio is a measurement that means less to the audio community today than in the past.

Once upon a time, all music recording, storage, and reproduction was in the analog domain, and so every generation of transcription added its own noise in addition to the noise inherent in whatever medium the sound was recorded in. For example, a "live" sound recorded onto a mass-market vinyl record went through at least three procedures from microphone to the record-store shelf, and that was only for high-end stuff. Generally, a recording was made to tape, the tape was used to cut a master disk (high-end recordings often went "direct-to-disk" losing a generation at the expense of editing ability), the master disk was used to create a metal "mother", which then was used to stamp the vinyl records. (Another approach to increase sound quality was to run the cutting disc at half-speed from the master tape so that there would be better signal transfer.)

Either way, at every step there was added noise, both from material transfer flaws between the cutting needle and the master disk, the master disk and the metal mother, and the metal mother to the lump of vinyl squished to make the record, and signal losses due to the transmission between microphone and the tape recording and the tape playback to the vinyl cutting head.

A record at best would have a signal to noise ratio of 60 to 70 dB (decibels), and that would be with fanatical attention to dust and general record and stylus (called a needle by the great unwashed) cleanliness, as just the surface noise of the stylus rubbing in the groove added at least 10 dB of hiss. Subtract from that the fact that it was impossible to get every nuance of the sound from the physical groove in the record with an ill-fitting stylus that couldn't even fit too tightly for fear of sticking and damaging the record, and you have an inherently noisy medium.

Every aspect of a turntable, from motor vibration to tonearm resonances to the angle and attitude of the needle at the record surface contributed to the sound quality. Audiophiles (myself included) would spend hours packing the turntable base with clay (to dampen vibrations), using higher-quality wiring (to improve signal transfer), better needles (to fit as best as possible in the groove without damaging it), tonearms (less resonance, better tracking), and adjusting the result minutely with test gear until everything was as good as we could possibly get it. After all of that we'd gain about 3 db. (Don't laugh, that doubled amplifier efficiency and reduced the perceived noise floor by half, as one must double the power to gain 3 dB in sound output.)

Tape was worse. A cassette would have an S/N ratio of 40 to 50 dB, with Dolby noise-processing. There were losses in the recording, as the magnetic field couldn't be controlled tightly enough, there were losses in the tape's magnetic material, since it couldn't create an exact copy of what it was presented, and tape losses due to mis-alignment of the tape and the recording head (add more noise if you had a previously-used tape, as it couldn't be completely erased.) Now double all of those errors on playback. A reel-to-reel tape was wider, and therefore better at taking a signal, and travelled faster over the heads, for better signal transfer, but at best only approached the quality of vinyl.

Now that there is digital recording and playback, with potential S/N approaching 100 dB, the glory days of system "tweaking" are over. There are still improvements to be made in a system, but not nearly as much art is involved, and the improvements are incremental compared to what had to be done to get great sound out of a needle and a vinyl disk.