Guitar amplifiers are nothing like audiophile amplifiers. The main difference is that audiophile, "high-fi" amplifiers are designed to minimize distortion. Often a figure is quoted to indicate just how little distortion a high-fi amplifier will impart to the signal. (e.g. "less than 0.05% Total Harmonic Distortion" or some such.)

Guitar amplifiers on the other hand, are most certainly not designed to minimize distortion. They are often designed to provide extremely high levels of a very particular kind of distortion, particularly in the amplifiers favored by heavy metal guitarists. Marshall, Mesa Boogie, and Peavey (in particular the Peavey 5150 head) all make amplifiers capable of rendering the input signal unrecognizable by excessive distortion.

All amplifiers work in more or less the same way, even guitar amps and hi-fi amps. You can imagine a typical input signal as being something like a sine wave expressed as a voltage level. (Or, the superposition of many sine waves for a more complex signal, but that's not really important.) An amplifier's job (normally) is to simply multiply the voltage levels by some constant number, and produce (and maintain by supplying the necessary current) that multiplied voltage across the terminals of a speaker. So what is that constant? For high-fi amplifiers, it is a "reasonable" number, which is to say, the voltage required across the speaker terminals is normally never so high that the amplifier cannot deliver it.

Not so in a guitar amplifier that constant (the "gain factor") is often ludicrously huge. So huge that the amplifier has absolutely no hope whatsoever of maintaining or ever reaching anything remotely approaching the voltage levels that are implied by the gain factor. Why is this? Because the amplifier has a power supply with a fixed input voltage which acts as a cap on the output voltage. So what happens to the signal that's going through a guitar amplifier with a ludicrously high gain factor? The guitar amp does the best it can, amplifying very very small signals more or less accurately (consisting mostly of amplified noise many times :) while spitting out either plus or minus voltage at the maximum that it can, that is at the voltage of the power supply.

So, what happens is a signal comes into the amplifier looking more or less like a sine wave, but it comes out looking more or less like a square wave. The signal is "clipped", the tops and bottoms of the sine wave are cut off flat. That is the characteristic sound of a heavy-metal electric guitar. When operated in this manner, so that the signal is clipped, the amp is said to be "saturated"

Now there's the raging debate among both audiophiles and guitarists as to which produces the best sounds, tube amps or transistor amps. For guitar amps, there is (at least) one discernible distinction. Tube amplifiers respond more slowly, and are incapable of producing the very sharp corners on the clipped "square wave" output signal. Transistor amps tend to respond very quickly at the output stages and produce crisp sharp corners where the clipping occurs. This causes the two types of amp to sound a bit different. Transistor amps tend to sound a bit more "screechy" at high gain levels, though manufacturers put various filters on the output stages to mitigate or elminiate this screechiness. Some people say the tube amp sounds "warmer", but this is a meaningless word to apply to a sound. So if a tube amp sounds "warmer", then a transistor amp must sound "cooler?". Ridiculous.

It's also been said that a tube amp vs. a transistor amp will emphasize either the even or odd harmonics or vice versa (I can't recall which, or why this is so, or even if it really is so...but it has been said.)

Other differences between guitar amps and hi-fi amps have to do with the speakers. Hi-fi speaker amps are normally designed with the goal of accurate sound reproduction. Guitar amp speakers are decidedly not good at reproducing recorded sound. (Just try hooking up a portable CD player to a guitar amp. Twiddle the knobs as you might, you will not get a good sound from it, even operating the amp at the lowest (well below saturation) gain settings.) As to differences between various audiophile amplifiers, these tend to be very much more subtle than the differences between guitar amplifiers. Anyone can easily hear the difference between two guitar amplifiers, unless one is made with the express purpose of copying the sound of the other, and then it's only difficult if it's a very good copy. Not so with audiophile equipment, as anyone who has gone to buy a stereo can attest. Sure there are differences, but nothing like the differences between guitar amps. The differences tend to be in the degree of accuracy to which the input sound is reproduced, especially in the extremes of frequency and amplitude. This is of course because audiophile amps generally all have the same goal, reproduce a sound accurately. Naturally they tend to sound the same. Guitar amps have only the goal to sound "cool" (or is it "warm?") so they don't naturally converge on a single sound so much.