What exactly is the gain on an amplifier?

quote:


 

Ok, so lets take for example the Sundown SAE-1200D. 1200W @ 1ohm and an input sensitivy ranging from .153V to 5.6V. For everything in this post lets assume no clipping of the audio signal from the source unit and say the source unit is putting out 3V from the pre-outs.

What is going to happen if you turn the gain knob "up" (as in toward the .2V) after you have matched the gains of the amp to the sensitivity of your source? I assume it will not put out more power. Will it begin to send the amplifier into clipping?



 

OK, I wrote this response recently to someone else asking a similar question, but it seemed to fit ehre, so I'm quing to paste what I wrote, then add a few notes to it.

Yes the gain (more properly called "input sensitivity") on an amp should match the line voltage of the head unit, in short.
if the input stage (gain) on the amp is set to be too sensitive (high) then the input signal from the radio will over-drive the input stage of the amplifier, and the sine wave signal will be clipped, or cut off at the peaks and troughs of the wave, thereby sending DC voltage to the speaker, which will more quickly heat the voice coil(s), and build heat across teh coils, possibly causing damage from that heat, or from loss of linearity of coil travel within the motor structure.

If the input stage is set to be not sensitive enough, then you lose signal resolution, and you're amplifying a weak signal which means more distortion, and more noise floor (more noise per amount of actual clean signal.) ie, lower S/N ratio.

This next part is hard for me to explain without a dry erase board.
The reason the gain on an amp seems to act like a power or volume control is because you're moving the center-line of your sine wave signal up and down within the dynamic range of what the amplifier can handle as it's predefined input range limits.
It's similar to the way TV commercials are always a lot louder than the television show you're watching. The FCC passed laws against this practice decades ago, but advertisers found a way around that by boosting the recording level of the commercial so everything is compressed into the limitations of the reference range they're allowed.
If you can imagine a graph with an X and Y axis, the X axis is time, and Y is amplitude (volume, or dynamic range) We'll scale the Y axis in this example as 0 to 10, 10 being the highest volume possible for the input signal.

With a normal TV show, the center for the volume is about a 5. This allows someone who whispers to be around a 3, while a gun shot would be maybe an 8. What the TV commercial does, is cram everything into the 9-10 range.
This leaves them no dynamic variance for quiet-to-loud shifts, but it makes everything super loud based on how your TV volume is set for watching the TV programming. Tricky sneaks, huh?

I guess in short you could say setting the gain too low is like whispering so quietly that it's hard to be understood.
Setting it properly is like speaking in a clear, well enunciated voice.
setting the gain too high, is like standing next to someone's ear, and screaming at the top of your lungs.

in the first, the person can't hear you clearly over the ambient noise of the environment.
In the second, you can be heard and understood clearly and concisely.
In the third, oh, you can surely be heard, but it's painful, and clarity is out the window as you now have damaged ears, and this persistent ringing.

"What's the deal with line voltage? Why is more better?"

Simply put, the higher the line voltage, the lower the noise floor, and the more resolution your source signal will have. Think of your source signal like a JPG image, and your amplifier like PhotoShop. The more you take that JPG (the source signal) and blow it up in PhotoShop (your amplifier) the faster you see how high of a resolution the original JPG (signal) really was.
If your original signal is 500mA, when you amplify it, you only have so much signal resolution to work with. If the source signal was an 8 volt signal (Hello, Eclipse.) then you have a better signal resolution to work with when it's amplified 500 times.

 


 

The voltage gain of an amplifier is the ratio of voltage applied to the input vs. the voltage produced at the output. This is expressed in db. An average voltage gain for a high power sub amplifier is somewhere in the vicinity of 30 db, or 5x the voltage doubling. This would mean that if the amplifier was presented with 2V on the input, it would present 64V on the output.

You need to remember that audio amplifiers are voltage sources, they attempt to maintain a fixed voltage level into different impedances. Try not to forget this, it is important. This is why amplifiers have different power ratings at different impedances, because the rails will always try and swing the same voltage with the same input, and then the output stage will source the differing amounts of current required to maintain the constant voltage into the varying impedances.

Ok, so we understand that amplifers are a voltage source. So now, lets look at an amplifier without a gain knob, like many older amps and most home audio amps. They have a fixed voltage gain (this is published when usable specs are actually given by manufacturers) and if we know what the voltage gain is, we know exactly what input drive will produce full ouput power into any impedance. We know this because if the amplifier is rated at say 100 watts at 8 ohms it has 29VRMS output capabilty before the rails are exhausted and clipping sets in (square root of W*Z) And we know if the amplifier is rated with 30db of voltage gain that roughly 900mv on the input will produce 29V on the output and is the maximum power (900mv * 30db = Apr. 29V)

Now lets look at the amplifer with a gain knob. I has adjustable voltage gain. They usually give you around 10-15db of adjustment in overall volatge gain of the amplifier. Why isn't it expressed as db on the knob? Because most people wouldn't understand how db relates to voltage (a 6b increase doubles the voltage). So instead, they do the math for you and write a voltage level that cooresponds to the gain amount (like 200mv to 8V) and put that on the amp so you can set the voltage gain of the amp.

Why do they do this? #1 so you can match multiple outputs of multiple channels/amps. #2 to increase the S/N ratio. #3 as a marketing gimmick.....

#1 is easy and needs no explanation.

#2 is also easy, but I will explain. The more voltage gain an amp has, the more it amplifies everything. This includes alternator whine, white noise (hiss) and everything else. The less volatge gain the amp has, the less it amplifies everything, including all of these noises. So, the lower you set the voltage gain while still being able to get the power you need out of the amp (you have sufficient voltage drive from your source) the less overall noise the system will have. This is also where matching the H/U output to amplifer gain comes into effect.

#3 Would most people on here buy an amp with a gain knob or without one, if everything else was exactly the same? Think about it.

The most expensive amplifiers in the world do NOT have gain knobs. Look at any Krell, Mark Levinson, Pass Labs, etc and you will see this.

Speakers don't care about clipping. If you are listening to any Jimi, Metalica or G-n-R, you are listening to clipping, and a lot of it, and I bet your speakers are doing just fine. Speakers don't like too much power, that is what kills them. It kills them thermally if the enclosure alignment permits, or kills them mechanically if the enclosure alignment permits. That's it. The end.


© Copyright WolfWare, Ltd.. All Rights Reserved.

 
mouseover