jLB
Another one followed me home. Can we keep it?
Back in the olden days it was a car stereo competition thing.
They used to have competition classes based on the advertised wattage of the amplifiers. So of course the manufacturers made what were informally known as cheaper amps - amps that had loosely regulated power sections that responded with significant power increases if the rails were given any additional voltage to work with.
As an example, I have an old 2-channel Pioneer Premier amp in the closet that was rated at an advertised 50 Watts per channel at 12 Volts with 0.08% THD across the whole 20-20KHz spectrum. Since the power section of the amps is loosely regulated it responds well to increases in supply voltage - my Premier amp was certified at 238 Watts per channel at 14.4V.
Having competed in the car stereo competitions in the "olden days" (mid eighties to early nineties). I ran a number of the old "cheater amps" and don't remember a single one that ran nearly 5 times the rated output (into the rated 4ohm per channel load) with simply a 2.4v increase. Yes, they were under rated, and of course we ran the huge power cables, but I remember the huge gains in the power output (4 to 5 times the rated output) being from running them into a much lower impedance than the traditional 4ohm per channel load.
Last edited: