There is no less glamorous topic in the pro-audio world than that of AC, unless perhaps it’s what type of oil to use in the trucks. But, without clean and dependable electricity, you will spend needless hours chasing all kinds of problems.

It seems simple, right? In the US, you plug into a wall outlet, and you have clean 60Hz 120VAC. In theory anyway. How much do you think you need? One power amp rated at 2000W is going to draw more than 16 2/3 amps from the wall. To figure out about how much current a device is going to draw from the wall, take the rated power and divide by 120. Remember that power is equal to voltage X current. In this case, 120V (the AC line voltage) times 16.6 amp is 1992 watts. It will actually draw more than this y some amount, because nothing is 100% efficient. If the power amp puts out 2000 watts, it will draw a bit more than 2000W from the wall.

As you can see, this one amp has the potential to draw more current than a 15A circuit can provide. On the bright side, the amount of time that the amp actually hits this level of current draw is relatively brief, so the circuit breaker may not open (unless it’s worn out by repeatedly ‘breaking’). And that’s just one mid-sized amp. As you can see, a system capable of putting out 10,000W is likewise capable of drawing about 83 AMPS. The same formula works for anything that plugs into the wall (it’s a part of Ohms’ Law). A 1000W PAR64 can is going to draw 8.3 amps. ten of them are going to draw 83 amps, and are much more likely to be at full power for far longer than the sound system would be.

Another consideration is cable type and length. All cables have resistance, measured in ohms/foot. Cable diameter is measured in gauge. The larger the gauge number, the smaller the diameter, and hence the more resistance per foot. The reason this is important goes back to Ohm’s Law again. Think of the power cable to your amp as being electrically in series with your amp, with the wall socket as the voltage source. Let’s make it easy and say that your power cable is 50 feet long, and has a resistance of .1 ohm per foot. When the amp is not drawing much current, say 500 milliamps (or one half of one amp), the power cable itself will have a “voltage drop” of .5 x 5 (one half amp times the total resistance of the cable, which is Ohms per foot times the length) = 2.5 volts. That may or may not be enough to be significant. Now, let’s say that you turn it up so the amp is drawing 500 Watts. Now, we have 4.12 amps X 5 = 20.6 volts. Whoa! That means your power amp is only going to get less than 100V AC, is probably going to distort, have no headroom, and probably overheat. Since the formula for measuring how much voltage the power cable is going to “absorb” is the same no matter what, and you are going to need to draw that amount of current, there are two ways to get more voltage to the amp – use a shorter power cable, or cable with less resistance per foot. To get less resistance per foot, you have to use cable with large diameter conductors.

More on this later. It’s an important topic that most people either don’t understand, or they just plain ignore in the hope that the problems will go away.