Re: Electric current for same output: 110V vs 240V



On Fri, 23 Jan 2009 11:41:50 GMT, "Lawrence Logic"
<mr-NotThisBit-logic@xxxxxxxxx> wrote:


"JP" <JP@xxxxxxxxxxxxxxxxxxxxxxxxx> wrote in message
news:plehn4l4ccf0t06kueubaeaqgqqk1qe2dc@xxxxxxxxxx

Here's my take on this...
Okay...for an amp that is drawing 175 watts plugged into 240 volts,
the amperage drawn through the power cord should be 0.42148 amps

For the same amp drawing the same watts plugged into 120 volts, the
amps drawn should be 0.842967 amps

This has little to do with the 50 watts output except for providing
power for the amp circuits required to generate this wattage.

http://www.csgnetwork.com/ohmslaw2.html



Thanks for your response, although you may need to recalibrate your
calculator. If P = I*V, then I = P/V (dividing both sides of the equation
by V to leave I on its own as the value that we're trying to determine). If
P = 175 and V = 240, then I would be about 0.73 (175/240). For 110V, the
current drawn would be roughly 1.6 amps (175/110).

I understand that the power consumed doesn't necessarily bear any
relationship to the power shoved out through the front. It does raise an
interesting point though. Is there any rating of effective power for the
output of an amplifier?

A good example of what I'm talking about was an occasion when my old Peavey
Classic 30 was turned up to about 3 (out of 12) and its output via its
single 12" speaker totally swamped a Yamaha G100 transistor amplifier turned
up full and delivered via 4x12" speakers. The numbers meant nothing because
a valve amp will generally be much louder than its solid state equivalent.
Even the difference between A class, B class and A/B class valve amps is
pretty much immeasurable.

It would be nice to have some sort of absolute rating that really gave an
indication of how loud an amplifier would actually be.
The Db rating..
.