Originally Posted by AnOldUR
Shows that I only know enough about electric to get in trouble.
I thought that the 4000W wouldn't change if you went from 230V to 240V.
I thought that the amps drawn would go from 17.4 to 16.7.
So, you're saying that the amperage is constant and the wattage changes?
No, the amps are not constant either. The only thing constant about it is that it is a resistor of a certain size. The "rating" they give you always specifies a wattage and a voltage, which means that you will get 'that' wattage if (and only if) you run it at 'that' voltage.
Run it at any other voltage and the current drawn and power produced start to change.
IN this case, advertised as 4000W/230V, the thing is a 13.2 Ohm resistor. The math for this is based on two basic electrical equations.
P = I * I * R
V = I * R, which is the same as I = V/R
So, plug that back into the Power equation as
P = (V/R) * (V/R) * R = (V*V*R)/(R*R) = (V*V)/R
and solve for R
R = V*V/P
R = 230*230/4000 = 13.2
If you put a 13.2 Ohm resistor on 120V (as an extreme example)...V = I * R
120 = I * 13.2
120 / 13.2 = I = 9A
And Power isP = I * I * R
P = 9 * 9 * 13.2 = 1089Watt