I was reading the information in the general overview of an electric brew build by coderage (very nice article by the way), and had a quick question.
I don't understand why if you have a 5500w element that is made for 240v, that if you put that on a 120v line, why the maximum wattage you can attain is 1375? He says that if you go from a 240v element and run it on a 120v line, you have to divide the usable watts by 4. Why would you do that instead of 2?
Another question I had, if we have a 2000w 120v element, and ran it on a 15amp breaker, I assume it would trip the breaker, no? Because 2000w/120v = 16.x amps? I guess what I am really asking is does the load determine how many amps are drawn (ie the element) or is it the breaker (in which case the breaker would only allow 15 amps to flow through the circuit, and only opeatre the elment at 1800w)?
I don't understand why if you have a 5500w element that is made for 240v, that if you put that on a 120v line, why the maximum wattage you can attain is 1375? He says that if you go from a 240v element and run it on a 120v line, you have to divide the usable watts by 4. Why would you do that instead of 2?
Another question I had, if we have a 2000w 120v element, and ran it on a 15amp breaker, I assume it would trip the breaker, no? Because 2000w/120v = 16.x amps? I guess what I am really asking is does the load determine how many amps are drawn (ie the element) or is it the breaker (in which case the breaker would only allow 15 amps to flow through the circuit, and only opeatre the elment at 1800w)?