Ztrain1985
Electrical
- Dec 18, 2007
- 34
Maybe its just me, but im having some issues wrapping my head around this.
I was told that in a resitive heater, if you decrease voltage, you increase current draw. However, I can not figure out how.
Say we have a 240 V heater, drawing 10 A, meaning its a 2400 Watt unit.
If we half the voltage, P=(1/2 * V)^2/R. R should be constant, so using V=IR, 240=10*R, R = 24 Ohms. Since Voltage is squared though, halving the voltage means reducing the power output to 1/4 its value. Or, 120^2/24 = 600 Watts. Again using the Power Equation, 600=120V*I meaning current is 5A.
How exactly is current supposed to increase with a voltage drop? Unless Power doesnt change when voltage does (or the resistance somehow changes), what am I missing?
I was told that in a resitive heater, if you decrease voltage, you increase current draw. However, I can not figure out how.
Say we have a 240 V heater, drawing 10 A, meaning its a 2400 Watt unit.
If we half the voltage, P=(1/2 * V)^2/R. R should be constant, so using V=IR, 240=10*R, R = 24 Ohms. Since Voltage is squared though, halving the voltage means reducing the power output to 1/4 its value. Or, 120^2/24 = 600 Watts. Again using the Power Equation, 600=120V*I meaning current is 5A.
How exactly is current supposed to increase with a voltage drop? Unless Power doesnt change when voltage does (or the resistance somehow changes), what am I missing?