tdbeng
Electrical
- May 8, 2002
- 11
Hi All,
I'm using an LM317 (TO-220 package) to regulate a supply voltage of 7.5VDC from 12VDC. It's the basic LM317 circuit: 0.1uF cap on the input, 1.0cap on the output, forward diode from output to input, and two resistors on the adjust pin so Vout = 1.25*(1+r2/r1) + Iadj*r2. I'm using 100ohm for r1 and 500ohm for r2. The 7.5VDC is used to supply power to an ethernet hub circuit, and the max. current draw is 1A. The 12VDC input is coming from an unregulated power supply.
The problem is that the regulator is dissipating too much heat for my application, even w/ a large TO-220 heat sink.
What's the best way to reduce the heat from the regulator? Is it by reducing the input voltage (because the heat dissipation is in relation to the voltage drop)? I thought about putting a zener in seried with the input, but is this bad circuit design technique?
Thanks in advance!
I'm using an LM317 (TO-220 package) to regulate a supply voltage of 7.5VDC from 12VDC. It's the basic LM317 circuit: 0.1uF cap on the input, 1.0cap on the output, forward diode from output to input, and two resistors on the adjust pin so Vout = 1.25*(1+r2/r1) + Iadj*r2. I'm using 100ohm for r1 and 500ohm for r2. The 7.5VDC is used to supply power to an ethernet hub circuit, and the max. current draw is 1A. The 12VDC input is coming from an unregulated power supply.
The problem is that the regulator is dissipating too much heat for my application, even w/ a large TO-220 heat sink.
What's the best way to reduce the heat from the regulator? Is it by reducing the input voltage (because the heat dissipation is in relation to the voltage drop)? I thought about putting a zener in seried with the input, but is this bad circuit design technique?
Thanks in advance!