To power an LED, the current that flows through it needs to be limited. For simple LEDs like the ones for signalling purposes, the cheapest and easiest way is to use a resistor in series with the LED, to limit the current. The question that normally remains is: Which value should the resistor have?

First of all, you need to know at which *voltage* you want or need to run your LEDs. Most electronic circuits use some kind of standard voltages, like 3.3 V, 5 V or 12 V. Also very important is the forward voltage (U_f) of the LED (or LEDs) you want to power. The forward voltage is the voltage the LED needs to light up at all. It’s also the voltage that drops by passing through the LED. This voltage can be measured i.e. by using the AVR Transistortester, a multimeter (it depends on the voltage the multimeter uses) or a power supply with constant current function. Another way to find the forward voltage is to check out the datasheet. Or you could just use some kind of default value, determined by using the color of the LED as indicator.

The measured forward voltage depends significantly on the current flowing while testing. I found, that using a constant current power source (bench power supply) with a current set to about 10-20 mA will give me the best results, that nearly match the theoretical calculations. All other devices gave me slightly to significantly lower results (between 0,15 to 0,5 V less).

The second important value, is the maximum *current* (or forward current) the LED in question can handle without being damaged. Usually it’s about up to 20 mA (0.02 A). But in reality, most of the time, you don’t want to run the LED at its current limit. In moste situations, the light is still more than bright enough at 5 mA (or even less) to do it’s job perfectly fine. If you’re not sure, test it on a breadboard or with some alligator clips and different resistor values.

For the calculation of the *resistance* value of the resistor, you need the exact voltage at the resistor. Normally this would be the voltage from your power supply. But since the LED is also a part of the circuit, it will reduce the voltage. Therefore, the forward voltage of the LED needs to be subtracted from the supply voltage first. The resulting voltage value will then be divided by the current in Ampere.

R_{LED}=\frac{U_R}{I_{LED}}

Now the same, using some example values:

R_{5\,V}=\frac{5\,V - 1.95\,V}{0.005\,A}=\frac{3.05\,V}{0.005\,A}=610\,\Omega

R_{24\,V}=\frac{24\,V - 1.95\,V}{0.005\,A}=\frac{22.05\,V}{0.005\,A}=4.410\,k\Omega

Usually the next higher value, which is available, will be used. Like in the first example maybe a 680 Ohm resistor and in the second example 4700 Ohm.

But what about the power? Will the usual 1/4 Watt resistor handle the current? How much current will pass through the resistor at 680 Ohm?

I_{5\,V}=\frac{U}{R}=\frac{3.05\,V}{680\,\Omega}=0.0045\,A=4.5\,mA

P_{5\,V}=U \cdot A=3.05\,V \cdot 0.0045\,A=0,0137\,W=13.7\,mW

I_{24\,V}=\frac{U}{R}=\frac{22.05\,V}{4700\,\Omega}=0.0047\,A=4.7\,mA

P_{24\,V}=U \cdot A=22.05\,V \cdot 0.0047\,A=0,1036\,W=103.6\,mW

As you can see, the resistor can easily handle that, even at 24 Volt.

how this voltage can be measured i.e. by using the AVR Transistortester?