LEDs are a current driven device. That is, you should not just apply a voltage directly to them without some form of current limiting, or you are likely to damage them. This can be in the form of a constant current source, or a simple resistor. We shall explore the latter method here.
The light output of LEDs is dependent on the amount of current that is flowing through them. The more current, the more light. ( until you push past their rated limits! )
The light output of LEDs is dependent on the amount of current that is flowing through them. The more current, the more light. ( until you push past their rated limits! )
A Single LED
The supply voltage must he higher than the LED's "Forward Voltage" (Vf). A typical red LED will have a Vf parameter of around 1.7 Volts.
Now, say you want to drive a single LED at 20mA. This is a fairly common (continuous) rating for LEDs. Let's also say you have a 5 Volt power supply.
[[Image:Single_led_resistor.gif]]
The maths for working out the resistor value and power rating is quite simple and involves [[Ohms Law]].
We have a 5V supply, but the LED is only 1.7V, so we'll be dropping 3.3V (5 - 1.7) across the series resistor.
Using [[Ohms Law]], that becomes: 3.3V / 0.02A (20mA) which equals 165 Ohms.
The nearest common resistor values to this are 150 Ohms and 180 Ohms. Let's go for 180 Ohms to be safe.
Working backwards, that becomes: 3.3V / 180 Ohms which equals 0.0183A (18.3mA) which is a little under our target current, but still close enough.
OK, we know that a 180 Ohm resistor will work in this application, but what power (wattage) should it be?
Simply multiply the voltage across the resistor (3.3V) by the current through it (0.0183A) to get 0.06W (60mW).
Most common resistors these days are 125mW (1/8W), 250mW (1/4W) or 600mW. Any of these will do since they are a higher power rating than the minimum 60mW required.
Now, say you want to drive a single LED at 20mA. This is a fairly common (continuous) rating for LEDs. Let's also say you have a 5 Volt power supply.
[[Image:Single_led_resistor.gif]]
The maths for working out the resistor value and power rating is quite simple and involves [[Ohms Law]].
We have a 5V supply, but the LED is only 1.7V, so we'll be dropping 3.3V (5 - 1.7) across the series resistor.
Using [[Ohms Law]], that becomes: 3.3V / 0.02A (20mA) which equals 165 Ohms.
The nearest common resistor values to this are 150 Ohms and 180 Ohms. Let's go for 180 Ohms to be safe.
Working backwards, that becomes: 3.3V / 180 Ohms which equals 0.0183A (18.3mA) which is a little under our target current, but still close enough.
OK, we know that a 180 Ohm resistor will work in this application, but what power (wattage) should it be?
Simply multiply the voltage across the resistor (3.3V) by the current through it (0.0183A) to get 0.06W (60mW).
Most common resistors these days are 125mW (1/8W), 250mW (1/4W) or 600mW. Any of these will do since they are a higher power rating than the minimum 60mW required.
Multiple Series LEDs
Calculating the resistor parameters for multiple LEDs in series is also pretty painless.
The supply voltage must he higher than the sum of all of the LED's "Forward Voltage" (Vf).
In the diagram below we have three LEDs, so if we assume red colour, that's about 5.1V total for the LEDs.
It is important to note that the resistor current is still the same as the LED current as they are all in series.
For this example, we'll use a 12V power supply. That means the resistor will have 6.9V (12 - 5.1) across it.
Using Ohms Law again, that becomes: 6.9V / 0.02A (our 20mA target) which equals 345 Ohms.
The nearest common resistor values to this are 330 Ohms and 390 Ohms. Let's go for 390 Ohms to be safe.
Working backwards, that becomes: 5.1V / 390 Ohms which equals 0.0176A (17.6mA) which is a little under our target current.
Again, we multiply the voltage across the resistor (6.9V) by the current (0.176A) and get about 120mW.
That's too high to use a 125mW (1/8W) resistor, so a 250mW (1/4W) or 600mW type would be chosen.
[[Image:Multi_led_resistor.gif]]
The supply voltage must he higher than the sum of all of the LED's "Forward Voltage" (Vf).
In the diagram below we have three LEDs, so if we assume red colour, that's about 5.1V total for the LEDs.
It is important to note that the resistor current is still the same as the LED current as they are all in series.
For this example, we'll use a 12V power supply. That means the resistor will have 6.9V (12 - 5.1) across it.
Using Ohms Law again, that becomes: 6.9V / 0.02A (our 20mA target) which equals 345 Ohms.
The nearest common resistor values to this are 330 Ohms and 390 Ohms. Let's go for 390 Ohms to be safe.
Working backwards, that becomes: 5.1V / 390 Ohms which equals 0.0176A (17.6mA) which is a little under our target current.
Again, we multiply the voltage across the resistor (6.9V) by the current (0.176A) and get about 120mW.
That's too high to use a 125mW (1/8W) resistor, so a 250mW (1/4W) or 600mW type would be chosen.
[[Image:Multi_led_resistor.gif]]
Headroom
The series resistor in an LED circuit needs to drop the difference between the supply voltage and the LED voltage (Vf). The more voltage across the resistor though, the more power (heat) it will dissipate.
Whilst we don't want the resistor to get too hot, good circuit design requires that the resistor drops at least a volt or two across it. The supply voltage may vary due to tolerances and load variations, while the LED Vf may vary due to temperature, age and batch variations.
These variations must remain small compared to the voltage dropped across the resistor, otherwise the actual LED current will vary wildly. If the LED current drops too low, this may result in greatly reduced light output. If it is too high, the LED will be driven beyond its ratings with damage following soon after.
For example, imagine our power supply is 12V and we have 4 LEDs in series, each with a Vf of 2.9V (11.6V total Vf). If we assume 20mA LED current, using Ohms law the series resistor would work out to be 20 Ohms. The equation would be (12 - (4 * 2.9)) / 0.02 or simply 0.4 (Volts) / 0.02 (Amps).
Now imagine that the 12V supply is actually 12.4V (only 3.5% high). The LED current would shoot up to 40mA - that's twice the 20mA we wanted! OK, you may be thinking "I'll just measure the exact supply voltage and calculate from that". That's all well and good, but the LED Vf also varies remember.
Also as LEDs get warmer, the Vf can drop slightly. The Vf of 2.9V in the example may drop just 100mV (to 2.8V, a 3.5% difference) and have the same disastrous effect on the LED current. The increased current will cause the LED to run warmer still, with a further reduction in Vf. As you can see, this can lead to "thermal runaway" with the LEDs dying in a short time.
Now you can see why running a string of series LEDs with a total VF very close to the supply voltage is a bad thing. The circuit could have been better designed and had only 3 LEDs in series (8.7 total Vf) and having the resistor drop the remaining 3.3 Volts. Then any variations would be much smaller, leading to a much more stable LED current.
There are LED array designs on the web that use a 1 Ohm series resistor for each row of LEDs. The resistor in this case is nothing more than a fuse that may (hopefully) blow when the LED current rises too high. These poor designs seem to rely on the long (and relatively thin) cables feeding the arrays having enough resistance to help prevent excessive current and the resultant damage. Someone using better power supply cables may think that they are improving their system, but soon find themselves with a dead LED array!
Whilst we don't want the resistor to get too hot, good circuit design requires that the resistor drops at least a volt or two across it. The supply voltage may vary due to tolerances and load variations, while the LED Vf may vary due to temperature, age and batch variations.
These variations must remain small compared to the voltage dropped across the resistor, otherwise the actual LED current will vary wildly. If the LED current drops too low, this may result in greatly reduced light output. If it is too high, the LED will be driven beyond its ratings with damage following soon after.
For example, imagine our power supply is 12V and we have 4 LEDs in series, each with a Vf of 2.9V (11.6V total Vf). If we assume 20mA LED current, using Ohms law the series resistor would work out to be 20 Ohms. The equation would be (12 - (4 * 2.9)) / 0.02 or simply 0.4 (Volts) / 0.02 (Amps).
Now imagine that the 12V supply is actually 12.4V (only 3.5% high). The LED current would shoot up to 40mA - that's twice the 20mA we wanted! OK, you may be thinking "I'll just measure the exact supply voltage and calculate from that". That's all well and good, but the LED Vf also varies remember.
Also as LEDs get warmer, the Vf can drop slightly. The Vf of 2.9V in the example may drop just 100mV (to 2.8V, a 3.5% difference) and have the same disastrous effect on the LED current. The increased current will cause the LED to run warmer still, with a further reduction in Vf. As you can see, this can lead to "thermal runaway" with the LEDs dying in a short time.
Now you can see why running a string of series LEDs with a total VF very close to the supply voltage is a bad thing. The circuit could have been better designed and had only 3 LEDs in series (8.7 total Vf) and having the resistor drop the remaining 3.3 Volts. Then any variations would be much smaller, leading to a much more stable LED current.
There are LED array designs on the web that use a 1 Ohm series resistor for each row of LEDs. The resistor in this case is nothing more than a fuse that may (hopefully) blow when the LED current rises too high. These poor designs seem to rely on the long (and relatively thin) cables feeding the arrays having enough resistance to help prevent excessive current and the resultant damage. Someone using better power supply cables may think that they are improving their system, but soon find themselves with a dead LED array!
LED Specs
In the above examples, we used a Vf parameter of 1.7 Volts. In reality, this will vary (1V - 4V typically) with the colour and type of LED. You can often find the Vf parameter specified in the manufacturer's data sheet.
To measure it for yourself, hook up an LED with a 1K (1000 Ohms) series resistor and a 12V supply. This will limit the circuit current to a safe maximum of 12mA. (12V / 1000 Ohms).
Now, simply measure the voltage across the LED itself to find it's typical Vf parameter.
I have simplified this somewhat, as the Vf will also vary a little depending on how much current the LED is drawing. It will be close enough for most applications.
To measure it for yourself, hook up an LED with a 1K (1000 Ohms) series resistor and a 12V supply. This will limit the circuit current to a safe maximum of 12mA. (12V / 1000 Ohms).
Now, simply measure the voltage across the LED itself to find it's typical Vf parameter.
I have simplified this somewhat, as the Vf will also vary a little depending on how much current the LED is drawing. It will be close enough for most applications.
Calculators
There are various on-line calculators for working out LED series resistors.
Please make sure you read the instructions carefully. Most seem to use mA (milliamps) for the LED current, not Amps.
Don't forget, knowing and understanding Ohms Law is still something you need to know for may other applications.
Please make sure you read the instructions carefully. Most seem to use mA (milliamps) for the LED current, not Amps.
Don't forget, knowing and understanding Ohms Law is still something you need to know for may other applications.
Summary
You can use the information above to work out series resistor values. Once you do choose a resistor, measuring the actual current through the circuit is a good idea, especially if you are running the LED at close to its maximum continuous rating. Use the milliamps setting on your multimeter and connect it in series, by breaking into the circuit at any point.
References
Categories:
LED String pages
This page has been seen 2,792 times.
-
-
Created by onLast updated by on
-
- Contributors: