Tuesday, January 26, 2010

Why do I need a resistor with an LED ?

The short answer: to limit the current in the LED to a safe value.
The long answer: LEDs are semiconductors, diodes in particular. The current flowing in an LED is an exponential function of voltage across the LED. The important part about that for you is that a small change in voltage can produce a huge change in current. That is the most important concept of this article. Resistors aren’t like that. The current and voltage in a resistor are linearly related. That means that a change in voltage will produce a proportional change in current. Current versus voltage is a straight line for a resistor, but not at all for an LED.

Because of this, you can’t say that LEDs have “resistance.” Resistance is defined as the constant ratio of voltage to current in a resistive circuit element. Even worse, there’s no real way to know exactly the relationship between current and voltage for any given LED across all possible voltages other than direct measurement. The exact relationship varies among different colors, different sizes, and even different batches from the same manufacturer. When you buy an LED, it should come with a rating that looks like this: 3.3V @ 20 mA typical. That gives you one point along the operating curve. Usually that’s a safe operating point. You may get a maximum rating in addition. It may be in the form of either a voltage or current. For example, a lot of people report buying “5V blue LEDs.” These are really not rated to operate continuously at 5V in most cases.

The other thing I’d like you to take away from this article is the idea that it’s more useful to talk about driving an LED with a current of a particular size, instead of a voltage. If you know the voltage across an LED, you can not determine the current flowing in it, unless you are operating it at the exact point along the curve that’s described in the specs. Worse, being “off by a little” in the forward voltage can have a drastic effect in the current. So the approach I prefer is to select a current-limiting resistor in order to achieve a target current in the LED.
Most 3mm and 5mm LEDs will operate close to their peak brightness at a drive current of 20 mA. This is a conservative current: it doesn’t exceed most ratings (your specs may vary, or you may not have any specs–in this case 20 mA is a good default guess). In most cases, driving the LED at a higher current will not produce substantial additional light. Instead, the junction (the working parts of the LED) has to dissipate the excess power as heat. Heating the junction will decrease its useful life, and can reduce the output of the LED substantially. Heating it enough will cause catastrophic failure (producing a dark emitting diode).

To illustrate these ideas, I conducted an experiment. I built a simple circuit consisting of a variable voltage supply driving an LED directly. I varied the voltage across the LED and measured the current that flowed. I had a 3000 mcd blue LED and a 5000 mcd white LED available to test, both 5mm. The results are in the graph below. It’s the most important thing in this article, and it’s worth repeating: a small change in voltage can produce a huge change in current. Note especially the portion of the curve between 3.2V and 3.4V. The current changes by a factor of 4 even though the voltage varies by 0.2V. While the specifics will be different for every LED, they all will have this sort of relationship. Overdriving an LED a little is going to degrade it substantially. Both the LEDs in the test were destroyed by the higher drive currents. They still lit up, but at a fraction of their original brightness.


No comments:

Post a Comment