Thanks to their energy savings, long service life, durability and design flexibility they offer, light emitting diodes (LEDs) are rapidly replacing incandescent and fluorescent technologies in both indoor and outdoor luminaires. But choosing the right LED is only part of the design equation. In order for your solid-state lighting design to realize its full efficiency, durability, and longevity, you'll need to choose a power supply with characteristics which closely match the requirements of your application and the LEDs you're using. This brief tutorial offers some helpful pointers on what you'll need to consider during the selection process.
Start with the basics
LEDs begin to produce light once their supply voltage is equal to or greater than the diode's forward voltage drop (typically in the region of 2-3V). The current required for full brightness varies from device to device but is typically 350mA for a 1W LED (usually the smallest size used in lighting applications). But unlike incandescent bulbs, LEDs are non-linear devices. This means that once the supply voltage exceeds the diode's forward voltage, the current they pass increases exponentially as a function of supply voltage. Without some sort of current regulation, the LED chip will become an expensive, one-shot solid-state flash bulb.
To prevent this inconvenient behavior, the power source therefore must provide a suitable voltage at the appropriate current. The simplest way to achieve this would be to select a power supply with an output voltage above the forward voltage of the chosen LED and to limit the current to the maximum specified by the LED manufacturer using a current limit resistor. The down-side of this approach is that one of the main benefits of LED lighting – that of high efficiency, is compromised by the power dissipated by the current limiting device.
A further problem with this approach is that the LED junction temperature affects its forward voltage. As a power supply’s output voltage is fixed, this