Wouldn't it be nice to measure the temperature of the internal led semiconductor junction, or die, as this is the temperature that really matters. To be precise, the maximum junction temperature for the Cree XRE and MCE is 150 DegC.

Actually, it should be relatively easy to measure junction temperature, given the published forward voltage temperature coefficient of -4.0 mV/DegC. In other words, for a constant led current, the forward voltage drops by 4.0mV for every degree of temperature rise. Therefore, by measuring the fall in led voltage, from the instant the led is turned on, we can easily calculate the rise in junction temperature.

Firstly, a quick check to see if the temperature coefficient is sufficiently large to produce an easily measurable change in voltage. Assume the led starts out an ambient of 30 DegC, and finishes with a junctiuon temperature of 130 DegC, so the junction temperature rise for this example is 100 DegC. The fall in forward voltage is therefore 100 x 4mV = 400mV, which is more than enough to accurately measure.

A minor practical problem is that it will be difficult to measure the forward voltage at exactly the time the led is switched on, because the thermal time constant of the junction itself will be quite short (probably well less than 1 second) so by the time we take the voltage measurement with our digital voltmeter, the led die temperature will already have risen above ambient. There are several ways around this problem. Ideally, use a computer based data acquisition system that can take thousands of voltage readings every second. I have that available, but just at the moment don't own any leds!

However, if you don't have a fast data acquisition card, the initial 'fast" rise in junction temperature can be estimated from the published thermal resistance from junction to package heatsink solder point, and is 8DegC/W for an XRE, or 3DegC/W for an MCE. Taking the MCE as an example. At maximum current of 700 mA, the power dissipation is typically 0.7x3.4x4 = 9.5 Watts. Thus the fast initial junction temperature rise is 9.5x3 = 28 DegC. For the XRE, the result is 3.7x8 = 29.6 DegC, which for practical purposes is the same. Fortunately this does not need to be known with great accuracy, as the overall temperature rise of around 100DegC or more is considerably greater and will dominate the result.

In summary, a simple procedure for measuring junction temperature rise is as follows. Firstly, you preferably need a precision constant-current lab power supply, though the better led drivers, eg Taskled, are probably good enough. Connect the voltmeter across the led, switch on the current, and as fast as possible note the initial led voltage, and then record the voltage versus time as the housing heats up. Let V0 be the initial measured voltage, and V be the voltage at time t. The junction temperature rise at time t is given by :

Temp Rise = (V0-V)/0.004 + 28

I'm ordering a couple of MCEs today from Cutter, and will certainly perform this measurement when I get them. Anyone out there care to try?? This method beats the heck out of guesswork, and lets you know whether your favourite housing is keeping the led(s) within thermal ratings, or not. To look at it another way, you can ascertain fairly well how hot your housing needs to be before the led(s) reach maximum junction temperature. It may be that most of are in fact nowhere near the max led temperature, or maybe we are, but at present we are guessing, and I don't guess when I can measure ....

Anyone else out there find this kind of stuff interesting?

Colin