Thursday, December 25, 2008

energy

I once heard that the standby/clock display on a microwave oven uses more energy than the oven uses to cook food. I find that unlikely, so let's do a quick calculation. The microcontroller used to drive the clock or whatever's on display shouldn't draw more than a few mA, 10 at most, or else it's a piece of shit. Most LEDs (I'll assume it was an LED display, because segment LCDs consume next to nothing) are rated for 20 mA average current, though in many cases they're used with less. High-efficiency LEDs can glow sufficiently brightly with as little as 2 mA. Let's assume 20mA per LED (worst case scenario), and a display made up of 4 standard 7-segment digits and 2 dots. Now, not all LEDs are simultaneously lit, because the clock doesn't show 88:88. To find out how many LEDs are lit on average, I built the following table:

Digit Number of Segments
0 6
1 2
2 5
3 5
Partial Total 18
4 4
5 5
Partial Total 27
6 6
7 3
8 7
9 6
Total 49

Note, totally off-topic, that 4, 5 and 6 are the only digits whose 7-segment representation has the same number of segments as the digit value.
Given a 24-hour clock (the worst case scenario), the sum for all 24 hours is: 6*10 for the leading zero plus 49 for the following early hours, plus 2*10 for the leading one plus 49 for the following mid hours, plus 5*4 for the leading 2 plus 18 for the following late hours. That's 216, so diving by 24 we get 9 LEDs lit on average to display the hours.
The minutes are just as easy to calculate: There's an average of 27/6+49/10 (for the first, second digits), that is 9.4 LEDs on average. The two points average 1 if they blink, 2 if they don't (worst case, which is what we choose).
In total there are 20.4 LEDs lit, that's an average current of 410mA, holy shit that's fucking much. In reality the current is significantly lower, but this is a worst-case calculation. So the display uses at most 420mA, microcontroller included. Suppose it's supplied with 5V, that's 2.1 watts. Assuming a supply efficiency of around 50%, the standby-mode clock eats about 4W averaged over a whole day, neglecting the small intervals the oven is used for cooking.
Now for the cooking. My oven has settings for 800W, 640W and 320W or something like this. I don't know if that's the useful power or the total power drawn. For the worst-case we're discussing I'll assume it's the total power. I always use it on 800W for about 1-2 minutes. So let's say the oven is used for 10 minutes a day (an average family might use it for 20-30 minutes a day). The display eats up 4W*24(hours)*60min=5760 watt-minutes. The cooking eats up 800W*10min=8000 watt-minutes, which is more. There, the display does NOT consume more energy than the oven-proper. If I only used the oven for 5 minutes a day, and the display was a really lousy one like the one described above then yes, the display might have consumed about as much energy as the cooking. Note, my oven displays a singe 0 when not in use, so that uses more than 3 times less energy. It still eats up quite a lot of energy, which sucks, but still significantly less than the oven-proper, so there goes the propaganda down the drain.

1 comment:

etor said...

I wish to sincerely congratulate you for the elegant justness of this post.