## Monday, December 29, 2008

### energy, of the black kind

As noted before on this non-blog, many self-titled environmentalists have no idea what they're talking about. For instance, this blog came up with the idea that a "black Google" would save 750 MWh/year (that's about 86 kW) because black pixels supposedly take less power to display than white pixels. The guy apparently failed to note that only obsolete CRT monitors do that, initially stating a 3000 MWh/year figure (see it in the URL, he also admits it). Apparently, when he found out that LCDs actually use more energy to display black than white (this comment on his blog agrees) he changed the figure to 750, a commendable action. Even so, his calculation is highly approximative at best. So it's easy to campaign for stuff without having the slightest idea what's going on. As CRT displays are being slowly but surely replaced with LCDs, his black background idea is actually going to result in increased energy dissipation. It's true that OLED displays might eventually favor his idea, but the point is he did no serious studies before launching an idea that thousands of sheeple intercepted and praised, leading to sites such as this, which was brought to my attention today. Its about page (where I learned about the naive but honest guy's blog) is full of smug about how much it protects the environment, while its front page mentions how much energy it saved. I guess if we take the approx. 1MWh stated and multiply it by 0.6/15 (measured watts increase in LCD consumption versus stated CRT watts decrease) we get about 40 kWh of wasted energy. At least it's not very much.

## Thursday, December 25, 2008

### energy

I once heard that the standby/clock display on a microwave oven uses more energy than the oven uses to cook food. I find that unlikely, so let's do a quick calculation. The microcontroller used to drive the clock or whatever's on display shouldn't draw more than a few mA, 10 at most, or else it's a piece of shit. Most LEDs (I'll assume it was an LED display, because segment LCDs consume next to nothing) are rated for 20 mA average current, though in many cases they're used with less. High-efficiency LEDs can glow sufficiently brightly with as little as 2 mA. Let's assume 20mA per LED (worst case scenario), and a display made up of 4 standard 7-segment digits and 2 dots. Now, not all LEDs are simultaneously lit, because the clock doesn't show 88:88. To find out how many LEDs are lit on average, I built the following table:

Note, totally off-topic, that 4, 5 and 6 are the only digits whose 7-segment representation has the same number of segments as the digit value.

Given a 24-hour clock (the worst case scenario), the sum for all 24 hours is: 6*10 for the leading zero plus 49 for the following early hours, plus 2*10 for the leading one plus 49 for the following mid hours, plus 5*4 for the leading 2 plus 18 for the following late hours. That's 216, so diving by 24 we get 9 LEDs lit on average to display the hours.

The minutes are just as easy to calculate: There's an average of 27/6+49/10 (for the first, second digits), that is 9.4 LEDs on average. The two points average 1 if they blink, 2 if they don't (worst case, which is what we choose).

In total there are 20.4 LEDs lit, that's an average current of 410mA, holy shit that's fucking much. In reality the current is significantly lower, but this is a worst-case calculation. So the display uses at most 420mA, microcontroller included. Suppose it's supplied with 5V, that's 2.1 watts. Assuming a supply efficiency of around 50%, the standby-mode clock eats about 4W averaged over a whole day, neglecting the small intervals the oven is used for cooking.

Now for the cooking. My oven has settings for 800W, 640W and 320W or something like this. I don't know if that's the useful power or the total power drawn. For the worst-case we're discussing I'll assume it's the total power. I always use it on 800W for about 1-2 minutes. So let's say the oven is used for 10 minutes a day (an average family might use it for 20-30 minutes a day). The display eats up 4W*24(hours)*60min=5760 watt-minutes. The cooking eats up 800W*10min=8000 watt-minutes, which is more. There, the display does NOT consume more energy than the oven-proper. If I only used the oven for 5 minutes a day, and the display was a really lousy one like the one described above then yes, the display might have consumed about as much energy as the cooking. Note, my oven displays a singe 0 when not in use, so that uses more than 3 times less energy. It still eats up quite a lot of energy, which sucks, but still significantly less than the oven-proper, so there goes the propaganda down the drain.

Digit Number of Segments

0 6

1 2

2 5

3 5

Partial Total 18

4 4

5 5

Partial Total 27

6 6

7 3

8 7

9 6

Total 49

Note, totally off-topic, that 4, 5 and 6 are the only digits whose 7-segment representation has the same number of segments as the digit value.

Given a 24-hour clock (the worst case scenario), the sum for all 24 hours is: 6*10 for the leading zero plus 49 for the following early hours, plus 2*10 for the leading one plus 49 for the following mid hours, plus 5*4 for the leading 2 plus 18 for the following late hours. That's 216, so diving by 24 we get 9 LEDs lit on average to display the hours.

The minutes are just as easy to calculate: There's an average of 27/6+49/10 (for the first, second digits), that is 9.4 LEDs on average. The two points average 1 if they blink, 2 if they don't (worst case, which is what we choose).

In total there are 20.4 LEDs lit, that's an average current of 410mA, holy shit that's fucking much. In reality the current is significantly lower, but this is a worst-case calculation. So the display uses at most 420mA, microcontroller included. Suppose it's supplied with 5V, that's 2.1 watts. Assuming a supply efficiency of around 50%, the standby-mode clock eats about 4W averaged over a whole day, neglecting the small intervals the oven is used for cooking.

Now for the cooking. My oven has settings for 800W, 640W and 320W or something like this. I don't know if that's the useful power or the total power drawn. For the worst-case we're discussing I'll assume it's the total power. I always use it on 800W for about 1-2 minutes. So let's say the oven is used for 10 minutes a day (an average family might use it for 20-30 minutes a day). The display eats up 4W*24(hours)*60min=5760 watt-minutes. The cooking eats up 800W*10min=8000 watt-minutes, which is more. There, the display does NOT consume more energy than the oven-proper. If I only used the oven for 5 minutes a day, and the display was a really lousy one like the one described above then yes, the display might have consumed about as much energy as the cooking. Note, my oven displays a singe 0 when not in use, so that uses more than 3 times less energy. It still eats up quite a lot of energy, which sucks, but still significantly less than the oven-proper, so there goes the propaganda down the drain.

## Thursday, December 18, 2008

### honk

I was relaxing in the bar, calmly enjoying my beer. Outside, the traffic was horrible. The intersection was almost blocked and the drivers were angry. A car blew its horn. I thought I saw the lights in the bar blink. It was obviously just a brain glitch. Surely the lights can't dim when a car is honking outside, can they? A great thought came to my mind. What if they actually could? I mean, when a lightning bolt hits a powerline or transformer station, the lights most certainly blink. They dim a little. The same thing happens when someone in the building is welding - the welder draws a lot of current so the line voltage decreases significantly. The question is, when a tram or trolleybus or even a train goes past the building, does it create enough of a disturbance for the lights to noticeably blink? In most cases, I guess it doesn't. But do the lights blink unnoticeably? Possibly. It's not necessarily an electrical disturbance that can cause the lights to blink. Air currents for instance do modify the light output of all devices, be them incandescent lamps, fluorescents, LEDs, whatever, simply because light output is dependent on temperature, no matter how slightly. Me shouting at the lightbulb can potentially alter its output. By how much, that's another discussion. It's certainly not noticeable, but is it measurable? If it's not directly measurable, is it at least statistically measurable? How many photons a second does a lightbulb's output vary with when a car honks outside? I don't know. Fewer than air currents cause? Electrical disturbances from people turning their TV on or off in the next building? I don't know. It's an interesting thought. My guess is about 5.

## Wednesday, December 17, 2008

### elevator

It's said that people who live on the first and second floors die older, because they rarely take the elevator and thus have more healthy exercise. On two occasions today I shared the elevator with people who ascended just one floor. A few days ago the elevator was nonfunctional. It was stuck at the second floor. I can't be totally sure, but my guess is that someone walked into the building, took the elevator to the second floor, got out and instead of closing the door in a civilized fashion, they kicked the fuck out of it. The door bounced back just as someone else was calling the elevator and the interlock was closing. It didn't have enough time to close so the door got stuck open. Then again, the interlock might have just been defective, not allowing the external door to open, and the person getting out might have manually opened it, got out in a civilized fashion and left it locked (it can only be overriden from the inside). If the first scenario actually happened (I don't claim it did), I can only wonder how could someone be so lazy as to take the elevator to the second floor, and at the same time blast the shit out of the door. Maybe they were in a hurry.

## Thursday, December 11, 2008

### netbooks

Laptops are now called notebooks, because they get too hot to hold in one's lap. From the article: "A study by State University of New York researchers found that heat generated from laptops can raise the temperature of the scrotum, potentially putting sperm count at risk." Small laptops are now called netbooks (Firefox 3 spellchecker underlines the word red) because well, I don't know why. Maybe it's because they have a network interface, as opposed to normal laptops which somehow don't. There is some truth to that: my laptop's internal wireless card failed a few months after purchase. Fuck warranty, I wouldn't have been able to use my computer for a month or so while they repaired it. I tapped the card firmly and it started working again, it worked for another week and then it finally kicked the bucket. Given the low price of PCMCIA or USB wireless cards, fuck it. Maybe I'll reflow the solder some time (fuck BGAs and fuck RoHS (I wonder what they use to replace the Pb in CRT glass - you know, the Pb that shields you from the X-rays generated in the CRT)).

Anyway, thank Microsoft for indirectly limiting netbook performance.

Anyway, thank Microsoft for indirectly limiting netbook performance.

Subscribe to:
Posts (Atom)