Saturday, November 5, 2011

heating

I hear this annoying argument all the time: I live in a cold climate so my {home server, tube amp, idle set-top box, microwave oven display} doesn't waste electricity, it just helps keep my house warm.
I understand that resistive space heaters are popular in the U.S., but in the rest of the world some people like heat pumps better. The heat output from both resistive space heaters and microprocessors or all other electronics is equal to the electrical energy drawn from the grid. The heat output from a heat pump-based device is higher than the electrical energy input, the difference resulting from cooling the outside air. Just like air conditioning in the summer, but in reverse. So there, your inefficient technology is not only helping keep your home warm, it's also wasting electricity that could otherwise be used to heat it more efficiently.

Saturday, October 1, 2011

oops

I found this nice, classic "beware of macros" bug:
#define MAX(a, b)    ((a) > (b) ? (a) : (b))
[ ... ]
int x = MAX(foo, bar++); // incremented either once or twice

Oops.

Saturday, September 24, 2011

supercrapinal

Scientists studying neutrino oscillations have recently managed to screw up their measurements and come up with a speed for the particles that's higher than the speed of light. For those of you not familiar with particle physics, think of neutrinos as very light electrons but without the electrical charge. Having no electrical charge they seldom interact with matter and can pass through thousands of miles of rock undisturbed. So they generate the neutrinos at CERN and measure the time it takes for them to arrive in Italy. Of course, according to all we know so far and according to countless experiments, no particle travels faster than light. This paper, the authors of which could easily fill up a large bus or maybe a small train, details their method. The paper is properly written and simply presents their method and results; it makes no extraordinary claims, but rather states that the authors have yet to identify any more sources of error and are searching for those. Of course the media have picked up the story. The respectable outlets maintain the cautious tone, but several are trying to make this into a sensationalist news story: "Einstein's Relativity could be wrong". No. The chances of Relativity being wrong are extremely slim. The chances that an undetected error screwed up the measurements is overwhelming. The time measurement and calculation error from known sources is about 10 nanoseconds and the particles have supposedly arrived 60 ns too early. Given the complicated system used to do the measurements, whose systematic (fixed) errors that had to be calibrated are much higher than that, it's very likely that this result is due to an equipment problem or a mistake. For example, does the circuitry always have the same delay? Does that 8km-long optical fiber always have the same delay? Are those delays during the experiment equal to the ones that have been measured while not doing the experiment? 60 ns is a pretty long time, given that one light-nanosecond is about 30 cm. But it could easily arrive from faulty equipment or bad measurement techniques. I'm sure that in the following weeks or months these errors will be identified. Until then, supernova measurements indicate that neutrinos arrive at the same time as light, not earlier. If they were really going faster, then the signal measured from a supernova that's thousands of light-years away would precede the optical telescope detection by days or weeks, which we're not seeing. So neutrinos don't travel faster than light. One might think that there's a small region, near the area where the high-energy reaction that creates the neutrinos takes place, where some extraordinary phenomenon might be happening, and that is causing the effect - once they leave that area they behave normally. This is also highly unlikely. It's true that Relativity has to be adjusted at small scales and high energies, but those scales are very far off and a space-time distortion large enough to cause 60 ns or 18 meters of error is surely going to be noticed and cause some very weird shit so to speak.
So there. Calm down and wait for that defective satellite to fall on your heads tomorrow.

Tuesday, August 16, 2011

web

The discussion on this article on Slashdot caught my attention. The article is complaining about the direction that Firefox development is taking, and many comments go along those lines. I won't discuss those comments and I won't discuss said development policies further than saying that one has every right to argue against those decisions in a civilized manner.
What pisses me off is people arguing in large numbers for web developers to drop Firefox support because of the new rapid-release cycle. I've noticed a pattern of people complaining that dealing with bug reports and generally supporting their web applications running on a large number of Firefox versions is painful and will cause them to drop Firefox support altogether.
Let me first say that this is a precise illustration of some of the many reasons for which I hate web developers, web programmers, Web 2.0 proponents and all the other web-*. With a few exceptions of course.
Furthermore, let me say that if your so-called application wasn't a monstrous, attack-prone, inefficient, poorly-documented, hastily-written-on-the-back-of-a-napkin-over-dunch, non-standards-compliant, unclear-specification-bending, latest-buzzstuff-incorporating, user-agent-string-dependent piece of spaghetti-code seasoned to taste with data, metadata and metacode, then maybe it would display OK and function OK in all reasonably recent versions of all mainstream web-browsers. I've worked (not a pleasure believe me!) with what I'm told are crappy CMS platforms and even those function reasonably well even in IE6, let alone Firefox, Chrome, Opera and that thing that Macs are using.
Now, even if your application isn't a monstrous, inefficient, undocumented, poorly-designed, badly written piece of html-css-js-xml-php-sql-jpeg-mashed-potatoes, that's still no excuse for being plain lazy with customer support.
But why so much venom? Why do I care? Because I'm forced to work with products that exhibit the above-mentioned characteristics on a regular basis, and that's costing me time and energy. I don't generally hate people who write bad applications, and I can always write applications that run on my PC if I don't like the ones that are around - at least theoretically I have that freedom, but I can't do that with someone else's web applications. It's when I'm forced to use some kid's late-night, inebriated brain-vomit that he calls a web platform, that I get pissed off, and I want to use this opportunity to say: I sincerely hate you. At the same time, I fully understand that somebody has to collect my garbage and somebody has to unclog my sewer, so you've got all my respect and thank God you're not programming critical infrastructure, I hope.
Getting back to the matter, I remember when I was a kid I came across a code sample from Microsoft on how to "sign"-or-something your ActiveX control that you wrote in Visual Basic so that Internet Explorer wouldn't warn the user that an ActiveX control is going to execute from the web page they're visiting, with the default IE settings at least. Yeah. ActiveX controls were IE's non-standards-compliant way of providing a 'rich web experience' or whatever it was called back then, and many businesses developed 'web applications' around said technology. Sadly ActiveX controls are not web applications, they're normal applications that contain normal machine code that's directly executable by the processor of the client computer and has access to all normal operating system functions, i.e. it is not sandboxed within the browser. Needless to say, that code can in principle do anything its writer wants with your machine. So yeah, go ahead and not support Firefox, and while you're at it complain that it's fat and slow, but remember that it's Mozilla who played an important part in promoting compliance to open web standards and distancing the web from the state described above. At least give them some credit for that, or just go ahead and implement your web application in platform-independent, not-quite-ActiveX Flash or Java or something and shut up.
Oh, and a late Happy 20th Birthday, Web, and a Happy 30th Birthday, PC. May you survive all this crap for decades to come and continue to bring us entertainment and prosperity.

Saturday, July 9, 2011

solitaire

The wonderfully idiotic game of Klondike solitaire is ubiquitous; there are countless embarassing pictures of people playing it instead of doing their job. It seems however that some people's job is actually that of playing Solitaire. Take this article for instance, which I found while terribly bored and wondering about whether they came up with a mathematical description for Solitaire and what are the odds of winning. It seems that no, there isn't a complete analysis yet - one of the embarassements of modern mathematics :)
Before discussing the article any further it must be stated that, while to some it might look funny, a waste of time, or downright idiotic to do a study on Solitaire, it's actually quite serious. While the game itself might be considered silly by some, if studying it can bring advancement in "artificial intelligence" then so be it. Because I'm anti-mainstream I like to actually call it "automated problem-solving", because there's nothing intelligent about it or many other game "AIs", but I digress. Studying Solitaire is as valid a scientific endeavour as studying chess or how to slice a pizza so it spilts into equal parts, though this last problem is considerably easier and has been recently solved (searching for the article is left as an exercise to the reader).
Getting back to the Solitaire article, they supposedly worked with a famous mathematician who "carefully played and recorded 2000 games, achieving a win rate of 36.6%". Their software supposedly obtained win rates of up to 70.2%.
So there. There is actually someone whose job is (among many other more productive things I'm sure) to play Solitaire. 2000 games at an average of 20 minutes per game as stated in the article equals exactly 666,6... hours oddly enough. Considering an 8-hour work day, that would equal rougly 80 days of playing Solitaire, or about 4 months. Cool thing indeed.

Monday, June 20, 2011

internet

I'm on a long scientific visit at a scientific institute, I'm accomodated at a student guesthouse that's on the institute's scientific network and am not allowed to use the Internet for anything other than scientific purposes. I'm also not allowed to transfer more than 420MB/day or thereabout. Their leaflet says they'd let me but they can't because their traffic is metered and they're paying by amount of data transferred. I'm not a furious downloader, especially when I know I'm on a limited network. I just want to relax, read Slashdot and stream a news channel at some few 100 kb/s. But I can't because the institute's traffic is metered.

P.S. Oh shit this post is not scientific enough I'll be punished! [hides]

Wednesday, June 15, 2011

unfortunate

There's a growing trend around here for newspapers to offer various book collections, probably to try to counteract their plummeting sales or something. Under slogans having to do with investing in one's culture and stuff, they're offering a new book each week for a relatively small price, and you're encouraged to buy them all to own the complete collection. While not bad in itself, this affair gets funny when the competition gets tough and the companies start getting really creative with their advertising. For instance, one collection is currently advertised along these lines: [display frog] For those who read, this is a future prince. For the others, it's just a frog. Yeah, pretty original, until they get to this week's book: The Marquis de Sade. WTF.
On a totally different thread, I've been wanting for a very long time to comment on dr. Michio Kaku's pet show Sci-Fi Science, which I also find very unfortunate. His take on the possible real-world implementation of various sci-fi memes is sometimes plausible, at few times interesting, at many times outrageous, and the show itself is worth watching if you're otherwise terribly bored. I personally liked him much more when he was making brief appearences in various science shows than as the star in his own project. On to the point:
1. About one in three 'problems' has negative matter as a solution. Enough with the negative matter that falls upwards, it's purely theoretical, there are some clues as to its possible existance, but for now it's pure speculation.
2. Let's make a sci-fi-style force field/shield, only it's not a force field it's a matter+laser shield. Reasonable, until the enemy spaceship attacks you with lasers, which your carbon-nanotube-or-whatever-future-material shield can't supposedly withstand. His idea - coat it with "photocromatic materials", which change color in response to light exposure, just like those adaptive sunglasses. Those are supposed to absorb the laser. Why on Earth would he use photocromatic materials instead of a mirror or nothing is beyond me. The energy absorbed by the shield with or without said materials is basically the same. It either melts or it doesn't. Maybe it absorbs more so that the laser doesn't pass through the shield and on to the ship? Who knows. That doesn't solve the shield melting problem though. It just looks like a half-baked hack with a lot of buzztech.
3. He suggests that teleportation might work by digitising a human, memories included, sending the data along a series of laser beams (x-ray laser beams IIRC, for the added coolness and information density) and reconstructing the human at the destination. He openly admits to this resulting in two identical humans, memories and personality included, but states that we've got enough time to work the kinks out of this problem. I find this "brute force" approach questionable and I've got a big problem with it. Would you like to be killed at point A after being copied to point B? There's got to be a more elegant solution. Even if it were possible to clone me with little error and come up with an identical self-aware being, the two of us would be disconnected. The new mind would be a different mind. Sure, to my friends I might look the same and they might never notice, but I sincerely doubt that what I call "me" would be the same. Kill the first body and the second would surely function OK as a new person with my exact same traits, but I strongly doubt that I'll awake inside it as if nothing happened.
And the list goes on.
There are of course the cool solutions such as the self-reconfigurable robot, but many of them are, as I said, unfortunate.
Some might label me pathetic for picking on a low-budget pop-sci entertainment show, but there's a deeper thought here. Some time ago being a geek was something to be proud of, as geeks could display high levels of skill and competence, and in a larger sense, a greater understanding of the world. Now there's a new breed of geek, the sci-fi/fantasy game geek, who also takes great pride in their passion but often times lacks the high levels of practical skills and only displays a great understanding of some fantasy world. It can of course be argued that this projects positively into the real world, but generally the fiction geek combines all the disadvantages of being a geek with none of the advanges of being a geek. Everyone who watched Sci-Fi Science would agree with me.
After all, some people watch a movie or read a book and enjoy it as such, some find parallels with human society, some dream of being princesses and frog-princes, and some dream of being Jedi wielding extendable porous ceramic lightsabres with nuclear batteries, adamantium cooling fans and superconducting plasma confinement magnets. Or something like that. There's nothing wrong with either case, and of course there's always money to be made.