Saturday, November 5, 2011

heating

I hear this annoying argument all the time: I live in a cold climate so my {home server, tube amp, idle set-top box, microwave oven display} doesn't waste electricity, it just helps keep my house warm.
I understand that resistive space heaters are popular in the U.S., but in the rest of the world some people like heat pumps better. The heat output from both resistive space heaters and microprocessors or all other electronics is equal to the electrical energy drawn from the grid. The heat output from a heat pump-based device is higher than the electrical energy input, the difference resulting from cooling the outside air. Just like air conditioning in the summer, but in reverse. So there, your inefficient technology is not only helping keep your home warm, it's also wasting electricity that could otherwise be used to heat it more efficiently.

Saturday, October 1, 2011

oops

I found this nice, classic "beware of macros" bug:
#define MAX(a, b)    ((a) > (b) ? (a) : (b))
[ ... ]
int x = MAX(foo, bar++); // incremented either once or twice

Oops.

Saturday, September 24, 2011

supercrapinal

Scientists studying neutrino oscillations have recently managed to screw up their measurements and come up with a speed for the particles that's higher than the speed of light. For those of you not familiar with particle physics, think of neutrinos as very light electrons but without the electrical charge. Having no electrical charge they seldom interact with matter and can pass through thousands of miles of rock undisturbed. So they generate the neutrinos at CERN and measure the time it takes for them to arrive in Italy. Of course, according to all we know so far and according to countless experiments, no particle travels faster than light. This paper, the authors of which could easily fill up a large bus or maybe a small train, details their method. The paper is properly written and simply presents their method and results; it makes no extraordinary claims, but rather states that the authors have yet to identify any more sources of error and are searching for those. Of course the media have picked up the story. The respectable outlets maintain the cautious tone, but several are trying to make this into a sensationalist news story: "Einstein's Relativity could be wrong". No. The chances of Relativity being wrong are extremely slim. The chances that an undetected error screwed up the measurements is overwhelming. The time measurement and calculation error from known sources is about 10 nanoseconds and the particles have supposedly arrived 60 ns too early. Given the complicated system used to do the measurements, whose systematic (fixed) errors that had to be calibrated are much higher than that, it's very likely that this result is due to an equipment problem or a mistake. For example, does the circuitry always have the same delay? Does that 8km-long optical fiber always have the same delay? Are those delays during the experiment equal to the ones that have been measured while not doing the experiment? 60 ns is a pretty long time, given that one light-nanosecond is about 30 cm. But it could easily arrive from faulty equipment or bad measurement techniques. I'm sure that in the following weeks or months these errors will be identified. Until then, supernova measurements indicate that neutrinos arrive at the same time as light, not earlier. If they were really going faster, then the signal measured from a supernova that's thousands of light-years away would precede the optical telescope detection by days or weeks, which we're not seeing. So neutrinos don't travel faster than light. One might think that there's a small region, near the area where the high-energy reaction that creates the neutrinos takes place, where some extraordinary phenomenon might be happening, and that is causing the effect - once they leave that area they behave normally. This is also highly unlikely. It's true that Relativity has to be adjusted at small scales and high energies, but those scales are very far off and a space-time distortion large enough to cause 60 ns or 18 meters of error is surely going to be noticed and cause some very weird shit so to speak.
So there. Calm down and wait for that defective satellite to fall on your heads tomorrow.

Tuesday, August 16, 2011

web

The discussion on this article on Slashdot caught my attention. The article is complaining about the direction that Firefox development is taking, and many comments go along those lines. I won't discuss those comments and I won't discuss said development policies further than saying that one has every right to argue against those decisions in a civilized manner.
What pisses me off is people arguing in large numbers for web developers to drop Firefox support because of the new rapid-release cycle. I've noticed a pattern of people complaining that dealing with bug reports and generally supporting their web applications running on a large number of Firefox versions is painful and will cause them to drop Firefox support altogether.
Let me first say that this is a precise illustration of some of the many reasons for which I hate web developers, web programmers, Web 2.0 proponents and all the other web-*. With a few exceptions of course.
Furthermore, let me say that if your so-called application wasn't a monstrous, attack-prone, inefficient, poorly-documented, hastily-written-on-the-back-of-a-napkin-over-dunch, non-standards-compliant, unclear-specification-bending, latest-buzzstuff-incorporating, user-agent-string-dependent piece of spaghetti-code seasoned to taste with data, metadata and metacode, then maybe it would display OK and function OK in all reasonably recent versions of all mainstream web-browsers. I've worked (not a pleasure believe me!) with what I'm told are crappy CMS platforms and even those function reasonably well even in IE6, let alone Firefox, Chrome, Opera and that thing that Macs are using.
Now, even if your application isn't a monstrous, inefficient, undocumented, poorly-designed, badly written piece of html-css-js-xml-php-sql-jpeg-mashed-potatoes, that's still no excuse for being plain lazy with customer support.
But why so much venom? Why do I care? Because I'm forced to work with products that exhibit the above-mentioned characteristics on a regular basis, and that's costing me time and energy. I don't generally hate people who write bad applications, and I can always write applications that run on my PC if I don't like the ones that are around - at least theoretically I have that freedom, but I can't do that with someone else's web applications. It's when I'm forced to use some kid's late-night, inebriated brain-vomit that he calls a web platform, that I get pissed off, and I want to use this opportunity to say: I sincerely hate you. At the same time, I fully understand that somebody has to collect my garbage and somebody has to unclog my sewer, so you've got all my respect and thank God you're not programming critical infrastructure, I hope.
Getting back to the matter, I remember when I was a kid I came across a code sample from Microsoft on how to "sign"-or-something your ActiveX control that you wrote in Visual Basic so that Internet Explorer wouldn't warn the user that an ActiveX control is going to execute from the web page they're visiting, with the default IE settings at least. Yeah. ActiveX controls were IE's non-standards-compliant way of providing a 'rich web experience' or whatever it was called back then, and many businesses developed 'web applications' around said technology. Sadly ActiveX controls are not web applications, they're normal applications that contain normal machine code that's directly executable by the processor of the client computer and has access to all normal operating system functions, i.e. it is not sandboxed within the browser. Needless to say, that code can in principle do anything its writer wants with your machine. So yeah, go ahead and not support Firefox, and while you're at it complain that it's fat and slow, but remember that it's Mozilla who played an important part in promoting compliance to open web standards and distancing the web from the state described above. At least give them some credit for that, or just go ahead and implement your web application in platform-independent, not-quite-ActiveX Flash or Java or something and shut up.
Oh, and a late Happy 20th Birthday, Web, and a Happy 30th Birthday, PC. May you survive all this crap for decades to come and continue to bring us entertainment and prosperity.

Saturday, July 9, 2011

solitaire

The wonderfully idiotic game of Klondike solitaire is ubiquitous; there are countless embarassing pictures of people playing it instead of doing their job. It seems however that some people's job is actually that of playing Solitaire. Take this article for instance, which I found while terribly bored and wondering about whether they came up with a mathematical description for Solitaire and what are the odds of winning. It seems that no, there isn't a complete analysis yet - one of the embarassements of modern mathematics :)
Before discussing the article any further it must be stated that, while to some it might look funny, a waste of time, or downright idiotic to do a study on Solitaire, it's actually quite serious. While the game itself might be considered silly by some, if studying it can bring advancement in "artificial intelligence" then so be it. Because I'm anti-mainstream I like to actually call it "automated problem-solving", because there's nothing intelligent about it or many other game "AIs", but I digress. Studying Solitaire is as valid a scientific endeavour as studying chess or how to slice a pizza so it spilts into equal parts, though this last problem is considerably easier and has been recently solved (searching for the article is left as an exercise to the reader).
Getting back to the Solitaire article, they supposedly worked with a famous mathematician who "carefully played and recorded 2000 games, achieving a win rate of 36.6%". Their software supposedly obtained win rates of up to 70.2%.
So there. There is actually someone whose job is (among many other more productive things I'm sure) to play Solitaire. 2000 games at an average of 20 minutes per game as stated in the article equals exactly 666,6... hours oddly enough. Considering an 8-hour work day, that would equal rougly 80 days of playing Solitaire, or about 4 months. Cool thing indeed.

Monday, June 20, 2011

internet

I'm on a long scientific visit at a scientific institute, I'm accomodated at a student guesthouse that's on the institute's scientific network and am not allowed to use the Internet for anything other than scientific purposes. I'm also not allowed to transfer more than 420MB/day or thereabout. Their leaflet says they'd let me but they can't because their traffic is metered and they're paying by amount of data transferred. I'm not a furious downloader, especially when I know I'm on a limited network. I just want to relax, read Slashdot and stream a news channel at some few 100 kb/s. But I can't because the institute's traffic is metered.

P.S. Oh shit this post is not scientific enough I'll be punished! [hides]

Wednesday, June 15, 2011

unfortunate

There's a growing trend around here for newspapers to offer various book collections, probably to try to counteract their plummeting sales or something. Under slogans having to do with investing in one's culture and stuff, they're offering a new book each week for a relatively small price, and you're encouraged to buy them all to own the complete collection. While not bad in itself, this affair gets funny when the competition gets tough and the companies start getting really creative with their advertising. For instance, one collection is currently advertised along these lines: [display frog] For those who read, this is a future prince. For the others, it's just a frog. Yeah, pretty original, until they get to this week's book: The Marquis de Sade. WTF.
On a totally different thread, I've been wanting for a very long time to comment on dr. Michio Kaku's pet show Sci-Fi Science, which I also find very unfortunate. His take on the possible real-world implementation of various sci-fi memes is sometimes plausible, at few times interesting, at many times outrageous, and the show itself is worth watching if you're otherwise terribly bored. I personally liked him much more when he was making brief appearences in various science shows than as the star in his own project. On to the point:
1. About one in three 'problems' has negative matter as a solution. Enough with the negative matter that falls upwards, it's purely theoretical, there are some clues as to its possible existance, but for now it's pure speculation.
2. Let's make a sci-fi-style force field/shield, only it's not a force field it's a matter+laser shield. Reasonable, until the enemy spaceship attacks you with lasers, which your carbon-nanotube-or-whatever-future-material shield can't supposedly withstand. His idea - coat it with "photocromatic materials", which change color in response to light exposure, just like those adaptive sunglasses. Those are supposed to absorb the laser. Why on Earth would he use photocromatic materials instead of a mirror or nothing is beyond me. The energy absorbed by the shield with or without said materials is basically the same. It either melts or it doesn't. Maybe it absorbs more so that the laser doesn't pass through the shield and on to the ship? Who knows. That doesn't solve the shield melting problem though. It just looks like a half-baked hack with a lot of buzztech.
3. He suggests that teleportation might work by digitising a human, memories included, sending the data along a series of laser beams (x-ray laser beams IIRC, for the added coolness and information density) and reconstructing the human at the destination. He openly admits to this resulting in two identical humans, memories and personality included, but states that we've got enough time to work the kinks out of this problem. I find this "brute force" approach questionable and I've got a big problem with it. Would you like to be killed at point A after being copied to point B? There's got to be a more elegant solution. Even if it were possible to clone me with little error and come up with an identical self-aware being, the two of us would be disconnected. The new mind would be a different mind. Sure, to my friends I might look the same and they might never notice, but I sincerely doubt that what I call "me" would be the same. Kill the first body and the second would surely function OK as a new person with my exact same traits, but I strongly doubt that I'll awake inside it as if nothing happened.
And the list goes on.
There are of course the cool solutions such as the self-reconfigurable robot, but many of them are, as I said, unfortunate.
Some might label me pathetic for picking on a low-budget pop-sci entertainment show, but there's a deeper thought here. Some time ago being a geek was something to be proud of, as geeks could display high levels of skill and competence, and in a larger sense, a greater understanding of the world. Now there's a new breed of geek, the sci-fi/fantasy game geek, who also takes great pride in their passion but often times lacks the high levels of practical skills and only displays a great understanding of some fantasy world. It can of course be argued that this projects positively into the real world, but generally the fiction geek combines all the disadvantages of being a geek with none of the advanges of being a geek. Everyone who watched Sci-Fi Science would agree with me.
After all, some people watch a movie or read a book and enjoy it as such, some find parallels with human society, some dream of being princesses and frog-princes, and some dream of being Jedi wielding extendable porous ceramic lightsabres with nuclear batteries, adamantium cooling fans and superconducting plasma confinement magnets. Or something like that. There's nothing wrong with either case, and of course there's always money to be made.

Sunday, April 17, 2011

aliens

Just about every popsci show I've seen in the last few years favors the idea of aliens being everywhere. I'm sick and tired of hearing "reputable scientists" utter the stereotypical sentencte: "The Universe is probably teeming with life". Just because a lot of extrasolar planets have been found lately and just because alcohol's been found in deep space (along with other organic chemicals such as aminoacids) and just because bacteria can live everywhere on Earth doesn't mean that life just appears everywhere there's water and heat. And then becomes intelligent. A few decades ago you didn't see this almost-consensus that aliens must exist with high probability. Now everyone's being disgustingly optimistic about "not being alone in the Universe".
There are a lot of problems with the whole extraterrestrial life issue.
First, you can't do statistics on a sample of one. Therefore, you can't scientifically predict anything about life elsewhere. If we were to find life on Mars or Jupiter's moons or wherever, we'd have a sample of 2, which would still be insufficient, but a lot better than 1. To be pedantic, we don't even have a clear, universal definition of life, or a clear method to identify it; we can only speculate that if it existed, it'd be similar to what we see on Earth because the physics and chemistry are the same everywhere.
Second, we are very limited in our capabilities to detect extraterrestrial life. Radio searches have so far yielded nothing and I personally doubt they ever will, because radio "leakeage" from aliens would be too weak to be detectable. The whole "our TV shows have already reached dozens of stars" is bullshit, the signals are too weak to be detectable. A 100 kilowatt radio source on Earth would shine just about 10^-24 watts of power on a generous alien antenna (one square kilometer) placed not very far away (10 light years), for the same reason that distant stars appear so dim to us. If that antenna were connected to an incredibly sensitive receiver, say cooled to a tenth of a degree above absolute zero, the electronic noise due to thermal motion would still be 1000 times stronger. They could probably barely receive Morse code, which takes a lot less bandwith than human speech. Ramp up the distance to 100 light years and the signal gets 100 more times weaker. So in order for the aliens to hear us, either they'd have to have incredibly advanced receiver technology, or our beam would have to be focused and directed towards them. We're faced with the same problem when trying to detect alien radio signals. We could probably detect a signal if it was being intentionally broadcast, but we haven't yet. This raises another problem: how many times, and for how long, did we transmit such signals into space? Not that many, not that long. Maybe the aliens are doing likewise.
What other means of detection could we use? Take artificial lighting for instance. City lights on Earth can be seen from space, and they have a very distinct spectral signature. Aliens would possibly also use artificial lighting at night, but light has the same problem as radio signals. Maybe if we had a telescope powerful enough to see planets around other stars, we'd be able to detect these hypothetical lights. At least we'd know what to look for, given that aliens would probably see in the same frequency range as we do, because their eyes or whatever they have would adapt to the spectrum of their star, which would be similar to ours. If we watched the planet from the right angle, we could possibly even see these lights turning on in the evening and off in the morning. The total power coming out of our lights is much bigger than the power from our radio transmitters, so things might be the same on their planet. Light is also easier to detect than decoding radio signals.
Another thing that comes to mind are nuclear explosions. Any sufficiently developed aliens would do them, even if just for testing. They would be strong enough to be detectable, but did we ever see any sudden flash of light near a star? Did we ever look for one? I have no idea.
Finally, contrary to what some may think, there's not a shred of physical evidence that aliens have ever visited us. That might be due to the difficulty of interstellar travel, but nonetheless it hasn't happened.
So yeah, even if aliens exist, they can't get to us and we can't get to them. Relativity, which is pretty much proven to model this world correctly, while not forbidding faster-than-light travel, predicts that such travel would result in grandfather paradoxes and such. We also can't talk to them, because information, however we send it, radio, lasers, X-rays, whatever, only travels so fast. Maybe in a "parallel reality" we could, but not in this one. So there. We are alone.

Wednesday, March 30, 2011

power

Are you deeply concerned with pollution, global warming, nuke accidents and all the shit that comes with modern, power-hungry technology? Are you thinking green, feeling green, living green? Caring about the environment? Celebrating Earth Hour by switching off the lights and burning stuff? Or are you just concerned with your electricity bill, netbook battery, or living in Japan where the power grid's flaky after the quake?
Well, then use a goddamn low-power web browser while you surf for pr0n!