Last night I had an awesome, disturbing dream. It began with me waking up and realizing I was dead. I was somekind of ghost that people couldn't see, but those close to me could hear and maybe touch. I on the other hand experienced the world as usual -- no walking through walls, no flying, no mind-reading -- it felt the same. Once I realized I could talk to people, I asked them what was going on, and everybody seemed to agree that I'm an invisible ghost now. I assured them I saw neither hell nor heaven, and asked them what they had done with my body.
"Oh, we just called the disposal team and had it incinerated, we thought that's what you would've liked us to do".
"Oh, yes, certainly, good job".
(That's got to be the greatest piece of dialogue I've ever experienced.)
Thoughts of unfinished business and unfulfilled promises raced through my mind, but they were quickly quenched by the realization that I could ask people to do said buisness for me. Still the fear remained that I would soon be unable to communicate with those I've left behind.
After a while I fell asleep and then woke up again and glitches started happening, like objects disappearing. I thought - maybe I was decaying. It was very short-lived. Things started looking and feeling normally again. Actually it felt so normal that with no dead body available I was starting to doubt that I was a ghost. So I challenged the people to shine a laser towards a wall. I put my hand in the laser beam and sure enough the spot on the wall disappeared. But the people told me that no, the spot is right there on the wall, unobstructed. I then told them to point a digital camera at the wall. Surely a system so complicated would separate my possible illusion from theirs, and most likely record the spot on the wall. But it didn't happen. It was just too complicated to dream. I woke up.
I've been trying to decipher this dream to no avail. Maybe I've been reborn or something. Maybe I've just been feeling weird. The Universe has been screwing with me lately, and it's been wonderful.
So remember, don't drink and sleep, and when in doubt, fucking lasers yeah.
Monday, January 23, 2012
Saturday, November 5, 2011
heating
I hear this annoying argument all the time: I live in a cold climate so my {home server, tube amp, idle set-top box, microwave oven display} doesn't waste electricity, it just helps keep my house warm.
I understand that resistive space heaters are popular in the U.S., but in the rest of the world some people like heat pumps better. The heat output from both resistive space heaters and microprocessors or all other electronics is equal to the electrical energy drawn from the grid. The heat output from a heat pump-based device is higher than the electrical energy input, the difference resulting from cooling the outside air. Just like air conditioning in the summer, but in reverse. So there, your inefficient technology is not only helping keep your home warm, it's also wasting electricity that could otherwise be used to heat it more efficiently.
I understand that resistive space heaters are popular in the U.S., but in the rest of the world some people like heat pumps better. The heat output from both resistive space heaters and microprocessors or all other electronics is equal to the electrical energy drawn from the grid. The heat output from a heat pump-based device is higher than the electrical energy input, the difference resulting from cooling the outside air. Just like air conditioning in the summer, but in reverse. So there, your inefficient technology is not only helping keep your home warm, it's also wasting electricity that could otherwise be used to heat it more efficiently.
Saturday, October 1, 2011
oops
I found this nice, classic "beware of macros" bug:
Oops.
#define MAX(a, b) ((a) > (b) ? (a) : (b))
[ ... ]
int x = MAX(foo, bar++); // incremented either once or twice
Oops.
Saturday, September 24, 2011
supercrapinal
Scientists studying neutrino oscillations have recently managed to screw up their measurements and come up with a speed for the particles that's higher than the speed of light. For those of you not familiar with particle physics, think of neutrinos as very light electrons but without the electrical charge. Having no electrical charge they seldom interact with matter and can pass through thousands of miles of rock undisturbed. So they generate the neutrinos at CERN and measure the time it takes for them to arrive in Italy. Of course, according to all we know so far and according to countless experiments, no particle travels faster than light. This paper, the authors of which could easily fill up a large bus or maybe a small train, details their method. The paper is properly written and simply presents their method and results; it makes no extraordinary claims, but rather states that the authors have yet to identify any more sources of error and are searching for those. Of course the media have picked up the story. The respectable outlets maintain the cautious tone, but several are trying to make this into a sensationalist news story: "Einstein's Relativity could be wrong". No. The chances of Relativity being wrong are extremely slim. The chances that an undetected error screwed up the measurements is overwhelming. The time measurement and calculation error from known sources is about 10 nanoseconds and the particles have supposedly arrived 60 ns too early. Given the complicated system used to do the measurements, whose systematic (fixed) errors that had to be calibrated are much higher than that, it's very likely that this result is due to an equipment problem or a mistake. For example, does the circuitry always have the same delay? Does that 8km-long optical fiber always have the same delay? Are those delays during the experiment equal to the ones that have been measured while not doing the experiment? 60 ns is a pretty long time, given that one light-nanosecond is about 30 cm. But it could easily arrive from faulty equipment or bad measurement techniques. I'm sure that in the following weeks or months these errors will be identified. Until then, supernova measurements indicate that neutrinos arrive at the same time as light, not earlier. If they were really going faster, then the signal measured from a supernova that's thousands of light-years away would precede the optical telescope detection by days or weeks, which we're not seeing. So neutrinos don't travel faster than light. One might think that there's a small region, near the area where the high-energy reaction that creates the neutrinos takes place, where some extraordinary phenomenon might be happening, and that is causing the effect - once they leave that area they behave normally. This is also highly unlikely. It's true that Relativity has to be adjusted at small scales and high energies, but those scales are very far off and a space-time distortion large enough to cause 60 ns or 18 meters of error is surely going to be noticed and cause some very weird shit so to speak.
So there. Calm down and wait for that defective satellite to fall on your heads tomorrow.
So there. Calm down and wait for that defective satellite to fall on your heads tomorrow.
Tuesday, August 16, 2011
web
The discussion on this article on Slashdot caught my attention. The article is complaining about the direction that Firefox development is taking, and many comments go along those lines. I won't discuss those comments and I won't discuss said development policies further than saying that one has every right to argue against those decisions in a civilized manner.
What pisses me off is people arguing in large numbers for web developers to drop Firefox support because of the new rapid-release cycle. I've noticed a pattern of people complaining that dealing with bug reports and generally supporting their web applications running on a large number of Firefox versions is painful and will cause them to drop Firefox support altogether.
Let me first say that this is a precise illustration of some of the many reasons for which I hate web developers, web programmers, Web 2.0 proponents and all the other web-*. With a few exceptions of course.
Furthermore, let me say that if your so-called application wasn't a monstrous, attack-prone, inefficient, poorly-documented, hastily-written-on-the-back-of-a-napkin-over-dunch, non-standards-compliant, unclear-specification-bending, latest-buzzstuff-incorporating, user-agent-string-dependent piece of spaghetti-code seasoned to taste with data, metadata and metacode, then maybe it would display OK and function OK in all reasonably recent versions of all mainstream web-browsers. I've worked (not a pleasure believe me!) with what I'm told are crappy CMS platforms and even those function reasonably well even in IE6, let alone Firefox, Chrome, Opera and that thing that Macs are using.
Now, even if your application isn't a monstrous, inefficient, undocumented, poorly-designed, badly written piece of html-css-js-xml-php-sql-jpeg-mashed-potatoes, that's still no excuse for being plain lazy with customer support.
But why so much venom? Why do I care? Because I'm forced to work with products that exhibit the above-mentioned characteristics on a regular basis, and that's costing me time and energy. I don't generally hate people who write bad applications, and I can always write applications that run on my PC if I don't like the ones that are around - at least theoretically I have that freedom, but I can't do that with someone else's web applications. It's when I'm forced to use some kid's late-night, inebriated brain-vomit that he calls a web platform, that I get pissed off, and I want to use this opportunity to say: I sincerely hate you. At the same time, I fully understand that somebody has to collect my garbage and somebody has to unclog my sewer, so you've got all my respect and thank God you're not programming critical infrastructure, I hope.
Getting back to the matter, I remember when I was a kid I came across a code sample from Microsoft on how to "sign"-or-something your ActiveX control that you wrote in Visual Basic so that Internet Explorer wouldn't warn the user that an ActiveX control is going to execute from the web page they're visiting, with the default IE settings at least. Yeah. ActiveX controls were IE's non-standards-compliant way of providing a 'rich web experience' or whatever it was called back then, and many businesses developed 'web applications' around said technology. Sadly ActiveX controls are not web applications, they're normal applications that contain normal machine code that's directly executable by the processor of the client computer and has access to all normal operating system functions, i.e. it is not sandboxed within the browser. Needless to say, that code can in principle do anything its writer wants with your machine. So yeah, go ahead and not support Firefox, and while you're at it complain that it's fat and slow, but remember that it's Mozilla who played an important part in promoting compliance to open web standards and distancing the web from the state described above. At least give them some credit for that, or just go ahead and implement your web application in platform-independent, not-quite-ActiveX Flash or Java or something and shut up.
Oh, and a late Happy 20th Birthday, Web, and a Happy 30th Birthday, PC. May you survive all this crap for decades to come and continue to bring us entertainment and prosperity.
What pisses me off is people arguing in large numbers for web developers to drop Firefox support because of the new rapid-release cycle. I've noticed a pattern of people complaining that dealing with bug reports and generally supporting their web applications running on a large number of Firefox versions is painful and will cause them to drop Firefox support altogether.
Let me first say that this is a precise illustration of some of the many reasons for which I hate web developers, web programmers, Web 2.0 proponents and all the other web-*. With a few exceptions of course.
Furthermore, let me say that if your so-called application wasn't a monstrous, attack-prone, inefficient, poorly-documented, hastily-written-on-the-back-of-a-napkin-over-dunch, non-standards-compliant, unclear-specification-bending, latest-buzzstuff-incorporating, user-agent-string-dependent piece of spaghetti-code seasoned to taste with data, metadata and metacode, then maybe it would display OK and function OK in all reasonably recent versions of all mainstream web-browsers. I've worked (not a pleasure believe me!) with what I'm told are crappy CMS platforms and even those function reasonably well even in IE6, let alone Firefox, Chrome, Opera and that thing that Macs are using.
Now, even if your application isn't a monstrous, inefficient, undocumented, poorly-designed, badly written piece of html-css-js-xml-php-sql-jpeg-mashed-potatoes, that's still no excuse for being plain lazy with customer support.
But why so much venom? Why do I care? Because I'm forced to work with products that exhibit the above-mentioned characteristics on a regular basis, and that's costing me time and energy. I don't generally hate people who write bad applications, and I can always write applications that run on my PC if I don't like the ones that are around - at least theoretically I have that freedom, but I can't do that with someone else's web applications. It's when I'm forced to use some kid's late-night, inebriated brain-vomit that he calls a web platform, that I get pissed off, and I want to use this opportunity to say: I sincerely hate you. At the same time, I fully understand that somebody has to collect my garbage and somebody has to unclog my sewer, so you've got all my respect and thank God you're not programming critical infrastructure, I hope.
Getting back to the matter, I remember when I was a kid I came across a code sample from Microsoft on how to "sign"-or-something your ActiveX control that you wrote in Visual Basic so that Internet Explorer wouldn't warn the user that an ActiveX control is going to execute from the web page they're visiting, with the default IE settings at least. Yeah. ActiveX controls were IE's non-standards-compliant way of providing a 'rich web experience' or whatever it was called back then, and many businesses developed 'web applications' around said technology. Sadly ActiveX controls are not web applications, they're normal applications that contain normal machine code that's directly executable by the processor of the client computer and has access to all normal operating system functions, i.e. it is not sandboxed within the browser. Needless to say, that code can in principle do anything its writer wants with your machine. So yeah, go ahead and not support Firefox, and while you're at it complain that it's fat and slow, but remember that it's Mozilla who played an important part in promoting compliance to open web standards and distancing the web from the state described above. At least give them some credit for that, or just go ahead and implement your web application in platform-independent, not-quite-ActiveX Flash or Java or something and shut up.
Oh, and a late Happy 20th Birthday, Web, and a Happy 30th Birthday, PC. May you survive all this crap for decades to come and continue to bring us entertainment and prosperity.
Saturday, July 9, 2011
solitaire
The wonderfully idiotic game of Klondike solitaire is ubiquitous; there are countless embarassing pictures of people playing it instead of doing their job. It seems however that some people's job is actually that of playing Solitaire. Take this article for instance, which I found while terribly bored and wondering about whether they came up with a mathematical description for Solitaire and what are the odds of winning. It seems that no, there isn't a complete analysis yet - one of the embarassements of modern mathematics :)
Before discussing the article any further it must be stated that, while to some it might look funny, a waste of time, or downright idiotic to do a study on Solitaire, it's actually quite serious. While the game itself might be considered silly by some, if studying it can bring advancement in "artificial intelligence" then so be it. Because I'm anti-mainstream I like to actually call it "automated problem-solving", because there's nothing intelligent about it or many other game "AIs", but I digress. Studying Solitaire is as valid a scientific endeavour as studying chess or how to slice a pizza so it spilts into equal parts, though this last problem is considerably easier and has been recently solved (searching for the article is left as an exercise to the reader).
Getting back to the Solitaire article, they supposedly worked with a famous mathematician who "carefully played and recorded 2000 games, achieving a win rate of 36.6%". Their software supposedly obtained win rates of up to 70.2%.
So there. There is actually someone whose job is (among many other more productive things I'm sure) to play Solitaire. 2000 games at an average of 20 minutes per game as stated in the article equals exactly 666,6... hours oddly enough. Considering an 8-hour work day, that would equal rougly 80 days of playing Solitaire, or about 4 months. Cool thing indeed.
Before discussing the article any further it must be stated that, while to some it might look funny, a waste of time, or downright idiotic to do a study on Solitaire, it's actually quite serious. While the game itself might be considered silly by some, if studying it can bring advancement in "artificial intelligence" then so be it. Because I'm anti-mainstream I like to actually call it "automated problem-solving", because there's nothing intelligent about it or many other game "AIs", but I digress. Studying Solitaire is as valid a scientific endeavour as studying chess or how to slice a pizza so it spilts into equal parts, though this last problem is considerably easier and has been recently solved (searching for the article is left as an exercise to the reader).
Getting back to the Solitaire article, they supposedly worked with a famous mathematician who "carefully played and recorded 2000 games, achieving a win rate of 36.6%". Their software supposedly obtained win rates of up to 70.2%.
So there. There is actually someone whose job is (among many other more productive things I'm sure) to play Solitaire. 2000 games at an average of 20 minutes per game as stated in the article equals exactly 666,6... hours oddly enough. Considering an 8-hour work day, that would equal rougly 80 days of playing Solitaire, or about 4 months. Cool thing indeed.
Monday, June 20, 2011
internet
I'm on a long scientific visit at a scientific institute, I'm accomodated at a student guesthouse that's on the institute's scientific network and am not allowed to use the Internet for anything other than scientific purposes. I'm also not allowed to transfer more than 420MB/day or thereabout. Their leaflet says they'd let me but they can't because their traffic is metered and they're paying by amount of data transferred. I'm not a furious downloader, especially when I know I'm on a limited network. I just want to relax, read Slashdot and stream a news channel at some few 100 kb/s. But I can't because the institute's traffic is metered.
P.S. Oh shit this post is not scientific enough I'll be punished! [hides]
P.S. Oh shit this post is not scientific enough I'll be punished! [hides]
Subscribe to:
Posts (Atom)