Wednesday, October 10, 2007

anti-patterns

Time to relax (slow down brain clock while browsing Wikipedia articles on simple ubiquitous stuff just to see if something trivially interesting pops up).
Of course, this also leads to reading articles of a more specialized nature.
One of these would be the article on anti-patterns. I find the 'Reinventing the square wheel' list entry the funniest :D
Note the smileycon used as sentence-terminating punctuation mark.

Anti-patterns are called so because in many (most?) cases they're bad, m'kay?
Not always.

1. Come From and Go To
(listed as 'Spagetti Code')

goto is useful, and is indispensable if programming in assembly language. Few processors support 'if' and 'for' and 'while' natively, most use jumps (gotos) to emulate them.

2. Busy Waiting
(spinning in a loop waiting for something to happen/finish instead of doing something else and waiting for a signal)

They taught this with great passion and emphasis in Operating Systems class. No busy waiting, use wait functions that yield control to another thread, halt the processor to sleep if it has nothing to do. Which is obviously reasonable. You don't want a task to spin doing nothing, you want it to sleep and let another one run. If none has anything useful to do, halt the processor to save power. That can't be anything but good, right?

Wrong. Turning on the processor to do useful stuff and then off while waiting for more stuff to do trashes the supply voltage by drawing large current spikes. In an improperly designed (or cheap) system, this can induce audible noises in the speakers, audible noises from power supply coils (hear them when scrolling text?) and electrical noise leading to poor measurement precision in embedded data acquisition systems. That's why all my microntrollers spin and never sleep. Many don't use interrupts at all and poll all peripherals except timing-critical ones. Waste of power? At some tens of milliwatts, from the mains and not from batteries, I couldn't be made to care. I own and control all tasks, I know how long they (should) take, I schedule them, I own all time and silicon Mwahaha, I spend 4 hours debugging. I rule :)

Also, spinlocks and atomic operations are good, m'kay. And well-documented.

3. Raising, catching and ignoring exceptions
(and all sorts of other problems)

Blah! What's wrong with just checking return values?
Why do we need exceptions? Certainly not because of abstraction and object-orientation.
If I ask you to please go take a shit, you can do it and reply with "Done." (return 0, or the more oxymoronic ERR_OK), or you can not do it and reply with "All portable toilets in range have their doors locked." (return ERR_NO_TOILET), or "No." (return ERR_GENERIC). I can then choose to check or ignore your return value, or just check to see if it's null. You can even return a pointer to an error object (supposing that the method halts on the first error encountered, use a list otherwise), though you would need some sort of RTTI to implement various types of errors just as you traditionally do with typed exceptions. But that can be messy, so you could just use one single object type for all errors and fit all necessary info there or make it polymorphic or something. Anyway, what's so wrong and unobjectful about switching through some possible error types on an error_type field in your returned error object versus writing the same amount of catches for each possible exception object type you might catch? Don't want to check function (excuse me, method) return error type after each call? goto somewhere after a group of calls you would otherwise try and check there. That's what the compiler assembles anyway. But no. Passing objects by return is not enough. We also need to throw and catch them through some abstract, exceptional aether. There. Fuck exceptions**.

4. Magic numbers

Examples:
0xdeadbeef, 0xbaadf00d, and combinations thereof. Rotten.
0xcafebabe. Sexy.
0xfee15bad. It does. Fuck 1337*.
MZ. Sounds tough.

*). I remember when I was young and dreaming of optimal ways to represent text on 7-segment displays.

Bad examples:
for i from 13 to 69 do stuff with i involving 833 and 647.

My examples:
clock_divisor = 47; // because I say so, dammit! MY code, my clock! See (2).

And, the winner is:
123. Loop-switch sequence

for i from 1 to 3
if i is 1 do this
if i is 2 do that
if i is 3 do some other stuff
fuck autounindentation end for

Funny :D
Seen that in programming books. Bad, m'kay ?

**). Regarding abstracted and formalized exception handling, another reason why I tend to look down on object-orientation fanaticism is that everything is defined backwards. If I have a computer and it contains a 3D-accelerated graphics card, that's a subassembly. The GPU per-se is a subassembly of that card. But no, in OOP if I derive a class from another and add functionality, that's a subclass, and its parent is of course a superclass.*** And if I choose to draw that hierarchy in UML all the arrows have funny heads and generally point backwards. Of course, the inheritors are under (sub-) their parent, which is above (super-). So 'subclass' is not about its ability to substitute its superclass. It's about counterintuitive back-arrows. As you might have guessed, you actually can not generally substitute objects of a derived class in place of objects of the base class (inheritance is supposed to model the "sub is a super" relationship) and that's called the circle-ellipse or square-rectangle problem. (with square pronounced skwaah****, like Cartman: "So I kicked 'm skwaah- in the nuts!") Final solution suggested in linked article: change the paradigm.

***). Too bad you don't have #defines in Java. Then you could #define ultra super*****
****). skwåh?
*****). Credits: Wacky.

No comments: