The Bedroom Programmer: Cycle over?

“Procrastination is the art of keeping up with yesterday.”Don Marquis

It seems only reasonable that the first (and possibly only, I’m really not doing well with this blog lark recently) post of 2014 consists of some editorial that leads to my latest set of excuses as to why my iPhone game is still not completed. And you know what they say about excuses: your friends don’t need them and nobody else cares. This way, though, we can all look back and laugh later in the year when my new set of plans predictably turn out to be massively overoptimistic due to a combination of work, children and a desire to be in the garden burning sausages and burgers rather than doing level design. But depressing realism aside, let’s kick off with what everybody has been waiting for: 2014’s first piece of cobra-art!

Peppa, George and a blue moon. My latest schedule.

Look! Pigs flying! Blue moon! Daughter-friendly artwork! Honestly, it’s three levels of win and you know it. Plus it’s the best indication of the conditions required for my game to be released.

Having successfully sneaked into a job in the games industry in the early 90s, my first three published games were pretty much entirely programmed by me. Largely, I did the design work, too (although goodness only knows how obscurely rubbish they’d have been without the input of those considerably more experienced than me). Needless to say, though, I didn’t draw the graphics. They were Amiga games and it was amazing to be in a position to be the whole programming team. There’s an incredible satisfaction from knowing that you did it. It was yours. You were not “programmer 5” on some large team, three other people hadn’t told you what to do, you’d architected the system, solved all the problems and delivered the tens of thousands of lines of lovingly crafted 68000 assembly language code that made it happen. Then, it is actually shipped: it’s a real box, on real shelves and you get both the horror and joy of reading reviews in magazines (that’s right, I’m that old, it was magazines back then). It’s a wow factor and express learning experience that largely faded out as the 90s dragged on. As cost and complexity of development grew, self-publishing became almost impossible without a massive marketing budget and bar the odd surprise, the “indie game developer” faded into mythology: exaggerated drunken stories told in the bar at conferences like GDC and E3.

Then something amazing happened. The iPhone appeared.

Suddenly, one programmer could repeat what happened in the 80s: write a game. From scratch. And publish it. And that’s without the need to sell one’s soul, bend over backwards, yield to other opinions or have to listen to ill-considered focus groups1, wear animal noses2 or pack products into four magical marketing boxes3. And that enabled innovation on truly magnificent levels. Whole new gaming genres, wonderfully endearing ideas and genuinely different things. Pretty things. Things that the massive inertia of the larger companies generally prevents by beating them to death with procedures (or “risk management” as I’ve heard it called) — I suspect most bright-eyed designers who’ve entered a large gaming company have watched sadly and with growing despair as each and every idea they have is gradually diluted through every step of the green-light committees until, mysteriously, the cup of joy is empty.

The nimble adapt to the changing world

This last five years or so has been an extraordinary repeat of something few thought would ever happen again — the bedroom programmer — and, whatever you think of Apple, we largely have them to thank for it happening so soon and with such success. Sure, someone else would have managed it eventually, but Apple made it easy. Made it enjoyable. Made it so that one person could do it. Without a team. Without a huge budget. Hell, I’d kiss each and every one of them for what they did, with tongues. Their efforts created an unexpected easy publisher-free route to market and shone a deserved light on the wonderful achievements of small independent developers, on all platforms, that hadn’t really been seen since the 80s. This opportunity has been a mini gaming cambrian explosion: in come minds untarnished by what the big corporates believe is “the right way” and out of those minds come strange new wonders.

The world has clearly changed dramatically. We carry our gaming, music, video and networking and telecommunications device around with us and it is just that: a single device, not an assortment of devices as it was just a few years ago. It’s transformations of this magnitude that make Nintendo’s Wii-U and 3DS feel like totally baffling products… I can’t help but wonder if they’d be better off as a software company delivering their unique gameplay experiences to platforms whose owners have a better grasp of the changing world than they do. It would seem that the whole traditional console business is at risk because today’s connected gaming habits have simply overtaken it: adding multiple screens and attempting to spread beyond a device that plays games and does it well seems like a foolish step into a future that won’t take kindly to second best.

Microsoft seem oddly detached from this change too: their obsession with performance-per-watt unfriendly heavy high-level languages and frameworks makes no sense for mobile devices until battery technology makes a good few leaps forwards. Perhaps this is why C++ is now “in” again at Castle Microsoft. Then there’s the device-in-the-living room XBone:

  • The name. “Ecks-bee-one” sounds like XBOX 1, not XBOX 3, which technically it is. It is an odd confusion for them to inflict on themselves and their users to have a new device that is pronounced the same way as the original
  • The Kinect. As cool as it is, and it is cool, the forced purchase and the not-quite-always-on camera feels a little creepy. The same sort of creepy loved by “Smart” TV manufacturers (Samsung and LG, I’m looking at you, because you’re probably looking at me, somehow)
  • What is it? The big media box in the living room was a battle that was fought, lost and won by nobody. The living room’s focus has shifted from the ‘ultimate set top box’ to everyone’s individual screens and whilst a games box is easy to understand, an über do-it-all media box is less understandable to today’s modern living room inhabitants

Still, everyone’s an expert, and as fun as armchair generalling is, it’s always done with less than half the story and considerably less than half the expertise. Plus I’m digressing, but I’m mad at Microsoft at the moment for their outrageous Visual Studio pricing strategy: my reward for buying version after version of Visual Studio (since version 4) is, it appears, nothing at all (cue stupid sexy Flanders animation). I want C++, not C# and a whole pile of other junk, and I certainly don’t feel that over £500 is a reasonable price for a compiler. But as Joel Spolsky so elegantly explained a decade ago, pricing is both variable and a black art and I’d assert that independent small developers sit at the worst possible place on the pricing curve because they have the least amount leverage. Developers-developers-developers my arse.

No, I am not missing the irony of not getting to the point

But windows of opportunity like this don’t remain open forever. Autumn comes. Then it closes gradually until eventually it’s shut. Curtains drawn, it’s all over until the next time that disruptive new technologies, products and processes enable the individual to leave the incumbents dazed, confused and stationary. And that window is closing now. Creating a new game from scratch from thin air to on-the-market for the individual is getting harder every day and soon it’ll be pretty much impossible to all but the exceptionally gifted or lucky few. The big boys are drowning out the small guys with massive marketing budgets and, more often than not, quantity over quality. Models like Freemium are sucking the will to live out of developers who find it almost impossible to get right and bloody expensive to support. Besides which, it’s too easy to fall onto the cynical dark side and end up spending more effort on figuring out how to scam your users than delivering any actual gameplay.

Conspiring alongside business models and crowded markets are the platforms themselves. They’re rapidly growing in complexity, they’re fragmenting, the developer tools are overwhelming to the beginner, the hardware requires considerably more knowledge to get anything useful out of it and the whole process of bringing a product to market is tougher. It’s hard to spot a needle in a haystack and the world’s various app-stores are increasingly massive haystacks. Effective social media use goes a long way, but it’s a crowded market and incredibly hard work: unless you’re coming to it with something unbelievably special it’s going to be a real challenge to gain marketing traction without a budget or a publisher. To put it another way, the 80s are rolling into the 90s again but two decades further down the line – it’s a cycle, and it’s currently cycling.

No, there’s not always tomorrow

So if you have dreams of being an indie developer and publishing a mobile game of your own, now is the time to do it. Before it is too late. And it soon will be for all but the very fortunate few. Find the time. Make it happen. You’ve probably got, oh, a year, two years tops, but set your expectations low: you’re more likely to find a pot of gold in your back yard than you are to make a mint from releasing a game as an independent developer. In case you can’t read between the lines: do it for the fun and have fun doing it or you’re almost certainly wasting your time. I, like most with such indie ideas, am a steaming hypocrite because I’ve been idly working on my iPhone game for years finding one reason or another why I can’t spare ten minutes here and there to make some progress. The one area I have made progress with is my procrastination skills: they’re second to none. However, the acute awareness that the window of opportunity is slimmer than the population of flatland is beginning to weigh on my mind. There’s a dream here, and I don’t want it to be merely a dream.

As the author of this tweet knows very well, this is at least a trillion times easier to say than it is to do. I STILL can't work out if the reply is sarcastic or not.

As the author of this tweet knows very well, it is at least a trillion times easier to say this than it is to do. I STILL can’t work out if the reply is sarcasm of such subtlety that it is beyond my comprehension skills.

Remember: if you’re not seeking fulfilment of your own dreams, you’re probably being paid to help someone else achieve theirs (if you’re really lucky, as I have been on three separate occasions, your employer’s dreams happen to align with yours, but let’s face it, that’s rare). Life, despite what the hopeful believe, is a one-shot wonder. Once you’re done, your atoms return to the stars and all you have at the end are your collective memories of what you did. It’s an old saying, but it’s true: nobody ever lay on their deathbed and said “I wish I’d spent more time at work helping someone else get rich by doing stuff I hate”. What they regret is not spending enough time with their friends, families or pursuing those wonderful ideas and curiosities that poke at one’s mind — that feeling that one simply didn’t stop and smell the roses often enough. And life is full of roses, unless you deliberately close your eyes to them.

My desire to publish an iPad/iPhone game or two is not because I think I’ll be driving over the horizon in my Aston Martin with an evil laugh echoing from between sacks of money, but because I just want to know that I’ve done it. It’s one item on a checklist that’s longer than Mr Tickle’s arms. That’s right, both of them. Besides which, I’m planning on releasing the first game for free (as in “real free”, not “ad-supported privacy sucking ‘free'”), so if it breaks even it will be because I have twisted the fundamental laws of the universe itself. At this stage, this is purely an exercise in laying solid foundations: just so I know I can still do this stuff and because it builds the technology I require to… go further.

Oh, there will be other windows, but they’ll be different, and maybe I’ll not fancy being part of them or maybe they’ll simply be beyond where my skills lie. Who knows what the next opportunity will be: perhaps it’ll be related to 3D printers, or maybe Google Glass and the Oculus Rift VR headset offer tantalising glimpses into an incredibly exciting future of human-computer-interaction that, finally, doesn’t involve keyboards, mice or any other ridiculously computer friendly junk. Maybe the new era of playing nice with the intelligent part of the equation, the human, is closer than we might have dreamed. But it’s hard to imagine how I’d fit into all this because nobody knows how it’ll pan out. Glass and VR headsets are all fine and dandy, but the killer-device, the real disruptive market-changer is, in my opinion, neither: let’s face it, nobody really wants to wear that stuff. Slap screens on contact lenses and then we’re talking: it’ll transform augmented reality, reduce the size of VR setups to just a pile of sensors and have the added advantage of stopping Glass type product users from walking around looking like robots with “rob me” stickers on their backs. Besides which, with a cycle interval of a couple of decades or so, my time is running out; perhaps allowing this one to slam on my fingers would be a thing of great regret unless by some stroke of good fortune bottom-up, biologically inspired simulations roll up as the only way of managing the future’s insane software complexity. (As an “interesting” footnote, I find it’s interesting that in this article, that discusses this generation’s “Biggest Disruption” nobody mentions that it will be the approach to software design and implementation)

And now for the implausible new plan!

I’ve been sitting on game designs now since before the baby Cobra arrived (the first one) and the game idea that I really like right now is the one that requires the least amount of artwork. It’s the one that relies on emergence more than the others, it’s the one out of which endearing gameplay falls out of a simple underlying model. It’s kinda groovy in a “not been done before” sort of way, I think.

However, the road to failure is littered with half-finished things and before giraffes grace your device’s screen (yes, giraffes! It’ll be so cool!) there’s something I need to finish. Game number 1, therefore, is now engine-complete. Here’s the Table O’ Progress:

Estimated total project duration: 4 months with a month’s contingency
Start date: March 2010
Total progress so far: Twelve weeks
Total progress since last update: Nine weeks! (75% complete!)
Estimated completion date: Maybe even this year!

That’s right, folks, how about that? Four years down the line, I’m finally code complete. I’ve got no known bugs (but then again, my QA department is my eldest daughter and she enjoys the attract screen much more than the game itself). I’m down to some graphical stuff, some audio and level design. It’ll be the latter of those that drags on, but (and you’d better screenshot this because I’ll be back in a few weeks to “edit history” in my favour) my current aim is to publish on the 14th July 2014. And it’s going to be free, mostly because that’s probably what it’s worth, but I’ll let self-entitled Internet reviewers be the judge of that… which gives me ten or so weeks to fit a couple of weeks of work and some level design, visuals and audio in.

If you fancy being a beta tester and you have any iOS Retina device, drop me a line. Given my previous success in predicting progress, it could well be the only chance to play my game. Ever.


1 “Yes, we’ve selected people from the whole demographic that we believe will play your game”. Sitting behind the one-way mirror watching their baffling selection of L.A. inhabitants look at the computers muttering things like “what’s this?” whilst holding a mouse would have suggested otherwise.

2 This actually happened. Apparently it would “get us in the mood” to be creative. It didn’t.

3 And so did this. Apparently, everything fitted into these four magical boxes, including our product. Worst. Trip. Ever. I tried to drink enough to forget, but memories of it still burn my eyes.

Posted in Armchair general, My iPhone game, Rants, Software development | Tagged , , , | 9 Comments

Book Club episode II: Wool and the art of software documentation

“Software and cathedrals are much the same – first we build them, then we pray.”Sam Redwine, 4th International Software Process Workshop, 1988

Occasionally, but just occasionally, you end up reading a book that is so good that you can’t stop talking about it to all of your friends. It weaves a story of such magnificence that you’re on the verge of stopping perfect strangers in the street in order to check if they’ve had the pleasure yet. Wool, by Hugh Howey is one of these books.

Honestly, my pictures just get worse and worse. Here's a snake who looks so good out of one eye, but pretty skeletal out of the other. I wonder why. But I've said too much already, etc.

Honestly, my pictures just get worse and worse. Here’s a snake who looks good out of one eye, but pretty skeletal out of the other. I wonder why. But “I’ve said too much already”, etc.

I finally caved during the summer of 2012 to bring a welcome halt to the non-stop nagging I’d been receiving from a friend. I’ll just say this: of all the books I’ve ever been talked into reading, Wool is – by a Norfolk country mile – the best call and I’ve been talked into some crackers. Almost every nook and cranny of the Internet has something about Wool these days, but in the highly unlikely event that you’ve not heard of it or have not had the pleasure yet, you should do so. You’ll not regret it. But let me jump ahead of myself briefly: my friend introduced it as “It’s set in future. People living underground. Lots of strict rules. One in, one out. But I’ve said too much already.” And I tell you, if there’s one phrase that you hear a lot from people trying to talk you into reading Wool, it’s “I’ve said too much already”. Of course, once you’ve read it, you’ll understand just how right this is: don’t do too much research because you seriously don’t want this story spoiled. Resist the temptation. Don’t let your curiosity tarnish something really, really special: incredible depth that just keeps on giving, page after page.

Lots of books, including bloody amazing ones like last time’s book club star, the Pillars of the Earth, reveal their setting’s overall scope at the outset. Very early on, sometimes even from the back of the book alone, you know the entire theatre in which the story will take place. The plot, of course, winds, twists and turns but massive surprise revealings that transform the story’s universe don’t happen. Wool, though, well, words are, as you’ve probably noticed, failing me. It is a book of nested Doctor Who TARDISs that you simply don’t see coming even when you think you do. Take the first half of the opening sentence: “The children were playing while Holston climbed to his death”. This practically grabs you by the balls and begs you to turn the page. Why did he die? Was it an accident? Is he about to be murdered? Who did it? Does he know it is coming? Tell me! TELL ME! Wool positively infects you with curiosity. The words vanish silently into pages as they magically turn in front of you and the story expands in ways you could not have imagined.

Wool: achieving unputdownable status without the need to resort to new branches of physics. Picture credit, XXX

Wool: achieving unputdownable status without the need to resort to new branches of physics. Picture credit, The Great Architect, b3ta

Every time I thought I had figured out what was going on, I was wrong, and some other delightfully unexpected part to the story unfolded. As each delicious piece exposes itself you’re so involved that you barely get a moment to marvel at just how neatly it all fits together. I don’t know how much of the story Hugh had figured out when he set off with that initial small seed of Wool but if it falls anywhere near ‘not that much of it’ then he has achieved what story writers across the globe have consistently failed to do: layer a forest onto a seed without breaking either. Hugh, in the unlikely event that you ever read my “review” I will happily buy a hat just to take it off to you and send you a picture to prove it. Hell, you can even choose the hat.

If you are a fan of science fiction or simply love stories of great depth that explode gorgeously in unpredictable ways then your life is not complete unless you have read Wool: but, again, heed this one teensy smidge of advice – it is very important you know nothing more than the back cover and frankly, not even that (Random House’s 2013 print says too much, in my opinion, so if you buy it, treat the back cover as though it was the final page – a secret from yourself). You’ll feel richer having read it. If nothing else it will reinforce any suspicions that you may have that IT departments rule the world.

But I’ve said too much already.

Code beautiful: segue warning

Since I’m trying to win this year’s “shit segue” award, I figured I’d try and link this into software development somehow. With my recent post on dishwashers “seamlessly” transitioning into a rant on user-interfaces, I thought I had the award secured – albeit by the skin of my teeth – but this one is a sure winner and shows my awesome potential as a DJ. Wool is a special breed of book: it reads by magic. Your eyes look at pages full of words, but that’s not what you see. What you see is pictures, people, emotions. You’re there. The words somehow lift off the paper and end up forming video in your mind. This makes it an effortless exercise in enjoyment: you’re free to sit back and watch a movie, effectively, whilst holding an actual book with pages that simply reads itself. Few authors can do this and Hugh, to the incredible joy of his bank manager, I’m sure, is clearly one of them.

I knew Wool was special when I read it, but I knew it was really special when one day, Mrs Cobras said “can we go upstairs early and read?”. Way-hey! I thought, clothes falling from my body as I rushed upstairs to the bedroom. Imagine the disappointment when it turns out that this actually meant “can we go upstairs early and read?”. Thus, should I ever meet Hugh Howey, as well as taking my hat off to him, I will buy him a beer, but I specifically won’t enjoy doing it. I just feel I ought to. In a Ned Flanders “good host” sort of way.

Given that this was the first book of the many that I’ve tried to palm off on her that she’s actually read, this represents a major gold star for Wool. She picked every single moment that she could to read a page or two and was utterly hooked. Between that and another friend who sent me three texts (and two e-mails) to say how wonderful Wool was – to make sure I knew that the recommendation was appreciated – it’s clear that like rocking horse shit, clean modern trains in the UK, unicorn horns and good customer service from British Telecom, Wool is not something that you find lying around often.

But Hugh isn’t paying me to kiss his arse, so how does this link into software?

Love is shown in the details

The difference between good and great is the detail. Of course, that can pretty much be said about any place on the scale: the difference between outstandingly shite and vaguely bearable is also in the attention to detail shown. Take these two recent examples from Cambridge railway station:

This oozes lack of quality

I don’t know where to begin. I could make a gallery out of these things. Often, trains consist of “8 coa” and are going to “King” and have first class carriages in “carriage 1 &”. For it to be this way for over a year is amazing. This is because it was probably done by the lowest bidder who maximised their profit margins by doing a half arsed job. I’m sure the irony of this is missed on Greater Anglia because they couldn’t care less1.

The lack of detail and attention to quality is of course amazing at first glance until you realise that this is the British Railway system. I am sure that the likes of ATOC, the official turd polishers of the train operating companies, would explain how I am totally wrong in the conclusions that I am drawing here, but humour me anyway. Firstly, these screens are simply not designed to prioritise critical information. If you are in a hurry, often you have to wait for bloody ages to see the piece of information that you require. Secondly, they are poorly laid out: too much detail where there should not be detail and it enough detail where there needs to be detail. Finally, and most unforgivingly, the people who wrote the software did such a fucking piss-poor job that it does not even match the width of the screen. Everything that is important about the special notice is not shown. This is the case with all the data on this screen. And, it has been this way for over a year, thus showing the spectacular lack of pride that everyone involved has in the “service” that they provide. Mere words could not convey how much joy the virtual elimination of commuting has delivered to my life: I died a little bit inside each day I have to give those bum-stands any of my money2.

One of the many things that really, really steams my mirrors is a lack of attention to detail. It shows a messy, untidy mind and just stinks of lack of quality. It also has the mild smell of being done by someone who simply doesn’t care and has no pride in the final delivery. Having worked in software for the best part of a quarter of a century I have had enough opportunities to see such rubbish to write a book on the subject of poorly written code (to be fair, if I step off my moral high-ground briefly, I perpetrated some of that code in my early career). However, each and every page written would raise my blood pressure and I would be dead before I had drafted the first chapter. Instead, I shall bitch and moan in this brief blog post and then go straight to the wine rack and calm myself with a nice castle nine popes3.

Perhaps I just have impossibly high standards or maybe I have been burnt too often to keep making the same mistakes, but I feel that software is not just for Christmas, it is for life. We live in a disposable society and whilst there are signs of that changing, software seems to have taken very well to “knocking it out and throwing it away” as an actual, genuine process: particularly in the games industry. Sometimes the blame can be portioned off the twin evils of software development Mr Schedule and Mrs Specification. Other times, inappropriate process has to step up to the plate: either too much of it, the wrong type of it or–more often than should be the case–absolutely no process at all. Sometimes there is even a soupçon of misused agile around to provide a complete loss of big picture (but by jingo aren’t we cracking through a checklist!). The mistake that many programmers make is to believe that the pie of blame has been fully eaten at this point whereas, in fact, a good half of it is still remaining and that half is theirs to savour and enjoy.

The myth of self-documenting code

“You know you’re brilliant, but maybe you’d like to understand what you did 2 weeks from now.”Linus Torvalds, Linux coding style documentation, 1995

Self-documenting code is something that most programmers claim to believe in but few actually do. Many programmers cannot understand their own code just days after they wrote it but still talk about the importance of self-documenting code. Others simply believe that they are doing it and use that as a justification for writing no comments. I believe this: software is a story and you had better tell a good one that you and others can follow or you might as well have eaten a large bowl of alphabetti spaghetti and shat the results into your computer.

If you can’t tell a good story using an appropriate combination of neat code and relevant commentary then the chances are you have written bad code on three fronts: 1) you don’t care enough to make it nice 2) you have not thought it through or designed it properly and 3) there’s probably no good story to tell. And that’s selfish. This means that one day, and possibly one day soon, some other programmer will come across this code, realise that it is awful and be forced to rewrite it and on too many occasions in the first decade of the new millennium, that other programmer was me. If the people paying the programmer are fortunate enough, that person won’t replace one chunk of bloody awful code with another bloody awful chunk of code.

Self-documenting code without suitable commentary is largely a myth. Raw source-code doesn’t contain editorial, thoughts, considerations: it doesn’t explain the why. What was the programmer thinking? When was this done? Why is the algorithm done in this way and not that way? Was it because of a standard library problem? What were the ideas at the time for improving it? Why was this corner cut? This stuff is important and no programmer should have to spend days in source-control trawling the history to figure out basic context. It infuriates me that programmers don’t feel the need to tell a story that they and others can reconstitute with ease down the line: one where the purpose of the code leaps out at the reader along with all the interesting footnotes and commentary that provide context and encourage a relationship between the writer and reader. Remember: one day you could be that reader.

Software life-spans demand exposure of intention

The year 2038 problem
32 bit Unix systems, and they are plentiful across the globe in everything from power stations though ticket machines, trains and appliances, represent the date as “seconds since 00:00, January 1st, 1970.” in a signed 32 bit integer. This means that the maximum number of seconds it can store is 232, which translates to 3:14.07AM, Tuesday 19th January 2038. One second after that, it will roll over to zero. Any code that uses “greater than” or “less than” on a time value at that point will stop producing correct results. And if you’re in any doubt that today’s computers will be in operation in the year 2038, think very carefully when you’re on a train built in 1975 or powered by a power station built in the 50s, 60s or 70s.

Code without comments shows function, but not intention. That’s something you’ll appreciate if you’ve ever waded through code that has fantastic layout, amazing, readable variable names, cracking function names but is still frustratingly challenging to figure out enough of the “what” and “why” just so you can make one, small modification that would’ve taken five minutes if the original author had just had the decency to slap a couple of lines of text in describing the code’s intention. Of course, one could always write a specification or some separate documentation, but out of la-la land and back in reality we all know that isn’t going to fly. Separate documentation is rarely finished, rarely updated and usually lost. It’s a waste of time for everyone and good programmers rarely have the time, patience or ability to create wonderful documents. Documentation should go where it is useful to those who need it, which means in and amongst the code itself. It’s easy to update, it tells the right story and oozes context where context is needed.

Software’s life-span has an annoying habit of being considerably longer than anticipated which is why there was a Y2K problem in the first place and why there will be a year 2038 problem twenty-five years from now. And that’s forgetting the short-term pain caused by warming the cockles of managers by ticking boxes early in a project with a “we’ll fix it later” attitude that helps explain why the last 10% of so many projects takes 90% of the time: it is in fact 90% of the work and everything up until then has been a barely functioning skeleton of badly documented, poorly structured temporary code interleaved with special cases that just meets the requirements for leaping from one milestone to the next. One could call this “house of cards” development and it leads to more wasted time (and therefore money) rewriting unintelligible code than anyone can possibly imagine.

Everyone has their excuse for banging out disposable hieroglyphics instead of readable software, but my favourite three are “it’s self-evident what’s going on, it’s self-documenting” (hahahahaha!), “I’m in a hurry, I’ll come back and comment it later” (you won’t) or “yeah, but then I’ll have to update the comments every time I update the code” (demonstrating a phenomenal lack of understanding of the very basics of creating narrative). No matter how you wrap ‘em up, though, they are excuses; either ways of disguising bad code through obfuscation of shiteness or rendering good code bloody useless by requiring readers to engage in hours of close examination to find out what’s happening. If it is a beautiful algorithm written with pride then it deserves to be presented as such for all to appreciate for years to come.

Bad programming is just natural selection at work

“Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.”Brian Kernighan, explaining elegantly that readability and structure trump being a smart-arse

Of course, the light at the end of the tunnel isn’t always the train coming the other way. Even in the games industry, the onlineification of software is increasing life-spans of code to the point that good, careful design is beginning to become an essential part of the arsenal of weaponry required to stop programmers killing each other as well as ensuring that one doesn’t attempt the futile task of building fifty story tower-blocks on foundations designed for a garden shed. Long-term survival is increasingly dependent on stamping out a fire-and-forget attitude that came from slapping it into a box and shipping the damn thing which up until recently was pretty much where each game’s software story ended (“What? No class-A bugs you say? Ok – everyone step away from their computers. We’re shipping.”).

-~-

Code should be beautiful. Your source should be a work of art. The story you are telling should leap from the page out to all the programmers who see it. Maintenance work should be trivial to do: you know the old saying, write your code as though the person who will maintain it is a dangerous psychopath with a desk drawer full of loaded guns.

Make your code read like Wool does. Like a good book, the story should lift effortlessly into the eyes and minds of the readers, because it shows that you care. It shows you’re a good programmer who believes that the quality of what’s under the hood is directly related to the quality of the deliverable.


1 Please take note, those of you who get this wrong (I’m looking at you, large portions of the USA), it’s couldn’t care less, not could care less. The meaning is completely different and when you use the latter you’re not saying what you think that you’re saying.
2 But I’m not bitter. No, wait, that’s not right, I am bitter.
3 Don’t worry, I know.

Posted in Software development | Tagged , , , , , | 3 Comments

Conversational Geography

“Civility is only a passenger – not a driver – on the information superhighway.”
– Don Rittner

So, this is interesting. Jeff Atwood, one of the chaps who was instrumental in building what has become, without a shadow of a doubt, the most vital tool any developer can ever have – Stack Overflow – is playing around with discussion systems and forums now. He retired from the whole Stack Exchange thing and is tootling around seeing if his forum and discussion system, Discourse, can change the way we discuss things on the Internet.

The comments to Jeff’s blog post making the announcement are the usual mix that one would expect: congratulations, initial complaints, suggestions and the obvious arguments as to whether a flat or threaded model is appropriate. It always baffles me that it is seen as such a black and white issue. Threaded. Non-threaded. Indented. Non-indented. It’s like arguing whether aspirin or paracetamol is most appropriate for a man facing the guillotine: people becoming incredibly animated discussing their favourite way of shoving square pegs into round holes.

When several people hold conversations, they do not do it on a single page in a neat linear sequence. Conversations drift in and out of each other, some merge, some split, others idle and fade away and some just observe and comment on existing conversations (sneakily or publicly). People overhear things happening near to them and ooze or spring from one conversational group to another. It’s a thing that has geography. Yes, you heard me, geography. Conversations are a fundamentally spatial affair: people clump and group together in order to communicate effectively. If they’re bored or wish to start a new conversation, it actually splits off, like a new child conversation seeking independence from its parents as it drifts off. Maybe, a couple of glasses of wine later, it’ll want to move back in with Mummy and Daddy but it started near to an existing conversation and is, in some weird way, related.

Let’s illustrate the awesomeness of how the Internet brings people together from all over the globe into conversational groups with this poorly drawn little diagram:

Snakes apart

Four snakes, so, so far apart. Yet thanks to the Internet, they can argue with each other endlessly about stunningly pointless subjects. Later, snake 2 will make some suggestion about snake 3’s relationship with his mother using a series of grammatically questionable, badly spelt sentences. Fun times!

Snake 1 is talking to Snake 2 about whether GIMP’s user interface is an example of precisely what’s wrong with design by ineffectual committee. It’s going to be an interesting conversation, as you can imagine. Snake 3 and snake 4 are discussing whether EMACS is in fact the world’s greatest editor or whether today’s nice, tidy user interface driven editor applications mean that finally one need not have to remember how to wrap seventeen fingers up like two drunken octopuses making love to perform something that should be trivially easy. Later, they’ll shoot the breeze about whether an editor really needs a built-in, fully functioning text adventure game1. Yes, you guessed it, this is the social occasion from hell. But wait! Snakes 1, 2, 3 and 4 are FUCKING MILES APART! How can they hear each other or have a conversation? Well, this is the Internet, so it doesn’t matter. Instead, they get to choose between two failed models both indexed in a variety of confusing, inconsistent manners:

  1. Flat: Conversations that are related to each other may never know that they are related to each other because they’re held in private, sound-proofed rooms. Additionally, it’s a nightmare to follow what’s going on when several people are talking and that makes it unfriendly for new participants and generally nastier for existing ones unless they are moderated to within an inch of their lives. CosmoQuest’s forums (previously Bad Astronomy and Universe Today’s discussion forums) are one of the few examples of exceptionally well-moderated forums that transcend many of the underlying software’s faults simply by the power of effective people. This shows that making flat discussions functionally work is a matter of human effort of gargantuan proportions, however, it clearly can be done, but few have the time or patience to do it.
  2. Threaded: Oh my. Ok, well, there’s a lot of scrolling going on here, but you’re still suffering from disconnected roots although at least now it is possible to see who’s replying to who… after a fashion, anyway, because if any particular conversation gets long, it gets increasingly hard to follow what’s happening. SlashDot and b3ta’s (PS: b3ta may not be suitable for work, depending on where you work) discussion forums are both excellent examples of getting this as right as one can.

Of course, it would be remiss of me not to mention John Gabriel’s Greater Internet Fuckwad Theory at this point. This is a simple formula that says: Normal Person + Anonymity + Audience = Fuckwad. There is an almost unbelievable lack of social grace that is encouraged by the “you can’t see me, you don’t know me” factor. It’s what allows you to be drunken you, inhibition-free, permanently. Without any fear of being personally attributed or punched in the face and with a platform hugely disproportionate to that which you’d achieve in real life you’re free to shout, taunt, poke and disrupt everything around you. It’s the belief that the usual, civilised rules of discussion simply don’t and should apply to the Internet and solving this could transform the way we communicate online.

I’d humbly suggest that it’s not unreasonable to say that part of what’s wrong with Internet discussion forums is that they’re simply not how we talk in Real LifeTM. They’re not how we socialise. So why do these things form the fundamental foundations of how we socialise on the Internet?

Rolling what can’t be polished in glitter

Real conversations do not start with a big signpost saying “this is what this conversation is about and will remain about” and are not indexed and discovered as such in real life. However, the vast majority of Internet discussion forums (b3ta’s uniquely effective keep-it-fresh-by-discarding-the-old approach notwithstanding) rely on discussion titles for discovery by interested users. It is clear that with this system, remaining “on topic” is critical for maintaining integrity of the primary index. The CosmoQuest chaps achieve this with brutal execution of moderation powers, but conversations rarely stay on target in reality.

Modern forum software lacks a concept of what a conversation is now about, what it was about and what it is tending towards unless someone updates the topic title or splits things out into new threads (which, of course, then lose the context of how they appeared). The ebbs and flows of what was and wasn’t interesting fade into a long, endless stream of text. And that gets tiresome quickly. At a party, if you join a conversation one person may give you a heads up without messing up the chat in progress. You’ll get the relevant parts, not the complete potted history. Which is nice. Or, you’ll just slide into the “now” without needing to how everyone arrived there. Which is also nice, as nobody wants a fifty-slide presentation of all the conversational steps that led to the current heated discussion on how Microsoft got so far detached from their customers’ needs, hopes and desires.

What’s that coming over the hill, is it my daughter, my daauuuugghhter! Yes, and she wants a DISCUSSION! About GIRAFFES. And ASTRONAUTS ON THE MOON. And TERRY'S CHOCOLATE CHOCOLATES!

“What’s that ♩ coming over the ♫ hill, is it my daughter, ♬ my daauuuugghhter!” Yes, and she wants a DISCUSSION! About GIRAFFES. And Astronauts. On the MOON. And Terry’s Chocolate Oranges. Brace yourself!

Next time you’re at the pub, a bar or a party, just pause for a minute, lean quietly on a wall and watch. Try not to look too creepy or too much like a stalker, so, gents, don’t focus your attention on boobs and bottoms. Watch conversation groups form and disband. Watch the chit chat, watch the listeners, the partakers. Watch those that lead the conversation and see how it all varies wonderfully over time. This works because each participant has a position. A real one. A physical presence. This, coupled with the strength and type of personality they project gives them their own unique soap-box upon which to stand on or next to. Participants move around, gesticulate or background themselves and somehow this all works. Noise reduction is automatic. The filters are both implicit and effective. Trolls are managed by default, without the need for bans, naughty steps and other such top-down tools. Detail is where it needs to be, not where it’s forced to be.

The leap forwards

Now, don’t get me wrong: I’m not suggesting a Second Life approach to forums and socialising here. I don’t expect avatars to be walking around and striking up conversations as the user interface devices and software design philosophies for something like that are miles off. However, even though the participants are not walking around doesn’t mean that the conversations cannot have positions. As they split up, new circles of chat bubble off. The further they detach from their parent, the further they go. It is clear how many people are active and ghostly conversations of times gone by gradually drift into the basement where they are archived for the curious to stroll around and dust off some time in the future. The presence of geography means that it is beautifully self managing: the undesirables get pushed away from the centre and leave no trace of their conversational invasion. When trolling leaves no mark and where uncivilised behaviour has its volume turned down trivially by those engaged in the conversation, it rapidly becomes less attractive. No trophy, no point. A well-implemented, spatial discussion forum could well encourage a more civilised behaviour at its foundations. For free. Well, as civilised as real-life conversations get.

These are exciting times with exciting possibilities but it remains both odd and mildly unsatisfying that such key and critical aspects of the Internet as conversations and discussions are still executed largely as interactive magazines. It’s like having the most incredible toolset known to mankind, putting a rug over it and using it as a chair. The Web is quite possibly the most solitary social experience created by mankind2. I feel that it is sad that two decades on it’s still the best that we’ve got.

There’s clearly a leap to be made here.

One day, the internet will be a place that you go, not something you observe. Kiss goodbye to the Web and its clunky books-you-can-poke paradigms: it’s all going the way of the dinosaurs, it’s just that nobody has figured out how or with what yet. But they will. HTML 5, WebGL (now that Microsoft have finally joined the fun) will guide us through the dark towards technologies that will bring the global network to life. We’re fundamentally spatial organisms. We put things in places, we organise things in places and we interact socially in that way too. Organising discussion forums geographically and allowing different conversations to feed and nourish each other may well be the killer Internet discussion application. It could transform every nook and cranny of the global network in ways that we cannot yet imagine. With this in mind (and more specifically, with a particular solution in mind), despite my massive respect for a man who has contributed so much to transforming almost every aspect of my daily development life, I can’t see Discourse changing the world. It’s an iterative evolution, a rounder wheel if you wish, but it’s no flying car, teleporter or space elevator.

However, I admire the attempt and the underlying reasoning for doing so: Internet discussions suck.


1I’m not kidding. If you have a Unix box or EMACS configured on a PC, open a command prompt and launch it like this:
emacs -batch -l dunnet.

2Actually, there is one other: peak-time commuter train carriages. They’re full of miserable people, each of whom act as though they are in an impenetrable, sealed metal box. If it wasn’t for my various train friends, that soul destroying experience would have been even worse.

Posted in Miscellaneous rubbish | Tagged , , , | Comments Off on Conversational Geography

Square wheels are better than no wheels at all, right?

“The idea that Bill Gates has appeared like a knight in shining armour to lead all customers out of a mire of technological chaos neatly ignores the fact that it was he, by peddling second rate technology, who led them into it in the first place, and continues to do so today.”
– Douglas Adams, ‘Biting back at Microsoft’, June 2001

Gary the sea snail "rushing" forwards

Plankton has stolen the plans for an innovative new user interface paradigm, but he need not hurry; Gary the sea-snail has represented the wheels of progress since around 1976

Since the dawn of the microcomputer back in the 70s, a time haircuts will never forget, software companies have toiled endlessly to make computers easier to use. They’ve often worked through the night for years at a time with the sole aim of improving the experience for you, the user. 

Um. Yes. In your dreams

Today’s computer interfaces are a massive compromise. Born in a time when processing power was measured in snails, it was necessary to curtail the user experience to get anything useful out of them at all. Interfaces and software had to be computer friendly, not human friendly. As processing power rose, it became possible to shift the balance a tad in favour of the user with windows, icons, menus and the most incredible paradigm shift of all: the mouse and mouse pointer. Almost overnight, the incredible power of the computer fell into the grasp of the average person. 

The mouse and window stuff hit the consumer back in the mid 80s. Since then the average home computer’s power has grown by, wait for it, five orders of magnitude. Yes, that’s right number fans, by more than 100,000 times. My current work machine, a rather sexy MacBook Pro, is the processing power equivalent of about 100,000 original Macintoshes. That amount of Macintoshage wouldn’t actually fit in my house stacked to the ceiling even if I used the garage, loft and broke through into the neighbour’s house too.

With all this power, perhaps you might be Roger Mooring your eyebrows right now as you realise that modern UIs are effectively just an iterative rehash of the 80s experience: before the Internet (to the masses, anyway), before the mobile phone and long before The Simpsons, Ben and Holly’s Little Kingdom and Spongebob Squarepants. Microsoft have given us the ribbon – an ‘innovation’ barely better than that infernal paper clip designed to disguise their own inability to manage layers of complexity – which, along with all the other window decorations present these days, mean you barely get to Dear Sir on an average laptop before needing to scroll the screen. Apple have added easy to easy and users have discovered that the result is, at best, hard but usually just annoying. Take an example that annoyed me yesterday: “Your Mac can’t eject the Volume ‘Time Machine’ because someone is using it.” But who? I don’t know. But the computer knows. And it should tell me. And it should sort it out. In fact, scrub that, it should just sort it out without telling me. Crap like this should be on a need-to-know basis and frankly, I don’t and shouldn’t need to know. This is not me being greedy, this is me expecting a basic level of service from the system that I am using; but, then again, I travel on First Capital Connect’s trains so perhaps I should lower my expectations accordingly.

Most unfortunate

Well, this is where my expectations are set, train-wise. How about yours?

Many people hoped that when Microsoft’s global operating system monopoly started to crumble, first in the mobile space and then on the desktop, that we’d see the chance for innovation to make some real progress through the cracks and opportunities that appeared… but it would appear that everyone is still decorating the same tree. Let’s face it, ‘touch’ is mostly just same old usual stuff but with the finger replacing the mouse pointer and, at best, this is rolling the un-polishable in glitter, as they say.

The further we go, the more we stay put

Change is fast
The Internet was a clumsy, hard-to-interact-with academic network with systems like gopher, telnet, finger (yes, “finger”. Look it up. But carefully. Just imagine the kind of party where that was dreamt up.) and FTP used to navigate data that was unbelievably hard to find. The World Wide Web transformed that overnight. Believe me: when the next innovation arrives that finally fits the computer view into human terms, it’ll change the way we interact with the Internet at such speed that if you blink, you’ll miss it. And that will be Web ‘2.0’, not what’s shovelled down your throat with that moniker at the moment.

Modern UIs are still more computer friendly than user friendly. The restrictions that forced that are long since gone. We simply don’t work the way computers do. If real life was organised the way your computer is, you’d be exposed to the entire complexity of life everywhere you went. Your brain would explode. Computers have had the power to work the way we do, to adapt themselves to us and our way of lives for quite some time. Whilst we enjoy the trappings and pleasures of a free-market capitalistic economy, one of the more ironic prices we pay is companies’ risk-adverse approach of trying to play the same game – but a little better – as everyone else rather than moving the goalposts far, far away and changing the fundamental nature of winning and losing.

An old joke: “e-mail me two of those please, I need a couple of copies.” It’s side-splitting IT humour with an uncomfortable underlying foundation: our minds work spatially. We’re used to things having a place. Here. There. Somewhere. It’s how we interact with the real world around us, it’s something we learn from birth. It’s something that we’ve been doing since we were merely little fishies taking our first steps onto dry land from the chilly primeval oceans. The fact that computers place a complex network of tripwires around this knowledge is something that history will not remember kindly. Take even the most trivial of examples: I want two copies of a file. Two identical copies. I want to put them in the same folder and for them to have the same name, but I cannot. Why? Because they share the same filename and on virtually 100% of all computers, files are indexed by names. The reason for this is because the underlying database of files, the file-system, is architecturally pretty much the same as the very first ones were back in the 60s and 70s. Yes – you heard me, the reason you can’t do the most basic of human things on a modern computer comes from decisions made before man had landed on the moon and before I was born. And I’m old. Em oh oh en, that spells old1.

Mummified metaphors

Clapped out old metaphors such as “folders” are there to try and add some kind of human-level spatial organisation to something that was designed to be kind to low-powered computers with little or no memory. When one’s computer’s main memory doesn’t even have enough space to store this blog post, one must make compromises. For those restrictions to still be the anchors around the legs of progress in the twenty first century is regrettable and a massive collective embarrassment for everyone involved. Things should have a place. A real place. And we don’t mean the tired old “draw a desktop” type thing, we’re talking about making your computer a physical space or allowing it to seamlessly occupy your physical space with you. Open the door and the whole world is available to you: un-real estate as far as the eyes can see that works the same way that you do, but out of the box.

The most intelligent part of a computer is the person sitting in front of it. It is a sad, sad state of affairs that it remains the single most underused component. 

But it won’t be this way forever.

Soon, the only way to harness the incredible computing power we have will be to change the way we design and implement software at a fundamental level. Or, to answer this chap’s tweet, no, the reason why low-resolution TV looks better than hi-resolution gaming is not because we’re optimising the wrong things, we’re doing the wrong things and doing them wrong. Needless to say, I’ve soapboxed this before. Several times.

Say goodbye to keyboards, mice and conventional looking computing devices because soon, computers will be both places you go and companions that go there with you. Computing will be something you experience. Software will adapt itself to you rather than you needing to adapt yourself to it. Or, to put it another way, the entire concept of a “computer” will fade into your everyday life for all but those that create and develop for them.

This isn’t an “if”, it’s a “when”. 

Oh, and, if you’ve not already guessed, I know people who are working on it.


1 A packet of ready salted crisps to the first person to get this reference. Edit: actually, no, I tried googling it, it’s too easy and my daughter ate the crisps anyway.

Posted in User experience | Tagged , , , , | 2 Comments

Opinion! Quick, someone tell the Internet!

The problem with walled gardens is forgetting who’s supposed to be on each side of the wall. In the past half year or so, though, it looks as if Apple have managed to manoeuvre the wall between themselves and their customers, a rather odd arrangement that was last the case in the latter half of the 90s. I am, of course, like everyone else on the Internet, jumping on the iOS 6 and iPhone 5 bandwagon except long after the wheels have fallen off.

What kind of bandwagon is this? I mean, where are the trumpets? Drums? And on top of that, it has lost a wheel.

What kind of bandwagon is this? Where are the trumpets? Drums? I asked Siri and he replied ‘you’ve had enough, put the wine glass down’. Fair enough.

Their new (well, it was new, last year, when I originally drafted this post) phone looks like an incremental increase. It looks almost devoid of the incredible innovation that we’ve come to expect from Apple. Well, I have one and I can tell you: it has an almost magical build quality and feel to it. It’s a lovely device. It’s hard to explain, but somehow, in an Appley way, all the little things add up and I couldn’t see buyer’s regret with a set of super-strong, industrial strength binoculars. Plus, I come from couple of generations ago (device and reality) and as a result I’ve still not got bored of asking Siri questions like “who lives in a pineapple under the sea”, “why is my train late again” and “why is the bus timetable a work of fiction”. Regardless, though, its uniqueness and pleasure feel like they come from a large population of small tweaks and twiddles rather than any massive leaps over the competition.

The four riders of the appleocalypse

Four things seem to have changed about Apple recently that are, to me, a happy user and developer, causing a light smidgen of “armchair generalness” to seep to the surface and because this is the Internet, it would be wrong of me to mind my own bloody business like a normal person. Instead, I’m going to get all four things off my chest because it’ll make me feel better – and, given that I’m sitting on a train writing this – I could do with feeling better (the station’s dispatchers were playing “passenger inconvenience bingo”. The winning move, incidentally, was to get all the rush hour passengers safely to platform 5 a few minutes before departure and then announcing a platform change that involved everyone having to move a long way, fast. They did this, with a straight face, in between two announcements asking everyone to be careful on slippery platforms due to today’s “inclement weather”. I take my hat off. It was a spectacular play.)

Firstly it’s the new focus on litigation rather than innovation. Now, all but the blind and deliberately closed minded know that Samsung ripped the hell off the iPhone when they kicked off their large-screen smartphone career. In some screenshots the devices and software looked the same. Well, they did with my glasses, anyway. But regardless, Samsung were followers and Apple leaders. If you’re copying then by definition you are behind as you need to know what to copy before doing it. To see Apple investing so much time, effort and money in waging war-by-lawyers against Samsung instead of making new leaps into the lead turns smiley faces into frowns. Ironically, I suspect history will remember this so called “protection of innovation” as achieving the complete opposite and allowing the competition an unprecedented period of catching up.

What does this failure message really mean? Why is it there? What happened?

This message means “the entire operation bailed”. What does this failure message really mean? Why is it there? What happened? Click for embiggening.

Secondly it’s the ease of use stuff. Once upon a time, Apple were masters of disguising the complex with the simple. No matter what magic went on underneath the hood, Apple managed to stick a non-patronising user-interface on top of it that allowed one to get on with the job rather than being baffled by unnecessary complexity. Now, though, this appears to be slipping away. Things like horizontal slider bars that appear and cover up the bottom item in a finder window when you’re trying to select it just feel like lazy, poorly considered design. Buttons that you press that look like they do nothing (no visual indication at all) and then suddenly, a few seconds later, things happen. Or, buttons that just do nothing (iTunes, in particular, is particularly guilty of this). This all breaks even Apple’s own guidelines, so it does seem a touch… off. Then there’s “Open With…” lists that repeat the same item several times until you fix it with some cryptic Unix command-line-fu. Oh, and I must mention the icons vanishing from docks and dialogs like “there’s a newer version of this iCloud document on the computer ‘Sack of Snakes’, which do you want to keep?”: tens of billions of operations per second and you can’t tell me? Why not lay the two next to each other and let me drag differences across and then just sort it out? Why do I, suddenly, have to be clairvoyant in order to make a simple decision? Why, Apple, why? Or how about “The volume ‘Time Machine’ can’t be ejected because it is being used.”. But by who, Apple, by who?. This is a pointless dialog that has no value because I cannot do anything about it. It’s a little like a dialog saying “I’m going to punch you in the face” with a single “OK” button. A while back, this would have been unacceptable, but then again, a while back the share price was almost double of what it is today.

Free vanishing invisible adapter

Click to read the ‘evidence’. The French site said the same. Genuine mistake? Meh, probably. The airbrushing from history wasn’t particularly Apple-like, though.

Thirdly it’s the apparent drift away from “the customer rules kiss-kiss-kiss, love Apple xx”. See that cute little fella to the right? That’s a Lightning to 30 pin dock converter. Like so many others in the UK (and France and probably elsewhere), I was promised one of these when I ordered my iPhone. Apparently, this was a mistake. The Apple thing to do would have been to admit the mistake but honour what was promised given the marginal costs have surely got to be offset by the goodwill. Apple enough would have been to apologise to those affected and offer the thing for a reduced price of, say, a tenner. Instead, the slow dixonsification1 of their retail operations appears to be gathering momentum and the whole thing was expertly airbrushed out of history. But what really gets my goat, milks it without my permission and runs it over with a combine harvester is that they never even bothered to notify their early adopters who searched their boxes for this mythical adapter. And no, that wasn’t a euphemism. Apple Stores feel like a First Class shopping experience, so long as you don’t go on a saturday where they’re packed with teenagers using Facebook for free (they usually forget to log out, so feel free to add something fun to their time-line). We all like a little luxury occasionally. It makes us feel special. Take that away and it’s just the usual miserable experience that First Capital Connect, Stagecoach, British Telecom, British Gas, O2, Tescos and so many others are shovelling down our throats on a daily basis.

Fourthly, and perhaps more worryingly for long-term splendour, it’s the Google battle. To the casual observer this looks like an ill-considered playground fight where the users are stuck in the middle getting punched from both sides. With iOS 6, though, most the user damage is coming from Apple’s side. Never before has a component of iOS become an international laughing stock until maps. My house and street aren’t even on it and neither of them are exactly new. Hell, ten months down the line and they’re still not there: we continue to have to warn visitors not to use Apple Maps if they’re looking for our house. Maps wasn’t ready. It still isn’t ready. It seems like taking someone’s roast dinner and fine wine away from them and replacing it with a slice of bread and some water with the promise that you’ll slap some jam on the bread at some point in the future. Butter too, if you’re lucky. Would Steve have allowed it out the door? After all, he was an enthusiastic Google battler having felt cheated over Android. I don’t know. And nor do you. But between maps and the loss of the YouTube app it is hard to see how the Google extermination program is going to have a happy ending. What next, Microsoft’s Bing as a default search engine?

Five sides to every story

Maybe it is true that Googlemonster made unacceptable demands relating to turn by turn navigation. Maybe it is true that Apple’s future plans require them to define an API Google doesn’t and won’t have. Maybe this is a firm foundation upon which amazing things will be built, after all, geo-anything is a fundamental cornerstone to our future mobile device usage. Maybe Tim Cook’s apology was from the heart, not just a gimmick or an exercise in stopping more horses getting away. Who knows. But one thing is for sure: right now, it’s the customer – the very component of the equation apple built its deserved position from – that is being slapped in the face. When Apple writers like Daniel Eran Dilger2 feel they need to somehow justify the dog’s dinner that so many of us outside the Bay Area are experiencing with pathetic arguments that wouldn’t work for the prosecution in a kangaroo court you know we’re on the verge of taking the wrong turning. I see eyes. And spiders. And something blinked.

Of course it makes complete sense to control such a key component of your operating system as opposed to leaving it in the hands of a direct competitor but usually, it makes sense to have the steering wheel fitted before making sudden changes of direction: it’s this that makes it all seem so darn odd to those of us whose lives got three rungs harder as a result of this decision.

It looks as if there is a gradual ooze towards “the usual bloody awful way business is done and delivered” that until now Apple has refreshingly avoided. Maybe it was too good to last. Maybe Apple’s success was tied to Steve in ways that cannot be replicated or maybe, just maybe, Jonathan Ive, Tim Cook and their team of groovy people are only now getting their feet under the table without him and are on the verge of revealing a hat so full of rabbits that the whole world will be surprised and delighted again. Maybe it’s also darn unfair to pick on them after a ten-year stretch of transforming everyone’s lives, as, let’s face it, everyone deserves a break occasionally whilst they regroup, plot and plan.

I hope it’s the latter. I love OSX. I love its UNIX underpinnings. I adore development in Cocoa and Cocoa Touch. It makes me happy. I get on with my work whilst a wonderfully thought through set of APIs take my development pain away. Oh, and Xcode. Wow. It would be superior to Visual Studio even if it cost the same. It doesn’t. It’s free. Microsoft’s apparent fuck-you developer strategy of charging a King’s ransom for their IDE coupled with their cute deprecation policy is tiring, stressful and turns the wonder of software implementation into an miserable exercise in dodging bullets on a daily basis.

-~-

Let’s face it, we’re on the other side of closed doors and passing judgement on what’s going on behind them with only the Internet and personal opinions to guide us: and if there was ever a combination of sources that was the opposite of accurate commentary it’s the Internet combined with personal opinions. And let’s face it even more, I’m a small person commenting on a big thing that I don’t fully understand (but, if it’s not for ill informed opinions and porn, what is the Internet for?). Regardless, though, to some on the outside, Apple look increasingly mean-spirited and as though they’ve lost their way and I think a lot of people miss the magical customer-centric wonder that Steve inspired. That feeling you really needed something that you didn’t even know existed the previous day. The joy of the oooooos and ahhhhhhs as the way we go about our daily lives was transformed in front of our eyes is something that I don’t want to lose.

Even the share price, which looked like it would climb forever, reflects the falling confidence that investors have in the current direction’s ability to continue to print money. When your key USP of delighting the customer is eroded by what looks like your own bitterness and confusion over the vision then the writing may not yet be on the wall, but you can rest assured that there’s a queue of people eagerly holding pens.

Perfection is all in the detail. Until recently, it was the detail that Apple almost always got bang on the money. They stood the height of a stack of giraffes above everyone else. But now, with lots of little details going wrong from the software to the hardware, their magic looks increasingly like a trick that everybody can do…

… let’s hope not. I’m having fun. Even with the frustrations. I’m not ready for it to stop. With Apple’s 2013 WWDC just around the corner, I’m going to be glued to the set, popcorn in hand, prepared for wabbit season.


1 Dixonsification, the failed, doomed policy that all of the high street is following on their way to nowhere. It was with enormous shock to the entire world that the former CEO of Dixons, John Browett, was appointed to manage Apple’s retail operations, albeit briefly. If ever there was an “un-apple” approach to retail, it was that of Dixons. Even he admitted afterwards he wasn’t the optimal choice. The very fact that he was offered the job indicate changes tool place that were, well, just not Apple.

2 Sorry Daniel, I love your writing and it gives me much to think about, but let’s face it: if Apple bagged up kittens and threw them off bridges to drown, you’d somehow explain how this was a magnificent step forwards before explaining how Microsoft’s Zune is ultimately the real killer of kittens.

Posted in Armchair general | Tagged , , , , , | 2 Comments

This nudist play can wreck a nice beach

“Any artificial intelligence smart enough to pass a Turing test is smart enough to know to fail it.”1
– Ian McDonald (2006) River of Gods

Here’s a picture illustrating a simple problem that humans find utterly trivial to solve but computers and robots find challenging:

Farm cow feeding robot

I’m not sure that the cow is all that impressed, but it keeps her fed. Mooooooo!

And I’d describe the job to you like this: sit down with a big book and a six-pack of beer. The cows will eat the hay that they can reach. Every half hour or so, see if the cows can still reach the hay; if not, pick up the broom and brush it closer where it needs to be brushed. Should you run out of hay, there’s more round the back. In a nutshell: make sure the cows can reach the food.

Now try and make a robot do that. With AI. It’s tougher than you think. The robot’s AI won’t be smart enough to look around, figure out what needs to be done and do just that, instead it’ll need to follow some set of rules that ensures that it doesn’t make the cows starve. Inevitably, this makes for a big, expensive robot that runs into all sorts of problems performing its job. Many of the inefficiencies it exhibits are because it can’t see:

  • It gets lost so it needs to recalibrate its position
  • It can’t rest peacefully if nothing needs to be done
  • It can’t do the job just where it is needed and not where it is not

Even so, it represents excellent value because it frees a human from regularly interrupting their normal day in order to go and check on the cows. With the robot generally doing a reasonable job, they’re freed up to do longer more complex jobs. The robot, of course, can’t do any other jobs, but it does this one acceptably so long as there are no surprises. Besides which, other than the six-pack and the book, there’s little job satisfaction in brushing hay towards cows all day surrounded in the sounds and smells of what comes out of them as a result of hay going in.

Traffic lights are another wonderful example of a system that is non-trivial to get to work right, so, needless to say, they don’t. They also can’t see or think and they also just follow a pre-programmed set of rules in order to make the lights change. Sometimes they have ground sensors or other detectors to help them avoid making really insane decisions, but ultimately, they’re dumb and inefficient in comparison to a human doing the same job. This morning, for example, I sat at a set of red lights at a crossroads for nearly a minute wondering if they were broken whilst absolutely no vehicles at all came the other way. If a human had been there controlling those lights, I’d not have even had to stop. But, sticking a human at every set of lights is prohibitively expensive so we “make do” with software and automation that’s simply never going to be able to match a true thinking machine.

In Soviet Russia, computer operates you

I had the extraordinary pleasure of working with and learning from Steve Grand during the second half of the 90s. He was fond of saying that Deep Blue, IBM’s chess-playing computer, didn’t know how to play chess. It merely had a lot of data and processing power available to it. Throw it in a lake and see how smart it is then. Humans who can play chess can also play monopoly. Or invent a drinking game, play it, and then (sometimes) successfully navigate a complex 3D space in order to get home whilst intoxicated. We’re simply amazing: we’re general purpose problem solving machines. As time progresses, as we’re rewarded and punished for the things that we do, we learn how to apply one piece of knowledge to many situations: and it is this incredible ability to generalise, plan, consider and apply that sets us aside from machines. If you think about it for just a while, you’ll realise how far we have to go from where we are at this time to having a computer in a traffic light that could match human performance: a machine that could reach very high efficiency regardless of what surprising, unplanned and unknown circumstances arise.

Machine learning and Strong AI (Artificial General Intelligence) are real big right now. Lots and lots of money is going into them both directly and indirectly and quite rightly so: it has long been understood that solving this problem transforms our lives overnight, forever, in ways we can’t even begin to imagine. Frankly, even the ones we can imagine are comprehensive game-changers. Combine this stuff with 3D printing, mobile super-high-bandwidth access to the global network (which will KILL WiFi hot-spots, you mark my words) and our lives are going to be simply unrecognisable in fifty years.

The future? Yes, more of the same – but cooler!

Humans are really poor at predicting the future long term which is why we think things like the Rabbit phone, WAP or QR codes are worth investing in. We tend to extrapolate2 based on where we are and what we know now: so the future will have faster cars, more spaceships, better computers, better medicine, etc., rather than being able to consider whole new technologies. Even science fiction authors, whose careers are built in imagining the amazing often get it wildly wrong. We can’t predict what we don’t know and our imaginations draw from our experience and knowledge. Those that can foresee disruptive technologies are either involved in making them or rely on the fact that if you take enough rough stabs in the dark, eventually you’ll hit something. I’m reading a book right now where in the opening few pages a future space-craft carrying hundreds of people from London to the West Coast in the USA flew there using imperial measurement units, had everybody on board smoking, was entirely male-focussed and featured several passengers rushing to use the – wait for it – video-phone booths in order to book a flying car at their destination. Wow. This was written in the mid-70s when man had already walked on the moon, but figured that 100 years down the line, we’d have more of the same rather than anything new. The author missed out the global network (the Internet), portable computing devices, civilisation finally cottoning on that smoking isn’t nice and a general shift away from furlongs, feet and miles towards a measurement system that actually makes sense… there’s more, but if I keep typing, my tea will get cold.

The daughter, educating Mr Robot

The daughter, showing the robot how it’s done. A victory dance followed.

And that’s where we are with machine learning. Trying to predict how intelligent, smart, learning machines would improve what we’ve got rather than considering how it’ll create new ways of living. You use a computer right now, but you do so with a keyboard because speech recognition is still really, really poor even though technically it’s getting good. It is poor because the speech that is recognised isn’t understood by the recogniser. Thus it isn’t able to correct itself (oh, I thought you said “this nudist play can wreck a nice beach”, but given the context, you probably said “this new display can recognise speech”). It’s also not able to isolate one conversation in a sea of others: we are experts at eliminating background noises to focus our attention on the person speaking to us, computers, well, not so good. And as a result of these shortfalls, speech recognition is a pretty useless day to day input device…

… but then you look at Google Glass (creepy, dudes, creepy) and realise that if this stuff was sorted out — if strong AI was cracked — you’d have your own smart, context-sensitive personal assistant with access to all of the world’s collective knowledge with you at all times. You’d never forget another face or name. You’d never get lost. Everything you needed to know at any moment would be right there, in front of your eyes, without you needing to think or do anything. You and your computer will talk to each other, listen to each other, share your experiences and understand each other. There won’t be keyboards in the future. Films, TV and books may tell you otherwise, but they’re wrong.

And that’s all cool, alarming loss of privacy aside.


1 Did that give you an uncomfortable shiver? Did it?
2 This is a cracking article. Other than the massive scare story of “everyone out of a job by 2045”, it’s a superb example of imagining the world we live in today but with smarter robots. It misses out all of the steps that go from now to then that will change the way we work and what we do. This isn’t the first time that this sort of amazingly ridiculous prediction has been made – the closed minded pessimist brigade have suggested such civilisation catastrophes are “just around the corner” now since the start of the industrial revolution. Anyway, long story short, this is bollocks.

Posted in Cool stuff, They're coming to get us | Tagged , , , , , | Comments Off on This nudist play can wreck a nice beach

Would you like to play a nice game of chess?

Twenty Megaton Giraffe

“Colonel Giraffy’s Weapons of Mass Distraction” – guest art by Max, who drew this from a briefing of “giraffe nuclear missile” in a matter of seconds. Jealous, I am!

If you’re my age then as well as lurking in your secret lair plotting your takeover of the universe, you’ll almost certainly remember War Games. War Games is an utterly fantastic movie of its time and it is still, all these years later, a wonderful demonstration of how time changes technology but neither the mistakes we make nor the risks we take. The plot is blissfully simple: during a military drill, it turns out that human beings, when it really comes down to it, find it really, really hard to press a button that kills tens of millions of men, women, children and kittens. This is a good thing. However, when it comes to launching a retaliatory nuclear strike, it’s a bad thing – if you’re into that whole “destroy the human race” thing. It is, after all, meant to be mutually assured destruction. Thus, the whole lot was replaced with computers. Wham, bang, job’s a good ‘un: should the soviets do something foolish, the president can press a button (or indeed, not) and a whole stack of machines will launch the missiles without any of those pesky meat bags thinking twice about whether they wish to be directly responsible for mass destruction beyond anyone’s imagination… and some of us grew up with films like Threads to aid our nuclear nightmares.

This reliance on machines turns out, of course, to be a mistake: a hacker kid with a computer and a modem sneaks into a back-door of the AI-stuffed war simulation computer and ends up nearly starting World War 3. He thought that he was just playing games. Brushing aside some minor details, the premise was alarmingly plausible: the equipment was bang on the money and the process is correct. I know this because I’ve done it. I copied War Games, but without the global thermonuclear war bit. In the mid 80s, before tone-dialing modems and digital exchanges, I had a computer and modem dial free 0800 numbers overnight from 100000 upwards looking for computers. Each time it found one, it made a note, disconnected and proceeded. “Security by obscurity” was alive and well: companies figured that if nobody knew the number, they needed no security. Well, they were wrong. As a naive teenager who was lucky that there was no practical way of tracing calls back then, I poked around in a great number of open-door policy network services booking this, that and whatever. You’d be shocked at how easy it was. You’d be even more shocked if you realised that nearly three decades later security (and by extension you, your privacy and your data) is still treated with very little care and attention by people who should know better.

It’s worse now: your life is digital

Generally, security on the Internet is shite. It’s this way for so many reasons and it’s simply not fair to blame just the companies operating their services with such an amazing lack of care. Let’s blame them, the web developers and the users themselves – the “trio of extraordinary incompetence and ignorance”, we shall call them. The latter have crap password policies that allow the errors the former two make to be magnified 1,000 fold. Users also click on links that they shouldn’t and have a habit of believing that if it’s too good to be true, then it’s probably true! The developers and companies, though, over-engineer everything across the board making such an incredible range of poorly configured interlinked computing devices that it’s possible for a single student to build an entire map of the IP4 internet space simply by uploading a custom piece of software that he wrote to a broad range of internet connected devices. We’re all lucky that this software wasn’t malicious. Next time, it might be. One of the many devices that contributed to this incredible paper may have been in your house. Hell, it could have been your ‘fridge, your thermostat, your lights or your television.

Wait, did I say, ‘fridge? Why yes, yes I did.

Let’s think for a while about the devices that you have that contain complete, functioning computers in them along with standard off-the-shelf operating systems (probably Linux) that may be a) poorly configured and b) connected to the internet and are thus c) massive time-bombs of security catastrophes waiting to happen.

Your television. It’s probably a smart-TV these days. The “Smart”, as we all know is the exact opposite, but perhaps you connect it to the Internet for BBC iPlayer or Netflix or whatever grows your beans. If it has a camera, like the Skype supporting Samsungs, then remote attackers could possibly switch the camera on without you knowing and film you. Nice, eh? I hope you don’t do anything naughty in front of your TV as you could find yourself a revenue generator for unscrupulous porn producers in no time.

But your TV is the tip of the iceberg. Lights, fridges, central heating, alarms, cameras, ovens that can be pre-warmed remotely, garage doors opened — if it’s electrical, some asshat is trying to figure out how to make it “smart” and connect it to the Internet. It’s the “in thing” in our modern connected world. It has the right marketing buzzwords, because, well, if it can’t tweet then it’s just not cool, right? Your refrigerator could very well soon be ordering food for you when you run low and if that doesn’t fill you with terror then read on, I’ve got more.

But what’s the worst that can happen?

So let’s play a brief game of “imagine the worst that could happen”. How often do you have to update your computer and its software to deal with security issues? Weekly? Every bloody day for Adobe’s “software”?1 Now think about all of these Internet connected appliances. Are they updated regularly? Are they updated at all? Are their passwords secure? Were they even changed from the defaults? Well, the IP4 map paper answers this question remarkably clearly: no, most are not. Most routers and “Internet appliances” are a fucking outrageous swiss cheese of security holes. And this means that in the future, a hacker would be able to secure control of your house. Did you miss that? Let me say it again, it’s important: in the future a hacker might be able to gain control of your house. Where you and your loved ones live. They could let themselves in. At night. Or disconnect your electricity, steal your car, take your stuff or simply gather the data that they need to electronically empty your bank accounts. And this is all the effects on the one. What if it was a co-ordinated attack on thousands? Or millions of people? What about the power substations? Distribution plants? What if all those was attacked at once? Screw bombs and guns: digital warfare and terrorism is the way forwards and has the added advantage of leaving infrastructure in one piece but simply disabled.

You could easily sleep-walk into letting each one of these badly configured appliances become an accidental repository of your passwords and personal data if you allow them to access Twitter, Facebook, Paypal, Amazon, etc. instead of using a (relatively) secure computer. Furthermore, by doing so, you are relying on them to incorporate, maintain and employ first-rate encryption and security practices to protect that data, much of which can be used to spend your money. Frankly, having seen what’s under the hood of some of these devices software-wise I wouldn’t trust them with my dirty underpants, let alone any passwords, username information, browsing history or payment capability.

Moral of the story so far: We do not take security seriously and we should.

Am I scaring you sufficiently yet? I hope so, because this is unbelievably important stuff and as a population we do not take it seriously enough. We need to hold everyone involved to a higher standard and stop tolerating piss-poor security. You read about passwords being stolen daily. This won’t stop until we make it stop.

You should neither tolerate lax computer security nor be part of the problem. Your passwords should be carefully chosen. They should be different. Don’t be lazy and re-use the same password for multiple services. You should never use insecure or untrusted WiFi. Never. And if you have any of your services from a company that stores your password in unencrypted form (companies saving money by adopting a “sort-of-fix-a-half-arsed-job-it-if-anyone-notices” policy is costing us, the customers, dearly) then you should vote with your feet before someone takes your stuff. Oh, and report them to the plain text offenders people. And tweet at them. Big companies still don’t understand the unparalleled power that an individual consumer now has thanks to social media, so use it. The rules of thumb are quite simple. For any company that has personally identifiable information about you:

  • If they can tell you your password or email you your password then they operate unacceptably crap security practices and should absolutely not be trusted for anything
  • If they provide a customer login on their web sites that is not fully encrypted using HTTPS and an appropriately up-to-date, verifiable certificate then they should not be trusted either

If they make excuses for either of the above then they are within a hair width of being criminally negligent with your data because it indicates they have no comprehension about why these practices indicate a fundamental thinking fault that will lead to problems.

And finally, a cautionary tale

And as for you, the reader, a cautionary tale. Whilst on the misery inducing train to London a few weeks ago I overheard some bloke talking about how his Yahoo mail account was hacked2. He was having the conversation sufficiently loud to ensure that everyone on the entire carriage heard, so I’m sure he won’t mind if I explain how him, his wife and his two teenage kids all had their accounts hacked too. They all used the same computer, of course, which was probably riddled with malware and other such things but their solution to their email accounts being hacked? To simply email their friends and say “ignore the spam you’re receiving from us, we’ve been hacked”. Did he do anything about it? No, of course not. It’s like saying “ignore the punches in the face I’m giving you, I love you really”. He’s exposing his friends and family to risk because he’s ignorant of the issue and he’s applying foundation to cover up his case of airborne Ebola. Selfish. And that’s being generous.

However, I mentioned that this was a cautionary tale. Well, that’s because I know his full name, his email address and password. He mentioned the latter to the bloke he was talking to on the blower and it would have taken the most determined “la-la-la, I can’t hear you” deployment to miss the rest. I’d bet a grand, right now, in cash, that he uses that password for everything. Everything. But it doesn’t matter anyway, because that’s his one and only email address so lost password recovery to find the rest is a trivial exercise.

I have the keys to his digital life.

And he’s lucky I choose not to use them.

Tomorrow, it could be his physical life too.

Please be careful when attaching devices to the Internet. Think twice. Be careful. Be smarter than your appliances.

It’s your stuff. Your life. Your loved ones. And they’re precious.


1 … but you’ve got to admire their latest automated remote wallet emptying scheme that manages the involve the world “cloud” as a method of charging you more for less. Honestly, it’s genius. Now, those irritating users who don’t/can’t afford to upgrade each and every time a new version is released are captured forever. Now, instead of paying a metric fortune to use Photoshop, you’ll pay an imperial fortune, in monthly instalments… forever! Muuuhahahahahaha, etc.

2 It wasn’t “hacked”, it was hijacked. There’s a subtle but important difference. He made (at least) two mistakes: 1) he logged into Yahoo mail, which does not have two-factor authentication and then *didn’t log out* and 2) continued to browse whilst still logged in. Eventually, some unscrupulous advert loaded with dodgy Javascript did the magic, harvested the cookie and that was that. Take home lesson here: do not stay logged into Facebook, Twitter, LinkedIn, Yahoo Mail or anything else whilst idly browsing the web. Either that or have a separate, different browser for general web surfing. A little care goes a long way to securing your life.

Edit 6th June 2013: Well, there you go. “Smart” Televisions: the security swiss cheese that keeps on giving!

Posted in Rants, Security | Tagged , , , , , , , , , | Comments Off on Would you like to play a nice game of chess?

I know what I did last summer

You, like most people on this planet, probably have a dishwasher. Even if you don’t, you almost certainly know how to use one. I, on the other hand, had neither owned nor used a dishwasher until last summer when thanks to an outstandingly generous gift from friends, we took delivery of a beautiful newish appliance and the Cobra family finally joined the 21st century.

Getting the dishwasher from its old location to its current one was not an unchallenging operation. Firstly, it required loading into the car. This kerfuffle of industrial-grade tetrising completed pretty much event free until it was observed that water was coming out of various holes in the car. There are many holes in the car only a few of which, such as the doors, are meant to be there. The swiss cheese that is the car containment vessel was demonstrating that dishwashers contain magic compartments all of which are full of water. Additionally, the so-called “empty pipes” (honestly, we actually emptied them) were not empty at all. Safely absorbed into the car, with water dripping from every crevice, I expressed my many thanks and with a bladder soon to be alarmingly full of tea, I departed. Driving an antique car with tyres that could probably do with an air top-up complete with a vast appliance in the boot is like driving a boulder on springs. With the typical british summer raining surprise torrents of rain onto the road, I boated myself back home, both surprised and relieved to arrive alive and well. Mrs Cobras and I managed to wrestle the appliance into the garage and went off to enjoy a friend’s first birthday (or, to be more accurate, our friend’s son’s first birthday, we don’t have many one year old friends). I ate much cake. I like cake.

I appreciate that you’re probably gripped by this story. I’d like to say it gets more interesting, but it doesn’t — although there is a cool flood moment a little later on that’s worth hanging around for — so if you’re still with me, I can only apologise.

Dishwasher is Good and Evil

Every appliance has a dark side. This one’s dark side allowed it to trample the competition and win this year’s “DOCUMENTATION WITH THE MOST ASSUMPTIONS EVER ABOUT PRIOR KNOWLEDGE” award.

Sunday night, friends come over for an innocent roast dinner. Of course, bloke visitor was roped into helping me lug the dishwasher up to the kitchen on the first floor. We got to the top of the stairs with this device, which weighs the same as a box of anvils, before realising that the stair-gate I’d lovingly installed to prevent baby Cobra from escaping narrowed the available space to slightly less than the width of a standard appliance. Balanced on the top stair, and held in thin air by various arms, tools were passed up and down and the stair-gate was disassembled. Incidentally, this is a cracking demonstration of why “planning ahead” has so much value in life.

Finally, the dishwasher slid nicely and snuggly into its little hole.

And there it stayed until Monday night when I had a few moments to double check that everything was correctly connected before doing something stupid, like switching it on. I routed the waste pipe exactly as the documentation said and attached it to a nice handy little attachment under the sink. Wey hey! All done. Water on, no explosions, time to add all the bits and pieces.

Now I recognise that this is all obvious to you, but it wasn’t to me. I see the television adverts that look like flying jellyfish scenes from Avatar about just how amazing these “all in one” tablets are. Various friends have described those adverts as “lies”, “bollocks”, “criminally wide of the mark” and, my favourite, “designed to snare the lazy and bone idle with promises that can’t be kept”. I have been told that you get the best results — by far — if you use separate softening salt, rinse aid and detergent. So, off to the supermarket I had gone to buy these three things. The supermarket shelves, I observed, are specifically designed to hide these items, so I can only assume margins are quite low on them or they can’t be blended easily with horse. I therefore felt a degree of pride that I was able to find all three and successfully get home with them without being upsold all sorts of crap (I did buy two crunchies, though).

Right! Time to put all those things into the various slots and holes. Like sex, I figured it would be obvious. Blah blah, in, out, put this here, that there, rotate 90 degrees, etc. After all, I had the manual, so it should be a doddle, right?

Let me just say. Dishwasher instruction manuals are designed for people with prior knowledge of dishwashers. It is assumed that nobody left on the planet is unfamiliar with the basics. It’s like when a manual says “plug it in”. They don’t feel the need to show you a picture of a plug, a socket and show you how to orientate the plug and slide it in, they just assume you know that shit, right? It’s fundamental knowledge. Well, I felt like a caveman trying to fly a 747. Put the salt in, it says. It doesn’t say how much. It doesn’t explain that the hole it goes in should be full of water. Armed with the entire Internet’s knowledge on the subject and the manual, it took me twenty whole fucking minutes to suss this out. Even then, I was worried that I was pouring salt into the flux capacitor socket and destabilising the plasma injectors.

Loading the rinse aid was just as hard. With no indication of how much, I just did it the same as the salt. Add a bit. Close door. Switch on. Is the low level light off? What? No? Ok, open door, add more, repeat. Finally, salt and rinse aid added. Put the detergent into the fucking odd looking two holes. Close door. Detergent pours out of one of them. Is that right? Oh, yes, it turns out it is. Nice of them to say that in the “documentation”.

Ok, all done. Test dishes loaded. Start.

Humm. Whirr. Buzz. Well, it was making all the right noises. I heard water going in, stuff going on, all looked good!

Until twenty minutes later. FLASHING LIGHTS. THEY’RE ALL FLASHING. NOOOOOOOO! Check the manual. Oh, what? That’s a code that isn’t covered in the troubleshooting? What have I done? Have I broken it? Opened door. A couple of centimetres of water. Ah. Ok, drainage problem. Take drainage pipe off (with a saucepan underneath just incase). Hissing noise. Uh huh. Check attachment. Opps! Little black stopper! Remove stopper. Try again. Same problem. Remove pipe again. Hissing noise. So I poked the attachment: ah yes, so it’s not actually cut and therefore was going nowhere. Tested this by switching Mr Machine on and off quickly with the pipe poking into a saucepan. Dumped drainage tube into option 2, the stand-pipe, and Bob’s my uncle, the dishwasher continued! Another DIY victory, safely chalked up.

The next morning, Mrs Cobras opened the door to reveal… WASHED DISHES. Washed automagically by a wondrous machine of splendour!

What’s the worst that can happen?

I’m not a plumber. I’m a software architect. A technology strategist. A programmer (but absolutely not a ‘coder’, one of the wankiest undervaluations of a software developer’s skills and role ever coined). I design and implement applications, simulations and algorithms. I talk about them, I sit in meetings, I write documents. But me, with pipes? Not a good idea. Still, the stand-pipe option was trivial with a capital T, so even I can figure that out. Well, allow me to let you in on a little secret: don’t. Just don’t. Get a plumber. A few weeks later, Mrs Cobras and I were sitting comfortably in the lounge, drinking wine and watching the telly whilst the dishwasher whirred away cleaning our dishes. We’d have gone to bed, but there was a disagreement over whether we should open another bottle of wine, so eventually we played it safe and did so. Had we not have had more wine1, the following… learning experience… would have contained 500% more learning than we needed for the lesson to be absorbed.

Water.

Lots of water.

The critical piece, when missing, causes floods

Why, oh WHY didn’t I call a plumber? The “critical piece” is a clip that stops Mr Pipe slithering like Mr Snake when water flows through him.

Sounding… more watery than one would expect. In fact, it sounded like water was pouring out of something. Eyes met over a glass of wine, looked at kitchen, and off we went. The dishwasher waste pipe had popped out of the stand-pipe and was pouring water into the cupboard under the sink. This in turn was leaking on four sides onto the floor. The ten minutes that followed are a bit of a blur, but several rolls of kitchen towels were involved as well as the discovery that the kitchen installers had simply brushed their building detritus underneath the kitchen cupboards because they were lazy fuckwits doing a typically half-arsed job. Clearly my moral high-ground position here is shaky, but they were supposed to be professionals, whereas my plumbing qualifications… still, I digress.

It turned out, incidentally, that there is a funny little bit of plastic (A “Flibop”, as I call them) that sits on the pipe and stops it from snaking its way backwards when water pours through it. See Figure A, “why, why, WHY didn’t I call a plumber” for details of the missing bit.

Obvious thing isn’t obvious

You may find this amusing, and if you do, I hate you. But irrational hate aside, obvious things are only obvious if you know a little about the subject. And I didn’t. So I screwed it up. And believe me, I read the documentation provided with the dishwasher with great care. Problem was, it was shite, and made massive leaps of faith as to how much the reader already knew.

However, I have been told that life in the kitchen is unacceptable without a dishwasher. Never having had one, I failed to see how this could be the case. After all, I can cook an entire roast and bake a cake without needing a dishwasher, so surely, they can’t be that good?

Well, yeah, they are.

Mr Dishwasher, I love you.

Mr Dishwasher manufacturer, though, you, like so much of the world have no comprehension how to write documentation that makes sense. You make the same bloody mistakes every time:

  1. One manual covers ten variations of the product. You just put footnotes everywhere saying “not available on all variants” without specifying which.
  2. Illustrations cover one variant. This leaves the user baffled and confused unless they have prior knowledge, plus, the onus is on the user to unwind the complexity, not the documentation: this is the wrong way around.
  3. Like crap cookbooks, you do not specify amounts anywhere. “For a while”, “a bit”, “add salt” don’t help me understand how much or how long.
  4. Badly translated english, poor perspective on illustrations. This, coupled with a baffling insistence on saving on localisation costs by labelling everything on pictures with A, B, C, D, E… etc. and then providing a faulty key in the main text. What? J? There’s no J.

I hate you, but not your product.

Lessons in software

Well, it’s time for some tenuous, shaky link that attaches this to the field of software development. You can treat the above as the longest introduction to a software rant ever or this paragraph as the worst segue ever, whatever wobbles your jelly. User interfaces. Yours, and your documentation, probably suck. Frankly, these days, it is all about degrees of suckage. Even Apple seem to be joining so many levels of simplicity together that it becomes complex2. If you want your user experience to suck less than everyone else’s does, get people who’ve never used – or seen – your software before, and get them on the case. And listen to them. They’re not stupid if they can’t use it, they just don’t have the benefit of prior knowledge. I built a 4 bit microprocessor out of NAND gates when I was a kid, but I couldn’t figure out for the life of me how to work or plumb a dishwasher. So, in handy tear-out bullet-point form, here are the relevant lessons:

The incredibly un-user friendly 'Tuesdicoccus Daylorii'

This germ is responsible for terrifying symptoms, it’s the Tuesdicoccus Daylori – but even he is more user friendly (and certainly easier to install) than most software these days.

  • You, yes YOU, cannot evaluate your own software’s user interface. You are the worst judge of your own UI.
  • No, I mean it. You can’t. Get someone who’s never used it before. Plonk them in front of it. Ask them to use it. Give them no clues other than a “back of box” paragraph.
  • Seriously, I wasn’t kidding. If you’ve used it, you already take it for granted. You’ll iterate it based on stuff you already know about using it.
  • User interfaces are easy to get ‘inside out’ – make sure they go in the direction the users expect, from the focus of attention inwards.

(And your ‘free’ bonus point: don’t put a shrunken down PC user interface experience onto a touch-based device. Please. It’ll suck more than a room full of vacuum cleaners. It’s like controlling a mouse pointer with a console controller: it’s just wrong, and you know it. They’re fondleslabs, as The Register calls them. They beg to be fondled, design so that they are.)

I’m sure you get the point. UIs and documentation are, in my humble opinion, one of the few areas in life where gradual iteration doesn’t tend towards excellence – it tends towards layers of steaming dung lovingly built from good intentions.

This is even more important today than it has ever been. With “freemium” and “ad-supported” being the flavour of the day, users (or “products” as they are now known) invest precisely nothing to get started. If your software isn’t self-documenting, bloody obvious and utterly unchallenging to use then they’ll move on. Stuff has to be a joy to use, beg to be interacted with and have no barriers to entry at all. None.

I’ve awarded medals for this sort of rubbish before, so today’s goes to Hotpoint and their Aquarius dishwasher. Speech! Speech!

Anyway, it’s time for a relaxing, stress relieving tea. I shall have it in a mug lovingly cleaned by the new inhabitant in the kitchen. Muuuhahahahahaha!


1 Remember folks: Wine. Saves lives. And floods. Drink some today.
2 I was over-preparing some notes for a talk a while back and I observed that iCloud was saying “hey, there’s a different version of this document on the computer ‘Sack of Snakes’. Which one do you want?”, to which I thought, “well, I don’t know. You’re the data-processing device capable of billions of operations per second, why don’t you tell me. Show me the differences, side by side, and let me slide around things nicely on my iPad until I’ve got it sorted and then give me a big, magic button labeled “yey, and make it so.”. It is amazing how small changes and tweaks, when added together, result in something so unfriendly.

Posted in Miscellaneous rubbish, Software development | Tagged , , , , | Comments Off on I know what I did last summer

Overcomplicated shark is overcomplicated

“This is my 22nd year doing this, and every single console transition we’ve seen an increase in development costs. Over long periods of time it gets smoothed out, but I would say this is not a transition where that’s going to be an exception.”
– Activision CEO Bobby Kotick, commenting on the expected rise in costs of development for the upcoming new generation of consoles

Sorry dude, you’re doing it wrong. But don’t feel too bad, you’re certainly not alone. Like so many, you’re looking at methods of managing complexity instead of investing in figuring out how to eliminate it in the first place.

Over complicated shark has lasers. Yes, lasers.

Overcomplicated shark is overcomplicated. He’s OK for getting rid of the odd minor eel infestation, but when it comes to an entire ocean, his single missile/single laser design just doesn’t scale. Cloning him isn’t the answer, he just isn’t an appropriate solution any longer. Software complexity kills software.

When my daughter hides her breakfast cheerios between the sofa cushions, that isn’t solving the problem, that’s hiding it. When mummy and daddy sit down and find the experience crunchier than it should be then the game is over. Right now, every generation of consoles does indeed increase the cost, and therefore risk, of development. Who benefits from this? Beuller? Beuller? That’s right, groovers, nobody. Innovation along with new ideas go straight out of the window in favour of the least possible risk option. Consumers end up with more of the same, but, ironically, less because as fidelity of experience increases, so does the ability to spot how utterly shite computer AI and world dynamics really are. Besides which, the fire exit might look pretty, but as was the case in that Simpsons episode, more often than not, it’s just painted on. Our always-on (unless BT supplies your broadband, in my experience), online, socially connected world fills modern entertainment experiences with vast numbers of the most unpredictable elements possible – humans – and tries to contain them in a highly specified, top-down construction that can’t possibly keep the trousers of plausibility up forever: and it won’t, despite everyone’s best efforts.

But let’s scale new heights of wrongness with this year’s winner (and by a country mile), one of the commenters to the story I got Bobby Kotick’s sound-byte from. He said… drumroll…

“I believe Microsoft and Sony are irresponsible in bringing technology into an an industry where it is almost impossible to recoup development costs. Development should be simple – tools should be fast and they are not. Tools are very much LAGGING behind technology…”

Do you hear that noise? No, and I don’t either. And that’s because it is the world’s smallest violin playing just for the author of this comment. I have not laughed so much since I tried gas-and-air when Mrs Cobras was about to give birth to Baby Cobra (just to test it for safety, obviously). If ever there was a more clearly illustrated detachment of cause and effect, this is it. The manufacturers? Irresponsible? They’ve given you hardware of incredible, awesome power and all the tools you need to do stuff with it. The rest is your problem, not theirs. My heart bleeds for your complete, comprehensive failure to figure out how to manage software complexity properly. Developers are reapplying the same tried and tested methods, tools and processes that worked well when life was simpler and had fewer bits, bytes, bobs and threads and acting all confused-like when it doesn’t scale. I add vanilla essence to my cake mix with a teaspoon but would I wouldn’t fill a swimming pool with one (I’d use a dessert spoon, obviously). How does this chap get to work? On a plank carried on the backs of a million ants?

Let go, Luke, use the force…

If complexity is a problem, and it quite clearly is, then instead of working out how to manage that complexity, work out how to do it with less in the first place. Something with less “agile” in it where the computer, which is, after all, capable of billions of operations per second, actually does some of the work for us. And this, I humbly offer, is where bottom-up, nature-inspired model based solutions have so much value: develop seeds, get the jungle for free. It’s the Lego of software design — create life support systems for small, simple systems and allow an incredible web of beautiful complexity to emerge. As a programmer, this is gorgeous: I need not know how to solve the problem in order to solve it, I merely create and environment where those problems can solve themselves.

The crying shame, of course, is that somewhere in their hearts programmers know this. They embraced dynamics models to make their cars, bikes, tanks and helicopters behave better yet the extra step of architecting entire engines this way eludes them. Ultimately, it’s scary to not have control: bottom up development philosophies trade predictability and understanding for simpler, more reliable – but mysterious – solutions. Don’t underestimate the requirement for having control: it’s the foundation of traditional design, scheduling and programming that puts the pipe and slippers onto developers world-wide. It is frightening to answer any question with “emergence” or “I don’t know how it will work, but it will” and designers generally hate it: you can’t script such a beast, you merely guide it. Management and investors don’t like being bathed in variables and even where risks are taken, those risks need to be mitigated better than “trust me, this’ll work”.

It’s a pity that few are ready to take the leap of faith that true bottom-up emergent simulation engines require because believability and suspension of disbelief come from the grey areas. The details. And it’s details that are expensive to produce: the production of millions of possibilities and variations where most people only see a small fraction. To create so much software and expose the average user to so little of it might be considered wasteful. If nature had built our brains that way, they’d be the size of a planet and wouldn’t work. Creating software that adapts itself to the needs, desires and demands of the person using it would seem on the surface to be a far more efficient method of development.

Not just hot air, just mostly hot air

It’s more impressive than it looks. It’s constructed, on-demand, with nothing but a massive cash-in on nature’s 3.7 billion year R&D program

We all know that armchair generalling is a hobby enjoyed most by those who have no idea what they’re talking about on subjects in which they have no expertise, but I’ve seen this software development philosophy work three times in the last twenty years, two of which were with architectures I built myself. It is possible to spawn incredibly intricate environments from simple things and the picture to the right, circa. 2004, is (sadly) all that I can provide as evidence of something that once had so much potential. It is from a product where the entire environment and all its content are grown from scratch. From seeds. Every day, a different world and not just painted scenery: these things are real, they grew. You could poke them. Touch them. Interact with them. Influence them. With an accurate enough underlying model, a large enough population of building blocks (and surprisingly few rules for each of the thousands of entities) everything that you believe is possible is possible. A world from nothing, in seconds. Oh, and it evolved, too (genetics, it turns out, need not be restricted to living things).

But — and as usual, this isn’t the nice ‘but’ with two ‘t’s — this was an era before easy self-publishing, when Apps were called “Applications” and it required an enormous amount of money and resource to bring a product to market. Of course, that’s not to say we had the right product anyway, because if there was one thing that we were world-class at back then in the mid naughties it was disagreeing with each other. World class. It’s fun to imagine alternative realities, so long as we first brush aside the obvious fact that if ifs and buts were sweets and nuts it would be Christmas every day. Perhaps if we’d had the wonders that social media brings to marketing and community building as well as a route to getting products that we’d built with this technology directly to devices and consumers as an iterative affair then things would be different. But maybe not. Sometimes there is a time for these things and maybe then wasn’t it.

However. Maybe now is. Exciting things are growing…

… figuratively and literally.

Isn’t that nice?

Posted in Armchair general, Software development | Tagged , , , | 1 Comment

The Incredible Adventures of Rattles McCoil

Bob l'eponge avec serpent!

Oh! ♫ Who lives in a ♪ pineapple under the ♪ sea, SpongeBob ♬ SquarePants…

Imagine this scenario. You are sitting in front of the TV watching the Spongebob Squarepants movie1 and you’re a little peckish. In fact, what you could really, really do with for that authentic movie experience is some popcorn. No problem! In your cupboard, you have microwave popcorn. So off you trot to the kitchen (pressing pause because it’s a particularly funny scene where Mr Crabs is about to say ‘with cheese, Mr Squidward, with cheese’), slap it in the microwave and hang around for a couple of minutes until it’s done. Nice job! So for maximum efficiency, you now put the popcorn in a bowl and leave it in the kitchen. Then, whenever you want a bit of popcorn, you walk into the kitchen, get one piece, go back to the movie, eat it, rinse, repeat. What? Inefficient, you say? You’d have taken the bowl with you? Of course you would. Which is why this post on Stack Overflow seemed so staggering. A colleague suggested it might be a setup for the Daily WTF, which I think we all hope it is, but are tainted having actually seen production code in actual products that’s as bad, if not worse, than this which is, at least, from a beginner. Furthermore, the guy in question has asked several such questions so… well, who knows. In a nutshell, the chap is writing an adventure game. He wants to show a map, just in simple text, with an X marking the location of the player. The map is 13 x 10 grid and has, therefore, 130 locations. Here’s an example of a line:

[X][ ][ ][ ][ ][ ][ ][ ][T][ ][ ][ ][ ]

I thought that the Ts were trees. Which would have been nice, because Ts do actually look like trees, but it turns out that they represent towns. His solution, and this is almost unbelievable, is to repeat the code that shows the map one hundred and thirty times. Then, using “if my location is 3, 4” type programming constructs he chooses which one to show.

Thanks to wonderful resources such as stack overflow, instead of allowing his natural programming spidey sense to develop, he simply posted an example of one instance of the map display with this as the money-shot:

“I have the map typed out 130 times in 130 different if statements, each time the X is somewhere else. I would think there would be a better way to do this, but I have no idea.”

No idea? What, none? You didn’t have even the smallest alarm bell dinging about, say, 50 copy-pastes in to this massive affair? You completed the entire exercise of creating 130 repeats of the same block of code with a minor, predictable difference without thinking “perhaps I could do this with a loop, or something”? You had to ask on an Internet resource with absolutely no clues at all? Did the brain even warm up? Or did sheer laziness prevent any thinking whatsoever? For crying out loud, this is worse than cutting your lawn with nail clippers and then, having finished, observing to your neighbour that it is “amazing that there isn’t a better way of doing this”.

It’s wonderful that people are learning to program. It’s a valuable, interesting skill that — in theory — teaches all sorts of problem solving skills but for those to develop, they need to be practiced. This means attempting to figure it out for yourself. When you finally reach a dead end, show your working when asking for help (he was asked “what have you tried?” to which he replied “I’ve not really tried anything yet”, emphasis mine.). It’s OK to ask for help, but the assumption is that you’ve at least made an effort that can be quantified without requiring an electron microscope to find it. As an added bonus, people can advise you on your problem solving and how you might save some time later. It’s win, win and win. You don’t win friends with bone idleness, just as you don’t with salad. Laziness should neither be encouraged nor tolerated: I genuinely hate to appear negative to someone who is ‘learning’ programming, but this is programming by proxy, or, more specifically, programming by piecing together other peoples’ solutions to your problems.

A while back, I said I was going to do a C++ tutorial via the medium of writing a 1980s style text adventure game. Needless to say, I completely failed to get around to it. This isn’t from lack of desire, it’s from lack of time (plus only one person was pestering me and not nearly often enough – you know who you are, Montaigne). However, since I’ve had a draft of this hanging around on my hard drive for the best part of nearly a year, I figured it was worth brushing it down, beating it into shape and slapping it onto the blog to make it look like I’m doing regular updates.

So let’s crack on.

Obviously, we won’t be worrying our fragile little minds with minor details like a schedule or a proper design document because we’re making this up as we’re going along. Furthermore, this is an exercise in general amusement that may or may not last longer than a few parts depending on whether it turns out to be as fun as I imagine it to be (particularly since most of that imagination takes place after a gin and tonic and some glasses of wine.) Still, we can’t simply write code blind, so we’re going to need something to start from: A plot! Here it is. Have some tissues handy, it’s a real tear-jerker:

“Rattles McCoil is a rattlesnake. He lives in the jungle with his wife-snake and three children-snakes. Every day he goes out to hunt for food to feed his family: shrews, mice, that sort of thing. Unfortunately, one day, he was slithering around the jungle when suddenly… an evil exotic pet dealer leaps out from behind a tree and captures him! Oh no! Shoved into a dark, frightening bag, he rattles away but no-one can hear him. The next thing he knows he’s bumping along on the back of a jeep away from his home, friends and family. Then, when it looked like all hope was lost, the jeep hits a large pot-hole and comes screeching to a halt with a broken wheel. Rattles notices that his bag has become untied with the bumping and quickly, he slithers like he’s never slithered before being extra-specially careful not to rattle as he escaped. But Rattles is a long, long way from home. Can you guide him back? CAN YOU? Of course you can.”

You’ve got to admit, it’s genius. We’ll also need a rough map to work from – something that describes the lie of the land and the general location of everything that we’ll use to build our adventure game. Here’s what I’ve got so far:

So many wonderful places to explore, but much more importantly, features an OXBOW LAKE! Yeah for my ‘B’ in O Level Geography! I knew it would come in handy eventually.

Eh? Eh? Honestly, I’ll bet you’re gagging to play this game now. Who wouldn’t want to traverse the Marsh of Crocodoom or the Plains of Waiting Disaster? And those Shrubs of Terror are particularly inviting. Of course the map features a plateau and a volcano. All good adventure games have volcanoes, because they’re cool and they feature deadly lava.

Finally, we need at least some form of specification even if it is “specification-light” to guide us in the early development. This need not be in any way comprehensive. Which is good, because this is possibly the least comprehensive design document ever produced by mankind2. Here’s the product summary:

“Text adventure game as they used to be done. Simple verb-noun commands such as ‘slither north’, ‘eat shrew’ and ‘rattle’. Adventure game will consist of a hundred or so “rooms” and thirty or forty objects that can be used in a variety of different ways. An effectiveness score is calculated based on the efficiency of the user’s journey through the optimal path. Various puzzles need to be solved and some solutions require the player to find and assemble things whereas others are traditional mazes. A very simple fighting system provides a little more excitement than a simple ‘fight giant spider’ and comes with various augments such as armour, specialised weapons (fangs, poison) and food based bonuses. A sense of urgency will be provided using other game characters that autonomously attempt to hunt Rattles wherever he is. Scattered around the gaming environment will be a selection of treasure chests, keys and other relics that Rattles can take home to complete the adventure with additional style and satisfaction!”

It’s also kind of nice to have some idea as to what we might do with it in the future after the first release, in the unlikely event that there is a first release:

“Enhance the game with improved puzzles and basic graphics to illustrate the various scenes. Allow for the possibility of multi-player where more than one player can take on roles inside the adventure at once. Look towards creating an iOS version so that it can be made available on the Apple App Store”.might take this as the days, months, years and generations roll forwards. If nothing else, it lets us avoid accidental closing of doors by designing for a future regardless of whether we make it there or not.

Armed with our plot, the map and the “design” we’re about ready to begin laying down some code, which is a shame, because we’re flat out of time on this episode of “Stringing it out for years”. But next time, next time we’ll code like there is no tomorrow. Bring a toothbrush. And wine. Here’s a “hello world” program to get you going because, well, it wouldn’t be quite fair to have a programming tutorial that had no software in it at all, now, would it?

#include <iostream>

int main(int argc, const char* argv[])
{
    std::cout << "Hello, giraffe and snake lovers!\n";
    return (0);
}

In the meanwhile, if you're planning on following this tutorial (and you'd better be really, really patient to do so) then you might want to get yourself sorted with a C++ compiler and see if you can get the above to compile and run as a command line program. If you can, you're ready to start putting together the adventure of a lifetime, saving Rattles McCoil from doooOOOooOOOoom!

  • Windows users, try Visual Studio Express.
  • Mac users, install Xcode from the App store.
  • Linux users, well, you don't need my help, you probably already compile half your applications 😉

Anyhoo, tune in spring 2014 for the next gripping part of this series, perhaps earlier if anyone pesters me about it sufficiently.

Oh, and go and watch the Spongebob Squarepants movie. With or without popcorn.

-
1 It's a fantastic movie. Even if you're not a Spongebob fan (which you should be), it's a really fun watch.
2 This is not true. I have personally worked on major titles that have had no specification at all and two of them were Amiga titles that I was the only programmer on. Sometimes, someone knows the design so well in their head that it “just happens”. On the other hand, other titles have had virtually no design and have ended up a complete catastrophe of misplaced dreams, hopes and ideas. Like porridge, it’s too easy for it to be “too hot” or “too cold”. Just right, on the other hand, is tough. Getting it wrong incurs the wrath of bears, so be careful out there.

Posted in Software development | Tagged , , , , , , , | 8 Comments