If you ask a present-day programmer why graphical user interfaces look the way they do, with desktops and icons and windows and menus (what they call the WIMP paradigm), they're likely to explain you how people can more easily learn to use software if it all looks the same and works the same, how they can explore the interfaces and apply their experience from one application to the next. How everyone is familiar with that type of interface by now, even indirectly, and it's what people have come to expect.
And it's all utterly wrong.
This is somewhat off-topic here, pertaining as it does to software in general, not just games; though in my defense, the article that prompted it, called How Technocratic Hyper-Rationalism Has Birthed Right-Wing Extremism, does turn out to be about games in the end. But games are software, and software development has been going through a massive crisis lately. Two, actually: one of burgeoning complexity, and one of relevance. And this ties into a bigger trend -- pointed out by the aforementioned article -- of people focusing more and more on the shiny toys while forgetting the who, the what and the why.
I ranted against techno-utopianism before: the childish belief that more shiny toys will somehow cure all the world's ills by their mere presence, when it's not the toys you have, but how you use them. (Look at the hubbub surrounding clean energy and self-driving cars when the Paris Metro has been automated and nuclear-powered for decades -- and yes, nuclear is cleaner than coal.) Or that computer algorithms are somehow objective and unbiased, a notion recent case studies have thoroughly dismantled, but one technocrats love, for obvious reasons: it justifies the status quo in which they rule the world.
All too often, what I want is a small library to help with a specific task, but instead you're offering me giant frameworks caught in a mesh of dependencies, that would dwarf my application and make it extremely difficult to distribute. I want to write apps for anyone to just download and run, but instead you're forcing me to think about ecosystems. I need to use my computer, and you're talking about leveraging the synergy of the cloud. I ask for a hammer, you offer a hydraulic press factory.
I learned a lesson these past few days: Starting something on the Internet has become frighteningly easy.
You can create a Google Code project, or a Google group, or an Yahoo group in seconds. Or if you have your own hosting, you can set up any number of Web applications in minutes. WordPress, MediaWiki, you name it: with practice, you can do it essentially without thinking. All the infrastructure required to launch an open source project (or the next great Web empire) can be set up literally on a whim... And therein lies the danger.
I like to brag about my work, for the simple reason that I like, well, my work. But it usually results in people taking me for a much better programmer than I am. Then again, that might have something to do with the decent number of completed personal projects on my website: working software, complete tutorials, playable games... Although they are small stuff (after all, there are people out there who write operating systems for fun), they obviously count as accomplishments.
Now, if even such limited success seems to elude you, you may think people like me have some secret ingredient. But all I have is a number of common sense rules derived from (sometimes bitter) experience.
I've been a professional web developer for over 13 years now. Technology has progressed, fashions have come and gone, expectations have changed accordingly. Time and again, however, I see projects running into the same carbon copy problems. I've ranted about this before, but every time I find more variations on the same theme. This time I found them all in a single project.
The most common mistake I see is customers asking for features because they've heard a website is supposed to have them. Don't. Your website isn't there for the sake of it. Keep your objectives in mind.
(Originally written on 30 June 2007. Revised on 25 October 2008.)
This article is dedicated to all the programmers who are still not using a version control system. I know you are out there; I was one of you mere weeks ago. But that has changed, and believe me, it's a world of difference.
This is in response to a lambda-the-ultimate post on literate programming, in which they wonder why it never caught on. Now, there are several possible answers to that question, and they give one or two good ones. But I think they are ultimately missing the point.
See, the idea of literate programming was born out of the well-meaning but ultimately misguided notion that programming is hard because of the notation. It's the same thinking that gave us pseudocode and UML. Did I mention 4GL languages? No, I didn't, because that particular idea died, thankfully, before I could be exposed to it (being young does have advantages, you know). The others survived enough for me to learn about them and to use them with abandon for a while. Then I asked myself a simple question. Why am I writing all my programs twice?