These days, humanity celebrates 50 years from the first Moon landing. Half a century since creatures of this Earth set foot on another celestial body. Inevitably, people in my line of work remember and celebrate one particular hero of that story: Margaret Hamilton, who led the software team and coined the term "software engineer" to describe what she did... because she was literally the first one.
And you know what? I think most modern programmers feel woefully inadequate compared to her.
You've probably heard the joke. I tweeted it myself just recently. It goes like this: the Apollo Guidance Computer had 4K of RAM and a CPU running at 2MHz, and they went to the Moon with it. The smartphone in your back pocket is millions of times more powerful and crashes trying to render a large web page.
Of course, it's nowhere near so simple. The circumstances were different, and highly specific. Moreover, we are doing hugely important work with computers today that wouldn't have been possible 25 years ago, much less 50. Such as putting together the data from countless telescopes the world over to image a giant black hole in a distant galaxy (an effort also led by a woman, by the way).
However, the sheer enormity of what NASA achieved in 1969, and the absurd discrepancy between said achievement and the humble computer that played a part in it can't help but amaze us. We stand in awe at the dizzying perspective. And there are a few lessons to learn from it.
We like to think that spaceflight is complicated, but the AGC held 72K worth of software! For a term of comparison in the modern world, you'd have to look at cheap microcontrollers used in hobbyist electronics. Heck, we've sent space probes all the way out of the Solar System, not to mention Mars and the rings of Jupiter, and they didn't use much more than that. Turns out it's a lot more complicated to get pictures from the end of the Universe. The Hubble space telescope had the equivalent of a 486 PC from around 1994. And that was after an upgrade.
You know what's even more complicated? Rendering a modern web page in a browser.
How in the world did we misplace our priorities so badly?
Moreover, the AGC software was written in assembler. They probably had line editors and teletypes for development. (In retrospect, Forth would have been ideal, but it wasn't invented until 1970.) Nowadays we have OOP, garbage collection, IDEs, debuggers, build systems, containers... and our browsers still crash with alarming regularity. Often because someone forgot to check for a null pointer after opening a dialog window.
Luckily, for scientific work we can make do with PDFs uploaded to an FTP server.
Do you see it now? Grossly underfunded for decades, misunderstood and mistrusted, science has continued to make amazing breakthroughs on a shoestring. And most of it, as a friend pointed out, is due to better lab equipment (to leave out human factors). IT, especially software, is contributing less and less to the progress of humankind, at least in proportion to everything else. And when it does, it's often with the same tools our grandfathers used. Such as Fortran.
So what exactly are we doing with all the crap we invented in the mean time?
Bigger software? Unix had a few thousand lines of code as late as the early 1980s. My first PHP app was bigger than that. And it wasn't anywhere near as important.
More correct software? Mom's phone gets massive software updates every few days, and still works like shit. The AGC was literally hardwired. By hand. With needles. There was no updating it once installed. Somehow, it worked well enough anyway.
Making our own work easier? Terrifying stories of crunch in the industry rock the headlines with regularity. Even as awareness and opposition have grown in leaps and bounds as of late. Turns out, each new tool is yet another complication to account for: exactly the opposite of what we were promised. Funny how that goes.
We programmers can't seem to help ourselves much anymore, let alone other people. That vague feeling of uselessness that's been nagging at you for a while? Now you know where it comes from.
The real question of course is what to do about it. And there's no one simple answer. I do, however, have a humble suggestion: be humble!
No, you're not going to create HAL 9000. It wouldn't help send people to the moons of Jupiter anyway.
No, you're not going to usher in a technological singularity. Consider yourself lucky. It would be an unmitigated nightmare.
No, you're not going to cure cancer, make people immortal or the like.
You can, however, help a lot of people do a lot of things, if only you listen to them and genuinely try to help rather than show off. You'll be surprised how few of your fancy toys it takes to build what they need.