All Software Rots
Software is an interesting creature. It’s easy to imagine that a piece of software is, at some point, completed, and goes on to live and run for eternity. The group who wrote it retires happy and comfortable, based on the success of that software.
Except, this is not reality. Over time, software degrades, runs more slowly, or doesn’t run at all. Putting software out there and expecting it to be useful in the future means we’ll have to work at it. It’s a consideration I’ve only recently realized. What causes this?
A technology stack involves a lot of moving parts. Hardware, firmware, drivers, operating systems, libraries, frameworks, browsers, APIs, and applications are all in this mix. There are other things I’m not even aware, I’m sure. And these stacks support the existence, operation, and functionality of an application.
Each of these components has a purpose, design limitations, bugs, shelf life, and life cycles. Security vulnerabilities, performance issues, and annoying bugs are likely present too, perhaps to be discovered decades down the line. Software written to run on systems in the 1960s have different stacks than those written in the 90s than those written today than those written in a few years.
When one piece of the stack goes extinct, the change ripples upward. Have you tried using games or other applications from a decade ago? It’s not easy. Large programs with many dependencies are at the most risk. More pieces can rot out from under it, ruining the beautiful cathedral which many human-hours built.
Services are no exception to this rot. Take Heroku for an example. Developers can deploy their web application without having to worry about the nitty gritty of configuring and managing a web server. The tradeoff is buying into the platform Heroku provides, and their company. These are also vulnerable to rot.
Note: I use the free tier of Heroku, so I can’t really complain. It just happens to be the Platform-as-a-Service I’m most familiar with.
Even if you don’t make any changes to the functionality of a web application, you’ll have to work to maintain it. It won’t run for 10 years on its own. These platforms will upgrade their software, practices, and capabilities. Tech stacks will fall as others rise.
I use the word ‘rot’, which has certain connotations. But it’s similar to the rotting of a house. Termites just do their thing, and it results in the foundation of a house changing in ways that are structurally problematic. It’s not a moral judgment; it’s neither good nor bad. Similarly, a tech stack rotting is neither good nor bad. Developers, software, and hardware just do their thing, and it results in the foundation of an application changing in ways that are structurally problematic.
Dynamic sites have more risk of rot than do static sites. The dynamic part means there’s software, production code, and databases. Security holes, bugs, and cosmic rays will have more chance of impact. There are more moving parts that need tending to.
Sites serving up static assets like HTML, CSS, and Javascript have a better chance to survive into the future. There are fewer moving parts. Static website generators are quite popular today, for many reasons, including their speed, costs, and ease of use. They’re curious to me because the output has the potential of a long lifespan.
The generator itself can be subject to rot, but that doesn’t impact the site you’ve already created. You can switch out the toolchain, and, as long as the site created is the same, the code is transparent.
I’ve currently got The Mechanism Collection hosted as a static site on GitHub Pages, for free. I get to piggy back on GitHub’s infrastructure, but I could take the static pages and host them elsewhere, if needed. Perhaps the Pages infrastructure will rot in the future.
Static sites are something I’ll explore more going forward. Especially for hosting writings, like this blog. I don’t author my content in Wordpress anymore. The comments here don’t get used much, because the conversation happens on social media, where I share the links. There are 3rd party services for comments, although I’d be wary of using one. The blog itself could be static, and visitors likely wouldn’t even notice a downside. The benefit of speed would be noticed and appreciated, however. Keeping URLs to posts the same would make the change even more transparent.
Sentient machines are a lofty goal, but what happens when there’s a zero day that turns that ‘being’ into an intelligent botnet? Rot is job security for software engineers. We have the opportunity to create lasting works of code. We’ve got some work ahead of us though.