In 2019, KTH Royal Institute of Technology in Sweden estimated that the internet uses over 10% of the world’s electricity. This figure is still growing, too. Enforced isolation driven by the COVID19 pandemic has seen internet usage shoot up.
Naturally, there has been a lot of research on this topic, and I plan to explore some of it on this blog in future posts. However, most of it has concentrated on the hardware - the cables, routers, computers, and data centres which comprise the physical parts of the system. These are the bits that consume all this energy, after all, and if their consumption can be improved it seems reasonable to assume that the overall energy consumption of the internet as a system will decrease.
Unfortunately, behind this level of obviousness are some things which get a lot less airtime:
- All this hardware is only needed because of our insatiable appetite for software. Without software there would be no need for any of that energy-guzzling hardware. This demand for software (and the data it uses to do its job) show no signs of decreasing, or even levelling off. Everyone wants, and uses, more.
- The business and individuals who use all these systems (on the whole) do not care about the resource usage of the systems they use, just the price. “The Cloud“ is seen as a magic panacea. The price to use software is rushing toward zero (in monetary terms), with an increasing number of huge global systems offered as apparently free to use, but supported by advertising and data gathering.
- The software industry (on the whole) does not care about the resource usage of the systems it builds and uses, just the price. The price of computing resources is kept artificially low by exploiting economic and international loopholes, and borrowing from the future.
- There is an overwhelming lack of information. Even those people who do care have no effective way of finding out about the resource usage and environmental impact of the services and products they use.
Before you shout at me or click away in disgust, the points above are generalisations. I’m sure you can find plenty of examples where they are not true. For example, developers of software for mobile and battery-operated systems often obsess over power usage. That’s not my point though. Such applications don’t even move the needle on global power consumption. The big users are the massive data centres and corporate IT systems and all the infrastructure to feed them.
Individual people working with these systems sometimes care a lot about these things but, despite the legendary flexibility of software, they mostly can’t do anything about it because they don’t have the information to decide what to keep or what to change. Providing such information should be the job of the global community of academic and industrial research, but this is where this community has let the individuals people down.
To understand why there is such an information gap, we need to dig into how software development actually works.
As a simplification all commercial software development is driven by the same small set of factors, all of which ultimately boil down to money:
- Cost to make. This mostly consists of the salaries of the people involved, so more people for more time equals more cost.
- Cost to operate. This includes the salaries of people to support and maintain it and any associated business processes, as well as the equipment and energy costs to keep it running.
- Income The amount of money generated (or saved) by the software system.
If the income is greater than the cost to make plus the cost to operate, over whatever period of time is being considered, then the project is considered a success. That’s business 101.
The trouble comes when you take a deeper look at the numbers. In the great majority of software projects the cost to make is orders of magnitude larger than the cost to operate, and that number is made up almost entirely of the cost of the people. If a team of 20 people with an average salary of $50,000 takes one year, that’s a million dollars right away. And many software projects cost a lot more than that. In the short term the electricity and equipment to run these systems is trivial in comparison.
Worse than that, many software projects have an income that is also much bigger than the running costs. If that million dollar software project earns a million a year, the cost of the resources to run it is pretty much pocket change. From a financial perspective, effort is always better spent on increasing the (large) earnings by a few percent than on trimming a similar few percent from the (small) costs.
So in most commercial software development there is simply no incentive to reduce resource usage. Anything that takes any work (and therefore cost) will never happen.
But it is even more complicated that that, and this is where the reality of modern software development begins to leave the ivory towers of academia behind.
In the model of the world used in most of the academic papers I have read, software is written from scratch, starting with a blank page, and usually consists of implementations of a few relatively simplistic algorithms. These are evaluated, and conclusions are drawn about which are the most efficient.
Unfortunately, this model is a long way from the reality faced by actual software developers:
- Hardly anyone starts with a blank page. Software is almost always based on other software. These starting points include use-cases provided by a manufacturer, tutorial examples from the internet (or, occasionally, books), existing projects which have something in common, and so on.
- Software is not built from scratch. Programming languages come complete with language features and libraries to use for a wide range of purposes, and third party suppliers provide an almost infinite supply. In this sense, developing a modern software solution has a large aspect of shopping, of selecting libraries, components and frameworks to use in building the desired end result.
- There are hundreds or even thousands of alternative implementations of many components. People make software for fun and give it away for free. Lots of people, and lots of projects. These free components can be found in almost all commercial software. This is great, because it reduces software costs, but difficult for financially focussed businesses to deal with. What’s the point of a competitive tender process when all the alternatives are free and nobody cares whether you use their product?
- Alternative implementations vary enormously I did some rough experiments (which I will write about in more detail in a future post) and found that, in the category of component I was looking at, some would potentially use over a thousand times more energy to do the same job. Yet none of this was apparent from any of the documentation provided with these components.
The result of these observations is that there is a potentially huge benefit in addressing the resource usage of this profusion of freely available components, enabling a more informed comparison, and educating developers that their choices matter in more than just the financial bottom line.