How do you make a data center more efficient? When we asked Paul Calleja, director of research computing services at the University of Cambridge, we got a surprising answer:

“It’s all about the software,” he said. “The gains we get from hardware, in terms of output per megawatt, will be dwarfed by the gains we get from software.”

That might come as a shock to data center designers. When they consider the environmental footprint of a facility, they start with the energy used at the site.

They focus on the waste heat produced, and the cooling systems which remove it. In the IT equipment itself, they look for servers that perform more calculations per Watt, or storage that has a lower idle power consumption.

They sometimes (but too rarely) look at the embodied energy in the data center - the emissions created by the concrete it is built from, and the raw materials in the electrical components.

But the thing that is almost always ignored is the factor that caused the creation of the building in the first place, and drives every single action performed by the hardware.

This feature appeared in the latest issue of the DCD Magazine. Read it for free today.

All data center emissions are caused by software. Without software, there would be no data centers. If we want more efficient data centers, why do we start with the hardware when, as Calleja says, it is all about the software?

Why don’t we have green software?

“Going back to fundamentals, software doesn't emit any carbon by itself,” says entrepreneur and academic David Mytton.

“It's not like it's combusting or generating anything physical. The question is, what is the impact of the infrastructure the software is running on? And the first step is to try and improve how the infrastructure is behaving in terms of electricity, and the energy that's going into the data center.”

Regulators and corporate green leaders can specify the amount of power that a building can use, and the type of materials it is made of, and data center builders can demonstrate that they are working towards the best practice in these areas.

But beyond that, it’s down to software - and, as Mytton says, “there's been less focus on the characteristics and behavior of the software itself.”

“Operational engineers and hardware engineers have really been doing all the heavy lifting up until now,” says Anne Currie, an entrepreneur and developer who, together with Sarah Hsu and Sara Bergmann, is writing a book, Building Green Software, for the publisher O’Reilly. “Software engineers have all been ‘Lalala, that's their problem’.”

Efficiency is not a hardware problem, she says: “The steps that we have left to take in data centers are software related. And a lot of the steps that we have to take in people's homes are software related as well.”

To be fair, the sector has already effectively cut data center emissions. In the early years of this century, researchers noted that data center energy use in the US was growing rapidly. In 2011, it was predicted to continue growing out of control, but in fact, stayed at around 2010 levels.

This was because software became more efficient. Virtualization made it possible to consolidate applications within a data center, and cloud services offered these benefits automatically. Virtual servers, in centralized cloud data centers, started to replace standalone servers.

Virtualization software literally reduced the need to build new data centers, by utilizing the hardware better.

”The advantage of the cloud over traditional data centers is the use of software to get much much higher server density within data centers,” says Currie.

Use fewer cycles

That is the infrastructure. But when you are making new applications, how do you make them more efficient?

Green software should be engineered to be carbon-efficient. As Currie, Hsu, and Bergman put it: “Green software is designed to require less power and hardware per unit of work.”

Programmers think there is a simple answer to this, but Currie says it is almost always wrong.

“My co-authors and I speak at conferences, and every time we speak, someone gets up at the end and says ‘Should I just be rewriting my applications in C?’”

Everyone knows that C is efficient because it handles the server hardware more directly than a higher-level language like Java or Python. So programmers expect Currie and her co-authors to tell them to go use it. But it’s not that simple.

“It's hard to say no, that’s not the answer, because if you rewrite your applications in C, you might well get 100-fold improvements in efficiency,” she says. “But it will kill your business.”

She explains: “In the old days, I used to write big servers and networking systems in C, where performance was really critical. This was in the ‘90s. All the machines were absolutely terrible. The Internet was terrible. Everything was 1,000 times worse than it is today. You had to work that way, or it wouldn't work at all.”

There have been improvements since then, but “that 1,000-fold increase in the quality of machines has not been used to deliver machine productivity. We've used it to deliver developer productivity.”

Higher-level languages make programs easier to construct, but there are layers of interpretation between the program and the hardware. So less efficient software has soaked up the added hardware power. As the industry adage puts it: “What Intel giveth, Microsoft taketh away.”

It is inefficient, but it’s been necessary, says Currie. Writing in higher-level languages is easier, and that is what has enabled the speed and volume of today’s software development. We couldn’t go back to lower-level languages, even if we tried.

“We just don't have the number of engineers,” she says. “If we were writing everything in C it wouldn't touch the sides. It just takes ages to write everything in C or Rust. It’s slow. Most businesses would be killed by doing it this way - and I don't want everybody's business to be killed.”

Code better where it counts

Improving code is not straightforward, says Mytton: “It's always more complicated. We'll leave the simple answers to politicians, the real answers really come down to what you are trying to improve.”

Physics tells us that energy is power multiplied by time, so if you want to reduce the carbon caused by your energy use, you can reduce the power used, or improve the characteristics of the power by moving to clean energy.

“That reduces one part of your equation,” says Mytton. “But the time variable is often what software engineers and programmers will think about. If I reduce the time, by making my code faster, then the amount of energy consumed will be reduced.”

Of course, this assumes there are no other variables - but there usually are more variables, he says: “To give two examples, memory and CPU are two separate resources. Optimizing those two can be difficult, as you get different trade-offs between them.

“A second example is where you could reduce the time by running the code across 10,000 different servers, but you may have increased the power used. You need to figure out what you're trying to optimize for.”

For most of us, Currie says it’s about optimizing software “where it matters. Where it matters is things that are going to be used by loads and loads and loads of people, because it takes ages to do this stuff.”

As the book puts it: “Don’t waste your time optimizing software that hardly anyone is running. Before you begin, consider how much hardware (servers or devices) and energy (data and CPU) in aggregate an application is likely to cause to be used everywhere it is run. For now, target only what’s operating at scale.”

They go on: “The best application of your effort is always context-specific, and when it comes to going green, pain does not equal gain.”

The things that need to be optimized are generally the shared tools that underlie everything we do. So most IT departments should be operating as enlightened consumers, demanding that these tools are efficient.

“For most people in an enterprise, it is not about your code, it’s about your platform,” says Currie. “This is really about managing your supply chain. It's about putting pressure on suppliers. Every platform that you're running on needs to be green.

“Take the standard libraries that come with the very common, popular languages. Are those standard libraries optimized? Those standard libraries are used all the time, so it's really worth getting the people who are writing those languages to tune those libraries, rather than just tuning your code that runs on top of those libraries.”

Customer pressure will make platforms “actively great,” she says. “If the platform is rewritten in Rust, measures and checks itself, and has made itself as efficient as possible, that's much, much more effective than you doing it just for your own stuff.”

Mytton says: “I think the goal of sustainable computing is to make it so that consumers don't have to think about this at all. They can just continue to use all the services that they like, and they are automatically sustainable and have minimal or no impact on the environment.”

And that is already happening. “If you're using AWS today, you're benefiting from all of their renewable energy purchases, and improvements in their data center efficiency, and you've not had to do anything as a customer. And I think that should be the goal.

“Software developers should hope for something similar,” he continues. “They may have to make a few more decisions and build a few more things into their code. But the underlying infrastructure should hopefully do a lot of the work for them and they've just got to make a few decisions.

The start of the movement

The green software movement began as an informal effort, but in the last couple of years has increased its profile. Sami Hussain, a cloud advocate at Microsoft, formed a focus group which in 2021 was launched as the Green Software Foundation, at Microsoft’s Build conference.

“As sustainable software engineers, we believe that everyone has a part to play in the climate solution,” says Hussain. “Sustainable software engineering is inclusive. Whatever sector, industry, role, technology – there is always something you can do to have an impact.”

Hussain is now Intel’s director of green software, and part-time chair of the Foundation, which operates under the Linux Foundation. It has backing from organizations including Accenture, Avanade, GitHub, UBS, Intel, and Microsoft, and even apparently got the blessing of former Microsoft CEO Bill Gates.

“The idea was to answer the question: is this a software problem or is it a hardware problem? And the answer is it's both,” says Currie. “But while data centers were addressing it, the software industry really wasn't - because it just wasn't something that was that was occurring to them.”

The Foundation wants to be a grassroots organization, rather than trying to get top-down regulations: “We did talk about whether we should be lobbying governments to put rules in place, but it's not really our skill set. At the moment we are completely focused on just pushing people to be aware, and to measure, rather than getting the law involved.”

The Foundation has produced a report on the state of green software, and the three O’Reilly authors are all members.

Making measurements

“A lot of the focus of the Green Software Foundation has been about measurements,” says Currie, “because if you can measure then you can put pressure on your providers and your suppliers.”

The idea is to create a measure that will be called Software Carbon Intensity (SCI), which measures how much energy is used (or how much GHG is produced) for a given amount of work.

But it’s difficult. “Watts per byte is a key measurement criterion in the networking industry, but it isn't in the software industry,” says Currie. "Because in networking, watts per byte is quite clear, but what is the unit of work when it comes to software?”

The basis of the SCI is a “relatively simple equation, which looks at things like the energy consumed by the software, the embodied emissions of that software, and where that software is actually running,” says Mytton.

The unit of work is a bit less clear: “The functional unit can be something like a user call or an API call, or running a machine learning job.”

Combining these components gives a score, to help understand the overall carbon intensity of software, says Mytton, who is not directly involved in the SCI work: “I believe the goal of that is to be able to look at improvements over time. So you can see how changes to your software architecture or the components, reduce or potentially increase your score. And the long-term goal is to be able to compare different pieces of software - so you can make choices based on the carbon intensity.”

The goal of comparing software with SCI is still a way off, as different vendors define their system boundaries differently - and the SCI measure is still emerging.

The Foundation explains it "is not a total carbon footprint; it's a rate of carbon emissions for software, such as per minute or per user device, that can serve as an important benchmark to compare the carbon intensity of a software system as it is updated over time; or between similar types of software systems, such as messaging apps or video conferencing solutions."

Importantly, for SCI to work, software must be aware of where it is running, what electricity it is causing to be consumed, and the local carbon intensity of that electricity.

The good news is that modern processors from suppliers like Intel and AMD now routinely include tools to report on their energy consumption, but these are still evolving.

Mytton again: “Intel’s tool only works on Intel CPUs, and it's different if you want to get data from AMD. And so far as I'm aware, Arm chips don't have anything available. Given that more systems are moving to Arm [for reasons of energy efficiency] on mobile, some laptops, and in the data center as well, that's a problem.”

These measurements are going to be important, because organizations may need to balance efficiency improvements against performance. However, Greg Rivera, VP of product at software intelligence firm Cast says there won’t be many such cases.

“Research from the Green Software Foundation is finding that making your software greener, typically makes it perform better, and cost less, and it makes it more resilient,” he says.

Coders making systems work well might sometimes hit on methods that trade performance and efficiency, however - and that might have an effect on efforts to give users the best experience.

“It can increase the amount of energy used, if you deploy your code very close to your user on a less efficient set of equipment, versus putting it in the middle of nowhere on the very highest efficiency equipment in the highest efficiency data center,” says Mytton.

The cloud might reduce the power, but increase the delay: “You need to figure out these trade-offs: you could increase CPU processing, because you've got a faster processor that can reduce the time. And memory has an energy cost as well.”

There’s a whole piece of work to do there on the software efficiency of mobile applications, he says. Phones are built to run efficiently, because the customer needs them to keep operating for a long while between charges, but they don’t always divide software in the same way.

“There is almost a philosophical difference between Android and iOS,” says Mytton. “Android typically wants to do more in the cloud and offloads a lot of things to it, whereas iOS devices are doing more and more on device. That's partly to do with how Google and Apple think about their infrastructure and where their core competencies lie - and there's a privacy element behind that.”

Just run it in the right place

There’s another very significant added complexity. The same software can have a different carbon footprint at a different time or place, because it is being run on more efficient hardware, or on a server powered by low-carbon electricity.

This observation leads to another major plank of green software. Software can be made which reduces its power consumption and the emissions it causes, by changing where and when it operates.

“Green software also attempts to shift its operations, and therefore its power draw, to times and places where the available electricity is from low carbon sources like wind, solar, geothermal, hydro, or nuclear,” say the book’s authors.

“Alternatively, it aims to do less at times when the available grid electricity is carbon intensive. For example, it might reduce its quality of service in the middle of a windless night when the only available power is being generated from coal. This is called carbon awareness.”

1645789248-cusp-of-a-new-era-of-carbon-efficient-software-sanjay-podder-green-software-foundation1.jpg
– Green Software Foundation

“I'm a huge fan of this,” says Currie. “Google have been working on this for years, and they have done some really interesting stuff. They're really, really looking at time shifting. There are jobs that really aren't all that time-sensitive; things like training a machine learning system. They can be delayed a couple of hours to when the wind’s blowing, or when the sun's about to come out.”

The practical example Google talks about is the encoding of YouTube videos. “When you upload a video, you want it to be available relatively soon,” says Mytton. “But it doesn't matter whether it's half an hour or an hour after you upload it. Processing and encoding is a one-time job, and they are able to move it to a region with more clean energy.”

Google can do this because owns a lot of data centers, in a lot of regions, and has the data. At present, that data is not yet fully available to customers. “It's only very recently that all three of the big cloud providers have started allowing customers to get carbon intensity information from the workloads that they're running,” says Mytton.

Once users have that information they could re-architect their distributed applications on the fly, to send workloads where the carbon intensity is lowes. But they need accurate and comparable data, and early implementations from the major players tend to be dashboards designed to show their own service in the best light.

“The data just hasn't been available and generally still isn't available for the most part,” says Mytton. “This is a particular challenge - and the Green Software Foundation has a working group that is looking into access to data and making that open source.”

If that’s solved, then users can start to do environmental load shifting for real: “By that, I mean things like moving workloads to regions where there is more clean energy available at a particular time, or choosing how you're going to schedule jobs based on the availability of clean energy.”

And beyond that, load shifting could also address other metrics such as water consumption, and the embodied energy in the manufacturing of the hardware.

Of course, moving workloads to follow clean energy supplies would have an effect on the footprint of data centers, because it could leave machines idle, and this would make their embodied carbon becomes more significant in the overall footprint of the system.

“If the servers are not being used all the time, then then they need to last longer,” says Currie. “So if we solve our problem, it's gonna give you [hardware] guys a problem. Delaying work is not a no-brainer, it means we have to balance these things.”

In the past, idle servers were a heretical suggestion, as the wisdom of a Moore’s Law world was that assets needed to be used continuously and replaced quickly with more performant versions.

That world is over now. New generations of hardware won’t have the same performance boosts, and older generations will be kept in use for longer - especially as organizations move towards reducing embedded emissions and a circular economy.

Can we make it happen?

Green software creators are serious about changing things, But they will have to work against the instincts of the industry that more is always better.

Big data center operators believe they can carry on with untrammelled growth, if they are carbon neutral.

Those managing software - even underlying infrastructure software that is heavily used all the time - very often don’t rate efficiency highly enough.

Cloud vendors “want to make sure that the usage of their resources has a minimal or zero environmental impact, which is why they're putting all their efforts into trying to be carbon neutral and get to net zero,” says Mytton. “They tell you all of the good things they're doing around buying renewable energy and all those kinds of things, rather than necessarily focusing on how to use fewer resources - because those are things that they're charging you for.”

Mytton notes the Green Software Foundation has a lot of backing from Microsoft, but praises its work: “The GSF has done a good job at being independent from Microsoft cloud products or anything like that, although a lot of Microsoft people are involved. But we're seeing a lot of competition between the cloud providers now about who can be the greenest.

“Google has been leading on this for quite some time - but I think Microsoft is doing a very good job as well. They're just looking at different things and they're on different timelines,” he says, noting that Amazon Web Services is lagging on transparency and environmentalism.

The Green Software movement’s answer to these questions boils down to a simple guideline.

“Turn off stuff that isn't being used, be lean,” says Currie. “Don't do things just in case, turn off stuff while it's not being used, and turn off test systems overnight.”

“Are you storing more data than you need? “ she asks. “Get rid of it all, or move it to long-term storage like tape.

“The biggest win is always do less.” 