The cloud and good climate – How cloud computing can help protect our climate
In the column “Stropek as a Service”, SaaS expert Rainer Stropek talks about exciting aspects of the implementation, monetization and use of software as a service offerings. This column focuses on how cloud computing could actually help reduce global warming.
We can assume that global warming caused by humans will reach 1.5 ° C compared to the pre-industrial age by about the year 2040 . This forecast assumes, however, that drastic measures in terms of reducing CO2 emissions are taken immediately, and that emissions fall to zero by 2055. Political declarations of intent abound. But deeds rarely follow words. If anything, reactions are hesitant. Therefore, one can assume that global warming will reach a higher level in the next decades, and that the consequences will be correspondingly serious.
These are far from rosy prospects for the medium- to long-term future. As a software developer, I wonder how I can exploit my area of expertise to help improve the situation. One motivation to specialize in cloud computing has been to increase the environmental sustainability of IT from the outset by making more efficient use of data center resources. Today, in the face of the climate crisis, this issue is more relevant than ever. So, today I want to dedicate my SaaS column to this issue. This article contains many references to Microsoft’s Azure data centers. This is not to say that the other, large cloud providers such as Amazon or Google are worse than Microsoft in terms of sustainable cloud computing; sometimes the opposite is the case. But I work primarily with Microsoft and therefore know this company much better than its market rivals. That’s the reason for the many examples from the Microsoft environment that follow.
Efficiency as the key to climate protection
The big cloud providers like Microsoft run data centers on a scale that hardly any other organization can reach. Gains in efficiency are therefore reflected significantly in the results of the respective cloud companies. A few percent savings in the area of server energy efficiency are of little significance for medium-sized companies that have their own computer room. That’s not really of any concern. But if you operate servers in numbers of many powers of ten, even small improvements leave a clear mark. Corporations like Microsoft are investing huge sums of money and effort in improving the efficiency of their cloud data centers. Specifically, these are the following points:
- Operational efficiency
- Hardware efficiency
- Infrastructure efficiency
- Exploiting renewable energies
In the cloud, resources can be allocated and released again relatively quickly. The entire operational model is designed for elastically scaling systems. As a result, overprovisioning can be avoided. A server that is – at least temporarily – not needed and therefore shut down contributes more to energy savings than the most energy-efficient server running without performing any meaningful task, producing only hot air.
In addition, server farms in the cloud are designed from the ground up for multi-tenancy. As a result, multiple cloud applications can be operated cleanly in a common server infrastructure. This is in stark contrast to what I repeatedly see in medium-sized data centers. There, each application, no matter how small, is assigned its own VM, if not its own server, in order to avoid mutual interference. Thanks to multi-tenancy, the density of applications per server (hosting density) can be significantly increased, saving on servers and the associated hardware, cooling, etc.
If one increases the load of modern servers, energy consumption increases at a lower rate than the computing power used. A study by Microsoft has shown, for example, that increasing the load on a server from 10% to 40%, i.e. quadrupling the computational load, only increases the required energy by a factor of 1.7. Measures taken to increase server utilization in the cloud on a large scale thus help to reduce the total energy required.
Large cloud providers such as Microsoft can use their purchasing power to influence the design of server hardware to an extent that is not possible for smaller businesses. This leads to servers being optimized for the needs of the cloud environment and to minimized power consumption. I also find it a very positive development that the open source concept is being adopted by the software world in the area of data center hardware. For example, Microsoft provides many hardware innovations as open source designs.
A data center’s energy consumption is determined not only by the efficiency of its servers. Also to be considered is the ratio of energy provided to a data center and the energy that is actually used for computing power. This ratio is referred to as the PUE (Power Usage Effectiveness) factor. A value of 1 would mean that all the energy is used for computing power and nothing is used for cooling, lighting, etc. PUE values greater than 2 are typical for smaller data centers . Modern cloud data centers reach values of 1.1 and less. For example, Microsoft already achieves an average PUE of 1.125 for all of its new Azure data centers.
Large cloud providers can benefit from economy of scale by improving their PUE factor. Their size allows them to afford conducting energy efficiency research and development that would be unthinkable for smaller organizations. Here are two examples:
Gas-powered (natural gas and / or biogas) in-rack fuel cells to power servers that increase reliability and energy efficiency while reducing energy consumption.
Use of batteries and generators in every major data center as intelligent buffers for power grids.
Exploiting renewable energies
Most of the major cloud providers are carbon neutral (see e.g. Microsoft’s blog). In these times when one is constantly reading of unachieved climate goals, it is refreshing to hear that, for example, Microsoft will reach its 2020 target of 60% renewables by 2019 and now is striving for 70% in 2023. Google is at the forefront in this regard and is already operating with 100% renewable energy. Considering that data centers will account for 4.5% of global electricity consumption in 2025, CO2-neutral operations and a high share of green energy are important factors.
SEE ALSO: Cloud-Native DevOps
The responsibility of development teams
If all existing IT were moved into the cloud, this alone would be very positive for the climate, depending on the status quo. But such change would make an even greater difference, if development teams modernize software and make better use of the cloud’s capabilities. Here are some examples:
- Instead of dedicating an individual, fat Windows VM to even the simplest Web API, a slim, Linux-based Docker Container can dramatically increase hosting density.
Serverless services give cloud providers the ability to automatically scale services that are infrequent or heavily fluctuate, and thus avoid overcapacity. This not only reduces energy consumption; it also helps reduce cloud costs.
- All major cloud providers offer cost and utilization data via Web APIs. These can be recorded in controlling systems and thereby made transparent. The automated billing and telemetry systems offered by cloud computing providers help create incentives for companies to increase server efficiency.
- Costs are variable in the cloud. Unnecessary services always cost money and can be turned off without much effort. Even if not serverless, many cloud services can be scaled using the simplest means (e.g., time-controlled at night or on weekends). The cloud’s price model thus automatically raises the level of efficiency.
It is a real shame that in many companies I talk to about cloud computing, cloud discussions are dominated only by considerations of regional data storage and processing. The topic is rarely considered holistically, and aspects such as climate protection through increased efficiency are seldom seriously discussed. Servers in the company’s own computer room are cooled and run by outdated air conditioning units, without anyone worrying about sensible workloads. The next time a company considers a change, it should think about whether modernizing its software and switching to the cloud would make a meaningful contribution to climate protection.
 Allen, M.R., O.P. Dube, W. Solecki, F. Aragón-Durand, W. Cramer, S. Humphreys, M. Kainuma, J. Kala, N. Mahowald, Y. Mulugetta, R. Perez, M. Wairiu, and K. Zickfeld, 2018: Framing and Context. In: Global Warming of 1.5°C. An IPCC Special Report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty [Masson-Delmotte, V., P. Zhai, H.-O. Pörtner, D. Roberts, J. Skea, P.R. Shukla, A. Pirani, W. Moufouma-Okia, C. Péan, R. Pidcock, S. Connors, J.B.R. Matthews, Y. Chen, X. Zhou, M.I. Gomis, E. Lonnoy, T. Maycock, M. Tignor, and T. Waterfield (eds.)]. In Press.
 Arman Shehabi, Sarah Josephine Smith, Dale A. Sartor, Richard E. Brown, Magnus Herrlin, Jonathan G. Koomey, Eric R. Masanet, Nathaniel Horner, Inês Lima Azevedo, and William Lintner. United States Data Center Energy Usage Report. Berkeley, CA: Lawrence Berkeley National Laboratory. LBNL-1005775. 2016.