Lightning and Clouds

Powering the Cloud

The rapid growth of energy-hungry data centres, accelerated by the AI juggernaut, is a cause for concern in the corridors of power.
 

The very term “The Cloud” encourages us to imagine some amorphous blob, floating out there in the ether. And that’s if we think of it at all. For most, it is quite literally out of sight and out of mind, attributes that are either a key selling point or a key pain point, depending on your perspective.

For those focused on data security and sovereignty, the “Where” that matters is usually “Where is it?”, i.e. where is the sensitive data stored, and who might potentially have access to it?

For others, it is increasingly “Where will all that power come from?”.

Big…and getting Bigger

Estimates of the total power used by data centres worldwide are contested, with models relying on multiple assumptions and extrapolations, but they typically come out at 1-2% of today’s total power consumption.

This Goldman Sachs article suggests that the vast computational power needed to feed the AI revolution will see 160% growth this decade, equating to data centres consuming 3-4% of global power by 2030. To support that, the article starts with the eye-opening claim, originating from the International Energy Agency, that, on average:

A ChatGPT query needs nearly 10 times as much electricity to process as a Google search.

I would never have considered using AI to help write this article, of course, but that number might have given me second thoughts if I had!

The growth in demand for cloud-based data processing was well underway before the AI boom, but for some years that growth was being roughly offset by efficiency gains, with industry-wide energy consumption having more or less plateaued. The explosion in AI appears to be the key culprit in breaking this balancing act, with Goldman Sachs predicting that:

This increased demand will help drive the kind of electricity growth that hasn’t been seen in a generation.

They estimate that US utilities will need to invest around $50 billion in new generation capacity just to support data centres alone. That’s big, but it pales in comparison to the prognosis for Europe, which is in a worse starting position thanks to the ravages of the GFC, COVID, and the Ukraine war, leading to a prediction that:

Europe needs $1 trillion-plus to prepare its power grid for AI.

And with the world already fighting what appears to be a losing battle against the emissions fuelling global climate change, there is more unwelcome news:

Along the way, the carbon dioxide emissions of data centres may more than double between 2022 and 2030.

The Empire Fights Back

Many major providers of cloud-based services, as well as the data centres that host them, are very aware that this huge growth in energy usage is not just bad for the planet, but bad for their corporate reputations. It’s hardly surprising that they are mobilising both their technical and PR resources to respond.

In fact, one of the catalysts for this article was a business partner spruiking the line that “Sage Intacct is built on AWS and uses 95% renewable energy!”.

We couldn’t verify that exact claim – partly because the energy profile of data centres, and the source of that energy, varies considerably from region to region, but Amazon and AWS they are certainly on the front foot when it comes to promoting their march to renewables. For example, their AWS Energy Transition page includes the statement that:

Amazon is the world’s largest corporate purchaser of renewable energy and is on a path to powering our operations with 100% renewable energy by 2025—five years ahead of our original target of 2030.

As well as turning to renewable energy, the industry is continuously looking to improve the energy-efficiency of their servers, as well as the cooling systems that account for as much as 40% of their total energy use. Presumably, all these efficiency measures are driven by concern for their bottom line, as well as for the planet and their reputations.

Ironically, advances in AI may well prove to be part of the solution as well as the problem, as companies turn to AI to assist with the optimisation of their energy use.

Power to the People?

This Australian Energy Council article, "Data centres: A 24hr power source?", quotes ADP Consulting when stating that:

  • A large data centre can consume energy equivalent to that used by 50,000 homes.
  • Cooling the rows and racks of powerful computers also requires up to 19 million litres of water a day.
  • That's a similar quantity to the water consumed by a small city of 50,000 people.

The article then goes on to ask:

What if all that excess heat could be seen not just as a problem, but also as a resource?

The article quotes Global engineering company Danfoss as suggesting that excess heat, including that from data centres, is the world’s largest untapped energy source.

Harvesting excess heat to generate electricity is hardly a new idea – think geothermal power stations or organic waste management facilities. As well as generating electricity, the above article mentions a couple of real-world case studies:

  • Technological University Dublin uses the excess heat from an Amazon data centre to heat its student accommodation.
  • In Norway, the world’s first land-based lobster farm uses heat from a data centre to heat the water those lobsters grow in.

Who knows where this will end, given that the world’s appetite for off-site data processing and storage appears to be insatiable, but this much seems clear:

Managing the unintended consequences of these ever-expanding data-centres will require a lot of ingenuity and intelligence, be that human, artificial, or both.

Latest

Welcome Emma
Orchid Welcomes Emma!
A big welcome to Emma Fouche, the newest member of the Orchid team.