Artificial Intelligence (AI) is everywhere. In just a couple of years, it has become a ubiquitous feature of many workplaces while enhancing the “brain power” of almost every app we use.
As well as being integrated into common applications, you can opt to use several free AI tools and platforms to assist with other tasks. ChatGPT, Google’s Gemini, and Microsoft’s Pilot are receiving the most publicity, but several others are already in common use.
Some AI tools are quite specialised, while others have a broader, more general intent and capability. If you’ve been using a search engine, particularly via a voice-prompted assistant like Siri, you’re already using AI.
Now, of course, the race is on between AI system developers to make faster, smarter, easier-to-use AI platforms and to integrate AI into more products and applications.
However, as the capabilities and prevalence of AI systems and tools expand, so too does their energy consumption, which is already getting to mind-boggling proportions.
Are policymakers and regulators paying enough attention to the potential future implications for our energy generation and infrastructure of the significant energy demands this rapidly evolving field will be adding to the equation?
AI is an energy-intensive business
It takes a lot of energy to run AI data centres and train the large AI models. If nothing else (and there is a lot else), when you’re using a lot of computational power in a concentrated manner for long periods, you need to keep everything cool, which takes a significant amount of energy.
The most recent estimates suggest that the data centres that are the backbone of AI operations already consume about 1-2% of the global electricity supply.
According to the International Energy Agency (IEA), data centres globally consumed around 415 TWh of electricity in 2024, and the proportion of that being consumed by AI might be as much as half.
Given the energy required to constantly run the cooling systems, to account for ongoing server maintenance, and to power the sheer volume of data being processed, those percentages are growing by the day.
Amazon Web Services (AWS) already has over 100 data centres across 31 regions globally, making it the world’s largest cloud provider. Most of the biggest (by area and capacity) data centres in the world are in the United States, in places like Nevada, Utah, and Arizona, where Apple’s Mesa Data Center is located.
The two largest individual data centres in the world are in China’s Inner Mongolia Information Park. China Mobile’s Hohhot Data Center has a total computing power of 6.7 EFLOPS, or 6.7 billion floating-point operations per second, deploys around 20,000 AI accelerator cards, and has space for 100,000 servers. The nearby China Telecom Data Center is slightly larger in area, covering more than 10 million square feet. It consumes 150 MW of energy across six data halls.
We are looking at very rapid growth
The IEA projects that electricity demand from data centres worldwide is set to more than double by 2030 to around 945 terawatt-hours (TWh). That’s more than the entire electricity consumption of Japan.
AI will be the most significant driver of this increase, with electricity demand from AI-optimised data centres projected to more than quadruple by 2030. At that rate, around 4% of global electricity demand will be consumed by these data centres,
However, this projection assumes a steady increase in AI adoption across various sectors, including healthcare, finance, and transportation. If the integration of AI into everyday applications and more industries spikes, which is more than likely, the current estimates might be blown out well before 2030.
At the same time, as AI models become more complex, the energy needed to train them will also escalate. And keep in mind that the “training” is constant, as the models are essentially trying to “learn” every single piece of information that they could need to do what we ask.
For example, OpenAI’s GPT-3, one of the largest language models, requires thousands of petaflops per second days of computing power to train. A FLOP is a Floating-Point Operation, a measure of computing performance, while a petaflop is 1015 of those, so the measure of a single petaflop/s-day is 1015 times 86,400 (the number of seconds in a day) or over 87 million FLOPs. But that’s just one petaflop/s-day, so multiply that by thousands, then again by the number of different AI models being trained!
Putting it another way, AI had better figure out more efficient algorithms and hardware for itself, or else the energy consumption of AI will be off the charts!
Could there be more sustainable approaches to AI energy consumption?
Given the potential for exponentially increasing energy consumption, the AI industry needs to devote some of its “energy” to exploring sustainable approaches to mitigate the energy consumption and environmental impact of AI. Here are some strategies:
Energy-efficient Algorithms: Researchers are working on developing algorithms that require less computational power, thus reducing energy consumption.
Green data centres: Investing in renewable energy sources for data centres can significantly reduce the carbon footprint associated with AI operations. Companies like Google and Microsoft are already leading the charge by committing to 100% renewable energy for their data centres.
Optimised hardware: Developing specialised hardware, such as AI accelerators and energy-efficient GPUs, can improve performance while minimising energy usage. Innovations in chip design can lead to more efficient processing of AI tasks.
Regulatory frameworks: Policymakers can play a role by establishing guidelines and incentives for energy-efficient AI practices and encouraging research into sustainable technologies.
It’s very much a “watch this space” situation
The conversation around AI’s energy consumption is only in its early stages, but it is a hugely important issue, and we hope (and expect) that there is some urgency to figuring it out.
As AI continues to become an accepted part of our lives, understanding its energy needs and implementing sustainable practices will be crucial.
The fact that the biggest companies in the world, like Amazon and Alphabet (Google’s parent company), are involved should be a positive thing, as they have already pledged their commitment to sustainable practices.
As the IEA states: “a diverse range of energy sources will be tapped to meet data centres’ rising electricity needs”, with renewables and natural gas set to take the lead due to their cost-competitiveness and availability in key markets.
“With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time,” IEA Executive Director Fatih Birol said.
“AI is a tool, potentially an incredibly powerful one, but it is up to us – our societies, governments and companies – how we use it.”