The effortless convenience of artificial intelligence comes with mounting demands on our energy supplies.
Less than three years ago, AI chatbots were a novelty. Today, they’re used for everything from computer coding to creative inspiration – and the opening paragraph of this story.
But every time you unlock your smartphone to ask ChatGPT a question, you’re not just getting an answer – you’re potentially chewing through enough electricity to charge the whole device.
AI chatbots use much more energy than a typical internet search because they have to perform more complex computational tasks, both when they are being trained and when they are being used.
Those tasks rely on huge and heavily energy-intensive networked computer systems known as data centres, which are needed to train and operate the AI models. Data centres require tremendous amounts of round-the-clock energy to power their systems, along with cooling, lighting and other networking and security equipment.
Estimates of just how much of that energy is needed to power individual AI searches vary widely. In a blog post earlier this month, OpenAI founder Sam Altman claimed the average ChatGPT query used about 0.34 watt-hours of energy.
Translated into English by Google’s AI chatbot, Gemini, that’s the equivalent of turning on a typical electric oven or boiling water in a kettle for around a second, or actively browsing on a smartphone for about 10 minutes.
Chatbots perform much more complex tasks than traditional Google searches, which were only ever designed to efficiently find and present information that already exists. AI chatbots, by contrast, have to “think” and generate unique, original responses in real time.
One analysis by investment bank Goldman Sachs estimated that a typical ChatGPT query required almost 10 times the electricity needed for a traditional Google search – closer to 3 watt-hours. That’s roughly equivalent to turning on an LED lightbulb for just under two minutes, or charging a smartphone for a few seconds.
Complex AI queries, especially image and video generation, require huge amounts of energy. Bloomberg
Things also get a little more energy-intensive when you start to increase the complexity of an AI query, such as summarising long documents, or doing computer coding that involves step-by-step reasoning.
According to some estimates, those queries can use anywhere between 10 and 30 watt-hours, or more – equivalent to running a hairdryer or a microwave for 30 seconds or keeping an LED light on for about one hour.
Using AI to generate images can take even more power – equal to fully charging an iPhone or boiling water in an electric kettle for around 30 seconds.
Video generation, though, is by far the most energy-intensive use of AI. According to some estimates, the amount of energy required to generate a five-second, 16-frames-per-second AI video could power a kitchen oven for 40 minutes.
There is also an important distinction between the energy required to train an AI model and the energy used to ask it a question.
The training element is particularly power-intensive, thought to be equivalent to the annual energy consumption of thousands of homes, whereas the processing of individual queries – or “inference” – has a much smaller energy cost.
However, it’s the cumulative effect of all those queries that is causing headaches for power grid operators. Gemini likened this to the construction and operation of a factory.
“After the factory is built, it produces millions of goods every day. Each good produced (an AI query) might not take much energy individually, but the constant, high-volume production requires continuous energy input,” Gemini said.
That cumulative effect is growing. AI is increasingly functioning more like a replacement for internet search – and with more than 5 billion global internet users doing an estimated 9 billion daily searches, the energy demands of AI are only likely to go up.
For energy grid operators, the demand is already being felt. ChatGPT was launched in November 2022, less than three years ago. In December 2024, Altman claimed it was processing over one billion queries per day.
On Tuesday, OpenAI chief economist Ronnie Chatterji told The Australian Financial Review that more than 500 million people used ChatGPT every week, while the number of Australians actively using the platform had doubled over the past year.
That has left grid operators around the globe scrambling to include this huge and sudden increase in demand into their forecasts for future energy use – forecasts that are changing by the month.
Recent draft estimates of data centre power demand by Australia’s electricity grid operator showed massive upward revisions of their future energy use on 2024 projections. The new forecasts almost double what was speculatively estimated in 2024 as an “accelerated” case.
Aside from energy, the other hidden input into AI searches is water, which is used to cool the data centres. According to Altman, a typical ChatGPT search uses 0.32 millilitres of water, or about 1/15th of a teaspoon.
Other studies have estimated much higher water inputs, especially when taking into account indirect water use from the energy generated to power the data centres.
One study from the University of California put water use per query at around 10 millilitres – around 30 times higher than Altman’s estimate. That’s roughly equivalent to two teaspoons.
Once again, though, it’s the cumulative effect of the billions of queries puts this in perspective. At 10mL per query, one billion queries per day would use the same amount of water as a small town.
According to Gemini, it’s a reminder that even our seemingly weightless digital world has a very real physical footprint.
This article (except for the first paragraph) was written by a human.