How much energy does ChatGPT consume? More than you think


Edgar Cervantes / Android Authority

Everything has a cost, and AI is no exception. ChatGPT While Gemini may be free to use, it requires a staggering amount of computing power to run. And if that weren’t enough, big tech companies are currently locked in an arms race to build bigger and better models like GPT-5Critics argue that this growing demand for powerful, energy-intensive hardware will have a devastating impact on climate change. How much energy exactly does an AI like ChatGPT consume, and what does this electricity consumption mean from an environmental perspective? Let’s take a look.

ChatGPT Power Consumption: How Much Electricity Does AI Need?

File photo ChatGPT 58

Calvin Wankhede / Android Authority

OpenAI’s legacy GPT-3 large language model Just under 1,300 megawatt hours (MWh) are required of electricity to train, which is equivalent to the annual energy consumption of about 120 American homes. To put that into context, the average American home consumes just over 10,000 kilowatt hours each year. That’s not all: AI models also need processing power to process each query, known as inference. And to do that, you need a lot of powerful servers spread across thousands of data centers around the world. At the heart of these servers are typically NVIDIA’s H100 chips, which consume 700 watts each and are installed by the hundreds.

Estimates vary wildly, but most researchers agree that ChatGPT alone requires a few hundred MWh each day. That’s enough electricity to power thousands of American homes — and perhaps even tens of thousands — a year. Since ChatGPT is no longer the only generative AI player in town, it stands to reason that its use will only grow from here.

AI could use 0.5% of global electricity consumption by 2027.

An article published in 2023 The study attempts to estimate how much electricity the generative AI industry will consume in the coming years. Its author, Alex de Vries, estimates that market leader NVIDIA will ship up to 1.5 million AI server units by 2027. That would result in AI servers using between 85.4 and 134 terawatt hours (TWh) of electricity each year — more than the annual energy consumption of smaller countries like the Netherlands, Bangladesh and Sweden.

While these numbers are certainly alarming, it is worth noting that total global electricity production was nearly 29,000 TWh just a couple of years ago. In other words, AI servers would account for roughly half a percent of global energy consumption by 2027. Is that still a lot? Yes, but it needs to be judged with some context.

The case of AI power consumption

cryptocurrency data center servers

AI can consume enough electricity to match the output of smaller nations, but it’s not the only industry doing so. In fact, the data centers that power the rest of the internet consume far more than those dedicated to AI and demand on that front has been growing regardless of new launches like ChatGPT. According to the International Energy AgencyToday, all data centers in the world consume 460 TWh. However, the trend has increased dramatically since the Great Recession ended in 2009 (AI had no role to play in this until late 2022).

Even if we consider the aforementioned researcher’s worst-case scenario and assume that AI servers will account for 134 TWh of electricity, it will pale in comparison to the overall consumption of the world’s data centers. Netflix alone used enough electricity to power 40,000 American homes in 2019, and that number has certainly risen since then, but you don’t see anyone clamoring to end internet streaming altogether. Air conditioners account for a whopping 10% of global electricity consumption, or 20 times more than the worst-case AI consumption estimate for 2027.

AI’s electricity consumption pales in comparison to that of global data centers as a whole.

AI’s electricity consumption can also be compared to the controversy surrounding Bitcoin’s energy usage. Like AI, Bitcoin faced harsh criticism for its high electricity consumption, with many labelling it as a serious environmental threat. However, the financial incentives of mining have driven its adoption in regions with cheaper and renewable energy sources. This is only possible due to the abundance of electricity in such regions, where it might otherwise be underutilised or even wasted. All of this means that we should really be asking about AI’s carbon footprint, and not just focusing on raw electricity consumption figures.

The good news is that, just like cryptocurrency mining operations, data centers are often strategically built in regions where electricity is abundant or cheaper to produce. That’s why renting a server in Singapore is significantly cheaper than in Chicago.

Google aims to have all of its data centers running on carbon-free energy 24/7 by 2030. And according to the company’s 2024 environmental report, 64% of its data center electricity consumption already comes from carbon-free energy sources. Microsoft has set a similar goal, and its Azure data centers power ChatGPT.

Increasing efficiency: Could AI’s electricity demand stagnate?

Samsung Galaxy S24 GalaxyAI Transcription Summary

Robert Triggs / Android Authority

As generative AI technology continues to evolve, companies have also been developing smaller, more efficient models. Since the launch of ChatGPT in late 2022, we’ve seen a slew of models that prioritize efficiency without sacrificing performance. Some of these newer AI models can deliver results comparable to their larger predecessors from just a few months ago.

For example, the recent OpenAI report GPT-4o mini It is significantly cheaper than the GPT-3 Turbo it replaces. The company has not disclosed efficiency figures, but the order of magnitude reduction in API costs indicates a large reduction in computing costs (and therefore electricity consumption).

We’ve also seen a push towards on-device processing for tasks like summarization and translation that can be accomplished with smaller models. While it could be argued that the inclusion of new software packages like AI of the galaxy Although the device’s power consumption is still higher, this disadvantage can be offset by the productivity gains it enables. For my part, I would happily trade slightly shorter battery life for the ability to get a real-time translation anywhere in the world. The sheer convenience may make the modest increase in power consumption worth it for many others.

However, not everyone sees AI as a necessary or beneficial advancement. For some, any additional energy consumption is unnecessary or wasteful, and no amount of efficiency can change that. Only time will tell whether AI is a necessary evil, similar to many other technologies in our lives, or whether it is simply a waste of electricity.



Source link