ChatGpt may not be as hunger for power as it once was supposed to
chatgptOpenai’s chatbot platform may not be as powerful as it once was supposed to. However, its appetite relies heavily on how ChatGpt is used, and on the AI model that responds to queries, according to new research.
a Recent analysis Epoch AI, a non-profit AI research lab, has tried to calculate the amount of energy that a typical ChatGPT consumes. a Commonly cited statistics CHATGPT requires about 3 watts of power to answer one question or 10 times the power of a Google search.
Epoch believes it is an overrated one.
Using OpenAI’s latest default model for ChatGPT, GPT-4O,As a reference, Epoch found that the average ChatGPT query consumes approximately 0.3 watt-hour consumption.
“Using energy is no big deal compared to using regular appliances, heating your home, driving a car or driving a car,” said Epoch, analysed. data analyst Joshua Yu told TechCrunch.
As AI companies are trying to quickly expand their infrastructure footprint, AI’s energy usage and its environmental impact are broadly the subject of debate. Last week, groups from over 100 organizations An open letter has been released It calls on the AI industry and regulators to ensure that new AI data centers do not deplete natural resources and that utilities rely on non-renewable energy sources.
He told TechCrunch that his analysis was spurred by what he characterized as an outdated previous study. For example, the authors of the report that reached the 3-woth time estimate noted that Openai assumed that it used an old and inefficient chip to run the model.

“I have seen many public discourses that correctly recognize that AI will consume a lot of energy in the coming years, but in fact, the energy that is heading towards AI today is actually accurate. I didn’t explain it,” you said. “Also, some of my colleagues have said that the most widely reported estimates of 3 watts per query are based on rather old research and are too high based on napkin mathematics. I realized that it seemed like that.”
Certainly, the 0.3 watt-hour value of the epoch is also approximate. Openai does not publish the details necessary to perform accurate calculations.
The analysis also does not take into account the additional energy costs caused by ChATGPT functions such as image generation and input processing. I’ve admitted that “long input” ChatGPT queries (for example, with long files attached) are more likely to consume more power in advance than a typical question.
However, he said he expects increased baseline ChatGpt power consumption.
“(The) AI is getting more advanced, and this AI training will probably require much more energy. This future AI will use ChatGpt today, with more tasks and more complex tasks. “It can handle tasks that are much more complicated than the way.”
While there Amazing breakthrough AI efficiency is expected to accelerate the expansion of power-hungry infrastructure, with the efficiency of recent months, at the scale of AI deployed. Over the next two years, AI data centers may be needed near California’s 2022 power capacity (68 GW). According to the RAND report. By 2030, frontier model training could require power equivalent to the power of eight reactors (8 GW), the report predicted.
ChatGpt alone has reached a huge number of people growing, and server requests are in equal demand. Openai is planning with several investment partners Billions of dollars spent on new AI data center projects For the next few years.
Openai’s attention, along with the rest of the AI industry, has shifted to what is called inference models. This is generally more capable in terms of tasks that can be accomplished, but requires more computing to be performed. In contrast to models like GPT-4O that responds almost instantly to queries, models infer “think” the process of sucking more computing, and thus the power sucking process for seconds to several minutes.
“Inference models increasingly take on tasks that older models can’t do, generate more (data) to do that, and both require more data centers,” you said.
Openai has begun releasing more power-efficient inference models o3-mini. However, at least at this point, it appears that increased efficiency is unlikely to offset the increased power demand due to the “thinking” process of inference models and the increased use of AI around the world.
You suggest that anyone concerned about the AI energy footprint should use apps such as ChatGPT at all, or use models that select within the range of choices that minimize the required computing model. I did.
“Try using a small AI model, such as the GPT-4o-Mini (Openai’s),” you said.