
ChatGPT may not be as resource-demanding as initially thought.
ChatGPT may not be as resource-intensive as previously thought. However, this largely depends on its usage and the AI models that are responding to the questions.
A new study suggests that ChatGPT, OpenAI's chatbot platform, does not consume as much energy as previously thought. The research conducted by Epoch AI, a nonprofit organization dedicated to analyzing artificial intelligence, aims to estimate the energy consumption of a typical query in ChatGPT. Previously, it was claimed that each query required approximately 3 watt-hours, a figure that Epoch considers exaggerated.
Using the latest default model from OpenAI, the GPT-4o, it has been calculated that an average query consumes around 0.3 watt-hours, which is less than many common household appliances. Joshua You, a data analyst at Epoch, noted that ChatGPT's energy usage is not significant compared to that of everyday appliances or the energy consumption related to heating or cooling a home, as well as driving a vehicle.
Energy consumption and its environmental impact is a topic of debate in the context of the rapid growth of AI companies' infrastructure. Recently, more than 100 organizations signed an open letter urging the industry and regulators to ensure that new AI data facilities do not deplete natural resources or force utility companies to rely on non-renewable energy sources.
You determined that his analysis was motivated by previous research he considers outdated. He highlighted that the original calculations regarding the consumption of 3 watt-hours assumed that OpenAI was using older, less efficient chips. Although the figure of 0.3 watt-hours is an estimate, OpenAI has not made public the specifications necessary for an exact calculation.
Epoch's research does not include additional energy costs associated with ChatGPT features such as image generation or input processing. You admitted that queries with lengthy attachments could consume more electricity than standard questions, although he anticipates that ChatGPT's energy consumption will increase as technology advances and AI applications become more complex. In the coming years, AI data centers are expected to require nearly all of the electrical capacity that California had in 2022, according to a report from Rand Corporation.
The use of reasoning models by OpenAI and others is also on the rise. These models, while more efficient, require more power to operate as their "thinking" time before responding extends from seconds to minutes. Although OpenAI has begun to launch more efficient reasoning models, they are unlikely to offset the increased electrical demand associated with this process.
Finally, You suggested that those concerned about their energy footprint when using AI consider using applications like ChatGPT less frequently or choose models that minimize resource consumption.