
Recent research from Epoch AI has suggested that ChatGPT consumes approximately 0.3 watt-hours (Wh) of electricity per interaction, which is about “10 times less” than earlier estimates that speculated it required around 3 Wh.
This new finding brings ChatGPT’s energy consumption closer to that of Google Search, which is around 0.0003 kWh per search. The implications of this study could be significant as there was previously concern amongst users and developers about the environmental impact of using such large language models (LLMs).
Epoch AI’s findings indicate that LLM interactions consume less energy than previously thought, suggesting that the energy usage of ChatGPT during typical interactions is comparable to that of a Google search.
The reduced energy consumption could also change the way technologies like ChatGPT are perceived, serving as a reminder that while efficiency is important, AI can offer productivity breakthroughs that improve overall energy management in other areas too.
Given that energy utilization in cloud services often goes unnoticed, it is vital to address the underlying power requirements. Recognizing the energy cost associated with employing such technologies can lead to a more responsible approach towards AI utilization and energy consumption assessments.
More transparency from AI companies like OpenAI would help provide a clearer understanding of energy usage, paving the way for informed decisions in the tech industry.