ChatGPT may not be as power-hungry as once assumed

Article Thumbnail
Favicontechcrunch.com
12d
Read Original

A recent study by Epoch AI suggests that ChatGPT's energy consumption is significantly lower than previously estimated. While earlier claims stated that a single query required about 3 watt-hours, Epoch's analysis indicates it actually consumes around 0.3 watt-hours, comparable to many household appliances. This finding challenges the notion that AI's energy use is excessively high compared to everyday activities.

The debate over AI's environmental impact continues as the industry expands. Although current usage is relatively efficient, future advancements and increased demand for more complex reasoning models may lead to higher energy requirements. OpenAI and other companies are investing heavily in new data centers, which could strain energy resources.

To mitigate concerns about energy consumption, users are encouraged to utilize smaller models and limit usage. As AI technology evolves, balancing efficiency with growing power demands will be crucial.