Urban75 Home About Offline BrixtonBuzz Contact

The vast energy consumption of AI

editor

hiraethified
Continuing the feeling of the world diving headlong into AI without thinking things through is the growing concern about the absolutely vast energy requirements of AI.

It seems it's nigh on impossible to currently accurately calculate how much energy computers running AI (for training or creating) but one estimate is that it could be equivalent to the energy demands of Sweden or Germany.

A recent report by the International Energy Agency offered similar estimates, suggesting that electricity usage by data centers will increase significantly in the near future thanks to the demands of AI and cryptocurrency.

The agency says current data center energy usage stands at around 460 terawatt hours in 2022 and could increase to between 620 and 1,050 TWh in 2026 — equivalent to the energy demands of Sweden or Germany, respectively.


One of the areas with the fastest-growing demand for energy is the form of machine learning called generative AI, which requires a lot of energy for training and a lot of energy for producing answers to queries.

Training a large language model like OpenAI’s GPT-3, for example, uses nearly 1,300 megawatt-hours (MWh) of electricity, the annual consumption of about 130 US homes.

According to the IEA, a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. (An incandescent light bulb draws an average of 60 watt-hours of juice.) If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.
 
When a writer cannot distinguish between power and energy in the examples they give, it’s probably worth taking their figures with a pinch of salt.

I’d still like to change my energy provider to whoever OpenAI are using, though.
 
Back
Top Bottom