Artificial intelligence consumes a surprising amount of energy: a simple email with ChatGPT requires as much water as is in a small bottle. Researchers have examined the resource consumption of AI systems in more detail and warn of the consequences.
The hidden price of AI usage
Artificial intelligence has become an indispensable part of our everyday lives. Whether translations, emails or texts – AI systems such as ChatGPT offer a wide range of support. But convenience comes at a price, as a new study shows.
Researchers at the University of California, Riverside, in collaboration with the Washington Post, have investigated the resource consumption of AI applications – with surprising results. According to the study ChatGPT with GPT-4 uses around 519 milliliters of water for a 100-word email.
That’s a little more than a standard 0.5-liter water bottle. Added to that is 0.14 kilowatt hours of electricity – enough to run 14 LED lamps for more than an hour.
These numbers may seem small at first glance. But when millions of people use the app, the consumption adds up significantly: If just 10% of working Americans wrote one such email with ChatGPT every week, that would be 435 million liters of water per year.
The electricity consumption would be 121,517 megawatt hours (MWh) – as much as all households in Washington DC use in 20 days.
Enormous resource requirements for AI training
Not only ongoing operations, but also the training of AI models consumes enormous resources. According to the study, 700,000 liters of water were needed just to train GPT-3, the predecessor of GPT-4. This figure shows how resource-intensive the development of advanced AI systems is.
The major technology companies are aware of the problem. According to its latest environmental report, Google recorded a 48 percent increase in CO2 emissions, mainly due to AI and data centers. OpenAI, the developer of ChatGPT, emphasizes that it is constantly working on “efficiency improvements”. The high resource consumption is mainly due to the enormous computing power in the data centers.
The servers generate a lot of waste heat, which has to be dissipated with water or power-intensive cooling systems. The ratio between water and power consumption varies depending on the location and available resources.
Challenges for the future
Despite all efforts, the energy hunger of AI systems remains a major challenge. Experts expect that the next generation of AI models will require significantly more computing power. Meta boss Mark Zuckerberg assumes that training the successor to Llama-3 will consume ten times more resources.
Alexia is the author at Research Snipers covering all technology news including Google, Apple, Android, Xiaomi, Huawei, Samsung News, and More.