AI can do a lot but it all comes at a price. It is no secret that generative AI models consume loads of energy to be trained with billions and even trillions of parameters. In light of this, many reports have highlighted concerns about the environmental impact of these AI models, particularly their energy consumption.
In fact, some analysis even suggests generative AI models like ChatGPT may double their energy consumption by 2026. This has raised concerns among environmentalists and policymakers about the sustainability of AI technologies and their impact on the environment. Let’s try to understand how much power technologies like AI need to operate!
Does AI need a lot of energy?
Electric Power Research Institute (EPRI) and International Energy Agency (IEA) analysis found astonishing energy levels consumed by artificial intelligence, cryptocurrencies and data centres. To give it a perspective, let’s compare it with a Google search.
One Google search query requires an average electricity of 0.3 Watt-hour however, ChatGPT eats up 2.9 Watt-hour per prompt! Now, millions of searches are performed on OpenAI’s GPT model which includes text, audio and video requests. All of these prompt consumer electricity in an increasing order.
AI to consume double energy by 2026!
According to the International Energy Agency (IEA) report, NVIDIA is currently at the forefront of the AI server market, holding a whopping 95% market share. Recently, it was brought to light that Nvidia’s H100 GPU has a peak power consumption of around 700W.
Last year, the firm sold about 1.5 million H100 GPU units that may have annually consumed energy that could power 4 million people (the population of people living in Georgia). What is more interesting is these cutting-edge chips are expected to exhaust energy equivalent to a small nation!
Looking ahead, the report predicts that by 2026, generative AI models like ChatGPT will consume at least ten times more energy than they did in 2023. But it is not just advanced AI gen models but digital assets aka cryptocurrencies were responsible for consuming around 110 terawatt-hours of energy in 2022. This accounts for 0.4% of the global annual electricity demand. The IEA report predicts this figure will shoot up by more than 40% to around 160 terawatt-hours by 2026.
Data centres become energy-hungry
ChatGPT-like AI models and cryptocurrency mining have a big appetite for data. To put it simply, they are dependent on data centres that utilise massive power to run and cool their servers. A key report Fueling the Future indicated that data centres will consume double the electricity in 2026 which would be the same as energy used by Japan.
Coming back to the Electric Power Research Institute (EPRI) report emphasises finding solutions that could help data centres be efficient. For example, the cooling of servers accounts for 40% of the energy that could be reduced. Currently, data centres are trying to be carbon neutral by replacing non-renewable resources with sustainable options.
Tech giant Microsoft is looking into powering data centres with hydrogen fuel cell power systems instead of diesel generators to reduce carbon emissions.
The takeaway
The growing energy usage of generative AI models such as ChatGPT presents a major hurdle for environmental sustainability. Organisations must tackle this challenge by streamlining their training methods and embracing renewable energy to cut energy use. Taking proactive steps can certainly help to lower the environmental footprint of AI, cryptocurrencies and even data centres.