Future AI Data Center Energy Needs

Estimated global data center electricity consumption in 2022 was 240-340 TWh, or around 1-1.3% of global final electricity demand. “It is really hard to quantify how much demand is needed for things like ChatGPT,” David Groarke, managing director at consultant Indigo Advisory Group, said in a recent phone interview. By 2030 AI could account for 3% to 4% of global power demand. Google said right now AI is representing 10% to 15% of their power use or 2.3 TWh annually.

Google could significantly increase its power demand IF generative AI were used in every Google search, according to academic research conducted by Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics.

Citing research by semiconductor analysis firm SemiAnalysis, de Vries in a commentary published Oct. 10 in the journal Joule, estimated that using generative AI such as ChatGPT in each Google search would require more than 500,000 of Nvidia’s A100 HGX servers, totaling 4.1 million graphics processing units, or GPUs. At a power demand of 6.5 kW per server, that would result in daily electricity consumption of 80 GWh and annual consumption of 29.2 TWh.

A 10X increase in queries could require 290 TWh.

Power demand from operational and currently planned datacenters in US power markets is expected to total about 30,694 MW once all the planned datacenters are operational, according to analysis of data from 451 Research, which is part of S&P Global Market Intelligence. Investor-owned utilities are set to supply 20,619 MW of that capacity. However, those expectations still don’t assume any radical adjustments due to adoption of AI.

If significant forecast adjustments need to be made, the earliest indications will likely come from the utilities that serve the big datacenters.

OpenAI is generating revenue at a pace of $1.3 billion a year in October 2023 which was up 30% from the $1 billion-a-year pace about four months earlier.

3 thoughts on “Future AI Data Center Energy Needs”

  1. The hype can’t keep up with resource usage. The scam is about to collapse.

    Wow, it generates images, writes emails for you, and saves you a couple of trips to openstack for debugging. That’s… Really cool but, how is this worth billions and billions of dollars and an incredible consumption of energy? It’s VR, the metaverse, and crypto all over again.

  2. I was a bit surprised to see the PNW dwarfing CA, given how much tech is located in silicone valley and the population difference. But, I’m guessing the distance between users and data centers is negligible at 1 or 2 states away, and datacenters located near cheaper electricity make much more sense, given the cheap hydro in that area.

  3. Montana — America’s stronghold against the coming AI apocalypse.

    Obviously, and said many times before, radiation hardened orbital data centers; solar powered, radiatively cooled.

Comments are closed.