Did you know that every time you type a query into ChatGPT, it requires about 10 times as much electricity to process as a Google search?
That’s according to Goldman Sachs, which writes in a new report that electricity consumption in the United States is poised for a major surge for the first time in years, due in large part to the rapid buildout of data centers that power AI platforms such as ChatGPT. Goldman says it’s projecting electricity demand to rise about 2.4 percent from 2022 to 2030, with data centers representing the largest growth segment at 0.9 percentage points, more than one-third of total new demand.
Goldman isn’t the only firm that’s forecasting huge changes to the U.S. energy grid. The Electric Power Research Institute, a Washington, D.C.-based nonprofit, estimates that data centers could consume up to 9 percent of U.S. electricity generation by 2030, more than double their current consumption.
To help put things in perspective, ChatG