Amid surging demand for data centers to train and use cutting-edge artificial intelligence (AI) models, there is a consensus that an expensive build-out of America’s electric power system is unavoidable. President Trump declared a national energy emergency on the first day of his administration and has pledged to fast-track new power plants to fuel the data center boom. Meanwhile, electric power rates for homes and businesses are set to rise sharply across the country to pay for tens of billions of dollars of planned grid upgrades to accommodate the data center boom. Powering these new data centers is a bipartisan priority for policymakers concerned about U.S. competitiveness in AI, especially after the release of the Chinese DeepSeek-R1 model.
A stunning new report out today from Duke University argues that the existing U.S. electricity system already has the “headroom” to power massive additions of data centers with no new grid or power plant infrastructure. The catch? New data centers need to incorporate a limited amount of flexibility in when they consume power, ramping down their use during rare hours during the year when regional power grids experience peak stress. Armed with this capability, new data centers could connect swiftly to existing regional power grids, the Duke researchers argue, without compromising grid reliability or waiting up to a decade for expensive new infrastructure to get built.
Read more from the Council on Foreign Relations here.
The views and opinions expressed are those of the author’s and do not necessarily reflect the official policy or position of C3.