Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
A new whitepaper by the Electric Power Research Institute (EPRI) dives into the massive growth in AI power needs. The 35-page report, “Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption,” predicts that by 2030, U.S. data centers’ power consumption alone could more than double.
According to the report, AI, particularly generative AI, uses significantly more power per query than traditional searches. The report highlights that a ChatGPT request uses about 2.9 watt-hours, compared to a Google search at just 0.3 watt-hours. And this doesn’t even factor in energy-intensive tasks like image or video generation.
The EPRI report examined use cases like Google search, ChatGPT, and BLOOM. Surprisingly, ChatGPT was the least energy-intensive of the AI queries. However, if Google incorporates similar AI into its searches, the energy use per search could jump to between 6.9 to 8.9 watt-hours.
EPRI created four forecasts for possible data center energy use from 2023 to 2030, each based on different annual growth rates: low (3.7%), moderate (5%), high (10%), and higher (15%). Under the most extreme scenario, data center electricity use could soar to 403.9 TWh/year by 2030, marking a 166% increase from 2023 levels. Even the lowest growth forecast predicts a 29% increase.
This spike in demand isn’t uniform across the U.S. Fifteen states accounted for 80% of national data center load in 2023, with Virginia alone making up 25%. By 2030, Virginia’s share of electricity consumption could reach 46% if growth continues at the highest rate. States like Oregon, Iowa, Nebraska, North Dakota, and Nevada are also expected to see notable growth, with data centers comprising 20% or more of their total electricity demand.
Different types of data centers drive this growth. Enterprise data centers (individual company-operated) make up 20-30% of the total load. Co-location centers, where companies rent shared space, and hyperscale centers run by cloud giants like Amazon, Google, and Microsoft make up the remaining 60-70%. Hyperscale centers, which are massive, are at the front line of energy innovations, with new facilities hosting capacities from 100 to 1000 megawatts.
With the rise in AI applications, companies are in a rush to secure state-of-the-art GPU-equipped servers. Yet, acquiring hardware is only part of the challenge. The power needed for these high-energy systems is another pressing concern. The race to leverage AI isn’t just about hardware and data; it’s also about ensuring adequate data center capacity.
In this scenario, enterprises need to approach data center procurement like hyperscale giants do. Companies such as Amazon, Google, and Microsoft secure long-term capacity to support growth. They engage in multi-year agreements with power providers, facility operators, and manufacturers to guarantee needed resources.
To stay competitive, enterprises may need to reconsider their procurement strategies. Traditionally, companies have followed a “three bids and a buy” process, choosing the lowest-cost provider for each project. However, with constrained data center capacity and critical infrastructure in high demand, this approach may not be sustainable.
Instead, companies might need to develop long-term partnerships with data center and equipment providers, securing a set level of capacity over time to ensure a steady supply. This approach is becoming increasingly common, with some suppliers moving away from traditional RFP processes.
For enterprise IT leaders, this shift will necessitate strategic thinking and long-term planning. This proactive approach won’t be easy and will require collaboration across IT, facilities, and finance teams, along with significant upfront investments in infrastructure. Nonetheless, for companies aiming to excel in an AI-driven world, this strategy may be essential to stay ahead.