As we rush to implement AI in our country, the immense demands on our power grid are a major problem. As we know, our power grid in major metropolitan areas is already strained. With the rush to build data centers required to power these major loads will only add to the problem. Currently, only three generation systems exist. Fossil fuels, Renewable Energy, and Nuclear Power. The power demands have initiated power companies to invest in major upgrades and construct new power generation plants and new and upgraded transmission systems feeding local substations. The load-serving AI data centers may require dedicated substations to provide for cloud servers and cooling loads. Examples of demand loads for each type of building include:
- Average Power Consumption
-
- Small AI Data Centers: 10–50 MW (megawatts)
- Large AI Data Centers (Hyperscale): 100–500 MW
- Extreme AI Facilities (e.g., for training large models): Can exceed 1 GW (gigawatt), comparable to a nuclear power plant.
- Comparisons for Scale
-
- A 100 MW data center uses enough electricity to power 80,000+ homes in the U.S.
- ChatGPT-level AI models require powerful GPUs, like Nvidia’s H100, which can consume 700W per chip and a single data center can have tens of thousands of these chips.
- 1 GW data centers (potentially coming soon) could use as much power as entire U.S. states like Vermont or Wyoming.
- AI’s Growing Power Demand
-
- AI workloads (especially training) consume 5–10× more power than traditional cloud computing.
- Late-Breaking News: AI Investments Reshaping the Future
In a dramatic acceleration of artificial intelligence (AI) infrastructure development, major Tech players and governments worldwide are pouring unprecedented resources into AI data centers. The race to power next-generation AI models has triggered investments totaling over $1 trillion dollars, with the United States leading the charge.
Massive AI Data Center Investments Underway. One of the most ambitious projects is President Trumps” Stargate” initiative, a $500 billion joint effort between OpenAI, SoftBank, and Oracle. The project aims to build 20 state-of-the-art AI data centers, each spanning half a million square feet, with the first facilities already under construction in Texas. Microsoft has projects underway in Georgia, Louisiana and Wisconsin.
Tech giants are also scaling their AI operations:
- Microsoft plans to invest $80 billion in AI-enabled data centers in 2025. The first two data center buildings are under construction in Mount Pleasant and future buildings this year in Kenosha, Wisconsin.
- Google has increased its capital expenditure to $75 billion, focusing on AI infrastructure.
- France has launched a $109 billion AI investment plan, while Brookfield & Data4 are committing $20 billion to AI infrastructure in Europe.
Why This Matters
As AI advances, the demand for powerful computing infrastructure is skyrocketing. These investments are not just about building data centers—they are laying the foundation for the future of AI-driven economies, automation, and digital transformation.
With AI set to revolutionize industries from healthcare to finance and national security, these developments signal the beginning of a new era of AI dominance. This will put a strain on our industry as the major contractors for these buildings are Electrical and HVAC. The electrical industry manpower will be overwhelmed in those areas.
The next few years will determine which nations and companies emerge as global AI leaders. AI’s future is unfolding faster than ever.
Power Center Estimate
A 100,000-square-foot data center’s power consumption depends on its power density, which varies based on workload intensity, cooling infrastructure, and hardware. Here’s a rough estimate:
- Power Density Estimates
Low Density (General Cloud Computing):
100–200 W per sq. ft.
→ 10–20 MW total
Medium Density (Enterprise & AI Processing):
200–500 W per sq. ft.
→ 20–50 MW total
High Density (AI/ML Training, HPC, Hyperscale):
500–1,000 W per sq. ft.
→ 50–100 MW total
Extreme Density (Next-Gen AI Centers):
1,000+ W per sq. ft.
→ 100+ MW total
- Real-World Comparisons
- A Google or AWS cloud data center of this size might use 20–40 MW.
- A dedicated AI training data center (running GPUs like Nvidia H100s) could use 50–100 MW or more.
- The highest-end AI facilities are pushing beyond 1,000 W per square foot, requiring on-site power plants or dedicated grid connections.
- By 2030, AI data centers could consume over 10% of the world’s electricity.
Stay tuned as new power generation systems emerge; even nuclear power may be an option. The demand for electricians is massive. Here in Wisconsin, the data center construction is in progress, with some requiring as many as 800 electricians on site.
Find Us on Socials