- INFRA INSIDER
- Posts
- ⚡🤖 Terawatts & Trillions: The Megaplan to Outscale the Power Grid
⚡🤖 Terawatts & Trillions: The Megaplan to Outscale the Power Grid
Inside the quiet power grab to fuel trillion-chip AI data centers

Quick Bytes
🧠 XAI’s 1M-Chip Data Center: Building now—just the start of a path toward 1 trillion chips.
⚡️ Tesla Terawatt Goal: Doubling U.S. electricity just to power AI data centers.
🚀 100x AI Infrastructure Scale-Up: Massive leap in chip, memory, and data center capacity.
☢️ Nuclear Comparison: Would take 1,000 reactors and a river of cooling water to match.
🧊 AI Infra Explosion: From 130kW to 600kW+ racks—data centers are turning into power-hungry beasts.
🌎 New AI Capacity:
📍40MW Southeast US – Ready 4Q 2026, 150kW/rack
📍25MW Midwest US – Liquid cooling, tax incentives, ready 1Q 2027
📍750MW+ East Coast US – Starting 3Q 2025 with liquid cooling
📅 Need data center capacity for AI workloads? Let’s talk. Book a meeting today.
Tesla TERAWATT Goal is Masterplan 4
Tesla Master plan 3 targeted a global installation of 30 terawatts (TW) of renewable energy capacity, primarily from solar and wind. This is a monumental scale-up aimed at meeting all energy needs across electricity, transportation, and heating. It involved extensive Battery Storage: It proposes 240 terawatt-hours (TWh) of battery storage worldwide to ensure consistent energy supply, addressing the intermittency of renewables like solar and wind.
A Tesla Terawatt plan is a more focused XAI and Tesla Master Plan 4. It would be doubling the electricity generation of the USA for the purpose of powering a monster AI data center. The Terawatt goal is a smaller but still huge energy goal but it also involves scaling up AI chip production and AI memory production and AI data centers by about 100 times.
If this was attempted with 1000 nuclear reactors it would mean ten times more nuclear reactors than the US has now and about 20 times more than China has now. Cooling those nuclear reactors would need 7 million tons of water each minute. It would be a literal major river of water.
One billion chip AI data center would be a stepping stone to one trillion chips.
XAI is completing a one million chip AI data center in the next 12 months.
The Terawatt goal is a smaller but still huge energy goal but it also involves scaling up AI chip production and AI memory production and AI data centers by about 100 times.
Tesla Master Plan 3
For the United States, the plan includes approximately 3 TW of solar, 1.9 TW of wind, and 6.5 TWh of battery storage, tailored to meet the nation’s energy demands sustainably.
Tesla estimated a global investment of $10 trillion, which is less than the $14 trillion projected for fossil fuel spending over the next two decades, making it economically viable. It also ensures material availability for this transition.
Doubling US Electricity but All for XAI and Tesla Data Centers
The U.S. currently generates approximately 4,243 TWh of electricity annually (based on 2022 data). Doubling this to 8,486 TWh per year would require a significant increase in generation capacity—roughly doubling the existing ~1,200 GW of installed capacity to ~2,400 GW, assuming current utilization rates.

Energy Output: Doubling U.S. electricity generation would increase annual output to 8,486 TWh, a substantial jump but far smaller than the global energy transformation in Master Plan 3. For the U.S. alone, the plan’s renewable capacity (3 TW solar + 1.9 TW wind = ~4.9 TW) could generate significantly more than 8,486 TWh annually with appropriate capacity factors (e.g., 25% for solar, 40% for wind), potentially exceeding 10,000 TWh/year, surpassing mere doubling.
Doubling generation focuses solely on electricity production, likely using a mix of existing sources (fossil fuels, nuclear, renewables).
Doubling would require new power plants and grid upgrades. However, Elon Musk has talked about using millions of Megapacks so that existing power plants could run at maximum output at all times. Generating even at night and other low demand times. This would mean more coal, natural gas and other fuel sources for the fuel burning power plants.
AI INFRA SUMMIT 3 Reflection: Scaling for the Future
Earlier this month, we were thrilled to produce AI INFRA SUMMIT 3 in Mountain View at Microsoft’s Silicon Valley Campus. This amazing event highlighted the extraordinary challenges facing the AI infrastructure industry as it undergoes unprecedented scaling demands. Over the past decade, compute density has skyrocketed from 10kW racks to today's 130kW GB200 systems, with projections reaching 600kW by 2027 and potentially megawatt-per-rack configurations before 2030.
This dramatic scaling necessitates rethinking every aspect of AI infrastructure: cooling innovations have become essential rather than optional, structural requirements have intensified to support racks weighing up to 10,000 pounds, and hardware refresh cycles have compressed from 3-7 years to annual updates.

Ignite’s AI INFRA SUMMIT, a semi-annual event, creates critical conversations across traditionally siloed domains, addressing forward-looking questions: how will today's massive GPU clusters evolve over the next 3-5 years? How will infrastructure adapt to specialized chips beyond NVIDIA GPUs? How can we optimize communication pathways between components as bottlenecks constantly shift?
Beyond technical challenges, the AI INFRA SUMMIT tackles the financial dimension of AI deployment. With AI infrastructure being capital intensive, understanding where money flows and generates returns across the entire AI value chain is crucial. This event series brings together early-stage investors, growth-stage VCs, operators, landowners, developers, and hardware providers to collaborate on projects ranging from 20MW facilities to gigawatt-scale campuses.

The AI INFRA SUMMIT also addresses critical security and governance frameworks. AI infrastructure presents unique security challenges, from protecting high-value GPU clusters worth hundreds of millions to safeguarding sensitive data against exfiltration and poisoning attacks. As AI evolves toward autonomous agents, implementing proper monitoring becomes essential to prevent rogue activity.
Looking ahead, the industry is being reshaped by specialized AI processors, distributed computing architectures, and innovations in energy and cooling technology, including two-phase liquid cooling and immersion solutions for extreme density requirements.
New Capacity Opportunities For 2025 and Beyond
40 MW in The Southeast Region of The US
Build-to-suit or turnkey, ready for service 4Q 2026
150+ kW/rack density
< 1.2 PUE
25 MW in The Midwest Region of The US
Ready for service 1Q 2027
Liquid cooling available
Tax incentives at this site
750+ MW in The Central East Coast Region of The US
50 MW ready for service 3Q 2025
+500 MW ready for service 4Q 2026
+200 MW ready for service 2028
Liquid cooling available
The above sites are great for large-scale deployments and we have plenty of inventory for those that need a few megawatts or hundreds of kilowatts of capacity. What kind of infrastructure are you looking for? Book time here with our team to learn about what’s available.

MAY 2, AI INFRA HIGHLIGHTS







Your Feedback Matters
At Infra Insider, we’re here to give you an edge—exclusive insights into the evolving infrastructure world that others miss. While most newsletters skim the surface, we go deeper, spotlighting the trends shaping AI, data centers, and tech capacity. Think behind-the-scenes intel, key players driving change, and actionable updates straight from the field. It’s not just another industry briefing—it’s your front-row seat to the future.
Your feedback is crucial in helping us refine our content and maintain the newsletter's value for you and your fellow readers. We welcome your suggestions on how we can improve our offering; [email protected].
Nina Tusova
Director of Customer Relationship & Success // Team Ignite
Did this newsletter bring the heat or just a little spark? Rate the intel! |
Reply