
The world’s data centers consumed approximately 415 terawatt-hours (TWh) of electricity in 2024, about 1.5% of global electricity use, and that number is on track to more than double by the end of this decade.
According to the International Energy Agency’s (IEA) April 2025 report, global data center electricity demand is projected to reach 945 TWh by 2030, an amount roughly equal to Japan’s entire annual power consumption today. Artificial intelligence (AI) is the primary driver of this surge, and the pace at which it is growing is raising hard questions about whether existing power infrastructure can support the industry’s expansion plans.
AI Workloads & Data Centers are Consuming Far More Power Than Traditional Computing
The core problem is that AI doesn’t run like standard software. A traditional server rack consumes between 5 and 15 kilowatts (kW). An AI-optimized rack loaded with high-performance GPUs demands 40 to 60 kW, and some cutting-edge AI training facilities are pushing individual racks past 100 kW. That’s a fundamental shift in how much electricity a single unit of computing infrastructure requires.
The difference shows up at the query level, too. A standard Google search uses around 0.0003 kWh of electricity, and a single ChatGPT query uses approximately 0.3 to 0.34 watt-hours. Now as AI models grow more complex and usage scales up globally, these small per-query numbers add up fast. AI-specific servers in the U.S. alone used an estimated 53 to 76 TWh in 2024, and projections are putting that figure at 165 to 326 TWh by 2028.
Training large language models further worsens the problem. For instance, training larger language models (LLMs) is estimated to have generated over 550 tons of carbon dioxide, which is equivalent to the annual carbon footprint of 121 American households. And while training is the most power-intensive phase, inference, which is the process of actually running AI models in response to user queries, accounts for an estimated 80-90% of a model’s total energy use over its lifecycle, making the operational cost of deployed AI just as significant.
Big Tech’s Carbon Commitments Are Falling Apart
For years, the world’s largest tech companies positioned themselves as corporate leaders on climate. Those commitments are now in direct conflict with their own expansion plans.
Microsoft pledged to be carbon negative by 2030. However, its emissions rose 23.4% between 2020 and 2024, driven by a 30.9% increase in indirect supply chain emissions tied to data center construction and AI workloads, according to its own sustainability report.
Google, which had declared itself carbon neutral since 2007, reported that its total emissions grew nearly 50% over the five years to 2024, pointing to data center electricity consumption as the primary cause.
Amazon, despite being the world’s largest corporate buyer of renewable energy, reported a carbon footprint of nearly 68.25 million metric tons of CO₂ equivalent in 2024, a 6% rise from the previous year, and roughly triple its 2019 figure.
Industry analysts have described this situation as a practical impossibility for these companies to hit their 2030 targets, given that emissions have already risen sharply and their AI investments are still in early stages. A 2025 analysis by NewClimate Institute and Carbon Market Watch found that all five major tech companies, Amazon, Apple, Google, Meta, and Microsoft, are facing a strategy crisis, with their climate goals undermined by AI growth plans and outdated greenhouse gas accounting methods.
The electricity problem is also part of the environmental picture. Water consumption from data center cooling is a growing concern that receives far less attention. Cooling a single large data center can require millions of gallons of water per day.
Regulators Are Responding
The gap between how fast data centers are expanding and how fast regulation is catching up is substantial, and it differs sharply across regions.
The European Union has moved furthest. The EU’s Energy Efficiency Directive (EED) already requires large data centers to report operational metrics like power usage and water consumption. More concretely, the European Commission is set to adopt a Data Centre Energy Efficiency Package in the first quarter of 2026, alongside a Strategic Roadmap on Digitalisation and AI, with the stated goal of achieving carbon-neutral data centers in Europe by 2030.
Critics, however, note that the EED still lacks binding efficiency or renewable energy targets specifically for data centers, which limits its practical effect.
For now, the growth of AI infrastructure continues to outpace both the grid capacity meant to power it and the regulatory frameworks meant to govern it. The IEA has described this as a “wake-up call” for electricity infrastructure in advanced economies, warning that without serious investment in grid capacity, governments will face difficult trade-offs between powering AI expansion and meeting broader decarbonization goals.