- Data centers are on track to consume more electricity than all U.S. heavy industry combined by 2030 — surpassing cement, steel, and chemical manufacturing put together.
- A single AI-focused hyperscale data center can draw as much power as 100,000 homes, dwarfing the energy footprint of conventional cloud storage facilities.
- Roughly two-thirds of data centers built since 2022 sit in water-stressed regions — making the hidden water cost of AI one of the most urgent and least-discussed environmental crises unfolding right now.
- The number of U.S. data centers more than doubled between 2018 and 2021, then doubled again — and the pace is still accelerating.
- There are viable paths to carbon-free, grid-friendly data centers — but right now, the industry isn’t defaulting to them, and that gap is where the real story lies.
Every time you ask an AI chatbot a question, a data center somewhere burns energy to answer it — and the bill the planet is picking up is growing fast.
The scale of that bill is only beginning to come into focus. Sustainability advocates, grid planners, and environmental scientists are raising alarms about an industry that has quietly become one of the most resource-intensive on Earth. GreenPower Monitor, which tracks energy use across enterprise infrastructure, is among the voices in this space helping organizations understand and act on their digital carbon footprint. The urgency isn’t abstract — it’s measured in terawatt-hours, billions of gallons of water, and communities absorbing the costs of infrastructure built for someone else’s profit.
U.S. Data Centers Will Soon Out-Consume All Heavy Industry Combined
According to the International Energy Agency, by 2030, U.S. data centers will consume more electricity than all of the country’s heavy industries combined — more than cement, steel, chemical plants, and car manufacturing facilities put together. Roughly half of that demand is projected to come from AI workloads alone. To put that in human terms, one energy analyst described the current moment as “the largest growth in power demand since the years following World War II.”
How Much Power Does a Data Center Actually Use?
The honest answer depends on what kind of data center we’re talking about — and the difference is staggering.
- A conventional data center handling cloud storage or video streaming draws electricity equivalent to 10,000 to 25,000 households
- A modern AI-focused hyperscale facility can consume power equivalent to 100,000 homes or more
- Meta’s Hyperion data center in Louisiana is expected to draw more than twice the power of the entire city of New Orleans once completed
- Amazon and Meta’s new facilities in Indiana and Louisiana will each require more than two gigawatts of electricity — dozens of times more than standard facilities
- xAI’s data center cluster near Memphis, across three nearby facilities, is projected to require nearly two gigawatts annually — roughly twice Seattle’s total electricity consumption
These aren’t edge cases. They represent the new normal for AI infrastructure, and the grid was not designed with them in mind.
The number of U.S. data centers more than doubled between 2018 and 2021. Fueled by AI investment, that number has already doubled again. In 2023 alone, U.S. data centers consumed 176 terawatt-hours of electricity — roughly equivalent to the entire energy consumption of the nation of Ireland for a full year.
The 40-Seattle Problem: What Grid Expansion Really Means
When grid planners talk about the infrastructure challenge, they aren’t exaggerating for effect. Accommodating AI data center growth means building new transmission lines, upgrading substations, and in many cases, constructing entirely new power generation capacity. The xAI facilities near Memphis alone could consume twice Seattle’s annual electricity. Multiply that across dozens of planned hyperscale campuses and the grid math becomes genuinely alarming.
Why Generative AI Demands Far More Energy Than Standard Computing
Standard computing — storing files, running web servers, processing transactions — has become remarkably efficient over time. Generative AI breaks that efficiency trend. Training large language models requires running billions of mathematical operations simultaneously across thousands of specialized chips, often for weeks at a time. Running those models at scale (called inference) isn’t cheap either. Every query to a tool like ChatGPT or DALL-E triggers a chain of energy-intensive computations that standard software simply doesn’t require. While not every data center workload is AI-driven, generative AI has become the primary engine of surging energy demand.
176 Terawatt-Hours: Putting 2023 Data Center Consumption in Context
Globally, data center electricity consumption reached 460 terawatt-hours in 2022 — which would have ranked data centers as the 11th largest electricity consumer in the world, sitting between Saudi Arabia and the Netherlands. By 2026, that figure is expected to approach 1,050 terawatt-hours, which would vault data centers to fifth place globally, just behind Japan and ahead of Russia. The trajectory isn’t gradual. It’s near-vertical.
The Fossil Fuel Trap AI Companies Walked Right Into
Here’s where the environmental story gets genuinely complicated. The tech industry has spent years making public commitments to carbon neutrality and renewable energy. But the pace of AI-driven energy demand has outrun the pace of clean energy deployment — and the gap is being filled with fossil fuels.
About half of the electricity currently powering U.S. data centers comes from fossil fuel plants. The pressure on the grid has become so acute that new natural gas plants are being commissioned specifically to meet data center demand — directly contradicting the clean energy timelines many of these same tech companies have publicly committed to.
New Gas Plants Are Being Built Specifically to Power AI
This isn’t a theoretical risk. Utility companies in Virginia, Georgia, and Texas have already announced plans to delay the retirement of existing gas plants — or build new ones — in direct response to data center load growth. The irony is sharp: companies publicly committed to net-zero emissions are indirectly driving new fossil fuel infrastructure investment simply by expanding their AI capacity faster than renewables can scale to meet it.
The economic logic is straightforward even if the environmental logic is not. Renewable energy projects — especially large solar and wind farms — take years to permit, finance, and build. A tech company that wants a hyperscale campus operational within 18 months doesn’t have the luxury of waiting for a new wind farm. So the grid fills the gap with whatever generation capacity is available, and right now, that often means gas.
The result is a structural contradiction at the heart of Big Tech’s sustainability messaging: the more aggressively these companies deploy AI, the harder it becomes to honor their climate commitments on any meaningful timeline.
- Virginia, the world’s largest data center hub, is running out of space — and its grid is already strained
- New data center hubs are expanding rapidly in Phoenix, Atlanta, and Dallas
- Gas turbine manufacturers report order backlogs stretching years into the future, driven partly by data center demand
- Some utilities are now forecasting electricity demand growth they haven’t seen since the post-WWII industrial expansion
Why Natural Gas Turbines Are Already Backlogged for Years
The demand surge hasn’t just strained electricity grids — it’s overwhelmed the supply chains for the equipment needed to build new power plants. Gas turbine manufacturers, which had scaled back production during a period of stagnant demand, are now reporting order backlogs measured in years. This means that even where utilities want to build new generation capacity quickly, the physical equipment to do so isn’t immediately available. The bottleneck compounds the environmental problem: in the interim, older, dirtier generation assets stay online longer than planned.
Water Consumption: The Hidden Environmental Cost Nobody Talks About
Energy headlines dominate the data center conversation, but water tells an equally troubling story. Cooling the massive server stacks inside these facilities requires enormous volumes of water — and unlike electricity consumption, water use is rarely disclosed, rarely regulated, and almost never scrutinized at the scale it deserves. The problem is compounded by the fact that roughly half the electricity powering U.S. data centers comes from fossil fuel plants, which themselves consume significant water volumes to generate steam for their turbines. You’re essentially counting the water cost twice.
Google Used Over 5 Billion Gallons of Water in 2023 Alone
Google’s own environmental reports disclosed that its data centers consumed over 5 billion gallons of water in 2023. Microsoft and Meta have reported similarly staggering figures in their sustainability disclosures — though the consistency and methodology of those disclosures varies considerably between companies. What makes these numbers particularly striking isn’t just their scale, but where that water is being withdrawn. According to a Bloomberg News analysis, roughly two-thirds of data centers built since 2022 have been located in water-stressed regions — areas already facing supply pressure from agriculture, municipal use, and climate-driven drought.
Why Companies Rarely Disclose How Much Water Their Data Centers Use
The short answer is that they largely don’t have to. Water disclosure requirements for private corporations are inconsistent across U.S. states and almost entirely absent at the federal level. For many tech companies, the reputational risk of publishing detailed water consumption data — especially against a backdrop of drought conditions in the American Southwest — creates a clear incentive for opacity rather than transparency.
The cooling systems inside data centers are the primary culprit. Most large facilities use evaporative cooling towers, which work by evaporating water to dissipate heat. This process is highly effective but consumes water at rates that scale directly with computing load. As AI workloads drive servers to run hotter and harder, cooling demand increases proportionally. A hyperscale AI facility doesn’t just use more electricity than a conventional data center — it also uses dramatically more water.
The geographic dimension makes this worse. Data center developers often choose locations based on land cost, tax incentives, and fiber connectivity — not on local water availability. The result is that some of the most water-intensive industrial facilities on the planet are being built in regions that can least afford to spare the supply. Phoenix, one of the fastest-growing data center markets in the United States, sits in the heart of the Sonoran Desert and draws water from the already-strained Colorado River basin.
- Evaporative cooling towers are the dominant cooling method and consume water proportional to computing load
- AI servers run hotter than standard servers, increasing cooling water demand per unit of computation
- Roughly two-thirds of data centers built since 2022 are located in water-stressed regions
- Phoenix, Atlanta, and Dallas — three of the fastest-growing data center hubs — all face significant water supply challenges
- Fossil fuel power plants supplying data center electricity add additional water withdrawal that is rarely included in tech company sustainability reports
Local Communities Pay the Price While Tech Giants Take the Gains
There’s a pattern playing out in communities across the United States that deserves much more public attention. A tech company identifies a location with cheap land, favorable tax treatment, and available grid capacity. Local officials, eager for economic development, approve the project quickly. The data center gets built. And then the community discovers that the promised jobs are fewer than expected, the tax base is thinner than projected, and the costs — in water, grid strain, air quality, and higher energy bills — are very much real.
“We are experiencing the largest growth in power demand since the years following World War II.” — Energy industry analyst, cited in reporting on AI infrastructure expansion. The communities absorbing that demand growth rarely had a meaningful say in whether it came to their neighborhoods at all.
The economic development argument for data centers has always rested on a few key claims: jobs, tax revenue, and regional investment. In practice, all three are frequently overstated. Modern hyperscale data centers are highly automated. A facility consuming two gigawatts of electricity and spanning hundreds of thousands of square feet might employ only 30 to 50 permanent workers. The construction phase creates temporary jobs, but the operational footprint is remarkably lean relative to the infrastructure impact.
Meanwhile, the strain on local grids is anything but lean. When a single industrial customer begins drawing power equivalent to tens of thousands of homes, utilities must either build new generation and transmission capacity — costs that are ultimately socialized across all ratepayers — or manage increasingly tight supply margins that raise the risk of outages for everyone else.
Tax Incentives That Cost States More Than They Return
States and counties competing for data center investment have engaged in a race to the bottom on tax incentives that would be remarkable in any other industry. Sales tax exemptions on equipment purchases, property tax abatements, and subsidized utility rates are routinely offered to attract facilities that, once built, contribute relatively little to the local tax base on an ongoing basis. In some cases, these incentives can lead to significant economic impacts on local communities.
Virginia — which hosts the largest concentration of data centers in the world in its Northern Virginia corridor — has grappled publicly with this tension. The state has offered substantial tax exemptions to data center developers for years, and researchers and policy advocates have questioned whether the returns justify the concessions. The data centers consume enormous grid capacity, require costly infrastructure upgrades, and generate relatively few permanent jobs, yet the tax incentives that attracted them reduce the revenue available to fund schools, roads, and public services.
The dynamic isn’t unique to Virginia. Similar patterns have emerged in Texas, Georgia, Ohio, and Oregon — wherever land is cheap and lawmakers are eager to claim a headline about tech investment. The communities that end up hosting these facilities rarely had the full picture when the deals were struck.
- Sales tax exemptions on equipment purchases reduce state revenue from some of the largest capital investments in the country
- Property tax abatements can last decades, limiting the long-term fiscal return to host communities
- Subsidized utility rates for large industrial customers can shift costs onto residential ratepayers
- Permanent employment at hyperscale facilities is often 30 to 50 workers — a fraction of what equivalent industrial investment would generate
Who Actually Bears the Cost of Air Pollution and Higher Energy Bills
The communities located nearest to data centers — and to the fossil fuel plants brought online to power them — absorb air quality impacts that tech company sustainability reports don’t account for. Nitrogen oxide and particulate matter emissions from gas plants aren’t distributed evenly. They concentrate in the neighborhoods closest to generation and transmission infrastructure, which are disproportionately lower-income communities and communities of color. The environmental justice dimension of the AI energy boom is real, present, and almost entirely absent from mainstream coverage of the topic.
Can Data Centers Ever Become Good Grid Citizens?
The honest answer is yes — but not without deliberate structural change. The technical solutions exist. What’s missing is the regulatory pressure, economic incentive, and corporate will to deploy them at scale and speed. There is a version of the data center industry that complements rather than strains the grid, that operates on carbon-free power, and that uses water responsibly. That version is not hypothetical. Parts of it are already operating. The gap between what’s possible and what’s being built by default is where sustainability advocates need to focus.
The core opportunity lies in the fact that data centers — unlike hospitals or residential buildings — have significant flexibility in when they run their most intensive workloads. AI model training, batch processing, software updates, and other non-time-sensitive tasks can, in principle, be scheduled around periods of low grid demand and high renewable generation. This concept, known as demand shifting or demand flexibility, could transform data centers from grid stressors into grid assets if implemented seriously.
Duke University’s Research on Peak Load Reduction
Research from Duke University has examined how large flexible loads — including data centers — could be used to actively support grid stability rather than simply drawing from it. The core finding is that if data centers participate in demand response programs, shifting even a fraction of their flexible workloads away from peak demand periods, the grid-level benefits in terms of reduced peak generation requirements and lower emissions intensity can be substantial. The challenge is that participation in these programs requires operational changes that many data center operators have been slow to prioritize, particularly when their primary metric is computing throughput rather than grid impact.
How Battery Storage and Demand Shifting Could Change Everything
Pairing large-scale battery storage with data center operations opens up possibilities that go well beyond simple demand response. A facility equipped with sufficient battery capacity can charge during periods of excess renewable generation — when wind and solar are producing more than the grid needs — and discharge during peak demand, effectively acting as a grid stabilizer. At scale, this model turns data centers from passive consumers into active participants in the clean energy transition.
The economics are not yet universally favorable, but they are improving rapidly as battery costs continue to fall. Some hyperscale operators are already piloting these configurations, and early results are promising. The barrier isn’t primarily technological — it’s a combination of regulatory frameworks that don’t yet reward this behavior adequately and corporate procurement processes that prioritize upfront cost over lifecycle grid impact. Changing those incentive structures is one of the highest-leverage interventions available to policymakers who want to bend the data center emissions curve downward.
Why Carbon-Free Data Centers Are Possible But Not Yet the Default
The technology to run a data center on carbon-free power exists today. Several major facilities already operate on high proportions of renewable energy, and the engineering challenges — while real — are not insurmountable. What’s missing is the default expectation. Right now, the fastest path to getting a new hyperscale facility online involves connecting to whatever grid capacity is available, and in most U.S. markets, that means a mix dominated by fossil fuels. Until permitting reform, transmission investment, and procurement standards change the default pathway, individual company commitments to clean energy will remain aspirational rather than structural.
The AI Bubble Risk: What Happens If Demand Collapses
There is a scenario that energy analysts and infrastructure investors are quietly beginning to take seriously: what if the AI demand curve doesn’t follow the projections? The buildout of data center capacity is happening at extraordinary speed, driven by competitive pressure and investor euphoria rather than proven long-term demand. If generative AI revenue growth plateaus, if efficiency breakthroughs dramatically reduce the compute required per query, or if public or regulatory backlash slows adoption, the industry could find itself with massive stranded assets — half-built hyperscale campuses, commissioned gas plants with no customers, and grid infrastructure paid for by ratepayers that never delivers its promised return. The environmental damage from the buildout would already be done. The economic case for having rushed it would have collapsed. It is not a fringe concern. It is a real and underpriced risk embedded in the current trajectory of AI infrastructure investment.
Is Quitting the Answer, or Is Smarter Growth?
Stepping away from AI and digital infrastructure entirely isn’t a realistic or even desirable solution — and framing the question that way lets the industry off the hook by making the alternative seem absurd. The actual choice in front of us is between unmanaged growth that externalizes its environmental costs onto communities, water systems, and the climate, and deliberately structured growth that internalizes those costs and takes responsibility for them. One path treats environmental impact as someone else’s problem. The other treats it as a design constraint — non-negotiable from the start.
Sustainability advocates have the most leverage not by opposing data centers categorically, but by demanding specific, measurable standards: mandatory water disclosure, renewable energy procurement that is additional and local rather than offset-based, genuine community benefit agreements, and regulatory frameworks that make demand flexibility the default rather than the exception. The technology to build a data center industry that doesn’t devour the planet exists. What it needs is the organized pressure to make that the only acceptable way to build.
Frequently Asked Questions
Here are the most common questions people ask when they start digging into the environmental footprint of data centers and AI infrastructure.
How much electricity do data centers use compared to other industries?
Data centers currently consume around 460 terawatt-hours of electricity globally per year, and that figure is rising sharply. In 2023, U.S. data centers alone consumed 176 terawatt-hours — roughly equivalent to the entire annual electricity consumption of Ireland.
By 2030, according to the International Energy Agency, U.S. data centers are projected to consume more electricity than all of the country’s heavy industries combined, including cement, steel, chemical manufacturing, and automotive production. Roughly half of that projected demand will come from AI workloads specifically.
To put individual facility scale in perspective: a conventional cloud storage data center draws electricity equivalent to 10,000 to 25,000 households. A modern AI-focused hyperscale facility can exceed 100,000 household equivalents — and some planned facilities will require more than two gigawatts of continuous power, comparable to the electricity demands of a mid-sized American city.
Why do data centers use so much water?
Data centers use water primarily for cooling. The dense server racks inside these facilities generate enormous amounts of heat, and the most cost-effective way to remove that heat at scale is through evaporative cooling towers, which work by evaporating water into the atmosphere. AI-optimized servers run significantly hotter than standard servers, which means AI data centers consume more water per unit of computation than conventional facilities. The problem is compounded by the fact that roughly half of data center electricity comes from fossil fuel power plants, which also consume large volumes of water to generate steam — a water cost that rarely appears in tech company sustainability reports but is very much part of the total environmental footprint.
Are any major tech companies powering data centers with renewable energy?
Several major tech companies — including Google, Microsoft, and Amazon — have made high-profile commitments to match their data center electricity consumption with renewable energy purchases. Google has targeted 24/7 carbon-free energy matching, meaning clean energy available in the same grid region at the same hour consumption occurs, rather than simply buying annual renewable energy credits that may not reflect real-time grid conditions. Progress has been made, but the pace of AI-driven demand growth has outrun clean energy deployment in many regions, and the gap is currently being filled with fossil fuel generation. The commitments are real; the gap between commitment and current reality is also real, and closing it requires both faster renewable deployment and meaningful demand-side flexibility from the data centers themselves.
What is demand shifting and how does it reduce data center emissions?
Demand shifting is the practice of scheduling flexible, non-time-sensitive computing workloads — like AI model training, batch data processing, and software updates — during periods when the grid has excess clean energy available and overall demand is low. Because renewable sources like wind and solar generate power intermittently, there are predictable windows when clean electricity is abundant and cheap. Data centers that can shift their heavy workloads into those windows consume a higher proportion of clean energy without requiring additional renewable generation capacity to be built. Research from Duke University and others has shown that if large flexible loads like data centers participate in demand response programs at scale, the grid-level emissions benefits can be substantial — effectively making the same computing work cleaner without changing the underlying technology at all.
How does AI growth affect local communities near data centers?
Communities that host data centers often receive less economic benefit than promised while absorbing real and ongoing costs. The permanent employment generated by hyperscale facilities is remarkably low relative to their size and energy footprint — often just 30 to 50 workers for a facility drawing two gigawatts of power. Meanwhile, the grid infrastructure required to serve these facilities is typically paid for through utility rate increases that affect all customers, including residential ratepayers who receive no direct benefit from the data center’s operation. Additionally, recent layoffs in tech companies further highlight the economic challenges faced by communities relying on such industries.
The air quality dimension is particularly significant and underreported. Fossil fuel power plants brought online or kept running specifically to serve data center demand emit nitrogen oxides and fine particulate matter that concentrate in nearby communities. These communities are disproportionately lower-income and disproportionately communities of color — a pattern that places the data center energy boom squarely within the environmental justice conversation, not just the energy policy one.
Water stress is the third major community-level impact. In regions like Phoenix, which sits in a desert and draws from the overtaxed Colorado River basin, the addition of major industrial water consumers competes directly with residential, agricultural, and ecological water needs. Local communities were rarely given meaningful input into whether these trade-offs were acceptable before the facilities were approved and built — a governance failure that sustainability advocates are increasingly calling out as a structural problem requiring systematic reform, not just better corporate behavior on a case-by-case basis.
If you want to understand and reduce your organization’s digital carbon footprint with real data rather than estimates, GreenPower Monitor provides the infrastructure monitoring and energy intelligence tools that make that possible.


