Green Data Centers: Liquid Cooling, Nuclear Power, and the Race to Make AI Sustainable
Data centers now consume approximately 4% of global electricity — more than most countries — and that figure is projected to double by 2030 as AI workloads, cloud computing, and digital services continue their exponential growth. The environmental footprint of the digital economy has become impossible to ignore: data centers emit more carbon than the global aviation industry, and their water consumption for cooling systems draws hundreds of billions of gallons annually from municipal water supplies, sometimes in drought-prone regions. In response, the industry is undergoing a radical transformation in how data centers are designed, powered, cooled, and operated — a transformation driven by both environmental conscience and hard economics.
The Energy Problem in Numbers
Global data center electricity consumption reached approximately 460 TWh in 2025 — roughly equivalent to the entire electricity consumption of France. The International Energy Agency (IEA) projects this will reach 1,000 TWh by 2030, driven primarily by AI workloads that are exponentially more compute-intensive than traditional cloud applications. Training a single large AI model consumes as much electricity as 100 US homes use in a year. Running inference (serving AI responses to users) at the scale of ChatGPT or Google Search with AI Overviews consumes electricity measured in megawatts continuously.
The major cloud and tech companies — Google, Microsoft, Amazon, Meta, and Apple — have all pledged carbon neutrality or net-zero emissions, but the explosive growth of AI has made achieving these goals significantly more challenging. Google’s total carbon emissions increased 48% between 2019 and 2025, driven almost entirely by data center expansion for AI. Microsoft’s emissions increased 30% over the same period. These increases occurred despite massive investments in renewable energy and efficiency — the efficiency gains simply can’t keep pace with the growth in compute demand.
Water consumption is a less publicized but equally significant environmental impact. Traditional data center cooling systems use evaporative cooling towers that consume 1.5-2.5 liters of water per kWh of electricity consumed. A typical hyperscale data center consuming 50 MW of electricity uses approximately 100 million gallons of water per year. In water-stressed regions like the American Southwest (where major data center clusters exist in Phoenix, Las Vegas, and the Dalles, Oregon), this consumption creates tension between the tech industry’s water needs and community water supplies.
Liquid Cooling: The Industry’s Most Important Technical Shift
The most impactful technology change in data center cooling is the transition from air cooling to liquid cooling, driven by the heat density of modern AI processors. An NVIDIA H100 GPU generates up to 700 watts of heat — roughly the same as a small space heater — and data centers pack thousands of these chips into racks. Traditional air cooling struggles to dissipate this much heat: it requires enormous volumes of cold air, massive HVAC systems, and results in hot spots that limit how densely equipment can be packed.
Direct liquid cooling (DLC) brings coolant into direct or near-direct contact with the heat-generating components. Cold plate cooling attaches liquid-filled cold plates directly to CPU and GPU surfaces, transferring heat to liquid that’s circulated to external heat exchangers. This approach is 3-5x more efficient than air cooling because liquid has over 1,000x the thermal conductivity of air. Major server vendors (Dell, HPE, Lenovo, Supermicro) now offer liquid-cooled server configurations as standard options, and both NVIDIA and AMD design their latest GPU modules with liquid cooling integrated from the factory.
Immersion cooling goes further, submerging entire servers in tanks of non-conductive dielectric fluid (engineered coolants that don’t conduct electricity and won’t damage electronics). The fluid absorbs heat directly from all components simultaneously — not just the main processors but also memory, storage, network cards, and power supplies. Two-phase immersion cooling, where the fluid boils into gas upon absorbing heat and condenses in a recovery system, provides even higher cooling efficiency. Companies like GRC (Green Revolution Cooling), LiquidCool Solutions, and Asetek provide immersion cooling systems that are commercially deployed in data centers handling AI workloads.
The efficiency gains from liquid cooling are substantial. Air-cooled data centers typically operate at Power Usage Effectiveness (PUE) ratings of 1.3-1.6 (meaning 30-60% of total electricity is consumed by cooling and overhead rather than computing). Liquid-cooled facilities achieve PUE ratings of 1.03-1.1 — effectively using almost all electricity for actual computing. For a 100 MW data center, reducing PUE from 1.4 to 1.05 saves 35 MW of electricity — enough to power 25,000 homes — and eliminates the water consumption associated with evaporative cooling.
Renewable Energy and Power Purchase Agreements
Every major cloud provider is the world’s largest corporate buyer of renewable energy, collectively purchasing over 70 GW of renewable electricity capacity through long-term power purchase agreements (PPAs). Google claims to match 100% of its global electricity consumption with renewable energy purchases (on an annual basis); Microsoft has committed to 100% carbon-free electricity by 2030 (on a 24/7 hour-matched basis); Amazon Web Services is the world’s largest corporate purchaser of renewable energy by capacity.
The distinction between annual matching and 24/7 matching is important. Annual matching means a company purchases enough total renewable energy over a year to equal its total electricity consumption — but at any given hour, the company may be consuming grid electricity generated by fossil fuels while its renewable energy is being produced elsewhere or at a different time. A data center running at 3 AM when solar panels aren’t generating and wind isn’t blowing is running on whatever the grid provides, which in many locations is natural gas or coal.
24/7 carbon-free energy (CFE) is a more ambitious standard that requires matching electricity consumption with carbon-free generation in every hour of every day, in the same grid region. Google reports that its best data center locations (Nordic countries with abundant hydropower and wind) achieve 97% hourly CFE matching, while its worst locations achieve only 30-40%. Microsoft’s commitment to 100% 24/7 CFE by 2030 requires not just purchasing renewable energy but investing in grid-scale energy storage (batteries, hydrogen) that can supply carbon-free electricity during periods when wind and solar aren’t generating.
Nuclear as the AI-Scale Power Solution
The scale of electricity needed for AI data centers has renewed interest in nuclear power as a carbon-free energy source with the 24/7 reliability that wind and solar cannot provide. Microsoft signed a 20-year power purchase agreement with Constellation Energy to restart the Three Mile Island Unit 1 nuclear reactor (which shut down in 2019 for economic rather than safety reasons) specifically to power its AI data centers. Amazon purchased a nuclear-powered data center campus adjacent to the Susquehanna Nuclear Power Plant in Pennsylvania. Google signed agreements with Kairos Power for small modular reactor (SMR) deployment to power data centers.
Small modular reactors — factory-fabricated nuclear reactors with 50-300 MW capacity, compared to 1,000+ MW for conventional reactors — are particularly attractive for data center developers. Their smaller size matches the scale of individual data center campuses, their modular construction reduces capital costs and construction timelines, and their enhanced safety designs (passive cooling that works without power or operator intervention) reduce regulatory complexity. NuScale Power received the first SMR design certification from the US Nuclear Regulatory Commission in 2023, and several other SMR designs are in the approval pipeline.
The nuclear option is not without controversy. Environmental groups are divided: some support nuclear as essential for carbon-free baseload power, others oppose it due to waste storage concerns, proliferation risks, and the slower deployment timeline compared to renewable energy. The economics are uncertain — nuclear power’s historical pattern of cost overruns and construction delays makes private investment risky without government support. But for data center operators willing to commit to 20+ year power agreements, nuclear provides the scale and reliability that renewable energy plus storage cannot yet match at competitive cost.
Waste Heat Recovery
Data centers convert electricity into heat (that’s fundamentally what computing does), and that heat has traditionally been vented to the atmosphere as waste. Waste heat recovery systems capture this thermal energy and put it to productive use — district heating, industrial processes, agricultural heating, and hot water supply. In Nordic countries, where data centers are establishing a significant presence due to cold climates and cheap renewable electricity, waste heat recovery is becoming standard practice.
Facebook’s data center in Lulea, Sweden exports waste heat to the local district heating system, providing warmth to nearby residences and businesses. Microsoft’s data center in Finland supplies waste heat to the Espoo district heating network. Amazon’s planned data center in Ireland will connect to a district heating system serving 47,000 homes. In each case, the waste heat replaces fossil fuel-fired boilers that would otherwise generate the same thermal energy, providing a double environmental benefit: the heat that was previously wasted now displaces carbon-emitting heating systems.
The economics of waste heat recovery are favorable in locations with district heating infrastructure (primarily Northern Europe) but challenging in locations without it (most of the US and Asia). Building district heating networks specifically to capture data center waste heat is expensive — the infrastructure cost is typically justified only for large, long-term data center installations with reliable heat output. As data centers grow larger and more permanent, the economic case for waste heat recovery strengthens, but it remains a regional rather than global solution.
Location Strategy: Following the Green Electrons
Data center location strategy has shifted from proximity to population centers (to minimize network latency) toward proximity to clean energy sources and favorable cooling conditions. Nordic countries (Sweden, Finland, Norway, Iceland) offer cold climates (reducing cooling costs), abundant hydropower and wind energy, political stability, strong data privacy laws, and district heating infrastructure for waste heat recovery. Ireland has become Europe’s largest data center market partly due to its cool maritime climate and strong renewable energy sector (42% wind power).
In the US, data center development is gravitating toward regions with cheap clean electricity: the Pacific Northwest (hydropower), Texas (wind and solar), and Virginia (the world’s largest data center market, supported by nuclear and expanding renewable capacity). Arizona and Nevada, despite their hot climates, attract data centers with cheap solar electricity and direct evaporative cooling that works efficiently in dry desert air (though water consumption is a growing concern).
Underwater data centers represent the most radical location innovation. Microsoft’s Project Natick demonstrated that sealed server pods deployed on the ocean floor benefit from natural cooling, protected environment (no oxygen means less corrosion), and proximity to subsea fiber optic cables. The concept proved technically viable during a two-year deployment in Scotland’s Orkney Islands, with lower failure rates than equivalent onshore data centers. While not yet commercially deployed at scale, the concept has influenced subsequent designs for sealed, maintenance-free edge data centers.
The Accountability Gap
Despite impressive corporate sustainability commitments, the data center industry faces a growing accountability gap between pledges and verified performance. Carbon accounting methodologies vary across companies, making direct comparisons difficult. Scope 3 emissions (from supply chain, construction, and equipment manufacturing) are often excluded from corporate carbon calculations even though they represent a significant share of total impact. Water consumption is reported inconsistently, with some companies reporting total consumption and others reporting only net consumption (after recycling).
Regulatory requirements for environmental reporting are tightening. The EU’s Corporate Sustainability Reporting Directive (CSRD) requires large companies operating in Europe to report detailed environmental metrics including data center energy consumption, PUE, water usage, and carbon emissions. The SEC’s proposed climate disclosure rules in the US (if finalized) would require similar reporting for publicly traded companies. These mandatory reporting requirements will provide standardized, comparable data that enables genuine accountability rather than selective corporate sustainability storytelling.
The fundamental challenge remains: the growing demand for AI and cloud computing is generating energy and resource requirements that are growing faster than efficiency and renewable energy can offset them. Making data centers greener is essential and the industry is investing heavily in doing so. But making data centers green enough to offset their exponential growth is a problem that technology alone may not solve — it may ultimately require social and economic decisions about how much computing is sustainable.
Related articles: Fintech Super Apps Dominate Emerging Mar | Neuromorphic Computing: Brain-Inspired C | 3D Bioprinting in 2026: From Lab Curiosi









