Skip to content
Analysis

AI datacentres, clean power, and the new energy bottleneck

Apr 26, 2026 · 8 min read · Conservation

The miscalculation that woke up Whitehall

The UK’s bid to be an “AI superpower” just collided with the physics of electricity. Revised government data indicate officials underestimated the climate impact of AI datacentres by more than 100-fold, with emissions from their power use potentially reaching 123 million tonnes of CO₂ over the next decade. That is roughly a third of the UK’s remaining carbon budget for the 2030s if the country stays on its net‑zero trajectory. Compounding the alarm, UK departments are reportedly at odds over how much power these facilities will actually need, creating a planning vacuum at exactly the wrong time.

This is not a uniquely British blind spot. Across advanced economies, digital growth was long assumed to be close to energy‑neutral, thanks to relentless efficiency gains. AI has broken that pattern. Hyperscale training clusters draw hundreds of megawatts apiece; rack power densities that lingered below 10 kW a few years ago now routinely exceed 50–100 kW, with designs targeting 150–200 kW as liquid cooling becomes standard. The upshot: demand is rising faster than grids can accommodate, and the carbon math no longer balances if new load is backstopped by fossil generation.

Demand is outrunning the plan

Global data centre electricity use was about 460 TWh in 2022 and could rise to 620–1,050 TWh by 2026, according to the International Energy Agency. AI is a major driver: inference—the everyday running of models, not just training—could alone consume tens of terawatt‑hours by the mid‑2020s. These headline numbers translate into very local problems:

  • Grid constraints: West London’s connection moratorium in recent years signalled how quickly urban networks can run out of headroom. Similar constraints are now appearing around major US data centre hubs.
  • Interconnection backlogs: In the US, more than 2.5 terawatts of generation and storage sit in interconnection queues, with median wait times exceeding five years, per Lawrence Berkeley National Laboratory. Data centres can be built in 18–30 months. Clean projects that would decarbonise their load often cannot.
  • Resource strain: Water‑cooled facilities can consume significant freshwater; studies have estimated single large training runs can use hundreds of thousands of litres, inflaming local opposition during droughts.

No surprise, then, that communities from Northern Virginia to Arizona are pushing back on new data centre proposals, citing bill impacts, land use, noise, and environmental equity. The social licence to operate is weakening just as developers need it most.

The net‑zero ledger no longer balances by default

For a decade, big tech’s answer to rising electricity use was to sign large corporate power purchase agreements (PPAs) for wind and solar and to tout annual “100% renewable” claims. That helped build gigawatts of clean capacity. But two realities undermine the approach in an AI era:

  • Time matters: Annual matching can still leave data centres running on fossil electricity at night or during winter lulls. As AI clusters operate 24/7, the marginal emissions of their consumption can be high even if annual megawatt‑hours are “covered” by certificates.
  • Location matters: New AI demand is often sited where land, fibre, and tax incentives are favourable—not where the grid is clean or robust. If local capacity is met with gas peakers or delayed transmission, emissions rise and targets slip.

The UK’s revised 123 Mt CO₂ estimate crystallises the stakes: if uncorrected, digital growth could consume a material slice of the national carbon budget. The same tension is visible in US utility plans that add or retain gas to meet fast‑rising data centre load while renewable and transmission projects queue for years.

Policy friction cuts both ways

The good news is policy can move quickly. A recent US federal court order blocking enforcement of an Interior Department memo that slowed solar and storage permitting was hailed by the industry as an immediate accelerant. Interconnection reform and long‑term transmission planning rules adopted by federal regulators in 2023–2024 are also designed to unclog queues and expand the grid backbone. These are not silver bullets, but they show how targeted regulatory action can free up the clean capacity that AI loads need to stay compatible with climate goals.

Yet the flipside is equally true: fragmented governance—like UK departments disagreeing on demand forecasts—can stall decisions on siting, reinforcements, and market design. The speed mismatch between data centre buildouts and public infrastructure upgrades is now a first‑order climate risk.

Five fixes to keep AI growth compatible with decarbonisation

  1. Forecasting and accountability that match AI’s pace
  • Publish transparent national and regional demand scenarios that explicitly include AI clusters, with independent review and quarterly updates.
  • Require major data centre developers to disclose expected load profiles, cooling water sources, and resilience plans as a condition of planning approval.
  • Align carbon accounting with reality: use location‑ and time‑based emissions factors rather than annual averages. Commitments should target 24/7 carbon‑free energy, not just annual RECs.
  1. Build clean power—and the wires—faster
  • Treat transmission like national critical infrastructure. Fast‑track priority corridors and adopt anticipatory investment so that lines are built ahead of load and renewables, not years after.
  • Streamline permitting for renewables and storage, as the recent US court action effectively did, while embedding biodiversity and community benefits from the outset to sustain support.
  • Reform interconnection to “first‑ready, first‑served,” cluster studies, and firm deadlines—mirroring ongoing US reforms—so gigawatts of wind, solar, and storage can connect in time to serve AI demand.
  1. Put data centres where clean power is—or make it so
  • Encourage co‑location in regions with abundant low‑carbon power (Nordic hydro, Scottish wind, US Pacific Northwest hydro), pairing with new transmission and fibre routes.
  • Where siting near load is essential (e.g., latency‑sensitive inference), condition approvals on developers funding local grid reinforcements and procuring firmed clean supply (e.g., solar plus multi‑hour storage or geothermal) tied to their metered demand.
  1. Make AI loads flexible and efficient by design
  • Flex the flexible: Training jobs are often schedulable. Include demand response commitments in permits so clusters reduce load during grid stress and ramp when wind/solar are abundant.
  • Raise the efficiency bar: Tighten standards for power usage effectiveness (PUE) and report real‑time performance. Promote liquid cooling that enables higher compute per watt while minimising water use via closed‑loop systems.
  • Optimise the stack: As MIT Technology Review notes, AI only delivers business value atop a strong data fabric. Better data pipelines reduce duplicated computation, failed runs, and data movement—cutting both cost and energy. Efficiency isn’t just a hardware story; it’s an enterprise architecture story.
  1. Align markets and incentives with net‑zero outcomes
  • Move from annual matching to 24/7 carbon‑free energy procurement. Enable granular certificates and long‑duration storage so buyers can actually procure clean supply in every hour.
  • Create tariffs that reward flexibility and penalise high‑emissions peak consumption. Consider “clean capacity” obligations for very large loads, met by contracted storage or firm low‑carbon power.
  • Monetise waste heat. Northern European cities already pipe data centre heat into district networks; replicate this with clear offtake standards and investment support where urban density allows.

A UK‑specific playbook

  • One set of numbers, one plan: DESNZ, DSIT, Ofgem, and National Grid should publish a joint AI load and emissions outlook tied to the Carbon Budget pathway, updated at least annually. Treat major AI campuses as “Strategic Energy Sites” with fast‑tracked but conditional approval pathways.
  • Accelerate the Great Grid Upgrade: Pull forward transmission investment to Scotland and the east coast to move offshore wind south; use anticipatory investment powers so connections can be offered in time for 2028–2032 AI capacity.
  • Condition approvals on 24/7 clean supply: For clusters exceeding, say, 50 MW, require developers to secure time‑matched clean PPAs plus storage or demand response equal to a percentage of peak draw. Offer lower network charges for higher flexibility commitments.
  • Water and community safeguards: Mandate non‑potable water use where feasible, publish monthly water intensity metrics, and establish local benefit funds and noise/traffic mitigations to maintain social licence.
  • Data transparency: Require hourly, site‑level carbon intensity reporting under Streamlined Energy and Carbon Reporting, with independent audit, to avoid repeats of the 100× miscalculation.

The bottom line

AI is not inherently at odds with net zero, but the status quo is. The UK’s emissions miscount is a warning that planning assumptions built for the cloud era do not fit the AI era. With credible forecasting, faster clean energy build‑out, smarter siting, flexible operations, and carbon‑honest procurement, governments can keep digital growth aligned with climate goals. Without those changes, the new bottleneck will not be model weights or GPUs—it will be megawatts and the carbon budget they quietly consume.