For all the sleek minimalism of the cloud, the truth is it’s anything but light and airy. Every “AI-powered” email, every cat video, every model training run lives somewhere — and that somewhere is a data centre, roaring 24/7 with air conditioning, fans, and endless racks of servers devouring electricity like it’s an all-you-can-eat buffet.
Globally, data centres already account for around 2–3% of total energy consumption — roughly the same as the entire aviation industry — and that number is rising fast. The explosion of AI has super-charged the problem. Training a single large language model can emit as much carbon as five cars over their entire lifetimes. Add to that the water needed to cool servers (hundreds of thousands of litres per day in some hyperscale facilities), and you begin to see the darker side of our digital convenience.
The issue isn’t just power; it’s concentration. Centralised data centres cluster in regions where energy is cheap and regulations are lenient. That often means fossil-fuel-heavy grids or delicate ecosystems suddenly saddled with industrial-scale cooling systems. It’s the equivalent of paving paradise to host your Slack backups.
So, what’s the alternative?
The Case for Decentralisation
Imagine if, instead of mega-centres in a handful of locations, we distributed computing power across a network of smaller, localised nodes — each optimised for renewable energy, regional cooling, and load balancing.
This isn’t sci-fi. Edge computing and decentralised architectures already exist; we just haven’t scaled them for sustainability. By processing data closer to where it’s generated — say, a wind farm, a solar-powered office park, or a local telecom exchange — we can reduce both latency and long-distance energy waste. It’s like moving from industrial-era power plants to rooftop solar panels: smaller, smarter, and more resilient.
There’s also a resilience argument. A decentralised web of data nodes would be less vulnerable to outages, cyber-attacks, and geopolitical risk. Plus, it lets us match local compute demand with local renewable supply. Picture a network that automatically shifts heavy AI training loads to regions enjoying peak solar or hydro generation. Clean energy meets clean data.
A Planet-First Data Future
None of this will happen by accident. It requires rethinking how cloud providers, governments, and enterprises design their digital infrastructure. We’ll need transparent carbon reporting, incentives for distributed compute, and serious investment in edge hardware that’s both secure and efficient.
But the opportunity is huge. The same technologies that made the cloud possible — containerisation, orchestration, and agentic automation — could now make it sustainable. The next frontier isn’t just faster models or bigger GPUs; it’s a planet-aware AI ecosystem that learns to use energy intelligently, not indiscriminately.
The future of computing doesn’t have to be a climate casualty. It could be the world’s smartest clean-energy grid — if we decentralise, localise, and design it that way.