Will the AI boom fuel a global energy crisis?
AI’s thirst for energy is ballooning into a monster of a challenge. And it’s not just about the electricity bills. The environmental fallout is serious, stretching to guzzling precious water resources, creating mountains of electronic waste, and, yes, adding to those greenhouse gas emissions we’re all trying to cut. As AI models get ever more complex and weave themselves into yet more parts of our lives, a massive question mark hangs in the air: can we power this revolution without costing the Earth? The numbers don’t lie: AI’s energy demand is escalating fast The sheer computing power needed for the smartest AI out there is on an almost unbelievable upward curve – some say it’s doubling roughly every few months. This isn’t a gentle slope; it’s a vertical climb that’s threatening to leave even our most optimistic energy plans in the dust. To give you a sense of scale, AI’s future energy needs could soon gulp down as much electricity as entire countries like Japan or the Netherlands, or even large US states like California. When you hear stats like that, you start to see the potential squeeze AI could put on the power grids we all rely on. 2024 saw a record 4.3% surge in global electricity demand, and AI’s expansion was a big reason why, alongside the boom in electric cars and factories working harder. Wind back to 2022, and data centres, AI, and even cryptocurrency mining were already accounting for nearly 2% of all the electricity used worldwide – that’s about 460 terawatt-hours (TWh). Jump to 2024, and data centres on their own use around 415 TWh, which is roughly 1.5% of the global total, and growing at 12% a year. AI’s direct share of that slice is still relatively small – about 20 TWh, or 0.02% of global energy use – but hold onto your hats, because that number is set to rocket upwards. The forecasts? Well, they’re pretty eye-opening. By the end of 2025, AI data centres around the world could demand an extra 10 gigawatts (GW) of power. That’s more than the entire power capacity of a place like Utah. Roll on to 2026, and global data centre electricity use could hit 1,000 TWh – similar to what Japan uses right now. And, by 2027, the global power hunger of AI data centres is tipped to reach 68 GW, which is almost what California had in total power capacity back in 2022. Towards the end of this decade, the figures get even more jaw-dropping. Global data centre electricity consumption is predicted to double to around 945 TWh by 2030, which is just shy of 3% of all the electricity used on the planet. OPEC reckons data centre electricity use could even triple to 1,500 TWh by then. And Goldman Sachs? They’re saying global power demand from data centres could leap by as much as 165% compared to 2023, with those data centres specifically kitted out for AI seeing their demand shoot up by more than four times. There are even suggestions that data centres could be responsible for up to 21% of all global energy demand by 2030 if you count the energy it takes to get AI services to us, the users. When we talk about AI’s energy use, it mainly splits into two big chunks: training the AI, and then actually using it. Training enormous models, like GPT-4, takes a colossal amount of energy. Just to train GPT-3, for example, it’s estimated they used 1,287 megawatt-hours (MWh) of electricity, and GPT-4 is thought to have needed a whopping 50 times more than that. While training is a power hog, it’s the day-to-day running of these trained models that can chew through over 80% of AI’s total energy. It’s reported that asking ChatGPT a single question uses about ten times more energy than a Google search (we’re talking roughly 2.9 Wh versus 0.3 Wh). With everyone jumping on the generative AI bandwagon, the race is on to build ever more powerful – and therefore more energy-guzzling – data centres. So, can we supply energy for AI – and for ourselves? This is the million-dollar question, isn’t it? Can our planet’s energy systems cope with this new demand? We’re already juggling a mix of fossil fuels, nuclear power, and renewables. If we’re going to feed AI’s growing appetite sustainably, we need to ramp up and diversify how we generate energy, and fast. Naturally, renewable energy – solar, wind, hydro, geothermal – is a huge piece of the puzzle. In the US, for instance, renewables are set to go from 23% of power generation in 2024 to 27% by 2026. The tech giants are making some big promises; Microsoft, for example, is planning to buy 10.5 GW of renewable energy between 2026 and 2030 just for its data centres. AI itself could actually help us use renewable energy more efficiently, perhaps cutting energy use by up to 60% in some areas by making energy storage smarter and managing power grids better. But let’s not get carried away. Renewables have their own headaches. The sun doesn’t always shine, and the wind doesn’t always blow, which is a real problem for data centres that need power around the clock, every single day. The batteries we have now to smooth out these bumps are often expensive and take up a lot of room. Plus, plugging massive new renewable projects into our existing power grids can be a slow and complicated business. This is where nuclear power is starting to look more appealing to some, especially as a steady, low-carbon way to power AI’s massive energy needs. It delivers that crucial 24/7 power, which is exactly what data centres crave. There’s a lot of buzz around Small Modular Reactors (SMRs) too, because they’re potentially more flexible and have beefed-up safety features. And it’s not just talk; big names like Microsoft, Amazon, and Google are seriously looking into nuclear options. Matt Garman, who heads up AWS, recently put it