AI’s sustainability challenge is anything but theoretical, says Professor Bjorn Cumps, expert in Financial Services Innovation & FinTech at Vlerick Business School. “We’ve all embraced the idea that artificial intelligence offers smarter solutions, but all too often we remain blind to its hidden cost: a skyrocketing energy footprint. It’s time we start using these tools more consciously,” he says. As AI advances, so does its appetite for energy. According to Cumps, the only way to keep that growth sustainable is through smarter usage, greener infrastructure, and smarter regulation.
Bjorn Cumps: Certainly, because many end users consider artificial intelligence and cloud computing to be intangible. To be magic. But it’s not magic; it’s machines. And machines require energy. Every time you ask ChatGPT a question or generate an image, you’re triggering a huge amount of computing power behind the scenes. Especially with generative AI, you are not simply retrieving existing data, like with a Google search – you’re creating something new, which requires millions or even billions of calculations per prompt. Those calculations demand servers, cooling systems, and constant power.”
“The numbers are sobering. Every 100 days, the computing power required for AI doubles. Nvidia’s CEO recently predicted a 100-fold increase in demand over the next few years. And Google, which aimed to become net zero by 2030, saw its emissions increase by 50% since 2019, mainly due to AI. If you zoom out, this is not sustainable – at least, not at the pace we’re moving now.”
“It’s tricky, because we often compare apples to pears. If you watch a 4K video, yes, that’s energy intensive. But generating that 4K video with AI in the first place costs far more. Similarly, asking ChatGPT a question consumes 10 to 15 times more energy than a basic Google search. Multiply that by billions of prompts a day, and it adds up very quickly.”
“Let’s say: we should think twice, without feeling guilty. If you’re generating images or videos ‘just for fun’, you’re using vast computing resources for low-impact activities. That’s like leaving all your lights on when you go to bed. We need to apply the same logic to digital tools. Banning usage is not what we should be doing, being conscious is. So, ask yourself questions like: ‘Do I really need to generate this?’, ‘Could I use a smaller model?’, ‘Is this the best use of energy?’ Etc.”
“True, and that’s where education and design come in. Companies can do a lot more to inform users: pop-ups suggesting smaller models, basic eco-labels, even AI ‘budgets’ that show your consumption. These are forms of digital nudging, and they work. People won’t change unless they understand the impact of their behaviour and unless the tools guide them to better choices.”
“Data centers are on the frontline. They host the models, provide the infrastructure, and determine how green, or not, AI truly is. Their energy sourcing, cooling systems, and transparency practices are crucial. In Belgium, we already see leaders like LCL going beyond compliance and investing heavily in sustainable practices. That sets an example for others. But we need systemic pressure from regulators, from users, and from competition to raise the bar across the board.”
“They can accelerate transparency requirements. Right now, most users have no idea what kind of infrastructure their AI prompt activates, or whether it’s powered by green or grey energy. Europe is leading the way on regulation, but even here, we’re not moving fast enough.”
”Absolutely. In the end, we vote with our clicks. If enough people start choosing services that are more energy-aware, companies will follow. Think of the rise of organic food or electric vehicles: it started small, but demand changed the system. The same could happen here. Avoid overconsumption. Choose the efficient model. Ask for transparency. Don’t use a bazooka to swat a fly.”
“That’s already happening. On one end, you have companies and workers using AI to become vastly more productive. On the other, people whose jobs are being automated but who lack the skills to switch roles. It’s a question of access, but also of education, upskilling and support. AI could widen social inequalities if we don’t take the appropriate action.”
“Yes, we’re seeing a growing awareness in business schools, among students, and even in policy circles. At Vlerick Business School, we now offer an entire Executive MBA track focused on digital sustainability. We’re helping professionals navigate these complex issues – how to use AI to boost productivity without causing environmental damage, and how to make sustainability a core part of digital transformation.
“Treat AI like electricity: incredibly useful when applied with purpose. Just as we don’t leave the lights or oven on unnecessarily, we shouldn’t overuse AI tools without reason. Use them where they truly add value and be mindful of when they don’t.”
Some revealing figures to help quantify the environmental impact of artificial intelligence:
The following examples highlight the energy cost of inference, the actual use of AI models to generate responses. This does not include the much greater energy demands of training them.
Training large language models (LLMs) comes with a significant water footprint.
The scale of increase is staggering, highlighting how each new generation of ChatGPT comes with a rapidly growing environmental cost.