Energy voracity
A wise man once said that with great power comes great responsibility.
Under the astonishment of many, AI development and use has skyrocketed in the last year.
The current artificial intelligence market is estimated to be worth more than US$180 billion in 2024, and expected to grow to more than US$800 billion by 2030.
New players, such as OpenAI or Anthropic, are emerging alongside big tech corporations, with the perspective of further accelerating the race for the smartest and fastest AI model by either competing or cooperating:
A new narrative was born: we can make the world a better place with artificial intelligence. A narrative supported by thousands of cases where companies have managed to launch breakthrough productions and solutions to increase productivity.
But there is another side of the narrative that is going quite unnoticed: AI has a huge environmental impact. The energy appetite of the data centres used to run AI software applications has surpassed the one of entire countries:
That’s why the number of data centres has surpassed the 7000 units (compared to the 3500 in 2015).The impact relies not only on the emissions generated by the massive energy consumption of data centres but also on the land use required to build them.
Ok, now let’s go a bit deeper into the topic.
To understand the roots of the AI boom we can look at Moore’s Law. In brief: the number of transistors on computer chips doubles approximately every two years.
The outcomes are two:
Computing efficiency has increased exponentially
The price of computer memory and storage has dropped exponentially
The two outcomes can also be explained as: AI is now cheap and fast to deploy thanks to the many small pieces that come together into big data centres.
We’re going into the next complexity layer of today’s topic: Graphics Processing Units (GPUs).
Supercharged chips
GPUs are the backbone of the whole AI thing.
They are the tiny bricks put together to make sure that there is enough computational power for AI to run and give an answer to all the prompts that ChatGPT gets.
Before getting lost in the realm of infinite technicalities, I’ll tell you what we’re looking for here: Power Usage Effectiveness (PUE).
PUE is the key metric that has been (so far) developed and used to monitor the evolution in energy consumption of GPUs. It compares the total energy a facility consumes to the amount its computing infrastructure uses. However, companies at the forefront of GPU development and innovation like Nvidia argue that PUE has become obsolete to evaluate the real impact of data centres in the era of generative AI. The argument against PUE is that it calculates only the energy input without considering how efficiently it is used or how it affects the output of data centres. In other words: I can only know that I gave the data centre X amount of power without knowing if the output was 10 or 100. Hence, problem #1 is that we need a better way to track the efficiency of these infrastructures (even if a lot of progress on transparency has been made thanks to initiatives such as the Green500 index).
Problem #2: GPUs are listed among the strategic technologies made out of critical minerals, so their production is exposed to the risk of supply chain constraints in the future.
Source: Critical Raw Materials for Strategic Technologies and Sectors in the EU - European Commission
We might expect a more aggressive approach to tackle critical minerals scarcity (which can also mean more intensive mining operations).
These two problems are enough to get a good migraine when thinking about AI and the planet.
What can we do to fix this mess?
Pro tip: listen to this episode from Catalyst w/ Shayle Kann if you’d like to go deeper into this rabbit hole.
Checkmate?
If your answer was “Just stop using AI,” unfortunately, I can’t agree.
We’ve all seen how powerful and capable AI is becoming, so there are zero incentives to prevent its use, especially in the business landscape. But not all the roads might lead to our doom.
I’d like to draw two scenarios here:
Shift from a consumer-driven AI to a progress-driven AI
The main use case for AI today is increasing individual productivity by automating tedious tasks. We’ve also seen tons of AI apps launched every week to satisfy a growing audience of enthusiasts.
However, the power of AI relies also on the opportunity to advance technological progress, and this includes leveraging AI to make it less harmful from a climate perspective.
Imagine an AI that optimises itself to reduce its energy consumption or runs only when the energy provided comes from renewable sources.
A zero-emissions AI
AI is building a massive case for a larger adoption of renewable energy, especially when it comes to sustaining data centres. As AI consumption increases, the climate impact becomes more evident. And companies are working to build solutions to mitigate the outcome of AI mass adoption on the planet.
We still can't match 100% of the AI energy demand with clean energy, but there are positive signals in this direction.
Some example include:
Helion: backed by Sam Altman himself, Helion is working to build the world's first fusion power plant and is gathering a lot of attention from the AI industry.
NexGen Cloud: NexGen Cloud is building more sustainable data centres running 100% on renewable energies.
Lumai: Lumai is developing an AI processor with energy efficiency through 3D optical technology.
Link to the original post: https://tctb.beehiiv.com/p/the-ai-gambit
Comments