AI Opinie

Why the future of AI depends on renewable energy

Remy Gieling
Remy Gieling
February 1, 2026
5
min read
Why the future of AI depends on renewable energy
The future of AI depends not only on smart algorithms, but on how we can sustainably feed the gigantic energy appetite of this technology.

The future of artificial intelligence seems limitless β€” but the energy it takes to build it is anything but. During a panel discussion in San Francisco with Constantijn van Oranje, investors, chip designers and entrepreneurs, among others, one theme became painfully clear: AI scales faster than our energy network can handle.

The energy issue of scalable intelligence

AI models are not only getting bigger, they're also getting smarter. We're moving from generative AI (which produces text and images) to reasoning models β€” systems that make decisions, carry out tasks and ultimately control physical actions. Think of robots, self-driving cars or surgical assistants. But this shift requires three to five times more computing power than the current foundation models.

And computing power means energy. Lots of energy.

An estimate mentioned during the panel: if OpenAI continues its current growth path, the company will have by 2033 requires as much energy as the country of India. That's not a dystopian exaggeration β€” it's a real scenario if we continue to scale the way we work today.

Chips, photonics, and memory near the processor

The hardware world is feeling the pressure. As one of the speakers aptly said: β€œMost of the energy does not go to the math itself, but to the moving data.” That is why we are working on new architectures in which memory closer to the computing core is placed, and to optical interconnections via photonics to make data traffic faster and more energy efficient.

Innovations are also taking place on the cooling side: from liquid cooling to new materials that dissipate heat better. However, the consensus is that these improvements incrementally are β€” and that something fundamentally different needs to happen to really break the energy demand.

The energy bubble: the new ceiling of innovation

Fabrizio del Maffeo, CEO of Axelera AI, painted a different picture: not the AI bubble, but the energy bubble will burst. In Taiwan, for example, the energy supply is reaching its limit, while the demand for data center capacity is growing exponentially. In the US, data centers are now consuming more than 5% of national electricity β€” a percentage that is rising rapidly.

So AI doesn't just have a compute problem, but a energy ceiling.

New architectures, old laws

Nevertheless, there is optimism. New chip architectures, such as the Mamba model and quantization techniques (where calculations are performed at lower precision) provide enormous efficiency gains. Where Nvidia's old GPUs worked with 32-bit calculations, modern AI chips already run at 8-bit or lower, with minimal loss of quality.

That means that smaller and more energy efficient models can be done without their performance declining. In addition, new hardware categories are emerging, specifically tailored to tasks β€” from edge devices to specialized AI accelerators.

Solar energy, batteries and direct current

Entrepreneur and investor Sid Sijbrandij from Gitlab and Kilo Code made an important point: technology alone will not solve the energy issue. The energy infrastructure itself needs to be redesigned.

His vision: the future is sun and batteries. Fusion reactors are too far away, nuclear power is too slow to build. Solar energy does offer scalability, provided we take a smart approach.

His plan:

  • Day/night cycle: buffering via batteries.
  • Seasonal cycle: overproduce in the summer, and make that energy available in the winter via storage or conversion (e.g. into fuels).
  • New infrastructure: switching from alternating current (AC) to direct current (DC), directly from solar panels to batteries and AI accelerators β€” without loss of conversion.

According to him, the key does not lie in more energy, but in less waste.

What now?

The race to scale artificial intelligence is a race between software innovation, hardware architecture, and energy efficiency.
The future of AI won't just be determined by who trains the smartest models, but by who they do it can feed most sustainably.

The conclusion from San Francisco: β€œThe biggest breakthrough in AI won't come from Silicon Valley, but from a power plant.”

Remy Gieling
Job van den Berg

Like the Article?

Share the AI experience with your friends