The enormous energy requirements of AI

and how they can be tamed

Offenburg, April 4, 2024

More energy-efficient use with embedded systems

The enormous energy requirements of artificial intelligence (AI) are increasingly becoming the focus of discussion. No wonder, since as early as 2019, a US study warned that training a single neural network alone causes as much CO2 emissions as five (conventional) cars. Just training the GPT-3 model (the version of ChatGPT that has been publicly available since the end of 2022) consumed as much electricity as a medium-sized nuclear power plant can produce in about an hour (1,287 megawatt hours).

In this context, the development and use of AI must become more energy efficient, not only to save costs, but also to counteract the energy shortage and the consumption of resources needed to generate energy. Not least, global warming and geopolitical events are forcing us to use energy sparingly.

Of course, the solution is not to do without AI, but to focus on greater energy efficiency in its use. In this way, AI and smart sensors can not only transform industry, but also make production more energy efficient.

Read the full article on manageIT (in German)