The challenges of powering Artificial Intelligence

Making a smart, connected world possible depends on ­energy-efficient data centres.

The invention of computers changed the world due to their ability to retain and share information, but up until recently, they lacked the capability to emulate a human brain and autonomously learn in order to perform tasks or make decisions.

To come close to the processing power of a human brain, an AI system must perform around 40 thousand trillion operations per second (or 40 PetaFLOPS). A typical server farm with this level of AI computing power would consume nearly 6 MW of power, whereas the human brain by comparison requires the calorific equivalent of only 20 W of power to perform the same tasks. Some of AI’s most advanced learning systems are currently consuming power at up to 15 MW – levels that would power a small European town of around 1500 homes for an entire day.

AI’s neural networks learn through exposure to differentiation, similar to human learning. Typically, thousands of images are processed through Graphics Processing Units (GPUs) set up in parallel in order for the network to compare and learn as quickly as possible.

AI computing is also dependent on so-called edge devices, including cameras, sensors, data collectors and actuators, to receive input information and output movement or actions in the physical world. Consumer and manufacturing trends such as the Internet of Things (IoT) have also led to the proliferation of AI-enabled devices in homes and factories, thereby also requiring increased data and energy consumption.

Delivering and managing megawatts of power is constantly underscored by pressure from rising energy prices. Additionally, every watt of energy dissipated in the data centres requires more cooling, increasing energy costs further.

Miniaturisation is central to improving processing power, but smaller sizes with increased power density reduce the surface area available for dissipating heat. Thermal ­management is therefore one of the most significant ­challenges in designing power for this new generation of AI supercomputers.

Reducing CO2 emissions

Estimates predict that there will be over 50 billion cloud-connected sensors and IoT devices by 2020. The combined effect these devices and the data centres that power Artificial Intelligence will have on global power consumption and global warming indicates the need for collective action to make power supplies for server racks, edge devices, and IoT devices much more energy-efficient.

In addition to investments in renewable energy production and attempts to move away from the use of petrol and diesel vehicles, European countries will need to place significant focus on energy efficiency in their efforts to cut carbon emissions. The European Commission implemented the Code of Conduct for Energy Efficiency in Data Centres in 2008 as a voluntary initiative to help all stakeholders improve energy efficiency, but data centres are still on course to consume as much as 104 TWh in Europe alone by 2020, almost doubled from 56 TWh in 2007.

According to a 2017 study on data centre energy consumption, the Information and Communication Technology (ICT) sector generates up to 2 per cent of the world’s total carbon dioxide emissions – a percentage on par with global emissions from the aviation sector. Data centres make up 14 per cent of this ICT footprint.

However, another report states that ICT-enabled solutions such as energy-efficient technologies could reduce the EU’s total carbon emissions by over 1.5 gigatonnes (Gt) of CO²e (carbon dioxide equivalent) by 2030. This would be a vast saving, almost equivalent to 37 per cent of the EU’s total carbon emissions in 2012.

Analogue vs. digital controllers

AI will no doubt have a significant impact on human society in the future. However, the repetitive algorithms of AI require a significant change to computing architectures and the processors themselves. As a result, powering these new AI systems will remain a persistent challenge.

Clearly, power solution sophistication must increase and, as a result, power management products have now emerged with advanced digital control techniques, replacing legacy analogue-based solutions.

Digital control has been shown to increase overall system flexibility and adaptability when designing high-end power solutions. A digital approach allows controllers to be customised without costly and time-consuming silicon spins and simplifies designing and building the scalable power solutions required for AI. Even with all of the ­included functionality and precision delivery of power, digital ­solutions are now price-competitive with the analogue solutions they will replace.

Making the power solutions for AI applications of the future as efficient as possible is a relatively easy and attainable way in which the ICT sector can contribute to reducing global carbon emissions.

Clayton Cornell, Technical Editor, Infineon Technologies AG