The Increasing Demands of Powering AI

Author:
Kevin Parmenter, Director, Applications Engineering. TSC, America

Date
04/28/2025

 PDF

Kevin Parmenter, Director, Applications Engineering. TSC, America

­According to Fortune Business Insights, the global artificial intelligence (AI) market size, valued at $233.46 billion in 2024, is projected to grow from $294.16 billion in 2025 to $11,771.62 billion by 2032, which is a CAGR of 29.2%.

As AI systems grow in complexity and scale, so do their energy requirements. Deep learning algorithms, which are at the core of many AI applications, require vast amounts of data processing, often performed on high-performance computing (HPC) systems. Some studies indicate that we are looking at three times the power density and power consumption in data centers.

Our challenge as power electronics engineers is meeting the increased power delivery and power efficiency requirements of large-scale computing systems. But how do we adapt to the new evolution in AI system power requirements? As I’ve said in past columns, it’s an enablement issue. We need to make AI viable for targeted applications.

For example, since data center costs are impacted by their massive energy demands, increasing power consumption translates into significant savings. In addition to streamlining operations, using less energy means fewer cooling systems. Plus, power efficiency has the environmental benefit of reducing the data center’s reliance on fossil fuels.

Other power efficiency solutions are being achieved by advancements in hardware design, such as the development of energy-efficient processors and accelerators specifically tailored for AI workloads. Companies like NVIDIA and Intel, for instance, are investing in energy-efficient GPUs and neural processing units (NPUs) to improve the efficiency of AI computations. We can also borrow technologies already in use in the power electronics world, such as those used for the electrification of transportation, alternative-energy power conversion, WBG semiconductor materials, and innovative packaging.

Also, by innovating hardware and software components of AI applications, developers can significantly reduce their environmental footprint. On the load side, engineers are optimizing software algorithms to reduce energy consumption, while researchers continuously develop new algorithms requiring fewer computational resources. Techniques like model compression, quantization, and sparse neural networks enable AI systems to achieve comparable performance with lower energy requirements.

Furthermore, edge computing presents opportunities to reduce energy consumption. Instead of relying solely on centralized data centers, edge devices, like smartphones, IoT sensors, and edge servers, can perform AI computations locally, minimizing the need for data transmission. Edge AI enables real-time inference and decision-making, making it ideal for applications requiring low-latency and privacy-preserving computation.

In conclusion, powering AI presents significant challenges in terms of energy consumption and environmental sustainability. However, advancements in hardware design, software optimization, renewable energy adoption and edge computing offer real solutions. By embracing innovation and collaboration in these areas, we can harness the transformative potential of AI while minimizing its ecological footprint. 

RELATED