Powering AI

Kevin Parmenter, Director, Applications Engineering. TSC, America



Kevin Parmenter, Director, Applications Engineering. TSC, America

­The artificial intelligence (AI) market size is projected to grow from $515 billion in 2023 to over $2.0 trillion by 2030, according to a recent report by Fortune Business Insights. Deep learning algorithms, which are at the core of many AI applications, require vast amounts of data processing. This processing is often performed on high-performance computing (HPC) systems or specialized hardware like graphics processing units (GPUs) – and uses as much as three times more energy than a conventional data center or server farm.  

When the topic arises of powering the newest generation of AI chipsets, something tells me that we have been here before. In the realm of power electronics, power delivery and power efficiency have become the largest concerns for large-scale computing systems. Why is this important? Because it’s an enablement issue.

Without advances in power efficiency and delivery, AI won’t be viable for the applications targeted. Due to the massive energy demands of data centers, optimizing power usage directly impacts operational costs. Efficient power consumption translates into significant savings for data center operators and, of course, increased efficiency impacts the cooling needed to remove heat from the rooms housing the servers and processors.

Technology advancements in hardware design, such as the development of energy-efficient processors and accelerators specifically tailored for AI workloads, offer promising solutions. Companies like NVIDIA and Intel are investing in energy-efficient GPUs and neural processing units (NPUs) to improve the efficiency of AI computations while minimizing power consumption. Other companies are working on the power conversion needs of AI systems and processors at 3KW to 10KW or more. Meeting this challenge will require not only innovations, but repurposing technology we already have in adjacent applications. We can borrow technologies used for the electrification of transportation, and those that utilize WBG semiconductor materials plus innovative packaging to optimize alternative-energy power conversion.

On the design side, engineers are optimizing software algorithms to reduce energy consumption. And researchers are continually developing algorithms that require fewer computational resources without sacrificing accuracy. Techniques like model compression, quantization, and sparse neural networks enable AI systems to achieve comparable performance with lower energy requirements.

Also, edge computing presents an opportunity to decentralize AI processing and reduce energy consumption. Instead of relying solely on centralized data centers, edge devices like smartphones, IoT sensors, and edge servers can perform AI computations locally, minimizing the need for data transmission and reducing energy consumption.

In conclusion, powering AI presents significant challenges in terms of energy consumption and environmental sustainability. Increasing energy efficiency in data centers, for instance, has a large part to play in moving toward a carbon-neutral world. Through advancements in hardware design, software optimization, renewable energy adoption and edge computing, the AI industry can mitigate its environmental impact and pave the way for a sustainable future. By embracing innovation and collaboration, the power electronics industry can harness the transformative potential of AI while minimizing its ecological footprint.