NEWS: PRODUCT NEWS

    Infineon Collaborates with d-Matrix to Optimize Performance and Power Efficiency for Interactive AI Inferencing

    05/12/2026
    Infineon Collaborates with d-Matrix to Optimize Performance and Power Efficiency for Interactive AI Inferencing

    ­Infineon Technologies AG announced a collaboration with d-Matrix®, a pioneer in highly interactive, low-latency AI inference compute for data centers. Infineon’s power solutions help d-Matrix’s Corsair™ inference accelerator achieve industry-leading performance, energy efficiency, and system integration in their high-density boards. d-Matrix’s solution leverages the Infineon OptiMOS™ TDM2254xx dual-phase power modules, which enable true vertical power delivery and offer a high density of 1.0 A/mm2. 

    AI inference applies a trained machine learning model to new data to generate predictions, classifications, or decisions. These models use learned parameters to process inputs and produce outputs without further training, requiring optimized hardware to balance performance, latency, and power efficiency at data center scale. 

    “Infineon has been collaborating with customers specializing in inference processors, such as d-Matrix, from the early days when the industry was mostly focused on training hardware,” said Raj Khattoi, Vice President and General Manager of Consumer, Computing and Communication at Infineon. “These early, strategic engagements have positioned Infineon as a leader in the inference hardware industry, further extending our leadership in powering AI with semiconductor solutions for both inference and training processors.”

    “AI is rapidly moving from back-office experimentation to a real-time interactive experience – and that shift demands a fundamentally different compute architecture. Corsair was purpose-built for this moment: delivering the sub-2ms token latency that interactive applications require, at multiples better energy efficiency than traditional approaches,” said Sid Sheth, founder and CEO of d-Matrix. “Infineon has been a design partner since the inception of our platform, and their power semiconductors are a meaningful contributor to our ability to deliver what the market demands."

    Leveraging Infineon's power semiconductors, d-Matrix has optimized its AI inference platforms for power and performance. Typical use cases include response generation using large language models (LLMs); agentic AI applications; and predictive analytics in finance and healthcare.

    As AI workloads continue to scale, the demand for efficient and reliable power solutions in data centers is increasing rapidly. Infineon's broad portfolio of power semiconductors based on silicon (Si), silicon carbide (SiC), and gallium nitride (GaN) has established the company as a trusted partner for leading AI developers across both training and inference markets. By enabling higher power density, improved energy efficiency, and seamless system integration from grid to core, Infineon is helping to shape the future of AI infrastructure.

    Availability

    Learn more about the Infineon OptiMOS TDM2254xx dual-phase power modules here.

    Related

    Power Systems Design

    146 Charles Street
    Annapolis, Maryland 21401 USA

    Power Systems Design

    Power Systems Design is a leading global media platform serving the power electronics design engineering community. It delivers in-depth technical content, industry news, and product insights to engineers and decision-makers developing advanced power systems and technologies.

    Published 12× per year across North America and Europe, Power Systems Design is distributed through online and fully digital editions, complemented by eNewsletters, webinars, and multimedia content. The platform covers key areas including power conversion, semiconductors, renewable energy, automotive electrification, AI power systems, and industrial applications—supporting innovation across the global electronics industry.