Welcome to the June issue!
We’re less than a month away from the official start of summer, so it’s time to kick back, relax, and give maximum effort (because we haven’t had the summers off since we were kids).
But hey, it’s fine to live (and relax) vicariously through our kids, right?
We’re about halfway through 2025, and thus far, we’ve witnessed the continued evolution of AI, autonomous systems, quantum computing, renewable energy, and several others. And so, it’s fitting that June’s issue – Motor Drives, Robotics + Controls – brings together many of today’s hottest topics.
One need only look at this month’s contributed piece from Melexis – “Unlocking Robotic Intelligence” – to see how much June’s topic lands on the bleeding edge. Melexis’s Julien Ghaye makes immediate reference to “cobots”, aka collaborative robots, aka the robots that’ll work alongside humans and make up a huge majority of such artificial creations (at least at first).
As Julien points out, cobots represent 10.5% of all new industrial robot installations (with over 56,000 units deployed in 2023), and the market could exceed US $40 billion by the end of the year.
In many cases, these collaborative robots would represent a huge improvement over traditional industrial bots. Industrial robots are good for large-scale, brute-force applications that don’t require a high degree of sensitivity, but they require a significant upfront cost and extensive safety measures before they can be anywhere near humans.
But cobots can not only perform many of the same tasks as industrial robots, but they’re smaller and lighter, and since they include force control and sensors, humans can easily work alongside them with minimal safety concerns.
Or as noted by Julien, “the integration of innovative sensor technologies, capable of enhancing both internal and environmental perception while driving design scalability, is vital for enabling robots to perform in complex, real-world situations.”
Of course, sensors and force feedback assuage more than just safety concerns. For any robots to achieve human-like precision, they have to replicate our natural sensory systems – our bodies can automatically detect slight variations in touch, weight, texture, or temperature, and we adjust our grip and posture on-the-fly based on those factors.
The robots of the future will need to absorb and understand data from a number of different sensors – far more than expected of industrial robots.
“Advanced robotic systems rely on inputs from tactile sensors, positional encoders, temperature monitors, current sensors, and other external detectors,” Julien says.
Julien suggests a solution that allows for true robotic tactile sensing, not dissimilar to the human skin’s tactile receptors.
That – plus accounting for human unpredictability – could propel robots into tomorrow’s collaborative applications.
Enjoy the June issue!
Best Regards,
Jason Lomberg
North American Editor, PSD