Who could’ve ever guessed that the eternal negligence of motorists crossed with an imperfect – and occasionally over-hyped – technology would cause problems? If you’re surprised that “driver assistance” tech has contributed to a growing number of accidents, then welcome to planet Earth!
And the autonomous growing pains aren’t just an American problem. Germany has already seen its first self-driving fatality, when an electric BMW iX with self-driving steering capabilities veered into oncoming traffic, killing one and injuring nine.
The UK has been hashing out the legal onus for self-driving car accidents and just passed legislation to assign blame to manufacturers (vs occupants).
But here’s where it gets tricky – the litigious capital of the world (aka, the US of A) has been wrestling with all sorts of semi-autonomous legal scenarios. If a motorist streams Netflix while a “driver assistance” feature causes a fatal accident (or doesn’t prevent one), who bears responsibility? How many warnings does the system need? What can manufacturers legally call such a feature?
China is dealing with similar issues after the first fatal accident involving the country’s “safest car.”
China's crashworthiness assessment program (C-NCAP) recently tested Xiaomi’s popular electric SU7, and the vehicle not only scored 5/5, but, due in no small part to its AT128 LiDAR sensor from Hesai and the Xiaomi Pilot Max driver's assistance system (and a 95.25% in the Active Safety category), became the highest-rated car in the country.
But that didn’t stop the SU7 – which was in NOA (Navigation on Autopilot) intelligent driving assistance mode at the time – from being involved in a fatal accident that claimed the lives of 3 university students.
Apparently, the car approached a closed-off construction zone, after which the system issued a warning and began decelerating. The driver belatedly took over, but the car still ended up crashing into a concrete barrier post, killing all three occupants.
To make matters worse, the doors’ mechanical emergency release handles failed to open, preventing any possibility of escape.
All of this led to China banning the use of the terms "smart driving" and "autonomous driving" when advertising driving assistance features (according to Reuters).
For what it’s worth, Xiaomi appears to advertise their own system as “smart driving,” though that particular language is still on their site as of this writing. And that verbiage is one of countless terms used to describe similar technology, in China, the U.S., and worldwide.
So short of legitimate, full-scale “autonomous” claims, what semi-autonomous vernacular will survive all the legal wrangling? Or is the latter notion inherently problematic?
We won’t arrive at full autonomy overnight, and these questions need answers, lest the courts dictate the progress of self-driving cars.