How Far Does Tesla Need to Protect us from Ourselves?

Author:
Jason Lomberg, North American Editor, PSD

Date
05/27/2021

 PDF

The hoopla over the latest Tesla autopilot crash took a strange turn (imagine that), again calling into question who’s responsible in what can only be described as user error.

For those late to the party – in April, a Tesla Model S (equipped with autopilot) left the road at high speed and hit a drainage culvert, a raised manhole, and a tree, bursting into blames and killing both occupants.

I say occupants because here’s where it gets weird – initial reports stated that no one was in the driver’s seat.

“[Investigators] are 100-percent certain that no one was in the driver seat driving that vehicle at the time of impact,” Harris County Precinct 4 Constable Mark Herman said. “They are positive.”

According to the National Transportation Safety Board, footage from the Tesla owner’s home security camera shows the owner entering the car’s driver’s seat and the passenger entering the front passenger seat.

And given that all seatbelts post-crash were found to be unbuckled, and the deceased were aged 59 and 69 – i.e., not exactly nimble – it seems extremely unlikely that the driver wiggled out of his restraints in the 550 feet the car travelled before leaving the road.

Of course, the early reports implied that Tesla’s autopilot must’ve malfunctioned in some way – it didn’t sufficiently protect us from ourselves.

But the autopilot can only function if both the Traffic Aware Cruise Control and Autosteer systems are working, and since Autosteer requires clearly-delineated lane lines (which this road didn’t have), and again, barring a malfunction, autopilot couldn’t have been on.

Autopilot also has a weighted seat to ensure someone is behind the wheel, so if the driver somehow managed to wiggle out of the seat belt without unbuckling it and then leave the driver’s seat (while the car was in motion), autopilot would’ve shut off.

By the time you read this, the official story might’ve changed again, but that’s kinda a red herring. The implication of the early reports – and countless stories like it – is that Tesla didn’t do enough to protect us from ourselves.

Perhaps Tesla proffered deceptive marketing. Maybe the driver fell victim to a PR campaign which convinced him that “autopilot” is a fully-autonomous, self-driving feature.

On its site, Tesla says that autopilot “enables your car to steer, accelerate and brake automatically within its lane. Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

It does say that all new Tesla cars include advanced hardware that allows for full self-driving capabilities in the future, but that last bit is clearly spelled out. Even a cursory glance at Tesla’s autopilot page couldn’t reasonably give the impression that autopilot is fully autonomous.

Only the title, itself, is slightly misleading, but I can only assume it was labeled “autopilot” with an eye towards the future (and the dormant hardware, as heretofore mentioned).

So how far does Tesla need to go to protect us from ourselves? The bulk of the evidence suggests autopilot wasn’t functioning at the time of the accident, and even if it was, Tesla has never intimated that autopilot is the Johnny Cab from Total Recall (or the automated traffic from Minority Report).

Without autopilot, and as tragic as it was, this was a garden-variety vehicular accident.

RELATED

 


-->