Tesla Assigned Partial Responsibility for Last Year’s Fatal Autopilot Crash

The National Transportation Safety Board concluded that Tesla shared blame with the driver.

While most of the automotive industry has approached autonomous driving slowly and cautiously, Tesla has sped forward — last year, to fatal results. In May 2016, Joshua Brown was killed when his Tesla Model S collided with a semi truck while Autopilot, Tesla’s semi-autonomous driving technology, was activated. Brown was assigned most of the blame for using Autopilot inappropriately — drivers are supposed to continue to pay attention to the road — but today, the United States National Transportation Safety Board concluded that Tesla needs to shoulder some of the responsibility as well.

Using logs from the vehicle, Tesla found that Brown had had his hands on the steering wheel for only 25 seconds of the 37 minutes he drove that day. At that time, Tesla had instructed drivers to keep their hands on the wheel and stay engaged with the road even with Autopilot activated, with the car emitting warning chimes if it detected that the driver was disengaged. Tesla concluded that the driver ignored both instructions and warnings, treating Autopilot as a fully autonomous drive system rather than a driver assist system.

While the NTSB didn’t disagree with that conclusion, they did assign Tesla blame for a lack of safeguards. While Brown used Autopliot outside of its intended use, the NTSB concluded that Tesla should have had hard systems in place preventing misuse of Autopilot, rather than relying on drivers to properly use their technology. It’s something Tesla has implemented since the crash — Autopilot 2.0 will turn off if the driver doesn’t keep their hands on the wheel.

Tesla also has a communication problem. We’ve seen the likes of Volvo and Nissan introduce similar semi-autonomous drive technology, but they’ve been much more cautious about what they call it — Volvo with Pilot Assist II and Nissan with ProPILOT Assist. Those names make it clear that the systems are meant to assist the driver, not take over completely — needless to say, Autopilot doesn’t convey the same message. German authorities agreed, instructing Tesla not to use the name ‘Autopilot’ in October of last year.

Fully autonomous vehicles of the future might not actually be sold to consumers (being operated as fleets instead), which would help to simplify the liability question. However, we’re going to be stuck in a transition period for the next couple of decades, and during that time, assigning blame is going to be a central question that needs to be answered before autonomous cars can become widespread. Judging from this week’s news, the companies making the cars and the software won’t be off the hook completely.

Via BBC News

One Ping

  1. Pingback:

Leave a Reply

Your email address will not be published. Required fields are marked *