Opinion

Tesla Crash Shows Man and Machine Must Cooperate

Tesla Crash Shows Man and Machine Must Cooperate
Advertisement

Almost as soon as news broke of a fatal crash involving Tesla's Autopilot last year, fans and detractors of the electric-car manufacturer have been clear on the tragedy's causes. Tesla's supporters and investors never doubted that the system improves safety, so the driver must have failed to heed Tesla's warnings and remain attentive. Detractors and short investors are all but certain that Autopilot somehow failed to protect the car's driver, allowing him to drive directly into a semi at 74 mph.

After more than a year of debate a conclusive answer is finally at hand, courtesy of a National Transportation Safety Board investigation whose final results were presented this week. But the board's findings aren't likely to leave either side happy: Rather than blaming man or machine alone, it seems that both human drivers and the Autopilot system - specifically the complex relationship between the two -- contributed to the deadly event.

At the heart of the matter is a dangerous dynamic: With billions at stake in the frantic race to develop self-driving car technology, there are huge incentives for carmakers to create the impression that vehicles for sale today are "autonomous." But as the NTSB made clear, no vehicle now on the market is capable of safe autonomous driving. When consumers take high-tech hype at face value, a lethal gap between perception and reality can open.

US Updates Self-Driving Car Guidelines as More Hit the Road

Tesla reaped months of laudatory coverage and billions worth of market cap by presenting its Autopilot system as being more autonomous than any other advanced driver assist systems, even as it warned owners they must remain attentive and in control at all times. Though Autopilot did offer better performance than other advanced driver assistance systems, the key to its success was the lack of limitations Tesla put on its use. Because Autopilot allows owners to drive hands-free anywhere, even on roads where Tesla has warned that such use would not be safe, the company has been able to profit off the perception that its system was more autonomous than others.

But Autopilot was actually designed for use on well-marked, protected highways with no chance of cross-traffic. So when the tractor-trailer turned across Florida's Highway 27 last May and the Tesla slammed directly into it without triggering any safety systems, Autopilot was working exactly as designed. The problem was that it was being used on a road with conditions it wasn't designed to cope with, and the driver had apparently been lulled into complacency. Far from failing, Autopilot was actually so good that it led the driver to believe it was more capable than it really was.

This complex failure, which both man and machine contributed to, sounds an important warning about autonomous-drive technology: until the systems are so good they need no human input, the human driver must remain at the center of "semi-autonomous" drive system design. Engineers must assume that if there's a way for people to misuse these systems, they will. Just as important, companies need to understand that if they over-promote a semi-autonomous drive system's capabilities in hopes of pulling ahead in the race to autonomy, they run the risk of making the technology less safe than an unassisted human driver.

There's a lesson to be learned here from aviation. As computers and sensors improved in the 1980s, aircraft manufacturers began to automate more and more of the controls simply because they could. Only later did the industry realize that adding automation for the sake of automation actually made aircraft less safe, so they re-oriented autopilot development around the principle of "human-centric" automation. Only when automation is deployed in ways that are designed to improve pilot performance does safety actually improve.

If anything, this dynamic will be more pronounced with automobiles, which that are used in much higher numbers than planes by people with much less training. But unlike aircraft companies, which join forces to improve safety across the industry, automakers and tech startups are in intense competition for the real or perceived lead in the race to autonomy.

As long as consumers care more about the futuristic cool factor of hands-free operation than using technology to become safer drivers, the potential for a dangerous gap between the perception and reality of autonomous-drive technology remains. And what a shame it would be if this technology, which has the potential to someday save tens of thousands of lives every year, actually made cars less safe in the short term.

© 2017 Bloomberg L.P.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Windows 10 Fall Creators Update to Offer Improved Privacy Features, Including App Permissions
EU Set to Demand Internet Firms Act Faster to Remove Illegal Content
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »