Scroll Top

Drivers death highlights the risks of new technology

article_tesla

WHY THIS MATTERS IN BRIEF

With new technologies come new risks but the rate at which they’re being developed and deployed leaves culture and training struggling to keep up.

 

The first reported death involving Tesla’s Autopilot feature raises troubling questions about how much more progress proponents of self-driving cars must achieve before they can be operated without constant driver awareness.

Joshua Brown, 40, of Canton, Ohio, died May 7 in Williston, FL, when the car’s semi-autonomous system, which Tesla markets under the name Autopilot, failed to detect a tractor-trailer turning in front of the luxury electric car.

 

RELATED
France bans sales of combustion engines from 2040

 

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a statement on its corporate web site. “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”

Later this month, the National Highway Traffic Safety Administration, which has launched an investigation into the Florida crash, will issue guidelines intended to set the near-term rules of the road in autonomous vehicle research.

Brown’s death is a sobering reminder that the excitement, enthusiasm and media coverage of autonomous vehicles may be running ahead of the engineering reality.

Still the horserace goes on.

BMW said Friday it would introduce a fully self-driving car by 2021 after striking a technology partnership with Intel and Mobileye.

Without addressing the Tesla fatality directly, BMW, Intel and Mobileye said in a statement that they “are convinced that automated driving technologies will make travel safer and easier.”

BMW said its goal is to reach a level where drivers can take their “eyes off” the road altogether.

 

RELATED
During Covid-19 lockdown robo-trucks in the US are keeping freight moving

 

As for Tesla’s Florida crash, there’s a lot we don’t know.

For example, did the car alert Brown that he was in a situation where the sensors, micro-cameras and guidance technology could not safely respond?

Even if the system alerted him, was he able to retake control quickly enough to mitigate the crash impact?

Are we experimenting with this technology at the risk of people’s lives?

“The reality is that the public and regulators won’t tolerate that kind of risk-taking even if over 50 years it means more people are saved,” said Bryant Walker Smith, a professor at the University of South Carolina School of Law who has studied the legal and ethical ramifications of this technology for years.

Advocates of autonomous vehicles argue that over time they can make a quantum improvement in safety.

But regulators must operate in the present.

Last month at a conference in Novi, NHTSA chief Mark Rosekind argued that technology like Autopilot should be twice as safe as the manual systems they replace.

 

RELATED
Watch: Daedulus is an insane, real life flying Iron Man suit

 

The other risk illustrated in the Florida crash is the tension between what automakers call automated driver assist features, such as emergency braking, line departure alert and adaptive cruise control, and an autonomy that is being touted as allowing all occupants to text, email, watch movies and otherwise disengage from driving.

In regulator speak, full autonomy is called Level 4. Tesla’s Autopilot is regarded as Level 2 technology, but many drivers are treating it as Level 3 and over-trusting the system.

“The expectation of Tesla is that the driver is alert and vigilant, ready to take over at a moment’s notice,” said Ryan Eustice, a professor of engineering at the University of Michigan who is part of the Toyota Research Institute focusing on autonomous transportation. “In practice, however, we see that humans quickly become bored and place too much trust in the system. People let down their guard and are not attentive and ready to take over.”

But USC’s Smith said the marketing pressure to present these vehicles as a magical mode of transportation that allows multitasking is also leading some companies to overpromise the benefits.

“Is an automated system actually safer? The automated system plus an alert human driver is the safest,” Smith said. “The problem is people might be less likely to buy those technologies if they are only marketed as safety. If you market them as relaxation or multitasking, you don’t get the same safety benefit but you get the vehicles used and deployed more rapidly.”

 

RELATED
France opens the worlds first solar panel road to the public

 

The report of Brown’s fatal crash comes at a challenging time for Tesla. It is burning an average of $500 million in cash each quarter. Now it is shifting gears from a niche producer of very expensive cars to a planned launch next year of its Model 3 that it says will sell for between $30,000 and $40,000 after tax credits.

Flamboyant founder and CEO Elon Musk has been criticized for his proposed $2.8 billion acquisition by Tesla of Solar City, a solar panel manufacturing business he also launched, but which, like Tesla is losing money.

Earlier this week an investor group called for Tesla to add two independent directors to its board and separate the roles of chairman and CEO in a challenge to Musk’s autonomy.

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This