Advertisement

A Tesla mystery: Why didn’t auto-braking stop these crashes?

A crash scene involving a Tesla, a police vehicle and an ambulance.
A crash scene in Cochise County, Ariz., where a Tesla crashed into a police vehicle, causing it to hit an ambulance, in July 2020.
(Arizona Department of Public Safety)
Share

On a rainy winter night in December 2019, college student Maria Smith found herself followed by a state trooper, police lights flashing, on Massachusetts’ Route 24. She pulled over to the side of the road. Then: Smack! Something had hit her car from behind, shattering her rear window. “I was scared,” she said.

A Tesla running in Autopilot mode had slammed into the state trooper’s cruiser, knocking it into Smith’s car.

No one was injured, but it’s part of a sweeping investigation launched in August by the National Highway Traffic Safety Administration. NHTSA is looking into a dozen similar episodes over three years in which Tesla vehicles traveling at a range of speeds ran into stationary police cars, firetrucks, ambulances and other emergency vehicles, injuring 17 people and killing one.

Advertisement

Announcing the probe, NHTSA noted that all of the Tesla vehicles involved were running on either Autopilot or Traffic Aware Cruise Control, software systems that allow the driver to relinquish control of speed and sometimes steering while — in theory, at least — staying ready to intervene. NHTSA said it would be looking into such factors as how the vehicle makes sure drivers are paying attention and how it detects visual cues of a crash scene such as flashing lights and flares — details that an alert human driver would be unlikely to miss.

But its investigators will also be digging into a question involving a more basic technology: Why isn’t Tesla’s forward collision avoidance system better at preventing crashes like Smith’s — at least when the computer is driving?

Compared with so-called advanced driver assistance systems such as Autopilot, a forward collision avoidance system is relatively crude. It is designed to answer one question — is a frontal impact imminent? — and respond to danger by sounding a warning and, if necessary, triggering a subsystem called automatic emergency braking. Unlike Autopilot, which must be selected manually and is available only under some driving conditions, automatic emergency braking runs by default unless manually turned off.

The top federal traffic safety regulator says Tesla’s partially automated driving system, Autopilot, failed to spot parked police cars and fire trucks. It wants to know why.

Aug. 16, 2021

First developed in the mid-1990s, automatic emergency braking is effective at preventing or reducing the severity of crashes, said David Aylor, manager of active safety testing for the Insurance Institute for Highway Safety. IIHS has found that automatic braking systems can reduce the incidence of front-to-rear crashes by 50%, with better performance at lower speeds and in good visibility conditions.

“We think it’s a great technology and all cars should have it,” Aylor said.

“The benefits are pretty astounding,” said Kelly Funkhouser, head of connected and automated vehicles for Consumer Reports. “It’s the one technology I would never let a family member or friend buy a car without.”

Tesla calls its vehicles “the safest cars in the world,” citing their combination of structural engineering and advanced technology. But when it comes to the forward collision avoidance system, Tesla owners have been reporting problems at a substantially elevated rate compared with similarly equipped cars.

Advertisement

In 2020 and the first three quarters of 2021, NHTSA received 131 complaints about Tesla’s system, compared with 55 for Mercedes-Benz, 28 for Audi and 14 for Volvo. Each of the four automakers made collision avoidance systems standard on all its cars ahead of a voluntary industry deadline late next year.

The rate of complaints about Tesla, relative to the number of its cars sold in the U.S. in 2020, was more than three times that of the other automakers.

Tesla does not have a media relations department, and Chief Executive Elon Musk did not respond to attempts to seek comment.

The pattern of crashes and complaints is raising alarms among safety advocates and experts in automotive technology.

“Teslas are running into stationary objects,” said Alain Kornhauser, who heads the driverless car engineering program at Princeton University. “They shouldn’t be.” If the company’s cars can’t avoid crash scenes marked by flares or traffic cones, he said, “how can you trust anything else they do with Autopilot?”

In tests, Tesla Models S, X and 3 scored “superior” grades from IIHS for their forward collision avoidance systems. (The Model Y has not yet been tested.) IIHS said 84% of automatic braking systems across all automakers achieved a superior designation.

Advertisement

But those tests are conducted at only 12 mph and 25 mph, according to Aylor. IIHS doesn’t test at high speeds or with Autopilot or similar systems engaged.

Is Randeep Hothi a stalker, a harasser, a perpetrator of violence and an imminent threat?

July 9, 2019

In the operating manuals given to Tesla owners, the company states that its automatic emergency braking is designed to work at speeds from 3 mph to 90 mph. That language comes with several disclaimers, including the admonition that automatic braking is “designed to reduce the severity of an impact. It is not designed to avoid a collision.”

A 2020 report by the National Transportation Safety Board summarizing investigations into four Tesla crashes highlighted “the limitations of [forward] collision avoidance systems ... when vehicles ... are traveling at high speed or are faced with vehicle shapes or objects that the system has not been designed to detect. ... The systems are not designed or tested to operate consistently at speeds over 50 mph.”

Best known for its airline disaster probes, the NTSB lacks regulatory authority, but its investigations have highlighted issues around automated vehicle development and performance, including frontal collision avoidance.

In a fatal crash in Mountain View, Calif., in 2018, a Tesla running on Autopilot drove head-on into a concrete abutment. NTSB determined Tesla hadn’t designed its system to avoid such road obstructions, according to its report. “Consequently, the forward collision warning system did not provide an alert and the automatic emergency braking did not activate.”

Advanced driver assistance terms

  • Forward Collision Avoidance: A system that identifies an obstruction in front of a car or truck, sends a warning to the driver, and applies automatic emergency brakes.

  • Automatic Emergency Braking: A system that uses cameras, radar or both, to identify obstructions and apply brakes in order to avoid or reduce the force of imminent collisions.

  • Autopilot: Tesla’s Level 2 driver-assist system. It combines adaptive cruise control, automatic steering, lane changing and other features. Driver is required to pay full attention when Autopilot is engaged.

  • Traffic Aware Cruise Control: An adaptive cruise control feature in Autopilot, it matches the car’s speed to that of surrounding traffic. It can be turned on even if the rest of Autopilot is off.

  • Full Self-Driving: Tesla’s driverless car development program. Customers pay Tesla $10,000 (or $199 a month on lease) to test out evolving iterations of the company’s self-driving software on public roads. Autonomous driving experts say FSD does not constitute true self-driving, and the head of the National Transportation Safety Board calls the name “misleading and irresponsible.”

Advertisement

In a high-speed crash in Texas this year, a Tesla in Autopilot mode caused a chain reaction that sent five state troopers to the hospital, according to court documents. A plaintiff’s filing says the car was traveling at 70 mph and “did not apply its ‘Automatic Emergency Braking’ to slow down to avoid or mitigate the accident.” Tesla has not yet filed a response to the lawsuit.

An NTSB report on a 2018 crash on the 405 Freeway in Culver City, in which a Tesla on Autopilot plowed into the back of a parked firetruck, said the NTSB determined the car was traveling at 30.9 mph on impact. Not only did the automatic emergency braking not engage, the car sped up just before the collision, investigators said, from 21 mph.

That is the speed range in which automatic emergency braking is supposed to excel, Aylor said.

Based on such findings, the board since 2018 has been recommending that NHTSA develop and apply tests to evaluate performance of forward collision avoidance systems at several velocities, including high speeds. NHTSA has not yet done so.

“NHTSA has taken no action toward gaining a better understanding of how these lifesaving technologies perform in real-world high-speed crash scenarios,” the NTSB said in a report on how its recommendations have been received.

Asked for comment, a NHTSA spokesperson said in a statement that the agency is “continuing to collect data and conduct research that will inform and are necessary precursors to several regulatory actions” on its agenda. The agency declined a request for an interview with an NHTSA official.

Advertisement

The polarizing founder shows up a lot in ‘Power Play,’ Tim Higgins’ Tesla history — but usually in the role of a dramatic foil who will be overtaken.

July 30, 2021

In its investigation of the crashes involving emergency vehicles, NHTSA appears to be intent on understanding the interaction between Autopilot and automatic emergency braking and which one has control of the brakes when an obstacle is detected. In an Aug. 31 letter addressed to Eddie Gates, Tesla’s director of field quality, the agency instructed him to describe Autopilot’s control over functions including braking and acceleration “during routine and crash-imminent operations.”

One possibility, according to Missy Cummings, a former Navy fighter pilot who studies human-machine interaction at Duke University, is that Autopilot is designed to preempt or suppress emergency braking to minimize what’s known as phantom braking.

“I haven’t seen the code to say how Tesla works, but I suspect the AEB is turned off in some situations,” she said. “If it were left on it may detect what are called phantom objects and would be slamming on the brakes.”

Mahmood Hikmet, an autonomous vehicle research engineer in New Zealand, said automatic emergency braking can interfere with testing of truly driverless systems — something Tesla is currently doing at scale with a public beta test of its so-called Full Self Driving software.

In 2018, a pedestrian crossing the street was killed by a self-driving Uber test car made by Volvo. The automatic braking system had been turned off, according to Aptiv, which supplied the system’s camera and radar. The vehicle’s safety driver didn’t see the woman on the street at night and the car drove over her at highway speed without stopping.

“You might turn off some safety features in order to gauge how the system you’re testing works,” Hikmet said, but “ideally” on closed tracks with skilled safety drivers and clear rules of operation.

Advertisement

Maria Smith, for one, thinks there should be more clarity around how automated driving and safety systems work — and why they sometimes don’t.

“Teslas are such cool cars,” Smith said. “But they’re testing them with the rest of us out here driving our uncool cars that don’t drive themselves.”

Advertisement