Advertisement

Watch this giant teddy bear ‘drive’ a Tesla

VIDEO | 01:10
Furry stuffed animals spoof Tesla ‘Full-Self Driving’ feature

In a test drive on a private road, a Tesla with Full-Self Driving engaged slams into a child mannequin. The driver monitoring system, meant to assure a human driver is paying attention to avoid such crashes, fails to detect a giant teddy bear meant to mimic a person in the driver’s seat. (Video courtesy of the Dawn Project/AI Addict)

Share via

As a child-size mannequin stands in a traffic lane on a rural two-lane road, a Tesla in Full Self-Driving mode barrels toward it. At the wheel: a giant teddy bear. The car’s driver monitoring system doesn’t issue any warnings. The front end whacks the mannequin, sending it flying into the air. And the car drives on, as if nothing happened.

It’s the latest salvo from activist organization the Dawn Project, which publishes videos aimed at showing how badly Tesla’s automated driving technology can behave. Dan O’Dowd, the wealthy, tech-savvy activist who founded and self-funds the Dawn Project, said he wants to ensure that “the safety-critical systems that everyone’s life depends on are fail-safe and can’t be hacked.”

Although O’Dowd’s stated goal is brand-agnostic, his main target since launching the Dawn Project in 2021 has been Tesla and its controversial Autopilot and Full Self-Driving systems. So far, he said, he’s spent $10 million on a campaign to persuade Tesla to fix what he sees as its safety problems, and to push for government regulations if it doesn’t.

Advertisement
VIDEO | 01:06
Human defeats Tesla driver-monitoring system

(Video courtesy of the Dawn Project.)

O’Dowd, 67, is an expert in secure systems. A Caltech-trained engineer, he’s designed military-grade operating systems for intercontinental missiles, jet fighters, bombers and passenger planes, as well as microprocessors for NASA spacecraft. He made his money through Green Hills Software, a company he founded in 1982 that develops fail-safe operating systems and programming tools for commercial customers.

He’s personally offended, he said, that what he calls Tesla’s half-baked automated systems are being allowed on the nation’s highways before serious flaws have been fixed through off-road testing. He notes that since 2021, at least 23 people have died in crashes involving Autopilot or Full Self-Driving systems. That tally comes from the National Highway Traffic Safety Administration, which also reports 840 Autopilot/FSD-related crashes over the same period.

Advertisement

Autopilot is similar to the driver-assistance programs sold by other car manufacturers, offering automated cruise control, lane keeping and lane changes. Full Self-Driving, a far more expensive option, is marketed as a $15,000 system that can handle city traffic, including traffic signals and turns at intersections. Because Full Self-Driving requires human attention and therefore does not fully drive itself, the California Department of Motor Vehicles says it is investigating whether Tesla is fraudulently marketing the product.

O’Dowd said that after he sent videos to NHTSA, he spoke with several NHTSA officials over Zoom who said they have an investigation underway but couldn’t discuss it until it’s complete.

Asked for comment, NHTSA reiterated the message: “NHTSA’s Autopilot investigation remains open, and the agency generally does not comment on open investigations.”

Advertisement

O’Dowd admits that some skepticism about his tests is to be expected. To counter it, he’s invited NHTSA, the DMV, Elon Musk and Tesla itself to replicate his tests. He’s asked them all to visit the Dawn Project in Santa Barbara to evaluate his team’s methods, and to bring their engineers to see if there’s any funny business going on. So far, no takers.

VIDEO | 00:21
Tesla blows through a stop sign

(Video courtesy of the Dawn Project).

The ‘teddy bear test’

Critics on social media have accused O’Dowd of applying his expertise to manipulate the car’s software and hardware and fake the defects. In addition to NHTSA, Tesla and the DMV, he has invited Tesla fans to the Dawn Project to assess the methodology.

In June, O’Dowd teamed up with prominent Tesla fanboy Ross Gerber, a Santa Monica investment manager, to take a ride together in Gerber’s FSD-equipped Tesla. At one point, FSD ignored a stop sign and Gerber had to slam the brakes to avoid a collision.

Another fan, John Bernal, a former Tesla employee who still drives a Model 3, conducted the teddy bear test. (Bernal says he was fired after posting critical FSD videos.)

Bernal took his car on a private road with no other traffic around. He put up the child model, then had two giant stuffed animals take turns “driving” — a teddy bear and a pink unicorn. He put a 30-pound kettle ball on the driver’s seat, tricking the car into thinking a person was there, and attached a steering wheel weight, available for purchase on the internet, designed to mimic a human hand on the steering wheel, which the owner’s manual says is required.

Advertisement

With a Dawn Project driver in the passenger seat, Bernal set the car in motion. Just like in earlier videos, the Tesla creamed the child mannequin. In one case, it slowed down for a few seconds after the collision, then forged ahead. Another video shows a Dawn Project driver tricking the Tesla monitoring system by wearing sunglasses; he works on a laptop awhile and then slumps and pretends to fall asleep, as the car drives on.

Dan O’Dowd has questioned Tesla’s self-driving technology before. Now, the Dawn Project founder has paid millions for a Super Bowl ad to question the EV’s safety.

Musk and Tesla did not respond to requests for comment on the video. In the past, Musk has tweeted that O’Dowd and his criticism are “crazy.”

Like cars from other manufacturers with automated driving systems — such as Ford and its BlueCruise, GM and its Super Cruise and BMW and its Driving Assistant Plus — later-model Teslas are equipped with what’s called a driver monitoring system. These systems embed sensors over the windshield to keep an eye on drivers and make sure they’re paying attention to the road. If not, the driver will hear a series of warnings, and eventually, if it concludes the driver’s not paying attention, the car comes to a stop. Bernal said in another test the system went 10 minutes before issuing a warning.

Driver-assist technology can make driving safer, but it also encourages drivers to zone out. Automakers are fixing that with systems that make sure you’re paying attention to the road.

Tesla’s monitoring system is simple compared with others on the market. Tesla uses only a camera, while most of the others add infrared sensors that allow for a more sophisticated analysis of subtle behaviors such as gaze and head movement.

Colin Barndem, who follows the driver monitoring market for Semicast Research in London, said he ranks Tesla’s system “as the worst driver monitor system on the market. Last place … with no infrared illumination, it’s as close as possible to useless.”

The U.S. doesn’t set standards for driver monitoring systems, although NHTSA is researching the issue. Barndem notes that Europe already has such standards in place, and that “the U.S. is five years behind Europe.”

Advertisement

“NHTSA’s goal is to save lives,” Barndem said. “If they’re not going to follow the European example and set minimum standards, [loss of life] is what’s going to happen.”

NHTSA currently is investigating Tesla on a number of alleged defects, including sudden acceleration; sudden braking; steering wheel malfunctions; and crashes into police cars, firetrucks and ambulances while on Autopilot.

Advertisement