Advertisement

Tesla accused of deception in promoting its Autopilot technology

Share via

A pair of consumer advocacy groups say Tesla Inc.’s promotional claims about the safety and capability of its Autopilot driver-assistance technologies are deceptive and want the U.S. Federal Trade Commission to investigate.

“Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is,” the Center for Auto Safety and Consumer Watchdog wrote in a letter on Wednesday to FTC Chairman Joseph Simons.

The groups say that Tesla’s Autopilot marketing statements as well as those made by Chief Executive Elon Musk have made it “reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of ‘self-driving.’”

Advertisement

Tesla owners’ manuals contain several disclaimers about Autopilot’s limitations and drivers are reminded that they are responsible for maintaining control of the car each time they use the system. Its Autopilot website also says it is unknown when each of the self-driving functions described on the page will be available to drivers due to software validation and regulatory approvals needed, according to the company.

“The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” a Tesla spokeswoman said in an email.

An FTC spokesman declined to say if the agency would investigate, but said the agency takes all such correspondence seriously.

Advertisement

Tesla markets the driver-assist features such as adaptive cruise control, automated steering capability and others under the Autopilot name. Tesla stresses that drivers must keep their hands on the wheel and eyes on the road at all times while using Autopilot.

The advocacy groups say Tesla’s promotions of Autopilot suggest otherwise and are deceptive. Among the examples cited in the letter is Tesla’s Autopilot website, which proclaims Tesla vehicles have “full self-driving hardware” and contains a video posted that when played begins with text reading “the person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

A Tesla driver in Utah crashed while using Autopilot and looking at her phone earlier this month. Two fatal crashes while drivers used the system have also occurred, one in California in March and a 2016 crash in Florida.

Advertisement

In a report on the Florida crash issued last year, the U.S. National Transportation Safety Board said the system gave drivers too much leeway to activate the automation in conditions where it might be unsafe.

Advertisement