Advertisement

Consumer Reports urges Tesla to disable autopilot feature after deadly crash

People check out the Tesla model S at the Tesla showroom at the the Third Street Promenade in Santa Monica, Calif.
People check out the Tesla model S at the Tesla showroom at the the Third Street Promenade in Santa Monica, Calif. (AP Photo/Richard Vogel)

Consumer Reports magazine has urged electric car maker Tesla to disable its autopilot feature over fears that the self-driving system is not yet sophisticated enough to be used safely by drivers.

The call comes after a crash that killed 40-year-old Joshua Brown in May, when the cameras on his Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn’t automatically brake.

READ MORE: Tesla driver using Model S ‘autopilot’ function dies in crash

In a blog post published Thursday, Consumer Reports said its experts believe that by naming the feature “Autopilot” Tesla gave consumers the false belief that its vehicles are capable of operating entirely on their own.

“By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, vice-president of consumer policy and mobilization for Consumer Reports.

Story continues below advertisement

“We’re deeply concerned that consumers are being sold a pile of promises about unproven technology. ‘Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time.”

The magazine also slammed Tesla for allowing the autopilot feature to be installed in all of its vehicles, despite the fact it is still a beta program.

“Consumers should never be guinea pigs for vehicle safety ‘beta’ programs,” MacCleery added.

Tesla’s autopilot system uses cameras, radar and computers to detect objects and automatically brakes if the car is about to hit something. The technology also helps to steer the car and keep it centred in its lane.

READ MORE: Tesla to limit autopilot capabilities to prevent drivers from ‘doing crazy things’

The luxury car maker faced plenty of controversy shortly after rolling out the feature in its vehicles by way of a software update in October 2015. Several videos showing people testing the system without their hands on the wheel went viral – including one  titled “Tesla autopilot tried to kill me,” which shows a Tesla SP90D model abruptly veering into oncoming traffic on a two-lane highway. That video has since raked in over 1.5 million views on YouTube.

Autopilot is disabled by default on all Tesla vehicles. Before the feature can be used, drivers must acknowledge that it’s an “assist feature” that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time.

Story continues below advertisement

According to Tesla, “The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected.”

In a statement issued in June, Tesla downplayed safety concerns, noting that its Autopilot system has been safely used in more than 100 million miles of driving. The company said it informed the National Highway Traffic Safety Administration about the deadly crash on May 16 and sent its own investigator to the crash site on May 18.

WATCH: Former Navy Seal killed in crash when Tesla autopilot fails to detect tractor-trailer

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving,” read the statement.

Story continues below advertisement

Tesla has yet to comment on Consumer Reports’ allegations.

But Consumer Reports isn’t the first consumer-minded organization to express concerns about the feature.

Many safety advocates have questioned whether the company and the U.S. National Highway Traffic Safety Administration (NHTSA) allowed the public access to the system too soon.

“No safety-significant system should ever use consumers as test drivers on the highways,” said Clarence Ditlow, head of the non-profit Center for Automotive Safety, adding that the NHTSA lacks the electronic engineers and laboratories needed to keep up with advanced technology such as General Motors air bags or Tesla’s Autopilot.

READ MORE: Experts, safety advocates caution self-driving cars aren’t ready for roads

On Tuesday, the NHTSA posted a nine-page letter addressed to Tesla, requesting information about the crash and how Autopilot works, including how the system detects “compromised or degraded” signals from cameras and other sensors and how those problems are communicated to drivers.

The federal administration refused to comment on specifics of the investigation. The agency does not currently have legal authority to prevent automakers from rolling out features if they meet basic federal motor vehicle safety standards, but is in the process of developing standards for self-driving cars.

But the NHTSA investigation could have broad implications for the auto industry and its path toward self-driving cars. If the probe finds defects with Tesla’s system, the agency could seek a recall. Other automakers, including Nissan, have or are developing similar systems that may need to be changed due to the probe, which also could affect self-driving car regulations to be unveiled in the U.S. this summer.

Story continues below advertisement

With files from The Associated Press

Sponsored content

AdChoices