Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowTesla is recalling nearly all vehicles sold in the United States, more than 2 million, to update software and fix a defective system that’s supposed to ensure drivers are paying attention when using Autopilot.
Documents posted Wednesday by U.S. safety regulators say the update will increase warnings and alerts to drivers and even limit the areas where basic versions of Autopilot can operate.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some were deadly.
The agency says its investigation found Autopilot’s method of making sure that drivers are paying attention can be inadequate and can lead to “foreseeable misuse of the system.”
The added controls and alerts will “further encourage the driver to adhere to their continuous driving responsibility,” the documents said.
But safety experts said while the recall is a good step, it still makes the driver responsible and doesn’t fix the underlying problem that Tesla’s automated systems have trouble spotting and stopping for obstacles in their path.
The recall covers models Y, S, 3 and X produced between Oct. 5, 2012, and Dec. 7 of this year. The update was to be sent to certain affected vehicles on Tuesday, with the rest getting it later.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it’s not operating with a more sophisticated feature called Autosteer on City Streets.
The software update will limit where Autosteer can be used. “If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage,” the recall documents said.
Depending on a Tesla’s hardware, the added controls include “increasing prominence” of visual alerts, simplifying how Autosteer is turned on and off, and additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices. A driver could be suspended from using Autosteer if they repeatedly fail “to demonstrate continuous and sustained driving responsibility,” the documents say.
According to recall documents, agency investigators met with Tesla starting in October to explain “tentative conclusions” about the fixing the monitoring system. Tesla did not concur with NHTSA’s analysis but agreed to the recall on Dec. 5 in an effort to resolve the investigation.
Auto safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver’s hands are on the steering wheel. They have called for cameras to make sure a driver is paying attention, which are used by other automakers with similar systems.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address a lack of night vision cameras to watch drivers’ eyes, as well as Teslas failing to spot and stop for obstacles.
“The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring,” Koopman said.
Koopman and Michael Brooks, executive director of the nonprofit Center for Auto Safety, contend that crashing into emergency vehicles is a safety defect that isn’t addressed. “It’s not digging at the root of what the investigation is looking at,” Brooks said. “It’s not answering the question of why are Teslas on Autopilot not detecting and responding to emergency activity?”
Koopman said NHTSA apparently decided that the software change was the most it could get from the company, “and the benefits of doing this now outweigh the costs of spending another year wrangling with Tesla.”
In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”
Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. Independent tests have found that the monitoring system is easy to fool, so much that drivers have been caught while driving drunk or even sitting in the back seat.
In its defect report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse.”
A message was left early Wednesday seeking further comment from the Austin, Texas, company.
Tesla says on its website that Autopilot and a more sophisticated Full Self Driving system are meant to help drivers who have to be ready to intervene at all times. Full Self Driving is being tested by Tesla owners on public roads.
In a statement posted Monday on X, formerly Twitter, Tesla said safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.
The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into emergency vehicles. NHTSA has become more aggressive in pursuing safety problems with Teslas, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes NHTSA, said Tesla shouldn’t be calling the system Autopilot because it can’t drive itself.
Please enable JavaScript to view this content.
The recall should be to delete “Autopilot” and “Full Self Driving” capability from the vehicles. Any such capability that meets what’s being sold and insinuated from the name is decades away and shouldn’t be available to the public until Tesla is willing to indemnify users of the features.
So, in the seven years from 2017 to 2023, NHTSA has investigated only 35 such crashes and 17 such deaths. And, that’s why it needs to engage in a PR recall to add more “warnings.” Sigh. Meanwhile, tonight, heroin addicts will be driving cars, living overnight in their cars, and more than 17 of them will die of overdoses.
Straw man much?
A vehicle system that lulls people into a false sense of security and makes their reaction time worse (because they’re not engaged with the operation of the vehicle) is not a safety system.
Joe B: if you are driving a car and turn over the controls absolutely, then you have to deal with the consequences. I don’t care what your car says it does, you are still accountable for the end result of your trip driving the car. You hold the driver’s license, not Tesla or any other manufacturer. At the end of the day, your car can serve many purposes, but the worst is a weapon which can hurt or kill. Regardless of a car’s amenities, the licensed driver is still responsible for the safety of the vehicle. The blame is on the driver.
The driver is responsible for safely operating the vehicle.
The manufacturer can’t use that as a cop-out for not delivering a safe vehicle, which I’d argue Tesla is. There’s a reason that reputable cars have “lane assist” and don’t call it, I dunno, “full self driving” or “autopilot” and have safety controls that can easily be defeated.
Leave that aside. I believe any vehicle system that slows the reaction time of the driver is unsafe. People are just going to react slower to emergency situations if they are not continuously engaged with the operation of the vehicle or they’re distracted. There’s a reason that you can’t fiddle with Bluetooth settings in a lot of cars while you’re driving them. Why is that?
“Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. Independent tests have found that the monitoring system is easy to fool, so much that drivers have been caught while driving drunk or even sitting in the back seat.”
IMO they are clearly trying to deliver self-driving cars using everyone else on the roads (and pedestrians) as unwilling beta testers, which fits the MO of the CEO.
I love the concept and the idea of self-driving cars. And I know that those developing such vehicles have made a ton of progress. It’s that last bit of progress that is going to take a ton of time, effort, and money to deliver … and given the consequences, I don’t believe it’s appropriate for such cars to be on public roads until that’s solved.