Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowThe U.S. government’s highway safety agency says Tesla is telling drivers in public statements that its vehicles can drive themselves, conflicting with owners manuals and briefings with the agency saying the electric vehicles need human supervision.
The National Highway Traffic Safety Administration is asking the company to “revisit its communications” to make sure messages are consistent with user instructions.
The request came in a May email to the company from Gregory Magno, a division chief with the agency’s Office of Defects Investigation. It was attached to a letter seeking information on a probe into crashes involving Tesla’s “Full Self-Driving” system in low-visibility conditions. The letter was posted Friday on the agency’s website.
The agency began the investigation in October after getting reports of four crashes involving “Full Self-Driving” when Teslas encountered sun glare, fog and airborne dust. An Arizona pedestrian was killed in one of the crashes.
Critics, including Transportation Secretary Pete Buttigieg, have long accused Tesla of using deceptive names for its partially automated driving systems, including “Full Self-Driving” and “Autopilot,” both of which have been viewed by owners as fully autonomous.
The letter and email raise further questions about whether Full Self-Driving will be ready for use without human drivers on public roads, as Tesla CEO Elon Musk has predicted. Much of Tesla’s stock valuation hinges on the company deploying a fleet of autonomous robotaxis.
Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.
A message was sent Friday seeking comment from Tesla.
In the email, Magno writes that Tesla briefed the agency in April on an offer of a free trial of “Full Self-Driving” and emphasized that the owner’s manual, user interface and a YouTube video tell humans that they have to remain vigilant and in full control of their vehicles.
But Magno cited seven posts or reposts by Tesla’s account on X, the social media platform owned by Musk, that Magno said indicated that Full Self-Driving is capable of driving itself.
“Tesla’s X account has reposted or endorsed postings that exhibit disengaged driver behavior,” Magno wrote. “We believe that Tesla’s postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task.”
The postings may encourage drivers to see Full Self-Driving, which now has the word “supervised” next to it in Tesla materials, to view the system as a “chauffeur or robotaxi rather than a partial automation/driver assist system that requires persistent attention and intermittent intervention by the driver,” Magno wrote.
On April 11, for instance, Tesla reposted a story about a man who used Full Self-Driving to travel 13 miles from his home to an emergency room during a heart attack just after the free trial began on April 1. A version of Full Self-Driving helped the owner “get to the hospital when he needed immediate medical attention,” the post said.
In addition, Tesla says on its website that use of Full Self-Driving and Autopilot without human supervision depends on “achieving reliability” and regulatory approval, Magno wrote. But the statement is accompanied by a video of a man driving on local roads with his hands on his knees, with a statement that, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself,” the email said.
In the letter seeking information on driving in low-visibility conditions, Magno wrote that the investigation will focus on the system’s ability to perform in low-visibility conditions caused by “relatively common traffic occurrences.”
Drivers, he wrote, may not be told by the car that they should decide where Full Self-Driving can safely operate or fully understand the capabilities of the system.
“This investigation will consider the adequacy of feedback or information the system provides to drivers to enable them to make a decision in real time when the capability of the system has been exceeded,” Magno wrote.
The letter asks Tesla to describe all visual or audio warnings that drivers get that the system “is unable to detect and respond to any reduced visibility condition.”
The agency gave Tesla until Dec. 18 to respond to the letter, but the company can ask for an extension.
That means the investigation is unlikely to be finished by the time President-elect Donald Trump takes office in January, and Trump has said he would put Musk in charge of a government efficiency commission to audit agencies and eliminate fraud. Musk spent at least $119 million in a campaign to get Trump elected, and Trump has spoken against government regulations.
Auto safety advocates fear that if Musk gains some control over NHTSA, the Full Self-Driving and other investigations into Tesla could be derailed.
Musk even floated the idea of him helping to develop national safety standards for self-driving vehicles.
“Of course the fox wants to build the henhouse,” said Michael Brooks, executive director of the Center for Auto Safety, a nonprofit watchdog group.
He added that he can’t think of anyone who would agree that a business mogul should have direct involvement in regulations that affect the mogul’s companies.
“That’s a huge problem for democracy, really,” Brooks said.
Please enable JavaScript to view this content.
Another of the many reasons to avoid Musk’s products … he’s far too cavalier with other people’s lives.
Ah yes, the admirable Elon Musk: America’s first proto-oligarch posed to graduate and become an actual oligarch.
I guess you self elected to and conveniently forget about the Soros family, Hastings, Gates, Lawson, Moskovitz, and Bloomberg to name a few?
Do any of them have businesses that sell cars which they claim to be self-driving but aren’t?
Or are you just looking for another reason to cry victim?
Oh no, who will think of the poor billionaire conservatives…
None of those rich and famous individuals produce and sell products that are capable of killing people.
You are all added to the list of ignorant humans totally discarding Elon Musk and saying – it’s crazy and impossible. Noted. You and NASA are in good company of fools. So it is not today. But FSD will happen. He is just WAY smarter than you. Now, go use your rake while I get my leaf blower out.
A product feature that dulls and slows reaction time and human engagement with the act of driving is dangerous and should have never shipped, much less been allowed on public roads. Beta testing your products on other drivers, people who didn’t sign up for the risk that a Tesla will run amok and crash into them, is flat wrong and shows callous disregard for humanity.
If Tesla doesn’t believe in FSD enough to not have the disclaimers, it’s not ready for prime time or to be called Fully Self Driving. The family of the Arizona pedestrian that was run over because the computer got confused should end up owning the company.
Where in the article does it say that it will “never happen”? Reading comprehension is clearly not your strong suit.
Facts are automation in virtually everything is proving exponentially safer than human operation. It is not even close anymore.
Based on a very limited amount of data, five times worse at dusk and dawn and twice as bad at making turns.
That’s like claiming you’ve come up with a robot that is the world’s best race car driver and all it can do is drag race, and not even at night. Might be better at one specific subset of a skill, but it’s got a long way to go.
Will it get there on an infinite timeline? Sure. But figure out a way to come up with the data you need to get the robots better that doesn’t involve making the “drivers” of your own cars demonstrably worse from a reaction time standpoint when they are needed.
https://www.newscientist.com/article/2435896-driverless-cars-are-mostly-safer-than-humans-but-worse-at-turns/