830,000 Teslas Are Facing a Potential Recall Due to Numerous Autopilot Crashes

The National Highway Traffic Safety Administration is investigating why several Teslas equipped with the Autopilot feature were involved in crashes with parked first responder vehicles.

A federal government agency has already investigated once before whether Tesla’s Autopilot driver-assist system feature is working as advertised. After 11 incidents of Tesla cars colliding with first-responder vehicles, the National Highway Traffic Safety Administration (NHTSA) initiated an inquiry into 765,000 Tesla cars back in 2021.

According to its notice, the auto regulator agency said it started an investigation into how Tesla’s Autopilot reacts when interacting with first responder cars due to several crashes where Tesla vehicles with autopilot system engaged crashed with roadside or parked first responder cars.

The National Highway Traffic Safety Administration will now examine data from 830,000 Tesla vehicles and over 200 new reports of collisions involving Tesla vehicles that had the Autopilot feature turned on. The agency also announced that it is expanding its probe to include an engineering examination, which is required before a possible recall of hundreds of thousands of Tesla vehicles equipped with the autopilot feature.

1024px Tesla Autopilot Engaged in Model X 1
Tesla Autopilot Engaged by Ian Maddox under CC BY-SA 4.0

According to reports published by the NHTSA’s Office of Defects Investigation, the upgraded investigation will look into “the degree to which autopilot and associated Tesla devices may worsen human factors or behavioral safety hazards by weakening the effectiveness of the driver’s supervision.”

Similarly, the auto safety regulator studied 191 crashes that did not involve police cars, fire trucks, or ambulances. NHTSA stated it removed 85 since the causes were shown to be due to other factors or probably because determining the cause was challenging. Drivers were not responsive enough in nearly 50 of the remaining cases, according to the agency. In about 25 other crashes, the agency said the drivers were using autopilot in situations that Tesla has said could limit the system’s effectiveness – for instance, in bad weather.

Tesla advertised autopilot as a function that allows cars to brake and navigate themselves within their lanes. Some assistance functions like “Full Self-Driving” are also labeled. According to the NHTSA’s website, “There is currently no vehicle available for sale that is Full Self-Driving or ‘self-driving.'”


Level 2 Vehicle Autonomy

According to the NHTSA, Tesla describes autopilot as “an SAE Level 2 driving automation system meant to support and assist the driver,” Many automakers utilize a Level 2 system in their new vehicles. As a matter of fact, the NHTSA asked Tesla and a dozen other automakers for details on how their Level 2 systems work as part of its investigation last autumn.

The currently available information shows that NHTSA’s main interest is to learn more about Tesla autopilot performance. The NHTSA specifically issued a follow-up request for more information on how Tesla makes changes to autopilot using over-the-air updates and how Tesla requires non-disclosure agreements with owners whose vehicles are part of Tesla’s “beta” release program for fully self-driving cars. Despite its name, FSD cannot drive the vehicle by itself.

1024px Tesla Model S X side by side at the Gilroy Supercharger retouched
Tesla Model S & X by Steve Jurvetson under CC BY-SA 4.0

The National Highway Traffic Safety Administration (NHTSA) presented its case for why autopilot should be reviewed in a public update on its investigation. According to the agency, they have investigated 16 crashes thus far. They discovered that autopilot averagely aborted its vehicle control “less than one second before the first impact,” even though the event’s footage revealed that the driver should have been aware of a potential mishap eight seconds before impact. The auto safety regulator discovered that most drivers had their hands on the wheel as autopilot mandates, but the vehicles failed to warn them to take evasive action in time.


Does it mean the auto-pilot system is defective?

On Thursday, the NHTSA stated that the overuse of the autopilot mode did not imply that the technology was not flawed. “This is true especially if the driver’s actions are predictable due to the system’s architecture or functionality,” the report said.

During this investigation, the agency looked at over 100 other crashes involving Tesla that used autopilot but did not involve first responder vehicles. On several occasions, the driver was “inadequately attentive to the requirements of the dynamic driving task,” according to the pilot investigation. And that’s why the NHTSA’s inquiry will focus on “the technologies and procedures Tesla employs to assist, monitor, and enforce the driver’s engagement with the dynamic driving job when autopilot is in operation.”

The probe has been expanded to cover 830,000 Tesla cars, including all current Tesla models, such as the Model S cars made between 2014 and 2021, Model X cars manufactured between 2015 and 2021, Model 3 vehicles built between 2018 and 2021, and Model Y vehicles built between 2020 and 2021. According to NHTSA data, the autopilot first responder problem has resulted in 15 injuries and one death.


I’m glad the NHTSA is intensifying its inquiry. Tesla has always disregarded safety rules and deluded the public about its ‘autopilot’ system, and our roads are becoming more unsafe,” Sen. Edward Markey of Massachusetts.

Tesla’s CEO, Elon Musk, continues to preach the benefits of Full Self-Driving (FSD). Earlier this month, he announced the expansion of the newest beta software to 100,000 cars on Twitter, claiming the latest upgrade would be able to “manage roads with no map data at all” and that “after a few months, FSD should be able to drive to a GPS point with zero map data.”

Elon Musk Tesla Factory Fremont CA USA 8765031426
Elon Musk, Tesla Factory by Maurizio Pesce under CC BY 2.0

Phantom braking

In a separate probe, the National Highway Traffic Safety Administration (NHTSA) has investigated 758 cases of “phantom braking,” in which drivers claimed their Tesla cars abruptly braked while driving at high speeds. Phantom braking is known as the company’s AEB system. There have been dozens of reports from other owners who have had identical problems. It’s a severe issue that the (NHTSA) is investigating.


Tesla has a not-so-good reputation when criticizing its technology, even when it comes from a government agency. Thus, the company has been slow to respond to issues from the NHTSA about alleged autopilot flaws. According to a Reuters report published on Friday, Tesla now has until June 20 to provide documentation to the government detailing hundreds of identified AEB issues.

What if Tesla fails to respond to the NHTSA’s queries by June 20?

It’s difficult to say. Since this is a request, it’s unclear what the agency’s next steps will be, but it’s hoped that something will come of it. According to Tesla Inc, “We’ve stopped recommending Tesla vehicles due to the persistent phantom braking issue with our Model Y, and we’ll keep doing so until the issue is resolved.” In a situation like this, NHTSA stated they would typically turn to a vehicle manufacturer for comment, but Tesla no longer has a public relations department.

Recent Posts

Follow Us