The National Highway Traffic Safety Administration announced it is expanding a probe into Tesla’s Autopilot function, examining the system for potential defects.
The probe began ten months ago as the NHTSA began a generalized probe examining 765,000 Tesla cars, after 11 incidents where Tesla vehicles crashed into first responder vehicles.
Thursday the agency announced in a release it was focusing the probe on Tesla’s driver assistance system, and will increase the scope of the analysis out to 830,000 vehicles which use the system. It also announced it has so far identified almost 200 new incidents where Tesla cars utilizing the autopilot feature collided with something, including six more cases of collisions with first responder vehicles.
It also announced the classification of the analysis had been upgraded to an “Engineering Analysis,” which is necessary for the probe to progress into a potential recall of all Tesla’s which contain the Autopilot feature.
According to the NHTSA, the probe will now, “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”
According to the NHTSA, Tesla’s warning system did activate prior to most of the collisions, and in about half of the cases the “Automatic Emergency Braking” system activated as well. According to the release, “On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”
85 additional crashes were disregarded because the nature of them and their cause could not be clearly established. In an additional 50, drivers were not responsive enough, and in 25 more, it was determined drivers were using Autopilot under conditions like bad weather, which would degrade its performance,
Autopilot is marketed as a feature which can automatically brake and steer cars to keep them in their lanes.
NHTSA said that it did not matter if some crashes using the system were due to driver misuse of the system if such misuse was predictable. “This is particularly the case if the driver behavior in question is foreseeable in light of the system’s design or operation,” it said.
Democratic Senator Edward Markey of Massachusetts was pleased to see the probe expand, saying on Twitter, “Every day that Tesla disregards safety rules and misleads the public about its ‘Autopilot’ system, our roads become more dangerous.”