Editorial images show the interior of the new Tesla Model 3 with full self-driving enabled.
Null Photo | Getty Images
The National Highway Traffic Safety Administration teslaThe company’s “fully autonomous” system, according to a filing posted on the agency’s website Thursday.
The investigation into Tesla’s FSD looks into possible safety flaws that could make it unsafe for drivers to use in fog, bright sun or other “reduced road visibility conditions.”
The investigation began last year and includes 3.2 million Tesla vehicles, including Model S, X, 3, Y and Cybertruck EVs, that are available with the company’s FSD brand of driver assistance systems, according to documents posted on the agency’s website.
The agency wrote that Tesla FSD may not be able to “adequately detect or warn drivers under conditions of reduced visibility, such as glare or airborne obstructions.”
In the crashes investigated by the agency, Tesla’s system “did not detect prevailing road conditions that impair camera visibility or issue warnings if camera performance was degraded until shortly before the crash occurred.”
The investigation was upgraded to an “engineering analysis” following a series of complaints about crashes in which FSD was used within 30 seconds of the crash, including one where a Tesla driver using FSD struck and killed a pedestrian.
Tesla did not immediately respond to a request for comment.
