Skip to content

U.S. agency bolsters Tesla Autopilot investigation, one step ahead of potential recall Tesla cars involved in 16 crashes with emergency vehicles


The National Highway Traffic Safety Administration (NHTSA) said on June 9 that it is upgrading its survey of Tesla Models Y, X, S and 3 vehicles, which represents approximately 830,000 vehicles equipped with its advanced driver assistance system. Autopilot driving, a mandatory tap before you can request a callback. NHTSA is focusing on Tesla’s Autopilot feature, which is supposed to help drivers navigate roads using artificial intelligence, which detects other vehicles. The company asks drivers to pay attention to the road and keep their hands on the wheel when using Autopilot, although some drivers have used it while drunk or sitting in the backseat of the car.

The 16 accidents at the base of the investigation took place between January 2018 and January 2022 and left 15 injured and one dead. In documents posted on its website, NHTSA said court data indicates the majority of drivers had their hands on the wheel before impact and complied with the system as it was designed to be used. NHTSA said this as the investigation specifically looks at whether the Autopilot feature ends up undermining the effectiveness of driver supervision.

NHTSA, in a separate investigation, is looking into a separate set of complaints that have been filed against Tesla vehicles suddenly braking at speed, otherwise known as “fancy braking”. The agency has received more than 750 complaints relating to this issue, although no accidents or injuries have been reported. The agency asked Tesla to provide more information about its knowledge of the problem.

NHTSA said it also looked closely at 191 crashes that did not involve first aid vehicles and narrowed the list down to 106 crashes after filtering out those for which available information did not allow a definitive assessment. In about half of these 106 crashes, there were indications that the driver was not sufficiently responsive to the needs of the dynamic driving task (DDT), as evidenced by the fact that drivers did not intervene when this happened. was necessary or were intervening through ineffective orders,” NHTSA said.

In about a quarter of the 106 crashes, the primary crash factor appears to be related to system operation in an environment in which, according to Tesla’s owner’s manual, system limitations may exist, or conditions may interfere with proper operation. Autopilot components. These conditions include operation on roads other than limited access highways, or operation while in low traction or visibility environments, such as rain, snow, or ice.

Tesla reports that in the fourth quarter of 2021, we recorded one crash for every 4.31 million miles driven in which drivers used Autopilot technology. For drivers not using Autopilot technology, we recorded one accident for every 1.59 million kilometers driven. For comparison, the most recent data from NHTSA shows that in the United States there is an automobile accident every 484,000 miles.

All of the accidents in question occurred on controlled access highways. When video of the incident was available, the approach to the scene of the first aid intervention would have been visible to the driver for an average of 8 seconds before impact. Additional scientific data available for eleven of the collisions indicates that no driver took evasive action between 2 and 5 seconds before impact, and the vehicles reported had all hands on the steering wheel prior to impact.

However, most drivers appeared to comply with the driver engagement system of the affected vehicle, as evidenced by the detection of hands on the steering wheel and the fact that nine of the eleven vehicles exhibited no visual alerts or driver engagement chimes. until the last minute before the collision (four of them showed no visual alert or chime during the last cycle of autopilot use).

For years, the agency has grappled with Tesla, and more specifically the company’s outspoken founder, Elon Musk. Anonymous officials and former regulators earlier this year described the heated reactions agency workers received from Musk, and noted that regulators have had to learn how to deal with a combative enterprise.

In a letter to Musk last fall, Jennifer Homendy, chair of the National Transportation Safety Board (NTSB), a separate federal agency that investigates crashes and makes recommendations to other agencies such as NHTSA, urged Musk Meet the safety recommendations it put out for Autopilot in 2017. The NTSB recommended Tesla put in place safeguards that wouldn’t allow drivers to operate the vehicles in a way that’s inconsistent with their design.

A recent Twitter conversation between journalist Ryan McCaffrey and Tesla CEO Elon Musk revealed that Tesla is considering making Steam games available in its cars. Steam is the largest digital video game distribution software. Musk mentioned that the company is currently working to bring the extensive library of Steam video games to its vehicles rather than adding specific titles individually.

Steam is an online content distribution, rights management and communication platform developed by Valve and oriented towards video games, it allows users to buy games, content for games, update them automatically, to manage the multiplayer part of the games and offers community tools around games using Steam.

The National Highway Traffic Safety Administration (NHTSA) wants to know why Tesla lets its car owners play solitaire while driving. NHTSA has released a statement indicating that it is reviewing this feature. “We are aware of driver concerns and are discussing this feature with the manufacturer,” NHTSA said.

Local US laws seem to align against the idea of ​​having a game visibly running while a car is in motion, even if the driver is not the one playing. A 2014 review by the Consumer Electronics Association found that video screen restriction laws were in effect in 34 states and the District of Columbia.

Although specific laws vary, most regulations focus on the operation of television screens that are visible to the driver while the car is in motion (California law more broadly restricts the use of any video monitor or screen or any other similar device that displays a video signal).

The NTSB has reviewed National NHTSA’s Advance Notice of Proposed Rulemaking (ANPRM) entitled Framework for Automated Driving System Safety. In its advisory, NHTSA seeks input on developing a framework for automated driving system (ADS) safety. ADS, as defined by SAE International and as used in ANPRM, refers to levels 3, 4 and 5 of driving automation.

An ADS is the hardware and software that are collectively capable of performing the entire dynamic driving task on a durable basis, regardless of whether it is confined to a specific operational design domain. Specifically, the agency is seeking comment on its role in facilitating ADS risk management through guidance, regulations, or both.

NHTSA is also seeking guidance on how it should select and design a security framework structure and appropriate administrative mechanisms to enhance security, mitigate risk, and enable the development and introduction of innovative security technologies. This recommendation is listed in the list of related security recommendations attached to this response to the NPRM.

And you?

What do you think ?

What do you think of Tesla’s public comments on crashes involving these vehicles?

What result do you foresee for this new survey?

See as well :

Tesla drivers can play video games in moving cars, feature raises new safety issues related to distracted driving

US Government asks Tesla why people can play video games in moving cars, feature under review

Tesla’s Autopilot can be “easily” tricked into working without anyone in the driver’s seat, according to Consumer Reports, but you shouldn’t try to

Tesla must now report Autopilot-related crashes to the government or face fines, federal road safety agency says

Leave a Reply

Your email address will not be published.