The National Highway Traffic Safety Administration (NHTSA) has announced that it will be investigating the effectiveness of Tesla’s self-driving Autopilot system.
The NHTSA said it was acting following 11 Tesla crashes since 2018 involving emergency vehicles.
In some cases, the Tesla vehicles “crashed directly into the vehicles of first responders,” it said.
The investigation will cover roughly 765,000 Tesla cars made since 2014, which will include the Model Y, Model X, Model S and Model 3.
The agency was primarily concerned with an apparent inability of Tesla vehicles to cope with emergency vehicles – and more specifically, cars stopped in the road.
Among the list of cases was one where a Tesla “ploughed into the rear” of a parked fire engine attending an accident, and another in which a parked police car was struck.
The NHTSA said it was opening its preliminary investigation into “the technologies and methods used to monitor, assist, and enforce the driver’s engagement”, while using Autopilot.
It said that in the 11 crashes it investigated, either Autopilot or a system called Traffic Aware Cruise Control had been active just before each collision.
Tesla’s Autopilot system is an autonomous driving technology that can assist drivers to automatically steer, accelerate and brake.
Tesla’s Autopilot system is undoubtedly an innovative and promising technology, however, it has come under fire for being misleading, as it does not automatically drive the car and drivers are still required to maintain control and attention at all times.
The company did promise that they would release their full autopilot capabilities soon, which is now available to some users in a beta version.
The autonomous car system is not perfect, and the design does allow for some misuse. Users have been known to use their phones while driving unattended and switching car seats, leaving no driver at the wheel.
In a statement, an NHTSA spokesperson said: “No commercially available motor vehicles today are capable of driving themselves. Every available vehicle requires a human driver to be in control at all times”.
The investigation’s supporting documents do, admittedly, note the challenging circumstances that were involved in many of these collisions.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” it reads.
The report comes just days ahead of an event to showcase the cars latest software.
Elon Musk, the CEO of Tesla had previously announced 19 August as “Tesla AI Day”, which he said would showcase the progress of the firm’s artificial intelligence systems – with a view to recruiting AI experts and others who are passionate about these technologies.