Tesla crashes mark the “black box” challenge for investigations

0
87

[ad_1]

(Bloomberg) – Two people were killed in two Tesla Inc. vehicle crashes on Sunday, and the Indiana and California police are still trying to answer an unknown question: who or what was in control at the time of the crash?

Shortly after midnight in Gardena, California, a Tesla Model S drove against a Honda Civic at a red light and killed both passengers in this car. Hours later, a Tesla Model 3 crashed into a fire truck parked in the left lane of an Indiana freeway, killing a woman and injuring her husband.

The police have not yet determined whether the drivers have used Tesla's suite of automatic driver assistance features known as autopilot that have aroused the interest of federal security agencies, or whether they have driven their vehicles manually. They expect to find out in the coming days.

The distinction will be important to determine the cause of the accidents and to answer questions that regulators have asked regarding the rapid shift of the industry to vehicles that perform some or all of a driver's functions.

Conventional vehicles have been equipped with so-called event data recorders for years, the "black box", which logs important information that is required for the compilation of a crash sequence, e.g. B. brakes, airbag deployment and other measurements that can be downloaded via widely used tools.

However, data describing the performance of automated driving technologies in the seconds before a crash is only accessible to vehicle manufacturers. This means that investigators who want to understand whether an automated system was active during an accident must contact the company that made the car.

"We should never have to rely on the manufacturer to translate this type of data because it may be responsible for product defects and an inherent conflict of interest," said Sam Abuelsamid, chief analyst at Navigant Research in Detroit.

He said a standard method should be in place to determine which control modes are active before and during a crash.

"As we use more vehicles with these partial automation systems, information about the automation status and the driver should be recorded," he said.

After a Tesla crash in 2016, the National Transportation Safety Board asked the transportation department to define what data to collect. However, this is not yet complete.

"As more and more manufacturers use automated systems in their vehicles, to improve system security, it is necessary to develop detailed information about how the active safety systems are executed during a crash sequence and how the drivers reacted to them," warned the NTSB in one Report on the crash. "Manufacturers, regulators and crash investigators need specific data in the event of a system failure or crash."

Tesla has not replied to emails asking for a comment. The company emphasizes that drivers are ultimately responsible for controlling their vehicle while using autopilot, and that they must always keep their hands on the steering wheel with their hands. The company has also vigorously opposed criticism that the system is insecure, often referring to quarterly data released by the company that show drivers using autopilot are safer than those who work without the system.

The National Highway Traffic Safety Administration said it was reviewing the California crash. A spokesman for the agency declined to comment on whether autopilot was suspected in the crash.

Raul Arbelaez, vice president of the Automotive Research Center at the Road Safety Insurance Institute, said that the inability to access this data hampers the ability of researchers to understand how automated driver assistance works in the field, especially in less serious accidents, making up the vast majority of road accidents out.

"How do people interact with these technologies? Under what conditions do they normally work? Do they work really badly with snow and rain at night or are they excellent under these conditions? ", He said. “I am sure that it is very useful for automakers to help them improve their products, but given the understanding of the performance of the current fleet, we really do not have access to these accidents very quickly without contacting the manufacturers together. "

This was the case in 2016 when NTSB and NHTSA investigated the first fatal crash with Tesla's autopilot system. In this collision, a 2016 Tesla Model S driver was killed after driving under a semi-trailer that crossed a freeway and tore off the roof of the Tesla.

Florida crash

The vehicle that was involved in the Florida crash did not have a traditional event data recorder that could be read by widely used tools, the NTSB said. Instead, the Tesla vehicle collected extensive data that the company provided to the NTSB, showing that the driver of the vehicle was using autopilot at the time of the accident.

And even if the Tesla had an event data recorder, the 15 data parameters required by the devices under the 2006 regulation are "not sufficient to capture even the simplest questions of who / what controls an automated vehicle at the time of an accident "NTSB wrote in the report a detailed description of its investigation.

Tesla vehicles have had event data recorders since at least 2018, and the company offers tools for the public to download crash data.

The NTSB urged the U.S. transportation department to define what data parameters are required to understand the automated vehicle control systems that are involved in an accident and that, according to a spokesman for the agency, have yet to be resolved.

The NTSB is investigating other crashes on Tesla vehicles that occurred while using the autopilot, including a fatal collision in Mountain View, California in March 2018. The NHTSA has now investigated 13 crashes that it believes occurred while using the autopilot, including a December 7 crash in Connecticut in which a Tesla driver left a parked police cruiser behind.

Real world data

The number of requests initiated by the agency shows that it is interested in how the new technology will be used on site, said Frank Borris, former director of the Office of Defects Investigation at NHTSA. According to Borris, the agency's crash investigators are tasked with collecting real-world crash examples to better understand areas of potential road safety risk, including new technologies.

Borris said he had long feared that Tesla, who called his system "autopilot", might cause drivers to rely too much on it in scenarios where it was not designed to operate. He also said that there is little empirical data to document the real performance of autopilot and other automated driver assistance technologies.

In the meantime, the need for data for investigators is likely to become more pressing as automated systems become more widely available.

In April 2019, Tesla's CEO, Elon Musk, said during an autonomy day for investors at the company's headquarters in Palo Alto, California, that next year he would use his "first working robot axis with no one around".

During the October earnings release, Musk said they should be able to upload the software that would allow a Tesla to become a robot axis "by the end of next year."

He clarified that “regulatory acceptance will vary from country to country. But this transition, the switch from a car that isn't a robot taxi to a robot taxi, is likely to be the biggest step in increasing the asset in history. "

– With the support of Dana Hull.

Contact the reporter about this story: Ryan Beene in Washington at rbeene@bloomberg.net

Contact the editors responsible for this story: Jon Morgan at jmorgan97@bloomberg.net, Elizabeth Wasserman

<p class = "Canvas-Atom Canvas-Text Mb (1.0em) Mb (0) – sm Mt (0.8em) – sm" type = "text" content = "For more articles like this please visit us at bloomberg.com"data-reactid =" 63 "> You can find more items of this kind at bloomberg.com

© 2020 Bloomberg L.P.