Skip to main contentSkip to navigationSkip to navigation
A Tesla Model X crashed into a barrier on U.S. Highway 101 in Mountain View, Calif. Tesla says the driver, who was killed in the accident, did not have his hands on the steering wheel for six seconds before the crash.
A Tesla Model X crashed into a barrier on U.S. Highway 101 in Mountain View, Calif. Tesla says the driver, who was killed in the accident, did not have his hands on the steering wheel for six seconds before the crash. Photograph: AP
A Tesla Model X crashed into a barrier on U.S. Highway 101 in Mountain View, Calif. Tesla says the driver, who was killed in the accident, did not have his hands on the steering wheel for six seconds before the crash. Photograph: AP

Self-driving car companies should not be allowed to investigate their own crashes

This article is more than 6 years old
and Alan Winfield

Following another fatal Tesla crash, accident investigators have announced that they have stopped working with the company. Self-driving cars urgently need ‘ethical black boxes’ so that we can all learn from their mistakes.

Self-driving cars are learning to drive. The algorithms that control them need to be fed vast quantities of real world data in order to improve. Cities and freeways, particularly in the US, are the laboratories in which they are being trained. Companies like Waymo, Uber and Tesla would argue that this real-world experience is vital for machine learning. Others would say that it creates an experiment in which other road users are unwitting test subjects. When technologies fail and people die, as happened with the Uber crash in Tempe last month, everyone, not just self-driving car companies, needs to learn what happened and why. Social learning must take precedence over machine learning.

For this reason, we should be worried by the news that the National Transportation Safety Board has thrown Tesla out its investigation into the fatal crash of a Tesla Model X that was in Autopilot mode. NTSB has announced that Tesla is no longer party to the investigation because the car company broke the rules on speaking out, in effect prejudicing the conclusions of the inquiry.

The NTSB is a fascinating organisation. Its stock-in-trade is the ruthlessly forensic investigation of aeroplane crashes. They are the people with the gruesome task of picking through aircraft wreckage to dig out the ‘black boxes’ that could reveal what happened in a flight’s last moments. The organisation binds airlines, pilots and governments together in a process of collective learning. They are interested in understanding rather than blaming, which is why the conclusions of their reports are not admissible in court. They have a slogan: ‘Anybody’s accident is everybody’s accident’. The success of this model is clear. 2017 was the first year since the dawn of the jet age in which nobody died in a major airline jet crash. In the same year, more than a million people died on the world’s roads.

The NTSB run a ‘party system’ in which transport companies and investigators work together to establish the probable cause of a crash. When Joshua Brown died after his Tesla hit a truck while on Autopilot in March 2016, a Tesla engineer joined the NTSB team to help dig out the car’s data. Among other things, the data revealed that the driver had barely touched the steering wheel in the 37 minutes before the crash.

Last month, Walter Huang died when his Tesla Model X hit a barrier on a Silicon Valley freeway and caught fire. The details of the crash remain unclear, which is why a thorough investigation is so important. Tesla’s response focused on the numbers: while the average death rate is one every 86 million miles, the rate for the cars they make is one death every 320 million miles. The statistics clearly didn’t satisfy the victim’s family, who are, unlike Joshua Brown’s family, suing Tesla.

Tesla have said that, for this latest crash, they will cooperate with the NTSB. But this is very different from being a party to the investigation. We should not have to rely on the goodwill of companies choosing to donate their data. Instead, self-driving cars should be equipped with their own ‘black box’ data recorders. These would be a standard unit that continuously records key data from the car’s sensors and algorithms, data that would allow accident investigators to construct a detailed timeline of an autopilot’s decisions leading up to the accident. Self-driving cars’ machine learning systems mean that the data would be more complicated than in aircraft black boxes. But this should not be used as an excuse to let carmakers off the hook. In their investigation of the Joshua Brown crash, the NTSB were concerned about the inscrutability of the Tesla’s brain. One NTSB staffer told the board that ‘The data we obtained was sufficient to let us know the [detection of the truck] did not occur, but it was not sufficient to let us know why’.

Since the 1960s, black boxes have been compulsory for aircraft. They are the key independent witnesses at the heart of air accident investigations, a technology of trust. Without the same technology, and robust regulation around it, driverless cars are unlikely to win public trust and fulfil the long-term promise of radically improved road safety. Tesla has a troubling history of rejecting industry-standard Event Data Recorders in its cars. The rule is that, if cars have EDRs, they must share the data. Until recently, Tesla claimed that their EDR wasn’t a real EDR, so they didn’t have to share crash data. A few days before the most recent crash, the company announced that it would take radically different approach, letting its customers dig into their cars’ data once they had bought an expensive kit. This attempt to change the conversation follows previous battles in which the company has rejected the authority of regulators. Tesla now complain that they are being unfairly targeted for attention.

The NTSB acquired its role and autonomy despite intense opposition from airlines. It was only once a US Senator was killed in a 1935 plane crash that Congress took seriously the need for an independent accident investigator. Individual companies may think they can and should control the story once crashes happen. Someone needs to persuade them they are misguided. More cautious car companies should worry about the reputational fallout from these recent incidents. If the government doesn’t want to make self-driving car regulation more like airline regulation, perhaps other car companies will recognise the need to protect themselves. After all, anybody’s accident is everybody’s accident.

Most viewed

Most viewed