Does Tesla’s Autopilot software have something against emergency services? That’s a flippant question, but there’s something underneath it. On Tuesday, a Model S electric vehicle—with Autopilot engaged, according to the driver—crashed into a police car in Laguna Beach, California. The police car was unoccupied at the time, but the Tesla driver sustained minor injuries. Last month, another Model S, also under Autopilot, slammed into the back of a stationary fire truck in South Jordan, Utah, resulting in a broken ankle for the Tesla driver. And in January, a third Autopiloted Model S plowed into the back of another fire engine, this time in Culver City, California.
It’s probably best to avoid the conspiracy theories, though. It’s not some bug with Autopilot’s sensors and flashing lights—it’s more like inattentive drivers who should be paying attention to the road. As we learned last year, automatic emergency braking is only trained to work in a relatively narrow set of circumstances, typically in the case of a moving vehicle that’s directly ahead of the car. So a stationary emergency vehicle on the shoulder of the road, particularly one at an angle, might not get classified properly to trigger the function.
It has been a rough couple of weeks for Autopilot. The suite of advanced driver assistance systems, which includes adaptive cruise control and lane keeping, has also been blamed for destroying a Model 3 in Greece last week. In that case, the facts are even murkier—the car was on an unsupported road trip at the time, and Tesla had warned the owner before he set off.
“While we appreciate [driver] You You Xue’s effort to spread the word about Model 3, he was informed that Tesla does not yet have a presence in Eastern Europe and that there is no connectivity or service available for vehicles there,” a Tesla spokesperson told Ars. “In addition, Model 3 has not yet been approved and homologated for driving outside of the U.S. and Canada. Although we haven’t been able to retrieve any data from the vehicle given that the accident occurred in an unsupported area, Tesla has always been clear that the driver must remain responsible for the car at all times when using Autopilot. We’re sorry to hear that this accident occurred, and we’re glad You You is safe.”
One might feel some sympathy for Tesla, as it’s often the company’s most loyal “superusers” who keep getting Tesla into trouble by pushing the bounds of the system—familiarity breeds contempt, after all. Once again it’s important to stress that Autopilot is not a self-driving system and was never designed to allow the driver to cede situational awareness to the car. If you drive a Tesla and use Autopilot—or any car with adaptive cruise control—it’s always your job to be paying attention to the road ahead.
Are Tesla’s safety claims backed up by the data?
It’s reasonable to ask why crashes involving Teslas get covered when the overwhelming majority of the 40,000-odd road deaths in the US each year receive no such scrutiny. There are a couple of factors at play. The first is Autopilot, which through operational design allows cars to travel for long intervals without human interaction or any form of driver monitoring beyond a torque sensor in the steering wheel. (By contrast, the industry standard for other adaptive cruise control and lane keeping systems is just 15 seconds of hands-free operation before deactivation.) Hence, every time there is a crash involving a Tesla, the first question anyone asks is “was Autopilot driving?”
Then there’s the fact that Tesla itself repeatedly talks up the safety of its cars, thereby inviting media attention. At various times it has claimed its vehicles are four times safer than average—and sometimes that they’re the safest cars on the road. Tesla EVs do indeed score very well in crash testing—even if the Insurance Institute for Highway Safety (IIHS) did not include the Model S among the three safest full-size sedans in 2017. Neither the Model S nor Model X is included in the institute’s list of top safety picks for 2018.
But there’s also reason to be skeptical of the company’s claims. For instance, Tesla repeatedly cites a National Highway Transportation Safety Administration statistic that the introduction of Autosteer to Autopilot reduced crashes by 40 percent. But last month, the NHTSA told us that it was a “cursory comparison” and that the agency “did not assess the effectiveness of this technology.”
It’s reasonable to expect that a luxury car like a Tesla would have a higher-than-average safety record, based both on owner demographics and the average age of the vehicles. On the other hand, the Model S did not appear on the IIHS’s list of 11 vehicles that recorded zero occupant deaths between 2012 and 2015, a list that included several other luxury cars and SUVs. And in just the past few weeks, there has been a spate of fatal Tesla crashes, both here in the US and in Norway and Switzerland.
“In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident,” Tesla told us. “Tesla Autopilot does not prevent all accidents—such a standard would be impossible—but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.” (Note that Autopilot is not believed to be a factor in all but one of the recent fatal crashes.)