By all accounts, it was never a question of if, but when there’d be a fatality in a self-driving car. The unfortunate fatal accident that occurred in a 2015 recently has the additional double-whammy effect of casting a shadow on the future of self-driving cars and tarnishing the Tesla image a bit.
The Tesla Model S was in Autopilot mode when the incident occurred. While the Model S is not a driverless car, the Autopilot feature is an assistive technology that allows for partially autonomous driving. The Autopilot, still in the beta testing period, is capable of completing tasks like merging onto a highway, steer down the highway and change lanes without driver intervention, although drivers are instructed that they should keep their hands on the wheel while Autopilot is being used.
NHTSA Opens Investigation
The accident raised serious questions about what happened, enough that the National Highway Traffic Safety Administration (NHTSA) is investigating it. NHTSA’s Office of Defects Investigation (ODI) preliminary evaluation is being opened to examine the design and performance of any automated systems in use at the time of the crash.
In a statement, Tesla said “This is the first known fatality in just over 130 million miles where Autopilot was activated… It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.”
Details of the Fatal Accident
The driver of the Tesla Model S, 40-year-old Joshua Brown, who died at the scene from injuries sustained in the crash of his vehicle in Williston, Florida on May 7, failed to detect the white side of a tractor trailer as it crossed the highway perpendicular to the Model S against the backdrop of a hazy sky.
Neither did the car’s Autopilot detect the hazard.
A report of the crash published by Consumer Reports in Yahoo said that the top of Brown’s Model S was torn off by the force of the collision, the car continued to travel east on U.S. 27A until it left the roadway and struck a fence on the south shoulder. The car smashed through two fences and struck a power pole, rotated counter-clockwise and finally coming to rest some 100 feet south of the highway.
Jake Fisher, Consumer Reports director of auto testing said that “This accident calls into question the wisdom of rolling out unproven technology to the public.”
Brown, an Ohio resident, according to various news reports and the Tesla statement, was “a friend to Tesla and the broader EV community.”
Brown Loved His “Tessy”
Among several media accounts of the tragic accident, a story in The New York Times relates how much Brown loved his all-electric car. He was so devoted to it that he nicknamed it “Tessy,” and posted numerous videos on YouTube.
Ironically, Brown was most interested in testing how far the Autopilot function would go, testing the limits of the technology. A Wall Street Journal article notes that in one video, however, that while his 2015 Model S “sees moving vehicles great…even really slowly moving vehicles…It has a harder time with stopped ones.”
In one video, his most recent and most popular, “Autopilot Saves Model S,” Brown is shown driving on an interstate highway from Cleveland to Canton, Ohio. After a white truck cuts in front of Brown’s Model S, by Brown’s account, the car’s Autopilot caused the car to swerve to the right, thus avoiding a collision.
Subsequent attention to the video on Twitter by Elon Musk, Tesla’s founder, caused it to go viral.
Just a couple weeks later, Brown died in an eerily similar accident involving a white truck.
Implications for Autonomous Cars
What this tragic accident will do to the future of autonomous cars remains to be seen. It will no doubt reinforce the skepticism of those who think self-driving cars are unsafe, while those who subscribe to the belief that technology has to evolve and will ultimately result in countless saved lives.
As a story in Fortune points out, while the accident was tragic, it will likely have little effect on the forward momentum with self-driving car technology. Even Tesla stock suffered a modest temporary loss, although the fatality did somewhat sully the unblemished Model S history.
What is very likely is that this accident will intensify pressure to regulate automation features. There’s also the issue of liability and the potential for legal fallout, especially if a judge or jury finds Tesla liable.
Interestingly, a study published in October 2015 and conducted by the University of Michigan’s Transportation Research Institute found that self-driving cars had a higher crash rate than traditional cars – per million miles traveled. At the time, however, no self-driving cars had been found to be at fault in the crashes they were involved in.
Still, consumers are wary. Some don’t trust self-driving cars, although younger drivers are more likely to be accepting of this technology than older drivers. Others worry about who’s liable in the event of a crash. Federal, state and local regulations will come into sharper focus in the near future.
Short of building cities where there’s no need for drivers at all, a kind of utopia where self-piloting cars take over the headaches and responsibilities of drivers, freeing up space, greatly improving convenience, saving time and more, the reality is that while development of fully-autonomous cars will continue, so will the troublesome issues surrounding successful implementation.
In our view, changes are coming. Advocates and critics of autonomous cars will pay close attention, as should the rest of the driving public.