The recent apparent failure of the semi-autonomous driving feature called Autopilot in a that resulted in a fatality in Florida has focused public and regulatory agency attention and concern about the safety and efficacy of such technology that is still in beta testing.
Consumer Reports, in a strongly worded report, urged Tesla to do the following:
- Disable Autosteer [part of the Autopilot system functionality] until it can be reprogrammed to require drivers to keep their hands on the steering wheel
- Stop referring to the system as “Autopilot” as it is misleading and potentially dangerous
- Issue clearer guidance to owners on how the system should be used and its limitations
- Test all safety-critical systems fully before public deployment; no more beta releases
Tesla issued the following email reply to Consumer Reports’ request to address the concerns: “Tesla is constantly introducing enhancements, proven over millions of miles of internal testing, to ensure that drivers supported by Autopilot remain safer than those operating without assistance. We will continue to develop, validate and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world, data, not speculation by media.”
Agencies Launch Investigations
The National Highway Traffic Safety Administration (NHTSA) last week sent Tesla a letter requesting detailed data on the company’s Autopilot system. The requested information includes any system design changes and updates and detailed logs of when the system prompted drivers to take control of the car’s steering.
According to the Consumer Reports’ article, the Florida crash also spurred the National Transportation Safety Board (NTSB) to begin an investigation. The NTSB investigation is looking into whether the crash reveals systemic issues in the development of autonomous cars and probes into accidents involving Tesla Autopilot cars.
The Wall Street Journal reports that the Securities Exchange Commission (SEC) is investigating Tesla for possible breach of securities law by failing to inform investors of the May 7 crash in a timely fashion.
Consumer Reports’ Tesla Test Fleet
Consumer Reports says that it has owned three Tesla vehicles, a 2013 Model S 85, a 2014 Model S P85D, and a 2016 Model X 90D. Disturbingly, the publication notes that testers of the Model X in Autopilot mode, during recent evaluation on a long, straight road, experienced more than a three-minute delay before the system recognized that the driver’s hands were off the wheel.
The publication also points out that research shows that drivers are “notoriously bad” at re-engaging complex tasks after their attention has wandered. It cites a 2015 NHTSA study that found test subjects took anywhere from three to 17 seconds to resume control of a car in semi-autonomous mode after being alerted that the car was no longer under control of the computer.
As Consumer Reports cautions, at 65 mph, that’s a traveled distance of between 100 feet and a quarter-mile – with effectively no driver control of the vehicle.
Autopilot Not Ready for Prime Time
Drivers under the mistaken impression that they’re free to watch videos, check emails, read or do other things rather than be fully focused on driving a vehicle are also at risk for not quickly comprehending what’s happening or what they need to do when the system fails to detect an obstacle it was supposed to see.
This is more than distracted driving, it’s not driving – and then needing to instantly react in a potentially life-threatening situation.
What this clearly indicates is that Autopilot isn’t nearly ready to be deployed. This also magnifies several issues:
- There aren’t any roads or highways devoted to autonomous or semi-autonomous driving yet. That means cars driven by drivers will need to share the road with vehicles in semi-autonomous driving mode – a very perilous situation.
- The technology for driverless cars is still in development and needs to be rock-solid. Multiple safeguards need to be installed to help prevent catastrophic failure.
- Consumers will need to be trained how to use the system.
- Mandatory safety standards for driverless cars should be put issued by the NHTSA and the agency should insist on independent third-party testing and certification of semi-autonomous and autonomous driving features.
Laura Cleary, Consumer Reports’ vice president of consumer policy and mobilization said in a statement that self-driving systems “could [eventually] make our roads safer,” and continued, “but today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology.”
Other calls for Tesla to rename the system to something less misleading than Autopilot may fall on deaf ears. The company has given no indication it has any intention of renaming Autopilot. The feature does, however, come with a warning that drivers should still at all times keep their hands on the wheel.
Not that some early adopters of technology won’t be tempted to push it to the limits – as the unfortunate victim of the Florida crash repeatedly did.
Is there a lesson here? Are we as a society so fixated on the latest and best technology that we ignore basic common sense when it comes to using it properly? How many more senseless deaths and serious injuries will it take before appropriate safety and regulatory measures are implemented?
In our view, you don’t need to be a rocket scientist to see the wisdom in this. While we applaud research and development of new technology to make our lives easier, driving safer and less strenuous, it’s time to face the reality that we’re not – and the technology’s not – ready now for a total hands-off approach to driving.