Autonomous driving cars are the way of the future, but they still have a long way to go. Though technologically advanced, these vehicles have design limitations that are deadly. On September 12, 2017, the National Transportation Safety Board (NTSB) determined that a May 2016 car crash involving a Tesla Model S’ Autopilot was caused primarily due to the automated control systems. The driver, Joshua Brown, was alleged to be inattentive while at the wheel—relying too much on the technology—and slammed into a tractor-trailer as the driver was making a left hand turn.
Cars such as the Model S fall on a self-driving scale of 0 to 5. Level 5 represents the most advanced technology; the vehicle can drive by itself all of the time. The Model S falls at a 2, which is considered partial automation, and means that the system does steering, acceleration, and braking—but the driver must still monitor the road and surrounding environment. Use of level 2 cars should be limited to interstates that don’t have any intersections. Brown was driving on a divided highway near Gainesville, Florida which, although infrequent, does allow some access to and from side roads. Tesla did not include any protections for uses on these more dangerous roads, and since Brown’s accident, they still have not.
“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks in a limited range of environments,” NTSB Chairman Robert Sumwalt explained. “Tesla allowed the driver to use the system outside of the environment for which it was designed.”
So, what safeguards should Tesla, as well as other semi-autonomous car manufacturers, put in place for their vehicles? The NTSB recommends that automakers help fight against distracted driving, which is ultimately what lead to Brown’s death. To do this, systems that ensure drivers stay attentive to the vehicle should be implemented. Tesla already has a feature that detects hand pressure on steering wheels, but it needs to go further. In their investigation, NTSB found that Brown had his hands on the wheel for 25 seconds of the entire 37.5 minutes that the car’s cruise control and lane-keeping systems were engaged. Its models and radars weren’t capable of detecting the truck in front of it and needed the driver intervention.
As technology in vehicles rapidly increases, accidents like the 2016 crash illustrate just one facet of the dilemmas faced with the rise of autonomous driving vehicles. An over-reliance on design (and the failure to realize its limitations) is one concern, but there are larger ethical concerns that come with this burgeoning industry.
Rob Wallis is CEO of TRL, a company that provides research, technology, and software for ground transportation, including cars. They are asking questions that policymakers must think about as more of these cars hit the road. “‘From a TRL perspective,” he told the London Assembly Transport Committee, “the biggest challenge we see is in the ethical decisions that need to be made around when should a vehicle hit a tree and hurt the occupants rather than hit the pedestrian and save the life of the occupants, or other such scenarios.”
Additional considerations include driver licenses. If a car can completely drive itself, does some need a driver’s license? Should the standard driving test change to incorporate these new technologies? These are all questions that need to be discussed and decided on within the next decade.
Hacking is also a concern. Before we can safely deploy autonomous trucks and taxis, guarding against cyber attacks out to be addressed. Theoretically, digital criminals could target and control self-driving vehicles, wreaking havoc around them. There have been no reports of this happening, but the MIT Technology Review urges us to consider the possibility. They liken it to email in the 1990s. At first, email seemed like the answer to unwanted junk mail. Then, it went through a period of time when spam was a huge problem. Decades later, however, email technology has come a long way. Learning from this progression, autonomous automakers need to consider and plan for this type of threat.
Autonomous vehicles represent an exciting new frontier for driving, but it shouldn’t come at the expense of its drivers and others who share the road.