With each passing year, the world inches closer to the inevitable day when automated cars chauffeur us to our destinations. But before that day arrives, a host of safety and regulatory concerns need to be addressed, such as licensing.
The University of Michigan’s Transportation Research Institute this week released a report exploring whether a system akin to existing graduated driver licensing would logically apply to driverless cars. The answer is no … and yes.
The bottom line, according to researchers Michael Sivak and Brandon Schoettle, is that self-driving vehicles, unlike humans, won’t benefit from gradual immersion into the complex realm of open-road driving. A novice motorist will gain applicable experience from, say, exposure to nighttime or harsh-weather driving, whereas autonomous cars will simply follow their programming.
“A self-driving vehicle either has the hardware and software to deal with a particular situation, or it does not,” researchers stated. “If it does not, experience in other situations will not be of benefit.”
Still, many concerns remain, including variations among manufacturers’ computer hardware and software; vehicles’ ability to perform in severe weather; distinguishing between visual patterns, such as downed power lines or a flooded roadway; and general performance under normal conditions. There are still many kinks to work out when it comes to what situations driverless cars can handle.
Following traffic laws presents some peculiar challenges. Programming an autonomous vehicle to obey the law is, on its face, rather simple: Load the car’s computer with all traffic laws in all 50 states and let the car’s GPS position determine which set of laws it needs to follow based on location. But what happens when cars follow traffic laws too strictly, in contrast to humans? The report cites an incident in which a Google car entered a roundabout and — likely due to aggressive driving by surrounding cars — determined the safest action was to continue circling.
“The fact that human drivers are relatively lax about many relevant laws and regulations creates a real quandary: Should manufacturers be allowed to program a vehicle to willfully break applicable laws?” researchers said.
Likewise, the vehicle could encounter a situation in which a lesser-of-two-evils decision must be made, for example, the split-second choice whether to crash into a bicyclist or a baby stroller. “It would be desirable if the resolutions of such ethical dilemmas were consistent with societal norms, as is hopefully the case with human drivers,” researchers stated.
In addressing the initial limitations of driverless cars, the report concluded that some form of graduated licensing could be beneficial. A car could be tested and licensed for use only under conditions and circumstances for which it was deemed suited. Then, as manufacturers make improvements — like enabling the car to pass a snow-driving evaluation it previously was unsuited for — a full license could be issued for unrestricted driving.