Should Autonomous Cars Be Driving More like Us?

By: Bridget Clerkin October 25, 2018
Self-driving vehicles are made to drive better than humans. But, like robots, they miss a key factor in their operation: the ability to interpret the subtleties of body language.
Share This Page
Share Pin It Email Print

One of the biggest and most consistent arguments for taking on the immense task of filling the world with autonomous cars has been their potential for safety.

Experts have predicted that the robotic vehicles, impervious to distraction, sleepiness, and all other biological issues facing their flesh-and-blood counterparts, could save a minimum of 3,000 lives a year—and as many as 500,000 lives over the course of the next 50 years. And despite a recent dip in the annual traffic fatality toll, human drivers remain far from perfect, causing 37,133 deaths on the road last year alone.

Yet one engineering firm is arguing that learning to drive more like us will be a crucial next step in the evolution of the autonomous rides.

The issue arises in one place where human motorists have an inherent leg up on the mechanical chauffeurs: interpreting the subtleties of body language.

Whether drivers realize it or not, they likely make a number of decisions based on these tiny interactions every time they hit the road, from assessing the intentions of pedestrians and bicyclists to determining who should move next through a four-way stop.

The new program, developed by Boston-based group Perceptive Automata, attempts to interpret the confusingly human communication for the robo-cars, allowing them to weigh whether a pedestrian is planning to cross the street, and to then move—or not move—accordingly.

Currently, the cars are programmed to remain overly cautious, directed to sit tight and wait out any scenario in which they detect a human who looks to be crossing the street. But even if a person changes their mind, stops short, and waves the car on, the vehicles remain idle, causing confusion and potential issues with other cars on the road.

With the new software, the vehicles will instead learn to look for forward momentum—whether a person is walking with the intention of going all the way or if they stop, stutter, or otherwise dawdle about the decision. And engineers are hoping the tweak will also give further momentum to the adoption of driverless rides in general, allowing them to more seamlessly slip into the stream of traffic.

All told, the predictive algorithm attempts to give autonomous cars a bit of human intuition. And with the deep-learning algorithms powering the program, it likely won’t be long before the vehicles master the task—and start asking each other, “Why did the human cross the road?”

Recent Articles