Emotional Affairs: How Autonomous Cars Are Reading Our Faces—and How It May Help

By: Bridget Clerkin April 23, 2018
Autonomous autos are being outfitted with technology to better read the emotional state of their passengers in order to adjust their driving patterns.
Share This Page
Share Pin It Email Print

From road rage to joyrides, driving is often an emotional undertaking, with the full gamut of human feelings even extended to those not behind the wheel. (Hello, backseat drivers.)

There’s a biological reason for such a deep connection to the pursuit, and it’s called “novelty seeking.” The transportation may be new but the concept of hitting the road certainly wasn’t for our ancestors, and to thank them for exploring—and eventually populating—the four corners of the globe, evolution developed a reward system to send small shots of serotonin to the brain to encourage more risky behavior.

Psychologically, driving is also extraordinarily engaging (or at least, it should be), with the mind absorbed by the myriad elements coming into play to create given roadway scenarios, and our reactions to such depending on still even more factors, from our current mood to our past experiences to how much—or little—we think of our fellow motorists.

Still, driving is becoming less human by the hour, as millions of robots are training to turn the skill from an artform into a simple act.

Indeed, their abject lack of emotion is why self-driving vehicles have been predicted to all but eliminate roadway death—but that’s not to say they can’t learn anything from their carbon-based passengers.

Some researchers have realized that a look can go a long way—and a few have begun training their self-driving systems on how to use that information to help autonomous cars go the extra mile.

Monkey See

FacialRecognition
By using facial recognition technology, software inside of self-driving cars can track a range of 20 different facial expressions.

The ability to read a crowd—and the precious chance it gives one to chart a new course should things not look good—is crucial for anyone providing a service, and the same is true for the new-age chauffeurs.

An autonomous car could be programmed to feel out how its passengers feel and adjust accordingly, whether by slowing down to accommodate a fearful looking rider or changing routes to allay a passenger who seems frustrated with current conditions on the road.

At least, that’s what the brains behind start-up Renovo Auto are hoping.

The Silicon Valley tech company has partnered with artificial intelligence (AI) startup Affectiva to let its cars track, learn from—and eventually respond to—their passengers’ expressions in real time.

The technology works by pinpointing several areas on a human face, including the slope of the eyebrows, point of the nose, and corners of the mouth, to monitor how they move over the course of a drive. The individual expressions are then strung together until the computer can successfully map the mosaic image back to an emotion.

As it turns out, many humans have a worse poker face than they may think, and the face-reading trick has already led the software to recognize 20 different facial expressions and seven “emotion metrics”: anger, contempt, disgust, fear, joy, sadness, and surprise.

But the concept offers much more than a vocabulary lesson for self-driving cars, or even the chance to make better rides—it opens up an entirely new channel of human-cyborg relations that can help foster one of the most important emotions of all: trust.

“We spend a lot of time trying to figure out how to sense inanimate objects with LiDAR and cameras, and that’s super important,” Renovo CEO Chris Heiser told The Verge. “But automated mobility has a huge human component. And companies like Affectiva give us a brand new data stream to look at and help every single one of the people in our ecosystem—people who are building self-driving, people who are building teleoperation, people who are building ride-hailing applications—they all want to know how people are feeling and reacting to these automated vehicles.”

Sleeping Aides

SuperCruise
GM's Super Cruise technology—seen here in a Cadillac CT6—reads a driver's face to detect when they become drowsy or fall asleep behind the wheel.

Still, it’s not just the arousal of human emotion the computers will be looking out for.

The lack of any real action behind our faces is another big indicator to the cars that something is amiss—likely because a driver is asleep or otherwise not paying attention to the road.

In that case—especially while humans are still expected to maintain at least some control over the car—it becomes paramount that the vehicle is able to not only identify drowsiness or distraction but intervene, either through flashing warning lights and sounding alerts to snap someone out of it or by letting its autopilot temporarily take control of the wheel.

The concept isn’t exactly new—General Motors has utilized some version of the technology in the Cadillac CT6 since last year—but it’s become an increasingly popular topic of conversation in the auto world as drivers are having more and more trouble focusing on the road.

Yet even as cars become savvier drivers, the program will become more important, thanks to a long—and deadly—history of humans struggling to pay attention once a vehicle begins driving itself. (Most recently, such false confidence led to fatal accidents in Arizona and California.) A proper alarm system could act as yet another redundancy in the vehicle, bolstering safety by ensuring that the robot isn’t the only driver aware of what’s happening outside the car.

People Watching

In fact, dealing with outside factors such as pedestrians and bikers will likely become an increasingly important function of facial-recognition technology in autonomous cars as the autos become more widespread. (It could be a huge help in getting the vehicles to understand and communicate with anyone they share the road with, especially at sensitive areas like intersections.)

And if any number of forward-looking car designers get their way, the machine may indeed soon be the only one able to even see what’s going on outside.

As the AI learning only gets deeper—and with it, likely the trust humans will have in the cars—windows are expected to be replaced by ever-more light-emitting screens, giving humans any number of flashy fantasies to immerse themselves in, and the idle reality of sitting inside an intelligent vehicle will be the furthest thing from their minds—or their faces.

Recent Articles