Advertisement

If They Only Had a Brain: New Tech Lets Old Cars Learn How to Drive

By: Bridget Clerkin July 17, 2017
Software developer Drive.ai is testing the theory that self-driving cars can 'learn' how to drive more safely based on experience.

You can’t teach an old dog new tricks, but you can teach old cars one or two. At least, that’s what some of the best minds in Palo Alto think.

Stanford-based start-up Drive.ai is hoping to transfer some of its collective smarts to our cars, helping them not just learn the ways of the road but eventually take the wheel from their flesh-and-blood drivers.

Retrofitted with a combination of hardware and software installed by the company, the vehicles can begin building a base of automotive knowledge from their past experiences—and tweak their future reactions accordingly.

It may sound cutting-edge, but it’s based on technology that’s been in development for millions of years: the human brain.

And with a fresh round of venture capital funding giving Drive.ai a $50 million boost for their efforts, more of our cars may soon become self-taught self-drivers.

Graduating with Honors

Learning to drive is a difficult—and sometimes deadly—process.

In 2015 alone, six teenagers between 16 and 19 died every day as the result of an automotive accident, according to the Centers for Disease Control. And motorists 15 through 19 were responsible for 11% of all damage incurred in car crashes in 2013, despite making up just 7% of the country’s population that year.

Even when compared to older teens, the youngest drivers on the road are twice as likely to crash, other studies have found.

While a number of factors contribute to the statistical disparities, the simplest is the sheer learning curve between new drivers and their more seasoned peers. It’s why every state in the nation has adopted some type of graduated driver’s license (GDL) program for most drivers, a system giving them more freedom—and responsibility—behind the wheel after certain periods of time and practice. If more strictly implemented, GLD programs could prevent more than 500 deaths and 9,500 car accidents annually, according to the Insurance Institute for Highway Safety.

That progressive learning model is reflected in Drive.ai’s suite of self-driving tools, which similarly require the cars to mature into their new skill set.

The circuitry added onto the vehicles by the company will allow the cars to assess situations on the road in real time, relying on artificial memories of what has worked in the past instead of a preordained programming pathway in order to make their own experienced-based decisions about what to do.

The system is part of a cutting-edge and rapidly-developing aspect of artificial intelligence called “deep learning,” and it could change everything about the way we interact with our autos—and how they interact with each other.

Deep Thoughts

How do you decide what is right, or determine what is good?

To begin with, you have to know your options.

After all, the concept of “good” doesn’t exist without comparison to its partner, “bad,” nor does the idea of “right” make any sense without understanding what’s “wrong.”

In order to truly learn for themselves, our vehicles must be given the same type of choices, and the freedom and ability to discern which route to take. That’s how deep learning works: it presents a plethora of information to a computer and asks it to evaluate its options and pursue the path it deems “correct.”

The concept takes its cues from human evolution, mimicking the ways in which our brains adapt over time, said Drive.ai co-founder and president Carol Reiley.

Drive.ai co-founder and president Carol Reiley.
Drive.ai co-founder and president Carol Reiley.

“It’s much like how a 16-year-old or young driver learns,” she told Fortune. “Instead of hard-coding rules, you’re given a lot of different examples—what is right, wrong, safe, what is a car, what is not a car. It starts to generate its own set of rules on how to navigate the road.

And the foundation of those rules comes from experience—the more varied, the better.

To prepare the vehicles for the unexpected nature of the open road, Drive.ai has exposed them to everything from pedestrians cartwheeling across the street to skateboarding dogs, Reiley said. The company is hoping such a wide range of circumstances to extrapolate from could take an ordinary vehicle up to level 4 automation—the penultimate realization of self-driving technology.

Companies like Tesla and Google’s Waymo have focused on building these artificial neuropathways directly into their autonomous vehicles, but Drive.ai’s kit allows any car on the road the chance to be retrofitted with them, like the Wizard of Oz dispensing brains to so many scarecrows.

Still, the complicated nature of the technology requires that it be installed by a professional, which limits the company’s focus, for now.

Its first pilot will launch later this year, primarily targeting vehicles used in large business fleets, company officials said.

But there’s a brilliance to such a business model: The more cars that use the program, the more miles logged—and lessons learned. Once let loose on the roads at large, the technology—and the data gleaned from its use—could allow for autonomous vehicles to take over at an even more rapid pace than previously thought.

After teaching them to teach themselves, the next trick will be getting the vehicles to share their newly acquired knowledge—both with us, and with each other.

Recent Articles