From our infrastructure to our industry, there’s hardly an aspect of society that won’t be affected by autonomous autos.
In America alone, the average citizen spends 17,600 minutes—or just over 293 hours—of every year in the confines of a car. With machines manning the wheel for all that time, humans can focus on a number of pursuits instead of the road, opening up new possibilities for design of not just the interior of our vehicles but our communities themselves. Major incidents and accidents will likely become a thing of the past, deeply impacting the way we look at insurance and policing the streets, and the concept of car ownership itself may completely disappear.
But that reorganization of the world is just the start.
Powered by a rapidly accelerating technology, whose development shows no sign of slowing, the vehicles are on course to replace many of our current jobs—and soon. Even race car drivers can’t seem to outrun the automotive advancement.
While those oncoming losses are a source of worry for many, agents at the Federal Bureau of Investigations are concerned with the mechanical outsourcing of one job in particular: the getaway driver.
The government agency examined the criminal potential of self-driving cars in a recently released restricted report—and just like the other areas it seems poised to impact, when it comes to breaking the law, the technology could completely change the game.
As they’ve begun mastering our roads, our cars have learned a great many skills, but there’s still a thing or two we can teach them—especially when it comes to thwarting authority.
In its report, compiled by the Directorate of Intelligence, the FBI posited that newly driverless vehicles “will have a high impact on transforming what both law enforcement and its adversaries can operationally do with a car,” according to The Guardian, which obtained the study through an open records request.
That’s because the same efficiency the cars can bring to navigating our roads could be exploited by those who need to get around—and away—in a hurry.
The document’s authors warned of self-driving models being used as getaway cars by savvy lawbreakers able to digitally hotwire the technology to ignore its safety programming. Hijacking the computer system could allow unsavory types to ignore traffic signals and speed far above legal limits—which all other autonomous autos on the road would have to follow.
And while the car is managing a quick escape, ill-intended passengers could make the most of its hands-free capabilities, using the opportunity to “conduct tasks that require use of both hands or taking one’s eyes off the road which would be impossible today,” the report states.
Such actions could include anything from calling in reinforcements to something as serious as shooting at any pursuers.
Still, the advancements work both ways and could aid in the enforcement of such activities, as well, according to the report.
Police equipped with self-driving squad cars would have an easier time surveilling suspicious types, the report says, as algorithms programmed into the vehicles would allow them to both track and trail other cars more efficiently. By utilizing connected infrastructure, police vehicles could even deceptively turn the opposite direction at an intersection, while calculating how to later catch back up with a target.
But even with all those advantages, law enforcement officers will likely struggle to stop those with the worst intentions—or the least to lose.
Perhaps most disturbingly, the FBI warned in its report of the possibility for a self-driving car to be “more of a potential lethal weapon than it is today.”
Terrorists could potentially load the vehicles with bombs and remotely drive them to targeted areas, as long as they could break into the car’s computer. Such access would also allow bad actors to lock passengers inside of their vehicles, and send the cars swerving off roads—or intentionally into each other.
Hackers have already proven that it’s possible to seize control over nearly every function of some contemporary self-driving models.
In an intentional demonstration conducted last year to test the technology, computer wizards Charlie Miller and Chris Valasek infiltrated the electronics system of a 2014 Jeep Cherokee and were able to make the vehicle accelerate, turn the steering wheel, and slam on the brakes at high speeds.
While the trick required the duo to plug into the car directly, a similar stunt they conducted in 2014 allowed them to access everything from the Jeep’s radio and air conditioning to its transmission, all from a remote location.
Making the cars especially vulnerable is their GPS system, Mary Cummings, director of the Humans and Autonomy Laboratory at Duke University, told the San Francisco Chronicle.
The only way to truly outsmart the hackers, she said, would be getting the cars to think for themselves.
“The reality is these systems are incredibly fragile,” Cummings said. “So where we have to go is to make sure that cars can make their own decisions absent any system like GPS. For them to be truly autonomous, they have to be able to make their own decisions like human drivers do.”
Cops & Robbers
Regardless of whether we’re ready or not, self-driving vehicles are coming—fast.
But as they begin to think more like we do, we must prepare for our machines to behave more like us, too, developing the same quirks and unique personality traits—for better or worse.
As autonomous technology continues to mature, we may soon need to be wary not just of any bad humans behind the machines but of any vehicles that have become a little too streetwise.