As we enter the era of autonomous cars, we’re going to start experiencing all manner of “firsts”. California may have just borne witness to one of the strangest new occurrences made possible by the self-driving rides.
Earlier this month, 2 California Highway Patrol officers noticed a Tesla Model S zooming by them—with its “driver” fast asleep behind the wheel.
The car had reportedly been put into its semi-autonomous Autopilot mode by 45-year-old owner Alexander Samek, who was apparently drunk at the time and had passed out at some point during the ride.
While the move is arguably better than driving under the influence, it’s still extraordinarily dangerous—even on a largely empty highway at 3:30 a.m.—given that any autonomous system is still very much in its beta phase, and driver participation is still very much needed.
Still, to solve the novel issue, the officers had to develop an entirely new approach themselves.
The usual step of following the vehicle and turning on lights and sirens produced no result, as the vehicle’s occupant was blissfully unaware that he was even traveling down the highway, let alone getting pulled over. (Autonomous cars being developed by Tesla competitor Waymo, however, are being trained in what to do should a pair of red and blue lights appear in the rearview.)
With no rules in the playbook dictating next steps, the quick-thinking officers got creative.
First, they called in for backup to ensure that traffic behind them could be slowed down. Then, the pair drove in front of the Tesla—reportedly traveling close to 70 mph—and gradually started braking, banking on the idea that the Autopilot program would respond to its slower surroundings.
And the gamut paid off, with the car eventually stopping, at which point the officers promptly woke Samek up—and booked him for drinking and driving, despite the fact that he apparently had little to do with actually piloting the auto.
For its part, Tesla has always and repeatedly insisted that drivers are still on the hook for controlling their cars, even with Autopilot on.
But the incident may point to a much bigger issue: a general over-reliance upon “self-driving” systems.
Recent reports have found that almost no one on the road understands the differences between current autonomous programs, or what, exactly, they can and can’t do.
And that ignorance has already proven to be deadly, ushering in another first for the new age—albeit a much more unfortunate one: the death of a driver who trusted technology too much.