The Hottest New Autonomous Car Trend Is Truly Visionary

By: Bridget Clerkin February 26, 2018
The navigation cameras on autonomous cars can have difficulty seeing in low light conditions, or pinpointing animals. Some companies have begun work on adding thermal imaging cameras to increase a car's ability to see what's in front of it.
Share This Page
Share Pin It Email Print

Autonomous cars aren’t just driving the automotive revolution—they’re ushering in the age of artificial intelligence.

In order for the vehicles to operate like people, they’ll have to be able to think like people, and a number of tech firms are dedicated to making sure that happens, diligently mapping out the circuitry of a truly self-driving brain.

But all that thinking power means nothing if the autos can’t look before they leap.

To solve the sight problem, most manufacturers have turned to LiDAR technology, a laser-guided system that graphs a 3D outlook of the world for the vehicles to utilize when navigating around. (The idea is considered so crucial in Silicon Valley, it’s already been at the heart of a huge lawsuit over trade secret theft and dubbed “the sauce” by ex-Uber CEO Travis Kalanick.)

But what happens to that artificial vision when a self-driving car encounters pitch darkness, heavy fog, or even too much sun?

The laser sensors and barrage of more traditional cameras used to guide the cars may struggle to see their way through such environments—but could a little heat enhance their prescription lenses?

Warming Up

AdaSkyLens
AdaSky's Viper model thermal imaging camera could be installed into self-driving vehicles.

Thermal cameras are increasingly being seen as a potential autonomous vision aid.

The technology captures a completely different data set from the myriad other gadgets the cars currently employ, adding even more depth to their field of judgement and data to inform their decision making.

Autonomous vehicles use as much as 100 laptops’ worth of juice to process the voluminous—and growing—amount of data they’re receiving from all that extra technology.

And when it comes to autonomous cars, redundancy is key: the concept is baked into the vehicles to ensure the failure of one navigation system won’t derail the entire operation. A little cloud cover might confuse a light-sensitive camera, but if the car can get the guidance it needs from another source, it can keep cruising right along.

A surplus of heat vision options have also cropped up lately, including new offerings from Israeli firm AdaSky and Oregon-based FLIR Systems.

The new-age eyes are based on old thermal technology and would inlay a heat map on top of an autonomous vehicle’s 3D view of the world. The idea is to help the autos keep on trucking through a number of adverse weather scenarios, which could impact not just visibility conditions but the condition of the road itself. In colder climates, a lack of heat on the asphalt could be a telltale sign of ice on the roadway—and the car could steer around the slick patch accordingly.

But perhaps the camera’s greatest contribution is giving the cars a fuller sense of reality—and all of the creatures populating it.

Playing Frogger

One of the most critical components of self-driving technology is its ability to recognize pedestrians, cyclists, and others with whom it shares the road. (That the cars should react to such human encounters is a given; how they should react is a bit more of an ethical conundrum.)

Manufacturers have actively worked on the issue, programming vehicles to deal with numerous situations that could arise, but one scenario seems especially sticky for the self-driving systems: how to handle animals on the road.

Most autonomous models utilize some form of animal detection software, essentially a tweaked version of pedestrian recognition technology which interprets the particular gait of any life forms crossing the road, helping the computer identify the type of wildlife it’s dealing with.

The idea has seen some success when it comes to large, loping animals such as moose and deer, but it’s far from perfect—especially when dealing with the world’s more eclectic creatures. Volvo’s Australian autonomous test fleet famously encountered issues with the country’s iconic kangaroos, unable to figure out where—or what—the hopping figures were.

Using optical, gait-interpreting cues to determine the type of animal crossing the road also presents the same type of visibility issues autonomous cars deal with when otherwise mapping out the world on a less-than-perfect day.

Thermal cameras could once again provide the solution, aiding in animal identification through the production of heat maps. Most mammals operate at different body temperatures, and the technology could pick up on how much heat the critters are emitting. (Cold-blooded animals, however, would likely fail to register in such situations.)

Still, nothing comes without a price—and adding extra-sensory abilities to autonomous systems could be especially costly for manufacturers.

Worth the Weight?

Under the burden of all kinds of cameras, sensors, and other new systems, self-driving cars are already struggling to stay light on their feet.

The excess bulk has sparked concerns in the auto world over how the vehicles will continue to meet increasingly stringent emissions requirements, with some estimating it will be 10% more difficult for the cars to comply with fuel economy and carbon emissions standards compared to human-driven autos.

And with the amount of power they must feast on to function, it’s no wonder the cars have gained so much weight. Autonomous vehicles use as much as 100 laptops’ worth of juice to process the voluminous—and growing—amount of data they’re receiving from all that extra technology.

Thermal cameras would make for an even heavier load, in terms of heft and bandwidth, likely leaving many manufacturers wondering if they can handle the heat.

Recent Articles