Advertisement

Name That Tech: New-Age Car Talk Too Confusing for Most

By: Bridget Clerkin October 17, 2017
A recent study by MIT found self-driving vehicle terms confused drivers.

Would a cruise control by any other name be as sweet? What about "driver assist"?

Turns out, the turn of phrase may make all the difference.

A recent survey on the naming conventions used in reference to self-driving cars revealed that when it comes to the new-age technology, nobody is exactly sure what carmakers are talking about.

Released last month by the Massachusetts Institute of Technology, the study examined how much people knew—or at least what they could infer—about the level of automation in certain vehicles based on the names of their self-driving features alone. And most of the participants were lost in translation.

Armed with the official definitions of each level of autonomy (a scale ranging from 0 to 5), members of the study were asked to imagine which level applied to a number of different systems currently or about to be released on the market—including programs like “Active Cruise Control,” “Super Cruise,” “Intelligent Cruise Control,” “Intelligent Assist,” “Intelligent Drive,” “Drive Pilot,” “Driving Assistant Plus,” “Pilot Assist,” “Pilot Plus,” and “Pro Pilot,” among others.

Most successful was the word “Cruise,” which 50% of participants were able to accurately identify as Level 1 autonomy. From there, the results only got worse for the naming conventions, with confusion over the vague-yet-similar terms abounding, especially when participants were tasked with tagging “Assist.” (Most weren’t sure whether the word indicated that the system was assisting drivers—or the other way around.)

The study also found that prior to receiving their reading materials, a vast majority of the 450 participants—of which 85% had at least earned a Bachelor’s degree—reported they were “not familiar at all” with the automation scale.

The new-age illiteracy could cause big problems, the survey suggested, creating a technological Tower of Babel leading consumers to question the systems—and making it hard to trust them.

While the lack of understanding would be an obvious blow for both the federal government and the automotive industry—both of which are pushing hard for the proliferation of the machines, which have been predicted to herald a new $7 trillion passenger economy—it could also be dangerous for drivers themselves. Without a clear understanding of what the system can (or won’t) do, those behind the wheel could be more prone to misuse the technology and get into an accident.

Overestimating the computer’s capability to pilot the car could likewise lead to increased incidents on the road. (Indeed, too much trust in an autopilot system was recently deemed at least partly to blame for the 2016 death of a Tesla test driver.)

The issue comes to a head when dealing with Level 2 automation, the survey found. The systems can only handle one aspect of driving at a time—taking control of either steering or braking and accelerating, but not both at once—and still require a driver’s focus and participation to be safely used. Still, most participants felt they could comfortably relax behind the wheel after activating Level 2 systems.

While automakers undoubtedly have a long way to go toward perfecting the technology, they may want to take a page from Shakespeare and ask themselves, “What’s in a name?”

Recent Articles