FairWarining Reports

Lots of Love for Driverless Cars, Except From One Group–Drivers

The driver of this Tesla, Joshua Brown, 40, was killed in Florida in May 2016 while using the car’s “Autopilot” system. (Photo by Florida Highway Patrol)

As tech companies and automakers, cheered on by the federal government, race to test and promote autonomous vehicles, several surveys show that most motorists don’t want to drive, ride in or be on the road anywhere near them.

What’s more, as development efforts pick up steam, the level of public skepticism seems to be mounting.

This week, in the latest federal effort to encourage driverless vehicles, the Department of Transportation announced a largely hands-off strategy for testing them on public roads. The voluntary guidelines came just days after the House approved legislation to allow testing of thousands of the robot cars on public roads over the next several years. Senate action is still pending.

The driving public, however, appears far more cautious, according to a string of studies by organizations including AAA, Kelley Blue Book, J.D. Power, the University of Michigan’s Transportation Research Institute and the Massachusetts Institute of Technology.

In May, an MIT white paper declared that “comfort or trust in full automation appears to be declining” after “numerous strides and setbacks … on the path to highly automated vehicles.”

Over time, MIT reported, the reality of autonomous vehicles has begun sinking in with the development of self-driving Tesla hardware and hands-free highway driving for 2018 Cadillacs – and such cautionary tales as the May 2016 death of a motorist using Tesla’s “Autopilot” system that steered his car into a tractor-trailer truck in Florida.


In the latest MIT AgeLab survey, a mere 13 percent of motorists said they would feel comfortable with features that fully released them of all control of their vehicles – down from nearly 25 percent the previous year.

Trust is declining

The findings mirrored those in a larger 2017 study by J.D. Power, which showed declining trust in automated technology among “Gen Z consumers” (born between 1995-2004) and “Pre-Boomers” (born before 1946).

“In most cases, as technology concepts get closer to becoming reality, consumer curiosity and acceptance increase,” said Kristin Kolodge, executive director of driver interaction research at J.D. Power, when the study was released in April. “With autonomous vehicles…the level of trust is declining.”

But Kolodge said this could change. “As features like adaptive cruise control, automatic braking and blind-spot warning systems become mainstream, car buyers will gain more confidence in taking their hands off the steering wheel.”

The AAA reports a similar sense of wariness: More than three-quarters of Americans are afraid to ride in a self-driving vehicle, according to a national survey this year.

Last year, Kelley Blue Book reported that 80 percent of people surveyed said they should always have the option to drive their vehicles and two-thirds said they preferred to be in full control of their car at all times.

And in August, the research firm Gartner Inc. released a survey showing that more than half of respondents in the U.S. and Germany would not consider riding in a fully autonomous vehicle.

“Fear of autonomous vehicles getting confused by unexpected situations, safety concerns around equipment and system failures and vehicle and system security are top concerns around using fully autonomous vehicles,” said Gartner research director Mike Ramsey.

One of the few outliers was a survey by the Consumer Technology Association, a leading trade group which last year released a poll showing that 70 percent of U.S. consumers want to try out an autonomous car. The association represents 2,200 companies in the consumer electronics industry, some with a stake in the development of self-driving cars. Technology association spokesman Tyler Suiters said in an email to FairWarning that his group’s survey asked about interest in the benefits that self-driving cars will deliver, and interest in test driving and owning” a self-driving car.

A February letter from the association to federal transportation officials attacked proposed federal guidelines aimed at reducing driver distraction from electronic devices. But its leaders also suggested, “The risk of distracted driving may one day entirely be eliminated by increasingly automated vehicles and, ultimately, self-driving cars.”

On Tuesday, in issuing the Trump Administration’s voluntary guidelines for testing driverless cars, Transportation Secretary Elaine L. Chao said the new technology could help reduce the annual toll of about 40,000 U.S. traffic deaths that mainly result from human error.

Lack of “system safeguards”

Also this week, the National Transportation Safety Board released findings on the deadly May 2016 Tesla crash in Florida. NTSB Chairman Robert L. Sumwalt III blamed the “driver’s inattention” and said that “system safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking.”

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” Sumwalt said.

These findings are more critical of Tesla than those of the National Highway Traffic Safety Administration , which in January declared there was no specific safety-related defect in the Tesla technology.

This week a consumer group, Advocates for Highway and Auto Safety, criticized what it called “a hands-off approach to hands-free driving,” saying that the Tesla case underscores the need for a tougher regulatory approach.

By merely issuing voluntary guidelines, federal authorities are dodging their responsibilities, said Cathy Chase, vice president of governmental affairs for the Advocates group. “The Tesla crash, sadly, will likely be just the first example demonstrating why the federal government needs to give AV [autonomous vehicle] manufacturers clear rules and regulations as they roll out this new technology into the marketplace.”

Print Print  

About the author

Paul Feldman is a FairWarning staff writer.

2 comments to “Lots of Love for Driverless Cars, Except From One Group–Drivers”

  1. Spiffy

    “cautionary tales as the May 2016 death of a motorist using Tesla’s “Autopilot” system that steered his car into a tractor-trailer truck in Florida.”

    This comments is why the public is less trusting of AV’s these days: media hype. The system didn’t steer the car into a truck, it failed to stop when a truck pulled directly in front of the driver. They were going straight and not paying attention. At least you put “Autopilot” into quotes, as it’s not an auto-pilot any more than an airliner’s “autopilot” is.

    There’s nobody currently driving a retail auto-pilot system since they don’t exist. The media is quick to write about all the testing failures. The public would be weary of any product that was getting so much scrutiny.

  2. Matthew Mabey

    If the industry, and government cheerleaders, wants to convince the public of the viability of autonomous cars they need to be pointing out all the existing autonomous systems out there that operate in complex, open environments (i.e. something akin to public roadways with debris, construction, animals, vehicles driven by people, and pedestrians of all ages).
    What is that I hear? Crickets. There is no existing autonomous system that provides even a slight approximation of what some are claiming they will be able to achieve with automobiles in only a few short years.
    Instead, what the public does see is phones and computers that exhibit random behavior and frequent failures. We see car entertainment systems that don’t reliably (i.e. 99.999% of the time, 99+% isn’t good enough for autonomous cars) reconnect with the Bluetooth on our phones. We see systems at all scales, and levels of security, that are vulnerable to malicious intrusion and disruption. We see airlines that come to a grinding halt because of computer system failures that take hours, or days, to remedy. We see a steady flood of recalls, of all sorts, regarding every make and model of car available. The public is right be be skeptical.
    The industry may be able to accomplish what they claim they will. But there is little evidence to support their claim at this point in time. That one Tesla incident matches our experience with technology at all levels. The explanations of the Google Car incidents belie a culture of explaining away problems. The old adage is proven true millions of time every day: “To err is human, but it takes a computer to really foul things up.”

Leave a comment