Another year has passed with no increase in driver confidence in autonomous cars. According to the results of a new survey from AAA, 66% of respondents said they were fearful of self-driving vehicles and 25% viewed them with uncertainty, which mirrors last year’s record-high vote of no confidence.
“Consumer skepticism of autonomous vehicles (AV) is not surprising, given the recalls and well-publicized incidents that occurred last year,” said Mark Schieldrop, senior spokesperson for AAA Northeast. “It’s critical that drivers understand both the capabilities and limitations of technology in their cars and how, when and where to use the systems properly.”
Fully self-driving vehicles are not yet available for consumer purchase, but four out of 10 drivers believe they could buy an AV today that will operate itself while they are asleep.
Driver Assistance Programs Popular
Interest is high, though, in advanced driver assistance systems (ADAS), the results showed. Almost two-thirds of U.S. drivers indicated they want autonomous vehicle safety features, such as reverse automatic emergency braking (65%), automatic emergency braking (63%) or lane-keeping assistance (62%) on their next car.
“Consumers have told us again and again that they are interested in and are willing to pay for ADAS that make driving safer,” according to Greg Brannon, director of automotive engineering and industry relations for AAA Inc. “The focus and investment should be in this area, which will build consumer confidence in higher levels of automation.”
ADAS Are Not Foolproof
Many drivers overestimate the capabilities of automated vehicle safety systems, according to AAA data. Most U.S. drivers believe automatic emergency braking will stop a vehicle when another car, child, adult pedestrian or bicyclist are in front of or behind the vehicle. But recent AAA research found that reverse automatic braking systems prevented a collision in only one of 40 test runs involving a car crossing behind a test vehicle that was backing up, and in only 10 out of 20 test runs with a mannequin representing a child standing behind the test vehicle.
“Automakers should focus on designing the systems to perform well in common scenarios that drivers face every day,” Brannon said. “Our test procedures included a van parked next to the test vehicle, and while this is a challenging situation, it is extremely common to be backing out without a clear view of what might be coming from behind. Engineers and designers need to keep these situations in mind and ensure that the systems do what they are advertised to do when it counts the most.”
Self-Driving Vehicles Still Need Work
Reports of crashes, including fatal ones, over the past few years have blunted public interest in self-driving vehicles. A fatal crash involving a Tesla vehicle with its automated driving system activated in April 2023 was the 17th for the automaker since 2021.
In San Francisco, an influx of robotaxis from the self-driving car company Cruise, owned by General Motors, was hailed as transformative in 2022, but the driverless cars began running amok. Then in October 2023, a pedestrian was struck and critically injured by a robotaxi, resulting in GM pulling funding and Cruise’s CEO resigning. The company now is facing federal investigations and possible fines.
Several lawsuits have been filed against Tesla for its Autopilot features, including claims that the company oversold Full Self-Driving capabilities, misleading some to believe the vehicles can drive themselves without active driver supervision. This has also captured the attention of transportation officials, who are taking steps to address the marketing and use of automated driving systems.
In 2021, the National Highway Traffic Safety Administration issued a Standing General Order requiring manufacturers and operators of automated driving systems to report crashes to the agency, helping to provide more insight to their overall safety and facilitate future improvements.
How do you feel about self-driving cars and autonomous vehicle safety features? Tell us in the comments.
11 Thoughts on “Drivers Still Fearful of Self-Driving Vehicles”
Leave A Comment
Comments are subject to moderation and may or may not be published at the editor’s discretion. Only comments that are relevant to the article and add value to the Your AAA community will be considered. Comments may be edited for clarity and length.
Yes, I am afraid of self-driving cars now, but in the end, when they are perfected and everyone has them (not in my lifetime), thousands of lives will be saved every year, and insurance costs will go down (legal firms & insurance companies will probably lobby against this).
Besides the issues with software bugs & rare situations simply not programmed for, we’ll be entering an era where half the cars on the road will still be human driven mixed with self-driven cars. Humans are unpredictable & illogical. Computers use logic.
On more than one occasion I have seen vehicles coming up to stop signs or red lights and not slow down and proceed right through it at high speeds. Any accidents have been avoided because I could see that car coming early enough to determine that he/she is not slowing down and I reacted accordingly (gut reaction). Sometimes this reaction has been split-second on my part. Would a self driving car still have me proceed right into it’s path since I have the right-of-way?
While vehicles (cars/trucks/fire trucks/police/ambulances/etc.) that require a driver are still on the road, there needs to be some control for the person in the self-driven car. If first responders do not get any experience behind the wheel in their teens & 20s, how can they take a job requiring that skill?
Can the software detect flash floods, a falling tree, oncoming tornadoes or other sudden natural phenomena requiring instantaneous driver intervention?
I have faith in Science, but not blind faith where I give up my steering wheel & brakes too soon. As I age, I could see where self-driven cars will allow me to always own a car for as long as I can function physically & mentally sufficiently enough where I am not a danger to myself or others.
I tested Tesla full self driving for an hour long trip and was surprising shocked how accurate the software and hardware is. I would say it was 98% accurate even going through a complicated intersection and then through a rotary. The reason I give it 98% is because of two factors that would not be easily programmable. Let me explain: 1) I was driving on a road that was heavily directed by a cop due to a large crowd leaving a nearby concert. Tesla detected a green light, but at the same time recognized that there is a person in the middle of the road. Tesla would not move, although cop was directing me to keep moving. Who is at fault here ? 2) My driveway has grass on both sides of the entrance, Tesla could not perfectly measure the driveway entrance dimensions so it ran over the grass, is it really a life threatening situation ?
I am a software engineer and I know a lot of software has some kind of bug it in – maybe minor and it will never be seen.
Even if the software is perfect and there is still the hardware ) processor and sensors.
If a sensor detects something, the software will react to that. This may be a extremely rare case that the software team never envisioned. It could cause the software to do something unexpected.
I was leery about putting any computer systems into cars, and really afraid of the possibilities that could occur with self-driving vehicles.
The self driving cars are improving with time as more and more driving scenarios are added over many years of testing.
Nothing is 100% fool-proof but recent research is showing that the self driving cars are much safer than
driver operated vehicles. I am for a slow transition of adding self driving cars with continuous improvement and updating
which may very well save many lives in the long run. For myself, I want to have full control of my car in the meantime but that’s just
my preference. As I get older and my skills/reaction speed start to decline I may feel differently then.
In my opinion, full self driving cars will never be perfect. The question is, when will they be good enough ??? Car makers should be calculating number of accidents vs number of driving hours. Compare that to a large random sample of human drivers. When that analysis shows self driving cars are better then human drivers is what I would call “Probably good enough”. My impression from news to date is that humans will not accept anything less then perfect driving. THAT is a long way away…. But, if the existing technology drives safer then the average human, it will save lives. Yes, there will be accidents, some of them stupid when a self driving car encounters something it is not programmed for, but those will be constantly getting fewer and fewer as these UNUSUAL circumstances are learned about and programmed against. So, the question remains, comparing accidents vs driving hours indicate that, on the average, are self driving cars safer then human drivers ?
SELF DRIVING CARTS SHOULD BE ILLEGAL IF YOU DON’T WANT TO DRIVE, TAKE A BUS OR A TAXI. OTHERWISE STAY HOME.
I own a 2017 Honda Accord Touring with active cruise control and lane holding ability
it also has cameras that help me park in a tight spots where I can back up and go forward very close to the cars Im trying to park between but not close enough to hit them.. I agree with the above comment that the lane holder and the active cruise control make a long drive more relaxing
I have driven many miles over the last six years and never seen may car fail to slow down or brake hard depending on what the car in front of me is doing I go to LA a few times a year and I always rent a Toyota corolla with dynamic cruise control which is even better than my active cruise because mine cuts out at under 25 mph where the dynamic brings you to a complete stop very gently as good as the best driver or it will slam on the brakes if need be. This is awesome in the LA traffic and I have driven thousands of miles with the Toyota’s and never have they failed to stop when need be. the one thing you do have to be aware of is if a car cuts you off
I have heard that the stopping ability will be mandatory in cars sold in a couple of years from now. The vast majority of accidents are rear end collisions and this technology will mostly eliminate them and cut your cost of auto insurance in half. It will not make lawyers happy.
The same type of people that fought against seat belts and now vaccines will also fight this technology.
Self-driving vehicles are still many years off, and when they are, will cut down on so many needless deaths on the roadways. However, with that said, we will have a mix of old fashioned driver controlled cars and Self-driving vehicles which will be a high casualty period. The old fashioned driver controlled cars will be the most dangerous since human drivers are so unpredictable. Self-driving vehicles could be perfect, but can’t do anything when they are T-boned by speeding cars running red lights. So the transition has to be gradual, from current 100% driver, to gradual driver assisted features where there needs to be a driver in the driver’s seat until older vehicles are phased out before 100% self-driving vehicles become safe. There always has to be a human override feature and driver’s seat. Sometimes you have to damage your car to get away from a dangerous situation, kidnapping, car jacking, shoot-out, a building/bridge is collapsing, flooding, etc. This includes trucks and buses as well. Maybe even civilian aircraft, Military vehicles not so much.
Remaining antique and older driver controlled vehicles will be insured at a very high cost, but insurance rates for the rest should drop significantly. This transition to 100% self-driving vehicles won’t happen in my lifetime.
Has anyone looked into all the satellite failure possibilities if driver-less cars will rely on them?
Anyway, I look forward to where there is a peak where a vehicle can drive by itself on a highway while the driver naps or eats, checks his phone, etc. but with an alert if traffic gets heavy, weather conditions deteriorate, or an accident occurs nearby. Or in an unfamiliar city where it can maneuver through the many streets/turns following the GPS without missing a turn. Current technology will allow that in my lifetime (it already does today, but with too many bugs).
The US is rampant with horrible drivers: not using lights in the dark or during inclement weather, incessant tailgating, disobeying traffic laws. Automated safety devices only make bad drivers worse by having drivers rely on their “safe” vehicles. Maybe the vehicle is “safe” – but the operator isn’t. We should get rid of No Fault insurance to put the onus back on these terrible drivers. Self-driving vehicles? No thanks. I would never trust them to avoid getting into accidents. Computer fails and who takes over?
I’ve never understood the want or desire for a self-driving vehicle.
I have a 2023 Hyundai Palisades with self driving mode. It took a few months before I became comfortable enough to use it. I find using it on divided highways with up to moderate traffic is great and relaxing that makes longer drives easier and less tiring. I like the function of maintaining a safe distance and alerting me to traffic when passing. Also the periodic alerts to put heads on the wheel. I always leave a hand very close to the wheel for safety. Also, it makes things like cleaning my glasses safer I believe.