We may trust our self-driving cars too much, experts warn

Self-driving cars are starting to be an ordinary sight on our streets and highways, as more and more consumers have embraced the technology. 

Proponents of autonomous driving argue that the technology will improve safety and efficiency, while making the driving experience that much more enjoyable. 

But now that self-driving cars have gained mainstream notoriety, it also raises plenty of questions. 

Chief among them: When a self-driving car crashes, who’s at fault? 

Tesla, Google and a plethora of other firms have been racing to make self-driving cars a reality. But experts warn that passengers may be too trusting of the technology, which could lead to major safety concerns. It also begs the question of who should be held liable in an accident

In response, experts have started to warn that we may be entering the most dangerous part of autonomous vehicle testing, as the technology struggles to catch up with the growing number of self-driving cars out on the road, according to the Guardian. 

Autonomous vehicles are still being outfitted with capabilities like lane centering and obstacle detection.

Adding to concerns, passengers in self-driving cars may trust the vehicle to make sophisticated decisions, which can result in accidents or close calls. 

‘People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,’ Nidhi Kalra, senior information scientist at the Rand Corporation, told the Guardian.

‘Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.’

The result is that drivers are being drawn into a false sense of security, where they may overestimate what a self-driving car is capable of doing. 

That trust may stem from the fact that passengers are looking forward to indulging in relaxing activities while in a self-driving car, such as texting, eating, talking on the phone or even sleeping.

More and more Americans are getting used to the idea of riding in an autonomous vehicle. 

Roughly 63% of Americans say they’re afraid of riding in a self-driving car, according to a study released Wednesday by AAA. That’s a noteworthy drop from 2017, when that number was 78%.  

WHO’S AT FAULT WHEN A SELF-DRIVING CAR CRASHES? 

Auto giant GM was slapped with a lawsuit on Wednesday after one of its self-driving Chevy Bolt cars collided with a motorcyclist on December 7

The suit is being filed by Oscar Nilsson, who was on motorcycle when the car swerved into the lane where he was driving 

Nilsson said he suffered neck and shoulder injuries that ‘require lengthy treatment’

In a statement, GM said safety is the company’s ‘primary focus’

‘In this matter, the SFPD collision report stated that the motorcyclist merged into our lane before it was safe to do so,’ a GM spokesperson said 

A self-driving Chevy Bolt, which is part of GM's Cruise autonomous program, collided with a motorcyclist last month. Police deemed the motorcyclist to be at fault

A self-driving Chevy Bolt, which is part of GM’s Cruise autonomous program, collided with a motorcyclist last month. Police deemed the motorcyclist to be at fault

The accident was in San Francisco at a time when there was heavy traffic

The Chevy Bolt was part of GM’s Cruise autonomous testing program

Nilsson’s motorcycle was driving 17 mph while the Bolt was traveling at 12 mph

Experts say that this is the first-known lawsuit dealing with an accident involving an autonomous vehicle 

Police later determined that Nilsson was at fault because he attempted to pass the Bolt on its right side

GM’s Cruise program has been involved in 22 different accidents

Experts say consumers may misunderstand which tasks their autonomous vehicle can complete without any human interaction. 

Tesla, Mercedes and BMW have cars with level 2 autonomy, which means the car controls things like accelerating and decelerating, but the driver must always be capable of retaking the wheel. 

Google’s self-driving car unit, Waymo, has been testing models with level 3 autonomy, wherein the vehicle is able to handle safety-critical functions, as well as basic operations. Level 4 is considered to be fully autonomous. 

Most autonomous vehicles use level 2 or level 3 autonomy, which means that the car still requires some human intervention. For a car to be fully autonomous, it must reach level 4

Most autonomous vehicles use level 2 or level 3 autonomy, which means that the car still requires some human intervention. For a car to be fully autonomous, it must reach level 4

Elon Musk has promised that Tesla’s self-driving cars would reach level 4 autonomy in the near future. 

While Tesla and other firms have lofty goals for where their self-driving cars may be headed next, they face many roadblocks concerning safety and regulation of the technology. 

GM is being sued by a motorcyclist after one of its self-driving Chevy Bolt cars collided into the driver on a busy street in San Francisco last month. 

Meanwhile, Tesla is being investigated by the U.S. National Highway Traffic Safety Administration after a semi-autonomous Tesla sedan rear-ended a parked fire truck.

A Tesla sedan rear-ended a firetruck in California on Monday.  NHTSA and NTSB are both investigating the crash to learn more details about the driver and the car 

A Tesla sedan rear-ended a firetruck in California on Monday.  NHTSA and NTSB are both investigating the crash to learn more details about the driver and the car 

The National Transportation Safety Board has also said it plans to examine the crash.

No one was injured in the crash. 

Automakers have started rolling out new safety features to help keep drivers from relying on self-driving cars too much.

GM, for example, introduced eye-tracking technology to make sure that drivers eyes are focused on the road. 

Tesla has created a feature that will turn autonomous features off completely if the driver ignores warnings to keep their hands on the wheel.   

 

  

 

  



Read more at DailyMail.co.uk