Self-driving cars are on the immediate horizon. Current estimates say that in about a decade, autonomous vehicles will be made widely available to the public. Companies like Uber claim they are safe, boasting that their autonomous system has data-gathering sensors that cover 360 degrees around the vehicle and maps for safe driving.
These elaborate safety systems are one of the reasons why, at least as of right now, when a self-driving car is involved in an accident, the driver is still largely held responsible. Autonomous vehicles come equipped with collision avoidance systems that alert the driver to take control of the vehicle in unsafe conditions or when a collision with another vehicle, person or object is imminent. It’s assumed that human error is the major cause of autonomous vehicle accidents, at least in California, according to a 2018 article in Fortune Magazine.
Examples of Autonomous Car Accidents
There are some blemishes on the brief history of self-driving cars, however. Elaine Herzberg was the first pedestrian fatality in March 2018. Herzberg was crossing a four-lane road with a bicycle when she was hit by an Uber test vehicle in Tempe, Arizona. The vehicle was operating with a human being as the safety backup driver. It was later ruled that the autonomous SUV’s monitoring system had a design flaw and did not include consideration for pedestrians that are jaywalking.
Tesla may be the most familiar manufacturer of autonomous vehicles. During the pilot program, the car maker faced a class-action lawsuit. Plaintiffs argued that the cars were sold without the required safety features. In March 2018, Tesla was hit with a lawsuit after a fatal crash involving the Model X. The Tesla driver allegedly fell asleep and failed to avoid the wreck.
The bottom line is while regulations are still being written for autonomous vehicles, they’re already on the road, and to some victims’ dismay, they are still causing accidents. The cars essentially turn drivers into passive participants and while advocates claim they are safer than the typical human driver, accidents are not unusual.
What If You’re Injured in an Accident Involving a Self-Driving Car?
Industry experts have made arguments that the cars are being made faster than applicable laws can be put in place. This lack of clarity leaves in question who is actually liable for a crash.
Generally speaking, if you have been injured by a self-driving car, you may be entitled to compensation and be able to sue in court if you can prove the person behind the wheel was negligent. This usually means you will have to prove that the human driver did not heed warning signals emitted from the collision avoidance system. Typical payouts would be made for injuries, disability, pain and suffering, lost wages, doctor bills and vehicle or property damage.
In cases where the manufacturer might be found negligent because of, for example, a faulty monitoring system, that company would be the defendant in your case instead of the driver. Big companies have already paid out huge sums to accident victims for glitches in autonomous vehicle technology.
Regardless of your role in the accident, deciphering laws that govern self-driving cars will require the expertise of a car accident lawyer. Since insurers are still developing hard-and-fast rules, you’ll need an attorney with up-to-the-minute knowledge. If you have been in an accident with an autonomous vehicle, don’t panic. Laws are in place to protect you. Seek legal counsel as you would with any car accident.