Self-Driving Vehicle Collisions
Self-driving cars are becoming more prevalent. Generally, they are already safer than many human drivers. But that is not always the case. There is still a lot of room for improvement when it comes to autonomous vehicle (AV) technology. In 2018, an autonomous Uber vehicle struck and killed a pedestrian, according to Vox. In the same year, the family of a man driving a Tesla (and using its autopilot driving feature) sued the automaker after the man was killed in a collision. The family blamed Tesla for the collision. Victims of self-driving vehicle collisions have the ability to sue the automaker, just as if it were a human driver.
Pedestrians and Cyclists Are at the Most Risk
While autonomous vehicles are already fairly competent at avoiding most crashes with other vehicles, they are woefully bad at recognizing cyclists. Pedestrians are difficult for autonomous vehicles to recognize as well. AV vehicles often get mailboxes, fire hydrants, and bushes mixed up with pedestrians, while cyclists are sometimes downright impossible for an AV vehicle to recognize—a terrifying proposition for those on two wheels who are already facing grave danger from distracted and impatient human drivers.
Rear-End Collisions With Self Driving Vehicles
One of the most common types of collisions that human drivers get into with self-driving vehicles are rear-end crashes. In one year alone here in California, self-driving vehicles were involved in 28 rear-end crashes (where the human driver rear-ended the autonomous vehicles), according to Wired. Usually, the driver who commits the rear-end crash is found at fault. Tailgating is illegal, and is all too common these days with aggressive and distracted driving on the rise. However, the rear-driver is not always at fault. For example, if an autonomous vehicle stops for no reason in the middle of the road, it may be found that that vehicle was at fault, or at least partially at fault. Erratic driving or driving unpredictably is known to lead to collisions, and an autonomous vehicle that stops in the middle of the road during rush hour because it thinks a bush is a pedestrian trying to cross the street is no exception.
Many AV Crashes Are Caused By Impatient Human Drivers
Tailgating and passing in no-pass zones are two common instances in which impatient human drivers are causing crashes with autonomous vehicles. Self-driving cars drive the speed limit, stop for pedestrians, and are programmed to put caution and safety above speed—all opposing characteristics of most human drivers. As such, you could be blamed for causing the crash if it is proven that you were acting carelessly. You need to work with an experienced attorney to ensure that this does not happen.
A Campbell Autonomous Vehicle Collision Attorney Can Help Today
If you were involved in a crash with an autonomous vehicle, either as a pedestrian, vehicle occupant, or other road user, you have the right to file a personal injury lawsuit against the AV company. Reach out to our Campbell car accident attorneys at the Solution Now Law Firm today at 408-256-2871 in order to schedule a free case evaluation.