Minnesota motorists may have heard that an autonomous Uber vehicle struck and killed a pedestrian in Arizona. One of the reasons for that fatality may be that the vehicle's software was copying human behavior, according to a computer professor at the Ira A. Fulton Schools of Engineering at Arizona State University.
The fatal accident occurred in Tempe on March 18. A woman was walking her bike across a dark road when the Uber car, which had a human driver behind the wheel for safety purposes, hit her. The professor says that the program controlling the self-driving car was mimicking the way a human would drive and that could be what allowed the accident to happen. For example, a human driver who cannot see a pedestrian on a dark road will continue driving as though there are no obstacles. However, the professor argues that a computerized car should sense the darkness and adjust its speed to allow for emergency stops.
The Tempe police chief said that the accident was likely unavoidable given the lighting conditions of the road. However, the professor says that the whole point of autonomous vehicles is to drive safely in situations where humans cannot. He also says that if self-driving cars continue to be programmed to drive like humans, they will fail as a viable technology.
Victims of car accidents have the right to file a personal injury lawsuit against the driver who caused the crash. This type of lawsuit is designed to provide compensation for medical expenses, pain and suffering, lost wages, property loss and any other damages attributed to the collision. An attorney can often be helpful in this regard.
Source: Insurance Journal, "Human Influence Makes Autonomous Vehicle Programming Unsafe: Professor", March 29, 2018