Autonomous Vehicles Favor Saving Younger People when it Loses Control, New Study Finds


Photo Credit via Pxhere


Machines do not have morality and any essence of it can only be encoded by the developers. For autonomous vehicles, moral choices in life and death situations are calculated based on encoded information. According to a global study, this status quo was confirmed with results showing that AVs prefer to protect younger people and may sacrifice the elderly in a collision scenario.

How Autonomous Vehicles Deal with Possible Accidents

In a study conducted by Professor Iyad Rahwan of the Massachusetts Institute of Technology, a system that provides certain scenarios about car-related accidents revealed that AVs choose favorable conditions, depending on the instructions encoded in their systems.

Prof. Rahwan and colleagues determined that self-driving vehicles greatly favor protecting younger lives when it comes to decision-making. The researchers found that when the AVs lose their brakes, they would choose to save a child crossing the street over an adult. That simple choice also affects older people in case the AVs would have to choose between a teenager and an elderly person.

They also found that AVs would choose to protect passengers over pedestrians crossing the street, with almost a 40 percent chance of bulldozing people crossing the street if it loses control. Lastly, AVs have an evenly split decision between hitting one person crossing the road legally and hitting two people crossing illegally.

“When we compared Germany to the rest of the world or the east to the west, we found very interesting cultural differences. What this really highlights is there are different considerations that we have to take, there are different values that come into conflict and this really challenges our own ethics and it challenges us to figure these things out together – because we don’t all agree,” Prof. Rahwan expressed.

The study findings pointed out that the famous Three Laws of Robotics made by Isaac Asimov, a science fiction author, would not happen in reality. This is because the three laws pertained to the reflection of human values in the creation of machines, and these values should be quantified and negotiated to determine the important ones.