Remember meForgot password?
    Log in with Twitter

article imageHow to handle moral judgments for driverless cars?

By Tim Sandle     May 27, 2018 in Technology
Driverless car technology requires a moral dimension. In the event of an impending accident, does the car strike a pedestrian or another car, for example? Researchers have been studying the ethics of autonomous vehicles.
Autonomous vehicles will inevitably encounter situations where a moral assessment is required. Will the people in the autonomous car or those in the vicinity be accepting of the decision made by artificial intelligence technology? New research indicates that many will not agree with the decisions that self-driving vehicles will take.
New research, based on experiments that are designed to test people's reactions to a driving dilemma that potentially endangers human life, showed that many would rather undergo self-sacrifice than harm another driver or pedestrian. However, behind this general tendency there were variations based the age of potential victims and over how many lives might be affected. Moreover, the research suggests that human intuitions are at times at odds with what is generally regarded as ethically acceptable behavior or consensual political guidelines.
These types of questions are crystallized by real events, such as earlier in 2018 when an autonomous Uber car killed a woman in the street in Arizona. Findings by the U.S. the National Transportation Safety Board found that the vehicle’s computer system had determined it needed to brake to avoid a crash; however, the vehicle's built-in emergency braking system had been disabled while the car was in autonomous mode to ensure a smoother ride.
The MadeInGermany (MIG) vehicle is a driverless taxi that can be booked via iPad. The autonomous car...
The MadeInGermany (MIG) vehicle is a driverless taxi that can be booked via iPad. The autonomous car is capable of picking you up at a random location on its own, without the assistance of a driver.
Image courtesy
The debate into autonomous vehicle ethics has been explored by Dr. Lasse T. Bergmann, of the University of Osnabrück, Germany. He explains why: "The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind."
And also why the answers are not straightforward: "he behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable."
Hence there will occur circumstances when self-driving vehicles need to make decisions that are, by human standards, morally challenging. For instance: a car can swerve to avoid hitting a child that runs into the road, but in doing so could endangers the lives of other drivers, perhaps injuring more people than the single child. To address this circumstance, how should the autonomous vehicle be programmed to behave?
In Germany an ethics commission to address these types of questions has been initiated by the German Ministry for Transportation. However, Dr. Bergmann's research suggests that there are sometimes gaps between the direction the commission is taking and the responses of human subjects in simulators, who are reacting intuitively to different driving scenarios. For instance, with the scenario presented with the child, the commission is leaning towards protecting the child whereas in experiments, humans tended to take actions that protected more people. These are the types of moral and ethical issues that require resolution as part of the acceptance of self-driving cars in wider society.
The new research has been published in the journal Frontiers in Behavioral Neuroscience. The research paper is headed "Autonomous Vehicles Require Socio-Political Acceptance—An Empirical and Philosophical Perspective on the Problem of Moral Decision Making."
More about autonomous cars, Ethics, selfdriving cars, driverless cars
More news from