Human decision-making biases in the moral dilemmas of autonomous vehicles.

Published

Journal Article

The development of artificial intelligence has led researchers to study the ethical principles that should guide machine behavior. The challenge in building machine morality based on people's moral decisions, however, is accounting for the biases in human moral decision-making. In seven studies, this paper investigates how people's personal perspectives and decision-making modes affect their decisions in the moral dilemmas faced by autonomous vehicles. Moreover, it determines the variations in people's moral decisions that can be attributed to the situational factors of the dilemmas. The reported studies demonstrate that people's moral decisions, regardless of the presented dilemma, are biased by their decision-making mode and personal perspective. Under intuitive moral decisions, participants shift more towards a deontological doctrine by sacrificing the passenger instead of the pedestrian. In addition, once the personal perspective is made salient participants preserve the lives of that perspective, i.e. the passenger shifts towards sacrificing the pedestrian, and vice versa. These biases in people's moral decisions underline the social challenge in the design of a universal moral code for autonomous vehicles. We discuss the implications of our findings and provide directions for future research.

Full Text

Duke Authors

Cited Authors

  • Frank, D-A; Chrysochou, P; Mitkidis, P; Ariely, D

Published Date

  • September 11, 2019

Published In

Volume / Issue

  • 9 / 1

Start / End Page

  • 13080 -

PubMed ID

  • 31511560

Pubmed Central ID

  • 31511560

Electronic International Standard Serial Number (EISSN)

  • 2045-2322

International Standard Serial Number (ISSN)

  • 2045-2322

Digital Object Identifier (DOI)

  • 10.1038/s41598-019-49411-7

Language

  • eng