Saturday, December 26, 2020

People are even more outraged by a self-driving car that deliberately kills a less preferred group (e.g., an elderly person over a child) than by one that indiscriminately kills a more preferred group (e.g., a child)

Deliberately prejudiced self-driving vehicles elicit the most outrage. Julian De Freitas, Mina Cikara. Cognition, Volume 208, March 2021, 104555. https://doi.org/10.1016/j.cognition.2020.104555

Abstract: Should self-driving vehicles be prejudiced, e.g., deliberately harm the elderly over young children? When people make such forced-choices on the vehicle's behalf, they exhibit systematic preferences (e.g., favor young children), yet when their options are unconstrained they favor egalitarianism. So, which of these response patterns should guide AV programming and policy? We argue that this debate is missing the public reaction most likely to threaten the industry's life-saving potential: moral outrage. We find that people are more outraged by AVs that kill discriminately than indiscriminately. Crucially, they are even more outraged by an AV that deliberately kills a less preferred group (e.g., an elderly person over a child) than by one that indiscriminately kills a more preferred group (e.g., a child). Thus, at least insofar as the public is concerned, there may be more reason to depict and program AVs as egalitarian.

Keywords: Moral judgmentAutonomous vehiclesDriverless policyMoral outrage



No comments:

Post a Comment