THE FUTURE IS HERE

Machines Making Moral Decisions

Abstract: Humans are increasingly working with AI-powered algorithms, sharing the road with autonomous vehicles, sharing hospital wards with autonomous surgery robots, and making joint decisions with autonomous algorithms. As we adapt to the increasing presence of AIs playing significant roles in our organizations and society, it is important to understand how people respond and think about AIs decision-making. In my talk I will discuss two questions regarding human responses to AI decision-making, focusing on the moral domain. The first is: Do people want AIs to make moral decisions? I find that people want moral decision-makers to have the ability to think but also to feel, and since robots are perceived as being unable to feel, people do not want them to make moral decisions. I also find that when human decision-making is associated with unequal outcomes, such as racial and SES health disparities, people are more willing to accept AIs as decision-makers. The second is: When AIs would have to make moral decisions, such as in the case of self-driving cars, how do people want them to do that? I show that, in contrast to other published work, people want AIs not to discriminate between humans according to age, gender, and social status.

Bio: Kurt Gray is an Associate Professor of Psychology and Neuroscience at the University of North Carolina. He is also the Director of the Center for the Science of Moral Understanding, which is catalyzing a new field of scientific inquiry focused on identifying data-driven ways of reducing societal intolerance. Gray received his Ph.D. in Social Psychology from Harvard University, and is an expert in moral psychology and social cognition, with over 80 scientific publications. He is an award-winning researcher, educator, and author of the mass market book “The Mind Club: Who Thinks, What Feels, and Why It Matter” (Viking). Gray has been funded by the Charles Koch Foundation, the National Science Foundation, and the John Templeton Foundation. His work has been covered in the New York Times, The Economist, The Wall Street Journal, Science Magazine, Wired, Slate, and The New Yorker.