Back to Search Start Over

People May Punish, Not Blame, Robots

Authors :
Lee, Minha
Ruijten, Peter A.M.
Frank, Lily E.
de Kort, Yvonne A.W.
IJsselsteijn, Wijnand A.
Future Everyday
Human Technology Interaction
Philosophy & Ethics
EAISI Mobility
EAISI Foundational
EAISI Health
Source :
CHI 2021-Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, 1-11, STARTPAGE=1;ENDPAGE=11;TITLE=CHI 2021-Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
Publication Year :
2021
Publisher :
Association for Computing Machinery, Inc, 2021.

Abstract

As robots may take a greater part in our moral decision-making processes, whether people hold them accountable for moral harm becomes critical to explore. Blame and punishment signify moral accountability, often involving emotions. We quantitatively looked into people’s willingness to blame or punish an emotional vs. non-emotional robot that admits to its wrongdoing. Studies 1 and 2 (online video interaction) showed that people may punish a robot due to its lack of perceived emotional capacity than its perceived agency. Study 3 (in the lab) demonstrated that people were neither willing to blame nor punish the robot. Punishing non-emotional robots seems more likely than blaming them, yet punishment towards robots is more likely to arise online than offline. We reflect on if and why victimized humans (and those who care for them) may seek out retributive justice against robot scapegoats when there are no humans to hold accountable for moral harm.

Details

Language :
English
Database :
OpenAIRE
Journal :
CHI 2021-Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, 1-11, STARTPAGE=1;ENDPAGE=11;TITLE=CHI 2021-Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
Accession number :
edsair.narcis........e26ede205ba33cc4fb9df3df75339f2a