Back to Search Start Over

We need to talk about deception in social robotics!

Authors :
Sharkey, Amanda
Sharkey, Noel
Source :
Ethics & Information Technology; Sep2021, Vol. 23 Issue 3, p309-316, 8p
Publication Year :
2021

Abstract

Although some authors claim that deception requires intention, we argue that there can be deception in social robotics, whether or not it is intended. By focusing on the deceived rather than the deceiver, we propose that false beliefs can be created in the absence of intention. Supporting evidence is found in both human and animal examples. Instead of assuming that deception is wrong only when carried out to benefit the deceiver, we propose that deception in social robotics is wrong when it leads to harmful impacts on individuals and society. The appearance and behaviour of a robot can lead to an overestimation of its functionality or to an illusion of sentience or cognition that can promote misplaced trust and inappropriate uses such as care and companionship of the vulnerable. We consider the allocation of responsibility for harmful deception. Finally, we make the suggestion that harmful impacts could be prevented by legislation, and by the development of an assessment framework for sensitive robot applications. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13881957
Volume :
23
Issue :
3
Database :
Complementary Index
Journal :
Ethics & Information Technology
Publication Type :
Academic Journal
Accession number :
153819654
Full Text :
https://doi.org/10.1007/s10676-020-09573-9