Back to Search
Start Over
Moral Decision Making in Autonomous Systems: Enforcement, Moral Emotions, Dignity, Trust, and Deception.
- Source :
- Proceedings of the IEEE; Mar2012, Vol. 100 Issue 3, p571-589, 19p
- Publication Year :
- 2012
-
Abstract
- As humans are being progressively pushed further downstream in the decision-making process of autonomous systems, the need arises to ensure that moral standards, however defined, are adhered to by these robotic artifacts. While meaningful inroads have been made in this area regarding the use of ethical lethal military robots, including work by our laboratory, these needs transcend the warfighting domain and are pervasive, extending to eldercare, robot nannies, and other forms of service and entertainment robotic platforms. This paper presents an overview of the spectrum and specter of ethical issues raised by the advent of these systems, and various technical results obtained to date by our research group, geared towards managing ethical behavior in autonomous robots in relation to humanity. This includes: 1) the use of an ethical governor capable of restricting robotic behavior to predefined social norms; 2) an ethical adaptor which draws upon the moral emotions to allow a system to constructively and proactively modify its behavior based on the consequences of its actions; 3) the development of models of robotic trust in humans and its dual, deception, drawing on psychological models of interdependence theory; and 4) concluding with an approach towards the maintenance of dignity in human–robot relationships. [ABSTRACT FROM PUBLISHER]
- Subjects :
- DECISION making
AUTONOMOUS robots
INTELLIGENT agents
ROBOTS
SOCIAL norms
Subjects
Details
- Language :
- English
- ISSN :
- 00189219
- Volume :
- 100
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Proceedings of the IEEE
- Publication Type :
- Academic Journal
- Accession number :
- 73740050
- Full Text :
- https://doi.org/10.1109/JPROC.2011.2173265