Back to Search
Start Over
Proxy responsibility: addressing responsibility gaps in human-machine decision making on the resort to force.
- Source :
-
Australian Journal of International Affairs . Apr2024, Vol. 78 Issue 2, p191-199. 9p. - Publication Year :
- 2024
-
Abstract
- The integration of AI in resort-to-force decision making gives rise to substantial threats and problems. One significant challenge is that incorporating a machine into the decision-making process can result in responsibility gaps for decisions informed or made by the machine. Ethically, a situation in which lethal violence can be employed without a responsible subject to blame for any wrongdoing is unacceptable. But how can responsibility be attributed if a machine is involved in the decision-making process on war and peace? To address this question, I introduce the concept of 'proxy responsibility'. I contend that since we cannot ascribe moral responsibility to the AI artefact itself, we must identify responsibility relations in the structures in which AI decision making operates. A dynamic and contextual concept of responsibility positions AI in the broader decision-making process of the political, military, and economic system, and helps to unfold different responsibility layers among the involved actors. I argue that the more we move in the direction of machine autonomy, the denser the web of proxy responsibility relations in the environment of AI must become to address the aforementioned gaps in responsibility. [ABSTRACT FROM AUTHOR]
- Subjects :
- *DECISION making
*PEACE negotiations
*RESPONSIBILITY
*ARTIFICIAL intelligence
Subjects
Details
- Language :
- English
- ISSN :
- 10357718
- Volume :
- 78
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- Australian Journal of International Affairs
- Publication Type :
- Academic Journal
- Accession number :
- 177593933
- Full Text :
- https://doi.org/10.1080/10357718.2024.2327384