1. AI Content Moderation, Racism and (de)Coloniality
- Author
-
Eugenia Siapera
- Subjects
Social Psychology ,Social work ,media_common.quotation_subject ,Compensation (psychology) ,Moderation ,Racism ,Epistemology ,Expropriation ,Critical reading ,Developmental and Educational Psychology ,Coloniality of power ,Sociology ,Moderation system ,Social Sciences (miscellaneous) ,media_common - Abstract
The article develops a critical approach to AI in content moderation adopting a decolonial perspective. In particular, the article asks: to what extent does the current AI moderation system of platforms address racist hate speech and discrimination? Based on a critical reading of publicly available materials and publications on AI in content moderation, we argue that racialised people have no significant input in the definitions and decision making processes on racist hate speech and are also exploited as their unpaid labour is used to clean up platforms and to train AI systems. The disregard of the knowledge and experiences of racialised people and the expropriation of their labour with no compensation reproduce rather than eradicate racism. In theoretically making sense of this, we draw influences from Anibal Quijano’s theory of the coloniality of power and the centrality of race, concluding that in its current iteration, AI in content moderation is a technology in the service of coloniality. Finally, the article develops a sketch for a decolonial approach to AI in content moderation, which aims to centre the voices of racialised communities and to reorient content moderation towards repairing, educating and sustaining communities.
- Published
- 2021
- Full Text
- View/download PDF