1. There is an elephant in the room: towards a critique on the use of fairness in biometrics
- Author
-
Ana Valdivia, Júlia Corbera Serrajòrdia, and Aneta Swianiewicz
- Subjects
FOS: Computer and information sciences ,Computer Science - Computers and Society ,Artificial Intelligence (cs.AI) ,Computer Science - Artificial Intelligence ,Mechanical Engineering ,Data_MISCELLANEOUS ,Computers and Society (cs.CY) ,Energy Engineering and Power Technology ,Management Science and Operations Research - Abstract
In 2019, the UK's Immigration and Asylum Chamber of the Upper Tribunal dismissed an asylum appeal basing the decision on the output of a biometric system, alongside other discrepancies. The fingerprints of the asylum seeker were found in a biometric database which contradicted the appellant's account. The Tribunal found this evidence unequivocal and denied the asylum claim. Nowadays, the proliferation of biometric systems is shaping public debates around its political, social and ethical implications. Yet whilst concerns towards the racialised use of this technology for migration control have been on the rise, investment in the biometrics industry and innovation is increasing considerably. Moreover, fairness has also been recently adopted by biometrics to mitigate bias and discrimination on biometrics. However, algorithmic fairness cannot distribute justice in scenarios which are broken or intended purpose is to discriminate, such as biometrics deployed at the border. In this paper, we offer a critical reading of recent debates about biometric fairness and show its limitations drawing on research in fairness in machine learning and critical border studies. Building on previous fairness demonstrations, we prove that biometric fairness criteria are mathematically mutually exclusive. Then, the paper moves on illustrating empirically that a fair biometric system is not possible by reproducing experiments from previous works. Finally, we discuss the politics of fairness in biometrics by situating the debate at the border. We claim that bias and error rates have different impact on citizens and asylum seekers. Fairness has overshadowed the elephant in the room of biometrics, focusing on the demographic biases and ethical discourses of algorithms rather than examine how these systems reproduce historical and political injustices., Comment: 14 pages, 3 figures
- Published
- 2023