1. Group Fairness in Peer Review
- Author
-
Aziz, Haris, Micha, Evi, and Shah, Nisarg
- Subjects
Computer Science - Computer Science and Game Theory ,Computer Science - Artificial Intelligence ,Computer Science - Social and Information Networks ,Physics - Physics and Society - Abstract
Large conferences such as NeurIPS and AAAI serve as crossroads of various AI fields, since they attract submissions from a vast number of communities. However, in some cases, this has resulted in a poor reviewing experience for some communities, whose submissions get assigned to less qualified reviewers outside of their communities. An often-advocated solution is to break up any such large conference into smaller conferences, but this can lead to isolation of communities and harm interdisciplinary research. We tackle this challenge by introducing a notion of group fairness, called the core, which requires that every possible community (subset of researchers) to be treated in a way that prevents them from unilaterally benefiting by withdrawing from a large conference. We study a simple peer review model, prove that it always admits a reviewing assignment in the core, and design an efficient algorithm to find one such assignment. We use real data from CVPR and ICLR conferences to compare our algorithm to existing reviewing assignment algorithms on a number of metrics., Comment: A preliminary version appeared at NeurIPS 2023
- Published
- 2024