1. Knowledge Consistency Distillation for Weakly Supervised One Step Person Search
- Author
-
Li, Zongyi, Shi, Yuxuan, Ling, Hefei, Chen, Jiazhong, Wang, Runsheng, Zhao, Chengxin, Wang, Qian, and Huang, Shijuan
- Abstract
Weakly supervised person search targets to detect and identify a person with only bounding box annotations. Recent approaches have focused on learning person relations in a single model, ignoring the conflicts between the detection and Re-ID heads, along with the influence of background elements, which may lead to noisy pseudo labels and inaccurate Re-ID features. To address this challenge, we introduce a novel framework named Knowledge Consistency Distillation (KCD) for weakly supervised person search, which explores the capabilities of an advanced unsupervised person re-identification (Re-ID) model to mitigate the conflicts and background influences. We propose hierarchical consistency alignments, including feature-level, cluster-level, and instance-level consistency alignment, to synchronize the knowledge from the state-of-the-art unsupervised Re-ID model. Specifically, the feature-level consistency aligns the feature through both context and relation alignment. The cluster-level consistency aligns the teacher cluster information by reusing its OIM module. To tackle the inconsistency problem between student instances and teacher cluster centroids, we incorporate pseudo-label refinement to assist the student model in comprehending the teacher’s knowledge at cluster-level while mitigating the negative effects of noisy labels. Finally, an instance-level consistency loss weighted by the similarity between the instance and its corresponding cluster is proposed to align the positive instance correlations. Our approach aims to train a one-step weakly supervised model for person search by exploiting the characteristics of unsupervised person Re-ID. Extensive experiments illustrate that our method achieves state-of-the-art performance on two widely-used person search datasets, CUHK-SYSU and PRW. Our code will be available on GitHub at
https://github.com/zongyi1999/KCD .- Published
- 2024
- Full Text
- View/download PDF