Back to Search Start Over

Knowledge Amalgamation for Object Detection with Transformers

Authors :
Zhang, Haofei
Mao, Feng
Xue, Mengqi
Fang, Gongfan
Feng, Zunlei
Song, Jie
Song, Mingli
Publication Year :
2022

Abstract

Knowledge amalgamation (KA) is a novel deep model reusing task aiming to transfer knowledge from several well-trained teachers to a multi-talented and compact student. Currently, most of these approaches are tailored for convolutional neural networks (CNNs). However, there is a tendency that transformers, with a completely different architecture, are starting to challenge the domination of CNNs in many computer vision tasks. Nevertheless, directly applying the previous KA methods to transformers leads to severe performance degradation. In this work, we explore a more effective KA scheme for transformer-based object detection models. Specifically, considering the architecture characteristics of transformers, we propose to dissolve the KA into two aspects: sequence-level amalgamation (SA) and task-level amalgamation (TA). In particular, a hint is generated within the sequence-level amalgamation by concatenating teacher sequences instead of redundantly aggregating them to a fixed-size one as previous KA works. Besides, the student learns heterogeneous detection tasks through soft targets with efficiency in the task-level amalgamation. Extensive experiments on PASCAL VOC and COCO have unfolded that the sequence-level amalgamation significantly boosts the performance of students, while the previous methods impair the students. Moreover, the transformer-based students excel in learning amalgamated knowledge, as they have mastered heterogeneous detection tasks rapidly and achieved superior or at least comparable performance to those of the teachers in their specializations.<br />Comment: This work has been submitted to the IEEE for possible publication

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.03187
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/TIP.2023.3263105