Back to Search Start Over

Channel-spatial knowledge distillation for efficient semantic segmentation.

Authors :
Karine, Ayoub
Napoléon, Thibault
Jridi, Maher
Source :
Pattern Recognition Letters. Apr2024, Vol. 180, p48-54. 7p.
Publication Year :
2024

Abstract

In this paper, we propose a new lightweight Channel-Spatial Knowledge Distillation (CSKD) method to handle the task of efficient image semantic segmentation. More precisely, we investigate the KD approach that train a compressed neural network called student under the supervision of a heavy one called teacher. In this context, we propose to improve the distillation mechanism by capturing the contextual dependencies in spatial and channel dimensions through a self-attention principle. In addition, to quantify the difference between the teacher and student knowledge, we adopt the Centered Kernel Alignment (CKA) metric that avoids the student to add additional leaning layers to match the teacher features size. Experimental results over Cityscapes, CamVid and Pascal VOC datasets demonstrate that our method can achieve outstanding performance. The code is available at https://github.com/ayoubkarine/CSKD. • Heavy semantic segmentation methods require high computational costs. • The knowledge distillation is adopted for efficient semantic Segmentation. • The spatial and channel distillations through self-attention between teacher and student networks are proposed. • The Centred Kernel Alignment Metric is used to measure the teacher and student knowledge. • Ablation study and comparison with state-of-the-art methods on different image semantic segmentation are presented. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*DISTILLATION
*IMAGE segmentation

Details

Language :
English
ISSN :
01678655
Volume :
180
Database :
Academic Search Index
Journal :
Pattern Recognition Letters
Publication Type :
Academic Journal
Accession number :
176296639
Full Text :
https://doi.org/10.1016/j.patrec.2024.02.027