1. Semantic Knowledge Distillation for Onboard Satellite Earth Observation Image Classification
- Author
-
Le, Thanh-Dung, Ha, Vu Nguyen, Nguyen, Ti Ti, Eappen, Geoffrey, Thiruvasagam, Prabhu, Chou, Hong-fu, Tran, Duc-Dung, Garces-Socarras, Luis M., Gonzalez-Rios, Jorge L., Merlano-Duncan, Juan Carlos, and Chatzinotas, Symeon
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Machine Learning ,Electrical Engineering and Systems Science - Signal Processing - Abstract
This study presents an innovative dynamic weighting knowledge distillation (KD) framework tailored for efficient Earth observation (EO) image classification (IC) in resource-constrained settings. Utilizing EfficientViT and MobileViT as teacher models, this framework enables lightweight student models, particularly ResNet8 and ResNet16, to surpass 90% in accuracy, precision, and recall, adhering to the stringent confidence thresholds necessary for reliable classification tasks. Unlike conventional KD methods that rely on static weight distribution, our adaptive weighting mechanism responds to each teacher model's confidence, allowing student models to prioritize more credible sources of knowledge dynamically. Remarkably, ResNet8 delivers substantial efficiency gains, achieving a 97.5% reduction in parameters, a 96.7% decrease in FLOPs, an 86.2% cut in power consumption, and a 63.5% increase in inference speed over MobileViT. This significant optimization of complexity and resource demands establishes ResNet8 as an optimal candidate for EO tasks, combining robust performance with feasibility in deployment. The confidence-based, adaptable KD approach underscores the potential of dynamic distillation strategies to yield high-performing, resource-efficient models tailored for satellite-based EO applications. The reproducible code is accessible on our GitHub repository., Comment: Under revisions
- Published
- 2024