Back to Search Start Over

SCT: A Simple Baseline for Parameter-Efficient Fine-Tuning via Salient Channels

Authors :
Zhao, Henry Hengyuan
Wang, Pichao
Zhao, Yuyang
Luo, Hao
Wang, Fan
Shou, Mike Zheng
Publication Year :
2023

Abstract

Pre-trained vision transformers have strong representation benefits to various downstream tasks. Recently, many parameter-efficient fine-tuning (PEFT) methods have been proposed, and their experiments demonstrate that tuning only 1\% extra parameters could surpass full fine-tuning in low-data resource scenarios. However, these methods overlook the task-specific information when fine-tuning diverse downstream tasks. In this paper, we propose a simple yet effective method called "Salient Channel Tuning" (SCT) to leverage the task-specific information by forwarding the model with the task images to select partial channels in a feature map that enables us to tune only 1/8 channels leading to significantly lower parameter costs. Experiments on 19 visual transfer learning downstream tasks demonstrate that our SCT outperforms full fine-tuning on 18 out of 19 tasks by adding only 0.11M parameters of the ViT-B, which is 780$\times$ fewer than its full fine-tuning counterpart. Furthermore, experiments on domain generalization and few-shot classification further demonstrate the effectiveness and generic of our approach. The code is available at https://github.com/showlab/SCT.<br />Comment: This work has been accepted by IJCV

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2309.08513
Document Type :
Working Paper