Back to Search Start Over

What Can Attention Module Do in Knowledge Distillation?

Authors :
Xiaolin Li
Bowen Huang
Gang Xu
Zhuohao Chen
Source :
2021 4th International Conference on Robotics, Control and Automation Engineering (RCAE).
Publication Year :
2021
Publisher :
IEEE, 2021.

Details

Database :
OpenAIRE
Journal :
2021 4th International Conference on Robotics, Control and Automation Engineering (RCAE)
Accession number :
edsair.doi...........50839a4273d3f0028624b6a206dd42a1