Back to Search Start Over

Efficient Deweather Mixture-of-Experts with Uncertainty-aware Feature-wise Linear Modulation

Authors :
Zhang, Rongyu
Luo, Yulin
Liu, Jiaming
Yang, Huanrui
Dong, Zhen
Gudovskiy, Denis
Okuno, Tomoyuki
Nakata, Yohei
Keutzer, Kurt
Du, Yuan
Zhang, Shanghang
Publication Year :
2023

Abstract

The Mixture-of-Experts (MoE) approach has demonstrated outstanding scalability in multi-task learning including low-level upstream tasks such as concurrent removal of multiple adverse weather effects. However, the conventional MoE architecture with parallel Feed Forward Network (FFN) experts leads to significant parameter and computational overheads that hinder its efficient deployment. In addition, the naive MoE linear router is suboptimal in assigning task-specific features to multiple experts which limits its further scalability. In this work, we propose an efficient MoE architecture with weight sharing across the experts. Inspired by the idea of linear feature modulation (FM), our architecture implicitly instantiates multiple experts via learnable activation modulations on a single shared expert block. The proposed Feature Modulated Expert (FME) serves as a building block for the novel Mixture-of-Feature-Modulation-Experts (MoFME) architecture, which can scale up the number of experts with low overhead. We further propose an Uncertainty-aware Router (UaR) to assign task-specific features to different FM modules with well-calibrated weights. This enables MoFME to effectively learn diverse expert functions for multiple tasks. The conducted experiments on the multi-deweather task show that our MoFME outperforms the baselines in the image restoration quality by 0.1-0.2 dB and achieves SOTA-compatible performance while saving more than 72% of parameters and 39% inference time over the conventional MoE counterpart. Experiments on the downstream segmentation and classification tasks further demonstrate the generalizability of MoFME to real open-world applications.<br />Comment: aaai2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.16610
Document Type :
Working Paper