1. Domain‐invariant attention network for transfer learning between cross‐scene hyperspectral images.
- Author
-
Ye, Minchao, Wang, Chenglong, Meng, Zhihao, Xiong, Fengchao, and Qian, Yuntao
- Subjects
MACHINE learning ,FEATURE extraction ,IMAGE sensors - Abstract
Small‐sample‐size problem is always a challenge for hyperspectral image (HSI) classification. Considering the co‐occurrence of land‐cover classes between similar scenes, transfer learning can be performed, and cross‐scene classification is deemed a feasible approach proposed in recent years. In cross‐scene classification, the source scene which possesses sufficient labelled samples is used for assisting the classification of the target scene that has a few labelled samples. In most situations, different HSI scenes are imaged by different sensors resulting in their various input feature dimensions (i.e. number of bands), hence heterogeneous transfer learning is desired. An end‐to‐end heterogeneous transfer learning algorithm namely domain‐invariant attention network (DIAN) is proposed to solve the cross‐scene classification problem. The DIAN mainly contains two modules. (1) A feature‐alignment CNN (FACNN) is applied to extract features from source and target scenes, respectively, aiming at projecting the heterogeneous features from two scenes into a shared low‐dimensional subspace. (2) A domain‐invariant attention block is developed to gain cross‐domain consistency with a specially designed class‐specific domain‐invariance loss, thus further eliminating the domain shift. The experiments on two different cross‐scene HSI datasets show that the proposed DIAN achieves satisfying classification results. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF