Back to Search
Start Over
Generative Adversarial Network With Transformer for Hyperspectral Image Classification.
- Source :
- IEEE Geoscience & Remote Sensing Letters; 2023, Vol. 20, p1-5, 5p
- Publication Year :
- 2023
-
Abstract
- In recent years, generative adversarial networks (GANs) have made great progress in the field of hyperspectral image classification (HIC), which alleviates the problem of insufficient training samples to a large extent. At present, GANs in the field of HIC are all based on convolutional neural networks (CNNs). But CNNs cannot extract sequence information well, and it is difficult to model remote dependencies. However, the hyperspectrum is rich in spectral sequence information, and Transformer has been proven to be good at processing sequence information. Therefore, in order to process spectral information and alleviate the problem of insufficient training samples of hyperspectral images (HSIs), we put forward a new frame Transformer with residual upscale GAN (TRUG). TRUG includes a generator G and a discriminator D. In the G, we propose residual upscale (RU) to improve the resolution of generated features, while also extracting texture features and capturing context relationships. In addition, we visualized the generated fake images for more intuitive analysis. In the D, we adopt the Transformer block with progressively decreasing scale and use grid self-attention mechanism in the first layer to better extract features for classification. In addition, GANs are prone to the problem of unstable training. In order to solve this problem, we improve the normalization algorithm and add relative position coding. We applied a pure Transformer-based GAN to HIC datasets. Experimental results show that the proposed TRUG model has a better performance than other models on the three datasets. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1545598X
- Volume :
- 20
- Database :
- Complementary Index
- Journal :
- IEEE Geoscience & Remote Sensing Letters
- Publication Type :
- Academic Journal
- Accession number :
- 176253628
- Full Text :
- https://doi.org/10.1109/LGRS.2023.3322139