Back to Search Start Over

Deep Hyperspectral Unmixing using Transformer Network

Authors :
Ghosh, Preetam
Roy, Swalpa Kumar
Koirala, Bikram
Rasti, Behnood
Scheunders, Paul
Publication Year :
2022

Abstract

Currently, this paper is under review in IEEE. Transformers have intrigued the vision research community with their state-of-the-art performance in natural language processing. With their superior performance, transformers have found their way in the field of hyperspectral image classification and achieved promising results. In this article, we harness the power of transformers to conquer the task of hyperspectral unmixing and propose a novel deep unmixing model with transformers. We aim to utilize the ability of transformers to better capture the global feature dependencies in order to enhance the quality of the endmember spectra and the abundance maps. The proposed model is a combination of a convolutional autoencoder and a transformer. The hyperspectral data is encoded by the convolutional encoder. The transformer captures long-range dependencies between the representations derived from the encoder. The data are reconstructed using a convolutional decoder. We applied the proposed unmixing model to three widely used unmixing datasets, i.e., Samson, Apex, and Washington DC mall and compared it with the state-of-the-art in terms of root mean squared error and spectral angle distance. The source code for the proposed model will be made publicly available at \url{https://github.com/preetam22n/DeepTrans-HSU}.<br />Comment: Currently, this paper is under review in IEEE

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.17076
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/TGRS.2022.3196057