1. Hyperspectral Feature Extraction Using Sparse and Smooth Low-Rank Analysis
- Author
-
Behnood Rasti, Pedram Ghamisi, and Magnus O. Ulfarsson
- Subjects
classification ,constrained penalized cost function ,feature extraction ,hyperspectral image ,low-rank ,total variation ,sparse features ,smooth features ,Science - Abstract
In this paper, we develop a hyperspectral feature extraction method called sparse and smooth low-rank analysis (SSLRA). First, we propose a new low-rank model for hyperspectral images (HSIs) where we decompose the HSI into smooth and sparse components. Then, these components are simultaneously estimated using a nonconvex constrained penalized cost function (CPCF). The proposed CPCF exploits total variation penalty, ℓ 1 penalty, and an orthogonality constraint. The total variation penalty is used to promote piecewise smoothness, and, therefore, it extracts spatial (local neighborhood) information. The ℓ 1 penalty encourages sparse and spatial structures. Additionally, we show that this new type of decomposition improves the classification of the HSIs. In the experiments, SSLRA was applied on the Houston (urban) and the Trento (rural) datasets. The extracted features were used as an input into a classifier (either support vector machines (SVM) or random forest (RF)) to produce the final classification map. The results confirm improvement in classification accuracy compared to the state-of-the-art feature extraction approaches.
- Published
- 2019
- Full Text
- View/download PDF