Back to Search Start Over

A Deep Neural Network Combined CNN and GCN for Remote Sensing Scene Classification

Authors :
Jiali Liang
Yufan Deng
Dan Zeng
Source :
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 13, Pp 4325-4338 (2020)
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

Learning powerful discriminative features is the key for remote sensing scene classification. Most existing approaches based on convolutional neural network (CNN) have achieved great results. However, they mainly focus on global-based visual features while ignoring object-based location features, which is important for large-scale scene classification. There are a large number of scene-related ground objects in remote sensing images, as well as Graph convolutional network (GCN) has the potential to capture the dependencies among objects. This article introduces a novel two-stream architecture that combines global-based visual features and object-based location features, so as to improve the feature representation capability. First, we extract appearance visual features from whole scene image based on CNN. Second, we detect ground objects and construct a graph to learn the spatial location features based on GCN. As a result, the network can jointly capture appearance visual information and spatial location information. To the best of authors' knowledge, we are the first to investigate the dependencies among objects in remote sensing scene classification task. Extensive experiments on two datasets show that our framework improves the discriminative ability of features and achieves competitive accuracy against other state-of-the-art approaches.

Details

Language :
English
ISSN :
21511535
Volume :
13
Database :
Directory of Open Access Journals
Journal :
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Publication Type :
Academic Journal
Accession number :
edsdoj.16a91886a8194615b1c40821a3a6b6be
Document Type :
article
Full Text :
https://doi.org/10.1109/JSTARS.2020.3011333