Back to Search Start Over

Ship Detection under Complex Backgrounds Based on Accurate Rotated Anchor Boxes from Paired Semantic Segmentation.

Authors :
Xiao, Xiaowu
Zhou, Zhiqiang
Wang, Bo
Li, Linhao
Miao, Lingjuan
Source :
Remote Sensing; Nov2019, Vol. 11 Issue 21, p2506, 1p
Publication Year :
2019

Abstract

It is still challenging to effectively detect ship objects in optical remote-sensing images with complex backgrounds. Many current CNN-based one-stage and two-stage detection methods usually first predefine a series of anchors with various scales, aspect ratios and angles, and then the detection results can be outputted by performing once or twice classification and bounding box regression for predefined anchors. However, most of the defined anchors have relatively low accuracy, and are useless for the following classification and regression. In addition, the preset anchors are not robust to produce good performance for other different detection datasets. To avoid the above problems, in this paper we design a paired semantic segmentation network to generate more accurate rotated anchors with smaller numbers. Specifically, the paired segmentation network predicts four parts (i.e., top-left, bottom-right, top-right, and bottom-left parts) of ships. By combining paired top-left and bottom-right parts (or top-right and bottom-left parts), we can take the minimum bounding box of these two parts as the rotated anchor. This way can be more robust to different ship datasets, and the generated anchors are more accurate and have fewer numbers. Furthermore, to effectively use fine-scale detail information and coarse-scale semantic information, we use the magnified convolutional features to classify and regress the generated rotated anchors. Meanwhile, the horizontal minimum bounding box of the rotated anchor is also used to combine more context information. We compare the proposed algorithm with state-of-the-art object-detection methods for natural images and ship-detection methods, and demonstrate the superiority of our method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
11
Issue :
21
Database :
Complementary Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
139548594
Full Text :
https://doi.org/10.3390/rs11212506