Back to Search Start Over

Mask Attention-SRGAN for Mobile Sensing Networks

Authors :
Ching-Chun Chang
Yung Hui Li
Chi En Huang
Source :
Sensors, Volume 21, Issue 17, Sensors (Basel, Switzerland), Sensors, Vol 21, Iss 5973, p 5973 (2021)
Publication Year :
2021
Publisher :
Multidisciplinary Digital Publishing Institute, 2021.

Abstract

Biometrics has been shown to be an effective solution for the identity recognition problem, and iris recognition, as well as face recognition, are accurate biometric modalities, among others. The higher resolution inside the crucial region reveals details of the physiological characteristics which provides discriminative information to achieve extremely high recognition rate. Due to the growing needs for the IoT device in various applications, the image sensor is gradually integrated in the IoT device to decrease the cost, and low-cost image sensors may be preferable than high-cost ones. However, low-cost image sensors may not satisfy the minimum requirement of the resolution, which definitely leads to the decrease of the recognition accuracy. Therefore, how to maintain high accuracy for biometric systems without using expensive high-cost image sensors in mobile sensing networks becomes an interesting and important issue. In this paper, we proposed MA-SRGAN, a single image super-resolution (SISR) algorithm, based on the mask-attention mechanism used in Generative Adversarial Network (GAN). We modified the latest state-of-the-art (nESRGAN+) in the GAN-based SR model by adding an extra part of a discriminator with an additional loss term to force the GAN to pay more attention within the region of interest (ROI). The experiments were performed on the CASIA-Thousand-v4 dataset and the Celeb Attribute dataset. The experimental results show that the proposed method successfully learns the details of features inside the crucial region by enhancing the recognition accuracies after image super-resolution (SR).

Details

Language :
English
ISSN :
14248220
Database :
OpenAIRE
Journal :
Sensors
Accession number :
edsair.doi.dedup.....16f07a010d631a8bf6933ebbb69d0e7c
Full Text :
https://doi.org/10.3390/s21175973