Back to Search Start Over

Adversarial Attacks on Neural-Network-Based Soft Sensors: Directly Attack Output.

Authors :
Kong, Xiangyin
Ge, Zhiqiang
Source :
IEEE Transactions on Industrial Informatics; Apr2022, Vol. 18 Issue 4, p2443-2451, 9p
Publication Year :
2022

Abstract

Neural-network-based soft sensors are widely employed in the industrial process. Such models have great significance to smart manufacturing. Considering the strict requirements of industrial production, it is vital to ensure the safety and robustness of these models in their actual deployment. However, recent research has shown that neural networks are quite vulnerable to adversarial attacks. By imposing tiny perturbation to the original sample, the fabricated adversarial sample can cheat these models to make wrong decisions. Such a phenomenon may bring serious trouble to the practical application of soft sensors. This article focuses on the adversarial attacks on industrial soft sensors. For the first time, we verify and analyze the effectiveness and deficiencies of the existing attack methods in the industrial soft sensor scenario. Based on solving these defects, this article proposes a novel perspective for attacking soft sensors. We analyze the optimization mechanism behind this new idea and then design two algorithms to perform attacks. The proposed methods more conform to the actual situation. Besides, compared with the existing approaches, the proposed methods have potentials to cause severer damages since their attacks are not only more concealed but also more likely to cheat the technicians to execute wrong operations. The research and analyses of the proposed methods lay a solid foundation for more thorough defenses against various attacks, which is quite necessary for making the deployed soft sensors more robust and secure. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15513203
Volume :
18
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Industrial Informatics
Publication Type :
Academic Journal
Accession number :
154800339
Full Text :
https://doi.org/10.1109/TII.2021.3093386