Back to Search Start Over

Data Poisoning Attack against Neural Network-Based On-Device Learning Anomaly Detector by Physical Attacks on Sensors.

Authors :
Ino, Takahito
Yoshida, Kota
Matsutani, Hiroki
Fujino, Takeshi
Source :
Sensors (14248220); Oct2024, Vol. 24 Issue 19, p6416, 18p
Publication Year :
2024

Abstract

In this paper, we introduce a security approach for on-device learning Edge AIs designed to detect abnormal conditions in factory machines. Since Edge AIs are easily accessible by an attacker physically, there are security risks due to physical attacks. In particular, there is a concern that the attacker may tamper with the training data of the on-device learning Edge AIs to degrade the task accuracy. Few risk assessments have been reported. It is important to understand these security risks before considering countermeasures. In this paper, we demonstrate a data poisoning attack against an on-device learning Edge AI. Our attack target is an on-device learning anomaly detection system. The system adopts MEMS accelerometers to measure the vibration of factory machines and detect anomalies. The anomaly detector also adopts a concept drift detection algorithm and multiple models to accommodate multiple normal patterns. For the attack, we used a method in which measurements are tampered with by exposing the MEMS accelerometer to acoustic waves of a specific frequency. The acceleration data falsified by this method were trained on an anomaly detector, and the result was that the abnormal state could not be detected. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14248220
Volume :
24
Issue :
19
Database :
Complementary Index
Journal :
Sensors (14248220)
Publication Type :
Academic Journal
Accession number :
180276123
Full Text :
https://doi.org/10.3390/s24196416