Back to Search Start Over

Can machine learning model with static features be fooled: an adversarial machine learning approach.

Authors :
Taheri, Rahim
Javidan, Reza
Shojafar, Mohammad
Vinod, P.
Conti, Mauro
Source :
Cluster Computing; Dec2020, Vol. 23 Issue 4, p3233-3253, 21p
Publication Year :
2020

Abstract

The widespread adoption of smartphones dramatically increases the risk of attacks and the spread of mobile malware, especially on the Android platform. Machine learning-based solutions have been already used as a tool to supersede signature-based anti-malware systems. However, malware authors leverage features from malicious and legitimate samples to estimate statistical difference in-order to create adversarial examples. Hence, to evaluate the vulnerability of machine learning algorithms in malware detection, we propose five different attack scenarios to perturb malicious applications (apps). By doing this, the classification algorithm inappropriately fits the discriminant function on the set of data points, eventually yielding a higher misclassification rate. Further, to distinguish the adversarial examples from benign samples, we propose two defense mechanisms to counter attacks. To validate our attacks and solutions, we test our model on three different benchmark datasets. We also test our methods using various classifier algorithms and compare them with the state-of-the-art data poisoning method using the Jacobian matrix. Promising results show that generated adversarial samples can evade detection with a very high probability. Additionally, evasive variants generated by our attack models when used to harden the developed anti-malware system improves the detection rate up to 50% when using the generative adversarial network (GAN) method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13867857
Volume :
23
Issue :
4
Database :
Complementary Index
Journal :
Cluster Computing
Publication Type :
Academic Journal
Accession number :
146658224
Full Text :
https://doi.org/10.1007/s10586-020-03083-5