Back to Search Start Over

Adversarial Attacks and Defenses

Authors :
Han Xu
Jiliang Tang
Wei Jin
Yaxin Li
Source :
KDD
Publication Year :
2020
Publisher :
ACM, 2020.

Abstract

Deep neural networks (DNN) have achieved unprecedented success in numerous machine learning tasks in various domains. However, the existence of adversarial examples leaves us a big hesitation when applying DNN models on safety-critical tasks such as autonomous vehicles and malware detection. These adversarial examples are intentionally crafted instances, either appearing in the train or test phase, which can fool the DNN models to make severe mistakes. Therefore, people are dedicated to devising more robust models to resist adversarial examples, but usually they are broken by new stronger attacks. This arms-race between adversarial attacks and defenses has been drawn increasing attention in recent years. In this tutorial, we provide a comprehensive overview on the frontiers and advances of adversarial attacks and their countermeasures. In particular, we give a detailed introduction of different types of attacks under different scenarios, including evasion and poisoning attacks, white-box and black box attacks. We will also discuss how the defending strategies develop to compete against these attacks, and how new attacks come out to break these defenses. Moreover, we will discuss the story of adversarial attacks and defenses in other data domains, especially in graph structured data. Then, we introduce DeepRobust, a Pytorch adversarial learning library which aims to build a comprehensive and easy-to-use platform to foster this research field. Finally, we summarize the tutorial with discussions on open issues and challenges about adversarial attacks and defenses. Via our tutorial, our audience can grip the main idea and key approaches of the game between adversarial attacks and defenses.

Details

Database :
OpenAIRE
Journal :
Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining
Accession number :
edsair.doi...........ff12bd5df6733c3c71a48eabf75ce9e7
Full Text :
https://doi.org/10.1145/3394486.3406467