Back to Search
Start Over
A Systematic View of Model Leakage Risks in Deep Neural Network Systems.
- Source :
- IEEE Transactions on Computers; Dec2022, Vol. 71 Issue 12, p3254-3267, 14p
- Publication Year :
- 2022
-
Abstract
- As deep neural networks (DNNs) continue to find applications in ever more domains, the exact nature of the neural network architecture becomes an increasingly sensitive subject, due to either intellectual property protection or risks of adversarial attacks. While prior work has explored aspects of the risk associated with model leakage, exactly which parts of the model are most sensitive and how one infers the full architecture of the DNN when nothing is known about the structure a priori are problems that have been left unexplored. In this paper we address this gap, first by presenting a schema for reasoning about model leakage holistically, and then by proposing and quantitatively evaluating DeepSniffer, a novel learning-based model extraction framework that uses no prior knowledge of the victim model. DeepSniffer is robust to architectural and system noises introduced by the complex memory hierarchy and diverse run-time system optimizations. Taking GPU platforms as a showcase, DeepSniffer performs model extraction by learning both the architecture-level execution features of kernels and the inter-layer temporal association information introduced by the common practice of DNN design. We demonstrate that DeepSniffer works experimentally in the context of an off-the-shelf Nvidia GPU platform running a variety of DNN models and that the extracted models significantly improve attempts at crafting adversarial inputs. The DeepSniffer project has been released in https://github.com/xinghu7788/DeepSniffer. [ABSTRACT FROM AUTHOR]
- Subjects :
- ARTIFICIAL neural networks
INTELLECTUAL property
LEAKAGE
MATHEMATICAL optimization
Subjects
Details
- Language :
- English
- ISSN :
- 00189340
- Volume :
- 71
- Issue :
- 12
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Computers
- Publication Type :
- Academic Journal
- Accession number :
- 160620885
- Full Text :
- https://doi.org/10.1109/TC.2022.3148235