Back to Search Start Over

An efficient modeling attack for breaking the security of XOR-Arbiter PUFs by using the fully connected and long-short term memory.

Authors :
Fard, Sina Soleimani
Kaveh, Masoud
Mosavi, Mohammad Reza
Ko, Seok-Bum
Source :
Microprocessors & Microsystems. Oct2022, Vol. 94, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• We utilize LSTM in order to overcome the resistance of complex XOR-APUF. • We combine LSTM with the Fully Connected to reach better modeling accuracy. • We use PyTorch and Python 3.7 in the Google Colab to implement the attack model. • We improve the accuracy, runtime, and needed CRPs in the proposed attack. Physical Unclonable Functions (PUFs) have been proposed as potential hardware security primitives for various cryptographic applications in recent years. The idea of generating a unique fingerprint for each silicon chip by leveraging variations in the hardware manufacturing has created even more amazing comfortable, and affordable possibilities. On the other hand, some PUFs have shown to be vulnerable to modeling attacks. In this study, we investigate the ability of new Deep Learning (DL)-based technique as potential modeling attacks against more complicated PUF structures. To that end, we utilize the advantage of Long-Short Term Memory (LSTM) for attacking XOR-Arbiter PUFs (XOR-APUFs) for the first time. By combining the proposed LSTM with Fully Connected (FC) neural network, our FC-LSTM obtains 99% modeling accuracy on 7-XOR-APUF and 8-XOR-APUF. Furthermore, our experiments show that, despite the precise mathematical models of PUFs, our proposed modeling attacks effectively overcome the constraints imposed by prior studies in a shorter period of time and with fewer sets of training Challenge-Response Pairs (CRPs). [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01419331
Volume :
94
Database :
Academic Search Index
Journal :
Microprocessors & Microsystems
Publication Type :
Academic Journal
Accession number :
159692072
Full Text :
https://doi.org/10.1016/j.micpro.2022.104667