Back to Search Start Over

Fault injection attacks on SoftMax function in deep neural networks

Authors :
Shivam Bhasin
Dirmanto Jap
Yoo-Seung Won
Source :
CF
Publication Year :
2021
Publisher :
ACM, 2021.

Abstract

Softmax is commonly used activation function in neural networks to normalize the output to probability distribution over predicted classes. Being often deployed in the output layer, it can potentially be targeted by fault injection attacks to create misclassification. In this extended abstract, we perform a preliminary fault analysis of Softmax against single bit faults.

Details

Database :
OpenAIRE
Journal :
Proceedings of the 18th ACM International Conference on Computing Frontiers
Accession number :
edsair.doi...........77096973afce1fe030338badac13e82a