Back to Search Start Over

SecBERT: Privacy-preserving pre-training based neural network inference system.

Authors :
Huang, Hai
Wang, Yongjian
Source :
Neural Networks. Apr2024, Vol. 172, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Pre-trained models such as BERT have made great achievements in natural language processing tasks in recent years. In this paper, we investigate the privacy-preserving pre-training based neural network inference in a two-server framework based on additive secret sharing technique. Our protocol allows a resource-restrained client to request two powerful servers to cooperatively process the natural processing tasks without revealing any useful information about its data. We first design a series of secure sub-protocols for non-linear functions used in BERT model. These sub-protocols are expected to have broad applications and of independent interest. Based on the building sub-protocols, we propose SecBERT, a privacy-preserving pre-training based neural network inference protocol. SecBERT is the first cryptographically secure privacy-preserving pre-training based neural network inference protocol. We show security, efficiency and accuracy of SecBERT protocol through comprehensive theoretical analysis and experiments. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
172
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
175643439
Full Text :
https://doi.org/10.1016/j.neunet.2024.106135