Back to Search Start Over

CECILIA: Comprehensive Secure Machine Learning Framework

Authors :
Ünal, Ali Burak
Pfeifer, Nico
Akgün, Mete
Source :
Patterns,5.9,2024
Publication Year :
2022

Abstract

Since ML algorithms have proven their success in many different applications, there is also a big interest in privacy preserving (PP) ML methods for building models on sensitive data. Moreover, the increase in the number of data sources and the high computational power required by those algorithms force individuals to outsource the training and/or the inference of a ML model to the clouds providing such services. To address this, we propose a secure 3-party computation framework, CECILIA, offering PP building blocks to enable complex operations privately. In addition to the adapted and common operations like addition and multiplication, it offers multiplexer, most significant bit and modulus conversion. The first two are novel in terms of methodology and the last one is novel in terms of both functionality and methodology. CECILIA also has two complex novel methods, which are the exact exponential of a public base raised to the power of a secret value and the inverse square root of a secret Gram matrix. We use CECILIA to realize the private inference on pre-trained RKNs, which require more complex operations than most other DNNs, on the structural classification of proteins as the first study ever accomplishing the PP inference on RKNs. In addition to the successful private computation of basic building blocks, the results demonstrate that we perform the exact and fully private exponential computation, which is done by approximation in the literature so far. Moreover, they also show that we compute the exact inverse square root of a secret Gram matrix up to a certain privacy level, which has not been addressed in the literature at all. We also analyze the scalability of CECILIA to various settings on a synthetic dataset. The framework shows a great promise to make other ML algorithms as well as further computations privately computable by the building blocks of the framework.<br />Comment: Preprint version of "A privacy-preserving approach for cloud-based protein fold recognition" paper published in Patterns, ~8 pages of the main paper, ~5 pages of Supplement

Details

Database :
arXiv
Journal :
Patterns,5.9,2024
Publication Type :
Report
Accession number :
edsarx.2202.03023
Document Type :
Working Paper
Full Text :
https://doi.org/10.1016/j.patter.2024.101023