Back to Search Start Over

Privacy-Preserving Deep Inference for Rich User Data on The Cloud

Authors :
Osia, Seyed Ali
Shamsabadi, Ali Shahin
Taheri, Ali
Katevas, Kleomenis
Rabiee, Hamid R.
Lane, Nicholas D.
Haddadi, Hamed
Publication Year :
2017

Abstract

Deep neural networks are increasingly being used in a variety of machine learning applications applied to rich user data on the cloud. However, this approach introduces a number of privacy and efficiency challenges, as the cloud operator can perform secondary inferences on the available data. Recently, advances in edge processing have paved the way for more efficient, and private, data processing at the source for simple tasks and lighter models, though they remain a challenge for larger, and more complicated models. In this paper, we present a hybrid approach for breaking down large, complex deep models for cooperative, privacy-preserving analytics. We do this by breaking down the popular deep architectures and fine-tune them in a particular way. We then evaluate the privacy benefits of this approach based on the information exposed to the cloud service. We also asses the local inference cost of different layers on a modern handset for mobile applications. Our evaluations show that by using certain kind of fine-tuning and embedding techniques and at a small processing costs, we can greatly reduce the level of information available to unintended tasks applied to the data feature on the cloud, and hence achieving the desired tradeoff between privacy and performance.<br />Comment: arXiv admin note: substantial text overlap with arXiv:1703.02952

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1710.01727
Document Type :
Working Paper