Back to Search
Start Over
CURE: Privacy-Preserving Split Learning Done Right
- Publication Year :
- 2024
-
Abstract
- Training deep neural networks often requires large-scale datasets, necessitating storage and processing on cloud servers due to computational constraints. The procedures must follow strict privacy regulations in domains like healthcare. Split Learning (SL), a framework that divides model layers between client(s) and server(s), is widely adopted for distributed model training. While Split Learning reduces privacy risks by limiting server access to the full parameter set, previous research has identified that intermediate outputs exchanged between server and client can compromise client's data privacy. Homomorphic encryption (HE)-based solutions exist for this scenario but often impose prohibitive computational burdens. To address these challenges, we propose CURE, a novel system based on HE, that encrypts only the server side of the model and optionally the data. CURE enables secure SL while substantially improving communication and parallelization through advanced packing techniques. We propose two packing schemes that consume one HE level for one-layer networks and generalize our solutions to n-layer neural networks. We demonstrate that CURE can achieve similar accuracy to plaintext SL while being 16x more efficient in terms of the runtime compared to the state-of-the-art privacy-preserving alternatives.
- Subjects :
- Computer Science - Cryptography and Security
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2407.08977
- Document Type :
- Working Paper