Back to Search Start Over

Scalable Multi-Party Privacy-Preserving Gradient Tree Boosting over Vertically Partitioned Dataset with Outsourced Computations

Authors :
Edemacu, Kennedy
Jang, Beakcheol
Kim, Jong Wook
Publication Year :
2022

Abstract

Due to privacy concerns, multi-party gradient tree boosting algorithms have become widely popular amongst machine learning researchers and practitioners. However, limited existing works have focused on vertically partitioned datasets, and the few existing works are either not scalable or tend to leak information. Thus, in this work, we propose SSXGB which is a scalable and secure multi-party gradient tree boosting framework for vertically partitioned datasets with partially outsourced computations. Specifically, we employ an additive homomorphic encryption (HE) scheme for security. We design two sub-protocols based on the HE scheme to perform non-linear operations associated with gradient tree boosting algorithms. Next, we propose a secure training and a secure prediction algorithms under the SSXGB framework. Then we provide theoretical security and communication analysis for the proposed framework. Finally, we evaluate the performance of the framework with experiments using two real-world datasets.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2202.03245
Document Type :
Working Paper