Back to Search
Start Over
Secure and Efficient Federated Gradient Boosting Decision Trees.
- Source :
- Applied Sciences (2076-3417); Apr2023, Vol. 13 Issue 7, p4283, 17p
- Publication Year :
- 2023
-
Abstract
- In recent years, federated GBDTs have gradually replaced traditional GBDTs, and become the focus of academic research. They are used to solve the task of structured data mining. Aiming at the problems of information leakage, insufficient model accuracy and high communication cost in the existing schemes of horizontal federated GBDTs, this paper proposes an algorithm of gradient boosting decision trees based on horizontal federated learning, that is, secure and efficient FL for GBDTs (SeFB). The algorithm uses locality sensitive hashing (LSH) to build a tree by collecting similar information of instances without exposing the original data of participants. In the stage of updating the tree, the algorithm aggregates the local gradients of all data participants and calculates the global leaf weights, so as to improve the accuracy of the model and reduce the communication cost. Finally, the experimental analysis shows that the algorithm can protect the privacy of the original data, and the communication cost is low. At the same time, the performance of the unbalanced binary data set is evaluated. The results show that SeFB algorithm compared with the existing schemes of horizontal federated GBDTs, the accuracy is improved by 2.53% on average. [ABSTRACT FROM AUTHOR]
- Subjects :
- DECISION trees
DATA privacy
BOOSTING algorithms
DATA mining
COMMUNICATION models
Subjects
Details
- Language :
- English
- ISSN :
- 20763417
- Volume :
- 13
- Issue :
- 7
- Database :
- Complementary Index
- Journal :
- Applied Sciences (2076-3417)
- Publication Type :
- Academic Journal
- Accession number :
- 163038168
- Full Text :
- https://doi.org/10.3390/app13074283