Back to Search Start Over

Breadth-first, Depth-next Training of Random Forests

Authors :
Anghel, Andreea
Ioannou, Nikolas
Parnell, Thomas
Papandreou, Nikolaos
Mendler-Dünner, Celestine
Pozidis, Haris
Publication Year :
2019

Abstract

In this paper we analyze, evaluate, and improve the performance of training Random Forest (RF) models on modern CPU architectures. An exact, state-of-the-art binary decision tree building algorithm is used as the basis of this study. Firstly, we investigate the trade-offs between using different tree building algorithms, namely breadth-first-search (BFS) and depth-search-first (DFS). We design a novel, dynamic, hybrid BFS-DFS algorithm and demonstrate that it performs better than both BFS and DFS, and is more robust in the presence of workloads with different characteristics. Secondly, we identify CPU performance bottlenecks when generating trees using this approach, and propose optimizations to alleviate them. The proposed hybrid tree building algorithm for RF is implemented in the Snap Machine Learning framework, and speeds up the training of RFs by 7.8x on average when compared to state-of-the-art RF solvers (sklearn, H2O, and xgboost) on a range of datasets, RF configurations, and multi-core CPU architectures.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.06853
Document Type :
Working Paper