Back to Search Start Over

Scaling Up Diffusion and Flow-based XGBoost Models

Authors :
Cresswell, Jesse C.
Kim, Taewoo
Publication Year :
2024

Abstract

Novel machine learning methods for tabular data generation are often developed on small datasets which do not match the scale required for scientific applications. We investigate a recent proposal to use XGBoost as the function approximator in diffusion and flow-matching models on tabular data, which proved to be extremely memory intensive, even on tiny datasets. In this work, we conduct a critical analysis of the existing implementation from an engineering perspective, and show that these limitations are not fundamental to the method; with better implementation it can be scaled to datasets 370x larger than previously used. Our efficient implementation also unlocks scaling models to much larger sizes which we show directly leads to improved performance on benchmark tasks. We also propose algorithmic improvements that can further benefit resource usage and model performance, including multi-output trees which are well-suited to generative modeling. Finally, we present results on large-scale scientific datasets derived from experimental particle physics as part of the Fast Calorimeter Simulation Challenge. Code is available at https://github.com/layer6ai-labs/calo-forest.<br />Comment: Presented at ICML 2024 Workshop on AI for Science

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.16046
Document Type :
Working Paper