Back to Search Start Over

Masked Particle Modeling on Sets: Towards Self-Supervised High Energy Physics Foundation Models

Authors :
Golling, Tobias
Heinrich, Lukas
Kagan, Michael
Klein, Samuel
Leigh, Matthew
Osadchy, Margarita
Raine, John Andrew
Publication Year :
2024

Abstract

We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data. This work provides a novel scheme to perform masked modeling based pre-training to learn permutation invariant functions on sets. More generally, this work provides a step towards building large foundation models for HEP that can be generically pre-trained with self-supervised learning and later fine-tuned for a variety of down-stream tasks. In MPM, particles in a set are masked and the training objective is to recover their identity, as defined by a discretized token representation of a pre-trained vector quantized variational autoencoder. We study the efficacy of the method in samples of high energy jets at collider physics experiments, including studies on the impact of discretization, permutation invariance, and ordering. We also study the fine-tuning capability of the model, showing that it can be adapted to tasks such as supervised and weakly supervised jet classification, and that the model can transfer efficiently with small fine-tuning data sets to new classes and new data domains.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.13537
Document Type :
Working Paper