Back to Search Start Over

Generalization capabilities of translationally equivariant neural networks

Authors :
Bulusu, Srinath
Favoni, Matteo
Ipp, Andreas
Müller, David I.
Schuh, Daniel
Source :
Phys. Rev. D 104, 074504 (2021)
Publication Year :
2021

Abstract

The rising adoption of machine learning in high energy physics and lattice field theory necessitates the re-evaluation of common methods that are widely used in computer vision, which, when applied to problems in physics, can lead to significant drawbacks in terms of performance and generalizability. One particular example for this is the use of neural network architectures that do not reflect the underlying symmetries of the given physical problem. In this work, we focus on complex scalar field theory on a two-dimensional lattice and investigate the benefits of using group equivariant convolutional neural network architectures based on the translation group. For a meaningful comparison, we conduct a systematic search for equivariant and non-equivariant neural network architectures and apply them to various regression and classification tasks. We demonstrate that in most of these tasks our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts, which applies not only to physical parameters beyond those represented in the training set, but also to different lattice sizes.<br />Comment: 28 pages, 20 figures, v3: equivalent to the version published in PRD

Details

Database :
arXiv
Journal :
Phys. Rev. D 104, 074504 (2021)
Publication Type :
Report
Accession number :
edsarx.2103.14686
Document Type :
Working Paper
Full Text :
https://doi.org/10.1103/PhysRevD.104.074504