Back to Search Start Over

Learning Symmetries via Weight-Sharing with Doubly Stochastic Tensors

Authors :
van der Linden, Putri A.
García-Castellanos, Alejandro
Vadgama, Sharvaree
Kuipers, Thijs P.
Bekkers, Erik J.
Source :
Advances in Neural Information Processing Systems (NeurIPS) 2024
Publication Year :
2024

Abstract

Group equivariance has emerged as a valuable inductive bias in deep learning, enhancing generalization, data efficiency, and robustness. Classically, group equivariant methods require the groups of interest to be known beforehand, which may not be realistic for real-world data. Additionally, baking in fixed group equivariance may impose overly restrictive constraints on model architecture. This highlights the need for methods that can dynamically discover and apply symmetries as soft constraints. For neural network architectures, equivariance is commonly achieved through group transformations of a canonical weight tensor, resulting in weight sharing over a given group $G$. In this work, we propose to learn such a weight-sharing scheme by defining a collection of learnable doubly stochastic matrices that act as soft permutation matrices on canonical weight tensors, which can take regular group representations as a special case. This yields learnable kernel transformations that are jointly optimized with downstream tasks. We show that when the dataset exhibits strong symmetries, the permutation matrices will converge to regular group representations and our weight-sharing networks effectively become regular group convolutions. Additionally, the flexibility of the method enables it to effectively pick up on partial symmetries.<br />Comment: 19 pages, 14 figures, 4 tables

Details

Database :
arXiv
Journal :
Advances in Neural Information Processing Systems (NeurIPS) 2024
Publication Type :
Report
Accession number :
edsarx.2412.04594
Document Type :
Working Paper