Back to Search
Start Over
On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology
- Publication Year :
- 2023
-
Abstract
- Message Passing Neural Networks (MPNNs) are instances of Graph Neural Networks that leverage the graph to send messages over the edges. This inductive bias leads to a phenomenon known as over-squashing, where a node feature is insensitive to information contained at distant nodes. Despite recent methods introduced to mitigate this issue, an understanding of the causes for over-squashing and of possible solutions are lacking. In this theoretical work, we prove that: (i) Neural network width can mitigate over-squashing, but at the cost of making the whole network more sensitive; (ii) Conversely, depth cannot help mitigate over-squashing: increasing the number of layers leads to over-squashing being dominated by vanishing gradients; (iii) The graph topology plays the greatest role, since over-squashing occurs between nodes at high commute (access) time. Our analysis provides a unified framework to study different recent methods introduced to cope with over-squashing and serves as a justification for a class of methods that fall under graph rewiring.<br />Comment: Accepted at ICML 2023; 21 pages
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2302.02941
- Document Type :
- Working Paper