Back to Search Start Over

Revisiting Random Walks for Learning on Graphs

Authors :
Kim, Jinwoo
Zaghen, Olga
Suleymanzade, Ayhan
Ryou, Youngmin
Hong, Seunghoon
Publication Year :
2024

Abstract

We revisit a simple idea for machine learning on graphs, where a random walk on a graph produces a machine-readable record, and this record is processed by a deep neural network to directly make vertex-level or graph-level predictions. We refer to these stochastic machines as random walk neural networks, and show that we can design them to be isomorphism invariant while capable of universal approximation of graph functions in probability. A useful finding is that almost any kind of record of random walk guarantees probabilistic invariance as long as the vertices are anonymized. This enables us to record random walks in plain text and adopt a language model to read these text records to solve graph tasks. We further establish a parallelism to message passing neural networks using tools from Markov chain theory, and show that over-smoothing in message passing is alleviated by construction in random walk neural networks, while over-squashing manifests as probabilistic under-reaching. We show that random walk neural networks based on pre-trained language models can solve several hard problems on graphs, such as separating strongly regular graphs where the 3-WL test fails, counting substructures, and transductive classification on arXiv citation network without training. Code is available at https://github.com/jw9730/random-walk.<br />Comment: 41 pages, 11 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.01214
Document Type :
Working Paper