Back to Search Start Over

Neural Topological Ordering for Computation Graphs

Authors :
Gagrani, Mukul
Rainone, Corrado
Yang, Yang
Teague, Harris
Jeon, Wonseok
Van Hoof, Herke
Zeng, Weiliang Will
Zappi, Piero
Lott, Christopher
Bondesan, Roberto
Publication Year :
2022

Abstract

Recent works on machine learning for combinatorial optimization have shown that learning based approaches can outperform heuristic methods in terms of speed and performance. In this paper, we consider the problem of finding an optimal topological order on a directed acyclic graph with focus on the memory minimization problem which arises in compilers. We propose an end-to-end machine learning based approach for topological ordering using an encoder-decoder framework. Our encoder is a novel attention based graph neural network architecture called \emph{Topoformer} which uses different topological transforms of a DAG for message passing. The node embeddings produced by the encoder are converted into node priorities which are used by the decoder to generate a probability distribution over topological orders. We train our model on a dataset of synthetically generated graphs called layered graphs. We show that our model outperforms, or is on-par, with several topological ordering baselines while being significantly faster on synthetic graphs with up to 2k nodes. We also train and test our model on a set of real-world computation graphs, showing performance improvements.<br />Comment: To appear in NeurIPS 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2207.05899
Document Type :
Working Paper