Back to Search Start Over

Graph Generative Pre-trained Transformer

Authors :
Chen, Xiaohui
Wang, Yinkai
He, Jiaxing
Du, Yuanqi
Hassoun, Soha
Xu, Xiaolin
Liu, Li-Ping
Publication Year :
2025

Abstract

Graph generation is a critical task in numerous domains, including molecular design and social network analysis, due to its ability to model complex relationships and structured data. While most modern graph generative models utilize adjacency matrix representations, this work revisits an alternative approach that represents graphs as sequences of node set and edge set. We advocate for this approach due to its efficient encoding of graphs and propose a novel representation. Based on this representation, we introduce the Graph Generative Pre-trained Transformer (G2PT), an auto-regressive model that learns graph structures via next-token prediction. To further exploit G2PT's capabilities as a general-purpose foundation model, we explore fine-tuning strategies for two downstream applications: goal-oriented generation and graph property prediction. We conduct extensive experiments across multiple datasets. Results indicate that G2PT achieves superior generative performance on both generic graph and molecule datasets. Furthermore, G2PT exhibits strong adaptability and versatility in downstream tasks from molecular design to property prediction.<br />Comment: preprint

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.01073
Document Type :
Working Paper