Back to Search
Start Over
Path-Augmented Graph Transformer Network
- Publication Year :
- 2019
- Publisher :
- American Chemical Society (ACS), 2019.
-
Abstract
- Much of the recent work on learning molecular representations has been based on Graph Convolution Networks (GCN). These models rely on local aggregation operations and can therefore miss higher-order graph properties. To remedy this, we propose Path-Augmented Graph Transformer Networks (PAGTN) that are explicitly built on longer-range dependencies in graph-structured data. Specifically, we use path features in molecular graphs to create global attention layers. We compare our PAGTN model against the GCN model and show that our model consistently outperforms GCNs on molecular property prediction datasets including quantum chemistry (QM7, QM8, QM9), physical chemistry (ESOL, Lipophilictiy) and biochemistry (BACE, BBBP).<br />Comment: Appears in ICML LRG Workshop
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Theoretical computer science
Statistics - Machine Learning
Computer science
Path (graph theory)
Machine Learning (stat.ML)
Graph algorithms
Graph property
Graph
Machine Learning (cs.LG)
Transformer (machine learning model)
Convolution
Subjects
Details
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....3d38bcf6f7e8b85ad2d3e49e519d98f0
- Full Text :
- https://doi.org/10.26434/chemrxiv.8214422