Back to Search Start Over

Hyper-Mol: Molecular Representation Learning via Fingerprint-Based Hypergraph.

Authors :
Cui, Shicheng
Li, Qianmu
Li, Deqiang
Lian, Zhichao
Hou, Jun
Source :
Computational Intelligence & Neuroscience. 2/1/2023, Vol. 2023, p1-9. 9p.
Publication Year :
2023

Abstract

With the development of artificial intelligence (AI) in the field of drug design and discovery, learning informative representations of molecules is becoming crucial for those AI-driven tasks. In recent years, the graph neural networks (GNNs) have emerged as a preferred choice of deep learning architecture and have been successfully applied to molecular representation learning (MRL). Up-to-date MRL methods directly apply the message passing mechanism on the atom-level attributes (i.e., atoms and bonds) of molecules. However, they neglect latent yet significant hyperstructured knowledge, such as the information of pharmacophore or functional class. Hence, in this paper, we propose Hyper-Mol, a new MRL framework that applies GNNs to encode hypergraph structures of molecules via fingerprint-based features. Hyper-Mol explores the hyperstructured knowledge and the latent relationships of the fingerprint substructures from a hypergraph perspective. The molecular hypergraph generation algorithm is designed to depict the hyperstructured information with the physical and chemical characteristics of molecules. Thus, the fingerprint-level message passing process can encode both the intra-structured and inter-structured information of fingerprint substructures according to the molecular hypergraphs. We evaluate Hyper-Mol on molecular property prediction tasks, and the experimental results on real-world benchmarks show that Hyper-Mol can learn comprehensive hyperstructured knowledge of molecules and is superior to the state-of-the-art baselines. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
16875265
Volume :
2023
Database :
Academic Search Index
Journal :
Computational Intelligence & Neuroscience
Publication Type :
Academic Journal
Accession number :
161719350
Full Text :
https://doi.org/10.1155/2023/3756102