Back to Search Start Over

Hierarchical graph representation learning for the prediction of drug-target binding affinity.

Authors :
Chu, Zhaoyang
Huang, Feng
Fu, Haitao
Quan, Yuan
Zhou, Xionghui
Liu, Shichao
Zhang, Wen
Source :
Information Sciences. Oct2022, Vol. 613, p507-523. 17p.
Publication Year :
2022

Abstract

• A novel hierarchical graph representation learning model for drug-target binding affinity prediction. • Represents the drug-target binding affinity data as a hierarchical graph. • Hierarchically integrates coarse- and fine-level information in a coarse-to-fine manner. Computationally predicting drug-target binding affinity (DTA) has attracted increasing attention due to its benefit for accelerating drug discovery. Currently, numerous deep learning-based prediction models have been proposed, often with a biencoder architecture that commonly focuses on how to extract expressive representations for drugs and targets but overlooks modeling explicit drug-target interactions. However, known DTA can provide underlying knowledge about how the drugs interact with targets that is beneficial for predictive accuracy. In this paper, we propose a novel hierarchical graph representation learning model for DTA prediction, named HGRL-DTA. The main contribution of our model is to establish a hierarchical graph learning architecture to integrate the coarse- and fine-level information from an affinity graph and drug/target molecule graphs, respectively, in a well-designed coarse-to-fine manner. In addition, we design a similarity-based representation inference method to infer coarse-level information when it is unavailable for new drugs or targets under the cold start scenario. Comprehensive experimental results under four scenarios across two benchmark datasets indicate that HGRL-DTA outperforms the state-of-the-art models in almost all cases. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
613
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
159928220
Full Text :
https://doi.org/10.1016/j.ins.2022.09.043