Back to Search Start Over

SLAPP: Subgraph-level attention-based performance prediction for deep learning models.

Authors :
Wang, Zhenyi
Yang, Pengfei
Hu, Linwei
Zhang, Bowen
Lin, Chengmin
Lv, Wenkai
Wang, Quan
Source :
Neural Networks. Feb2024, Vol. 170, p285-297. 13p.
Publication Year :
2024

Abstract

The intricacy of the Deep Learning (DL) landscape, brimming with a variety of models, applications, and platforms, poses considerable challenges for the optimal design, optimization, or selection of suitable DL models. One promising avenue to address this challenge is the development of accurate performance prediction methods. However, existing methods reveal critical limitations. Operator-level methods, proficient at predicting the performance of individual operators, often neglect broader graph features, which results in inaccuracies in full network performance predictions. On the contrary, graph-level methods excel in overall network prediction by leveraging these graph features but lack the ability to predict the performance of individual operators. To bridge these gaps, we propose SLAPP, a novel subgraph-level performance prediction method. Central to SLAPP is an innovative variant of Graph Neural Networks (GNNs) that we developed, named the Edge Aware Graph Attention Network (EAGAT). This specially designed GNN enables superior encoding of both node and edge features. Through this approach, SLAPP effectively captures both graph and operator features, thereby providing precise performance predictions for individual operators and entire networks. Moreover, we introduce a mixed loss design with dynamic weight adjustment to reconcile the predictive accuracy between individual operators and entire networks. In our experimental evaluation, SLAPP consistently outperforms traditional approaches in prediction accuracy, including the ability to handle unseen models effectively. Moreover, when compared to existing research, our method demonstrates a superior predictive performance across multiple DL models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
170
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
174842712
Full Text :
https://doi.org/10.1016/j.neunet.2023.11.043