Back to Search Start Over

GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model

Authors :
Caiyang Yu
Xianggen Liu
Yifan Wang
Yun Liu
Wentao Feng
Xiong Deng
Chenwei Tang
Jiancheng Lv
Source :
Big Data Mining and Analytics, Vol 8, Iss 1, Pp 45-64 (2025)
Publication Year :
2025
Publisher :
Tsinghua University Press, 2025.

Abstract

The pursuit of optimal neural network architectures is foundational to the progression of Neural Architecture Search (NAS). However, the existing NAS methods suffer from the following problem using traditional search strategies, i.e., when facing a large and complex search space, it is difficult to mine more effective architectures within a reasonable time, resulting in inferior search results. This research introduces the Generative Pre-trained Transformer NAS (GPT-NAS), an innovative approach designed to overcome the limitations which are inherent in traditional NAS strategies. This approach improves search efficiency and obtains better architectures by integrating GPT model into the search process. Specifically, we design a reconstruction strategy that utilizes the trained GPT to reorganize the architectures obtained from the search. In addition, to equip the GPT model with the design capabilities of neural architecture, we propose the use of the GPT model for training on a neural architecture dataset. For each architecture, the structural information of its previous layers is utilized to predict the next layer of structure, iteratively traversing the entire architecture. In this way, the GPT model can efficiently learn the key features required for neural architectures. Extensive experimental validation shows that our GPT-NAS approach beats both manually constructed neural architectures and automatically generated architectures by NAS. In addition, we validate the superiority of introducing the GPT model in several ways, and find that the accuracy of the neural architecture on the image dataset obtained from the search after introducing the GPT model is improved by up to about 9%.

Details

Language :
English
ISSN :
20960654
Volume :
8
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Big Data Mining and Analytics
Publication Type :
Academic Journal
Accession number :
edsdoj.84a9c91f629a4c239685ea4edd13310d
Document Type :
article
Full Text :
https://doi.org/10.26599/BDMA.2024.9020036