Cite
GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model
MLA
Caiyang Yu, et al. “GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model.” Big Data Mining and Analytics, vol. 8, no. 1, Feb. 2025, pp. 45–64. EBSCOhost, https://doi.org/10.26599/BDMA.2024.9020036.
APA
Caiyang Yu, Xianggen Liu, Yifan Wang, Yun Liu, Wentao Feng, Xiong Deng, Chenwei Tang, & Jiancheng Lv. (2025). GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model. Big Data Mining and Analytics, 8(1), 45–64. https://doi.org/10.26599/BDMA.2024.9020036
Chicago
Caiyang Yu, Xianggen Liu, Yifan Wang, Yun Liu, Wentao Feng, Xiong Deng, Chenwei Tang, and Jiancheng Lv. 2025. “GPT-NAS: Neural Architecture Search Meets Generative Pre-Trained Transformer Model.” Big Data Mining and Analytics 8 (1): 45–64. doi:10.26599/BDMA.2024.9020036.