Back to Search
Start Over
Trend Analysis of Large Language Models through a Developer Community: A Focus on Stack Overflow.
- Source :
-
Information (2078-2489) . Nov2023, Vol. 14 Issue 11, p602. 15p. - Publication Year :
- 2023
-
Abstract
- In the rapidly advancing field of large language model (LLM) research, platforms like Stack Overflow offer invaluable insights into the developer community's perceptions, challenges, and interactions. This research aims to analyze LLM research and development trends within the professional community. Through the rigorous analysis of Stack Overflow, employing a comprehensive dataset spanning several years, the study identifies the prevailing technologies and frameworks underlining the dominance of models and platforms such as Transformer and Hugging Face. Furthermore, a thematic exploration using Latent Dirichlet Allocation unravels a spectrum of LLM discussion topics. As a result of the analysis, twenty keywords were derived, and a total of five key dimensions, "OpenAI Ecosystem and Challenges", "LLM Training with Frameworks", "APIs, File Handling and App Development", "Programming Constructs and LLM Integration", and "Data Processing and LLM Functionalities", were identified through intertopic distance mapping. This research underscores the notable prevalence of specific Tags and technologies within the LLM discourse, particularly highlighting the influential roles of Transformer models and frameworks like Hugging Face. This dominance not only reflects the preferences and inclinations of the developer community but also illuminates the primary tools and technologies they leverage in the continually evolving field of LLMs. [ABSTRACT FROM AUTHOR]
- Subjects :
- *LANGUAGE models
*TREND analysis
*ELECTRONIC data processing
*SPECTRUM allocation
Subjects
Details
- Language :
- English
- ISSN :
- 20782489
- Volume :
- 14
- Issue :
- 11
- Database :
- Academic Search Index
- Journal :
- Information (2078-2489)
- Publication Type :
- Academic Journal
- Accession number :
- 173826554
- Full Text :
- https://doi.org/10.3390/info14110602