1. Long Term Memory: The Foundation of AI Self-Evolution
- Author
-
Jiang, Xun, Li, Feng, Zhao, Han, Wang, Jiaying, Shao, Jun, Xu, Shihao, Zhang, Shu, Chen, Weiling, Tang, Xavier, Chen, Yize, Wu, Mengyue, Ma, Weizhi, Wang, Mengdi, and Chen, Tianqiao
- Subjects
Computer Science - Artificial Intelligence ,Computer Science - Machine Learning - Abstract
Large language models (LLMs) like GPTs, trained on vast datasets, have demonstrated impressive capabilities in language understanding, reasoning, and planning, achieving human-level performance in various tasks. Most studies focus on enhancing these models by training on ever-larger datasets to build more powerful foundation models. While training stronger models is important, enabling models to evolve during inference is equally crucial, a process we refer to as AI self-evolution. Unlike large-scale training, self-evolution may rely on limited data or interactions. Inspired by the columnar organization of the human cerebral cortex, we hypothesize that AI models could develop cognitive abilities and build internal representations through iterative interactions with their environment. To achieve this, models need long-term memory (LTM) to store and manage processed interaction data. LTM supports self-evolution by representing diverse experiences across environments and agents. In this report, we explore AI self-evolution and its potential to enhance models during inference. We examine LTM's role in lifelong learning, allowing models to evolve based on accumulated interactions. We outline the structure of LTM and the systems needed for effective data retention and representation. We also classify approaches for building personalized models with LTM data and show how these models achieve self-evolution through interaction. Using LTM, our multi-agent framework OMNE achieved first place on the GAIA benchmark, demonstrating LTM's potential for AI self-evolution. Finally, we present a roadmap for future research, emphasizing the importance of LTM for advancing AI technology and its practical applications., Comment: 56 pages, 13 figures
- Published
- 2024