Back to Search Start Over

Enhancing Long-Term Memory using Hierarchical Aggregate Tree for Retrieval Augmented Generation

Authors :
A, Aadharsh Aadhithya
S, Sachin Kumar
P, Soman K.
Publication Year :
2024

Abstract

Large language models have limited context capacity, hindering reasoning over long conversations. We propose the Hierarchical Aggregate Tree memory structure to recursively aggregate relevant dialogue context through conditional tree traversals. HAT encapsulates information from children nodes, enabling broad coverage with depth control. We formulate finding best context as optimal tree traversal. Experiments show HAT improves dialog coherence and summary quality over baseline contexts, demonstrating the techniques effectiveness for multi turn reasoning without exponential parameter growth. This memory augmentation enables more consistent, grounded longform conversations from LLMs<br />Comment: 6 pages, 2 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.06124
Document Type :
Working Paper