1. End-to-End Graph Flattening Method for Large Language Models
- Author
-
Hong, Bin, Wu, Jinze, Liu, Jiayu, Ding, Liang, Sha, Jing, Zhang, Kai, Wang, Shijin, and Huang, Zhenya
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
In recent years, the breakthrough of Large Language Models (LLMs) offers new ideas for achieving universal methods on graph data. The common practice of converting graphs into natural language for LLMs, which refers to graph flattening, exhibits good generalizability and interpretability. However, the poor organization of the textual format results in poor performance in long-distance scenario understanding. Inspired by human cognitive reasoning habits, we propose a novel method for graph flattening to fit LLMs, termed as End-to-End DAG-Path prompting (EEDP). Experiments on real-world datasets show that EEDP enhances the reasoning performance of LLMs in long-distance scenarios while maintaining excellent performance in short-distance scenarios, demonstrating good robustness in the face of distance variations., Comment: 2024 1st International Conference on Computational Linguistics and Natural Language Processing (CLNLP 2024)
- Published
- 2024