Back to Search Start Over

StoryAnalogy: Deriving Story-level Analogies from Large Language Models to Unlock Analogical Understanding

Authors :
Jiayang, Cheng
Qiu, Lin
Chan, Tsz Ho
Fang, Tianqing
Wang, Weiqi
Chan, Chunkit
Ru, Dongyu
Guo, Qipeng
Zhang, Hongming
Song, Yangqiu
Zhang, Yue
Zhang, Zheng
Publication Year :
2023

Abstract

Analogy-making between narratives is crucial for human reasoning. In this paper, we evaluate the ability to identify and generate analogies by constructing a first-of-its-kind large-scale story-level analogy corpus, \textsc{StoryAnalogy}, which contains 24K story pairs from diverse domains with human annotations on two similarities from the extended Structure-Mapping Theory. We design a set of tests on \textsc{StoryAnalogy}, presenting the first evaluation of story-level analogy identification and generation. Interestingly, we find that the analogy identification tasks are incredibly difficult not only for sentence embedding models but also for the recent large language models (LLMs) such as ChatGPT and LLaMa. ChatGPT, for example, only achieved around 30% accuracy in multiple-choice questions (compared to over 85% accuracy for humans). Furthermore, we observe that the data in \textsc{StoryAnalogy} can improve the quality of analogy generation in LLMs, where a fine-tuned FlanT5-xxl model achieves comparable performance to zero-shot ChatGPT.<br />Comment: Accepted by EMNLP 2023 main conference

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.12874
Document Type :
Working Paper