Back to Search Start Over

Think Before You Speak: Cultivating Communication Skills of Large Language Models via Inner Monologue

Authors :
Zhou, Junkai
Pang, Liang
Shen, Huawei
Cheng, Xueqi
Publication Year :
2023

Abstract

The emergence of large language models (LLMs) further improves the capabilities of open-domain dialogue systems and can generate fluent, coherent, and diverse responses. However, LLMs still lack a crucial ability: communication skills. This limitation renders them more like information seeking tools rather than anthropomorphic chatbots. Communication skills, such as topic transition, proactively asking questions, concept guidance, empathy, and summarising often should be taken into consideration, to make LLMs more anthropomorphic and proactive during the conversation, thereby increasing the interest of users and attracting them to chat for longer. However, enabling these communication skills in black-box LLMs remains a key challenge because they do not have the same utterance formation mode as real people: think before speaking. Inspired by linguistics and cognitive science, we empower LLMs with communication skills through inner monologues. To evaluate various communication skills, we construct a benchmark named Cskills, which can also more comprehensively evaluate the dialogue generation ability of the model. Experimental results show that the proposed CSIM strategy improves the backbone models and outperforms the baselines.<br />Comment: Accepted by NAACL 2024 Findings

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.07445
Document Type :
Working Paper