Back to Search Start Over

Harnessing Attention-Based Graph Recurrent Neural Networks for Enhanced Conversational Flow Prediction via Conversational Graph Construction.

Authors :
Sujatha, R.
Nimala, K.
Source :
Journal of Information & Knowledge Management; Jun2024, Vol. 23 Issue 3, p1-17, 17p
Publication Year :
2024

Abstract

Conversational flow refers to the progression of a conversation, encompassing the arrangement of topics discussed and how responses are delivered. A smooth flow involves participants taking turns to speak and respond naturally and intuitively. Conversely, a more disjointed flow may entail prolonged pauses or difficulties establishing common ground. Numerous factors influence conversation flow, including the personalities of those involved, their familiarity with each other, and the contextual setting. A conversational graph pattern outlines how a conversation typically unfolds or the underlying structure it adheres to. It involves combining different sentence types, the sequential order of topics discussed, and the roles played by different individuals. Predicting subsequent sentences relies on predefined patterns, the context derived from prior conversation flow in the data, and the trained system. The accuracy of sentence predictions varies based on the probability of identifying sentences that fit the subsequent pattern. We employ the Graph Recurrent Neural Network with Attention (GRNNA) model to generate conversational graphs and perform next-sentence prediction. This model constructs a conversational graph using an adjacency matrix, node features (sentences), and edge features (semantic similarity between the sentences). The proposed approach leverages attention mechanisms, recurrent updates, and information aggregation from neighbouring nodes to predict the next node (sentence). The model achieves enhanced predictive capabilities by updating node representations through multiple iterations of message passing and recurrent updates. Experimental results using the conversation dataset demonstrate that the GRNNA model surpasses the Graph Neural Network (GNN) model in next-sentence prediction, achieving an impressive accuracy of 98.89%. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02196492
Volume :
23
Issue :
3
Database :
Complementary Index
Journal :
Journal of Information & Knowledge Management
Publication Type :
Academic Journal
Accession number :
178117032
Full Text :
https://doi.org/10.1142/S0219649224500382