Back to Search Start Over

Hierarchical Encoder-Decoder Summary Model with an Instructor for Long Academic Papers

Authors :
Shasha Li
Jianling Li
Jie Yu
Wuhang Lin
Jun Ma
Source :
ICCCS
Publication Year :
2021
Publisher :
IEEE, 2021.

Abstract

Summary models, whether extractive or abstractive, have achieved great success recently. For long academic papers, the abstractive model with the encoder-decoder architecture mainly only relies on the attentional context vector for generation, unlike humans who have already mastered the salient information of the source text to have full control over what to write. While the extracted sentences always contain the correct and salient information which can be used to control the abstraction process. Therefore, based on a hierarchical encoder-decoder architecture specifically for academic papers, we proposed a summary model with an Instructor, an encoder in essence by taking the guiding sentences as the input to further control the generating process. In the encoder part, the final hidden state from Instructor is directly added to the basic hierarchical hidden state from the encoder. Experimental results on arXiv/PubMed show that the only encoder-improved model can generate better abstract. In the decoder part, the context vector from Instructor is integrated with the original discourse-aware context vector for the generation. The results show that Instructor is effective for control and our model can generate a more accurate and fluent abstract with significantly higher ROUGE values.

Details

Database :
OpenAIRE
Journal :
2021 IEEE 6th International Conference on Computer and Communication Systems (ICCCS)
Accession number :
edsair.doi...........31d21a7701a48b349a1183aaa3aa2095