Back to Search Start Over

Table Caption Generation in Scholarly Documents Leveraging Pre-trained Language Models

Authors :
Xu, Junjie H.
Shinden, Kohei
Kato, Makoto P.
Source :
2021 IEEE 10th Global Conference on Consumer Electronics (GCCE 2021)
Publication Year :
2021

Abstract

This paper addresses the problem of generating table captions for scholarly documents, which often require additional information outside the table. To this end, we propose a method of retrieving relevant sentences from the paper body, and feeding the table content as well as the retrieved sentences into pre-trained language models (e.g. T5 and GPT-2) for generating table captions. The contributions of this paper are: (1) discussion on the challenges in table captioning for scholarly documents; (2) development of a dataset DocBank-TB, which is publicly available; and (3) comparison of caption generation methods for scholarly documents with different strategies to retrieve relevant sentences from the paper body. Our experimental results showed that T5 is the better generation model for this task, as it outperformed GPT-2 in BLEU and METEOR implying that the generated text are clearer and more precise. Moreover, inputting relevant sentences matching the row header or whole table is effective.

Details

Database :
arXiv
Journal :
2021 IEEE 10th Global Conference on Consumer Electronics (GCCE 2021)
Publication Type :
Report
Accession number :
edsarx.2108.08111
Document Type :
Working Paper