Back to Search
Start Over
shs-nlp at RadSum23: Domain-Adaptive Pre-training of Instruction-tuned LLMs for Radiology Report Impression Generation
- Source :
- BioNLP 2023, Co-located with ACL 2023
- Publication Year :
- 2023
-
Abstract
- Instruction-tuned generative Large language models (LLMs) like ChatGPT and Bloomz possess excellent generalization abilities, but they face limitations in understanding radiology reports, particularly in the task of generating the IMPRESSIONS section from the FINDINGS section. They tend to generate either verbose or incomplete IMPRESSIONS, mainly due to insufficient exposure to medical text data during training. We present a system which leverages large-scale medical text data for domain-adaptive pre-training of instruction-tuned LLMs to enhance its medical knowledge and performance on specific medical tasks. We show that this system performs better in a zero-shot setting than a number of pretrain-and-finetune adaptation methods on the IMPRESSIONS generation task, and ranks 1st among participating systems in Task 1B: Radiology Report Summarization at the BioNLP 2023 workshop.<br />Comment: 1st Place in Task 1B: Radiology Report Summarization at BioNLP 2023
- Subjects :
- Computer Science - Computation and Language
Subjects
Details
- Database :
- arXiv
- Journal :
- BioNLP 2023, Co-located with ACL 2023
- Publication Type :
- Report
- Accession number :
- edsarx.2306.03264
- Document Type :
- Working Paper