1. Igea: a Decoder-Only Language Model for Biomedical Text Generation in Italian
- Author
-
Buonocore, Tommaso Mario, Rancati, Simone, and Parimbelli, Enea
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence ,I.2.7 ,J.3 - Abstract
The development of domain-specific language models has significantly advanced natural language processing applications in various specialized fields, particularly in biomedicine. However, the focus has largely been on English-language models, leaving a gap for less-resourced languages such as Italian. This paper introduces Igea, the first decoder-only language model designed explicitly for biomedical text generation in Italian. Built on the Minerva model and continually pretrained on a diverse corpus of Italian medical texts, Igea is available in three model sizes: 350 million, 1 billion, and 3 billion parameters. The models aim to balance computational efficiency and performance, addressing the challenges of managing the peculiarities of medical terminology in Italian. We evaluate Igea using a mix of in-domain biomedical corpora and general-purpose benchmarks, highlighting its efficacy and retention of general knowledge even after the domain-specific training. This paper discusses the model's development and evaluation, providing a foundation for future advancements in Italian biomedical NLP., Comment: 6 pages, 1 figure, 3 tables
- Published
- 2024