Back to Search Start Over

DARWIN Series: Domain Specific Large Language Models for Natural Science

Authors :
Xie, Tong
Wan, Yuwei
Huang, Wei
Yin, Zhenyu
Liu, Yixuan
Wang, Shaozhou
Linghu, Qingyuan
Kit, Chunyu
Grazian, Clara
Zhang, Wenjie
Razzak, Imran
Hoex, Bram
Publication Year :
2023

Abstract

Emerging tools bring forth fresh approaches to work, and the field of natural science is no different. In natural science, traditional manual, serial, and labour-intensive work is being augmented by automated, parallel, and iterative processes driven by artificial intelligence-based experimental automation and more. To add new capabilities in natural science, enabling the acceleration and enrichment of automation of the discovery process, we present DARWIN, a series of tailored LLMs for natural science, mainly in physics, chemistry, and material science. This series relies on open-source LLM, incorporating structured and unstructured scientific knowledge from public datasets and literature. We fine-tuned the models using over 60,000 instruction data points, emphasizing factual correctness. During the fine-tuning, we introduce the Scientific Instruction Generation (SIG) model, automating instruction generation from scientific texts. This eliminates the need for manual extraction or domain-specific knowledge graphs and efficiently injects scientific knowledge into the model. We also explore multi-task training strategies, revealing interconnections between scientific tasks. DARWIN series not only achieves state-of-the-art results on various scientific tasks but also diminishes reliance on closed-source AI models. Our research showcases the ability of LLM in the scientific domain, with the overarching goal of fostering prosperity within the broader AI for science community.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.13565
Document Type :
Working Paper