Back to Search Start Over

Arcee's MergeKit: A Toolkit for Merging Large Language Models

Authors :
Goddard, Charles
Siriwardhana, Shamane
Ehghaghi, Malikeh
Meyers, Luke
Karpukhin, Vlad
Benedict, Brian
McQuade, Mark
Solawetz, Jacob
Publication Year :
2024

Abstract

The rapid expansion of the open-source language model landscape presents an opportunity to merge the competencies of these model checkpoints by combining their parameters. Advances in transfer learning, the process of fine-tuning pretrained models for specific tasks, has resulted in the development of vast amounts of task-specific models, typically specialized in individual tasks and unable to utilize each other's strengths. Model merging facilitates the creation of multitask models without the need for additional training, offering a promising avenue for enhancing model performance and versatility. By preserving the intrinsic capabilities of the original models, model merging addresses complex challenges in AI - including the difficulties of catastrophic forgetting and multitask learning. To support this expanding area of research, we introduce MergeKit, a comprehensive, open-source library designed to facilitate the application of model merging strategies. MergeKit offers an extensible framework to efficiently merge models on any hardware, providing utility to researchers and practitioners. To date, thousands of models have been merged by the open-source community, leading to the creation of some of the worlds most powerful open-source model checkpoints, as assessed by the Open LLM Leaderboard. The library is accessible at https://github.com/arcee-ai/MergeKit.<br />Comment: 11 pages, 4 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.13257
Document Type :
Working Paper