Back to Search Start Over

OnlySportsLM: Optimizing Sports-Domain Language Models with SOTA Performance under Billion Parameters

Authors :
Chen, Zexin
Li, Chengxi
Xie, Xiangyu
Dube, Parijat
Publication Year :
2024

Abstract

This paper explores the potential of a small, domain-specific language model trained exclusively on sports-related data. We investigate whether extensive training data with specially designed small model structures can overcome model size constraints. The study introduces the OnlySports collection, comprising OnlySportsLM, OnlySports Dataset, and OnlySports Benchmark. Our approach involves: 1) creating a massive 600 billion tokens OnlySports Dataset from FineWeb, 2) optimizing the RWKV architecture for sports-related tasks, resulting in a 196M parameters model with 20-layer, 640-dimension structure, 3) training the OnlySportsLM on part of OnlySports Dataset, and 4) testing the resultant model on OnlySports Benchmark. OnlySportsLM achieves a 37.62%/34.08% accuracy improvement over previous 135M/360M state-of-the-art models and matches the performance of larger models such as SomlLM 1.7B and Qwen 1.5B in the sports domain. Additionally, the OnlySports collection presents a comprehensive workflow for building high-quality, domain-specific language models, providing a replicable blueprint for efficient AI development across various specialized fields.<br />Comment: 13 pages, 4 figures, 4 tables

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.00286
Document Type :
Working Paper