Back to Search Start Over

LLMzSz{\L}: a comprehensive LLM benchmark for Polish

Authors :
Jassem, Krzysztof
Ciesiółka, Michał
Graliński, Filip
Jabłoński, Piotr
Pokrywka, Jakub
Kubis, Marek
Jabłońska, Monika
Staruch, Ryszard
Publication Year :
2025

Abstract

This article introduces the first comprehensive benchmark for the Polish language at this scale: LLMzSz{\L} (LLMs Behind the School Desk). It is based on a coherent collection of Polish national exams, including both academic and professional tests extracted from the archives of the Polish Central Examination Board. It covers 4 types of exams, coming from 154 domains. Altogether, it consists of almost 19k closed-ended questions. We investigate the performance of open-source multilingual, English, and Polish LLMs to verify LLMs' abilities to transfer knowledge between languages. Also, the correlation between LLMs and humans at model accuracy and exam pass rate levels is examined. We show that multilingual LLMs can obtain superior results over monolingual ones; however, monolingual models may be beneficial when model size matters. Our analysis highlights the potential of LLMs in assisting with exam validation, particularly in identifying anomalies or errors in examination tasks.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.02266
Document Type :
Working Paper