Back to Search Start Over

The NLBSE'23 Tool Competition

Authors :
Kallis, Rafael (author)
Izadi, M. (author)
Pascarella, Luca (author)
Chaparro, Oscar (author)
Rani, Pooja (author)
Kallis, Rafael (author)
Izadi, M. (author)
Pascarella, Luca (author)
Chaparro, Oscar (author)
Rani, Pooja (author)
Publication Year :
2023

Abstract

We report on the organization and results of the second edition of the tool competition from the International Workshop on Natural Language-based Software Engineering (NLBSE'23). As in the prior edition, we organized the competition on automated issue report classification, with a larger dataset. This year, we featured an extra competition on au-tomated code comment classification. In this tool competition edition, five teams submitted multiple classification models to automatically classify issue reports and code comments. The submitted models were fine-tuned and evaluated on a benchmark dataset of 1.4 million issue reports or 6.7 thousand code comments, respectively. The goal of the competition was to improve the classification performance of the baseline models that we provided. This paper reports details of the competition, including the rules, the teams and contestant models, and the ranking of models based on their average classification performance across issue report and code comment types.<br />Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.<br />Software Engineering

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1427490567
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1109.NLBSE59153.2023.00007