Back to Search Start Over

Distilled GPT for source code summarization

Authors :
Su, Chia-Yi
McMillan, Collin
Source :
Automated Software Engineering; May 2024, Vol. 31 Issue: 1
Publication Year :
2024

Abstract

A code summary is a brief natural language description of source code. Summaries are usually only a single sentence long, and yet form the backbone of developer documentation. A short descriptions such as “changes all visible polygons to the color blue” can give a programmer a high-level idea of what code does without the effort of reading the code itself. Recently, products based on Large Language Models such as ChatGPT have demonstrated a strong ability to write these descriptions automatically. However, to use these tools, programmers must send their code to untrusted third parties for processing (e.g., via an API call). This loss of custody is not acceptable to many organizations. In this paper, we present an alternative: we train an open source model using sample output generated by GPT-3.5 in a process related to knowledge distillation. Our model is small enough (350 m parameters) to be run on a single 16gb GPU, yet we show in our evaluation that it is large enough to mimic GPT-3.5 on this task.

Details

Language :
English
ISSN :
09288910 and 15737535
Volume :
31
Issue :
1
Database :
Supplemental Index
Journal :
Automated Software Engineering
Publication Type :
Periodical
Accession number :
ejs65657331
Full Text :
https://doi.org/10.1007/s10515-024-00421-4