Back to Search Start Over

ComFormer: Code Comment Generation via Transformer and Fusion Method-based Hybrid Code Representation

Authors :
Yang, Guang
Chen, Xiang
Cao, Jinxin
Xu, Shuyuan
Cui, Zhanqi
Yu, Chi
Liu, Ke
Publication Year :
2021

Abstract

Developers often write low-quality code comments due to the lack of programming experience, which can reduce the efficiency of developers program comprehension. Therefore, developers hope that code comment generation tools can be developed to illustrate the functionality and purpose of the code. Recently, researchers mainly model this problem as the neural machine translation problem and tend to use deep learning-based methods. In this study, we propose a novel method ComFormer based on Transformer and fusion method-based hybrid code presentation. Moreover, to alleviate OOV (out-of-vocabulary) problem and speed up model training, we further utilize the Byte-BPE algorithm to split identifiers and Sim_SBT method to perform AST Traversal. We compare ComFormer with seven state-of-the-art baselines from code comment generation and neural machine translation domains. Comparison results show the competitiveness of ComFormer in terms of three performance measures. Moreover, we perform a human study to verify that ComFormer can generate high-quality comments.<br />Comment: DSA2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2107.03644
Document Type :
Working Paper