Back to Search Start Over

On Leveraging Tests to Infer Nullable Annotations

Authors :
Dietrich, Jens
Pearce, David J.
Chandramohan, Mahin
Publication Year :
2023
Publisher :
Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2023.

Abstract

Issues related to the dereferencing of null pointers are a pervasive and widely studied problem, and numerous static analyses have been proposed for this purpose. These are typically based on dataflow analysis, and take advantage of annotations indicating whether a type is nullable or not. The presence of such annotations can significantly improve the accuracy of null checkers. However, most code found in the wild is not annotated, and tools must fall back on default assumptions, leading to both false positives and false negatives. Manually annotating code is a laborious task and requires deep knowledge of how a program interacts with clients and components. We propose to infer nullable annotations from an analysis of existing test cases. For this purpose, we execute instrumented tests and capture nullable API interactions. Those recorded interactions are then refined (santitised and propagated) in order to improve their precision and recall. We evaluate our approach on seven projects from the spring ecosystems and two google projects which have been extensively manually annotated with thousands of @Nullable annotations. We find that our approach has a high precision, and can find around half of the existing @Nullable annotations. This suggests that the method proposed is useful to mechanise a significant part of the very labour-intensive annotation task.<br />LIPIcs, Vol. 263, 37th European Conference on Object-Oriented Programming (ECOOP 2023), pages 10:1-10:25

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi...........0c5900c2c7fbde1a588c68c0b112641e
Full Text :
https://doi.org/10.4230/lipics.ecoop.2023.10