IEEE Access (Jan 2024)

Comprehensive Evaluation of Static Analysis Tools for Their Performance in Finding Vulnerabilities in Java Code

  • Midya Alqaradaghi,
  • Tamas Kozsik

DOI
https://doi.org/10.1109/ACCESS.2024.3389955
Journal volume & issue
Vol. 12
pp. 55824 – 55842

Abstract

Read online

Various static code analysis tools have been designed to automatically detect software faults and security vulnerabilities. This paper aims to 1) conduct an empirical evaluation to assess the performance of five free and state-of-the-art static analysis tools in detecting Java security vulnerabilities using a well-defined and repeatable approach; 2) report on the vulnerabilities that are best and worst detected by static Java analyzers. We used the Juliet benchmark test suite in a controlled experiment to assess the effectiveness of five widely used Java static analysis tools. The vulnerabilities were successfully detected by one, two, or three tools. Only one vulnerability has been detected by four tools. The tools missed 13% of the Java vulnerability categories appearing in our experiment. More critically, none of the five tools could identify all the vulnerabilities in our experiment. We conclude that, despite recent improvements in their methodologies, current state-of-the-art static analysis tools are still ineffective for identifying the security vulnerabilities occurring in a small-scale, artificial test suite.

Keywords