PLoS ONE (Jan 2021)

Attention based GRU-LSTM for software defect prediction.

  • Hafiz Shahbaz Munir,
  • Shengbing Ren,
  • Mubashar Mustafa,
  • Chaudry Naeem Siddique,
  • Shazib Qayyum

DOI
https://doi.org/10.1371/journal.pone.0247444
Journal volume & issue
Vol. 16, no. 3
p. e0247444

Abstract

Read online

Software defect prediction (SDP) can be used to produce reliable, high-quality software. The current SDP is practiced on program granular components (such as file level, class level, or function level), which cannot accurately predict failures. To solve this problem, we propose a new framework called DP-AGL, which uses attention-based GRU-LSTM for statement-level defect prediction. By using clang to build an abstract syntax tree (AST), we define a set of 32 statement-level metrics. We label each statement, then make a three-dimensional vector and apply it as an automatic learning model, and then use a gated recurrent unit (GRU) with a long short-term memory (LSTM). In addition, the Attention mechanism is used to generate important features and improve accuracy. To verify our experiments, we selected 119,989 C/C++ programs in Code4Bench. The benchmark tests cover various programs and variant sets written by thousands of programmers. As an evaluation standard, compared with the state evaluation method, the recall, precision, accuracy and F1 measurement of our well-trained DP-AGL under normal conditions have increased by 1%, 4%, 5%, and 2% respectively.