IEEE Access (Jan 2022)

Advanced Crowdsourced Test Report Prioritization Based on Adaptive Strategy

  • Penghua Zhu,
  • Ying Li,
  • Tongyu Li,
  • Huimin Ren,
  • Xiaolei Sun

DOI
https://doi.org/10.1109/ACCESS.2022.3176086
Journal volume & issue
Vol. 10
pp. 53522 – 53532

Abstract

Read online

Crowdsourced testing is an emerging trend in software testing, which takes advantage of the efficiency of crowdsourced and cloud platforms. Crowdsourced testing has gradually been applied in many fields. In crowdsourced software testing, after the crowdsourced workers complete the test tasks, they submit the test results in test reports. Therefore, in crowdsourced software testing, checking a large number of test reports is an arduous but unavoidable software maintenance task. Crowdsourced test reports are numerous, complex, and need to be sorted to improve inspection efficiency. There are no systematic methods for prioritizing reports in crowdsourcing test report prioritization. However, in regression testing, test case prioritization technology has matured. Therefore, we migrate the test case prioritization method to crowdsourced test report prioritization and evaluate the effectiveness of these methods. We use natural language processing technology and word segmentation to process the text in the test reports. Then we use four methods to prioritize the reports: total greedy algorithm, additional greedy algorithm, genetic algorithm, and ART. The results show that these methods all perform well in prioritizing crowdsourced test reports, with an average APFD of more than 0.8.

Keywords