IEEE Access (Jan 2024)

A Comparative Study of Automated Refactoring Tools

  • Maha Alharbi,
  • Mohammad Alshayeb

DOI
https://doi.org/10.1109/ACCESS.2024.3361314
Journal volume & issue
Vol. 12
pp. 18764 – 18781

Abstract

Read online

Researchers proposed several refactoring approaches supported by automated and semi-automated refactoring tools. However, the existence of numerous automated refactoring tools imposes difficulties on developers to decide upon the appropriate one according to their needs. Moreover, the performance of the existing refactoring tools has not been empirically evaluated against the other available tools targeting the same refactoring opportunities. Therefore, the objective of this research is to conduct a comparative study to systematically compare and evaluate refactoring tools that belong to different categories of refactoring approaches. To this end, we propose an evaluation framework based on the DESMET methodology. The framework is used to empirically compare and evaluate four different refactoring tools, namely MultiRefactor, JDeodorant, jSparrow, and Spartenizer, using five open-source projects. The evaluation results show that jSparrow outperforms the other investigated tools by supporting the highest number of quantitative and qualitative features, suggesting that it is the best choice based on various perspectives. On the other hand, Spartenizer demonstrated the least favorable outcomes in terms of both quantitative and qualitative features, including introducing new code smells after applying a refactoring opportunity. The findings of this comparative study would assist the developers in understanding the characteristics and the capability of the studied refactoring tools. Also, it benefits the researchers to focus their efforts on addressing the identified limitations to enhance the smell detection and refactoring process.

Keywords