IEEE Access (Jan 2024)
Siamese Neural Networks Method for Semantic Requirements Similarity Detection
Abstract
Detecting semantic similarity between textual requirements is a crucial task for various natural language processing (NLP)-based requirements engineering (RE) applications. It is also challenging due to the nature of these requirements, which are written in natural language (NL), include domain knowledge, and often follow pre-defined templates that contain duplicated words. Recently, deep neural networks (DNNs) have shown promising results in measuring semantic similarity between texts. Siamese neural networks (SNNs), a class of DNNs, are widely used for measuring similarity between various data types, demonstrating their capability and independence of language and domain. Nevertheless, SNNs have a limited use in measuring semantic requirements similarity (SRS). In this paper, a novel metric-based learning method is proposed using SNNs that combines a sentence Transformer model (LLM) and long short-term memory (LSTM) networks with a backward network layer to measure semantic similarity between pairs of requirements. The proposed method is evaluated on an annotated SRS dataset that was built based on public datasets (i.e., PROMISE and PURE) and compared with other state-of-the-art methods (i.e., fine-tuning and zero-shot methods) using accuracy, precision, recall, and F1-score classification metrics. The results show that the proposed method achieved an accuracy of 95.42% and an F1-score of 95.71%, outperforming the state-of-the-art methods.
Keywords