Applied Sciences (Sep 2024)

Residual Dense Optimization-Based Multi-Attention Transformer to Detect Network Intrusion against Cyber Attacks

  • Majid H. Alsulami

DOI
https://doi.org/10.3390/app14177763
Journal volume & issue
Vol. 14, no. 17
p. 7763

Abstract

Read online

Achieving cyber-security has grown increasingly tricky because of the rising concern for internet connectivity and the significant growth in software-related applications. It also needs a robust defense system to defend itself from multiple cyberattacks. Therefore, there is a need to generate a method for detecting and classifying cyber-attacks. The developed model can be integrated into three phases: pre-processing, feature selection, and classification. Initially, the min-max normalization of original data was performed to eliminate the impact of maximum or minimum values on the overall characteristics. After that, synthetic minority oversampling techniques (SMOTEs) were developed to reduce the number of minority attacks. The significant features were selected using a Hybrid Genetic Fire Hawk Optimizer (HGFHO). An optimized residual dense-assisted multi-attention transformer (Op-ReDMAT) model was introduced to classify selected features accurately. The proposed model’s performance was evaluated using the UNSW-NB15 and CICIDS2017 datasets. A performance analysis was carried out to demonstrate the effectiveness of the proposed model. The experimental results showed that the UNSW-NB15 dataset attained a higher precision, accuracy, F1-score, error rate, and recall of 97.2%, 98.82%, 97.8%, 2.58, and 98.5%, respectively. On the other hand, the CICIDS 2017 achieved a higher precision, accuracy, F1-score, and recall of 98.6%, 99.12%, 98.8%, and 98.2%, respectively.

Keywords