IEEE Access (Jan 2024)
Empowering Network Security: BERT Transformer Learning Approach and MLP for Intrusion Detection in Imbalanced Network Traffic
Abstract
Intrusion detection systems (IDS) stand as formidable guardians in network security, playing a pivotal role in identifying and mitigating potential threats. As our digital landscape evolves, the imperative for robust intrusion detection mechanisms grows exponentially. The significance of IDS extends beyond preserving the integrity, confidentiality, and availability of network resources. In the dynamic realm of evolving cyber threats, IDS acts as the frontline defender—constantly monitoring network traffic to pinpoint suspicious activities and preemptively mitigate security breaches. In this paper, we study the effectiveness of combining the potency of the transformer-based model known as Bidirectional Encoder Representations from Transformers (BERT) in conjunction with the Synthetic Minority Over-sampling Technique (SMOTE) and a Multi-Layer Perceptron (MLP) to enhance the classification tasks in Machine Learning (ML) based IDS. We tested our approach on well-known and recent datasets, demonstrating that it is possible to obtain very high accuracy and robust performance in the detection of various attack types even when the datasets are affected by class imbalance. Beyond these results, our research introduces a novel perspective by seamlessly integrating the interpretability and context awareness of BERT with the efficient classification of MLP. This novel approach holds promise for advancing intrusion detection capabilities and contributing to the broader cybersecurity community.
Keywords