IEEE Access (Jan 2024)

Attention-Based Feature Fusion With External Attention Transformers for Breast Cancer Histopathology Analysis

  • K. Vanitha,
  • A. Manimaran,
  • K. Chokkanathan,
  • K. Anitha,
  • T. R. Mahesh,
  • V. Vinoth Kumar,
  • G. N. Vivekananda

DOI
https://doi.org/10.1109/ACCESS.2024.3443126
Journal volume & issue
Vol. 12
pp. 126296 – 126312

Abstract

Read online

Breast cancer, a common malignancy impacting women globally, involves the uncontrolled growth of breast cancer cells. Timely identification and accurate classification of breast cancer into non-cancerous (benign) and cancerous (malignant) categories are crucial for effective treatment planning and enhanced patient outcomes. Conventional diagnostic techniques depend on histopathological examination of breast tissue samples, a process that can be subjective and time-consuming. The problem statement revolves around developing a computational model to automatically classify images from histopathology into non-cancerous or cancerous categories, addressing the limitations of manual diagnosis. Existing methodologies leverage various machine learning and deep learning techniques, particularly Convolutional Neural Networks (CNNs) being prominently utilized due to their effectiveness in image recognition tasks. However, these methods often require substantial computational resources and can suffer from overfitting due to the complex architecture. The objective of this study is to introduce an External Attention Transformer (EAT) model that utilizes external attention mechanisms, providing an approach to breast cancer image classification. This model aims to achieve high accuracy while maintaining computational efficiency. The primary metrics to assess the model’s performance include precision, recall, F1-score, and overall accuracy. The EAT model demonstrated outstanding performance achieving an accuracy of 99% on the BreaKHis dataset, indicating its potential as a reliable tool for breast cancer classification.

Keywords