IEEE Access (Jan 2024)
A Lightweight Deep Convolutional Neural Network Implemented on FPGA and Android Devices for Detection of Breast Cancer Using Ultrasound Images
Abstract
Breast cancer (BC) continues to be the primary cause of high mortality rates among women globally. Early and automated detection of this disease plays a significant role in clinical standards for better diagnosis and improved survival rates. Ultrasonography is a widely used non-invasive imaging test to diagnose BC. The traditional ultrasound-based automated diagnostic systems for detecting BC rely on cloud-based processing, which has high latency, requires constant internet connectivity, and raises the privacy of patient’s ultrasound image data. This paper proposes a lightweight deep convolutional neural network (LWDCNN) implemented on edge devices (field programmable gate array (FPGA) and android devices) for real-time detection of BC using breast ultrasound (BUS) images. FPGA is used due to its parallel processing capability, low latency, and low power for the real-time processing of BUS images. Similarly, the Android device provides portability and a user-friendly system to process BUS images for automated detection of BC. The proposed LWDCNN is trained on the Google Cloud CPU-based framework to obtain the optimized model with weight and bias parameters. The compression methods (pruning and fixed-point precision-based representation of weight values) are applied to the LWDCNN model, and the inference of this model is performed on Android and PYNQ-Z2 FPGA-based edge devices to detect malignant tumor classes using BUS images. The proposed approach is evaluated using BUS images from Kaggle and breast ultrasound imaging databases. The experimental results reveal that the LWDCNN achieved average accuracy values of 94.15% on Google Cloud CPU and 93.16% on FPGA-based edge devices for detecting malignant tumors in the inference phase using hold-out validation. Similarly, the post-training quantization of LWDCNN with an integer 8-bit (INT8) case implemented on an Android device has yielded an accuracy value of 93.76% for detecting malignant tumors using BUS images. The presented deep-learning approach has demonstrated higher classification accuracy than different transfer-learning techniques and existing methods using the BUS images from the same database.
Keywords