Egyptian Informatics Journal (Sep 2024)
Enhancing medical image classification via federated learning and pre-trained model
Abstract
The precise classification of medical images is crucial in various healthcare applications, especially in fields like disease diagnosis and treatment planning. In recent times, machine-intelligent models are desired to work in remote settings. However, the potential privacy concerns that arise from sharing confidential patient information to train traditional centralized machine learning models cannot be ignored. Federated learning (FL) offers a promising method for collaborative training on distributed data held by various entities, ensuring the privacy of patient information. This study evaluated the efficiency of the pre-trained models in the FL environment for medical image classification. The Convolutional Neural Network (CNN) model with Gray-Level Co-occurrence Matrix (GLCM) and Local Binary Patterns (LBP), along with the EfficientNet model, are being used as the local models. The trainable parameters from the local models are fed as input for building the global model. Pre-trained models trained on extensive datasets, possess valuable characteristics that can be utilized by FL models trained on proprietary datasets. Implementing this method can improve the efficacy and precision of FL models while also ensuring data confidentiality. The proposed model is evaluated using two distinct medical imaging datasets: Magnetic Resonance Image(MRI) and Computed Tomography (CT) scan images. The research highlights the advantages of utilizing pre-trained models in federated learning for medical image classification (MIC). The model’s performance is assessed across several assessment criteria, demonstrating the model exhibited a satisfactory accuracy rate of 97.4% and 98.8% for MRI and CT scan images, respectively. The model is evaluated concerning to Diagnostic Odds Ratio (DOR), where the proposed global model has exhibited 1164.54 for the MRI images and 6825.17 for CT scan images, and the values have outperformed the pretrained model.