IEEE Access (Jan 2024)
FedADC: Federated Average Knowledge Distilled Mutual Conditional Learning (FedADC) for Waste Classification
Abstract
Federated learning presents a potent avenue for addressing challenges in waste classification, where diverse datasets are distributed across sources. This paper introduces the Federated Average Knowledge Distilled Mutual Conditional Learning (FedADC) algorithm, a novel framework tailored for efficient waste classification in federated settings. FedADC combines federated learning with knowledge distillation and mutual conditional learning, offering a sophisticated approach to model training and adaptation. The bidirectional learning process enables dynamic adjustments to local dataset nuances, enhancing model accuracy. A pivotal contribution of this work is the comparative analysis of hyperparameter tuning techniques, revealing the superiority of GridSearch for fine-tuning in federated learning. Additionally, the paper explores and compares data augmentation, denoising strategies, and cross-validation techniques, providing insights that amplify the adaptability of the proposed model. The FedADC algorithm, coupled with these nuanced contributions, advances the understanding and application of federated learning in waste classification. Through a hybrid learning paradigm, rigorous parameter exploration, and meticulous validation strategies, this research establishes a foundation for robust and effective waste management solutions in federated settings.
Keywords