IEEE Open Journal of the Computer Society (Jan 2024)
Slingshot: Globally Favorable Local Updates for Federated Learning
Abstract
Federated Learning (FL), as a promising distributed learning paradigm, is proposed to solve the contradiction between the data hunger of modern machine learning and the increasingly stringent need for data privacy. However, clients naturally present different distributions of their local data and inconsistent local optima, which leads to poor model performance of FL. Many previous methods focus on mitigating objective inconsistency. Although local objective consistency can be guaranteed when the number of communication rounds is infinite, we should notice that the accumulation of global drift and the limitation on the potential of local updates are non-negligible in those previous methods. In this article, we study a new framework for data-heterogeneity FL, in which the local updates in clients towards the global optimum can accelerate FL. We propose a new approach called Slingshot. Slingshot's design goals are twofold, i.e., i) to retain the potential of local updates, and ii) to combine local and global trends. Experimental results show that Slingshot helps local updates become more globally favorable and outperforms other popular methods under various FL settings. For example, on CIFAR10, Slingshot achieves 46.52% improvement in test accuracy and 48.21× speedup for a lightweight neural network named SqueezeNet.
Keywords