IEEE Access (Jan 2024)
BalancedSecAgg: Toward Fast Secure Aggregation for Federated Learning
Abstract
Federated learning is a promising collaborative learning system from the perspective of training data privacy preservation; however, there is a risk of privacy leakage from individual local models of users. Secure aggregation protocols based on local model masking are a promising solution to prevent privacy leakage. Existing secure aggregation protocols sacrifice either computation or communication costs to tolerate user dropouts. A naive secure aggregation protocol achieves a small communication cost by secretly sharing random seeds instead of random masks. However, it requires that a server incurs a substantial computation cost to reconstruct the random masks from the random seeds of dropout users. To avoid such a reconstruction, a state-of-the-art secure aggregation protocol secretly shares random masks. Although this approach avoids the computation cost of mask reconstruction, it incurs a large communication cost due to secretly sharing random masks. In this paper, we design a secure aggregation protocol to mitigate the tradeoff between the computation cost and the communication cost by complementing both types of secure aggregation protocols. In our experiments, our protocol achieves up to 11.41 times faster while achieving the same level of privacy preservation and dropout tolerance as the existing protocols.
Keywords