Jisuanji kexue (Nov 2022)

Communication Efficient Asynchronous ADMM for General Form Consensus Optimization

  • WANG Dong-xia, LEI Yong-mei, ZHANG Ze-yu

DOI
https://doi.org/10.11896/jsjkx.211200006
Journal volume & issue
Vol. 49, no. 11
pp. 309 – 315

Abstract

Read online

The distributed alternating direction method of multipliers(ADMM) is one of the most widely used methods for solving large-scale machine learning applications.However,most distributed ADMM algorithms are based on full model updates.With the increasing of system scale and data volume,the communication cost has become the bottleneck for the distributed ADMM when big data are involved.In order to reduce the communication cost in a distributed environment,a general form consensus asynchronous distributed alternating direction method of multipliers(GFC-ADADMM) is proposed in this paper.First,in the GFC-ADADMM,the associated model parameters rather than full model parameters are transmitted among nodes to reduce the transmission load,and the associated model parameters are filtered according to the characteristics of high-dimensional sparse data sets to further reduce the transmission load.Second,the GFC-ADMM is implemented by an asynchronous allreduce framework,which combines the advantage of the asynchronous communication protocol and the allreduce communication mode.Third,combining the advantages of the stale synchronous parallel(SSP) computing model,allreduce communication model,and hybrid programming model,the asynchronous allreduce framework and MPI/OpenMP hybrid programming model are adopted to implement the GFC-ADADMM,which improves calculation efficiency and communication efficiency of the algorithm.Finally,the sparse logistic regression problem is solved by the GFC-ADADMM.Evaluation with large-scale datasets shows that compared with state-of-the-art asynchronous distributed ADMM algorithms,the GFC-ADADMM can reduce the total running time by 15%-63%,and has higher accuracy in convergence.

Keywords