IEEE Access (Jan 2019)

Parameter Communication Consistency Model for Large-Scale Security Monitoring Based on Mobile Computing

  • Rui Yang,
  • Jilin Zhang,
  • Jian Wan,
  • Li Zhou,
  • Jing Shen,
  • Yunchen Zhang,
  • Zhenguo Wei,
  • Juncong Zhang,
  • Jue Wang

DOI
https://doi.org/10.1109/ACCESS.2019.2956632
Journal volume & issue
Vol. 7
pp. 171884 – 171897

Abstract

Read online

With the application of mobile computing in the security field, security monitoring big data has also begun to emerge, providing favorable support for smart city construction and city-scale and investment expansion. Mobile computing takes full advantage of the computing power and communication capabilities of various sensing devices and uses these devices to form a computing cluster. When using such clusters for training of distributed machine learning models, the load imbalance and network transmission delay result in low efficiency of model training. Therefore, this paper proposes a distributed machine learning parameter communication consistency model based on the parameter server idea, which is called the limited synchronous parallel model. The model is based on the fault-tolerant characteristics of the machine learning algorithm, and it dynamically limits the size of the synchronization barrier of the parameter server, reduces the synchronization communication overhead, and ensures the accuracy of the model training; thus, the model realizes finite asynchronous calculation between the worker nodes and gives full play to the overall performance of the cluster. The implementation of cluster dynamic load balancing experiments shows that the model can fully utilize the cluster performance during the training of distributed machine learning models to ensure the accuracy of the model and improve the training speed.

Keywords