Journal of Applied Mathematics (Jan 2013)
Analysis of Average Shortest-Path Length of Scale-Free Network
Abstract
Computing the average shortest-path length of a large scale-free network needs much memory space and computation time. Hence, parallel computing must be applied. In order to solve the load-balancing problem for coarse-grained parallelization, the relationship between the computing time of a single-source shortest-path length of node and the features of node is studied. We present a dynamic programming model using the average outdegree of neighboring nodes of different levels as the variable and the minimum time difference as the target. The coefficients are determined on time measurable networks. A native array and multimap representation of network are presented to reduce the memory consumption of the network such that large networks can still be loaded into the memory of each computing core. The simplified load-balancing model is applied on a network of tens of millions of nodes. Our experiment shows that this model can solve the load-imbalance problem of large scale-free network very well. Also, the characteristic of this model can meet the requirements of networks with ever-increasing complexity and scale.