EURASIP Journal on Advances in Signal Processing (Jan 2019)

Distributed stochastic gradient descent for link prediction in signed social networks

  • Han Zhang,
  • Gang Wu,
  • Qing Ling

DOI
https://doi.org/10.1186/s13634-019-0601-0
Journal volume & issue
Vol. 2019, no. 1
pp. 1 – 11

Abstract

Read online

Abstract This paper considers the link prediction problem defined over a signed social network, where the relationship between any two network users can be either positive (friends) or negative (foes). Given a portion of the relationships, the goal of link prediction is to identify the rest unknown ones. This task resorts to completing the adjacency matrix of the signed social network, which is low rank or approximately low rank. Considering the large scale of the adjacency matrix, in this paper, we adopt low-rank matrix factorization models for the link prediction problem and solve them through asynchronous distributed stochastic gradient descent algorithms. The low-rank matrix factorization models effectively reduce the size of the parameter space, while the asynchronous distributed stochastic gradient descent algorithms enable fast completion of the adjacency matrix. We validate the proposed algorithms using two real-world datasets on a distributed shared-memory computation platform. Numerical results demonstrate that the asynchronous distributed stochastic gradient descent algorithms achieve nearly linear computional speedups with respect to the number of computational threads, and are able to complete an adjacency matrix of ten billions of entries within 10 s.

Keywords