IEEE Access (Jan 2019)

Graph Embedding Matrix Sharing With Differential Privacy

  • Sen Zhang,
  • Weiwei Ni

DOI
https://doi.org/10.1109/ACCESS.2019.2927365
Journal volume & issue
Vol. 7
pp. 89390 – 89399

Abstract

Read online

Graph embedding maps a graph into low-dimensional vectors, i.e., embedding matrix, while preserving the graph structure, solving the high computation and space cost for graph analysis. Matrix factorization (MF) is an effective means to achieve graph embedding since maintaining the utility of the graph structure. The personalized graph structure features implied in the embedding matrix can identify the individual, which potentially breaches individual sensitive information in the original graph. Currently, protecting individual privacy without compromising the utility is the key to sharing the embedding matrix. Differential privacy is a gold standard for publishing sensitive information while protecting privacy. The existing methods on differentially private MF, however, cannot be directly incorporated onto MF-based graph embedding as they undergo either high global sensitivity or iterative noise error accumulation, potentially rendering poor utility of MF-based graph embedding. To address the deficiency, this study proposes PPGD, a differentially private perturbed gradient descent method for MF-based graph embedding matrix sharing. Specifically, a Lipschitz condition on the objective function of the MF and a gradient clipping strategy are devised for bounding global sensitivity. Along the way, a scalable solution to global sensitivity that is independent on the original dataset is proposed. Further, a composite noise added means in the gradient descent is designed to guarantee privacy while enhancing the utility. The theoretical analysis shows that PPGD can generate processed embedding matrix with the utility maximization while achieving (ε, δ)-differential privacy. The experimental evaluations confirm the effectiveness and efficiency of PPGD.

Keywords