PLoS ONE (Jan 2019)

Fast optimization of non-negative matrix tri-factorization.

  • Andrej Čopar,
  • Blaž Zupan,
  • Marinka Zitnik

DOI
https://doi.org/10.1371/journal.pone.0217994
Journal volume & issue
Vol. 14, no. 6
p. e0217994

Abstract

Read online

Non-negative matrix tri-factorization (NMTF) is a popular technique for learning low-dimensional feature representation of relational data. Currently, NMTF learns a representation of a dataset through an optimization procedure that typically uses multiplicative update rules. This procedure has had limited success, and its failure cases have not been well understood. We here perform an empirical study involving six large datasets comparing multiplicative update rules with three alternative optimization methods, including alternating least squares, projected gradients, and coordinate descent. We find that methods based on projected gradients and coordinate descent converge up to twenty-four times faster than multiplicative update rules. Furthermore, alternating least squares method can quickly train NMTF models on sparse datasets but often fails on dense datasets. Coordinate descent-based NMTF converges up to sixteen times faster compared to well-established methods.