IEEE Access (Jan 2019)
Improving Recommendations by Embedding Multi-Entity Relationships With Latent Dual-Metric Learning
Abstract
Recently, latent vector embedding has become a research hotspot, with its great representative ability to measure the latent relationships among different views. However, most researches utilize the inner product of latent vectors as the representation of relationships, and they develop some embedding models based on this theory. In this paper, we take deep insight into the existing embedding models and find that utilizing the inner product may increase several problems: 1) in latent space, the inner product among three vectors may violate triangle principle; 2) the inner product cannot measure the relationships between vectors in the same category, such as user and user and item and item; and 3) the inner product cannot catch the collaborative relationships (user-user and item-item) for collaborative filtering. Along with this line, we propose a latent vector embedding model for collaborative filtering: latent dual metric embedding (LDME), which utilizes the dual-Euclidean distance in latent space, instead of the inner product, to represent different types of relationships (user-user, item-item, and user-item) with a uniform framework. Specifically, we design an embedding loss function in LDME, which can measure the close and remote relationships between entities, tackle the above problems, and achieve a more clear, well-explained embedding result. Extensive experiments are conducted on several real-world datasets (Amazon, Yelp, Taobao, and Jingdong), where the expiring results demonstrate that LDME can overperform some state-of-the-art user-item embedding models and can benefit the existing collaborative filtering models.
Keywords