Wasserstein Distance-based Graph Contrastive Learning for Recommendation

  • Yonghong Yu
  • , Zhiyu Wang
  • , Yujie Liao
  • , Li Zhang
  • , Rong Gao

Research output: Contribution to journalArticlepeer-review

6 Downloads (Pure)

Abstract

Graph contrastive learning (GCL) is able to learn augmentation-invariant representations from raw data and reduce the dependence on labeled data. In the field of recommendation systems, traditional GCL models become a potential solution to insufficient supervision signals by augmenting the user-item interaction graph and optimizing InfoNCE loss to learn user and item representations. However, existing GCL-based recommendation models are limited by dimensional collapse, causing the sub-optimal performance of recommendation models. To tackle this problem, we propose a Wasserstein Distance-based Graph Contrastive Learning model, namely WGCL. Specifically, we integrate the Wasserstein loss into contrastive learning-based recommendation models to align the user/item representations distribution with the isotropic Gaussian distribution, which makes the real distribution of representations more uniform, thereby alleviating dimensional collapse. In fact, Wasserstein loss measures the distinction between the real distribution of entities’ representations and the desired distribution of representations by computing the covariance of representations learned from the augmented views. As a result, Wasserstein distance metric not only enables the representations more uniformly distributed on the hypersphere, but also better preserves the original semantic information of entities. Extensive experiments conducted on three widely used datasets demonstrate that WGCL outperforms traditional recommendation models. Our code is released at https://github.com/Sodapease/WGCL
Original languageEnglish
Article number129427
Number of pages12
JournalExpert Systems with Applications
Volume297
Early online date22 Aug 2025
DOIs
Publication statusPublished - 1 Feb 2026

Cite this