Contrastive Translation With Dynamical Temperature for Sequential Recommendation

Aoran Zhang, Yonghong Yu, Li Zhang, Rong Gao, Hongzhi Yin

Research output: Contribution to journalArticlepeer-review

9 Downloads (Pure)

Abstract

Contrastive learning is a promising solution to the problem of data sparsity in the field of recommendation system since it is able to extract self-supervised signals from raw data. The traditional contrastive learning-based sequential recommendation algorithms generate augmentations of original item sequences by utilizing crop, mask and reorder operations. However, those augmentation schemes destroy the underlying semantics of item sequences, resulting in difficulty in accurately defining positive and negative samples. To address this issue, we propose a contrastive translation based sequential recommendation algorithm, namely, CT4Rec. Specifically, CT4Rec generates augmented views of item sequences by injecting noises into embeddings of users and items, which is able to guarantee that the underlying semantics of augmented views are consistent with those of original item sequence. Hence, CT4Rec is able to effectively learn the invariances among the augmented views. In addition, the personalized translation operations are utilized to model the third-order relationships among entities. Moreover, it is difficult for contrastive learning-based recommendation algorithms with static temperature to simultaneously capture the differences among individual users/items and among the clusters of users/items. Hence, we utilize a dynamic temperature strategy to enhance CT4Rec, which endows CT4Rec with the capabilities of group-wise discrimination and instance discrimination. Our validation on five benchmark datasets shows that CT4Rec outperforms SOTA sequential recommendation methods. Our code is released at https://github.com/zar123123/CT4Rec.
Original languageEnglish
Pages (from-to)1-13
Number of pages13
JournalIEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans
Early online date26 Mar 2025
DOIs
Publication statusE-pub ahead of print - 26 Mar 2025

Cite this