All you see is the tip of the iceberg: distilling latent interactions can help you find treasures

Zhuo Cai, Guan Yuan*, Xiaobao Zhuang, Xiao Liu, Rui Bing, Shoujin Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

Recommender systems suffer from data sparsity problem severely, which can be attributed to the combined action of various possible causes like: gradually strengthened privacy protection policies, exposure bias, etc. In these cases, the unobserved items do not always refer to the items that users are not interested in; they could also be imputed to the inaccessibility of interaction data or users’ unawareness over items. Thus, blindly fitting all unobserved interactions as negative interactions in the training stage leads to the incomplete modeling of user preferences. In this work, we propose a novel training strategy to distill latent interactions for recommender systems (shorted as DLI). Latent interactions refer to the possible links between users and items that can reflect user preferences but not happened. We first design a False-negative interaction selecting module to dynamically distill latent interactions along the training process. After that, we devise two types of loss paradigms: Truncated Loss and Reversed Loss. The former one can reduce the detrimental effect of False-negative interactions by discarding the False-negative samples in the loss computing stage, while the latter turning them into positive ones to enrich the interaction data. Meanwhile, both loss functions can be further detailed into full mode and partial mode to discriminate different confidence levels of False-negative interactions. Extensive experiments on three benchmark datasets demonstrate the effectiveness of DLI in improving the recommendation performance of backbone models.

Original languageEnglish
Title of host publicationNeural Information Processing
Subtitle of host publication30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, proceedings, part XIV
EditorsBiao Luo, Long Cheng, Zheng-Guang Wu, Hongyi Li, Chaojie Li
Place of PublicationSingapore
PublisherSpringer, Springer Nature
Pages244-257
Number of pages14
ISBN (Electronic)9789819981816
ISBN (Print)9789819981809
DOIs
Publication statusPublished - 2024
Externally publishedYes
Event30th International Conference on Neural Information Processing, ICONIP 2023 - Changsha, China
Duration: 20 Nov 202323 Nov 2023

Publication series

NameCommunications in Computer and Information Science
Volume1968
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference30th International Conference on Neural Information Processing, ICONIP 2023
Country/TerritoryChina
CityChangsha
Period20/11/2323/11/23

Keywords

  • Recommender systems
  • Data sparsity
  • Latent interactions
  • Truncated Loss
  • Reversed Loss

Fingerprint

Dive into the research topics of 'All you see is the tip of the iceberg: distilling latent interactions can help you find treasures'. Together they form a unique fingerprint.

Cite this