NEIGHBOR-AUGMENTED TRANSFORMER-BASED EMBEDDING FOR RETRIEVAL

被引:0
|
作者
Zhang, Jihai [1 ]
Lin, Fangquan [1 ]
Jiang, Wei [1 ]
Yang, Cheng [1 ]
Liu, Gaoge [2 ]
机构
[1] Alibaba Grp, Hangzhou, Zhejiang, Peoples R China
[2] Columbia Univ, New York, NY 10027 USA
来源
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2022年
关键词
embedding; neighbor graph; transformer;
D O I
10.1109/ICASSP43922.2022.9746140
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
With rapid evolution of e-commerce, it is essential but challenging to quickly provide a recommending service for users. The recommender system can be divided into two stages: retrieval and ranking. However, most recent academic research has focused on the second stage for datasets with limited size, while the role of retrieval is heavily underestimated. Generally, graph-based or sequential models are used to generate item embedding for the retrieval task. However, the graph-based methods suffer from over-smoothing, while sequential models are largely influenced by data sparseness. To alleviate these issues, we propose NATM-a novel embedding-based method in large-scale learning incorporating both graph-based and sequential information. NATM consists of two key components: i) neighbor augmented graph construction with user behaviors to enhance item embedding and mitigate data sparseness, followed by ii) transformer-based representation network, targeting on minimizing NCE loss. The competitive performance of the proposed method is demonstrated through comprehensive experiments, including a benchmark study on MovieLens dataset and a real-world e-commerce scenario in Alibaba Group.
引用
收藏
页码:3893 / 3897
页数:5
相关论文
共 50 条
  • [1] A Transformer-based Embedding Model for Personalized Product Search
    Bi, Keping
    Ai, Qingyao
    Croft, W. Bruce
    PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 1521 - 1524
  • [2] Transformer-based embedding applied to classify bacterial species using sequencing reads
    Gwak, Ho-Jin
    Rho, Mina
    2022 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (IEEE BIGCOMP 2022), 2022, : 374 - 377
  • [3] Transformer-Based Distillation Hash Learning for Image Retrieval
    Lv, Yuanhai
    Wang, Chongyan
    Yuan, Wanteng
    Qian, Xiaohao
    Yang, Wujun
    Zhao, Wanqing
    ELECTRONICS, 2022, 11 (18)
  • [4] TRIG: Transformer-Based Text Recognizer with Initial Embedding Guidance
    Tao, Yue
    Jia, Zhiwei
    Ma, Runze
    Xu, Shugong
    ELECTRONICS, 2021, 10 (22)
  • [5] Patent image retrieval using transformer-based deep metric learning
    Higuchi, Kotaro
    Yanai, Keiji
    WORLD PATENT INFORMATION, 2023, 74
  • [6] MULTI-SCALE TRANSFORMER-BASED FEATURE COMBINATION FOR IMAGE RETRIEVAL
    Roig Mari, Carlos
    Varas Gonzalez, David
    Bou-Balust, Elisenda
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 3166 - 3170
  • [7] Transformer-Based End-to-End Speech Translation With Rotary Position Embedding
    Li, Xueqing
    Li, Shengqiang
    Zhang, Xiao-Lei
    Rahardja, Susanto
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 371 - 375
  • [8] PE-Attack: On the Universal Positional Embedding Vulnerability in Transformer-Based Models
    Gao, Shiqi
    Zhou, Haoyi
    Chen, Tianyu
    He, Mingrui
    Xu, Runhua
    Li, Jianxin
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 9359 - 9373
  • [9] HEART: Historically Information Embedding and Subspace Re-Weighting Transformer-Based Tracking
    Liu, Tianpeng
    Li, Jing
    Beheshti, Amin
    Wu, Jia
    Chang, Jun
    Song, Beihang
    Lian, Lezhi
    IEEE TRANSACTIONS ON BIG DATA, 2025, 11 (02) : 566 - 577
  • [10] Semanformer: Semantics-aware Embedding Dimensionality Reduction Using Transformer-Based Models
    Boyapati, Mallika
    Aygun, Ramazan
    18TH IEEE INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING, ICSC 2024, 2024, : 134 - 141