A Transformer-Based Substitute Recommendation Model IncorporatingWeakly Supervised Customer Behavior Data

被引:0
|
作者
Ye, Wenting [1 ]
Yang, Hongfei [1 ]
Zhao, Shuai [1 ]
Fang, Haoyang [1 ]
Shi, Xingjian [2 ]
Neppalli, Naveen [1 ]
机构
[1] Amazon Retails, Seattle, WA 98109 USA
[2] AWS AI, Santa Clara, CA USA
来源
PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023 | 2023年
关键词
substitute recommendation; multilingual; weakly supervised learning; natural language processing; selection bias; implicit feedback;
D O I
10.1145/3539618.3591847
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The substitute-based recommendation is widely used in E-commerce to provide better alternatives to customers. However, existing research typically uses customer behavior signals like co-view and view-but-purchase-another to capture the substitute relationship. Despite its intuitive soundness, such an approach might ignore the functionality and characteristics of products. In this paper, we adapt substitute recommendations into language matching problem. It takes the product title description as model input to consider product functionality. We design a new transformation method to de-noise the signals derived from production data. In addition, we consider multilingual support from the engineering point of view. Our proposed end-to-end transformer-based model achieves both successes from offline and online experiments. The proposed model has been deployed in a large-scale E-commerce website for 11 marketplaces in 6 languages. Our proposed model is demonstrated to increase revenue by 19% based on an online A/B experiment.
引用
收藏
页码:3325 / 3329
页数:5
相关论文
共 24 条
  • [1] CLFormer: a unified transformer-based framework for weakly supervised crowd counting and localization
    Deng, Mingfang
    Zhao, Huailin
    Gao, Ming
    VISUAL COMPUTER, 2024, 40 (02) : 1053 - 1067
  • [2] CLFormer: a unified transformer-based framework for weakly supervised crowd counting and localization
    Mingfang Deng
    Huailin Zhao
    Ming Gao
    The Visual Computer, 2024, 40 (2) : 1053 - 1067
  • [3] Automatic Detection of Sensitive Data Using Transformer-Based Classifiers
    Petrolini, Michael
    Cagnoni, Stefano
    Mordonini, Monica
    FUTURE INTERNET, 2022, 14 (08)
  • [4] Contrastive Transformer-Based Multiple Instance Learning for Weakly Supervised Polyp Frame Detection
    Tian, Yu
    Pang, Guansong
    Liu, Fengbei
    Liu, Yuyuan
    Wang, Chong
    Chen, Yuanhong
    Verjans, Johan
    Carneiro, Gustavo
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT III, 2022, 13433 : 88 - 98
  • [5] Automatic identification of suicide notes with a transformer-based deep learning model
    Zhang, Tianlin
    Schoene, Annika M.
    Ananiadou, Sophia
    INTERNET INTERVENTIONS-THE APPLICATION OF INFORMATION TECHNOLOGY IN MENTAL AND BEHAVIOURAL HEALTH, 2021, 25
  • [6] A weakly-supervised transformer-based hybrid network with multi-attention for pavement crack detection
    Wang, Zhenlin
    Leng, Zhufei
    Zhang, Zhixin
    CONSTRUCTION AND BUILDING MATERIALS, 2024, 411
  • [7] Enhancing the accuracy of transformer-based embeddings for sentiment analysis in social big data
    Zemzem, Wiem
    Tagina, Moncef
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2023, 73 (03) : 169 - 177
  • [8] RoBIn: A Transformer-based model for risk of bias inference with machine reading comprehension
    Dias, Abel Correa
    Moreira, Viviane Pereira
    Comba, Joao Luiz Dihl
    JOURNAL OF BIOMEDICAL INFORMATICS, 2025, 166
  • [9] A Systematic Review of Transformer-Based Pre-Trained Language Models through Self-Supervised Learning
    Kotei, Evans
    Thirunavukarasu, Ramkumar
    INFORMATION, 2023, 14 (03)
  • [10] Sentiment Mining in E-Commerce: The Transformer-based Deep Learning Model
    Alsaedi, Tahani
    Nawaz, Asif
    Alahmadi, Abdulrahman
    Rana, Muhammad Rizwan Rashid
    Raza, Ammar
    INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2024, 15 (08) : 641 - 650