Predicting stock market trends with self-supervised learning

被引:7
|
作者
Ying, Zelin [1 ,2 ]
Cheng, Dawei [3 ,4 ]
Chen, Cen [1 ]
Li, Xiang [1 ]
Zhu, Peng [3 ]
Luo, Yifeng [1 ]
Liang, Yuqi [5 ]
机构
[1] East China Normal Univ, Sch Data Sci & Engn, Shanghai, Peoples R China
[2] ByteDance Inc, Shanghai, Peoples R China
[3] Tongji Univ, Dept Comp Sci & Technol, Shanghai, Peoples R China
[4] Shanghai Artificial Intelligence Lab, Shanghai, Peoples R China
[5] Emoney Inc, Seek Data Grp, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
Sequence embeddings; Self-supervised learning; Multi-task joint learning; Stock trends prediction; ARIMA; MODEL; NEWS;
D O I
10.1016/j.neucom.2023.127033
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Predicting stock market trends is the basic daily routine task that investors should perform in the stock trading market. Traditional market trends prediction models are generally based on hand-crafted factors or features, which heavily rely on expensive expertise knowledge. Moreover, it is difficult to discover hidden features contained in the stock time series data, which are otherwise helpful for predicting stock market trends. In this paper, we propose a novel stock market trends prediction framework SMART with a self-supervised stock technical data sequence embedding model S3E. Specifically, the model encodes stock technical data sequences into embeddings, which are further trained with multiple self-supervised auxiliary tasks. With the learned sequence embeddings, we make stock market trends predictions based on an LSTM and a feed-forward neural network. We conduct extensive experiments on China A-Shares market and NASDAQ market to show that our model is highly effective for stock market trends prediction. We further deploy SMART in a leading financial service provider in China and the result demonstrates the effectiveness of the proposed method in real-world applications.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Whitening for Self-Supervised Representation Learning
    Ermolov, Aleksandr
    Siarohin, Aliaksandr
    Sangineto, Enver
    Sebe, Nicu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [32] Self-supervised Graph Learning for Recommendation
    Wu, Jiancan
    Wang, Xiang
    Feng, Fuli
    He, Xiangnan
    Chen, Liang
    Lian, Jianxun
    Xie, Xing
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 726 - 735
  • [33] COMBINING SELF-SUPERVISED AND SUPERVISED LEARNING WITH NOISY LABELS
    Zhang, Yongqi
    Zhang, Hui
    Yao, Quanming
    Wan, Jun
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 605 - 609
  • [34] Self-Supervised Learning for Videos: A Survey
    Schiappa, Madeline C.
    Rawat, Yogesh S.
    Shah, Mubarak
    ACM COMPUTING SURVEYS, 2023, 55 (13S)
  • [35] Self-supervised learning in medicine and healthcare
    Krishnan, Rayan
    Rajpurkar, Pranav
    Topol, Eric J.
    NATURE BIOMEDICAL ENGINEERING, 2022, 6 (12) : 1346 - 1352
  • [36] Graph Adversarial Self-Supervised Learning
    Yang, Longqi
    Zhang, Liangliang
    Yang, Wenjing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [37] Biased Self-supervised learning for ASR
    Kreyssig, Florian L.
    Shi, Yangyang
    Guo, Jinxi
    Sari, Leda
    Mohamed, Abdelrahman
    Woodland, Philip C.
    INTERSPEECH 2023, 2023, : 4948 - 4952
  • [38] The Challenges of Continuous Self-Supervised Learning
    Purushwalkam, Senthil
    Morgado, Pedro
    Gupta, Abhinav
    COMPUTER VISION, ECCV 2022, PT XXVI, 2022, 13686 : 702 - 721
  • [39] Self-Supervised Learning for User Localization
    Dash, Ankan
    Gu, Jingyi
    Wang, Guiling
    Ansari, Nirwan
    2024 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC, 2024, : 886 - 890
  • [40] Self-supervised hypergraph structure learning
    Li, Mingyuan
    Yang, Yanlin
    Meng, Lei
    Peng, Lu
    Zhao, Haixing
    Ye, Zhonglin
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (06)