DTSMLA: A dynamic task scheduling multi-level attention model for stock ranking

被引:0
|
作者
Du, Yuanchuang [1 ]
Xie, Liang [1 ]
Liao, Sihao [1 ]
Chen, Shengshuang [1 ]
Wu, Yuchen [1 ]
Xu, Haijiao [2 ]
机构
[1] Wuhan Univ Technol, Sch Sci, Dept Math, Wuhan 430070, Peoples R China
[2] Guangdong Univ Educ, Sch Comp Sci, Guangzhou 510303, Peoples R China
关键词
Stock ranking; Market index; Multi-task learning; Task scheduling;
D O I
10.1016/j.eswa.2023.122956
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Predicting stock ranking is a complex and challenging task due to the intricate nature of real stock market systems. There are two main obstacles for current methods to directly using historical trading data to predict stock ranking: (1) Predicting stock rankings is a complex process influenced by a variety of factors, such as the temporal dependence of each stock, the spatial correlation between stocks, and the influence of the market on the stock; (2) The stock market's considerable noise creates volatility in directly predicting stock rankings on a given day, presenting difficulties in training models. To overcome these two challenges, a dynamic task scheduling multi-level attention model (DTSMLA) is proposed to enhance stock ranking prediction through market index and multi-task learning. In response to complexity of stock data, we synthesize the factors affecting stocks and propose a multi-level attention prediction framework. To address the noise issue, we introduce multi-task learning and use a task scheduler algorithm to dynamically select auxiliary tasks for model training. Experimental results conducted on four real-world stock datasets have demonstrated that our method outperforms several state-of-the-art approaches.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] A Multi-level Mesh Mutual Attention Model for Visual Question Answering
    Zhi Lei
    Guixian Zhang
    Lijuan Wu
    Kui Zhang
    Rongjiao Liang
    Data Science and Engineering, 2022, 7 : 339 - 353
  • [22] A Multi-Level Attention Model for Evidence-Based Fact Checking
    Kruengkrai, Canasai
    Yamagishi, Junichi
    Wang, Xin
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2447 - 2460
  • [23] Integration of multi-level semantics in PTMs with an attention model for question matching
    Ye, Zheng
    Che, Linwei
    Ge, Jun
    Qin, Jun
    Liu, Jing
    PLOS ONE, 2024, 19 (08):
  • [24] Taking stock of multi-level governance networks
    Papadopoulos Y.
    European Political Science, 2005, 4 (3) : 316 - 327
  • [25] Multi-Level Linguistic Alignment in a Dynamic Collaborative Problem-Solving Task
    Duran, Nicholas D.
    Paige, Amie
    D'Mello, Sidney K.
    COGNITIVE SCIENCE, 2024, 48 (01)
  • [26] DYNAMIC BRAIN TRANSFORMER WITH MULTI-LEVEL ATTENTION FOR FUNCTIONAL BRAIN NETWORK ANALYSIS
    Kan, Xuan
    Gu, Antonio Aodong Chen
    Cui, Hejie
    Guo, Ying
    Yang, Carl
    2023 IEEE EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL AND HEALTH INFORMATICS, BHI, 2023,
  • [27] A Multi-level Mesh Mutual Attention Model for Visual Question Answering
    Lei, Zhi
    Zhang, Guixian
    Wu, Lijuan
    Zhang, Kui
    Liang, Rongjiao
    DATA SCIENCE AND ENGINEERING, 2022, 7 (04) : 339 - 353
  • [28] Reliability Model Based Dynamic Multi-Level Trust Analysis
    Zhang, Li
    Zhang, Bin
    Liu, Anqing
    Xing, Liudong
    APPLIED SCIENCES-BASEL, 2020, 10 (17):
  • [29] OPTIMAL CONTROL OF A MULTI-LEVEL DYNAMIC MODEL FOR BIOFUEL PRODUCTION
    Ghezzi, Roberta
    Piccoli, Benedetto
    MATHEMATICAL CONTROL AND RELATED FIELDS, 2017, 7 (02) : 235 - 257
  • [30] a Multi-Level Dynamic Access Control Model and Its Formalization
    Zhou, Yanjie
    Ma, Li
    Wen, Min
    2015 2ND INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING ICISCE 2015, 2015, : 23 - 27