Classification Method for Tibetan Texts Based on In-depth Learning

被引:0
|
作者
Wang, Lili [1 ]
Wang, Hongyuan [1 ]
Yang, Hongwu [1 ,2 ,3 ]
机构
[1] Northwest Normal Univ, Coll Phys & Elect Engn, Lanzhou 730070, Gansu, Peoples R China
[2] Engn Res Ctr Gansu Prov Intelligent Informat Tech, Lanzhou 730070, Gansu, Peoples R China
[3] Natl & Prov Joint Engn Lab Learning Anal Technol, Lanzhou 730070, Gansu, Peoples R China
来源
PROCEEDINGS OF 2019 IEEE 8TH JOINT INTERNATIONAL INFORMATION TECHNOLOGY AND ARTIFICIAL INTELLIGENCE CONFERENCE (ITAIC 2019) | 2019年
基金
中国国家自然科学基金;
关键词
Tibetan text classification; word vector space; deep neural network; machine learning model;
D O I
10.1109/itaic.2019.8785789
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text classification is a key technology in the field of information retrieval and data mining, which can effectively sort messy information and locate needed information. This paper focuses on the comparative study of depth neural network CNN model, RNN model and LSTM model on the effect of Tibetan text classification. Firstly, we train BiISTM_CRF model to segment Tibetan categorized text. We construct a word vector space model to get word vectors by removing stop words, calculating word frequency and extracting feature words. Secondly, the word vector is transmitted to the classification model to train the Tibetan text classifier. Finally, we use the Tibetan text classifier to classify Tibetan texts, Experiments show that deep neural network has better classification effect than traditional text classification method when the amount of data is large. Among them, CNN classifier has the best classification effect. When the amount of data is small, the SVM model is effective.
引用
收藏
页码:1231 / 1235
页数:5
相关论文
共 50 条
  • [21] Research on Tibetan Text Classification Method Based on Neural Network
    Li, Zhensong
    Zhu, Jie
    Luo, Zhixiang
    Liu, Saihu
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 379 - 383
  • [22] An in-depth study on adversarial learning-to-rank
    Yu, Hai-Tao
    Piryani, Rajesh
    Jatowt, Adam
    Inagaki, Ryo
    Joho, Hideo
    Kim, Kyoung-Sook
    INFORMATION RETRIEVAL JOURNAL, 2023, 26 (01):
  • [23] In-Depth Research and Analysis of Multilabel Learning Algorithm
    Li, Daowang
    Wang, Canwei
    JOURNAL OF SENSORS, 2022, 2022
  • [24] Negotiated assessment and teacher learning: An in-depth exploration
    Verberg, Christel P. M.
    Tigelaar, Dineke E. H.
    Verloop, Nico
    TEACHING AND TEACHER EDUCATION, 2015, 49 : 138 - 148
  • [25] An in-depth study on adversarial learning-to-rank
    Hai-Tao Yu
    Rajesh Piryani
    Adam Jatowt
    Ryo Inagaki
    Hideo Joho
    Kyoung-Sook Kim
    Information Retrieval Journal, 2023, 26
  • [26] Transformer-based approaches for neuroimaging: an in-depth review of their role in classification and regression tasks
    Zhu, Xinyu
    Sun, Shen
    Lin, Lan
    Wu, Yutong
    Ma, Xiangge
    REVIEWS IN THE NEUROSCIENCES, 2025, 36 (02) : 209 - 228
  • [27] In-Depth Learning of Architectural Heritage with Application of Augmented Reality based on Sequential Scenes
    Lin, Keng-Ho
    Tai, Nan-Ching
    2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [28] Transverse alignment analysis and in-depth mining of web-based learning resources
    Wang, Hong
    Hui, Lianxiao
    INTERNATIONAL JOURNAL OF CONTINUING ENGINEERING EDUCATION AND LIFE-LONG LEARNING, 2013, 23 (3-4) : 357 - 366
  • [29] A learning-based approach for performing an in-depth literature search using MEDLINE
    Young, S.
    Duffull, S. B.
    JOURNAL OF CLINICAL PHARMACY AND THERAPEUTICS, 2011, 36 (04) : 504 - 512
  • [30] A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems
    Liu, Ran
    Stamper, John
    Davenport, Jodi
    JOURNAL OF LEARNING ANALYTICS, 2018, 5 (01): : 41 - 54