Classification Method for Tibetan Texts Based on In-depth Learning

被引:0
|
作者
Wang, Lili [1 ]
Wang, Hongyuan [1 ]
Yang, Hongwu [1 ,2 ,3 ]
机构
[1] Northwest Normal Univ, Coll Phys & Elect Engn, Lanzhou 730070, Gansu, Peoples R China
[2] Engn Res Ctr Gansu Prov Intelligent Informat Tech, Lanzhou 730070, Gansu, Peoples R China
[3] Natl & Prov Joint Engn Lab Learning Anal Technol, Lanzhou 730070, Gansu, Peoples R China
来源
PROCEEDINGS OF 2019 IEEE 8TH JOINT INTERNATIONAL INFORMATION TECHNOLOGY AND ARTIFICIAL INTELLIGENCE CONFERENCE (ITAIC 2019) | 2019年
基金
中国国家自然科学基金;
关键词
Tibetan text classification; word vector space; deep neural network; machine learning model;
D O I
10.1109/itaic.2019.8785789
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text classification is a key technology in the field of information retrieval and data mining, which can effectively sort messy information and locate needed information. This paper focuses on the comparative study of depth neural network CNN model, RNN model and LSTM model on the effect of Tibetan text classification. Firstly, we train BiISTM_CRF model to segment Tibetan categorized text. We construct a word vector space model to get word vectors by removing stop words, calculating word frequency and extracting feature words. Secondly, the word vector is transmitted to the classification model to train the Tibetan text classifier. Finally, we use the Tibetan text classifier to classify Tibetan texts, Experiments show that deep neural network has better classification effect than traditional text classification method when the amount of data is large. Among them, CNN classifier has the best classification effect. When the amount of data is small, the SVM model is effective.
引用
收藏
页码:1231 / 1235
页数:5
相关论文
共 50 条
  • [31] A Method of Chinese Texts Sentiment Classification Based on Bayesian Algorithm
    Yang, Aimin
    Zhou, Yongmei
    Lin, Jianghao
    INFORMATION TECHNOLOGY APPLICATIONS IN INDUSTRY, PTS 1-4, 2013, 263-266 : 2185 - +
  • [32] Sentiment Classification Method Based on Blending of Emoticons and Short Texts
    Zou, Haochen
    Xiang, Kun
    ENTROPY, 2022, 24 (03)
  • [33] ACCIDENTS - IN-DEPTH ANALYSIS - TOWARDS A METHOD AIDA
    STOOP, JA
    SAFETY SCIENCE, 1995, 19 (2-3) : 125 - 136
  • [34] BLAGA'S HARMONIOUS MELANCHOLY: AN IN-DEPTH SEMANTIC APPROACH TO SEVERAL POETIC TEXTS
    Marian, Rodica
    DACOROMANIA, 2006, 11-12 : 251 - 261
  • [35] New classification of gastric polyps: An in-depth analysis and critical evaluation
    Liao, Xiao-Hui
    Sun, Ying-Ming
    Chen, Hong-Bin
    WORLD JOURNAL OF GASTROENTEROLOGY, 2025, 31 (07)
  • [36] In-depth analysis of SVM kernel learning and its components
    Roman, Ibai
    Santana, Roberto
    Mendiburu, Alexander
    Lozano, Jose A.
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (12): : 6575 - 6594
  • [37] Semi-unsupervised Learning: An In-depth Parameter Analysis
    Davidson, Padraig
    Buckermann, Florian
    Steininger, Michael
    Krause, Anna
    Hotho, Andreas
    ADVANCES IN ARTIFICIAL INTELLIGENCE, KI 2021, 2021, 12873 : 51 - 66
  • [38] Children's Conversations Reveal In-Depth Learning at the Zoo
    Collins, Courtney
    McKeown, Sean
    McSweeney, Lynda
    Flannery, Kevin
    Kennedy, Declan
    O'Riordan, Ruth
    ANTHROZOOS, 2021, 34 (01): : 17 - 32
  • [39] Visual attention methods in deep learning: An in-depth survey
    Hassanin, Mohammed
    Anwar, Saeed
    Radwan, Ibrahim
    Khan, Fahad Shahbaz
    Mian, Ajmal
    INFORMATION FUSION, 2024, 108
  • [40] Modern machine learning and particle physics: an in-depth review
    Bhattacherjee, Biplob
    Mukherjee, Swagata
    EUROPEAN PHYSICAL JOURNAL-SPECIAL TOPICS, 2024, 233 (15-16): : 2421 - 2424