Multi-Class Document Classification Using Lexical Ontology-Based Deep Learning †

被引:2
作者
Yelmen, Ilkay [1 ,2 ]
Gunes, Ali [1 ]
Zontul, Metin [3 ]
机构
[1] Istanbul Aydin Univ, Fac Engn, Dept Comp Engn, TR-34295 Istanbul, Turkiye
[2] Turkcell Grp Co Digital Educ Technol Inc, TR-06800 Ankara, Turkiye
[3] Sivas Sci & Technol Univ, Fac Engn & Nat Sci, Dept Comp Engn, TR-58100 Sivas, Turkiye
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 10期
关键词
document classification; multi-class classification; word embeddings; WordNet; BERT; TEXT CLASSIFICATION;
D O I
10.3390/app13106139
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
With the recent growth of the Internet, the volume of data has also increased. In particular, the increase in the amount of unstructured data makes it difficult to manage data. Classification is also needed in order to be able to use the data for various purposes. Since it is difficult to manually classify the ever-increasing volume data for the purpose of various types of analysis and evaluation, automatic classification methods are needed. In addition, the performance of imbalanced and multi-class classification is a challenging task. As the number of classes increases, so does the number of decision boundaries a learning algorithm has to solve. Therefore, in this paper, an improvement model is proposed using WordNet lexical ontology and BERT to perform deeper learning on the features of text, thereby improving the classification effect of the model. It was observed that classification success increased when using WordNet 11 general lexicographer files based on synthesis sets, syntactic categories, and logical groupings. WordNet was used for feature dimension reduction. In experimental studies, word embedding methods were used without dimension reduction. Afterwards, Random Forest (RF), Support Vector Machine (SVM) and Multi-Layer Perceptron (MLP) algorithms were employed to perform classification. These studies were then repeated with dimension reduction performed by WordNet. In addition to the machine learning model, experiments were also conducted with the pretrained BERT model with and without WordNet. The experimental results showed that, on an unstructured, seven-class, imbalanced dataset, the highest accuracy value of 93.77% was obtained when using our proposed model.
引用
收藏
页数:22
相关论文
共 68 条
[1]   Mining opinionated product features using WordNet lexicographer files [J].
Alrababah, Saif A. Ahmad ;
Gan, Keng Hoon ;
Tan, Tien-Ping .
JOURNAL OF INFORMATION SCIENCE, 2017, 43 (06) :769-785
[2]   Unsupervised sentence representations as word information series: Revisiting TF-IDF [J].
Arroyo-Fernandez, Ignacio ;
Mendez-Cruz, Carlos-Francisco ;
Sierra, Gerardo ;
Torres-Moreno, Juan-Manuel ;
Sidorov, Grigori .
COMPUTER SPEECH AND LANGUAGE, 2019, 56 :107-129
[3]  
Bamatraf S.A., 2021, INT RES J INNOV ENG, V5, P5
[4]  
Barbouch M., 2021, Comput. Linguistics Netherlands J, V11, P105
[5]  
Biagioli C., 2005, P 10 INT C ART INT L, P133, DOI [10.1145/1165485.1165506, DOI 10.1145/1165485.1165506]
[6]  
Bloehdorn S, 2006, IEEE DATA MINING, P808
[7]  
Bond F., 2014, NUSA LINGUISTIC STUD, V57, P83, DOI DOI 10.1007/S11023-007-9060-8.COPYRIGHTARXIV:ARXIV:1310.1707V3
[8]   A Pattern-Based Approach for Multi-Class Sentiment Analysis in Twitter [J].
Bouazizi, Mondher ;
Ohtsuki, Tomoaki .
IEEE ACCESS, 2017, 5 :20617-20639
[9]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[10]   Efficient Processing of RDF Queries with Nested Optional Graph Patterns in an RDBMS [J].
Chebotko, Artem ;
Lu, Shiyong ;
Atay, Mustafa ;
Fotouhi, Farshad .
INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS, 2008, 4 (04) :1-30