Transformer-based approach to variable typing

被引:0
|
作者
Rey, Charles Arthel [1 ]
Danguilan, Jose Lorenzo [1 ]
Mendoza, Karl Patrick [1 ]
Remolona, Miguel Francisco [1 ]
机构
[1] Univ Philippines Diliman, Dept Chem Engn, Chem Engn Intelligence Learning Lab, Quezon City 1101, Philippines
关键词
Natural language processing; Transformers; Entity recognition; Relation extraction; Variable typing; Machine learning; Mathematical knowledge; NAMED ENTITY RECOGNITION;
D O I
10.1016/j.heliyon.2023.e20505
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
The upsurge of multifarious endeavors across scientific fields propelled Big Data in the scientific domain. Despite the advancements in management systems, researchers find that mathematical knowledge remains one of the most challenging to manage due to the latter's inherent heterogeneity. One novel recourse being explored is variable typing where current works remain liminary and, thus, provide a wide room for contribution. In this study, a primordial attempt implement the end-to-end Entity Recognition (ER) and Relation Extraction (RE) approach variable typing was made using the BERT (Bidirectional Encoder Representations from Transformers) model. A micro-dataset was developed for this process. According to our findings, the model and RE model, respectively, have Precision of 0.8142 and 0.4919, Recall of 0.7816 0.6030, and F1-Scores of 0.7975 and 0.5418. Despite the limited dataset, the models performed par with values in the literature. This work also discusses the factors affecting this BERT-based
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Transformer-Based Approach to Melanoma Detection
    Cirrincione, Giansalvo
    Cannata, Sergio
    Cicceri, Giovanni
    Prinzi, Francesco
    Currieri, Tiziana
    Lovino, Marta
    Militello, Carmelo
    Pasero, Eros
    Vitabile, Salvatore
    SENSORS, 2023, 23 (12)
  • [2] A Two-Stage Transformer-Based Approach for Variable-Length Abstractive Summarization
    Su, Ming-Hsiang
    Wu, Chung-Hsien
    Cheng, Hao-Tse
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 2061 - 2072
  • [3] A Sparse Transformer-Based Approach for Image Captioning
    Lei, Zhou
    Zhou, Congcong
    Chen, Shengbo
    Huang, Yiyong
    Liu, Xianrui
    IEEE Access, 2020, 8 : 213437 - 213446
  • [4] A Sparse Transformer-Based Approach for Image Captioning
    Lei, Zhou
    Zhou, Congcong
    Chen, Shengbo
    Huang, Yiyong
    Liu, Xianrui
    IEEE ACCESS, 2020, 8 : 213437 - 213446
  • [5] A transformer-based approach to irony and sarcasm detection
    Rolandos Alexandros Potamias
    Georgios Siolas
    Andreas - Georgios Stafylopatis
    Neural Computing and Applications, 2020, 32 : 17309 - 17320
  • [6] TRANSFORMER-BASED APPROACH FOR DOCUMENT LAYOUT UNDERSTANDING
    Yang, Huichen
    Hsu, William
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4043 - 4047
  • [7] TSLocator: A Transformer-Based Approach to Bug Localization
    HUCheng
    XIAOYuliang
    WuhanUniversityJournalofNaturalSciences, 2021, 26 (02) : 200 - 206
  • [8] A transformer-based approach to irony and sarcasm detection
    Potamias, Rolandos Alexandros
    Siolas, Georgios
    Stafylopatis, Andreas-Georgios
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (23): : 17309 - 17320
  • [9] Transformer-based Image Compression with Variable Image Quality Objectives
    Kao, Chia-Hao
    Chen, Yi-Hsin
    Chien, Cheng
    Chiu, Wei-Chen
    Peng, Wen-Hsiao
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 1718 - 1725
  • [10] Smart Transformer-based Frequency Support in Variable Inertia Conditions
    Langwasser, Marius
    De Carne, Giovanni
    Liserre, Marco
    2019 IEEE 13TH INTERNATIONAL CONFERENCE ON COMPATIBILITY, POWER ELECTRONICS AND POWER ENGINEERING (CPE-POWERENG), 2019,