Transformer-Based Approaches for Purépecha Translation: Advancing Indigenous Language Preservation

被引:0
|
作者
Gonzalez-Servin, Cecilia [1 ]
Sidorov, Grigori [1 ]
Maldonado-Sifuentes, Christian Efrain [2 ]
Nunez-Prado, Cesar Jesus [3 ]
机构
[1] Inst Politecn Nacl, CIC IPN, Mexico City, Mexico
[2] Conahcyt, Mexico City, Mexico
[3] Inst Politecn Nacl, ESIMEZ, Mexico City, Mexico
关键词
Machine Translation; Transformer Networks; Indigenous Languages; Pur & eacute; pecha; Neural Machine Translation;
D O I
10.61467/2007.1558.2025.v16i1.595
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Indigenous languages like Pur & eacute;pecha face significant challenges in the modern era, particularly due to limited digital resources and a dwindling number of speakers. This study, conducted by researchers from CIC-IPN and CONACYT, presents an innovative application of transformer-based neural networks for the automatic translation of Pur & eacute;pecha to Spanish. Unlike previous works that utilized transformer architectures, this work develops a unique bilingual corpus through an algorithm based on the verbal inflection of Pur & eacute;pecha verbs, generating simple sentences in Pur & eacute;pecha and their corresponding Spanish translations. This corpus was then used to train a transformer model for automatic translation. The results indicate the potential of artificial intelligence to contribute to the preservation and revitalization of indigenous languages, opening new possibilities in the field of automatic translation and other natural language processing sectors.keywords in this section.
引用
收藏
页码:64 / 74
页数:11
相关论文
共 50 条
  • [1] Incorporating Relative Position Information in Transformer-Based Sign Language Recognition and Translation
    Aloysius, Neena
    Geetha, M.
    Nedungadi, Prema
    IEEE ACCESS, 2021, 9 : 145929 - 145942
  • [2] Incorporating Relative Position Information in Transformer-Based Sign Language Recognition and Translation
    Aloysius, Neena
    Geetha, M.
    Nedungadi, Prema
    IEEE Access, 2021, 9 : 145929 - 145942
  • [3] SignNet II: A Transformer-Based Two-Way Sign Language Translation Model
    Chaudhary, Lipisha
    Ananthanarayana, Tejaswini
    Hoq, Enjamamul
    Nwogu, Ifeoma
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12896 - 12907
  • [4] The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models
    Wennberg, Ulme
    Henter, Gustav Eje
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 130 - 140
  • [5] Adaptive Transformer-Based Deep Learning Framework for Continuous Sign Language Recognition and Translation
    Said, Yahia
    Boubaker, Sahbi
    Altowaijri, Saleh M.
    Alsheikhy, Ahmed A.
    Atri, Mohamed
    MATHEMATICS, 2025, 13 (06)
  • [6] Transformer-based Machine Translation for Low-resourced Languages embedded with Language Identification
    Sefara, Tshephisho J.
    Zwane, Skhumbuzo G.
    Gama, Nelisiwe
    Sibisi, Hlawulani
    Senoamadi, Phillemon N.
    Marivate, Vukosi
    2021 CONFERENCE ON INFORMATION COMMUNICATIONS TECHNOLOGY AND SOCIETY (ICTAS), 2021, : 127 - 132
  • [7] A Review of Transformer-Based Approaches for Image Captioning
    Ondeng, Oscar
    Ouma, Heywood
    Akuon, Peter
    APPLIED SCIENCES-BASEL, 2023, 13 (19):
  • [8] Transformer-Based Music Language Modelling and Transcription
    Zonios, Christos
    Pavlopoulos, John
    Likas, Aristidis
    PROCEEDINGS OF THE 12TH HELLENIC CONFERENCE ON ARTIFICIAL INTELLIGENCE, SETN 2022, 2022,
  • [9] Transformer-based Natural Language Understanding and Generation
    Zhang, Feng
    An, Gaoyun
    Ruan, Qiuqi
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 281 - 284
  • [10] RelFormer: Advancing contextual relations for transformer-based dense captioning
    Jin, Weiqi
    Qu, Mengxue
    Shi, Caijuan
    Zhao, Yao
    Wei, Yunchao
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2025, 252