A Heideggerian analysis of generative pretrained transformer models

被引:2
|
作者
Floroiu, Iustin [1 ]
Timisica, Daniela [1 ,2 ]
机构
[1] Natl Inst Res & Dev Informat ICI Bucharest, Bucharest, Romania
[2] Natl Univ Sci & Technol Politehn Bucharest, Bucharest, Romania
来源
ROMANIAN JOURNAL OF INFORMATION TECHNOLOGY AND AUTOMATIC CONTROL-REVISTA ROMANA DE INFORMATICA SI AUTOMATICA | 2024年 / 34卷 / 01期
关键词
Martin Heidegger; GPT; Artificial Intelligence; Dasein; TURING TEST;
D O I
10.33436/v34i1y202402
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
To better understand the emergence of new large language models in the context of future possibilities with regard to developing novel artificial general intelligence, it is essential to analyse and conclude the existential implications of these algorithms. Given the high speed of technological advancements in the field of deep learning, generative pretrained transformers (GPT) are the closest thing related to the invention of highly independent and intelligent programs, because they manifest creativity and convey an accurate formation of a worldview model that was never seen before. Because of these aspects, this article proposes an analysis of the concept of Dasein, defined by Heidegger, in the vast description of advancements added in the field of computational intelligence. The analysis methods described here are meant to bypass the complex problems of cognitive sciences with regard to computational intelligence and to create a highly accurate model of mental representation and hierarchisation of emergent intelligent algorithms.
引用
收藏
页码:13 / 22
页数:10
相关论文
共 50 条
  • [1] Generative Pretrained Transformer for Heterogeneous Catalysts
    Mok, Dong Hyeon
    Back, Seoin
    JOURNAL OF THE AMERICAN CHEMICAL SOCIETY, 2024, 146 (49) : 33712 - 33722
  • [2] Chat generative pretrained transformer: A disruptive or constructive technology?
    Deshmukh, Sonali Vijay
    JOURNAL OF THE INTERNATIONAL CLINICAL DENTAL RESEARCH ORGANIZATION, 2023, 15 (01) : 1 - 2
  • [3] Medical Text Prediction and Suggestion Using Generative Pretrained Transformer Models with Dental Medical Notes
    Sirrianni, Joseph
    Sezgin, Emre
    Claman, Daniel
    Linwood, Simon L.
    METHODS OF INFORMATION IN MEDICINE, 2022, 61 (05/06) : 195 - 200
  • [4] Analyzing Redundancy in Pretrained Transformer Models
    Dalvi, Fahim
    Sajjad, Hassan
    Durrani, Nadir
    Belinkov, Yonatan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4908 - 4926
  • [5] A Future of Smarter Digital Health Empowered by Generative Pretrained Transformer
    Miao, Hongyu
    Li, Chengdong
    Wang, Jing
    JOURNAL OF MEDICAL INTERNET RESEARCH, 2023, 25
  • [6] Compact binary systems waveform generation with a generative pretrained transformer
    Shi, Ruijun
    Zhou, Yue
    Zhao, Tianyu
    Cao, Zhoujian
    Ren, Zhixiang
    PHYSICAL REVIEW D, 2024, 109 (08)
  • [7] Transforming the generative pretrained transformer into augmented business text writer
    Khalil, Faisal
    Pipa, Gordon
    JOURNAL OF BIG DATA, 2022, 9 (01)
  • [8] Foresight-generative pretrained transformer for the prediction of patient timelines
    Hofmann-Apitius, Martin
    Froehlich, Holger
    LANCET DIGITAL HEALTH, 2024, 6 (04): : e233 - e234
  • [9] Colorectal Cancer Prevention and Chat Generative Pretrained Transformer (ChatGPT)
    Daungsupawong, Hinpetch
    Wiwanitkit, Viroj
    JOURNAL OF CLINICAL GASTROENTEROLOGY, 2024, 58 (05) : 531 - 531
  • [10] The Heap, the Hype, the Reality: Generative Pretrained Transformer for Systematic Reviews
    Li, Tianjing
    Chang, Stephanie
    ANNALS OF INTERNAL MEDICINE, 2024, 177 (06) : 828 - 829