Alternative Approaches to HVAC Control of Chat Generative Pre-Trained Transformer (ChatGPT) for Autonomous Building System Operations

被引:4
|
作者
Ahn, Ki Uhn [1 ]
Kim, Deuk-Woo [2 ]
Cho, Hyun Mi [1 ]
Chae, Chang-U [3 ]
机构
[1] Korea Inst Civil Engn & Bldg Technol, Dept Bldg Res, Goyang Si 10223, South Korea
[2] Korea Inst Civil Engn & Bldg Technol, Dept Bldg Energy Res, Goyang Si 10223, South Korea
[3] Korea Inst Civil Engn & Bldg Technol, Res Strateg Planning Dept, Goyang Si 10223, South Korea
关键词
ChatGPT; large language model; deep Q-network; reinforcement learning; artificial intelligence; autonomous building; ENERGY; OPTIMIZATION; PREDICTION; COMFORT;
D O I
10.3390/buildings13112680
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Artificial intelligence (AI) technology has rapidly advanced and transformed the nature of scientific inquiry. The recent release of the large language model Chat Generative Pre-Trained Transformer (ChatGPT) has attracted significant attention from the public and various industries. This study applied ChatGPT to autonomous building system operations, specifically coupling it with an EnergyPlus reference office building simulation model. The operational objective was to minimize the energy use of the building systems, including four air-handling units, two chillers, a cooling tower, and two pumps, while ensuring that indoor CO2 concentrations remain below 1000 ppm. The performance of ChatGPT in an autonomous operation was compared with control results based on a deep Q-network (DQN), which is a reinforcement learning method. The ChatGPT and DQN lowered the total energy use by 16.8% and 24.1%, respectively, compared with the baseline operation, while maintaining an indoor CO2 concentration below 1000 ppm. Notably, compared with the DQN, ChatGPT-based control does not require a learning process to develop intelligence for building control. In real-world applications, the high generalization capabilities of the ChatGPT-based control, resulting from its extensive training on vast and diverse data, could potentially make it more effective.
引用
收藏
页数:19
相关论文
共 27 条
  • [21] A Comparative Analysis of Conventional and Chat-Generative Pre-trained Transformer-Assisted Teaching Methods in Undergraduate Dental Education
    Bhatia, Amrita P.
    Lambat, Apurva
    Jain, Teerthesh
    CUREUS JOURNAL OF MEDICAL SCIENCE, 2024, 16 (05)
  • [22] Evaluating the accuracy of Chat Generative Pre-trained Transformer version 4 (ChatGPT-4) responses to United States Food and Drug Administration (FDA) frequently asked questions about dental amalgam
    Buldur, Mehmet
    Sezer, Berkant
    BMC ORAL HEALTH, 2024, 24 (01):
  • [23] Diagnosing Glaucoma Based on the Ocular Hypertension Treatment Study Dataset Using Chat Generative Pre-Trained Transformer as a Large Language Model
    Raja, Hina
    Huang, Xiaoqin
    Delsoz, Mohammad
    Madadi, Yeganeh
    Poursoroush, Asma
    Munawar, Asim
    Kahook, Malik Y.
    Yousefi, Siamak
    OPHTHALMOLOGY SCIENCE, 2025, 5 (01):
  • [24] Is generative pre-trained transformer artificial intelligence (Chat-GPT) a reliable tool for guidelines synthesis? A preliminary evaluation for biologic CRSwNP therapy
    Maniaci, Antonino
    Saibene, Alberto Maria
    Calvo-Henriquez, Christian
    Vaira, Luigi
    Radulesco, Thomas
    Michel, Justin
    Chiesa-Estomba, Carlos
    Sowerby, Leigh
    Lobo Duro, David
    Mayo-Yanez, Miguel
    Maza-Solano, Juan
    Lechien, Jerome Rene
    La Mantia, Ignazio
    Cocuzza, Salvatore
    EUROPEAN ARCHIVES OF OTO-RHINO-LARYNGOLOGY, 2024, 281 (04) : 2167 - 2173
  • [25] Is generative pre-trained transformer artificial intelligence (Chat-GPT) a reliable tool for guidelines synthesis? A preliminary evaluation for biologic CRSwNP therapy
    Antonino Maniaci
    Alberto Maria Saibene
    Christian Calvo-Henriquez
    Luigi Vaira
    Thomas Radulesco
    Justin Michel
    Carlos Chiesa-Estomba
    Leigh Sowerby
    David Lobo Duro
    Miguel Mayo-Yanez
    Juan Maza-Solano
    Jerome Rene Lechien
    Ignazio La Mantia
    Salvatore Cocuzza
    European Archives of Oto-Rhino-Laryngology, 2024, 281 : 2167 - 2173
  • [26] Development and evaluation of a program based on a generative pre-trained transformer model from a public natural language processing platform for efficiency enhancement in post-procedural quality control of esophageal endoscopic submucosal dissection
    Ma, Huaiyuan
    Ma, Xingbin
    Yang, Chunxiao
    Niu, Qiong
    Gao, Tao
    Liu, Chengxia
    Chen, Yan
    SURGICAL ENDOSCOPY AND OTHER INTERVENTIONAL TECHNIQUES, 2024, 38 (03): : 1264 - 1272
  • [27] Development and evaluation of a program based on a generative pre-trained transformer model from a public natural language processing platform for efficiency enhancement in post-procedural quality control of esophageal endoscopic submucosal dissection
    Huaiyuan Ma
    Xingbin Ma
    Chunxiao Yang
    Qiong Niu
    Tao Gao
    Chengxia Liu
    Yan Chen
    Surgical Endoscopy, 2024, 38 : 1264 - 1272