We have to go back, back to the future! Reflecting on 75 years of human factors in the UK to shape a future of responsible artificial intelligence innovation

被引:0
作者
Roberts, Aaron P. J. [1 ]
Parnell, Christopher J. [2 ]
Patel, Menisha [3 ]
机构
[1] Maritime Syst, Thales, England
[2] Def Sci & Technol Lab, Salisbury, Wilts, England
[3] Kings Coll London, Informat, London, England
关键词
History of human factors; artificial intelligence; academic practitioner gap; responsible research innovation; HUMAN FACTORS/ERGONOMICS; INDUSTRIAL-ENGINEERS; ERGONOMICS RESEARCH; BENEFITS; SCIENCE; PUBLICATIONS; DISCIPLINE; AUTOMATION; SYSTEMS; DESIGN;
D O I
10.1080/00140139.2024.2392779
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The origins of Human Factors (HF) are rooted in the Second World War. It is a sign of the times that 75 years on from the formation of the Ergonomics Research Society, discussions occur as to whether Artificial Intelligence (AI) could/should be capable of controlling weaponry in a theatre of war. HF can support the design of safe, ethical, and usable AI: but there is little evidence of HF influencing industrial organisations developing AI. A review of the history of HF was conducted to understand how the influence of discipline on AI development may be optimised. The field may need to become broader and more inclusive, given the potential implications of innovation such as AI. The field of Responsible Research and Innovation can help the HF Practitioner ensure that the design and application of AI based technology serves to improve human well-being and optimise system performance over the next 75 years.Practitioner summary: A review of the history and origins of Human Factors was conducted. The review aimed to learn from the development of the discipline over the last 75 years to provide insights of what can be done to optimise the influence of HF to design safe, ethical, and usable artificial intelligence.
引用
收藏
页码:968 / 986
页数:19
相关论文
共 134 条
  • [1] Artificial intelligence in obstetrics
    Ahn, Ki Hoon
    Lee, Kwang-Sig
    [J]. OBSTETRICS & GYNECOLOGY SCIENCE, 2022, 65 (02) : 113 - 124
  • [2] Artificial intelligence in the creative industries: a review
    Anantrasirichai, Nantheera
    Bull, David
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2022, 55 (01) : 589 - 656
  • [3] The role of shared mental models in human-AI teams: a theoretical review
    Andrews, Robert W.
    Lilly, J. Mason
    Srivastava, Divya
    Feigh, Karen M.
    [J]. THEORETICAL ISSUES IN ERGONOMICS SCIENCE, 2023, 24 (02) : 129 - 175
  • [4] [Anonymous], 1999, Human-centred design processes for interactive systems
  • [5] Who is responsible for automated driving? A macro-level insight into automated driving in the United Kingdom using the Risk Management Framework and Social Network Analysis
    Banks, Victoria A.
    Stanton, Neville A.
    Plant, Katherine L.
    [J]. APPLIED ERGONOMICS, 2019, 81
  • [6] Analysis of driver roles: modelling the changing role of the driver in automated driving systems using EAST
    Banks, Victoria A.
    Stanton, Neville A.
    [J]. THEORETICAL ISSUES IN ERGONOMICS SCIENCE, 2019, 20 (03) : 284 - 300
  • [7] Driver error or designer error: Using the Perceptual Cycle Model to explore the circumstances surrounding the fatal Tesla crash on 7th May 2016
    Banks, Victoria A.
    Plant, Katherine L.
    Stanton, Neville A.
    [J]. SAFETY SCIENCE, 2018, 108 : 278 - 285
  • [8] Bardzell Jefrey, 2013, P SIGCHI C HUM FACT, P3297, DOI [10.1145/2470654.2466451, DOI 10.1145/2470654.2466451]
  • [9] Academics and Practitioners Are Alike and Unlike: The Paradoxes of Academic-Practitioner Relationships
    Bartunek, Jean Marie
    Rynes, Sara Lynn
    [J]. JOURNAL OF MANAGEMENT, 2014, 40 (05) : 1181 - 1201
  • [10] Beevis D, 2003, APPL ERGON, V34, P491, DOI 10.1016/S0003-6870(03)00068-1