Differences in Trust between Human and Automated Decision Aids

被引:8
|
作者
Pearson, Carl J. [1 ]
Welk, Allaire K. [1 ]
Boettcher, William A. [2 ]
Mayer, Roger C. [3 ]
Streck, Sean [4 ]
Simons-Rudolph, Joseph M. [1 ]
Mayhorn, Christopher B. [1 ]
机构
[1] North Carolina State Univ, 2310 Stinson Dr,Poe Hall 700, Raleigh, NC 27695 USA
[2] North Carolina State Univ, Campus Box 8102, Raleigh, NC 27695 USA
[3] North Carolina State Univ, Nelson Hall 1328, Raleigh, NC 27607 USA
[4] North Carolina State Univ, 1021 Main Campus Dr,Suite 310, Raleigh, NC 27606 USA
来源
SYMPOSIUM AND BOOTCAMP ON THE SCIENCE OF SECURITY | 2016年
关键词
Trust; reliance; automation; decision-making; risk; workload; strain;
D O I
10.1145/2898375.2898385
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.
引用
收藏
页码:95 / 98
页数:4
相关论文
共 50 条
  • [1] Similarities and differences between human-human and human-automation trust: an integrative review
    Madhavan, P.
    Wiegmann, D. A.
    THEORETICAL ISSUES IN ERGONOMICS SCIENCE, 2007, 8 (04) : 277 - 301
  • [2] Assessment of operator trust in and utilization of automated decision-aids under different framing conditions
    Bisantz, AM
    Seong, Y
    INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, 2001, 28 (02) : 85 - 97
  • [3] Human Performance Consequences of Automated Decision Aids: The Impact of Time Pressure
    Rieger, Tobias
    Manzey, Dietrich
    HUMAN FACTORS, 2022, 64 (04) : 617 - 634
  • [4] A Conceptual Model of Trust, Perceived Risk, and Reliance on AI Decision Aids
    Solberg, Elizabeth
    Kaarstad, Magnhild
    Eitrheim, Maren H. Ro
    Bisio, Rossella
    Reegard, Kine
    Bloch, Marten
    GROUP & ORGANIZATION MANAGEMENT, 2022, 47 (02) : 187 - 222
  • [5] Trust in hybrid human-automated decision-support
    Kares, Felix
    Koenig, Cornelius J.
    Bergs, Richard
    Protzel, Clea
    Langer, Markus
    INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, 2023, 31 (03) : 388 - 402
  • [6] Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults
    Pak, Richard
    Fink, Nicole
    Price, Margaux
    Bass, Brock
    Sturre, Lindsay
    ERGONOMICS, 2012, 55 (09) : 1059 - 1072
  • [7] Human Performance Consequences of Automated Decision Aids in States of Sleep Loss
    Reichenbach, Juliane
    Onnasch, Linda
    Manzey, Dietrich
    HUMAN FACTORS, 2011, 53 (06) : 717 - 728
  • [8] The uncertain advisor: trust, accuracy, and self-correction in an automated decision support system
    Martin Lochner
    Daniel Smilek
    Cognitive Processing, 2023, 24 : 95 - 106
  • [9] The uncertain advisor: trust, accuracy, and self-correction in an automated decision support system
    Lochner, Martin
    Smilek, Daniel
    COGNITIVE PROCESSING, 2023, 24 (01) : 95 - 106
  • [10] Automated Decision Aids: When Are They Advisors and When Do They Take Control of Human Decision Making?
    Strickland, Luke
    Boag, Russell J.
    Heathcote, Andrew
    Bowden, Vanessa
    Loft, Shayne
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-APPLIED, 2023, 29 (04) : 849 - 868