Adopting AI: how familiarity breeds both trust and contempt

被引:36
作者
Horowitz, Michael C. [1 ]
Kahn, Lauren [2 ]
Macdonald, Julia [3 ]
Schneider, Jacquelyn [4 ]
机构
[1] Univ Penn, Philadelphia, PA 19104 USA
[2] Council Foreign Relat, Washington, DC USA
[3] Univ Denver, Denver, CO USA
[4] Stanford Univ, Stanford, CA USA
关键词
Artificial intelligence; Autonomy; Public opinion; Trust; Familiarity; Transportation; Medicine; Autonomous weapons; Cyber; TECHNOLOGY ACCEPTANCE; UNITED-STATES; POLITICS; SCIENCE; PERCEPTIONS; AUTOMATION; ATTITUDES; VEHICLES; BELIEFS; SAFETY;
D O I
10.1007/s00146-023-01666-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite pronouncements about the inevitable diffusion of artificial intelligence and autonomous technologies, in practice, it is human behavior, not technology in a vacuum, that dictates how technology seeps into-and changes-societies. To better understand how human preferences shape technological adoption and the spread of AI-enabled autonomous technologies, we look at representative adult samples of US public opinion in 2018 and 2020 on the use of four types of autonomous technologies: vehicles, surgery, weapons, and cyber defense. By focusing on these four diverse uses of AI-enabled autonomy that span transportation, medicine, and national security, we exploit the inherent variation between these AI-enabled autonomous use cases. We find that those with familiarity and expertise with AI and similar technologies were more likely to support all of the autonomous applications we tested (except weapons) than those with a limited understanding of the technology. Individuals that had already delegated the act of driving using ride-share apps were also more positive about autonomous vehicles. However, familiarity cut both ways; individuals are also less likely to support AI-enabled technologies when applied directly to their life, especially if technology automates tasks they are already familiar with operating. Finally, we find that familiarity plays little role in support for AI-enabled military applications, for which opposition has slightly increased over time.
引用
收藏
页码:1721 / 1735
页数:15
相关论文
共 65 条
[1]  
Austin David, 2006, Aust Fam Physician, V35, P365
[2]   Assessing public opinions of and interest in new vehicle technologies: An Austin perspective [J].
Bansal, Prateek ;
Kockelman, Kara M. ;
Singh, Amit .
TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2016, 67 :1-14
[3]   Automation interrupted: How autonomous vehicle accidents transform the material politics of automation [J].
Bissell, David .
POLITICAL GEOGRAPHY, 2018, 65 :57-66
[4]   INFORMATIONAL AND NORMATIVE SOCIAL INFLUENCE IN BUYER BEHAVIOR [J].
BURNKRANT, RE ;
COUSINEAU, A .
JOURNAL OF CONSUMER RESEARCH, 1975, 2 (03) :206-215
[5]  
Chau P. T. K., 1996, Journal of Management Information Systems, V13, P185
[6]   Human confidence in artificial intelligence and in themselves: The evolution and impact of confidence on adoption of AI advice [J].
Chong, Leah ;
Zhang, Guanglu ;
Goucher-Lambert, Kosa ;
Kotovsky, Kenneth ;
Cagan, Jonathan .
COMPUTERS IN HUMAN BEHAVIOR, 2022, 127
[7]   Individuals with greater science literacy and education have more polarized beliefs on controversial science topics [J].
Drummond, Caitlin ;
Fischhoff, Baruch .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (36) :9587-9592
[8]  
Fazio R. H., 1981, ADV EXPT SOCIAL PSYC, V14, P161, DOI [DOI 10.1016/S0065-2601(08)60372-X, DOI 10.1177/014616727800400109, 10.1016/S0065-2601(08)60372-X]
[9]   DIRECT EXPERIENCE AND ATTITUDE-BEHAVIOR CONSISTENCY - INFORMATION-PROCESSING ANALYSIS [J].
FAZIO, RH ;
ZANNA, MP ;
COOPER, J .
PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN, 1978, 4 (01) :48-51
[10]  
Feaver P.D., 2011, Choosing Your Battles: American civil-military relations and the use of force, DOI DOI 10.1515/9781400841455