How do users interact with algorithm recommender systems? The interaction of users, algorithms, and performance

被引:91
作者
Shin, Donghee [1 ]
机构
[1] Zayed Univ, Coll Commun & Media Sci, POB 144534, Abu Dhabi, U Arab Emirates
关键词
Algorithm; News recommendation system; Algorithm heuristic; Algorithm user experience; User-centered algorithm; TRANSPARENCY; TRUST;
D O I
10.1016/j.chb.2020.106344
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Although algorithms have been widely used to deliver useful applications and services, it is unclear how users actually experience and interact with algorithm-driven services. This ambiguity is even more troubling in news recommendation algorithms, where thorny issues are complicated. This study investigates the user experience and usability of algorithms by focusing on users' cognitive process to understand how qualities/features are received and transformed into experiences and interaction. This work examines how users perceive and feel about issues in news recommendations and how they interact and engage with algorithm-recommended news. It proposes an algorithm experience model of news recommendation integrating the heuristic process of cognitive, affective, and behavioral factors. The underlying algorithm can affect in different ways the user's perception and trust of the system. The heuristic affect occurs when users' subjective feelings about transparency and accuracy act as a mental shortcut: users considered transparent and accurate systems convenient and useful. The mediating role of trust suggests that establishing algorithmic trust between users and NRS could enhance algorithm performance. The model illustrates the users' cognitive processes of perceptual judgment as well as the motivation behind user behaviors. The results highlight a link between news recommendation systems and user interaction, providing a clearer conceptualization of user-centered development and the evaluation of algorithm-based services.
引用
收藏
页数:10
相关论文
共 31 条
[1]   Why trust an algorithm? Performance, cognition, and neurophysiology [J].
Alexander, Veronika ;
Blinder, Collin ;
Zak, Paul J. .
COMPUTERS IN HUMAN BEHAVIOR, 2018, 89 :279-288
[2]   Towards Algorithmic Experience: Initial Efforts for Social Media Contexts [J].
Alvarado, Oscar ;
Waern, Annika .
PROCEEDINGS OF THE 2018 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI 2018), 2018,
[3]   Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability [J].
Ananny, Mike ;
Crawford, Kate .
NEW MEDIA & SOCIETY, 2018, 20 (03) :973-989
[4]   THE MODERATOR MEDIATOR VARIABLE DISTINCTION IN SOCIAL PSYCHOLOGICAL-RESEARCH - CONCEPTUAL, STRATEGIC, AND STATISTICAL CONSIDERATIONS [J].
BARON, RM ;
KENNY, DA .
JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1986, 51 (06) :1173-1182
[5]   Personalized News Portals: Filtering Systems and Increased News Exposure [J].
Beam, Michael A. ;
Kosicki, Gerald M. .
JOURNALISM & MASS COMMUNICATION QUARTERLY, 2014, 91 (01) :59-77
[6]   Empowering recommender systems using trust and argumentation [J].
Bedi, Punam ;
Vashisth, Pooja .
INFORMATION SCIENCES, 2014, 279 :569-586
[7]   Understanding information systems continuance: An expectation-confirmation model [J].
Bhattacherjee, A .
MIS QUARTERLY, 2001, 25 (03) :351-370
[8]  
Chen L, 2012, USER MODEL USER-ADAP, V22, P125, DOI [10.1007/s11257-011-9108-6, 10.1007/s11257-011-9115-7]
[9]  
Cohen J., 1988, Journal Of The American Statistical Association, DOI [DOI 10.2307/2290095, 10.4324/9780203771587, DOI 10.4324/9780203771587]
[10]   The effects of transparency on trust in and acceptance of a content-based art recommender [J].
Cramer, Henriette ;
Evers, Vanessa ;
Ramlal, Satyan ;
van Someren, Maarten ;
Rutledge, Lloyd ;
Stash, Natalia ;
Aroyo, Lora ;
Wielinga, Bob .
USER MODELING AND USER-ADAPTED INTERACTION, 2008, 18 (05) :455-496