Robot-Mediated Inclusive Processes in Groups of Children: From Gaze Aversion to Mutual Smiling Gaze

被引:11
作者
Tuncer, Sylvaine [1 ]
Gillet, Sarah [1 ]
Leite, Iolanda [1 ]
机构
[1] Royal Inst Technol KTH, Stockholm, Sweden
来源
FRONTIERS IN ROBOTICS AND AI | 2022年 / 9卷
关键词
interactions in groups; robot-mediated interaction; video analysis; gaze behaviour; conversation analysis; ingroup inclusion; interdisciplinary study; YOUNG-CHILDREN; BEHAVIOR;
D O I
10.3389/frobt.2022.729146
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Our work is motivated by the idea that social robots can help inclusive processes in groups of children, focusing on the case of children who have newly arrived from a foreign country and their peers at school. Building on an initial study where we tested different robot behaviours and recorded children's interactions mediated by a robot in a game, we present in this paper the findings from a subsequent analysis of the same video data drawing from ethnomethodology and conversation analysis. We describe how this approach differs from predominantly quantitative video analysis in HRI; how mutual gaze appeared as a challenging interactional accomplishment between unacquainted children, and why we focused on this phenomenon. We identify two situations and trajectories in which children make eye contact: asking for or giving instructions, and sharing an emotional reaction. Based on detailed analyses of a selection of extracts in the empirical section, we describe patterns and discuss the links between the different situations and trajectories, and relationship building. Our findings inform HRI and robot design by identifying complex interactional accomplishments between two children, as well as group dynamics which support these interactions. We argue that social robots should be able to perceive such phenomena in order to better support inclusion of outgroup children. Lastly, by explaining how we combined approaches and showing how they build on each other, we also hope to demonstrate the value of interdisciplinary research, and encourage it.
引用
收藏
页数:15
相关论文
共 54 条
  • [31] What is Human-like?: Decomposing Robots' Human-like Appearance Using the Anthropomorphic roBOT (ABOT) Database
    Phillips, Elizabeth
    Zhao, Xuan
    Ullman, Daniel
    Malle, Bertram F.
    [J]. HRI '18: PROCEEDINGS OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2018, : 105 - 113
  • [32] Doing Introductions: The Work Involved in Meeting Someone New
    Pillet-Shore, Danielle
    [J]. COMMUNICATION MONOGRAPHS, 2011, 78 (01) : 73 - 95
  • [33] Pitsch K., 2017, P 5 INT C HUMAN AGEN
  • [34] Robot feedback shapes the tutor's presentation How a robot's online gaze strategies lead to micro-adaptation of the human's conduct
    Pitsch, Karola
    Vollmer, Anna-Lisa
    Muehlig, Manuel
    [J]. INTERACTION STUDIES, 2013, 14 (02) : 268 - 296
  • [35] Pomerantz A., 2012, The handbook of conversation analysis, P210, DOI DOI 10.1002/9781118325001.CH11
  • [36] Rossano F., 2013, The handbook of conversation analysis, P308, DOI DOI 10.1002/9781118325001.CH15
  • [37] Schegloff E.A., 2002, Gesture, V2, P133, DOI [10.1075/gest.2.2.02sac, DOI 10.1075/GEST.2.2.02SAC]
  • [38] The Influence of Robot Verbal Support on Human Team Members: Encouraging Outgroup Contributions and Suppressing Ingroup Supportive Behavior
    Sebo, Sarah
    Dong, Ling Liang
    Chang, Nicholas
    Lewkowicz, Michal
    Schutzman, Michael
    Scassellati, Brian
    [J]. FRONTIERS IN PSYCHOLOGY, 2020, 11
  • [39] Strategies for the Inclusion of Human Members within Human-Robot Teams
    Sebo, Sarah Strohkorb
    Dong, Ling Liang
    Chang, Nicholas
    Scassellati, Brian
    [J]. PROCEEDINGS OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '20), 2020, : 309 - 317
  • [40] The Ripple Effects of Vulnerability: The Effects of a Robot's Vulnerable Behavior on Trust in Human-Robot Teams
    Sebo, Sarah Strohkorb
    Traeger, Margaret
    Jung, Malte
    Scassellati, Brian
    [J]. HRI '18: PROCEEDINGS OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2018, : 178 - 186