Context-Aware Automated Analysis and Annotation of Social Human-Agent Interactions

被引:29
作者
Baur, Tobias [1 ]
Mehlmann, Gregor [1 ]
Damian, Ionut [1 ]
Lingenfelser, Florian [1 ]
Wagner, Johannes [1 ]
Lugrin, Birgit [1 ]
Andre, Elisabeth [1 ]
Gebhard, Patrick [2 ]
机构
[1] Univ Augsburg, Human Ctr Multimedia, Univ Str 6a, D-86159 Augsburg, Germany
[2] DFKI GmbH, Berlin, Germany
基金
欧盟地平线“2020”;
关键词
Social cue recognition; virtual job interviews; serious games; automated behavior analysis; interaction design;
D O I
10.1145/2764921
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The outcome of interpersonal interactions depends not only on the contents that we communicate verbally, but also on nonverbal social signals. Because a lack of social skills is a common problem for a significant number of people, serious games and other training environments have recently become the focus of research. In this work, we present NovA (Nonverbal behavior Analyzer), a system that analyzes and facilitates the interpretation of social signals automatically in a bidirectional interaction with a conversational agent. It records data of interactions, detects relevant social cues, and creates descriptive statistics for the recorded data with respect to the agent's behavior and the context of the situation. This enhances the possibilities for researchers to automatically label corpora of human-agent interactions and to give users feedback on strengths and weaknesses of their social behavior.
引用
收藏
页数:33
相关论文
共 63 条
[1]  
Anderson Keith, 2013, LECT NOTES COMPUTER
[2]  
[Anonymous], 2000, ISCA TUT RES WORKSH
[3]  
Batrinca Ligia, 2013, Intelligent Virtual Agents. 13th International Conference, IVA 2013. Proceedings: LNCS 8108, P116, DOI 10.1007/978-3-642-40415-3_10
[4]   A Job Interview Simulation: Social Cue-based Interaction with a Virtual Character [J].
Baur, Tobias ;
Damian, Ionut ;
Gebhard, Patrick ;
Porayska-Pomsta, Kaska ;
Andre, Elisabeth .
2013 ASE/IEEE INTERNATIONAL CONFERENCE ON SOCIAL COMPUTING (SOCIALCOM), 2013, :220-227
[5]  
Baur T, 2013, LECT NOTES COMPUT SC, V8212, P160, DOI 10.1007/978-3-319-02714-2_14
[6]  
Boersma P., 2016, PRAAT DOING PHONETIC
[7]   Communicating expressiveness and affect in multimodal interactive systems [J].
Camurri, A ;
Volpe, G ;
De Poli, G ;
Leman, M .
IEEE MULTIMEDIA, 2005, 12 (01) :43-53
[8]  
Caridakis George, 2006, P TEH WORKSH MULT CO
[9]   Empirically building and evaluating a probabilistic model of user affect [J].
Conati, Cristina ;
Maclaren, Heather .
USER MODELING AND USER-ADAPTED INTERACTION, 2009, 19 (03) :267-303
[10]  
Cowie R., 2012, INT J SYNTHETIC EMOT, V3, P1, DOI [10.4018/jse.2012010101, DOI 10.4018/JSE.2012010101]