Bodystorming Human-Robot Interactions

被引:22
|
作者
Porfirio, David [1 ]
Fisher, Evan [2 ]
Sauppe, Allison [2 ]
Albarghouthi, Aws [1 ]
Mutlu, Bilge [1 ]
机构
[1] Univ Wisconsin, Madison, WI 53706 USA
[2] Univ Wisconsin, La Crosse, WI 54601 USA
基金
美国国家科学基金会;
关键词
Human-robot interaction; interaction design; ideation; bodystorming; design tools; program synthesis; DESIGN;
D O I
10.1145/3332165.3347957
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Designing and implementing human-robot interactions requires numerous skills, from having a rich understanding of social interactions and the capacity to articulate their subtle requirements, to the ability to then program a social robot with the many facets of such a complex interaction. Although designers are best suited to develop and implement these interactions due to their inherent understanding of the context and its requirements, these skills are a barrier to enabling designers to rapidly explore and prototype ideas: it is impractical for designers to also be experts on social interaction behaviors, and the technical challenges associated with programming a social robot are prohibitive. In this work, we introduce Synthe, which allows designers to act out, or bodystorm, multiple demonstrations of an interaction. These demonstrations are automatically captured and translated into prototypes for the design team using program synthesis. We evaluate Synthe in multiple design sessions involving pairs of designers body-storming interactions and observing the resulting models on a robot. We build on the findings from these sessions to improve the capabilities of Synthe and demonstrate the use of these capabilities in a second design session.
引用
收藏
页码:479 / 491
页数:13
相关论文
共 50 条
  • [41] Robot Transparency and Anthropomorphic Attribute Effects on Human-Robot Interactions
    Wang, Jianmin
    Liu, Yujia
    Yue, Tianyang
    Wang, Chengji
    Mao, Jinjing
    Wang, Yuxi
    You, Fang
    SENSORS, 2021, 21 (17)
  • [42] Modelling User Experience in Human-Robot Interactions
    Jokinen, Kristiina
    Wilcock, Graham
    MULTIMODAL ANALYSES ENABLING ARTIFICIAL AGENTS IN HUMAN-MACHINE INTERACTION, 2015, 8757 : 45 - 56
  • [43] Engagement rules for human-robot collaborative interactions
    Sidner, CL
    Lee, C
    2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS, 2003, : 3957 - 3962
  • [44] Development of an Office Delivery Robot with Multimodal Human-Robot Interactions
    Jean, Jong-Hann
    Wei, Chen-Fu
    Lin, Zheng-Wei
    Lian, Kuang-Yow
    2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 1564 - 1567
  • [45] The Making of Gendered Bodies in Human-Robot Interactions
    Velazquez, Isabel Garcia
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2023, 15 (11) : 1891 - 1901
  • [46] GESTURE RECOGNITION FOR CONTROL IN HUMAN-ROBOT INTERACTIONS
    Reid, Chris
    Samanta, Biswanath
    ASME INTERNATIONAL MECHANICAL ENGINEERING CONGRESS AND EXPOSITION, 2014, VOL 4B, 2015,
  • [47] Sound as Implicit Influence on Human-Robot Interactions
    Moore, Dylan
    Ju, Wendy
    COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 311 - 312
  • [48] MULTIMODAL DATA COMMUNICATION FOR HUMAN-ROBOT INTERACTIONS
    Wallhoff, Frank
    Rehrl, Tobias
    Gast, Juergen
    Bannat, Alexander
    Rigoll, Gerhard
    ICME: 2009 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-3, 2009, : 1146 - 1149
  • [49] Human-robot interactions in active sensor networks
    Makarenko, A
    Kaupp, T
    Grocholsky, B
    Durrant-Whyte, H
    2003 IEEE INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN ROBOTICS AND AUTOMATION, VOLS I-III, PROCEEDINGS, 2003, : 247 - 252
  • [50] Human, Robot, and Audience About the position of the third party in human-robot interactions
    Schad, Nina
    SOZIALE WELT-ZEITSCHRIFT FUR SOZIALWISSENSCHAFTLICHE FORSCHUNG UND PRAXIS, 2022, 73 (01): : 5 - 33