Force-feedback interaction with a neural oscillator model: for shared human-robot control of a virtual percussion instrument

被引:0
作者
Edgar Berdahl
Claude Cadoz
Nicolas Castagné
机构
[1] Association pour la Création et la Recherche sur les Outils d'Expression (ACROE) and ICA Laboratory Grenoble Institute of Technology,
来源
EURASIP Journal on Audio, Speech, and Music Processing | / 2012卷
关键词
force-feedback; neural oscillator; physical modeling; human-robot interaction; new media; haptic;
D O I
暂无
中图分类号
学科分类号
摘要
A study on force-feedback interaction with a model of a neural oscillator provides insight into enhanced human-robot interactions for controlling musical sound. We provide differential equations and discrete-time computable equations for the core oscillator model developed by Edward Large for simulating rhythm perception. Using a mechanical analog parameterization, we derive a force-feedback model structure that enables a human to share control of a virtual percussion instrument with a "robotic" neural oscillator. A formal human subject test indicated that strong coupling (STRNG) between the force-feedback device and the neural oscillator provided subjects with the best control. Overall, the human subjects predominantly found the interaction to be "enjoyable" and "fun" or "entertaining." However, there were indications that some subjects preferred a medium-strength coupling (MED), presumably because they were unaccustomed to such strong force-feedback interaction with an external agent. With related models, test subjects performed better when they could synchronize their input in phase with a dominant sensory feedback modality. In contrast, subjects tended to perform worse when an optimal strategy was to move the force-feedback device with a 90° phase lag. Our results suggest an extension of dynamic pattern theory to force-feedback tasks. In closing, we provide an overview of how a similar force-feedback scenario could be used in a more complex musical robotics setting.
引用
收藏
相关论文
共 57 条
  • [1] Mathews M(1963)The digital computer as a musical instrument Science 142 553-557
  • [2] Park T(2009)An interview with Max Mathews Comput Music J 33 9-22
  • [3] Cadoz C(2009)Supra-instrumental interactions and gestures J New Music Res 38 215-230
  • [4] Marin L(2009)Interpersonal motor coordination. Interact Stud 10 479-504
  • [5] Issartel J(2009)Pulse and meter as neural resonance Ann N Y Acad Sci 1169 46-57
  • [6] Chaminade T(1988)Robust control of dynamically interacting systems Int J Control 48 65-88
  • [7] Large E(1981)Synthèse musicale par simulation des mécanismes instrumentaux Revue d'acouqistique 59 279-292
  • [8] Snyder J(1993)CORDIS-ANIMA: A modeling and simulation system for sound and image synthesis--the general formalism Comput Music J 17 19-29
  • [9] Colgate JE(2004)Fractal modeling of human isochronous serial interval production Biol Cybern 90 105-112
  • [10] Hogan N(2008)Central pattern generators for locomotion control in animals and robots: A review Neural Netw 21 642-653