Democratizing access to collaborative music making over the network using air instruments

被引:6
作者
Cocchiara, Davide [1 ]
Turchet, Luca [1 ]
机构
[1] Univ Trento, Dept Informat Engn & Comp Sci, Trento, Italy
来源
PROCEEDINGS OF THE 17TH INTERNATIONAL AUDIO MOSTLY CONFERENCE, AM 2022 | 2022年
关键词
Networked music performance systems; digital musical instruments; computer vision; Internet of Musical Things;
D O I
10.1145/3561212.3561227
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
To date, scarce research has been conducted on the development of tools capable of fostering the democratization of the access to collaborative music making over the network. This paper describes a system based on interconnected air instruments conceived for introducing musically untrained people to collaborative music playing. The system consists of an application controlling synthesizers via real-time finger tracking on input from a consumer-grade camera, which is used in conjunction with a basic networking music performance system communicating control messages. Moving fingers in the air is one of the simplest movements that everybody can afford, thus it was selected as an optimal candidate to build a musical instrument accessible to everybody. A user study was conducted to assess the experience in interacting with the system, involving ten pairs of participants with no musical expertise. Overall, results showed that participants deemed that the system was effective in providing a high user experience, adequate to enable non-musicians to play together at a distance. Moreover, the system was judged as capable of promoting music playing for non-musicians thus fostering easiness of access to music making in a collaborative fashion. A critical reflection on the results is provided, along with a discussion of the study limitations and possible future works.
引用
收藏
页码:211 / 218
页数:8
相关论文
共 28 条
[1]   Implementation of a Character Recognition System Based on Finger-Joint Tracking Using a Depth Camera [J].
Alam, Md. Shahinur ;
Kwon, Ki-Chul ;
Kim, Nam .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2021, 51 (03) :229-241
[2]  
Bernardo F., 2017, P NIME, V17, P283
[3]  
Bevilacqua F, 2010, LECT NOTES ARTIF INT, V5934, P73, DOI 10.1007/978-3-642-12553-9_7
[4]  
Braun V., 2006, Qual Res Psychol, V3, P77, DOI [DOI 10.1080/10875549.2021.1929659, DOI 10.1191/1478088706QP063OA]
[5]   Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians [J].
Brown, Dom ;
Nash, Chris ;
Mitchell, Tom .
DIGITAL CREATIVITY, 2018, 29 (2-3) :129-148
[6]   Adaptive Gesture Recognition with Variation Estimation for Interactive Systems [J].
Caramiaux, Baptiste ;
Montecchio, Nicola ;
Tanaka, Atau ;
Bevilacqua, Frederic .
ACM TRANSACTIONS ON INTERACTIVE INTELLIGENT SYSTEMS, 2015, 4 (04)
[7]   Building Guitar Strum Models for an Interactive Air Guitar Prototype [J].
Edel Tamani, John ;
Blaise Cruz, Jan Christian ;
Raphaelle Cruzada, Joshua ;
Valenzuela, Jolene ;
Gray Chan, Kevin ;
Aiko Deja, Jordan .
PROCEEDINGS OF 4TH INTERNATIONAL CONFERENCE ON HCI AND UX (CHIUXID 2018), 2018, :18-22
[8]  
Erdem C., 2020, P SOUND MUS COMP C, P177
[9]  
Essl G., 2013, NIME, P122
[10]  
Fiebrink R., 2010, P INT SOC MUS INF RE, V3