Modosc: A Library of Real-Time Movement Descriptors for Marker-Based Motion Capture

被引:3
作者
Dahl, Luke [1 ]
Visi, Federico [2 ]
机构
[1] Univ Virginia, Dept Mus, Charlottesville, VA 22903 USA
[2] Univ Hamburg, Inst Systemat Musicol, Hamburg, Germany
来源
PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING (MOCO'18) | 2018年
基金
欧洲研究理事会;
关键词
Motion capture; motion descriptors; motion analysis; expressive movement; interaction design; Max; Open Sound Control; modosc;
D O I
10.1145/3212721.3212842
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Marker-based motion capture systems that stream precise movement data in real-time afford interaction scenarios that can be subtle, detailed, and immediate. However, challenges to effectively utilizing this data include having to build bespoke processing systems which may not scale well, and a need for higher-level representations of movement and movement qualities. We present modosc, a set of Max abstractions for computing motion descriptors from raw motion capture data in real time. Modosc is designed to address the data handling and synchronization issues that arise when working with complex marker sets, and to structure data streams in a meaningful and easily accessible manner. This is achieved by adopting a multiparadigm programming approach using o.dot and Open Sound Control. We describe an initial set of motion descriptors, the addressing system employed, and design decisions and challenges.
引用
收藏
页数:4
相关论文
共 50 条
  • [31] Real-time Human Action Recognition From Motion Capture Data
    Vantigodi, Suraj
    Babu, R. Venkatesh
    2013 FOURTH NATIONAL CONFERENCE ON COMPUTER VISION, PATTERN RECOGNITION, IMAGE PROCESSING AND GRAPHICS (NCVPRIPG), 2013,
  • [32] ELMO: Enhanced Real-time LiDAR Motion Capture through Upsampling
    Jang, Deok-kyeong
    Yang, Dongseok
    Jang, Deok-yun
    Choi, Byeoli
    Shin, Donghoon
    Lee, Sung-hee
    ACM TRANSACTIONS ON GRAPHICS, 2024, 43 (06):
  • [33] Interacting with Physically-Based Character using Real-Time Motion Capture with Kinect Sensor
    Pessoa, Italo N. S.
    de Sousa, Pedro Henrique A. Q.
    Nunes, Rubens F.
    2016 18TH SYMPOSIUM ON VIRTUAL AND AUGMENTED REALITY (SVR 2016), 2016, : 150 - 154
  • [34] Markerless motion capture estimates of lower extremity kinematics and kinetics are comparable to marker-based across 8 movements
    Song, Ke
    Hullfish, Todd J.
    Silva, Rodrigo Scattone
    Silbernagel, Karin Gravare
    Baxter, Josh R.
    JOURNAL OF BIOMECHANICS, 2023, 157
  • [35] Design and Development of a Real-Time, Low-Cost IMU Based Human Motion Capture System
    Raghavendra, P.
    Sachin, M.
    Srinivas, P. S.
    Talasila, Viswanath
    COMPUTING AND NETWORK SUSTAINABILITY, 2017, 12 : 155 - 165
  • [36] The Cost-effective Method to Develop A Real-time Motion Capture System
    Chao, Shih-Pin
    Chen, Yi-Yao
    Chen, Wu-Chou
    ICCIT: 2009 FOURTH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND CONVERGENCE INFORMATION TECHNOLOGY, VOLS 1 AND 2, 2009, : 494 - +
  • [37] Human hand motion capture in image sequences for real-time animation systems
    Condell, J. V.
    Moore, G.
    IMAGING SCIENCE JOURNAL, 2008, 56 (06) : 307 - 313
  • [38] A Workflow for Real-time Visualization and Data Analysis of Gesture using Motion Capture
    Jego, Jean-Francois
    Boutet, Dominique
    Meyrueis, Vincent
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING MOCO'19, 2019,
  • [39] An Evaluation of Motion Trackers with Virtual Reality Sensor Technology in Comparison to a Marker-Based Motion Capture System Based on Joint Angles for Ergonomic Risk Assessment
    Vox, Jan P.
    Weber, Anika
    Wolf, Karen Insa
    Izdebski, Krzysztof
    Schueler, Thomas
    Koenig, Peter
    Wallhoff, Frank
    Friemert, Daniel
    SENSORS, 2021, 21 (09)
  • [40] Developing real-time sonification with optical motion capture to convey balance-related metrics to dancers
    Dahl, Luke
    Zaferiou, Antonia
    Knowlton, Christopher
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING MOCO'19, 2019,