Expressive facial gestures from motion capture data

被引:10
|
作者
Ju, Eunjung [1 ]
Lee, Jehee [1 ]
机构
[1] Seoul Natl Univ, Seoul 151, South Korea
关键词
D O I
10.1111/j.1467-8659.2008.01135.x
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Human facial gestures often exhibit such natural stochastic variations as how often the eyes blink, how often the eyebrows and the nose twitch, and how the head moves while speaking. The stochastic movements of facial features are key ingredients for generating convincing facial expressions. Although such small variations have been simulated using noise functions in many graphics applications, modulating noise functions to match natural variations induced from the affective states and the personality of characters is difficult and not intuitive. We present a technique for generating subtle expressive facial gestures (facial expressions and head motion) semi-automatically from motion capture data. Our approach is based on Markov random fields that are simulated in two levels. In the lower level, the coordinated movements of facial features are captured, parameterized, and transferred to synthetic faces using basis shapes. The upper level represents independent stochastic behavior of facial features. The experimental results show that our system generates expressive facial gestures synchronized with input speech.
引用
收藏
页码:381 / 388
页数:8
相关论文
共 50 条
  • [1] Expressive Robot Performance based on Facial Motion Capture
    Beskow, Jonas
    Caper, Charlie
    Ehrenfors, Johan
    Hagberg, Nils
    Jansen, Anne
    Wood, Chris
    INTERSPEECH 2021, 2021, : 2343 - 2344
  • [2] Exaggeration of facial expressions from facial motion capture data
    Chin, Seongah
    Lee, Chung-Yeon
    CHINESE OPTICS LETTERS, 2010, 8 (01) : 29 - 32
  • [3] Exaggeration of facial expressions from facial motion capture data
    陈诚儿
    李充然
    Chinese Optics Letters, 2010, 8 (01) : 29 - 32
  • [4] Facial animation by optimized blendshapes from motion capture data
    Liu, Xuecheng
    Mao, Tianlu
    Xia, Shihong
    Yu, Yong
    Wang, Zhaoqi
    COMPUTER ANIMATION AND VIRTUAL WORLDS, 2008, 19 (3-4) : 235 - 245
  • [5] A novel visualization system for expressive facial motion data exploration
    Sucontphunt, Tanasai
    Yuan, Xiaoru
    Li, Qing
    Deng, Zhigang
    IEEE PACIFIC VISUALISATION SYMPOSIUM 2008, PROCEEDINGS, 2008, : 103 - +
  • [6] HIERARCHICAL FACIAL EXPRESSION ANIMATION BY MOTION CAPTURE DATA
    Wang, Shuyang
    Sha, Jinzheng
    Wu, Huai-yu
    Fu, Yun
    2014 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2014,
  • [7] Facial muscle activations from motion capture
    Sifakis, E
    Fedkiw, Z
    2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol 2, Proceedings, 2005, : 1195 - 1195
  • [8] Automatic determination of facial muscle activations from sparse motion capture marker data
    Sifakis, E
    Neverov, I
    Fedkiw, R
    ACM TRANSACTIONS ON GRAPHICS, 2005, 24 (03): : 417 - 425
  • [9] Analysis of Facial Motion Capture Data for Visual Speech Synthesis
    Zelezny, Milos
    Krnoul, Zdenek
    Jedlicka, Pavel
    SPEECH AND COMPUTER (SPECOM 2015), 2015, 9319 : 81 - 88
  • [10] Evaluation of Inertial Sensor Data by a Comparison with Optical Motion Capture Data of Guitar Strumming Gestures
    Freire, Sergio
    Santos, Geise
    Armondes, Augusto
    Meneses, Eduardo A. L.
    Wanderley, Marcelo M.
    SENSORS, 2020, 20 (19) : 1 - 27