Exploring style transfer algorithms in Animation: Enhancing visual

被引:0
|
作者
He, Jia [1 ]
机构
[1] Luan Vocat Tech Coll, Luan 237000, Peoples R China
关键词
Computational processing; Human motion; Dynamic animations; Motion style transfer; Deep learning algorithms; ARTIFICIAL-INTELLIGENCE;
D O I
10.1016/j.entcom.2023.100625
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The necessity to computationally process human motion to produce realistic and dynamic animations is increasing with the fourth industrial revolution. Motion style transfer provides an appealing alternative to manually creating motions from start by utilizing already recorded motion data to automatically create realistic motion samples. Motion style transfer techniques have been transformed by deep learning algorithms, especially deep neural networks (DNNs). These algorithms are excellent for motion synthesis tasks because they can anticipate future motion styles. A style transfer method called CNN-BiLSTM-ATT (Convolutional Neural Network-Bidirectional Long Short-Term Memory with Attention) is put forth to analyze spatiotemporal features. The strategy tries to realistically synthesize and represent the intricacy of human motion by merging CNNs, BiLSTMs, and attention mechanisms. The difference between reference and source styles is converted to a novel motion that might include never-before-seen movements by extracting spectral intensity representations of each. A temporally sliding window filter is added to the method to do local analysis in time for the processing of heterogeneous motion, greatly improving it. As a result, the method can be used to enrich style databases by filling in missing actions and enhancing the effectiveness of earlier style transfer techniques. Through controlled user studies and quantitative trials, the effectiveness of the suggested strategy is assessed. The outcomes show a notable advancement over earlier studies, emphasizing the method's capacity to produce thorough and accurate motion sequences.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Enhancing the Robustness of Visual Object Tracking via Style Transfer
    Amirkhani, Abdollah
    Barshooi, Amir Hossein
    Ebrahimi, Amir
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 70 (01): : 981 - 997
  • [2] Exploring Music Style Transfer and Innovative Composition using Deep Learning Algorithms
    He, Sujie
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (05) : 1000 - 1007
  • [3] Integrating visual stimuli for enhancing neural text style transfer with EEG sensors
    Amin, Muhammad
    Tubaishat, Abdallah
    Al-Obeidat, Feras
    Shah, Babar
    Ullah, Waseef
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 102
  • [4] Unpaired Motion Style Transfer from Video to Animation
    Aberman, Kfir
    Weng, Yijia
    Lischinski, Dani
    Cohen-Or, Daniel
    Chen, Baoquan
    ACM TRANSACTIONS ON GRAPHICS, 2020, 39 (04):
  • [5] EXTRACTION ALGORITHMS FOR DYNAMIC VISUAL EFFECT IN FLASH ANIMATION
    Shi, Lin
    Xu, Zhenguo
    Meng, Xiangzeng
    MECHATRONIC SYSTEMS AND CONTROL, 2018, 46 (01): : 39 - 45
  • [6] SAMStyler: Enhancing Visual Creativity With Neural Style Transfer and Segment Anything Model (SAM)
    Psychogyios, Konstantinos
    Leligou, Helen C.
    Melissari, Filisia
    Bourou, Stavroula
    Anastasakis, Zacharias
    Zahariadis, Theodore
    IEEE ACCESS, 2023, 11 : 100256 - 100267
  • [7] Hierarchical style transfer for enhancing feature diversity in long-tailed visual recognition
    Lintao Hu
    Zhao-Min Chen
    Huiling Chen
    Ruoxi Deng
    Jie Hu
    Cluster Computing, 2025, 28 (5)
  • [8] Japanese Animation Style Transfer using Deep Neural Networks
    Ye, Shiyang
    Ohtera, Ryo
    PROCEEDINGS OF THE 2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATION AND ENGINEERING (IEEE-ICICE 2017), 2017, : 492 - 495
  • [9] Multimodal Animation Style Transfer Method Fused with Attention Mechanism
    Nie, Xiongfeng
    Wang, Junying
    Dong, Fangmin
    Zang, Zhaoxiang
    Jiang, Shu
    Computer Engineering and Applications, 2023, 59 (15) : 223 - 234
  • [10] Japanese animation style transfer using deep neural networks
    Ye, Shiyang
    Ohtera, Ryo
    Proceedings of the 2017 IEEE International Conference on Information, Communication and Engineering: Information and Innovation for Modern Technology, ICICE 2017, 2018, : 492 - 495