Classification of Indian Dance Forms using Pre-Trained Model-VGG

被引:0
作者
Biswas, Snigdha [1 ]
Ghildiyal, Anirudh [1 ]
Sharma, Sachin [1 ]
机构
[1] Graph Era Deemed Univ, Dept Comp Sci & Engn, Dehra Dun, Uttarakhand, India
来源
2021 SIXTH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS, SIGNAL PROCESSING AND NETWORKING (WISPNET) | 2021年
关键词
Transfer-Learning; VGG; ICD; dance classification; Multi-Class problem;
D O I
10.1109/WISPNET51692.2021.9419426
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Indian classical dance often referred to as ICD acts as a mirror of the rich cultural heritage of India. Every dance comes from a different state in the country. The national level academy for performing arts in India called The Sangeet Natak Academy identifies eight different dance forms namely- 'Bharatanatyam', 'Kathak', 'Kathakali', 'Kuchipudi', 'Manipuri', 'Mohiniyattam', 'Odissi' and 'Sattriya'. In this paper, we have used transfer learning to solve this multiclass problem. Pre-trained models such as VGG16 and VGG19 were used to classify the images of the 8 dance forms known in India. This was completed using the ICD data set from Kaggle consisting of 8 different classes and multiple images in each class.
引用
收藏
页码:278 / 282
页数:5
相关论文
共 50 条
  • [11] Comparison of Pre-Trained CNNs for Audio Classification Using Transfer Learning
    Tsalera, Eleni
    Papadakis, Andreas
    Samarakou, Maria
    [J]. JOURNAL OF SENSOR AND ACTUATOR NETWORKS, 2021, 10 (04)
  • [12] Pre-trained Language Models with Limited Data for Intent Classification
    Kasthuriarachchy, Buddhika
    Chetty, Madhu
    Karmakar, Gour
    Walls, Darren
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [13] Performance Evaluation of CNN and Pre-trained Models for Malware Classification
    Omar Habibi
    Mohammed Chemmakha
    Mohamed Lazaar
    [J]. Arabian Journal for Science and Engineering, 2023, 48 : 10355 - 10369
  • [14] Performance Evaluation of CNN and Pre-trained Models for Malware Classification
    Habibi, Omar
    Chemmakha, Mohammed
    Lazaar, Mohamed
    [J]. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2023, 48 (08) : 10355 - 10369
  • [15] Analyzing the Potential of Pre-Trained Embeddings for Audio Classification Tasks
    Grollmisch, Sascha
    Cano, Estefania
    Kehling, Christian
    Taenzer, Michael
    [J]. 28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 790 - 794
  • [16] PTMA: Pre-trained Model Adaptation for Transfer Learning
    Li, Xiao
    Yan, Junkai
    Jiang, Jianjian
    Zheng, Wei-Shi
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2024, 2024, 14884 : 176 - 188
  • [17] Pre-trained Language Model for Biomedical Question Answering
    Yoon, Wonjin
    Lee, Jinhyuk
    Kim, Donghyeon
    Jeong, Minbyul
    Kang, Jaewoo
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 727 - 740
  • [18] Improved Siamese Palmprint Authentication Using Pre-Trained VGG16-Palmprint and Element-Wise Absolute Difference
    Ezz M.
    Alanazi W.
    Mostafa A.M.
    Hamouda E.
    Elbashir M.K.
    Alruily M.
    [J]. Computer Systems Science and Engineering, 2023, 46 (02): : 2299 - 2317
  • [19] Skin Lesion Classification Using Pre-Trained DenseNet201 Deep Neural Network
    Jasil, S. P. Godlin
    Ulagamuthalvi, V.
    [J]. ICSPC'21: 2021 3RD INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATION (ICPSC), 2021, : 393 - 396
  • [20] Efficient pollen grain classification using pre-trained Convolutional Neural Networks: a comprehensive study
    Rostami, Masoud A.
    Balmaki, Behnaz
    Dyer, Lee A.
    Allen, Julie M.
    Sallam, Mohamed F.
    Frontalini, Fabrizio
    [J]. JOURNAL OF BIG DATA, 2023, 10 (01)