Object Recognition and Classification Based on Improved Bag of Features using SURF AND MSER Local Feature Extraction

被引:1
|
作者
Ramya, P. P. [1 ]
James, Ajay [1 ]
机构
[1] Govt Engn Coll, Dept Comp Sci & Engn, Trichur, Kerala, India
来源
PROCEEDINGS OF 2019 1ST INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION AND COMMUNICATION TECHNOLOGY (ICIICT 2019) | 2019年
关键词
Bag of Features(BoF); Object Recognition; SURF; MSER; Spatial Pyramid Matching; Classification; SVM;
D O I
10.1109/iciict1.2019.8741434
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Object recognition and classification is a challenging task in computer vision because of the large variation in shape, size and other attributes within the same object class. Also we need to consider other challenges such as the presence of noise and haze, occlusion, low illumination conditions, blur and the cluttered backgrounds. Due to these facts, object recognition and classification gained attention in recent years. Many researchers have proposed different methods to address the problem of recognition. This paper proposes a method for object recognition and classification based improved bag of features using SURF(Speeded Up Robust Features) and MSER(Maximally Stable External Regions) local feature extraction. Combination of SURF and MSER feature extraction algorithm can improve the recognition efficiency and the classification accuracy can be improved by spatial pyramid matching. SURF and MSER extracts the local features of an image and generate a image histogram codebook. Spatial pyramid matching is applied to this histogram, which improves the accuracy of classification. The experiment is conducted on Caltech 101 and Caltech 256 dataset.
引用
收藏
页数:4
相关论文
共 50 条
  • [41] Improved Classification Performance With Autoencoder-Based Feature Extraction Using Cross-Disorder Datasets
    Zhang-James, Yanli
    BIOLOGICAL PSYCHIATRY, 2020, 87 (09) : S87 - S88
  • [42] EEG-Based Multi-Modal Emotion Recognition using Bag of Deep Features: An Optimal Feature Selection Approach
    Asghar, Muhammad Adeel
    Khan, Muhammad Jamil
    Fawad
    Amin, Yasar
    Rizwan, Muhammad
    Rahman, MuhibUr
    Badnava, Salman
    Mirjavadi, Seyed Sajad
    SENSORS, 2019, 19 (23)
  • [43] Automatic Facial Expression Recognition: A Survey Based on Feature Extraction and Classification Techniques
    Kauser, Nazima
    Sharma, Jitendra
    PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON ICT IN BUSINESS INDUSTRY & GOVERNMENT (ICTBIG), 2016,
  • [44] Hierarchical classification of SAR data with feature extraction method based on texture features
    Kasapoglu, NG
    Yazgan, B
    RAST 2003: RECENT ADVANCES IN SPACE TECHNOLOGIES, PROCEEDINGS, 2003, : 355 - 358
  • [45] Affine invariant fusion feature extraction based on geometry descriptor and BIT for object recognition
    Yu, Lingli
    Xia, Xumei
    Zhou, Kiajun
    Zhao, Lijun
    IET IMAGE PROCESSING, 2019, 13 (01) : 57 - 72
  • [46] 3D object feature extraction and classification using 3D MF-DFA
    Wang, Jian
    Han, Ziwei
    Jiang, Wenjing
    Kim, Junseok
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 232
  • [47] Object-oriented Classification of remote sensing image based on SPM feature extraction
    Li, Xingang
    Hu, Yan
    INFORMATION TECHNOLOGY FOR MANUFACTURING SYSTEMS II, PTS 1-3, 2011, 58-60 : 1997 - 2001
  • [48] Spatial Domain Entropy Based Local Feature Extraction Scheme for Face Recognition
    Fattah, Shaikh Anowarul
    Islam, Md. Shafiqul
    Islam, Md. Saiful
    2012 7TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING (ICECE), 2012,
  • [49] Foetal neurodegenerative disease classification using improved deep ResNet classification based VGG-19 feature extraction network
    Gopinath Siddan
    Pradeepa Palraj
    Multimedia Tools and Applications, 2022, 81 : 2393 - 2408
  • [50] Finger Vein Recognition Using Minutia-Based Alignment and Local Binary Pattern-Based Feature Extraction
    Lee, Eui Chul
    Lee, Hyeon Chang
    Park, Kang Ryoung
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2009, 19 (03) : 179 - 186