Pre-trained CNNs Models for Content based Image Retrieval

被引:0
作者
Ahmed, Ali [1 ]
机构
[1] King Abdulaziz Univ Rabigh, Fac Comp & Informat Technol, Rabigh 21589, Saudi Arabia
关键词
Pre-trained deep neural networks; transfer learning; content based image retrieval; CLASSIFICATION;
D O I
10.14569/IJACSA.2021.0120723
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Content based image retrieval (CBIR) systems is a common recent method for image retrieval and is based mainly on two pillars extracted features and similarity measures. Low level image presentations, based on colour, texture and shape properties are the most common feature extraction methods used by traditional CBIR systems. Since these traditional handcrafted features require good prior domain knowledge, inaccurate features used for this type of CBIR systems may widen the semantic gap and could lead to very poor performance retrieval results. Hence, features extraction methods, which are independent of domain knowledge and have automatic learning capabilities from input image are highly useful. Recently, pre-trained deep convolution neural networks (CNN) with transfer learning facilities have ability to generate and extract accurate and expressive features from image data. Unlike other types of deep CNN models which require huge amount of data and massive processing time for training purposes, the pre-trained CNN models have already trained for thousands of classes of large-scale data, including huge images and their information could be easily used and transferred. ResNetl8 and SqueezeNet are successful and effective examples of pre-trained CNN models used recently in many machine learning applications, such as classification, clustering and object recognition. In this study, we have developed CBIR systems based on features extracted using ResNetl8 and SqueezeNet pre-trained CNN models. Here, we have utilized these pre-trained CNN models to extract two groups of features that are stored separately and then later are used for online image searching and retrieval. Experimental results on two popular image datasets Core-1K and GHIM-10K show that ResNetl8 features based on the CBIR method have overall accuracy of 95.5% and 93.9% for the two datasets, respectively, which greatly outperformed the traditional handcraft features based on the CBIR method.
引用
收藏
页码:200 / 206
页数:7
相关论文
共 50 条
[31]   Bidirectional brain image translation using transfer learning from generic pre-trained models [J].
Haimour, Fatima ;
Al-Sayyed, Rizik ;
Mahafza, Waleed ;
Al-Kadi, Omar S. .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 248
[32]   Pre-processing Effects of the Tuberculosis Chest X-Ray Images on Pre-trained CNNs: An Investigation [J].
Tasci, Erdal .
ARTIFICIAL INTELLIGENCE AND APPLIED MATHEMATICS IN ENGINEERING PROBLEMS, 2020, 43 :589-596
[33]   Late fusion of pre-trained networks for satellite image classification [J].
Mehmood, Asif .
PATTERN RECOGNITION AND TRACKING XXXIII, 2022, 12101
[34]   Classification of Rice Leaf Diseases using CNN-based pre-trained models and transfer learning [J].
Mavaddat, Marjan ;
Naderan, Marjan ;
Alavi, Seyyed Enayatallah .
2023 6TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION AND IMAGE ANALYSIS, IPRIA, 2023,
[35]   Implementation of CNNs for Crop Diseases Classification: A Comparison of Pre-trained Model and Training from Scratch [J].
Sahu, Priyanka ;
Chug, Anuradha ;
Singh, Amit Prakash ;
Singh, Dinesh ;
Singh, Ravinder Pal .
INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2020, 20 (10) :206-215
[36]   A General Endoscopic Image Enhancement Method Based on Pre-trained Generative Adversarial Networks [J].
Li, Yating ;
Fan, Jingfan ;
Ai, Danni ;
Song, Hong ;
Wang, Yongtian ;
Yang, Jian .
2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, :2403-2408
[37]   Vulnerability prediction using pre-trained models: An empirical evaluation [J].
Kalouptsoglou, Ilias ;
Siavvas, Miltiadis ;
Ampatzoglou, Apostolos ;
Kehagias, Dionysios ;
Chatzigeorgiou, Alexander .
2024 32ND INTERNATIONAL CONFERENCE ON MODELING, ANALYSIS AND SIMULATION OF COMPUTER AND TELECOMMUNICATION SYSTEMS, MASCOTS 2024, 2024, :200-205
[38]   Pre-trained Language Models with Limited Data for Intent Classification [J].
Kasthuriarachchy, Buddhika ;
Chetty, Madhu ;
Karmakar, Gour ;
Walls, Darren .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[39]   How to Estimate Model Transferability of Pre-Trained Speech Models? [J].
Chen, Zih-Ching ;
Yang, Chao-Han Huck ;
Li, Bo ;
Zhang, Yu ;
Chen, Nanxin ;
Chang, Shou-Yiin ;
Prabhavalkar, Rohit ;
Lee, Hung-yi ;
Sainath, Tara N. .
INTERSPEECH 2023, 2023, :456-460
[40]   Improved White Blood Cells Classification Based on Pre-trained Deep Learning Models [J].
Mohamed, Ensaf H. ;
El-Behaidy, Wessam H. ;
Khoriba, Ghada ;
Li, Jie .
JOURNAL OF COMMUNICATIONS SOFTWARE AND SYSTEMS, 2020, 16 (01) :37-45