Constrained Center Loss for Convolutional Neural Networks

被引:8
|
作者
Shi, Zhanglei [1 ]
Wang, Hao [2 ]
Leung, Chi-Sing [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China
[2] Shenzhen Univ, Guangdong Multimedia Informat Serv Engn Technol R, Shenzhen 518060, Peoples R China
关键词
Convolutional neural networks; Field-flow fractionation; Training; Prototypes; Feature extraction; Optimization; Clustering algorithms; Center loss (CL); constrained center loss (CCL); convolutional neural networks (CNNs); image classification;
D O I
10.1109/TNNLS.2021.3104392
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
From the feature representation's point of view, the feature learning module of a convolutional neural network (CNN) is to transform an input pattern into a feature vector. This feature vector is then multiplied with a number of output weight vectors to produce softmax scores. The common training objective in CNNs is based on the softmax loss, which ignores the intra-class compactness. This brief proposes a constrained center loss (CCL)-based algorithm to extract robust features. The training objective of a CNN consists of two terms, softmax loss and CCL. The aim of the softmax loss is to push the feature vectors from different classes apart. Meanwhile, the CCL aims at clustering the feature vectors such that the feature vectors from the same classes are close together. Instead of using stochastic gradient descent (SGD) algorithms to learn all the connection weights and the cluster centers at the same time. Our CCL-based algorithm is based on the alternative learning strategy. We first fix the connection weights of the CNN and update the cluster centers based on an analytical formula, which can be implemented based on the minibatch concept. We then fix the cluster centers and update the connection weights for a number of SGD minibatch iterations. We also propose a simplified CCL (SCCL) algorithm. Experiments are performed on six commonly used benchmark datasets. The results demonstrate that the two proposed algorithms outperform several state-of-the-art approaches.
引用
收藏
页码:1080 / 1088
页数:9
相关论文
共 50 条
  • [1] Centrosymmetric constrained Convolutional Neural Networks
    Zheng, Keyin
    Qian, Yuhua
    Yuan, Zhian
    Peng, Furong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (07) : 2749 - 2760
  • [2] Hybrid Loss-Constrained Lightweight Convolutional Neural Networks for Cervical Cell Classification
    Chen, Wen
    Shen, Weiming
    Gao, Liang
    Li, Xinyu
    SENSORS, 2022, 22 (09)
  • [3] Convolutional Neural Networks at Constrained Time Cost
    He, Kaiming
    Sun, Jian
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 5353 - 5360
  • [4] Constrained Convolutional Neural Networks for Weakly Supervised Segmentation
    Pathak, Deepak
    Kraehenbuehl, Philipp
    Darrell, Trevor
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 1796 - 1804
  • [5] Optimization of Convolutional Neural Networks on Resource Constrained Devices
    Arish, S.
    Sinha, Sharad
    Smitha, K. G.
    2019 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2019), 2019, : 19 - 24
  • [6] Pairwise Gaussian Loss for Convolutional Neural Networks
    Qin, Yuxiang
    Yan, Chungang
    Liu, Guanjun
    Li, Zhenchuan
    Jiang, Changjun
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (10) : 6324 - 6333
  • [7] Dissecting Convolutional Neural Networks for Efficient Implementation on Constrained Platforms
    Laguduva, Vishalini R.
    Mahmud, Shakil
    Aakur, Sathyanarayanan N.
    Karam, Robert
    Katkoori, Srinivas
    2020 33RD INTERNATIONAL CONFERENCE ON VLSI DESIGN AND 2020 19TH INTERNATIONAL CONFERENCE ON EMBEDDED SYSTEMS (VLSID), 2020, : 149 - 154
  • [8] Designing convolutional neural networks with constrained evolutionary piecemeal training
    Dolly Sapra
    Andy D. Pimentel
    Applied Intelligence, 2022, 52 : 17103 - 17117
  • [9] Designing convolutional neural networks with constrained evolutionary piecemeal training
    Sapra, Dolly
    Pimentel, Andy D.
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17103 - 17117
  • [10] Efficient Loss Landscape Reshaping for Convolutional Neural Networks
    Chen, Liangming
    Jin, Long
    Shang, Mingsheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,