Fast Sparse Deep Neural Networks: Theory and Performance Analysis

被引:3
作者
Zhao, Jin [1 ]
Jiao, Licheng [1 ]
机构
[1] Xidian Univ, Sch Artificial Intelligence, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Shaanxi, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Sparse representation; extreme learning machine; deep neural networks; convex approximation; fast sparse deep neural networks; EXTREME LEARNING-MACHINE; FACE RECOGNITION; REGRESSION;
D O I
10.1109/ACCESS.2019.2920688
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, fast sparse deep neural networks that aim to offer an alternative way of learning in a deep structure are proposed. We examine some optimization algorithms for traditional deep neural networks and find that deep neural networks suffer from a time-consuming training process because of a large number of connecting parameters in layers and layers. To reduce time consumption, we propose fast sparse deep neural networks, which mainly consider the following two aspects in the design of the network. One is that the parameter learning at each hidden layer is given utilizing closed-form solutions, which is different from the BP algorithm with iterative updating strategy. Another aspect is that fast sparse deep neural networks use the summation method of a multi-layer linear approximation to estimate the output target, which is a different way from most deep neural network models. Unlike the traditional deep neural networks, fast sparse deep neural networks can achieve excellent generalization performance without fine-tuning. In addition, it is worth noting that fast sparse deep neural networks can also effectively overcome the shortcomings of the extreme learning machine and hierarchical extreme learning machine. Compared to the existing deep neural networks, enough experimental results on benchmark datasets demonstrate that the proposed model and optimization algorithms are feasible and efficient.
引用
收藏
页码:74040 / 74055
页数:16
相关论文
共 51 条
  • [1] [Anonymous], P INT C MACH LEARN
  • [2] [Anonymous], 2015, Nature, DOI [10.1038/nature14539, DOI 10.1038/NATURE14539]
  • [3] [Anonymous], 2017, COMMUN ACM, DOI DOI 10.1145/3065386
  • [4] [Anonymous], 2017, DEEP FOREST ALTERNAT
  • [5] [Anonymous], IEEE T PATTERN ANAL
  • [6] [Anonymous], ARXIV161205596
  • [7] [Anonymous], ARXIV161205596V1
  • [8] [Anonymous], 2017, Deep Learning, Optimization and Recognition
  • [9] [Anonymous], 2015, 2015 INT JOINT C NEU
  • [10] [Anonymous], 2007, P ADV NEUR INF PROC