An Introductory Review of Deep Learning for Prediction Models With Big Data

被引:347
作者
Emmert-Streib, Frank [1 ,2 ]
Yang, Zhen [1 ]
Feng, Han [1 ,3 ]
Tripathi, Shailesh [1 ,3 ]
Dehmer, Matthias [3 ,4 ,5 ]
机构
[1] Tampere Univ, Fac Informat Technol & Commun Sci, Predict Soc & Data Analyt Lab, Tampere, Finland
[2] Inst Biosci & Med Technol, Tampere, Finland
[3] Univ Appl Sci Upper Austria, Sch Management, Steyr, Austria
[4] Univ Hlth Sci Med Informat & Technol UMIT, Dept Biomed Comp Sci & Mechatron, Hall In Tirol, Austria
[5] Nankai Univ, Coll Artificial Intelligence, Tianjin, Peoples R China
来源
FRONTIERS IN ARTIFICIAL INTELLIGENCE | 2020年 / 3卷
关键词
deep learning; artificial intelligence; machine learning; neural networks; prediction models; data science; NEURAL-NETWORKS; CLASSIFICATION; LSTM; REPRESENTATIONS; DIMENSIONALITY; RECOGNITION; AUTOENCODER; SPEECH; NETS;
D O I
10.3389/frai.2020.00004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models stand for a new learning paradigm in artificial intelligence (AI) and machine learning. Recent breakthrough results in image analysis and speech recognition have generated a massive interest in this field because also applications in many other domains providing big data seem possible. On a downside, the mathematical and computational methodology underlying deep learning models is very challenging, especially for interdisciplinary scientists. For this reason, we present in this paper an introductory review of deep learning approaches including Deep Feedforward Neural Networks (D-FFNN), Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Autoencoders (AEs), and Long Short-Term Memory (LSTM) networks. These models form the major core architectures of deep learning models currently used and should belong in any data scientist's toolbox. Importantly, those core architectural building blocks can be composed flexibly-in an almost Lego-like manner-to build new application-specific network architectures. Hence, a basic understanding of these network architectures is important to be prepared for future developments in AI.
引用
收藏
页数:23
相关论文
共 151 条
[1]   Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning [J].
Alipanahi, Babak ;
Delong, Andrew ;
Weirauch, Matthew T. ;
Frey, Brendan J. .
NATURE BIOTECHNOLOGY, 2015, 33 (08) :831-+
[2]  
An J., 2015, SPECIAL LECT IE, V2
[3]  
[Anonymous], 2014, Comput. Sci.
[4]  
[Anonymous], 1991, Introduction to the Theory of Neural Computation
[5]  
[Anonymous], 2016, ABS160502688
[6]  
[Anonymous], 2011, Statistical Pattern RecognitionWiley
[7]  
[Anonymous], ing universally-approximating deep neural networks: A first
[8]  
[Anonymous], 2013, ARXIV13084214
[9]  
[Anonymous], 1969, Perceptrons
[10]  
[Anonymous], 2000, P 2 INT ICSC S NEUR