Transfer Learning Using Rotated Image Data to Improve Deep Neural Network Performance

被引:14
作者
Amaral, Telmo [1 ]
Silva, Luis M. [1 ,2 ]
Alexandre, Luis A. [3 ]
Kandaswamy, Chetak [1 ]
de Sa, Joaquim Marques [1 ,4 ]
Santos, Jorge M. [1 ,5 ]
机构
[1] Univ Porto, Inst Engn Biomed INEB, Rua Campo Alegre 823, P-4100 Oporto, Portugal
[2] Univ Aveiro, Dept Matemat, Aveiro, Portugal
[3] Univ Beira Interior, Inst Telecomun, Covilha, Portugal
[4] Univ Porto, Dept Eng Elect & Comp, Fac Eng, Porto, Portugal
[5] Inst Super Engn Inst Politecn Porto, Dept Matemat, Porto, Portugal
来源
IMAGE ANALYSIS AND RECOGNITION, ICIAR 2014, PT I | 2014年 / 8814卷
关键词
Transfer learning; Deep learning; Stacked auto-encoders;
D O I
10.1007/978-3-319-11758-4_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work we explore the idea that, in the presence of a small training set of images, it could be beneficial to use that set itself to obtain a transformed training set (by performing a random rotation on each sample), train a source network using the transformed data, then retrain the source network using the original data. Applying this transfer learning technique to three different types of character data, we achieve average relative improvements between 6% and 16% in the classification test error. Furthermore, we show that it is possible to achieve relative improvements between 8% and 42% in cases where the amount of original training samples is very limited (30 samples per class), by introducing not just one rotation but several random rotations per sample.
引用
收藏
页码:290 / 300
页数:11
相关论文
共 10 条
[1]  
[Anonymous], 2007, IEEE INT C ICML
[2]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[3]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[4]   Deep, Big, Simple Neural Nets for Handwritten Digit Recognition [J].
Ciresan, Dan Claudiu ;
Meier, Ueli ;
Gambardella, Luca Maria ;
Schmidhuber, Juergen .
NEURAL COMPUTATION, 2010, 22 (12) :3207-3220
[5]  
Ciresan DC., 2012, 2012 INT JOINT C NEU, P1, DOI DOI 10.1109/IJCNN.2012.6252544
[6]  
Deng Li., 2013, Deep Learning for Signal and Information Processing
[7]  
Glorot X., 2011, INT C MACH LEARN, P513
[8]   A fast learning algorithm for deep belief nets [J].
Hinton, Geoffrey E. ;
Osindero, Simon ;
Teh, Yee-Whye .
NEURAL COMPUTATION, 2006, 18 (07) :1527-1554
[9]   A Survey on Transfer Learning [J].
Pan, Sinno Jialin ;
Yang, Qiang .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (10) :1345-1359
[10]  
Simard PY, 2003, PROC INT CONF DOC, P958