Restricted Boltzmann Machines Without Random Number Generators for Efficient Digital Hardware Implementation

被引:3
作者
Hori, Sansei [1 ]
Morie, Takashi [1 ]
Tamukoh, Hakaru [1 ]
机构
[1] Kyushu Inst Technol, Grad Sch Life Sci & Syst Engn, Wakamatsu Ku, 2-4 Hibikino, Kitakyushu, Fukuoka 8080196, Japan
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I | 2016年 / 9886卷
关键词
Restricted Boltzmann machines; Deep learning; Random number generators; Digital hardware; FPGA;
D O I
10.1007/978-3-319-44778-0_46
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Restricted Boltzmann machines (RBMs) have actively been studied in the field of deep neural networks. RBMs are stochastic artificial neural networks that can learn a probability distribution of input datasets. However, they require considerable computational resources, long processing times and high power consumption due to huge number of random number generation to obtain stochastic behavior. Therefore, dedicated hardware implementation of RBMs is desired for consumer applications with low-power devices. To realize hardware implementation of RBMs in a massively parallel manner, each unit must include random number generators (RNGs), which occupy huge hardware resources. In this paper, we propose a hardware-oriented RBM algorithm that does not require RNGs. In the proposed method, as a random number, we employ underflow bits obtained from the calculation process of the firing probability. We have developed a software implementation of fixed-point RBMs to evaluate the proposed method. Experimental results show that a 16-bit fixed-point RBM can be trained by the proposed method, and the underflow bits can be used as random numbers in RBM training.
引用
收藏
页码:391 / 398
页数:8
相关论文
共 10 条
[1]  
[Anonymous], 2012, P INT C MACH LEARN
[2]  
[Anonymous], 2015, P RISP INT WORKSH NO
[3]  
[Anonymous], 2010003 UTML TR U TR
[4]  
Fischer Asja, 2012, Progress in Pattern Recognition, Image Analysis, ComputerVision, and Applications. Proceedings 17th Iberoamerican Congress, CIARP 2012, P14, DOI 10.1007/978-3-642-33275-3_2
[5]   A fast learning algorithm for deep belief nets [J].
Hinton, Geoffrey E. ;
Osindero, Simon ;
Teh, Yee-Whye .
NEURAL COMPUTATION, 2006, 18 (07) :1527-1554
[6]   A Large-scale Architecture for Restricted Boltzmann Machines [J].
Kim, Sang Kyun ;
McMahon, Peter L. ;
Olukotun, Kunle .
2010 18TH IEEE ANNUAL INTERNATIONAL SYMPOSIUM ON FIELD-PROGRAMMABLE CUSTOM COMPUTING MACHINES (FCCM 2010), 2010, :201-208
[7]  
Kim SK, 2009, I C FIELD PROG LOGIC, P367, DOI 10.1109/FPL.2009.5272262
[8]   ImageNet Classification with Deep Convolutional Neural Networks [J].
Krizhevsky, Alex ;
Sutskever, Ilya ;
Hinton, Geoffrey E. .
COMMUNICATIONS OF THE ACM, 2017, 60 (06) :84-90
[9]   High-Performance Reconfigurable Hardware Architecture for Restricted Boltzmann Machines [J].
Le Ly, Daniel ;
Chow, Paul .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (11) :1780-1792
[10]  
Park S, 2015, ISSCC DIG TECH PAP I, V58, P80, DOI 10.1109/ISSCC.2015.7062935