Neural network training algorithms on parallel architectures for finance applications

被引:8
作者
Thulasiram, RK [1 ]
Rahman, RM [1 ]
Thulasiraman, P [1 ]
机构
[1] Univ Manitoba, Dept Comp Sci, Winnipeg, MB R3T 2N2, Canada
来源
2003 INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING WORKSHOPS, PROCEEDINGS | 2003年
关键词
D O I
10.1109/ICPPW.2003.1240376
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We focus on the neural network training problem that could be used for price forecasting or other purposes in finance. We design and develop four different parallel and multithreaded backpropagation neural network algorithms: neuron and training set parallelism on a distributed memory architecture using MPI; loop-level (fine-grain) and coarse-grained parallelism in shared memory architecture using OpenMP We have conducted various experiments to study the performance of these algorithms and compared our results with a traditional autoregression model to establish accuracy of our results. The comparison between our MPI and OpenMP results suggest that the training set parallelism performs better than all the other types of parallelism considered in the study.
引用
收藏
页码:236 / 243
页数:8
相关论文
共 15 条
[11]  
ROGERS RO, 1998, P BIOSP3 WORKSH HELD
[12]  
Takefuji Y., 2012, Neural network parallel computing, V164
[13]  
Trippi RobertR., 1993, Neural Networks in Finance and Investing
[14]  
Weigend A. S., 1997, DECISION TECHNOLOGIE
[15]  
WHITE JA, 2000, PRICING OPTIONS FUTU