GCRNN: Group-Constrained Convolutional Recurrent Neural Network

被引:50
作者
Lin, Sangdi [1 ]
Runger, George C. [2 ]
机构
[1] Arizona State Univ, Sch Comp Informat & Decis Syst Engn, Tempe, AZ 85281 USA
[2] Arizona State Univ, Dept Biomed Informat, Sch Comp Informat & Decis Syst Engn, Tempe, AZ 85281 USA
关键词
Convolutional neural network (CNN); deep learning; recurrent neural network (RNN); regularization; sparse group lasso (SGL); time-series classification (TSC); TIME-SERIES CLASSIFICATION; SIMILARITY;
D O I
10.1109/TNNLS.2017.2772336
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a new end-to-end deep neural network model for time-series classification (TSC) with emphasis on both the accuracy and the interpretation. The proposed model contains a convolutional network component to extract high-level features and a recurrent network component to enhance the modeling of the temporal characteristics of TS data. In addition, a feedforward fully connected network with the sparse group lasso (SGL) regularization is used to generate the final classification. The proposed architecture not only achieves satisfying classification accuracy, but also obtains good interpretability through the SGL regularization. All these networks are connected and jointly trained in an end-to-end framework, and it can be generally applied to TSC tasks across different domains without the efforts of feature engineering. Our experiments in various TS data sets show that the proposed model outperforms the traditional convolutional neural network model for the classification accuracy, and also demonstrate how the SGL contributes to a better model interpretation.
引用
收藏
页码:4709 / 4718
页数:10
相关论文
共 42 条
[1]  
[Anonymous], P C EMPIRICAL METHOD
[2]  
[Anonymous], 2015, DEEP SPEECH 2 END EN
[3]  
[Anonymous], 1997, Neural Computation
[4]  
[Anonymous], 2015, P INT C LEARN REPR
[5]  
[Anonymous], 1998, HDB BRAIN THEORY NEU
[6]  
[Anonymous], THESIS
[7]  
[Anonymous], PROC CVPR IEEE
[8]  
[Anonymous], 2015, COMPUTER SCI
[9]  
[Anonymous], 1990, Advances in neural information processing systems
[10]  
Bagnall A., 2016, GREAT TIME SERIES CL