Low Computational Cost for Sample Entropy

被引:52
作者
Manis, George [1 ]
Aktaruzzaman, Md [2 ]
Sassi, Roberto [3 ]
机构
[1] Univ Ioannina, Dept Comp Sci & Engn, GR-45110 Ioannina, Greece
[2] Islamic Univ Kushtia, Dept Comp Sci & Engn, Kushtia 7003, Bangladesh
[3] Univ Milan, Dipartimento Informat, I-26013 Crema, Italy
关键词
Sample Entropy; algorithm; fast computation; kd-trees; bucket-assisted algorithm;
D O I
10.3390/e20010061
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used in long series or with a large number of signals. The computationally intensive part is the similarity check between points in m dimensional space. In this paper, we propose new algorithms or extend already proposed ones, aiming to compute Sample Entropy quickly. All algorithms return exactly the same value for Sample Entropy, and no approximation techniques are used. We compare and evaluate them using cardiac inter-beat (RR) time series. We investigate three algorithms. The first one is an extension of the kd-trees algorithm, customized for Sample Entropy. The second one is an extension of an algorithm initially proposed for Approximate Entropy, again customized for Sample Entropy, but also improved to present even faster results. The last one is a completely new algorithm, presenting the fastest execution times for specific values of m, r, time series length, and signal characteristics. These algorithms are compared with the straightforward implementation, directly resulting from the definition of Sample Entropy, in order to give a clear image of the speedups achieved. All algorithms assume the classical approach to the metric, in which the maximum norm is used. The key idea of the two last suggested algorithms is to avoid unnecessary comparisons by detecting them early. We use the term unnecessary to refer to those comparisons for which we know a priori that they will fail at the similarity check. The number of avoided comparisons is proved to be very large, resulting in an analogous large reduction of execution time, making them the fastest algorithms available today for the computation of Sample Entropy.
引用
收藏
页数:15
相关论文
共 28 条
[1]   Effect of mobile phone radiation on heart rate variability [J].
Ahamed, V. I. Thajudin ;
Karthick, N. G. ;
Joseph, Paul K. .
COMPUTERS IN BIOLOGY AND MEDICINE, 2008, 38 (06) :709-712
[2]   Parametric estimation of sample entropy in heart rate variability analysis [J].
Aktaruzzaman, Md ;
Sassi, Roberto .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2014, 14 :141-147
[3]  
Al-Angari H.M., 2007, IEEE T BIOMED ENG, V54, P1900, DOI 10. 1109/TBME. 2006. 889772 17926691
[4]   Optimal parameters study for sample entropy-based atrial fibrillation organization analysis [J].
Alcaraz, Raul ;
Abasolo, Daniel ;
Hornero, Roberto ;
Rieta, Jose J. .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2010, 99 (01) :124-132
[5]   Sample entropy of the main atrial wave predicts spontaneous termination of paroxysmal atrial fibrillation [J].
Alcaraz, Raul ;
Rieta, Jose J. .
MEDICAL ENGINEERING & PHYSICS, 2009, 31 (08) :917-922
[6]  
[Anonymous], NONLIN BIOMED SIGNAL
[7]  
Beckers F., 2001, CARDIOVASC ENG, V1, P177, DOI 10.1023/A:1015212328405
[8]   MULTIDIMENSIONAL TREES, RANGE SEARCHING, AND A CORRELATION DIMENSION ALGORITHM OF REDUCED COMPLEXITY [J].
BINGHAM, S ;
KOT, M .
PHYSICS LETTERS A, 1989, 140 (06) :327-330
[9]  
Cerutti S., 2014, EUROPACE, V16, piv141, DOI 10. 1093/europace/euu262 25362165
[10]  
Goldberger A., 2000, CIRCULATION