Information-theoretical analysis of statistical measures for multiscale dynamics

被引:1
|
作者
Asuke, Naoki [1 ]
Yamagami, Tomoki [1 ]
Mihana, Takatomo [1 ]
Rohm, Andre [1 ]
Horisaki, Ryoichi [1 ]
Naruse, Makoto [1 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, Dept Informat Phys & Comp, 7-3-1 Hongo,Bunkyo Ku, Tokyo 1138656, Japan
基金
日本科学技术振兴机构; 日本学术振兴会;
关键词
OPTICAL FEEDBACK; ENTROPY;
D O I
10.1063/5.0141099
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Multiscale entropy (MSE) has been widely used to examine nonlinear systems involving multiple time scales, such as biological and economic systems. Conversely, Allan variance has been used to evaluate the stability of oscillators, such as clocks and lasers, ranging from short to long time scales. Although these two statistical measures were developed independently for different purposes in different fields in the literature, their interest is to examine multiscale temporal structures of physical phenomena under study. We show that, from an information-theoretical perspective, they share some foundations and exhibit similar tendencies. We experimentally confirmed that similar properties of the MSE and Allan variance can be observed in low-frequency fluctuations (LFF) in chaotic lasers and physiological heartbeat data. Furthermore, we calculated the condition under which this consistency between the MSE and Allan variance exists, which is related to certain conditional probabilities. Heuristically, physical systems in nature including the aforementioned LFF and heartbeat data mostly satisfy this condition, and hence the MSE and Allan variance demonstrate similar properties. As a counterexample, an artificially constructed random sequence is demonstrated, for which the MSE and Allan variance exhibit different trends.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Information-theoretical analysis of the statistical dependencies among three variables: Applications to written language
    Hernandez, Damian G.
    Zanette, Damian H.
    Samengo, Ines
    PHYSICAL REVIEW E, 2015, 92 (02):
  • [2] Information-Theoretical Analysis of EEG Microstate Sequences in Python']Python
    von Wegner, Frederic
    Laufs, Helmut
    FRONTIERS IN NEUROINFORMATICS, 2018, 12
  • [3] Information-Theoretical Analysis of the Neural Code in the Rodent Temporal Lobe
    Maidana Capitan, Melisa B.
    Kropff, Emilio
    Samengo, Ines
    ENTROPY, 2018, 20 (08)
  • [4] Information-theoretical complexity for the hydrogenic abstraction reaction
    Esquivel, Rodolfo O.
    Molina-Espiritu, Moyocoyani
    Carlos Angulo, Juan
    Antolin, Juan
    Flores-Gallegos, Nelson
    Dehesa, Jesus S.
    MOLECULAR PHYSICS, 2011, 109 (19) : 2353 - 2365
  • [5] INFORMATION-THEORETICAL ENTROPY AS A MEASURE OF SEQUENCE VARIABILITY
    SHENKIN, PS
    ERMAN, B
    MASTRANDREA, LD
    PROTEINS-STRUCTURE FUNCTION AND GENETICS, 1991, 11 (04): : 297 - 313
  • [6] An information-theoretical perspective on weighted ensemble forecasts
    Weijs, Steven V.
    van de Giesen, Nick
    JOURNAL OF HYDROLOGY, 2013, 498 : 177 - 190
  • [7] Three information-theoretical methods to estimate a random variable
    Lind, NC
    JOURNAL OF ENVIRONMENTAL MANAGEMENT, 1997, 49 (01) : 43 - 51
  • [8] Information-theoretical properties of a sequence of quantum nondemolition measurements
    Ban, M
    PHYSICS LETTERS A, 1998, 249 (03) : 167 - 179
  • [9] Diversity and interdisciplinarity: Should variety, balance and disparity be combined as a product or better as a sum? An information-theoretical and statistical estimation approach
    Mutz, Ruediger
    SCIENTOMETRICS, 2022, 127 (12) : 7397 - 7414
  • [10] Concurrent Phenomena at the Transition Region of Selected Elementary Chemical Reactions: An Information-Theoretical Complexity Analysis
    Esquivel, Rodolfo O.
    Molina-Espiritu, Moyocoyani
    Dehesa, Jesus S.
    Carlos Angulo, Juan
    Antolin, Juan
    INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, 2012, 112 (22) : 3578 - 3586