The Kullback-Leibler divergence rate between Markov sources

被引:92
作者
Rached, Z [1 ]
Alajaji, F [1 ]
Campbell, LL [1 ]
机构
[1] Queens Univ, Dept Math & Stat, Kingston, ON K7L 3N6, Canada
关键词
classifcation; decision theory; Kullback-Leibler divergence rate; nonnegative matrices; pattern recognition; Perron-Frobenius theory; rate of convergence; Shannon entropy rate; time-invariant Markov sources;
D O I
10.1109/TIT.2004.826687
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this work, we provide a computable expression for the Kullback-Leibler divergence rate lim(n-->infinity)D(p((n))\\q((n))) between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions P-(n) and q((n)), respectively. We illustrate it numerically and examine its rate of convergence. The main tools used to obtain the Kullback-Leibler divergence rate and its rate of convergence are the theory of nonnegative matrices and Perron-Frobenius theory. Similarly, we provide a formula for the Shannon entropy rate lim(n-->infinity)1/n H(p((n))) of Markov sources and n examine its rate of convergence.
引用
收藏
页码:917 / 921
页数:5
相关论文
共 18 条
[1]  
BASSAT MB, 1978, INFORM CONTR, V39, P227
[2]  
Chen C. H., 1973, STAT PATTERN RECOGNI
[3]   APPROXIMATING DISCRETE PROBABILITY DISTRIBUTIONS WITH DEPENDENCE TREES [J].
CHOW, CK ;
LIU, CN .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1968, 14 (03) :462-+
[4]  
Cox DR., 1965, The Theory of Stochastic Proceesses
[5]   Fast approximation of Kullback-Leibler distance for dependence trees and hidden Markov models [J].
Do, MN .
IEEE SIGNAL PROCESSING LETTERS, 2003, 10 (04) :115-118
[6]  
Gallager R., 1996, Discrete Stochastic Processes
[7]  
GALLAGER RG, 1968, INFORMATION THEORY R
[8]  
GRAY RM, 1990, ENTROPY INFORMATION
[9]  
Horn R. A., 1986, Matrix analysis
[10]   ON BEST FINITE SET OF LINEAR OBSERVABLES FOR DISCRIMINATING 2 GAUSSIAN SIGNALS [J].
KADOTA, TT ;
SHEPP, LA .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (02) :278-+