Successive refinement for hypothesis testing and lossless one-helper problem

被引:36
作者
Tian, Chao [1 ]
Chen, Jun [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
[2] McMaster Univ, Dept Elect & Comp Engn, Hamilton, ON L8S 4K1, Canada
关键词
entropy characterization; error exponent; hypothesis testing; image size characterization; information bottleneck; one-helper problem; successive refinement;
D O I
10.1109/TIT.2008.928951
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We investigate two closely related successive refinement (SR) coding problems: 1) In the hypothesis testing (HT) problem, bivariate hypothesis H-o : P-XY against H-1 : P-X P-Y, i.e., test against independence is considered. One remote sensor collects data stream X and sends summary information, constrained by SR coding rates, to a decision center which observes data stream Y directly. 2) In the one-helper (OH) problem, X and Y are encoded separately and the receiver seeks to reconstruct Y losslessly. Multiple levels of coding rates are allowed at the two sensors, and the transmissions are performed in an SR manner. We show that the SR-HT rate-error-exponent region and the SR-OH rate region can be reduced to essentially the same entropy characterization form. Single-letter solutions are thus provided in a unified fashion, and the connection between them is discussed. These problems are also related to the information bottleneck (IB) problem, and through this connection we provide a straightforward operational meaning for the IB method. Connection to the pattern recognition problem, the notion of successive refinability, and two specific sources are also discussed. A strong converse for the SR-HT problem is proved by generalizing the image size characterization method, which shows the optimal type-two error exponents under constant type-one error constraints are independent of the exact values of those constants.
引用
收藏
页码:4666 / 4681
页数:16
相关论文
共 42 条