Least Squares Superposition Codes of Moderate Dictionary Size Are Reliable at Rates up to Capacity

被引:81
作者
Joseph, Antony [1 ]
Barron, Andrew R. [1 ]
机构
[1] Yale Univ, Dept Stat, New Haven, CT 06520 USA
关键词
Achieving capacity; compressed sensing; exponential error bounds; Gaussian channel; maximum likelihood estimation; subset selection; INFORMATION-THEORETIC LIMITS; PARITY-CHECK CODES; SIGNAL RECOVERY; SPARSITY RECOVERY; MODEL SELECTION; REPRESENTATIONS; APPROXIMATION; CONSISTENCY; REGRESSION; ALGORITHM;
D O I
10.1109/TIT.2012.2184847
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For the additive white Gaussian noise channel with average codeword power constraint, coding methods are analyzed in which the codewords are sparse superpositions, that is, linear combinations of subsets of vectors from a given design, with the possible messages indexed by the choice of subset. Decoding is by least squares (maximum likelihood), tailored to the assumed form of codewords being linear combinations of elements of the design. Communication is shown to be reliable with error probability exponentially small for all rates up to the Shannon capacity.
引用
收藏
页码:2541 / 2557
页数:17
相关论文
共 72 条