A Joint Typicality Approach to Compute-Forward

被引:15
作者
Lim, Sung Hoon [1 ]
Feng, Chen [2 ]
Pastore, Adriano [3 ]
Nazer, Bobak [4 ]
Gastpar, Michael [5 ]
机构
[1] Korea Inst Ocean Sci & Technol, Busan 49111, South Korea
[2] Univ British Columbia, Sch Engn, Kelowna, BC V1V 1V7, Canada
[3] CTTC CERCA, Ctr Tecnol Telecomunicac Catalunya, Castelldefels 08860, Spain
[4] Boston Univ, Dept Elect & Comp Engn, Boston, MA 02215 USA
[5] Ecole Polytech Fed, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
基金
新加坡国家研究基金会; 加拿大自然科学与工程研究理事会;
关键词
Linear codes; joint decoding; compute-forward; multiple-access channel; relay networks; ACHIEVABLE RATE REGION; MULTIPLE-ACCESS CHANNEL; DISTRIBUTED COMPRESSION; APPROXIMATE CAPACITY; GROUP CODES; INTERFERENCE; FREEDOM; SUM; INFORMATION; STRATEGIES;
D O I
10.1109/TIT.2018.2872053
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a joint typicality framework for encoding and decoding nested linear codes in multi-user networks. This framework provides a new perspective on compute-forward within the context of discrete memoryless networks. In particular, it establishes an achievable rate region for computing a linear combination over a discrete memoryless multiple-access channel (MAC). When specialized to the Gaussian MAC, this rate region recovers and improves upon the lattice-based compute-forward rate region of Nazer and Gastpar, thus providing a unified approach for discrete memoryless and Gaussian networks. Furthermore, our framework provides some valuable insights on establishing the optimal decoding rate region for compute-forward by considering joint decoders, progressing beyond most previous works that consider successive cancellation decoding. Specifically, this paper establishes an achievable rate region for simultaneously decoding two linear combinations of nested linear codewords from K senders.
引用
收藏
页码:7657 / 7685
页数:29
相关论文
共 59 条
[1]   GROUP CODES DO NOT ACHIEVE SHANNONS CHANNEL CAPACITY FOR GENERAL DISCRETE CHANNELS [J].
AHLSWEDE, R .
ANNALS OF MATHEMATICAL STATISTICS, 1971, 42 (01) :224-&
[2]  
[Anonymous], 2011, INFORM THEORY CODING, DOI DOI 10.1017/CBO9780511921889
[3]  
[Anonymous], 1990, Entropy and Information Theory
[4]  
[Anonymous], 2011, Network information theory
[5]  
Ash R. B., 2000, PROBABILITY MEASURE
[6]   Optimal Achievable Rates for Interference Networks With Random Codes [J].
Bandemer, Bernd ;
El Gamal, Abbas ;
Kim, Young-Han .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (12) :6536-6549
[7]   The Approximate Capacity of the Many-to-One and One-to-Many Gaussian Interference Channels [J].
Bresler, Guy ;
Parekh, Abhay ;
Tse, David N. C. .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (09) :4566-4592
[8]   The Random Coding Bound Is Tight for the Average Linear Code or Lattice [J].
Domb, Yuval ;
Zamir, Ram ;
Feder, Meir .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (01) :121-130
[9]   An Algebraic Approach to Physical-Layer Network Coding [J].
Feng, Chen ;
Silva, Danilo ;
Kschischang, Frank R. .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (11) :7576-7596
[10]  
Gel'fand S. I., 1980, Problems of Control and Information Theory, V9, P19