Neural Distributed Compressor Discovers Binning

被引:0
作者
Ozyilkan, Ezgi [1 ]
Balle, Johannes [2 ]
Erkip, Elza [1 ]
机构
[1] NYU, Dept Elect & Comp Engn, Brooklyn, NY 11201 USA
[2] Google Res, New York, NY 10011 USA
来源
IEEE JOURNAL ON SELECTED AREAS IN INFORMATION THEORY | 2024年 / 5卷
关键词
Distributed source coding; binning; Wyner-Ziv coding; learning; lossy compression; neural networks; rate-distortion theory; MULTILAYER FEEDFORWARD NETWORKS; SCALAR QUANTIZATION; SIDE INFORMATION; BINARY SOURCES; QUANTIZERS; ALGORITHM; CAPACITY; CODES;
D O I
10.1109/JSAIT.2024.3393429
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider lossy compression of an information source when the decoder has lossless access to a correlated one. This setup, also known as the Wyner-Ziv problem, is a special case of distributed source coding. To this day, practical approaches for the Wyner-Ziv problem have neither been fully developed nor heavily investigated. We propose a data-driven method based on machine learning that leverages the universal function approximation capability of artificial neural networks. We find that our neural network-based compression scheme, based on variational vector quantization, recovers some principles of the optimum theoretical solution of the Wyner-Ziv setup, such as binning in the source space as well as optimal combination of the quantization index and side information, for exemplary sources. These behaviors emerge although no structure exploiting knowledge of the source distributions was imposed. Binning is a widely used tool in information theoretic proofs and methods, and to our knowledge, this is the first time it has been explicitly observed to emerge from data-driven learning.
引用
收藏
页码:246 / 260
页数:15
相关论文
共 74 条
  • [1] Slepian D., Wolf J., Noiseless coding of correlated information sources, IEEE Trans. Inf. Theory, 19, 4, pp. 471-480, (1973)
  • [2] Wyner A., Ziv J., The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inf. Theory, 22, 1, pp. 1-10, (1976)
  • [3] Zamir R., Shamai S., Nested linear/lattice codes for Wyner-Ziv encoding, Proc. Inf. Theory Workshop, pp. 92-93, (1998)
  • [4] Zamir R., Shamai S., Erez U., Nested linear/lattice codes for structured multiterminal binning, IEEE Trans. Inf. Theory, 48, 6, pp. 1250-1276, (2002)
  • [5] Pradhan S., Ramchandran K., Distributed source coding using syndromes (DISCUS): Design and construction, IEEE Trans. Inf. Theory, 49, 3, pp. 626-643, (2003)
  • [6] Liu Z., Cheng S., Liveris A., Xiong Z., Slepian-Wolf coded nested quantization (SWC-NQ) for Wyner-Ziv coding: Performance analysis and code design, Proc. Data Compress. Conf. (DCC), pp. 322-331, (2004)
  • [7] Yang Y., Cheng S., Xiong Z., Zhao W., Wyner-Ziv coding based on TCQ and LDPC codes, Proc. 37th Asilomar Conf. Signals, Syst. Comput., pp. 825-829, (2003)
  • [8] Girod B., Aaron A., Rane S., Rebollo-Monedero D., Distributed video coding, Proc. IEEE, 93, 1, pp. 71-83, (2005)
  • [9] Leshno M., Lin V.Y., Pinkus A., Schocken S., Multilayer feedforward networks with a nonpolynomial activation function can approximate any function, Neural Netw., 6, 6, pp. 861-867, (1993)
  • [10] Hornik K., Stinchcombe M., White H., Multilayer feedforward networks are universal approximators, Neural Netw., 2, 5, pp. 359-366, (1989)