Generalized Gaussian Model for Learned Image Compression

被引:0
作者
Zhang, Haotian [1 ]
Li, Li [1 ]
Liu, Dong [1 ]
机构
[1] University of Science and Technology of China, MOE Key Laboratory of Brain-Inspired Intelligent Perception and Cognition, Hefei, 230093, China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
学科分类号
摘要
In learned image compression, probabilistic models play an essential role in characterizing the distribution of latent variables. The Gaussian model with mean and scale parameters has been widely used for its simplicity and effectiveness. Probabilistic models with more parameters, such as the Gaussian mixture models, can fit the distribution of latent variables more precisely, but the corresponding complexity is higher. To balance the compression performance and complexity, we extend the Gaussian model to the generalized Gaussian family for more flexible latent distribution modeling, introducing only one additional shape parameter beta than the Gaussian model. To enhance the performance of the generalized Gaussian model by alleviating the train-test mismatch, we propose improved training methods, including beta -dependent lower bounds for scale parameters and gradient rectification. Our proposed generalized Gaussian model, coupled with the improved training methods, is demonstrated to outperform the Gaussian and Gaussian mixture models on a variety of learned image compression networks. © 1992-2012 IEEE.
引用
收藏
页码:1950 / 1965
相关论文
empty
未找到相关数据