Rate-Distortion Tradeoffs under Kernel-Based Distortion Measures

被引:0
作者
Watanabe, Kazuho [1 ]
机构
[1] Toyohashi Univ Technol, Dept Comp Sci & Engn, Toyohashi, Aichi, Japan
来源
2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT) | 2017年
关键词
FEATURE SPACE;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Kernel methods have been used for turning linear learning algorithms into nonlinear ones. These nonlinear algorithms measure distances between data points by the distance in the kernel-induced feature space. In lossy data compression, the optimal tradeoff between the number of quantized points and the incurred distortion is characterized by the rate-distortion function. However, the rate-distortion functions associated with distortion measures involving kernel feature mapping have yet. to be analyzed. We consider two reconstruction schemes, reconstruction in input space and reconstruction in feature space, and provide bounds to the rate-distortion functions for these schemes. Comparison of the derived bounds to the quantizer performance obtained by the kernel K-means method suggests that the rate-distortion bounds for input space and feature space reconstructions are informative at low and high distortion levels, respectively.
引用
收藏
页码:1928 / 1932
页数:5
相关论文
共 13 条
[1]  
Aizerman M. A., 1964, Automation and Remote Control, V25, P821
[2]  
[Anonymous], 2001, Learning with Kernels |
[3]  
Berger T, 1971, Rate Distortion Theory. A Mathematical Basis for Data Compression
[4]   Mean shift: A robust approach toward feature space analysis [J].
Comaniciu, D ;
Meer, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (05) :603-619
[5]  
Cover TM., 1991, ELEMENTS INFORM THEO, V1, P279
[6]   A survey of kernel and spectral methods for clustering [J].
Filippone, Maurizio ;
Camastra, Francesco ;
Masulli, Francesco ;
Rovetta, Stefano .
PATTERN RECOGNITION, 2008, 41 (01) :176-190
[7]  
FUKUNAGA K, 1975, IEEE T INFORM THEORY, V21, P32, DOI 10.1109/TIT.1975.1055330
[8]   Mercer kernel-based clustering in feature space [J].
Girolami, M .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (03) :780-784
[9]  
Gray RM, 2011, ENTROPY AND INFORMATION THEORY , SECOND EDITION, P395, DOI 10.1007/978-1-4419-7970-4
[10]   ON THE ASYMPTOTIC TIGHTNESS OF THE SHANNON LOWER-BOUND [J].
LINDER, T ;
ZAMIR, R .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1994, 40 (06) :2026-2031