Information-theoretic Limits on the Classification of Gaussian Mixtures: Classification on the Grassmann Manifold

被引:0
作者
Nokleby, Matthew [1 ]
Calderbank, Robert [1 ]
Rodrigues, Miguel R. D. [2 ]
机构
[1] Duke Univ, Durham, NC 27708 USA
[2] UCL, London WC1E 6BT, England
来源
2013 IEEE INFORMATION THEORY WORKSHOP (ITW) | 2013年
关键词
COMMUNICATION; MODELS;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Motivated by applications in high-dimensional signal processing, we derive fundamental limits on the performance of compressive linear classifiers. By analogy with Shannon theory, we define the classification capacity, which quantifies the maximum number of classes that can be discriminated with low probability of error, and the diversity-discrimination tradeoff, which quantifies the tradeoff between the number of classes and the probability of classification error. For classification of Gaussian mixture models, we identify a duality between classification and communications over non-coherent multiple-antenna channels. This duality allows us to characterize the classification capacity and diversity-discrimination tradeoff using existing results from multiple-antenna communication. We also identify the easiest possible classification problems, which correspond to low-dimensional subspaces drawn from an appropriate Grassmann manifold.
引用
收藏
页数:5
相关论文
共 16 条