共 3 条
Towards Sharper Generalization Bounds for Structured Prediction
被引:0
作者:
Li, Shaojie
[1
,2
]
Liu, Yong
[1
,2
]
机构:
[1] Renmin Univ China, Gaoling Sch Artificial Intelligence, Beijing, Peoples R China
[2] Beijing Key Lab Big Data Management & Anal Method, Beijing, Peoples R China
来源:
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021)
|
2021年
基金:
中国国家自然科学基金;
关键词:
COVERING NUMBER;
D O I:
暂无
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
In this paper, we investigate the generalization performance of structured prediction learning and obtain state-of-the-art generalization bounds. Our analysis is based on factor graph decomposition of structured prediction algorithms, and we present novel margin guarantees from three different perspectives: Lipschitz continuity, smoothness, and space capacity condition. In the Lipschitz continuity scenario, we improve the square-root dependency on the label set cardinality of existing bounds to a logarithmic dependence. In the smoothness scenario, we provide generalization bounds that are not only a logarithmic dependency on the label set cardinality but a faster convergence rate of order O(1/root n) on the sample size n. In the space capacity scenario, we obtain bounds that do not depend on the label set cardinality and have faster convergence rates than O(1/root n). In each scenario, applications are provided to suggest that these conditions are easy to be satisfied.
引用
收藏
页数:14
相关论文