On lower complexity bounds for large-scale smooth convex optimization

被引:41
|
作者
Guzman, Cristobal [1 ]
Nemirovski, Arkadi [1 ]
机构
[1] Georgia Inst Technol, H Milton Stewart Sch Ind & Syst Engn, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
Smooth convex optimization; Lower complexity bounds; Optimal algorithms;
D O I
10.1016/j.jco.2014.08.003
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Holder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional parallel to . parallel to(p)-balls, 1 <= p <= infinity. Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substantial extension of the existing lower complexity bounds for large-scale convex minimization covering the nonsmooth case and the "Euclidean" smooth case (minimization of convex functions with Lipschitz continuous gradients over Euclidean balls). As a byproduct of our results, we demonstrate that the classical Conditional Gradient algorithm is near-optimal, in the sense of Information-Based Complexity Theory, when minimizing smooth convex functions over high-dimensional parallel to . parallel to(infinity)-balls and their matrix analogies - spectral norm balls in the spaces of square matrices. (C) 2014 Elsevier Inc. All rights reserved.
引用
收藏
页码:1 / 14
页数:14
相关论文
empty
未找到相关数据