The problem of finding the best rank-one approximation to higher-order tensors has extensive engineering and statistical applications. It is well-known that this problem is equivalent to a homogeneous polynomial optimization problem. In this paper, we study theoretical results and numerical methods of this problem, particularly focusing on the 4-th order symmetric tensor case. First, we reformulate the polynomial optimization problem to a matrix programming, and show the equivalence between these two problems. Then, we prove that there is no duality gap between the reformulation and its Lagrangian dual problem. Concerning the approaches to deal with the problem, we propose two relaxed models. The first one is a convex quadratic matrix optimization problem regularized by the nuclear norm, while the second one is a quadratic matrix programming regularized by a truncated nuclear norm, which is a D.C. function and therefore is nonconvex. To overcome the difficulty of solving this nonconvex problem, we approximate the nonconvex penalty by a convex term. We propose to use the proximal augmented Lagrangian method to solve these two relaxed models. In order to obtain a global solution, we propose an alternating least eigenvalue method after solving the relaxed models and prove its convergence.
机构:
Hexi Univ, Sch Math & Stat, Zhangye 734000, Peoples R China
Fudan Univ, Sch Math Sci, Shanghai 200433, Peoples R ChinaHexi Univ, Sch Math & Stat, Zhangye 734000, Peoples R China
Wang, Xuezhong
Che, Maolin
论文数: 0引用数: 0
h-index: 0
机构:
Southwest Univ Finance & Econ, Sch Econ Math, Chengdu 611130, Sichuan, Peoples R ChinaHexi Univ, Sch Math & Stat, Zhangye 734000, Peoples R China
Che, Maolin
Wei, Yimin
论文数: 0引用数: 0
h-index: 0
机构:
Fudan Univ, Sch Math Sci, Shanghai 200433, Peoples R China
Fudan Univ, Shanghai Key Lab Contemporary Appl Math, Shanghai 200433, Peoples R ChinaHexi Univ, Sch Math & Stat, Zhangye 734000, Peoples R China