Tighter αBB relaxations through a refinement scheme for the scaled Gerschgorin theorem

被引:0
|
作者
Nerantzis, Dimitrios [1 ]
Adjiman, Claire S. [1 ]
机构
[1] Imperial Coll London, Dept Chem Engn, London, England
基金
英国工程与自然科学研究理事会;
关键词
Global optimization; Branch-and-bound; Convex underestimators; Interval matrix; GLOBAL OPTIMIZATION METHOD; DIFFERENTIABLE CONSTRAINED NLPS; CONVEX UNDERESTIMATORS; CLUSTER PROBLEM; EIGENVALUES;
D O I
10.1007/s10898-018-0718-y
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Of central importance to the BB algorithm is the calculation of the values that guarantee the convexity of the underestimator. Improvement (reduction) of these values can result in tighter underestimators and thus increase the performance of the algorithm. For instance, it was shown by Wechsung et al. (J Glob Optim 58(3):429-438, 2014) that the emergence of the cluster effect can depend on the magnitude of the values. Motivated by this, we present a refinement method that can improve (reduce) the magnitude of values given by the scaled Gerschgorin method and thus create tighter convex underestimators for the BB algorithm. We apply the new method and compare it with the scaled Gerschgorin on randomly generated interval symmetric matrices as well as interval Hessians taken from test functions. As a measure of comparison, we use the maximal separation distance between the original function and the underestimator. Based on the results obtained, we conclude that the proposed refinement method can significantly reduce the maximal separation distance when compared to the scaled Gerschgorin method. This approach therefore has the potential to improve the performance of the BB algorithm.
引用
收藏
页码:467 / 483
页数:17
相关论文
共 3 条