A stochastic multiple gradient descent algorithm

被引:36
|
作者
Mercier, Quentin [1 ]
Poirion, Fabrice [1 ]
Desideri, Jean-Antoine [2 ]
机构
[1] Univ Paris Saclay, ONERA DMAS, Onera French Aerosp Lab, 29 Ave Div Leclerc, F-92320 Chatillon, France
[2] INRIA, 2004 Route Lucioles, F-06902 Valbonne, France
关键词
Multiple objective programming; Multiobjective stochastic optimization; Stochastic gradient algorithm; Multiple gradient descent algorithm; Common descent vector; MULTIOBJECTIVE OPTIMIZATION; ROBUST OPTIMIZATION; UNCERTAINTY;
D O I
10.1016/j.ejor.2018.05.064
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
In this article, we propose a new method for multiobjective optimization problems in which the objective functions are expressed as expectations of random functions. The present method is based on an extension of the classical stochastic gradient algorithm and a deterministic multiobjective algorithm, the Multiple Gradient Descent Algorithm (MGDA). In MGDA a descent direction common to all specified objective functions is identified through a result of convex geometry. The use of this common descent vector and the Pareto stationarity definition into the stochastic gradient algorithm makes the algorithm able to solve multiobjective problems. The mean square and almost sure convergence of this new algorithm are proven considering the classical stochastic gradient algorithm hypothesis. The algorithm efficiency is illustrated on a set of benchmarks with diverse complexity and assessed in comparison with two classical algorithms (NSGA-II, DMS) coupled with a Monte Carlo expectation estimator. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:808 / 817
页数:10
相关论文
共 50 条
  • [1] Optimal stochastic gradient descent algorithm for filtering
    Turali, M. Yigit
    Koc, Ali T.
    Kozat, Suleyman S.
    DIGITAL SIGNAL PROCESSING, 2024, 155
  • [2] Stochastic Multiple Target Sampling Gradient Descent
    Phan, Hoang
    Tran, Ngoc N.
    Le, Trung
    Tran, Toan
    Ho, Nhat
    Phung, Dinh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] A new stochastic gradient descent possibilistic clustering algorithm
    Koutsimpela, Angeliki
    Koutroumbas, Konstantinos D.
    AI COMMUNICATIONS, 2022, 35 (02) : 47 - 64
  • [4] Fast Convergence Stochastic Parallel Gradient Descent Algorithm
    Hu Dongting
    Shen Wen
    Ma Wenchao
    Liu Xinyu
    Su Zhouping
    Zhu Huaxin
    Zhang Xiumei
    Que Lizhi
    Zhu Zhuowei
    Zhang Yixin
    Chen Guoqing
    Hu Lifa
    LASER & OPTOELECTRONICS PROGRESS, 2019, 56 (12)
  • [5] Guided Stochastic Gradient Descent Algorithm for inconsistent datasets
    Sharma, Anuraganand
    APPLIED SOFT COMPUTING, 2018, 73 : 1068 - 1080
  • [6] Stochastic Approximate Gradient Descent via the Langevin Algorithm
    Qiu, Yixuan
    Wang, Xiao
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5428 - 5435
  • [7] A stochastic gradient descent algorithm for structural risk minimisation
    Ratsaby, J
    ALGORITHMIC LEARNING THEORY, PROCEEDINGS, 2003, 2842 : 205 - 220
  • [8] The Improved Stochastic Fractional Order Gradient Descent Algorithm
    Yang, Yang
    Mo, Lipo
    Hu, Yusen
    Long, Fei
    FRACTAL AND FRACTIONAL, 2023, 7 (08)
  • [9] Convergence behavior of diffusion stochastic gradient descent algorithm
    Barani, Fatemeh
    Savadi, Abdorreza
    Yazdi, Hadi Sadoghi
    SIGNAL PROCESSING, 2021, 183
  • [10] STOCHASTIC GRADIENT DESCENT ALGORITHM FOR STOCHASTIC OPTIMIZATION IN SOLVING ANALYTIC CONTINUATION PROBLEMS
    Bao, Feng
    Maier, Thomas
    FOUNDATIONS OF DATA SCIENCE, 2020, 2 (01): : 1 - 17