A FULLY BAYESIAN GRADIENT-FREE SUPERVISED DIMENSION REDUCTION METHOD USING GAUSSIAN PROCESSES

被引:0
|
作者
Gautier, Raphael [1 ]
Pandita, Piyush [2 ]
Ghosh, Sayan [2 ]
Mavris, Dimitri [1 ]
机构
[1] Georgia Inst Technol, Aerosp Syst Design Lab, Atlanta, GA 30308 USA
[2] GE Res, Probabilist Design, Niskayuna, NY 12309 USA
关键词
surrogate modeling; high-dimensional input space; dimensionality reduction; uncertainty quantification; active subspace; Bayesian inference; Gaussian process regression; RIDGE FUNCTIONS; REGRESSION; SUBSPACE; DESIGN; MODELS;
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Modern day engineering problems are ubiquitously characterized by sophisticated computer codes that map parameters or inputs to an underlying physical process. In other situations, experimental setups are used to model the physical process in a laboratory, ensuring high precision while being costly in materials and logistics. In both scenarios, only a limited amount of data can be generated by querying the expensive information source at a finite number of inputs or designs. This problem is compounded further in the presence of a high-dimensional input space. State-of-the-art parameter space dimension reduction methods, such as active subspace, aim to identify a subspace of the original input space that is sufficient to explain the output response. These methods are restricted by their reliance on gradient evaluations or copious data, making them inadequate for expensive problems without direct access to gradients. The proposed methodology is gradient-free and fully Bayesian, as it quantifies uncertainty in both the low-dimensional subspace and the surrogate model parameters. This enables a full quantification of epistemic uncertainty and robustness to limited data availability. It is validated on multiple datasets from engineering and science and compared to two other state-of-the-art methods based on four aspects: (a) recovery of the active subspace, (b) deterministic prediction accuracy, (c) probabilistic prediction accuracy, and (d) training time. The comparison shows that the proposed method improves the active subspace recovery and predictive accuracy, in both the deterministic and probabilistic sense, when only few a model observations are available for training, at the cost of increased training time.
引用
收藏
页码:19 / 51
页数:33
相关论文
共 50 条
  • [21] Efficient Gradient-Free Variational Inference using Policy Search
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [22] A conjecture on global optimization using gradient-free stochastic approximation
    Maryak, JL
    Chin, DC
    JOINT CONFERENCE ON THE SCIENCE AND TECHNOLOGY OF INTELLIGENT SYSTEMS, 1998, : 441 - 445
  • [23] Multi-parameter gradient-free automatic history matching method
    Zhang, Kai, 1600, University of Petroleum, China (38):
  • [24] Accelerating Groundwater Data Assimilation With a Gradient-Free Active Subspace Method
    Yan, Hengnian
    Hao, Chenyu
    Zhang, Jiangjiang
    Illman, Walter A.
    Lin, Guang
    Zeng, Lingzao
    WATER RESOURCES RESEARCH, 2021, 57 (12)
  • [25] Asynchronous Gossip-Based Gradient-Free Method for Multiagent Optimization
    Yuan, Deming
    ABSTRACT AND APPLIED ANALYSIS, 2014,
  • [26] A novel model-based hearing compensation design using a gradient-free optimization method
    Chen, Z
    Becker, S
    Bondy, J
    Bruce, IC
    Haykin, S
    NEURAL COMPUTATION, 2005, 17 (12) : 2648 - 2671
  • [27] Fully Bayesian differential Gaussian processes through stochastic differential equations
    Xu, Jian
    Lin, Zhiqi
    Chen, Min
    Yang, Junmei
    Zeng, Delu
    Paisley, John
    KNOWLEDGE-BASED SYSTEMS, 2025, 314
  • [28] Gradient-free aerodynamic shape optimization using Large Eddy Simulation
    Karbasian, Hamid R.
    Vermeire, Brian C.
    COMPUTERS & FLUIDS, 2022, 232
  • [29] Gradient-Free Aeroacoustic Shape Optimization Using Large Eddy Simulation
    Hamedi, Mohsen
    Vermeire, Brian
    AIAA JOURNAL, 2025,
  • [30] Gradient-free training of recurrent neural networks using random perturbations
    Fernandez, Jesus Garcia
    Keemink, Sander
    van Gerven, Marcel
    FRONTIERS IN NEUROSCIENCE, 2024, 18