AeroINR: Meta-learning for Efficient Generation of Aerodynamic Geometries

被引:0
|
作者
Bamford, Tom [1 ]
Toal, David [1 ]
Keane, Andy [1 ]
机构
[1] Univ Southampton, Southampton SO16 7QF, Hants, England
来源
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-APPLIED DATA SCIENCE TRACK, PT IX, ECML PKDD 2024 | 2024年 / 14949卷
关键词
AI-Aided Design; Variational Auto-Encoders (VAE); Hypernetworks; Implicit Neural Representations (INR); Meta-Learning; DESIGN;
D O I
10.1007/978-3-031-70378-2_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Effective optimisation of aerodynamic shapes requires high-quality parameterisation of candidate geometries. In recent years, the increasing availability and applicability of data - through increasing computational power, GPUs, cloud storage and AI - has motivated the development of data-driven approaches to the parameterisation problem, particularly those that can process the image-based data coming from scanned design parts. In this paper a novel approach to aerodynamic shape parameterisation is proposed, which leverages meta-learning in a generative deep learning framework. The solution put forward - AeroINR - aims to learn continuous neural representations as surrogates of the discrete field data used for shape representation in image-based applications. This approach transforms the learning problem to that of the surrogate model weight distribution of candidate geometries, rather than grid-based field values directly, which can reduce the number of variables describing each geometry by an order of magnitude or more. Benchmarking is carried out against three state-of-the-art deep-learning based aerofoil parameterisations, with AeroINR shown to outperform these models in two of the three metrics considered. Ablation study results show the robustness of this approach to generative framework and choice of discrete field representation.
引用
收藏
页码:452 / 467
页数:16
相关论文
共 50 条
  • [1] Efficient Variance Reduction for Meta-Learning
    Yang, Hansi
    Kwok, James T.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [2] TinyMetaFed: Efficient Federated Meta-learning for TinyML
    Ren, Haoyu
    Li, Xue
    Anicic, Darko
    Runkler, Thomas A.
    MACHINE LEARNING AND PRINCIPLES AND PRACTICE OF KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2023, PT IV, 2025, 2136 : 126 - 137
  • [3] Meta-learning for efficient unsupervised domain adaptation
    Vettoruzzo, Anna
    Bouguelia, Mohamed-Rafik
    Roegnvaldsson, Thorsteinn
    NEUROCOMPUTING, 2024, 574
  • [4] Memory Efficient Meta-Learning with Large Images
    Bronskill, John
    Massiceti, Daniela
    Patacchiola, Massimiliano
    Hofmann, Katja
    Nowozin, Sebastian
    Turner, Richard E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Efficient Meta-Learning for Continual Learning with Taylor Expansion Approximation
    Zou, Xiaohan
    Lin, Tong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] Towards Sample-efficient Overparameterized Meta-learning
    Sun, Yue
    Narang, Adhyyan
    Gulluk, Halil Ibrahim
    Oymak, Samet
    Fazel, Maryam
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [7] Metadial: A Meta-learning Approach for Arabic Dialogue Generation
    Shamas, Mohsen
    El Hajj, Wassim
    Hajj, Hazem
    Shaban, Khaled
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (06)
  • [8] Learning from Past Observations: Meta-Learning for Efficient Clustering Analyses
    Fritz, Manuel.
    Tschechlov, Dennis
    Schwarz, Holger
    BIG DATA ANALYTICS AND KNOWLEDGE DISCOVERY (DAWAK 2020), 2020, 12393 : 364 - 379
  • [9] Learning Meta-Learning (LML) dataset: Survey data of meta-learning parameters
    Corraya, Sonia
    Al Mamun, Shamim
    Kaiser, M. Shamim
    DATA IN BRIEF, 2023, 51
  • [10] Active Dataset Generation for Meta-learning System Quality Improvement
    Zabashta, Alexey
    Filchenkov, Andrey
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2019, PT I, 2019, 11871 : 394 - 401