Sag-flownet: self-attention generative network for airfoil flow field prediction

被引:4
|
作者
Wang, Xiao [1 ]
Jiang, Yi [2 ]
Li, Guanxiong [1 ]
Zhang, Laiping [3 ]
Deng, Xiaogang [1 ,4 ]
机构
[1] Sichuan Univ, Sch Comp Sci, Chengdu 610065, Peoples R China
[2] Acad Mil Sci, Inst Syst Engn, Beijing 100082, Peoples R China
[3] Natl Innovat Inst Def Technol, Unmanned Syst Res Ctr, Beijing 100071, Peoples R China
[4] Acad Mil Sci, Beijing 100190, Peoples R China
关键词
Flow field; Generative networks; Self-attention; Prediction; SIMULATION;
D O I
10.1007/s00500-023-09602-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Flow field prediction is essential for airfoil design. It is a time-consuming task to obtain the flow fields around an airfoil. Convolution neural networks (CNN) have been applied for flow field prediction in recent years. However, CNN-based methods rely heavily on convolutional kernels to process information within local neighborhoods, making it difficult to capture global information. In this paper, we propose a novel self-attention generative network referred to as SAG-FlowNet, both for original and optimization airfoil flow field prediction. We investigate the self-attention mechanism with a multi-layer convolutional generative network. We use the self-attention module to capture various information within and between flow fields, and with the help of the attention module, the CNN can utilize the information with stronger relationships regardless of their distances to achieve better flow field prediction results. Through extensive experiments, we explore the proposed SAG-FlowNet performance. The experimental results show that the method has accurate and universal performance for the reconstruction and prediction of the flow field both for original and optimized airfoils. SAG-FlowNet is promising for fast flow field prediction and has potential applications in accelerating airfoil design.
引用
收藏
页码:7417 / 7437
页数:21
相关论文
共 50 条
  • [21] Vehicle Interaction Behavior Prediction with Self-Attention
    Li, Linhui
    Sui, Xin
    Lian, Jing
    Yu, Fengning
    Zhou, Yafu
    SENSORS, 2022, 22 (02)
  • [22] CSAN: Contextual Self-Attention Network for User Sequential Recommendation
    Huang, Xiaowen
    Qian, Shengsheng
    Fang, Quan
    Sang, Jitao
    Xu, Changsheng
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 447 - 455
  • [23] SUPER-RESOLUTION AND SELF-ATTENTION WITH GENERATIVE ADVERSARIAL NETWORK FOR IMPROVING MALIGNANCY CHARACTERIZATION OF HEPATOCELLULAR CARCINOMA
    Li, Yunling
    Huang, Hui
    Zhang, Lijuan
    Wang, Guangyi
    Zhang, Honglai
    Zhou, Wu
    2020 IEEE 17TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2020), 2020, : 1556 - 1560
  • [24] MSAN: Multiscale self-attention network for pansharpening
    Lu, Hangyuan
    Yang, Yong
    Huang, Shuying
    Liu, Rixian
    Guo, Huimin
    PATTERN RECOGNITION, 2025, 162
  • [25] A new deep self-attention neural network for GNSS coordinate time series prediction
    Jiang, Weiping
    Wang, Jian
    Li, Zhao
    Li, Wudong
    Yuan, Peng
    GPS SOLUTIONS, 2024, 28 (01)
  • [26] A new deep self-attention neural network for GNSS coordinate time series prediction
    Weiping Jiang
    Jian Wang
    Zhao Li
    Wudong Li
    Peng Yuan
    GPS Solutions, 2024, 28
  • [27] Self-attention Based Collaborative Neural Network for Recommendation
    Ma, Shengchao
    Zhu, Jinghua
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 235 - 246
  • [28] Multiple Self-attention Network for Intracranial Vessel Segmentation
    Li, Yang
    Ni, Jiajia
    Elazab, Ahmed
    Wu, Jianhuang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [29] Multilayer self-attention residual network for code search
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (09)
  • [30] Diversifying Search Results using Self-Attention Network
    Qin, Xubo
    Dou, Zhicheng
    Wen, Ji-Rong
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1265 - 1274