Image Representations of Numerical Simulations for Training Neural Networks

被引:45
|
作者
Zhang, Yiming [1 ]
Gao, Zhiran [1 ]
Wang, Xueya [1 ]
Liu, Qi [2 ]
机构
[1] Hebei Univ Technol, Sch Civil & Transportat Engn, Tianjin 300401, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing 210044, Peoples R China
来源
CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES | 2023年 / 134卷 / 02期
基金
中国国家自然科学基金;
关键词
Numerical simulations; neural network; pre-; post-processing; data compression; DISCONTINUITY LAYOUT OPTIMIZATION; DEEP; PLASTICITY; CONCRETE; MODEL; RISK;
D O I
10.32604/cmes.2022.022088
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
A large amount of data can partly assure good fitting quality for the trained neural networks. When the quantity of experimental or on-site monitoring data is commonly insufficient and the quality is difficult to control in engineering practice, numerical simulations can provide a large amount of controlled high quality data. Once the neural networks are trained by such data, they can be used for predicting the properties/responses of the engineering objects instantly, saving the further computing efforts of simulation tools. Correspondingly, a strategy for efficiently transferring the input and output data used and obtained in numerical simulations to neural networks is desirable for engineers and programmers. In this work, we proposed a simple image representation strategy of numerical simulations, where the input and output data are all represented by images. The temporal and spatial information is kept and the data are greatly compressed. In addition, the results are readable for not only computers but also human resources. Some examples are given, indicating the effectiveness of the proposed strategy.
引用
收藏
页码:821 / 833
页数:13
相关论文
共 50 条
  • [31] Sequential Training of Neural Networks With Gradient Boosting
    Emami, Seyedsaman
    Martinez-Munoz, Gonzalo
    IEEE ACCESS, 2023, 11 : 42738 - 42750
  • [32] Training Neural Networks with Computer Generated Images
    Hollosi, Janos
    Ballagi, Aron
    2019 IEEE 15TH INTERNATIONAL SCIENTIFIC CONFERENCE ON INFORMATICS (INFORMATICS 2019), 2019, : 155 - 159
  • [33] To improve the training time of BP neural networks
    Yu, CC
    Tang, YC
    2001 INTERNATIONAL CONFERENCES ON INFO-TECH AND INFO-NET PROCEEDINGS, CONFERENCE A-G: INFO-TECH & INFO-NET: A KEY TO BETTER LIFE, 2001, : C473 - C479
  • [34] A survey of randomized algorithms for training neural networks
    Zhang, Le
    Suganthan, P. N.
    INFORMATION SCIENCES, 2016, 364 : 146 - 155
  • [35] A framework for parallel and distributed training of neural networks
    Scardapane, Simone
    Di Lorenzo, Paolo
    NEURAL NETWORKS, 2017, 91 : 42 - 54
  • [36] Co-training of multiple neural networks for simultaneous optimization and training of physics-informed neural networks for composite curing☆
    Humfeld, Keith D.
    Kim, Geun Young
    Jeon, Ji Ho
    Hoffman, John
    Brown, Allison
    Colton, Jonathan
    Melkote, Shreyes
    Nguyen, Vinh
    COMPOSITES PART A-APPLIED SCIENCE AND MANUFACTURING, 2025, 193
  • [37] Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream
    Guclu, Umut
    van Gerven, Marcel A. J.
    JOURNAL OF NEUROSCIENCE, 2015, 35 (27): : 10005 - 10014
  • [38] Image Compression Using Neural Networks: A Review
    Sadeeq, Haval T.
    Hameed, Thamer H.
    Abdi, Abdo S.
    Abdulfatah, Ayman N.
    INTERNATIONAL JOURNAL OF ONLINE AND BIOMEDICAL ENGINEERING, 2021, 17 (14) : 135 - 153
  • [39] Comparative Study of Neural Networks for Image Retrieval
    Jyothi, B. Veera
    Eswaran, Kumar
    UKSIM-AMSS FIRST INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS, MODELLING AND SIMULATION, 2010, : 199 - +
  • [40] Hybrid neural networks for gray image recognition
    Ye, XJ
    Li, ZN
    ELECTRONIC IMAGING AND MULTIMEDIA SYSTEMS II, 1998, 3561 : 7 - 13