NiaNet: A framework for constructing Autoencoder architectures using nature-inspired algorithms

被引:0
作者
Pavlic, Saso [1 ]
Karakatic, Saso [1 ]
Fister, Iztok, Jr. [1 ]
机构
[1] Univ Maribor, Fac Elect Engn & Comp Sci, Koroska Cesta 46, SLO-2000 Maribor, Slovenia
来源
PROCEEDINGS OF THE 2022 17TH CONFERENCE ON COMPUTER SCIENCE AND INTELLIGENCE SYSTEMS (FEDCSIS) | 2022年
关键词
AutoML; autoencoder; deep learning; natureinspired algorithms; optimization; DIFFERENTIAL EVOLUTION; NEURAL-NETWORKS;
D O I
10.15439/2022F192
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Autoencoder, an hourly glass-shaped deep neural network capable of learning data representation in a lower dimension, has performed well in various applications. However, developing a high-quality AE system for a specific task heavily relies on human expertise, limiting its widespread application. On the other hand, there has been a gradual increase in automated machine learning for developing deep learning systems without human intervention. However, there is a shortage of automatically designing particular deep neural networks such as AE. This study presents the NiaNet method and corresponding software framework for designing AE topology and hyper-parameter settings. Our findings show that it is possible to discover the optimal AE architecture for a specific dataset without the requirement for human expert assistance. The future potential of the proposed method is also discussed in this paper.
引用
收藏
页码:109 / 116
页数:8
相关论文
共 35 条
[21]  
Miikkulainen R., NEUROEVOLUTION
[22]   Metaheuristic design of feedforward neural networks: A review of two decades of research [J].
Ojha, Varun Kumar ;
Abraham, Ajith ;
Snasel, Vaclav .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 60 :97-116
[23]  
Pecnik L, 2021, J. Open Source Softw, V6, P2949
[24]  
PyTorch, About Us
[25]   A survey on Bayesian network structure learning from data [J].
Scanagatta, Mauro ;
Salmeron, Antonio ;
Stella, Fabio .
PROGRESS IN ARTIFICIAL INTELLIGENCE, 2019, 8 (04) :425-439
[26]  
Shekhar Shashi, 2022, Advances in Computational Intelligence and Communication Technology: Proceedings of CICT 2021. Lecture Notes in Networks and Systems (399), P385, DOI 10.1007/978-981-16-9756-2_37
[27]   Evolving neural networks through augmenting topologies [J].
Stanley, KO ;
Miikkulainen, R .
EVOLUTIONARY COMPUTATION, 2002, 10 (02) :99-127
[28]   Differential evolution - A simple and efficient heuristic for global optimization over continuous spaces [J].
Storn, R ;
Price, K .
JOURNAL OF GLOBAL OPTIMIZATION, 1997, 11 (04) :341-359
[29]   Automated Design of Deep Neural Networks: A Survey and Unified Taxonomy [J].
Talbi, El-Ghazali .
ACM COMPUTING SURVEYS, 2021, 54 (02)
[30]  
Vrbancic G., 2018, DESIGNING DEEP NEURA