Automatic Optimization-Based Methods in Machine Learning: A Systematic Review

被引:0
作者
Shahrabadi, Somayeh [1 ,2 ]
Adao, Telmo [1 ,2 ]
Alves, Victor [2 ]
Magalhaes, Luis G. [2 ]
机构
[1] Ctr Comp Graph CCG, Campus Azurem,Edificio 14, P-4800058 Guimaraes, Portugal
[2] Univ Minho, ALGORITMI Res Ctr, Guimaraes, Portugal
来源
INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, INTELLISYS 2023 | 2024年 / 823卷
关键词
Deep Learning; Machine Learning; Artificial Intelligence; Computer Vision; Dataset Learning Optimization; Auto ML; Automatic Data Augmentation; CNN Architecture; Hyperparameter Optimization; Neural Architecture Search; HYPERPARAMETER OPTIMIZATION; DATA AUGMENTATION; SEARCH; ALGORITHMS;
D O I
10.1007/978-3-031-47724-9_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine Learning (ML) is a subfield of Artificial Intelligence (AI) that has been applied to various fields ranging from industrial to medical sectors, to perform miscellaneous Computer Vision tasks such as image classification, image segmentation, object detection, and language modeling. Notwithstanding, having a suitable model with practical applicability requires performing appropriate structural operations upon datasets, building adequate CNN architectures from the scratch or resorting to the ones available in the state-of-the-art, and, either way, parameterizing them to improve machine learning skills, usually, in a trial-and-error fashion. Aligned with this context, despite the several semi-/fully automatic approaches that can be found in the literature, (e.g., grid search for hyperparameter fine-tuning, auto-Machine Learning for self-configurable model development, and automatic methods for data arrangement and augmentation), which are often integrated with combination to establish automatic pipelines for the effective implementation of solutions powered by AI, surveys documenting such topic seem to be scarce. Therefore, the main goal of this work is to present an updated yet extensive literature review focusing on this class of approaches, considering the importance of their role in the perspective of ML Optimization.
引用
收藏
页码:309 / 326
页数:18
相关论文
共 77 条
  • [1] Hyperparameter Optimization: Comparing Genetic Algorithm against Grid Search and Bayesian Optimization
    Alibrahim, Hussain
    Ludwig, Simone A.
    [J]. 2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, : 1551 - 1559
  • [2] Baker B, 2017, Arxiv, DOI arXiv:1611.02167
  • [3] A review of neural architecture search
    Baymurzina, Dilyara
    Golikov, Eugene
    Burtsev, Mikhail
    [J]. NEUROCOMPUTING, 2022, 474 : 82 - 93
  • [4] Grid search in hyperparameter optimization of machine learning models for prediction of HIV/AIDS test results
    Belete D.M.
    Huchaiah M.D.
    [J]. International Journal of Computers and Applications, 2022, 44 (09) : 875 - 886
  • [5] Bengio Y, 2013, Arxiv, DOI arXiv:1308.3432
  • [6] Bergstra J., 2011, ADV NEURAL INFORM PR, V24, P2546, DOI DOI 10.5555/2986459.2986743
  • [7] Bergstra J, 2012, J MACH LEARN RES, V13, P281
  • [8] Bischl B, 2021, Arxiv, DOI arXiv:2107.05847
  • [9] Hyperparameter Optimization Techniques for Designing Software Sensors Based on Artificial Neural Networks
    Blume, Sebastian
    Benedens, Tim
    Schramm, Dieter
    [J]. SENSORS, 2021, 21 (24)
  • [10] Brock A, 2017, Arxiv, DOI arXiv:1708.05344