Autoencoder Evaluation and Hyper-parameter Tuning in an Unsupervised Setting

被引:3
|
作者
Ordway-West, Ellie [1 ]
Parveen, Pallabi [1 ]
Henslee, Austin [1 ]
机构
[1] AT&T, Dallas, TX 75202 USA
来源
2018 IEEE INTERNATIONAL CONGRESS ON BIG DATA (IEEE BIGDATA CONGRESS) | 2018年
关键词
D O I
10.1109/BigDataCongress.2018.00034
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper aims to introduce a new methodology for evaluating autoencoder performance and to shorten time spent on heuristic analysis during hyper-parameter tuning. Existing methodologies for evaluating hyper-parameter tuning focus on finding known anomalies in a labeled set or minimizing the average per row reconstruction error as a method of model selection. This paper focuses on anomaly detection in a completely unsupervised setting, where labels are not known during model training or evaluation. This approach uses the approximate Full Width Half Max (FWHM) of the histogram of the per row reconstruction error in conjunction with the average per row reconstruction error and the number of anomalies found to define a new method of model selection that aims to maximize the FWHM while minimizing the average per row reconstruction error. This methodology simplifies and speeds up model evaluation by presenting model results in an intuitive manner and simplifies the heuristic analysis needed to determine the "best" model.
引用
收藏
页码:205 / 209
页数:5
相关论文
共 50 条
  • [1] Bayesian Optimization for Accelerating Hyper-parameter Tuning
    Vu Nguyen
    2019 IEEE SECOND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND KNOWLEDGE ENGINEERING (AIKE), 2019, : 302 - 305
  • [2] ONLINE HYPER-PARAMETER TUNING FOR THE CONTEXTUAL BANDIT
    Bouneffouf, Djallel
    Claeys, Emmanuelle
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3445 - 3449
  • [3] Hyper-Parameter Tuning for the (1+(λ, λ)) GA
    Nguyen Dang
    Doerr, Carola
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 889 - 897
  • [4] Facilitating Database Tuning with Hyper-Parameter Optimization: A Comprehensive Experimental Evaluation
    Zhang, Xinyi
    Chang, Zhuo
    Li, Yang
    Wu, Hong
    Tan, Jian
    Li, Feifei
    Cui, Bin
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (09): : 1808 - 1821
  • [5] Hyper-parameter Tuning under a Budget Constraint
    Lu, Zhiyun
    Chen, Liyu
    Chiang, Chao-Kai
    Sha, Fei
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5744 - 5750
  • [6] HYPER-TUNE: Towards Efficient Hyper-parameter Tuning at Scale
    Li, Yang
    Shen, Yu
    Jiang, Huaijun
    Zhang, Wentao
    Li, Jixiang
    Liu, Ji
    Zhang, Ce
    Cui, Bin
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (06): : 1256 - 1265
  • [7] Hyper-parameter Tuning for Quantum Support Vector Machine
    Demirtas, Fadime
    Tanyildizi, Erkan
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2022, 22 (04) : 47 - 54
  • [8] Hyper-parameter Tuning of a Decision Tree Induction Algorithm
    Mantovani, Rafael G.
    Horvath, Tomas
    Cerri, Ricardo
    Vanschoren, Joaquin
    de Carvalho, Andre C. P. L. F.
    PROCEEDINGS OF 2016 5TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2016), 2016, : 37 - 42
  • [9] Effectiveness of Random Search in SVM hyper-parameter tuning
    Mantovani, Rafael G.
    Rossi, Andre L. D.
    Vanschoren, Joaquin
    Bischl, Bernd
    de Carvalho, Andre C. P. L. F.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [10] Effects of Random Sampling on SVM Hyper-parameter Tuning
    Horvath, Tomas
    Mantovani, Rafael G.
    de Carvalho, Andre C. P. L. F.
    INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS (ISDA 2016), 2017, 557 : 268 - 278