A Meta-Learning Approach to Predicting Performance and Data Requirements

被引:5
|
作者
Jain, Achin [1 ]
Swaminathanl, Gurumurthy [1 ]
Favarol, Paolo [1 ]
Yang, Hao [1 ]
Ravichandrae, Avinash [1 ]
Harutyunyan, Hrayr [1 ,2 ]
Achillel, Alessandro [1 ]
Dabeerl, Onkar [1 ]
Schiele, Bernt [1 ]
Swaminathan, Ashwin [1 ]
Soata, Stefano [1 ]
机构
[1] AWS AI Labs, Seattle, WA 77002 USA
[2] Univ Southern Calif, Los Angeles, CA USA
来源
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR | 2023年
关键词
BENCHMARK;
D O I
10.1109/CVPR52729.2023.00353
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an approach to estimate the number of samples required for a model to reach a target performance. We find that the power law, the de facto principle to estimate model performance, leads to a large error when using a small dataset (e.g., 5 samples per class) for extrapolation. This is because the log-performance error against the log-dataset size follows a nonlinear progression in the few-shot regime followed by a linear progression in the high-shot regime. We introduce a novel piecewise power law (PPL) that handles the two data regimes differently. To estimate the parameters of the PPL, we introduce a random forest regressor trained via meta learning that generalizes across classification/detection tasks, ResNet/ViT based architectures, and random/pre-trained initializations. The PPL improves the performance estimation on average by 37% across 16 classification and 33% across 10 detection datasets, compared to the power law. We further extend the PPL to provide a confidence bound and use it to limit the prediction horizon that reduces over-estimation of data by 76% on classification and 91% on detection datasets.
引用
收藏
页码:3623 / 3632
页数:10
相关论文
共 50 条
  • [1] A Meta-Learning Approach to Predicting Financial Statement Fraud
    Mckee, Thomas E.
    JOURNAL OF EMERGING TECHNOLOGIES IN ACCOUNTING, 2009, 6 (01) : 5 - 26
  • [2] Meta-learning approach to gene expression data classification
    de Souza, Bruno Feres
    Soares, Carlos
    de Carvalho, Andre C. P. L. F.
    INTERNATIONAL JOURNAL OF INTELLIGENT COMPUTING AND CYBERNETICS, 2009, 2 (02) : 285 - 303
  • [3] Learning Meta-Learning (LML) dataset: Survey data of meta-learning parameters
    Corraya, Sonia
    Al Mamun, Shamim
    Kaiser, M. Shamim
    DATA IN BRIEF, 2023, 51
  • [4] Data Augmentation for Meta-Learning
    Ni, Renkun
    Goldblum, Micah
    Sharaf, Amr
    Kong, Kezhi
    Goldstein, Tom
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Data Efficiency of Meta-learning
    Al-Shedivat, Maruan
    Li, Liam
    Xing, Eric
    Talwalkar, Ameet
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [6] Learning to Defer to a Population: A Meta-Learning Approach
    Tailor, Dharmesh
    Patra, Aditya
    Verma, Rajeev
    Manggala, Putra
    Nalisnick, Eric
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [7] Meta-learning: Data, architecture, and both
    Binz, Marcel
    Dasgupta, Ishita
    Jagadish, Akshay
    Botvinick, Matthew
    Wang, Jane X.
    Schulz, Eric
    BEHAVIORAL AND BRAIN SCIENCES, 2024, 47
  • [8] Meta-learning Enhancements by Data Partitioning
    Merk, Beata
    Bratu, Camelia Vidrighin
    Potolea, Rodica
    2009 IEEE 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, PROCEEDINGS, 2009, : 59 - 62
  • [9] On sensitivity of meta-learning to support data
    Agarwal, Mayank
    Yurochkin, Mikhail
    Sun, Yuekai
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [10] A Meta-Learning Approach to Error Prediction
    Guimaraes, Miguel
    Carneiro, Davide
    PROCEEDINGS OF 2021 16TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI'2021), 2021,