HawkEye: A robust loss function for regression with bounded, smooth, insensitive zone characteristics

被引:0
作者
Akhtar, Mushir [1 ]
Tanveer, M. [1 ]
Arshad, Mohd. [1 ]
机构
[1] Indian Inst Technol Indore, Dept Math, Indore 453552, Madhya Pradesh, India
关键词
Supervised learning; Support vector regression; Loss function; HawkEye loss function; Inverse-free optimization; Adam algorithm; SUPPORT VECTOR REGRESSION;
D O I
10.1016/j.asoc.2025.113118
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector regression (SVR) encounters challenges when confronted with outliers and noise, primarily due to the limitations of the traditional E-insensitive loss function. To address this, bounded loss functions have gained traction for their robustness and improved generalization. More recent advancements, BLINEX and bounded least square loss, focus on smooth bounded loss functions that enable efficient gradient based optimization. However, these approaches lack an insensitive zone, which is crucial for mitigating deviations and noise. The challenge of designing a loss function that combines boundedness, smoothness, an insensitive zone remains unresolved in the current literature. To address this issue, we develop the HawkEye loss, a novel formulation that integrates boundedness, smoothness, and the presence of an insensitive This unique combination enhances the robustness and generalization capabilities of SVR models, particularly the presence of noise and outliers. Notably, the HawkEye loss is the first in SVR literature to simultaneously incorporate boundedness, smoothness, and an insensitive zone. Leveraging this breakthrough, we integrate the HawkEye loss into the least squares framework of SVR and yield a new robust and scalable termed HE-LSSVR. The optimization problem inherent to HE-LSSVR is addressed by harnessing the adaptive moment estimation (Adam) algorithm, known for its adaptive learning rate and efficacy in handling scale problems. To our knowledge, this is the first time Adam has been employed to solve an SVR problem. empirically validate the proposed HE-LSSVR model, we evaluate it on UCI, synthetic, time series, and age datasets. The experimental outcomes unequivocally reveal the superiority of the HE-LSSVR model terms of its remarkable generalization performance and its efficiency in training time. The code of the proposed model is publicly available at https://github.com/mtanveer1/HawkEye.
引用
收藏
页数:14
相关论文
共 54 条
[1]  
Akhtar M., 2024, INT C PATTERN RECOGN, P63
[2]   RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for Supervised Learning [J].
Akhtar, Mushir ;
Tanveer, M. ;
Arshad, Mohd. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (01) :149-160
[3]   Advancing Supervised Learning with the Wave Loss Function: A Robust and Smooth Approach [J].
Akhtar, Mushir ;
Tanveer, M. ;
Arshad, Mohd. .
PATTERN RECOGNITION, 2024, 155
[4]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[5]   A new asymmetric ε-insensitive pinball loss function based support vector quantile regression model [J].
Anand, Pritam ;
Rastogi, Reshma ;
Chandra, Suresh .
APPLIED SOFT COMPUTING, 2020, 94
[6]   A Taxonomy of Traffic Forecasting Regression Problems From a Supervised Learning Perspective [J].
Angarita-Zapata, Juan S. ;
Masegosa, Antonio D. ;
Triguero, Isaac .
IEEE ACCESS, 2019, 7 :68185-68205
[7]  
[Anonymous], COURSERA: Neural Networks for Machine Learning
[8]   Robust Support Vector Regression in Primal with Asymmetric Huber Loss [J].
Balasundaram, S. ;
Meena, Yogendra .
NEURAL PROCESSING LETTERS, 2019, 49 (03) :1399-1431
[9]   A General and Adaptive Robust Loss Function [J].
Barron, Jonathan T. .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4326-4334
[10]   Forecasting long-term stock prices of global indices: A forward-validating Genetic Algorithm optimization approach for Support Vector Regression [J].
Beniwal, Mohit ;
Singh, Archana ;
Kumar, Nand .
APPLIED SOFT COMPUTING, 2023, 145