Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/16071
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAkhtar, Mushiren_US
dc.contributor.authorTanveer, M.en_US
dc.contributor.authorArshad, Mohd.en_US
dc.date.accessioned2025-05-14T16:55:26Z-
dc.date.available2025-05-14T16:55:26Z-
dc.date.issued2025-
dc.identifier.citationAkhtar, M., Tanveer, M., & Arshad, M. (2025). HawkEye: A robust loss function for regression with bounded, smooth, and insensitive zone characteristics. Applied Soft Computing, 176. https://doi.org/10.1016/j.asoc.2025.113118en_US
dc.identifier.issn1568-4946-
dc.identifier.otherEID(2-s2.0-105003879125)-
dc.identifier.urihttps://doi.org/10.1016/j.asoc.2025.113118-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/16071-
dc.description.abstractSupport vector regression (SVR) encounters challenges when confronted with outliers and noise, primarily due to the limitations of the traditional ɛ-insensitive loss function. To address this, bounded loss functions have gained traction for their robustness and improved generalization. More recent advancements, such as BLINEX and bounded least square loss, focus on smooth bounded loss functions that enable efficient gradient-based optimization. However, these approaches lack an insensitive zone, which is crucial for mitigating minor deviations and noise. The challenge of designing a loss function that combines boundedness, smoothness, and an insensitive zone remains unresolved in the current literature. To address this issue, we develop the HawkEye loss, a novel formulation that integrates boundedness, smoothness, and the presence of an insensitive zone. This unique combination enhances the robustness and generalization capabilities of SVR models, particularly in the presence of noise and outliers. Notably, the HawkEye loss is the first in SVR literature to simultaneously incorporate boundedness, smoothness, and an insensitive zone. Leveraging this breakthrough, we integrate the HawkEye loss into the least squares framework of SVR and yield a new robust and scalable model termed HE-LSSVR. The optimization problem inherent to HE-LSSVR is addressed by harnessing the adaptive moment estimation (Adam) algorithm, known for its adaptive learning rate and efficacy in handling large-scale problems. To our knowledge, this is the first time Adam has been employed to solve an SVR problem. To empirically validate the proposed HE-LSSVR model, we evaluate it on UCI, synthetic, time series, and brain age datasets. The experimental outcomes unequivocally reveal the superiority of the HE-LSSVR model both in terms of its remarkable generalization performance and its efficiency in training time. The code of the proposed model is publicly available at https://github.com/mtanveer1/HawkEye. © 2025 Elsevier B.V.en_US
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.sourceApplied Soft Computingen_US
dc.subjectAdam algorithmen_US
dc.subjectHawkEye loss functionen_US
dc.subjectInverse-free optimizationen_US
dc.subjectLoss functionen_US
dc.subjectSupervised learningen_US
dc.subjectSupport vector regressionen_US
dc.titleHawkEye: A robust loss function for regression with bounded, smooth, and insensitive zone characteristicsen_US
dc.typeJournal Articleen_US
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: