Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/15797
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAkhtar, Mushiren_US
dc.contributor.authorTanveer, M.en_US
dc.contributor.authorArshad, Mohd.en_US
dc.date.accessioned2025-03-26T09:59:08Z-
dc.date.available2025-03-26T09:59:08Z-
dc.date.issued2025-
dc.identifier.citationAkhtar, M., Tanveer, M., & Arshad, M. (2025). RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for Supervised Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 47(1), 149–160. https://doi.org/10.1109/TPAMI.2024.3465535en_US
dc.identifier.issn0162-8828-
dc.identifier.otherEID(2-s2.0-86000375114)-
dc.identifier.urihttps://doi.org/10.1109/TPAMI.2024.3465535-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/15797-
dc.description.abstractIn the domain of machine learning, the significance of the loss function is paramount, especially in supervised learning tasks. It serves as a fundamental pillar that profoundly influences the behavior and efficacy of supervised learning algorithms. Traditional loss functions, though widely used, often struggle to handle outlier-prone and high-dimensional data, resulting in suboptimal outcomes and slow convergence during training. In this paper, we address the aforementioned constraints by proposing a novel robust, bounded, sparse, and smooth (RoBoSS) loss function for supervised learning. Further, we incorporate the RoBoSS loss within the framework of support vector machine (SVM) and introduce a new robust algorithm named LRoBoSS-SVM. For the theoretical analysis, the classification-calibrated property and generalization ability are also presented. These investigations are crucial for gaining deeper insights into the robustness of the RoBoSS loss function in classification problems and its potential to generalize well to unseen data. To validate the potency of the proposed LRoBoSS-SVM, we assess it on 88 benchmark datasets from KEEL and UCI repositories. Further, to rigorously evaluate its performance in challenging scenarios, we conducted an assessment using datasets intentionally infused with outliers and label noise. Additionally, to exemplify the effectiveness of LRoBoSS-SVM within the biomedical domain, we evaluated it on two medical datasets: the electroencephalogram (EEG) signal dataset and the breast cancer (BreaKHis) dataset. The numerical results substantiate the superiority of the proposed LRoBoSS-SVM model, both in terms of its remarkable generalization performance and its efficiency in training time. © 1979-2012 IEEE.en_US
dc.language.isoenen_US
dc.publisherIEEE Computer Societyen_US
dc.sourceIEEE Transactions on Pattern Analysis and Machine Intelligenceen_US
dc.subjectClassificationen_US
dc.subjectloss functionsen_US
dc.subjectRoBoSS loss functionen_US
dc.subjectsupervised machine learning (SML)en_US
dc.subjectsupport vector machine (SVM)en_US
dc.titleRoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for Supervised Learningen_US
dc.typeJournal Articleen_US
dc.rights.licenseAll Open Access-
dc.rights.licenseGreen Open Access-
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: