Please use this identifier to cite or link to this item:
https://dspace.iiti.ac.in/handle/123456789/15113
Title: | Granular Ball Twin Support Vector Machine |
Authors: | Quadir, A. Sajid, M. Tanveer, M. |
Keywords: | Granular ball (GB);granular computing;large-scale dataset;structural risk minimization (SRM) principle;support vector machine (SVM);twin SVM (TSVM) |
Issue Date: | 2024 |
Publisher: | Institute of Electrical and Electronics Engineers Inc. |
Citation: | Quadir, A., Sajid, M., & Tanveer, M. (2024). Granular Ball Twin Support Vector Machine. IEEE Transactions on Neural Networks and Learning Systems. Scopus. https://doi.org/10.1109/TNNLS.2024.3476391 |
Abstract: | Twin support vector machine (TSVM) is an emerging machine learning model with versatile applicability in classification and regression endeavors. Nevertheless, TSVM confronts noteworthy challenges: 1) the imperative demand for matrix inversions presents formidable obstacles to its efficiency and applicability on large-scale datasets 2) the omission of the structural risk minimization (SRM) principle in its primal formulation heightens the vulnerability to overfitting risks and 3) the TSVM exhibits a high susceptibility to noise and outliers and also demonstrates instability when subjected to resampling. In view of the aforementioned challenges, we propose the granular ball TSVM (GBTSVM). GBTSVM takes granular balls (GBs), rather than individual data points, as inputs to construct a classifier. These GBs, characterized by their coarser granularity, exhibit robustness to resampling and reduced susceptibility to the impact of noise and outliers. We further propose a novel large-scale GBTSVM (LS-GBTSVM). LS-GBTSVM's optimization formulation ensures two critical facets: 1) it eliminates the need for matrix inversions, streamlining the LS-GBTSVM's computational efficiency and 2) it incorporates the SRM principle through the incorporation of regularization terms, effectively addressing the issue of overfitting. The proposed LS-GBTSVM exemplifies efficiency, scalability for large datasets, and robustness against noise and outliers. We conduct a comprehensive evaluation of the GBTSVM and LS-GBTSVM models on benchmark datasets from UCI and KEEL, both with and without the addition of label noise, and compared with existing baseline models. Furthermore, we extend our assessment to the large-scale NDC datasets to establish the practicality of the proposed models in such contexts. Our experimental findings and rigorous statistical analyses affirm the superior generalization prowess of the proposed GBTSVM and LS-GBTSVM models compared to the baseline models. The source code of the proposed GBTSVM and LS-GBTSVM models are available at https://github.com/mtanveer1/GBTSVM. © 2012 IEEE. |
URI: | https://doi.org/10.1109/TNNLS.2024.3476391 https://dspace.iiti.ac.in/handle/123456789/15113 |
ISSN: | 2162-237X |
Type of Material: | Journal Article |
Appears in Collections: | Department of Mathematics |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
Altmetric Badge: