Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/6530
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTanveer, M.en_US
dc.contributor.authorSharma, Anuragen_US
dc.date.accessioned2022-03-17T01:00:00Z-
dc.date.accessioned2022-03-21T10:49:44Z-
dc.date.available2022-03-17T01:00:00Z-
dc.date.available2022-03-21T10:49:44Z-
dc.date.issued2021-
dc.identifier.citationTanveer, M., Sharma, A., & Suganthan, P. N. (2021). Least squares KNN-based weighted multiclass twin SVM. Neurocomputing, 459, 454-464. doi:10.1016/j.neucom.2020.02.132en_US
dc.identifier.issn0925-2312-
dc.identifier.otherEID(2-s2.0-85090061095)-
dc.identifier.urihttps://doi.org/10.1016/j.neucom.2020.02.132-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/6530-
dc.description.abstractK-nearest neighbor (KNN) based weighted multi-class twin support vector machines (KWMTSVM) is a novel multi-class classification method. In this paper, we propose a novel least squares version of KWMTSVM called LS-KWMTSVM by replacing the inequality constraints with equality constraints and minimized the slack variables using squares of 2-norm instead of conventional 1-norm. This simple modification leads to a very fast algorithm with much better results. The modified primal problems in the proposed LS-KWMTSVM solves only two systems of linear equations whereas two quadratic programming problems (QPPs) need to solve in KWMTSVM. The proposed LS-KWMTSVM, same as KWMTSVM, employed the weight matrix in the objective function to exploit the local information of the training samples. To exploit the inter class information, we use weight vectors in the constraints of the proposed LS-KWMTSVM. If any component of vectors is zero then the corresponding constraint is redundant and thus we can avoid it. Elimination of redundant constraints and solving a system of linear equations instead of QPPs makes the proposed LS-KWMTSVM more robust and faster than KWMTSVM. The proposed LS-KWMTSVM, commensurate as the KWMTSVM, all the training data points into a “1-versus-1-versus-rest” structure, and thus our LS-KWMTSVM generate ternary output {-1,0,+1} which helps to deal with imbalance datasets. Numerical experiments on several UCI and KEEL imbalance datasets(with high imbalance ratio) clearly indicate that the proposed LS-KWMTSVM has better classification accuracy compared with other baseline methods but with remarkably less computational time. © 2020 Elsevier B.V.en_US
dc.language.isoenen_US
dc.publisherElsevier B.V.en_US
dc.sourceNeurocomputingen_US
dc.subjectClassification (of information)en_US
dc.subjectConstraint theoryen_US
dc.subjectLearning systemsen_US
dc.subjectLinear equationsen_US
dc.subjectNearest neighbor searchen_US
dc.subjectNumerical methodsen_US
dc.subjectQuadratic programmingen_US
dc.subjectClassification accuracyen_US
dc.subjectK nearest neighbor (KNN)en_US
dc.subjectLeast squares versionsen_US
dc.subjectMulti-class classificationen_US
dc.subjectQuadratic programming problemsen_US
dc.subjectSystem of linear equationsen_US
dc.subjectSystems of linear equationsen_US
dc.subjectTwin support vector machinesen_US
dc.subjectSupport vector machinesen_US
dc.subjectadulten_US
dc.subjectarticleen_US
dc.subjectleast square analysisen_US
dc.subjecttwin support vector machineen_US
dc.titleLeast squares KNN-based weighted multiclass twin SVMen_US
dc.typeJournal Articleen_US
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: