Please use this identifier to cite or link to this item:
https://dspace.iiti.ac.in/handle/123456789/1122
Title: | Optimization algorithms for non-parallel support vector machines and its applications |
Authors: | Sharma, Anshul |
Supervisors: | Tanveer, M. |
Keywords: | Mathematics |
Issue Date: | 22-Jun-2018 |
Publisher: | Department of Mathematics, IIT Indore |
Series/Report no.: | MS060 |
Abstract: | Least squares twin multi-class classi cation support vector machine (LST-KSVC) [18] and K-nearest neighbor-based weighted multi-class twin support vector machine (KWMTSVM) [23] are novel multi-class classi ers based on the least squares twin support vector machine (LSTSVM) [17] and twin support vector machine (TSVM) [14] respectively. LST-KSVC and KWMTSVM obtain two non-parallel hyperplanes for focused classes by solving a system of linear equations and two small size quadratic programming problems (QPPs) respectively. Local information of data points is neglected in LST-KSVC, and optimal hyperplanes are constructed by assuming that each data point contribute the same weight but generally each data point have di erent predominance on the optimal hyperplanes. KWMTSVM solves the QPPs which consume more time to compute the optimal hyperplanes. To reduce the drawbacks of the above algorithms we proposed a novel algorithm based on least squares version of KWMTSVM in chapter 4 of this thesis termed as Least squares K-nearest neighbor-based weighted multi-class twin support vector machine (LS-KWMTSVM). In our algorithm to enterprise the local information we introduce the weight matrix Di(i = 1; 2) in the objective function of QPPs and to enterprise the inter class information we use Fv1, Fv2, Hv weight vectors in constraints. If any component of vectors is zero then corresponding constraint is redundant so we can escape it. Evacuation of redundant constraints and solving a system of linear equation instead of QPPs makes our algorithm faster than KWMTSVM. LS-KWMTSVM evaluates all the training data points into a \1-versus-1-versus-rest" structure, so it generates ternary output which helps to deal with imbalance datasets. Classical TSVM uses hinge loss function [4] which is sensitive to noise and unstable for re-sampling. To elevate the performance of TSVM, we introduce a novel algorithm in chapter 5 of this thesis termed as General twin support vector machine with pinball loss. In our proposed algorithm we use quantile distance [15, 20] and pinball loss function [15,20] instead of shortest distance and hinge loss respectively. We justify theoretically and experimentally that our proposed algorithm is noise insensitive i.e., it give better classi cation results for noise corrupted data. |
URI: | https://dspace.iiti.ac.in/handle/123456789/1122 |
Type of Material: | Thesis_M.Sc |
Appears in Collections: | Department of Mathematics_ETD |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
MS60_Anshul_Sharma_1603141001.pdf | 821.59 kB | Adobe PDF | ![]() View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
Altmetric Badge: