Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/11149
Title: Inverse free reduced universum twin support vector machine for imbalanced data classification
Authors: Ganaie, M. A.
Tanveer, M.
Keywords: Classification (of information);Inverse problems;Matrix algebra;Support vector machines;Vectors;Class imbalance learning;Class-imbalanced;Imbalanced dataset;Rectangular kernel;Reduced universum twin support vector machine;Twin support vector machines;Universum;Universum twin support vector machine;Learning algorithms;adult;article;classifier;data classification;learning;machine learning;twin support vector machine
Issue Date: 2023
Publisher: Elsevier Ltd
Citation: Moosaei, H., Ganaie, M. A., Hladík, M., & Tanveer, M. (2023). Inverse free reduced universum twin support vector machine for imbalanced data classification. Neural Networks, 157, 125-135. doi:10.1016/j.neunet.2022.10.003
Abstract: Imbalanced datasets are prominent in real-world problems. In such problems, the data samples in one class are significantly higher than in the other classes, even though the other classes might be more important. The standard classification algorithms may classify all the data into the majority class, and this is a significant drawback of most standard learning algorithms, so imbalanced datasets need to be handled carefully. One of the traditional algorithms, twin support vector machines (TSVM), performed well on balanced data classification but poorly on imbalanced datasets classification. In order to improve the TSVM algorithm's classification ability for imbalanced datasets, recently, driven by the universum twin support vector machine (UTSVM), a reduced universum twin support vector machine for class imbalance learning (RUTSVM) was proposed. The dual problem and finding classifiers involve matrix inverse computation, which is one of RUTSVM's key drawbacks. In this paper, we improve the RUTSVM and propose an improved reduced universum twin support vector machine for class imbalance learning (IRUTSVM). We offer alternative Lagrangian functions to tackle the primal problems of RUTSVM in the suggested IRUTSVM approach by inserting one of the terms in the objective function into the constraints. As a result, we obtain new dual formulation for each optimization problem so that we need not compute inverse matrices neither in the training process nor in finding the classifiers. Moreover, the smaller size of the rectangular kernel matrices is used to reduce the computational time. Extensive testing is carried out on a variety of synthetic and real-world imbalanced datasets, and the findings show that the IRUTSVM algorithm outperforms the TSVM, UTSVM, and RUTSVM algorithms in terms of generalization performance. © 2022 Elsevier Ltd
URI: https://doi.org/10.1016/j.neunet.2022.10.003
https://dspace.iiti.ac.in/handle/123456789/11149
ISSN: 0893-6080
Type of Material: Journal Article
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: