Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/17669
Title: BLS-CIL: Class Imbalance Broad Learning System via Dual Weighting and Layer Trimming
Authors: Tanveer, M. Sayed
Mishra, Akshat
Sajid, M.
Quadir, A.
Keywords: Broad learning system (BLS);Class imbalance (CI) learning;Random vector functional link (RVFL) network;Randomized neural networks (RNNs);Trimming
Issue Date: 2026
Publisher: Elsevier Ltd
Citation: Tanveer, M. S., Mishra, A., Sajid, M., & Quadir, A. (2026). BLS-CIL: Class Imbalance Broad Learning System via Dual Weighting and Layer Trimming. Pattern Recognition, 174. https://doi.org/10.1016/j.patcog.2025.112940
Abstract: Randomized neural networks (RNNs) offer fast learning by randomly initializing hidden layer weights, making them efficient for classification tasks. The broad learning system (BLS), a notable RNN variant, is built on a flat, incrementally expandable architecture that avoids deep structures and backpropagation, enabling rapid and efficient learning. However, traditional BLS assumes equal importance for all training samples, which limits its performance on real-world datasets often affected by noise and class imbalance. Class imbalance occurs when one class has significantly more samples than others, causing the model to favor the majority class and leading to poor generalization on the minority class. This issue, combined with noisy data, degrades BLS performance and highlights the need for more robust generalization to improve reliability under such conditions. In order to tackle these problems, we propose a broad learning system for class imbalance learning (BLS-CIL) and a trimmed broad learning system for class imbalance learning (BLS-CIL-TRIM) that improves overall robustness while addressing the challenges posed by noise and class imbalance in the datasets. In BLS-CIL, we incorporate a dual-level weighting scheme that considers both class-level and instance-level significance. Simultaneously, an intra-class evaluation assesses each sample's importance based on its distance from the class center. This approach enables the model to emphasize minority class patterns and enhance robustness against noise and intra-class variability. Moreover, in the BLS-CIL-TRIM model, the proposed trimming scheme functions as a powerful normalization approach, enhancing feature consistency by eliminating extreme values. By removing a fixed percentage of the highest and lowest data points, this technique emphasizes the central data distribution, minimizes noise influence while preserving important patterns. We assess the performance of our proposed BLS-CIL and BLS-CIL-TRIM models on 25 KEEL class imbalance benchmark datasets, as well as the BreakHis and ADNI datasets. The proposed BLS-CIL and BLS-CIL-TRIM models effectively address class imbalance and reduce the impact of noise. © 2025 Elsevier Ltd
URI: https://dx.doi.org/10.1016/j.patcog.2025.112940
https://dspace.iiti.ac.in:8080/jspui/handle/123456789/17669
ISBN: 9781597492720
9780123695314
ISSN: 0031-3203
Type of Material: Journal Article
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: