Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/17669
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTanveer, M. Sayeden_US
dc.contributor.authorMishra, Akshaten_US
dc.contributor.authorSajid, M.en_US
dc.contributor.authorQuadir, A.en_US
dc.date.accessioned2026-01-09T13:21:16Z-
dc.date.available2026-01-09T13:21:16Z-
dc.date.issued2026-
dc.identifier.citationTanveer, M. S., Mishra, A., Sajid, M., & Quadir, A. (2026). BLS-CIL: Class Imbalance Broad Learning System via Dual Weighting and Layer Trimming. Pattern Recognition, 174. https://doi.org/10.1016/j.patcog.2025.112940en_US
dc.identifier.isbn9781597492720-
dc.identifier.isbn9780123695314-
dc.identifier.issn0031-3203-
dc.identifier.otherEID(2-s2.0-105025520697)-
dc.identifier.urihttps://dx.doi.org/10.1016/j.patcog.2025.112940-
dc.identifier.urihttps://dspace.iiti.ac.in:8080/jspui/handle/123456789/17669-
dc.description.abstractRandomized neural networks (RNNs) offer fast learning by randomly initializing hidden layer weights, making them efficient for classification tasks. The broad learning system (BLS), a notable RNN variant, is built on a flat, incrementally expandable architecture that avoids deep structures and backpropagation, enabling rapid and efficient learning. However, traditional BLS assumes equal importance for all training samples, which limits its performance on real-world datasets often affected by noise and class imbalance. Class imbalance occurs when one class has significantly more samples than others, causing the model to favor the majority class and leading to poor generalization on the minority class. This issue, combined with noisy data, degrades BLS performance and highlights the need for more robust generalization to improve reliability under such conditions. In order to tackle these problems, we propose a broad learning system for class imbalance learning (BLS-CIL) and a trimmed broad learning system for class imbalance learning (BLS-CIL-TRIM) that improves overall robustness while addressing the challenges posed by noise and class imbalance in the datasets. In BLS-CIL, we incorporate a dual-level weighting scheme that considers both class-level and instance-level significance. Simultaneously, an intra-class evaluation assesses each sample's importance based on its distance from the class center. This approach enables the model to emphasize minority class patterns and enhance robustness against noise and intra-class variability. Moreover, in the BLS-CIL-TRIM model, the proposed trimming scheme functions as a powerful normalization approach, enhancing feature consistency by eliminating extreme values. By removing a fixed percentage of the highest and lowest data points, this technique emphasizes the central data distribution, minimizes noise influence while preserving important patterns. We assess the performance of our proposed BLS-CIL and BLS-CIL-TRIM models on 25 KEEL class imbalance benchmark datasets, as well as the BreakHis and ADNI datasets. The proposed BLS-CIL and BLS-CIL-TRIM models effectively address class imbalance and reduce the impact of noise. © 2025 Elsevier Ltden_US
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.sourcePattern Recognitionen_US
dc.subjectBroad learning system (BLS)en_US
dc.subjectClass imbalance (CI) learningen_US
dc.subjectRandom vector functional link (RVFL) networken_US
dc.subjectRandomized neural networks (RNNs)en_US
dc.subjectTrimmingen_US
dc.titleBLS-CIL: Class Imbalance Broad Learning System via Dual Weighting and Layer Trimmingen_US
dc.typeJournal Articleen_US
Appears in Collections:Department of Mathematics

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: