Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/11887
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBilas Pachori, Ramen_US
dc.date.accessioned2023-06-20T15:34:51Z-
dc.date.available2023-06-20T15:34:51Z-
dc.date.issued2023-
dc.identifier.citationAtrey, K., Singh, B. K., Bodhey, N. K., & Bilas Pachori, R. (2023). Mammography and ultrasound based dual modality classification of breast cancer using a hybrid deep learning approach. Biomedical Signal Processing and Control, 86 doi:10.1016/j.bspc.2023.104919en_US
dc.identifier.issn1746-8094-
dc.identifier.otherEID(2-s2.0-85152899173)-
dc.identifier.urihttps://doi.org/10.1016/j.bspc.2023.104919-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/11887-
dc.description.abstractTraditional methods of diagnosing breast cancer (BC) suffer from human errors, are less accurate, and consume time. A computer-aided detection (CAD) system can overcome the above-stated limitations and help radiologists with accurate decision-making. However, the existing studies using single imaging modalities have shown limited clinical use due to its low diagnostic accuracy and reliability when compared to multimodal system. Thus, we aim to develop a hybrid deep learning bimodal CAD algorithm for the classification of breast lesions using mammogram and ultrasound imaging modalities combined. A combined convolutional neural network (CNN) and long-short term memory (LSTM) model is implemented using images from both mammogram and ultrasound modalities to improve the early diagnosis of BC. A new real-time dataset consisting of 43 mammogram images and 43 ultrasound images collected from 31 patients is used in this work. Further, each group consists of 25 benign and 18 malignant images. The number of images is increased to 1032 (516 for each modality) using different data augmentation techniques. The proposed bimodal CAD algorithm achieves a classification accuracy of 99.35% and the area under the receiver operating characteristic curve (AUC) of 0.99 over the traditional unimodal CAD systems, which attain the classification accuracy of 97.16% and 98.84% using mammogram and ultrasound, respectively. The proposed bimodal CAD algorithm using combined mammogram and ultrasound outperforms the traditional unimodal CAD systems. The bimodal CAD algorithm can avoid unnecessary biopsies and encourage its clinical application. © 2023 Elsevier Ltden_US
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.sourceBiomedical Signal Processing and Controlen_US
dc.subjectBreast canceren_US
dc.subjectDual modalityen_US
dc.subjectFeature fusionen_US
dc.subjectHybrid deep learningen_US
dc.titleMammography and ultrasound based dual modality classification of breast cancer using a hybrid deep learning approachen_US
dc.typeJournal Articleen_US
Appears in Collections:Department of Electrical Engineering

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: