Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/14028
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNalwaya, Adityaen_US
dc.contributor.authorDas, Kritiprasannaen_US
dc.contributor.authorPachori, Ram Bilasen_US
dc.date.accessioned2024-07-18T13:48:20Z-
dc.date.available2024-07-18T13:48:20Z-
dc.date.issued2024-
dc.identifier.citationNalwaya, A., Das, K., & Pachori, R. B. (2024). An automated framework for human emotion detection from multichannel EEG signals. IEEE Sensors Journal. Scopus. https://doi.org/10.1109/JSEN.2024.3398050en_US
dc.identifier.issn1530-437X-
dc.identifier.otherEID(2-s2.0-85193543581)-
dc.identifier.urihttps://doi.org/10.1109/JSEN.2024.3398050-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/14028-
dc.description.abstractThis paper presents an electroencephalogram (EEG) rhythm-based novel approach for emotion recognition. Recognizing multiple classes of emotion has been a challenging task, and several attempts have been made earlier to recognize emotion. The proposed work presents a simplistic and efficient framework for emotion recognition. Instead of using different methods for signal quality enhancement and signal component extraction, the current study focuses on a single advanced signal processing method which addresses the above mentioned issue. A joint time-frequency domain-based feature is proposed. The proposed joint features help in estimating the effect of emotion elicitation over the time-frequency distribution of each rhythm calculated across all the channels. Additionally, channel-wise separated EEG rhythm features are extracted, and these features are used to determine the emotional state using a machine learning model. In EEG, several oscillatory rhythms exist which reflect the brain&#x2019en_US
dc.description.abstracts neural activity. The current study assesses changes in EEG rhythms due to audiovisual elicitation. Four classes of emotion, namely happy, sad, fear, and neutral, are studied in this paper. The subject-wise mean accuracy obtained is 95.91%. The proposed framework uses a multivariate variational mode decomposition method to separate the raw signal into various EEG rhythms. Also, it has been found that higher-frequency rhythms have more information related to emotion than the lower-frequency rhythms. A simplistic approach with good accuracy makes the proposed methodology significant. IEEEen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.sourceIEEE Sensors Journalen_US
dc.subjectEEGen_US
dc.subjectemotion recognitionen_US
dc.subjectjoint time-frequency analysisen_US
dc.subjectMVMDen_US
dc.subjectrhythmsen_US
dc.titleAn automated framework for human emotion detection from multichannel EEG signalsen_US
dc.typeJournal Articleen_US
Appears in Collections:Department of Electrical Engineering

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: