Please use this identifier to cite or link to this item:
https://dspace.iiti.ac.in/handle/123456789/14028
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Nalwaya, Aditya | en_US |
dc.contributor.author | Das, Kritiprasanna | en_US |
dc.contributor.author | Pachori, Ram Bilas | en_US |
dc.date.accessioned | 2024-07-18T13:48:20Z | - |
dc.date.available | 2024-07-18T13:48:20Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | Nalwaya, A., Das, K., & Pachori, R. B. (2024). An automated framework for human emotion detection from multichannel EEG signals. IEEE Sensors Journal. Scopus. https://doi.org/10.1109/JSEN.2024.3398050 | en_US |
dc.identifier.issn | 1530-437X | - |
dc.identifier.other | EID(2-s2.0-85193543581) | - |
dc.identifier.uri | https://doi.org/10.1109/JSEN.2024.3398050 | - |
dc.identifier.uri | https://dspace.iiti.ac.in/handle/123456789/14028 | - |
dc.description.abstract | This paper presents an electroencephalogram (EEG) rhythm-based novel approach for emotion recognition. Recognizing multiple classes of emotion has been a challenging task, and several attempts have been made earlier to recognize emotion. The proposed work presents a simplistic and efficient framework for emotion recognition. Instead of using different methods for signal quality enhancement and signal component extraction, the current study focuses on a single advanced signal processing method which addresses the above mentioned issue. A joint time-frequency domain-based feature is proposed. The proposed joint features help in estimating the effect of emotion elicitation over the time-frequency distribution of each rhythm calculated across all the channels. Additionally, channel-wise separated EEG rhythm features are extracted, and these features are used to determine the emotional state using a machine learning model. In EEG, several oscillatory rhythms exist which reflect the brain’ | en_US |
dc.description.abstract | s neural activity. The current study assesses changes in EEG rhythms due to audiovisual elicitation. Four classes of emotion, namely happy, sad, fear, and neutral, are studied in this paper. The subject-wise mean accuracy obtained is 95.91%. The proposed framework uses a multivariate variational mode decomposition method to separate the raw signal into various EEG rhythms. Also, it has been found that higher-frequency rhythms have more information related to emotion than the lower-frequency rhythms. A simplistic approach with good accuracy makes the proposed methodology significant. IEEE | en_US |
dc.language.iso | en | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | en_US |
dc.source | IEEE Sensors Journal | en_US |
dc.subject | EEG | en_US |
dc.subject | emotion recognition | en_US |
dc.subject | joint time-frequency analysis | en_US |
dc.subject | MVMD | en_US |
dc.subject | rhythms | en_US |
dc.title | An automated framework for human emotion detection from multichannel EEG signals | en_US |
dc.type | Journal Article | en_US |
Appears in Collections: | Department of Electrical Engineering |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
Altmetric Badge: