Human Emotion Classification Based on EEG Using FFT Band Power and LSTM Classifier
DOI:
https://doi.org/10.30983/knowbase.v5i2.10145Keywords:
EEG, Emotion, FFT, SEED, LSTMAbstract
This study investigates human emotion recognition using electroencephalogram (EEG) signals, focusing on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED), which consists of recordings from 62 EEG channels categorized into three emotion classes: positive, neutral, and negative. The main challenges in EEG-based emotion classification include the limited amount of available data and the nonlinear, non-stationary nature of EEG signals. To address these challenges, this study evaluates the effectiveness of the Fast Fourier Transform (FFT) band power as input features and employs a stacked Long Short-Term Memory (LSTM) network as the classifier. Model validation was conducted using stratified 10-fold cross-validation, and performance was assessed using accuracy, F1-score, and Cohen’s kappa metrics. Experimental results show that the proposed method achieved an average accuracy of 89.87%, an F1-score of 90.10%, and a Cohen’s kappa value of 0.848, with minimal variation across folds, demonstrating high model stability. Unlike many recent studies that rely on image-based representations or Generative Adversarial Networks (GAN)-driven data augmentation, this study demonstrates that FFT band power combined with a sequential LSTM classifier can achieve strong performance without synthetic data generation or complex feature transformations. These findings indicate that the combination of FFT band power features and the LSTM classifier can serve as a solid baseline for further research.
References
X. Li et al., “EEG Based Emotion Recognition: A Tutorial and Review,” ACM Comput. Surv., vol. 55, no. 4, pp. 1–57, Apr. 2023, doi: 10.1145/3524499.
C. Li, Z. Zhang, X. Zhang, G. Huang, Y. Liu, and X. Chen, “EEG-based Emotion Recognition via Transformer,” vol. 19, no. 4, pp. 1–10, 2022.
W. Tao et al., “EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention,” IEEE Trans. Affect. Comput., vol. 14, no. 1, pp. 382–393, Jan. 2023, doi: 10.1109/TAFFC.2020.3025777.
D. W. Prabowo, H. A. Nugroho, N. A. Setiawan, and J. Debayle, “A systematic literature review of emotion recognition using EEG signals,” Cogn. Syst. Res., vol. 82, no. July, p. 101152, 2023, doi: 10.1016/j.cogsys.2023.101152.
N. S. Suhaimi, J. Mountstephens, and J. Teo, “EEG-Based Emotion Recognition: A State-of-the-Art Review of Current Trends and Opportunities,” Comput. Intell. Neurosci., vol. 2020, pp. 1–19, Sep. 2020, doi: 10.1155/2020/8875426.
H. Liu et al., “EEG-Based Multimodal Emotion Recognition: A Machine Learning Perspective,” IEEE Trans. Instrum. Meas., vol. 73, pp. 1–29, 2024, doi: 10.1109/TIM.2024.3369130.
A. K. Singh and S. Krishnan, “Trends in EEG signal feature extraction applications,” Front. Artif. Intell., vol. 5, Jan. 2023, doi: 10.3389/frai.2022.1072801.
X. Geng, D. Li, H. Chen, P. Yu, H. Yan, and M. Yue, “An improved feature extraction algorithms of EEG signals based on motor imagery brain-computer interface,” Alexandria Eng. J., vol. 61, no. 6, pp. 4807–4820, Jun. 2022, doi: 10.1016/j.aej.2021.10.034.
I. Rakhmatulin, M.-S. Dao, A. Nassibi, and D. Mandic, “Exploring Convolutional Neural Network Architectures for EEG Feature Extraction,” Sensors, vol. 24, no. 3, p. 877, Jan. 2024, doi: 10.3390/s24030877.
S. Liang et al., “Domain-Generalized EEG Classification With Category-Oriented Feature Decorrelation and Cross-View Consistency Learning,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 3285–3296, 2023, doi: 10.1109/TNSRE.2023.3300961.
X.-C. Zhong et al., “EEG-DG: A Multi-Source Domain Generalization Framework for Motor Imagery EEG Classification,” IEEE J. Biomed. Heal. Informatics, vol. 29, no. 4, pp. 2484–2495, Apr. 2025, doi: 10.1109/JBHI.2024.3431230.
H. Song, Q. She, F. Fang, S. Liu, Y. Chen, and Y. Zhang, “Domain generalization through latent distribution exploration for motor imagery EEG classification,” Neurocomputing, vol. 614, p. 128889, Jan. 2025, doi: 10.1016/j.neucom.2024.128889.
H. Zhao, Q. Zheng, K. Ma, H. Li, and Y. Zheng, “Deep Representation-Based Domain Adaptation for Nonstationary EEG Classification,” IEEE Trans. Neural Networks Learn. Syst., vol. 32, no. 2, pp. 535–545, Feb. 2021, doi: 10.1109/TNNLS.2020.3010780.
P. O. de Paula, T. B. da Silva Costa, R. R. de Faissol Attux, and D. G. Fantinato, “Classification of image encoded SSVEP-based EEG signals using Convolutional Neural Networks,” Expert Syst. Appl., vol. 214, p. 119096, Mar. 2023, doi: 10.1016/j.eswa.2022.119096.
Z. Gao et al., “Complex networks and deep learning for EEG signal analysis,” Cogn. Neurodyn., vol. 15, no. 3, pp. 369–388, Jun. 2021, doi: 10.1007/s11571-020-09626-1.
X. Du et al., “Electroencephalographic Signal Data Augmentation Based on Improved Generative Adversarial Network,” Brain Sci., vol. 14, no. 4, p. 367, Apr. 2024, doi: 10.3390/brainsci14040367.
F. Fahimi, S. Dosen, K. K. Ang, N. Mrachacz-Kersting, and C. Guan, “Generative Adversarial Networks-Based Data Augmentation for Brain–Computer Interface,” IEEE Trans. Neural Networks Learn. Syst., vol. 32, no. 9, pp. 4039–4051, Sep. 2021, doi: 10.1109/TNNLS.2020.3016666.
G. Bao et al., “Data Augmentation for EEG-Based Emotion Recognition Using Generative Adversarial Networks,” Front. Comput. Neurosci., vol. 15, Dec. 2021, doi: 10.3389/fncom.2021.723843.
A. G. Habashi, A. M. Azab, S. Eldawlatly, and G. M. Aly, “Generative adversarial networks in EEG analysis: an overview,” J. Neuroeng. Rehabil., vol. 20, no. 1, p. 40, Apr. 2023, doi: 10.1186/s12984-023-01169-w.
M.-A. Li, J.-F. Han, and L.-J. Duan, “A Novel MI-EEG Imaging With the Location Information of Electrodes,” IEEE Access, vol. 8, pp. 3197–3211, 2020, doi: 10.1109/ACCESS.2019.2962740.
R. Alam, H. Zhao, A. Goodwin, O. Kavehei, and A. McEwan, “Differences in Power Spectral Densities and Phase Quantities Due to Processing of EEG Signals,” Sensors, vol. 20, no. 21, p. 6285, Nov. 2020, doi: 10.3390/s20216285.
Y. Zhang, G. Yan, W. Chang, W. Huang, and Y. Yuan, “EEG-based multi-frequency band functional connectivity analysis and the application of spatio-temporal features in emotion recognition,” Biomed. Signal Process. Control, vol. 79, p. 104157, Jan. 2023, doi: 10.1016/j.bspc.2022.104157.
K. P. Wagh and K. Vasanth, “Performance evaluation of multi-channel electroencephalogram signal (EEG) based time frequency analysis for human emotion recognition,” Biomed. Signal Process. Control, vol. 78, p. 103966, Sep. 2022, doi: 10.1016/j.bspc.2022.103966.
Wei-Long Zheng and Bao-Liang Lu, “Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks,” IEEE Trans. Auton. Ment. Dev., vol. 7, no. 3, pp. 162–175, Sep. 2015, doi: 10.1109/TAMD.2015.2431497.
D. Wahyu Prabowo et al., “Enhancing EEG-Based Emotion Recognition Using Asymmetric Windowing Recurrence Plots,” IEEE Access, vol. 12, no. June, pp. 85969–85982, 2024, doi: 10.1109/ACCESS.2024.3409384.
M. Roshanaei, H. Norouzi, J. Onton, S. Makeig, and A. Mohammadi, “EEG-based functional and effective connectivity patterns during emotional episodes using graph theoretical analysis,” Sci. Rep., vol. 15, no. 1, p. 2174, Jan. 2025, doi: 10.1038/s41598-025-86040-9.
S. Koelstra et al., “DEAP: A Database for Emotion Analysis ;Using Physiological Signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, Jan. 2012, doi: 10.1109/T-AFFC.2011.15.
M. Klug and N. A. Kloosterman, “Zapline‐plus: A Zapline extension for automatic and adaptive removal of frequency‐specific noise artifacts in M/EEG,” Hum. Brain Mapp., vol. 43, no. 9, pp. 2743–2758, Jun. 2022, doi: 10.1002/hbm.25832.
M. Tröndle, T. Popov, A. Pedroni, C. Pfeiffer, Z. Barańczuk-Turska, and N. Langer, “Decomposing age effects in EEG alpha power,” Cortex, vol. 161, pp. 116–144, Apr. 2023, doi: 10.1016/j.cortex.2023.02.002.
N. Avital, N. Shulkin, and D. Malka, “Automatic Calculation of Average Power in Electroencephalography Signals for Enhanced Detection of Brain Activity and Behavioral Patterns,” Biosensors, vol. 15, no. 5, p. 314, May 2025, doi: 10.3390/bios15050314.
E. Tan et al., “Theta activity and cognitive functioning: Integrating evidence from resting-state and task-related developmental electroencephalography (EEG) research,” Dev. Cogn. Neurosci., vol. 67, p. 101404, Jun. 2024, doi: 10.1016/j.dcn.2024.101404.
G.-H. Shin, Y.-S. Kweon, M. Lee, K.-Y. Jung, and S.-W. Lee, “Quantifying Sleep Quality Through Delta-Beta Coupling Across Sleep and Wakefulness,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 33, pp. 1905–1915, 2025, doi: 10.1109/TNSRE.2025.3569283.
S. Chikhi, N. Matton, and S. Blanchet, “EEG power spectral measures of cognitive workload: A meta‐analysis,” Psychophysiology, vol. 59, no. 6, Jun. 2022, doi: 10.1111/psyp.14009.
S. Sanei and J. A. Chambers, EEG Signal Processing. West Sussex, England: John Wiley & Sons Ltd, 2007. doi: 10.1002/9780470511923.
X. Du et al., “An Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1528–1540, Jul. 2022, doi: 10.1109/TAFFC.2020.3013711.
A. Samavat, E. Khalili, B. Ayati, and M. Ayati, “Deep Learning Model With Adaptive Regularization for EEG-Based Emotion Recognition Using Temporal and Frequency Features,” IEEE Access, vol. 10, pp. 24520–24527, 2022, doi: 10.1109/ACCESS.2022.3155647.
J. Li, S. Li, J. Pan, and F. Wang, “Cross-Subject EEG Emotion Recognition With Self-Organized Graph Neural Network,” Front. Neurosci., vol. 15, no. June, pp. 1–10, Jun. 2021, doi: 10.3389/fnins.2021.611653.
L. Farokhah, R. Sarno, and C. Fatichah, “Cross-Subject Channel Selection Using Modified Relief and Simplified CNN-Based Deep Learning for EEG-Based Emotion Recognition,” IEEE Access, vol. 11, no. September, pp. 110136–110150, 2023, doi: 10.1109/ACCESS.2023.3322294.
M. Aslan, “CNN based efficient approach for emotion recognition,” J. King Saud Univ. - Comput. Inf. Sci., Aug. 2021, doi: 10.1016/j.jksuci.2021.08.021.
J. R. Landis and G. G. Koch, “The Measurement of Observer Agreement for Categorical Data,” Biometrics, vol. 33, no. 1, p. 159, Mar. 1977, doi: 10.2307/2529310.
C. Wei, L. Chen, Z. Song, X. Lou, and D. Li, “EEG-based emotion recognition using simple recurrent units network and ensemble learning,” Biomed. Signal Process. Control, vol. 58, p. 101756, Apr. 2020, doi: 10.1016/j.bspc.2019.101756.
M. Khateeb, S. M. Anwar, and M. Alnowami, “Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset,” IEEE Access, vol. 9, pp. 12134–12142, 2021, doi: 10.1109/ACCESS.2021.3051281.
D. W. Prabowo, N. A. Setiawan, J. Debayle, and H. A. Nugroho, “Multi-Representation Convolutional Neural Network to Recognize Human Emotion,” in 2023 8th International Conference on Information Technology and Digital Applications (ICITDA), Nov. 2023, pp. 1–6. doi: 10.1109/ICITDA60835.2023.10427131.
N. A. Setiawan, D. W. Prabowo, and H. A. Nugroho, “Benchmarking of feature selection techniques for coronary artery disease diagnosis,” Proc. - 2014 6th Int. Conf. Inf. Technol. Electr. Eng. Leveraging Res. Technol. Through Univ. Collab. ICITEE 2014, pp. 1–5, 2014, doi: 10.1109/ICITEED.2014.7007898.
F. Wang, S. Zhong, J. Peng, J. Jiang, and Y. Liu, “Data Augmentation for EEG-Based Emotion Recognition with Deep Convolutional Neural Networks,” in Lecture Notes in Computer Science, vol. 10705 LNCS, Shenzhen: Springer, Cham, 2018, pp. 82–93. doi: 10.1007/978-3-319-73600-6_8.
Y. Luo and B.-L. Lu, “EEG Data Augmentation for Emotion Recognition Using a Conditional Wasserstein GAN,” in 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jul. 2018, vol. 2018-July, pp. 2535–2538. doi: 10.1109/EMBC.2018.8512865.
Y. Luo, L.-Z. Zhu, and B.-L. Lu, “A GAN-Based Data Augmentation Method for Multimodal Emotion Recognition,” in Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science, vol. 11554, Shanghai: Springer, Cham, 2019, pp. 141–150. doi: 10.1007/978-3-030-22796-8_16.
A. Zhang, L. Su, Y. Zhang, Y. Fu, L. Wu, and S. Liang, “EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN,” Complex Intell. Syst., vol. 8, no. 4, pp. 3059–3071, Aug. 2022, doi: 10.1007/s40747-021-00336-7.
Downloads
Published
How to Cite
Issue
Section
Citation Check
License
Copyright (c) 2025 Dwi Wahyu Prabowo

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).

