Automated Emotion Recognition Using Hybrid CNN-RNN Models on Multimodal Physiological Signals.

Automated Emotion Recognition Using Hybrid CNN-RNN Models on Multimodal Physiological Signals

https://doi.org/10.61710/kjcs.v3i2.100

Authors

  • AMMAR AZEEZ وزارة الكهرباء - المديرية العامة لنقل الطاقة الكهربائية

Keywords:

Emotion Recognition, Hybrid Models, CNN-RNN, Multimodal, Physiological Signals.

Abstract

Emotion recognition has emerged as one of the cornerstones of human-computer interaction, thus opening new frontiers in healthcare, education, and entertainment. The ability to automate emotion recognition processes using hybrid Convolutional Neural Network-Recurrent Neural Network models offers a promising avenue for decoding complex emotional states. The proposed study develops an approach for the integration of electrocardiogram, galvanic skin response, and facial expressions for performing emotion recognition in an accurate and efficient manner. This hybrid architecture combines the strengths of CNNs in spatial feature extraction and RNNs in modeling temporal dependencies, which naturally provides a remedy for challenges inherently brought about by the use of multimodal data. Extensive experiments have been conducted on benchmark datasets publicly available, and the proposed hybrid model outperforms other unmoral and traditional methods in terms of higher classification accuracy and robustness. This study points not only to the potential of hybrid models in advancing emotion recognition but also provides a scalable framework adaptable for real-world applications such as mental health monitoring and adaptive learning systems. The results underlined how deep learning techniques can dramatically bridge the gap between subjective emotional experiences and objective computational analyses. 

Downloads

Download data is not yet available.

References

References

Picard, R. W. (1997). Affective Computing. MIT Press.

Jerritta, S., Murugappan, M., Wan, K., & Yaacob, S. (2011). Physiological signals based human emotion recognition: A review. International Journal of Medical Engineering and Informatics, 3(2), 1-20.

Poh, M. Z., Swenson, N. C., & Picard, R. W. (2010). A wearable sensor for unobtrusive, long-term assessment of electrodermal activity. IEEE Transactions on Biomedical Engineering, 57(5), 1243-1252.

Koelstra, S., Muhl, C., Soleymani, M., et al. (2012). DEAP: A database for emotion analysis using physiological signals. IEEE Transactions on Affective Computing, 3(1), 18-31.

Soleymani, M., Lichtenauer, J., Pun, T., & Pantic, M. (2012). A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affective Computing, 3(1), 42-55.

Zheng, W. L., Zhu, J. Y., Peng, Y., & Lu, B. L. (2018). EEG-based emotion recognition using 3D convolutional neural networks and temporal electroencephalographic signals. Neural Networks, 105, 1-11.

Tang, Y. (2013). Deep learning using linear support vector machines. Proceedings of the 30th International Conference on Machine Learning (ICML-13).

Trigeorgis, G., Ringeval, F., Brueckner, R., et al. (2016). Adieu features? End-to-end speech emotion recognition using a deep convolutional recurrent network. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

Poria, S., Cambria, E., Bajpai, R., & Hussain, A. (2017). A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, 37, 98-125.

Published

2025-06-25

How to Cite

AZEEZ, A. (2025). Automated Emotion Recognition Using Hybrid CNN-RNN Models on Multimodal Physiological Signals.: Automated Emotion Recognition Using Hybrid CNN-RNN Models on Multimodal Physiological Signals. AlKadhim Journal for Computer Science, 3(2), 20–29. https://doi.org/10.61710/kjcs.v3i2.100