loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Chaudhary Muhammad Aqdus Ilyas 1 ; 2 ; Rita Nunes 3 ; Kamal Nasrollahi 2 ; Matthias Rehm 1 and Thomas B. Moeslund 2

Affiliations: 1 Human-Robot Interaction Lab, Aalborg University, Aalborg, Denmark ; 2 Visual Analysis of People Lab, Aalborg University, Aalborg, Denmark ; 3 Department of Electronic Systems, Aalborg University, Aalborg, Denmark

Keyword(s): Emotion Recognition, Facial Expressions, Body movements, Deep Learning, Convolutional Neural Networks.

Abstract: Despite recent significant advancements in the field of human emotion recognition, applying upper body movements along with facial expressions present severe challenges in the field of human-robot interaction. This article presents a model that learns emotions through upper body movements and corresponds with facial expressions. Once this correspondence is mapped, tasks such as emotion and gesture recognition can easily be identified using facial features and movement vectors. Our method uses a deep convolution neural network trained on benchmark datasets exhibiting various emotions and corresponding body movements. Features obtained through facial movements and body motion are fused to get emotion recognition performance. We have implemented various fusion methodologies to integrate multimodal features for non-verbal emotion identification. Our system achieves 76.8% accuracy of emotion recognition through upper body movements only, surpassing 73.1% on the FABO dataset. In addition, employing multimodal compact bilinear pooling with temporal information surpassed the state-of-the-art method with an accuracy of 94.41% on the FABO dataset. This system can lead to better human-machine interaction by enabling robots to recognize emotions and body actions and react according to their emotions, thus enriching the user experience. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.147.81.128

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Ilyas, C.; Nunes, R.; Nasrollahi, K.; Rehm, M. and Moeslund, T. (2021). Deep Emotion Recognition through Upper Body Movements and Facial Expression. In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP; ISBN 978-989-758-488-6; ISSN 2184-4321, SciTePress, pages 669-679. DOI: 10.5220/0010359506690679

@conference{visapp21,
author={Chaudhary Muhammad Aqdus Ilyas. and Rita Nunes. and Kamal Nasrollahi. and Matthias Rehm. and Thomas B. Moeslund.},
title={Deep Emotion Recognition through Upper Body Movements and Facial Expression},
booktitle={Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP},
year={2021},
pages={669-679},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010359506690679},
isbn={978-989-758-488-6},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 5: VISAPP
TI - Deep Emotion Recognition through Upper Body Movements and Facial Expression
SN - 978-989-758-488-6
IS - 2184-4321
AU - Ilyas, C.
AU - Nunes, R.
AU - Nasrollahi, K.
AU - Rehm, M.
AU - Moeslund, T.
PY - 2021
SP - 669
EP - 679
DO - 10.5220/0010359506690679
PB - SciTePress