Synch-Graph: multisensory emotion recognition through neural synchrony via graph convolutional networks

Esma Mansouri Benssassi, Juan Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Human emotions are essentially multisensory, where emotional states are conveyed through multiple modalities such as facial expression, body language, and non-verbal and verbal signals. Therefore having multimodal or multisensory learning is crucial for recognising emotions and interpreting social signals. Existing multisensory emotion recognition approaches focus on extracting features on each modality, while ignoring the importance of constant interaction and co- learning between modalities. In this paper, we present a novel bio-inspired approach based on neural synchrony in audio- visual multisensory integration in the brain, named Synch-Graph. We model multisensory interaction using spiking neural networks (SNN) and explore the use of Graph Convolutional Networks (GCN) to represent and learn neural synchrony patterns. We hypothesise that modelling interactions between modalities will improve the accuracy of emotion recognition. We have evaluated Synch-Graph on two state- of-the-art datasets and achieved an overall accuracy of 98.3% and 96.82%, which are significantly higher than the existing techniques.
Original languageEnglish
Title of host publicationProceedings of the AAAI Conference on Artificial Intelligence (AAAI-20)
PublisherAAAI Press
Pages1351-1358
Number of pages8
ISBN (Print)9781577358350
DOIs
Publication statusPublished - 3 Apr 2020
EventThirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20) - New York, United States
Duration: 7 Feb 202012 Feb 2020
Conference number: 34
https://aaai.org/Conferences/AAAI-20/aaai20call/

Publication series

NameProceedings of the AAAI Conference on Artificial Intelligence
PublisherAAAI
ISSN (Print)2159-5399
ISSN (Electronic)2374-3468

Conference

ConferenceThirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20)
Abbreviated titleAAAI-20
Country/TerritoryUnited States
CityNew York
Period7/02/2012/02/20
Internet address

Fingerprint

Dive into the research topics of 'Synch-Graph: multisensory emotion recognition through neural synchrony via graph convolutional networks'. Together they form a unique fingerprint.

Cite this