University of Leicester
Browse
- No file added yet -

Learning Person-Specific Cognition From Facial Reactions for Automatic Personality Recognition

Download (11.29 MB)
journal contribution
posted on 2024-01-04, 12:50 authored by Siyang Song, Zilong Shao, Shashank Jaiswal, Linlin Shen, Michel Valstar, Hatice Gunes

This article proposes to recognise the true (self-reported) personality traits from the target subject's cognition simulated from facial reactions. This approach builds on the following two findings in cognitive science: (i) human cognition partially determines expressed behaviour and is directly linked to true personality traits; and (ii) in dyadic interactions, individuals’ nonverbal behaviours are influenced by their conversational partner's behaviours. In this context, we hypothesise that during a dyadic interaction, a target subject's facial reactions are driven by two main factors: their internal (person-specific) cognitive process, and the externalised nonverbal behaviours of their conversational partner. Consequently, we propose to represent the target subject's (defined as the listener) person-specific cognition in the form of a person-specific CNN architecture that has unique architectural parameters and depth, which takes audio-visual non-verbal cues displayed by the conversational partner (defined as the speaker) as input, and is able to reproduce the target subject's facial reactions. Each person-specific CNN is explored by the Neural Architecture Search (NAS) and a novel adaptive loss function, which is then represented as a graph representation for recognising the target subject's true personality. Experimental results not only show that the produced graph representations are well associated with target subjects’ personality traits in both human-human and human-machine interaction scenarios, and outperform the existing approaches with significant advantages, but also demonstrate that the proposed novel strategies help in learning more reliable personality representations.

Funding

Adaptive Robotic EQ for Well-being (ARoEQ)

Engineering and Physical Sciences Research Council

Find out more...

European Union's Horizon 2020 research and innovation programme (Grant Number: 82623)

10.13039/501100001809-National Natural Science Foundation of China (Grant Number: 82261138629)

Guangdong Basic and Applied Basic Research Foundation

Shenzhen Municipal Science and Technology Innovation Council (Grant Number: JCYJ20220531101412030)

History

Author affiliation

School of Computing and Mathematical Sciences, University of Leicester

Version

  • AM (Accepted Manuscript)

Published in

IEEE Transactions on Affective Computing

Volume

14

Issue

4

Pagination

3048 - 3065

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

eissn

1949-3045

Copyright date

2022

Available date

2024-01-04

Language

en

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC