Learning Person-Specific Cognition From Facial Reactions for Automatic Personality Recognition
This article proposes to recognise the true (self-reported) personality traits from the target subject's cognition simulated from facial reactions. This approach builds on the following two findings in cognitive science: (i) human cognition partially determines expressed behaviour and is directly linked to true personality traits; and (ii) in dyadic interactions, individuals’ nonverbal behaviours are influenced by their conversational partner's behaviours. In this context, we hypothesise that during a dyadic interaction, a target subject's facial reactions are driven by two main factors: their internal (person-specific) cognitive process, and the externalised nonverbal behaviours of their conversational partner. Consequently, we propose to represent the target subject's (defined as the listener) person-specific cognition in the form of a person-specific CNN architecture that has unique architectural parameters and depth, which takes audio-visual non-verbal cues displayed by the conversational partner (defined as the speaker) as input, and is able to reproduce the target subject's facial reactions. Each person-specific CNN is explored by the Neural Architecture Search (NAS) and a novel adaptive loss function, which is then represented as a graph representation for recognising the target subject's true personality. Experimental results not only show that the produced graph representations are well associated with target subjects’ personality traits in both human-human and human-machine interaction scenarios, and outperform the existing approaches with significant advantages, but also demonstrate that the proposed novel strategies help in learning more reliable personality representations.
Funding
Adaptive Robotic EQ for Well-being (ARoEQ)
Engineering and Physical Sciences Research Council
Find out more...European Union's Horizon 2020 research and innovation programme (Grant Number: 82623)
10.13039/501100001809-National Natural Science Foundation of China (Grant Number: 82261138629)
Guangdong Basic and Applied Basic Research Foundation
Shenzhen Municipal Science and Technology Innovation Council (Grant Number: JCYJ20220531101412030)
History
Author affiliation
School of Computing and Mathematical Sciences, University of LeicesterVersion
- AM (Accepted Manuscript)