Date: April 2024.
Source: arXiv:2404.07867v1 [cs.CV]. https://doi.org/10.48550/arXiv.2404.07867.
Abstract: Facial expression-based human emotion recognition is a critical research area in psychology and medicine. State-of-the-art classification performance is only reached by end-to-end trained neural networks. Nevertheless, such black-box models lack transparency in their decision-making processes, prompting efforts to ascertain the rules that underlie classifiers’ decisions. Analyzing single inputs alone fails to expose systematic learned biases. These biases can be characterized as facial properties summarizing abstract information like age or medical conditions. Therefore, understanding a model’s prediction behavior requires an analysis rooted in causality along such selected properties. We demonstrate that up to 91.25% of classifier output behavior changes are statistically significant concerning basic properties. Among those are age, gender, and facial symmetry. Furthermore, the medical usage of surface electromyography significantly influences emotion prediction. We introduce a workflow to evaluate explicit properties and their impact. These insights might help medical professionals select and apply classifiers regarding their specialized data and properties.

Article: The Power of Properties: Uncovering the Influential Factors in Emotion Classification.
Authors: Tim Büchner, Niklas Penzel, Orlando Guntinas-Lichius, Joachim Denzler, Friedrich Schiller University Jena, Jena, Germany.