Paper
13 November 2024 Classifying emotions via analysis of facial physiological response without relying on expressions
Author Affiliations +
Abstract
Assessing a person’s emotional state may be relevant to security in situations where it may be beneficial to assess one’s intentions or mental state. In various situations, facial expressions that often indicate emotions, may not be communicated or may not necessarily correspond to the actual emotional state. Here we review our study, in which we classify emotional states from very short facial video signals. The emotion classification process does not rely on stereotypical facial expressions or contact-based methods. Our raw data are short facial videos obtained at some different known emotional states. A facial video includes a component of diffused light from the facial skin, affected by the cardiovascular activity that might be influenced by the emotional state. From the short facial videos, we extracted unique spatiotemporal physiological-affected features employed as input features into a deep-learning model. Results show average emotion classification accuracy of about 47.36%, compared to 20% chance accuracy given 5 emotion classes, which can be considered high for the cases where expressions are hardly observed.
© (2024) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yitzhak Yitzhaky, Shaul Shvimmer, Shlomi Talala, Rotem Simhon, and Michael Gilad "Classifying emotions via analysis of facial physiological response without relying on expressions", Proc. SPIE 13206, Artificial Intelligence for Security and Defence Applications II, 132060I (13 November 2024); https://doi.org/10.1117/12.3034867
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Emotion

Video

Convolution

Cameras

Deep learning

Signal detection

Skin

Back to Top