Automatic Facial Expression Analysis as a Measure of User-Designer Empathy
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Journal of Mechanical Design, Volume 145, issue 3
AbstractIn human-centered product design and development, understanding the users is essential. Empathizing with the user can help designers gain deeper insights into the user experience and their needs. However, a few studies have captured empathy real time during user interactions. Accordingly, the degree to which empathy occurs and enhances user understanding remains unclear. To narrow this gap, a study was performed exploring the use of video-based facial expression analysis during user interviews, as a means to capture empathy related to understanding vehicle driving experiences under challenging conditions. Mimicry and synchrony have been shown to be predictors of empathy in cognitive psychology. In this study, we adapted this method to study 46 user-designer interviews. The results show that the user and designer exhibited mimicry in their facial expressions, which thereby indicated that affective empathy can be captured via simple video facial recognition. However, we found that the user's facial expressions might not represent their actual emotional tone, which can mislead the designer, and they achieve false empathy. Further, we did not find a link between the observed mimicry of facial expressions and the understanding of mental contents, which indicated that the affective and some cognitive parts of user empathy may not be directly connected. Further studies are needed to understand how facial expression analysis can further be used to study and advance empathic design.
Design methodology, Design theory and methodology, Empathy, Facial expression analysis, Product design, Product development, User understanding, User-centered design
Salmi , A , Li , J & Holtta-Otto , K 2023 , ' Automatic Facial Expression Analysis as a Measure of User-Designer Empathy ' , Journal of Mechanical Design , vol. 145 , no. 3 , 031403 . https://doi.org/10.1115/1.4056494