Funded Grants


Faces in the Wild: Understanding Real-World Communication of Emotions

The recognition that someone else is experiencing a specific emotion is a fundamental aspect of human cognition. This seemingly straight-forward cognitive act guides the unfolding of simple social interactions such as the decision to approach or avoid another person as well as more complex forms of cooperation and joint action. How do we know that someone else is experiencing an emotion? What is the nature of the signal? Traditionally, scientists and lay people have believed that emotions are conveyed—perhaps universally—by the contraction of specific facial muscles that together comprise what are usually called facial expressions of emotion. The belief in specific facial expressions of emotion is foundational to research in scientific disciplines spanning affective and cognitive science to computer vision and artificial intelligence. But recently some theorists have claimed that the high variability in the form of facial expressions and the fact that their interpretation can be influenced by context indicates that reliable and generalizable facial expressions of internal states of emotion do not exist. Both the traditional view and critics of it, however, rely on conclusions drawn from decades of research that tended to involve (1) very small datasets containing (2) still images of (3) posed (i.e., not spontaneous) facial expressions (4) made in the absence of a perceiver or other audience, and displayed by (5) White actors. The proposed research constitutes a response to the limitations of previous research on the types of facial expressions that occur in the real world, by diverse populations, and how they are recognized by perceivers. The research steps include the (1) creation of an initial dataset of facial expressions displayed in emotion-eliciting dyadic tasks, (2) innovative modeling of resulting facial features, (3) analysis of categories of expressions, (4) creation of an “in-the-wild” dataset with the use of wearable devices from a diverse population of individuals, (5) comparison of initial models on the new dataset, and (6) evaluation of perceivers’ categorizations of instances of the discovered facial expression categories. Findings will have important implications for models of the signaling and recognition of emotion. In addition, results will have fundamental implications for the use of emotion in legal judgments, policy decisions, national security protocols, educational practices and the commercial applications used in all of these activities, most notably, so-called facial expression recognition software. Facial expression recognition software is based on fundamentally flawed prior research and yet is used for everything from assessing people’s preferences for products evaluated in marketing campaigns to the likely performance of players in teams associated the National Basketball Association. Theory, training, and product development and use should await the findings of the proposed research.