Social and economic decisions cannot be fully explained by "rational" attempts to maximize monetary gain, even in very simple game-theoretic scenarios. Complex emotional processes such as anger, guilt or generosity act as hidden forces that lead to observable actions. Such "non-rational" motivations can drive our own decisions and they affect our beliefs about what motivates others' decisions as well. The goal of this project is to use automatic measurements of dynamic facial expressions, in combination with other measurements such as functional MRI (fMRI) and eye-tracking, to investigate the role of non-rational motivations in social decision making. The core of the approach is to use state-of-the-art computer vision techniques to extract facial actions from video in real-time while participants interact with a computer or with each other, in some cases viewing live video of each others' faces. The investigators will use powerful statistical machine learning techniques to make inferences about the participants' internal emotional states during the interactions. The goal is to use the inferences concerning emotional state (a) to predict participants' behavior; (b) to explain why a decision is made in terms of the hidden forces driving it; and (c) to build autonomous agents that can use this information to drive their interactions with humans.

This multidisciplinary project contributes to several fields such as psychology, neuroscience, and economics. First, it develops new methodologies to study decision processes. Second, it uses these methods to test hypotheses about social decision-making and to bridge the gap between observable actions and the internal states that generated them. Third, the investigators intend to make available a dataset and toolset that should be an extremely useful for other investigators analyzing facial expression in multiple contexts. Additionally, automatic and on-line decoding of internal motivational states lays the groundwork for "affectively-aware" interactive computers, or artificial systems that can make inferences about the emotions and intentions of their users. Through the development of these systems, this project will make a significant contribution to the growing field of human-machine interaction.

[Supported by Perception, Action and Cognition, Decision, Risk and Management Sciences, and Robust Intelligence]

Project Start
Project End
Budget Start
2012-09-01
Budget End
2016-08-31
Support Year
Fiscal Year
2012
Total Cost
$166,977
Indirect Cost
Name
University of Arizona
Department
Type
DUNS #
City
Tucson
State
AZ
Country
United States
Zip Code
85719