People recognize dramatic situations and attribute roles and intentions to perceived characters, even when presented with extremely simple cues. As any cartoon viewer can attest, two animated shapes are sufficient to describe a scene involving tender lovers, brutal bullies, tense confrontations and hair-raising escapes. These basic notions of agency and intentionality are foundational to our social perception of the world. They provide the first discriminations between agents and objects, delineate which elements of the world can move with goal-directed purpose, and provide the primitive structure for describing cause and effect. Extensive laboratory experiments have described many of the basic properties that produce these perceptions on controlled stimuli. However there have been only limited attempts to quantify these processes and no attempts to see if these same properties hold on real-world activity patterns.

This project models our human ability to perceive agency, intentionality, and goal-directed behavior in dynamic real-world environments. Using off-the-shelf real-time localization systems, the movements of people and objects are recorded as they engage in unstructured activity and staged group games. Drawing on both this empirical data and theories drawn from the psychophysical data, computational models are constructed that quantify, explain, and predict real-world social and goal-directed behavior. The benefits of this work include: (1) modeling tools for use within behavioral studies, (2) a real-world grounding for psychophysical studies, and (3) a computational model of social and intentional behavior that would enhance human-computer and human-robot interfaces.

Project Report

This project was funded as a one-year exploratory effort to develop a methodology to validate existing results in human visual psychophysics regarding the perception of animacy. Over the past seven decades, psychologists have investigated the phenomenon in which two-dimensional movement of simple shapes evokes the appearance of high-level features such as animacy and intentionality. This work is interesting in part because perceptions of animacy and intentionality, which are high-level and complex evaluations, are consistently triggered almost automatically and irresistibly from low-level perceptions of 2D motion. Along similar lines, roboticists have been interested in how robot motion affects perceptions of robot intelligence, friendliness, and animacy. By applying research methods and topics from cognitive psychology to a social robotics domain, we hope to leverage the research in both fields to better understand the features that cue intentionality and to design a robot model for recognizing and manipulating intentionality. In order to explore what features of two-dimensional motion cue the perception of intentionality, some cognitive psychologists have chosen to analyze particular subsets of intentional motion, such as chasing. To identify chasing (instead of "following") a viewer needs to attribute goal-directedness, that is, intention to the agent's behavior. Therefore, chasing is a good test case for intentional behavior as a whole. As a result of our exploratory effort, we developed a methodology that matches the movement of traditional on-screen moving dots with the movement of a set of small robots moving on the floor. The two displays (on-screen and real-world) demonstrate the same movement and can be tested independently and cross-validated. This is the first methodology that links decades of work in psychophysics to three-dimensional objects in the real world. This research opens the door for a series of experiments that ground results in psychophysics to immediate judgments made by human observers in real settings.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0968538
Program Officer
Tatiana D. Korelsky
Project Start
Project End
Budget Start
2010-09-01
Budget End
2011-08-31
Support Year
Fiscal Year
2009
Total Cost
$250,000
Indirect Cost
Name
Yale University
Department
Type
DUNS #
City
New Haven
State
CT
Country
United States
Zip Code
06520