Social interaction is a complex and highly demanding task -- it can unfold in a harmonious and effortless way, yet sometimes also fail catastrophically. A critical determinant of success is whether or not the partners are able to establish rapport. Rapport impacts all communication contexts, from initial contact formation to private conflict resolution and business negotiations. While humans are sensitive to flaws in rapport, they frequently fail to identify the reasons or take counter measures. This project is motivated by the idea that new media technologies, such as social virtual reality (VR), can augment people's social-cognitive capacities in this regard and improve our communication skills in daily life interactions. In relevant respects, machine capabilities can be superior to, and less biased than, human social perception. On the one hand, interactions taking place in VR allow the system to register behavioral details affecting rapport (such as movement, eye gaze, and facial expressions). Moreover, mobile sensor technologies can be seamlessly integrated into VR devices, such as headsets or controllers to measure the neurophysiological correlates of emotional, motivational, and attentional attunement. On the other hand computational power now allows us to run highly complex machine learning algorithms on standard personal computers. This project will leverage VR capture technologies, mobile neurophysiological sensing and deep learning methods to develop a bio-behavioral model of rapport. Based on this the project will develop and evaluate tools to monitor rapport in ongoing interactions and administer feedback to enable corrective actions that improve rapport.

The investigators will meet the two objectives. First, they will accumulate an annotated interaction database consisting of 150 dyads (pairs of subjects in conversation) performing three different interaction tasks, with an overall duration of 30 minutes. The database will include speech, movement, gaze, EEG measures of concurrent brain activity, and cardiovascular measures. The interaction protocols will be annotated for rapport by groups of observers, and the subjects themselves will evaluate interaction quality and outcomes. Second, they will develop and validate machine learning algorithms that identify bio-behavioral rapport signatures in the annotated multichannel database, and predict perceived rapport and physiological responses from nonverbal behavior. This development will lead to a bio-behavioral model of rapport, which provides the basis for social AI components, capable of monitoring and facilitating rapport in ongoing avatar interactions. Long term goals are to integrate these tools in communication media beyond social VR or in real life interactions, depending on more advanced, portable and unobtrusive sensing devices.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1907807
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2019-10-01
Budget End
2022-09-30
Support Year
Fiscal Year
2019
Total Cost
$300,000
Indirect Cost
Name
Michigan State University
Department
Type
DUNS #
City
East Lansing
State
MI
Country
United States
Zip Code
48824