This interdisciplinary effort develops, integrates, and refines a suite of activity and behavioral observation tools supporting the automatic collection, annotation, access, analysis, and archiving of behavioral data for individuals and groups. These tools capture a continuous audiovisual record of human activity in various settings and apply machine intelligence technology to automatically process that record for efficient use by analytical observers to monitor situational behavior over time. The annotated record provides a level of completeness not feasible with human observers, allowing, for the first time, large-scale longitudinal behavioral and clinical research based on continuously captured and processed data, enabled through extensible interfaces accessing such voluminous records in a user-friendly, yet utilitarian manner. The record will be processed with data reduction and extraction technologies that recognize faces and speech, track moving individuals, and identify social interactions, while protecting the confidentiality of the participant's identity in collaboratively accessible, computerized databases.
The tools focus on the automatic identification of features within the audiovisual record to improve the accuracy and completeness of manual, labor-intensive rating instruments. Through computer vision, speech recognition, sensor integration, and machine learning, multimedia data extraction technologies will be developed for individual behavioral measurements. Additional tools will be developed to mine the resulting annotated longitudinal datasets for insights into individual interactions and reactions. The relevance of the tools will be demonstrated, refined, and validated through an initial challenge application and environment: the elderly residents in a continuing care retirement community. Collaborating studies of parent-child, teacher-student and autistic children social interactions will validate portability and utility across domains of behavioral research. The tools will be extensible, enabling behavioral scientists to better accommodate novel situations and source material. Specifically, the scientist will be able to identify a need for a class of audiovisual detection, adeptly supply training material for that class, and iteratively evaluate and improve the resulting automatic classification produced via machine learning and rule-based techniques.
The project's broader impacts include supplying social and behavioral scientists with automated tools supporting novel approaches to creating and analyzing data in their endeavors to characterize human behavior. Research by sociologists, psychologists, anthropologists, and medical clinical investigators will be enabled in new modes with greater accuracy and precision of observation than has heretofore been possible or practical. Ethical issues are explicitly and proactively addressed to engage the larger social concerns.