The goal of this project is to develop methods that will permit researchers to remotely and automatically monitor behavior of primates and other highly social animals. The PIs will collect behavioral data from cameras and microphones. They will then develop statistical models and computational algorithms to track the individuals in the group and to recognize facial expressions and vocalizations. Patterns in movements, expressions, and vocalizations will be used to develop behavior-identifying algorithms that will recognize different behaviors such as aggression, submission, grooming, eating and sleeping. The project is a collaboration between computer scientists and primatologists. A key element of this project is the observation that complex social interactions can often be regarded as being composed of sequences of elementary behaviors which occur frequently and consist of relatively simple and distinct gestures. Thus, the task of modeling complex social interactions can be broken down into two regimes - elementary behaviors spanning short duration, and their stochastic sequences spanning relatively longer time duration.
Apart from advancing computational science, the new methods for recording behavior unobtrusively and analyzing them at a high data rate are likely to be of interest to behavioral ecologists, socio-biologists and neuroscientists in studies of primates and other highly social animals. With these new tools, scientists can study and understand behavior, for example, in the context of planning conservation efforts for threatened species, building accurate animal models for health research, and supporting animal husbandry decisions in zoos. The project will provide an extensive, annotated data repository and associated algorithms and will also fund graduate students who will gain hands-on training in all aspects of the project.