This project is a collaborative effort by computer scientists and engineers from Texas A&M and UC Berkeley consulting with natural scientists and documentary filmmakers. The goal is to advance the fundamental understanding of automated and collaborative systems that combine sensors, actuators, and human input to observe and record detailed natural behavior in remote settings. Currently, scientific study of animals in situ requires vigilant observation of detailed animal behavior over weeks or months. When animals live in remote and/or inhospitable locations, observation can be an arduous, expensive, dangerous, and lonely experience for scientists. The project proposes a new class of hybrid teleoperated/autonomous robotic "observatories" that allow groups of scientists, via the internet, to remotely observe, record, and index detailed animal activity. Such observatories are made possible by emerging advances in robotic cameras, long-range wireless networking, and distributed sensors. The project will investigate the algorithmic foundations for such observatories: new metrics, models, data structures, and algorithms, that will comprise a robust, mathematical framework for collaborative observation. The project will build on past work to extend and formally characterize hybrid models of collaborative and automated observation that draw on computational geometry, stochastic modeling and optimization. The project will advance fundamental understanding of networked robotics and develop efficient algorithms for collaborative observation that combines human and sensor input. This effort is intended to benefit biological scientists and facilitate collaboration among researchers. It will produce working prototypes that will be accessible via the internet to scientists, students, and the public worldwide.