This award supports the preparation and sharing of computational neuroscience data as part of an exploratory activity aimed at catalyzing rapid and innovative advances in computational neuroscience and related fields. The data to be shared in this project are recordings of eye movements of subjects watching video clips under natural free viewing conditions. Data will be made available in both raw and processed forms, along with the corresponding video stimuli. Code will be provided for calibration of traces. Code, training data, and validation data will be provided to facilitate the development of prediction algorithms. These data were originally collected for development of an information-theoretic model of visual saliency and visual attention. It is anticipated that they will be useful for a broad range of questions in neuroscience, cognitive psychology, and computer vision. Saliency maps and raw feature maps tied to the information-theoretic model will also be made available, to allow users interested in quantifying which low-level visual features may more strongly attract human attention and gaze to easily perform quantitative analyses.