Modern experimental techniques in many fields of neuroscience can produce large quantities of data that can be processed and modeled in many ways, often providing the opportunity to answer questions beyond the original experimental motivation. Additionally, more and more sophisticated algorithms are being developed to analyze large neural data sets. An infrastructure that allows for routine sharing of data and algorithms will help advance the field while facilitating the validation of published results. Data sharing is now commonplace in other fields such as genomics and astrophysics and in these fields has accelerated the pace of research. There are numerous challenges (technical and social) involved in setting up and maintaining an infrastructure for sharing data and tools. The purpose of this meeting is to bring together a core group of experimentalists and modelers to examine ways in which such a system could be structured so as to best benefit the scientific community as well as the individual investigator.