For decades, neuroscientists have recorded from single brain cells (neurons) to understand how the brain senses, makes decisions, and controls movements. We can now record from hundreds of neurons simultaneously but are still at an early stage in developing tools for determining how networks of neurons work together to perceive the world and to generate the control signals needed to produce coordinated movement. Focusing on movement, this project brings to bear the power of deep learning --- powerful new machine learning algorithms --- on the problem of understanding neural activity. Because deep learning thrives on big data, the investigators can leverage massive-scale brain recordings. These include month-long recordings chronicling the activity of 100 neurons as a monkey goes about its daily business, or recording from thousands of neurons for hours in the mouse, each identified with an exact location in the brain and tied to the mouse's on-going behaviors. These approaches will open new windows on how neurons act together moment-by-moment to produce movement. The investigators will develop simple descriptions of the underlying processes to be shared with the public through venues including online tutorials, a new open course that will be developed at Emory University and Georgia Tech, the Atlanta Science Festival, and Atlanta's Brain Awareness Month. They will also make their data sets publicly available, and host data tutorial and modeling competitions at key scientific meetings, to accelerate progress by engaging the broader scientific community.
In the fifty years since Ed Evarts first recorded single neurons in M1 of behaving monkeys, great effort has been devoted to understanding the relation between these individual signals and movement-related signals collected during highly constrained motor behaviors performed by over-trained monkeys. In parallel, theoreticians posited that the computations performed in the brain depend critically on network-level phenomena: dynamical laws in brain circuits that constrain the activity and dictate how it evolves over time. The goal of this project is to develop a powerful new suite of tools, based on deep learning, to analyze these dynamics at unprecedented temporal and spatial scales. The investigators will leverage recordings with month-long M1 electrophysiology, EMG, and behavioral data during natural behaviors from monkeys, and vast numbers of neurons recorded with two-photon imaging from behaving mice. Novel machine learning techniques using sequential auto-encoders will enable the investigators to learn the dynamics underlying these data. This combination will provide windows into the brain's control of motor behavior that have never before been possible. The novel analytical framework developed here will be extensible from motor behaviors to higher level problems of error processing, decision making, and learning.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.