This project aims to uncover the existence of a qualitatively better class of analog error-correcting codes than previously known in the brain, show how such codes can be used and decoded, and develop the theory for quantifying the performance of such codes.

Information theory was introduced into neuroscience relatively early, and the theory of efficient (source) coding has been widely embraced in the sensory neurosciences. However, the second branch of information theory, which deals with the maximally parsimonious addition of redundancy to recover signal from noise, has curiously not made inroads in neuroscience. Shannon's channel coding theorem revealed the existence of codes that make possible error correction at efficiencies previously thought impossible.

The investigator's central hypothesis is that the brain routinely employs such error correcting codes and the machinery required to decode and work with them. The hypothesis is motivated by a recent analysis of the grid cell code for animal location by the investigator and colleagues, showing it has unprecedented error-correction properties compared to known population codes in the brain (Sreenivasan & Fiete, 2011). The investigator proposes to: 1) Develop definitions and constraints for analog neural codes, to apply the channel coding framework to neural codes and thus characterize their "goodness" on error-correction. 2) Identify high-level coding properties that enable strong error-correction, and search for these properties in observed but poorly understood neural codes. At the same time, explore strong theoretical error-correcting codes that the brain may plausibly implement. 3) Model plausible neural mechanisms for decoding such codes. Decoding is inference, so this question can be more generally thought of as exploring neural mechanisms for hierarchical inference.

This project is computational and theoretical, and also involves close collaboration with neurophysiologists, to apply quantification techniques to neural data and work with experiments to inform the theories and test predictions.

Project Report

This project has focused on the role that noise in neurons can play in degrading memory and learning and understanding whether good coding strategies can mitigate the effects of noise, and to what degree. Because memory involves holding information over time, ongoing noise that perturbs the activity of neurons that store such information will tend to reduce the fidelity of the representation over time. Further, noise during the process of development when networks of neurons must organize into useful computational structures can impede the process. Our work has quantified exactly the time-course of degradation of information stored in a large class of neural networks whose architectures are tuned to represent and store analog variables, as a function of the number of neurons involved in such storage [1]. In a related project, we ask whether, using clever ``coding’’ strategies, it is possible to improve memory accuracy over time even when the noise per neuron is kept fixed, and if so, what is the best possible achievable accuracy. For instance, thanks to the clever coding of music written on a CD, it is possible to get a good rendition of the piece even when the CD is substantially scratched. We combined techniques and results from modern coding theory to models in neuroscience to obtain a bound on best possible accuracy in the presence of noise. In collaboration with psychologists we showed that human memory behavior appears to follow a similar time-dependence as the bound [2]. Finally, we have shown that observed rules for how neural connection strengths change can lead to the development of memory systems, in particular the grid cell system that plays a role in spatial memory and navigation. We show that the very specific architectures required for spatial computation and memory can be learned by these rules, and moreover, are robust to a degree of ongoing neural noise. This work results in a number of testable predictions for experiments on how grid cell activity will gradually evolve during development, and raises a number of new questions for theory and experiment, on the role of noise during network development [3]. These projects have led to a better understanding of how noise affects computations in the brain, and how the brain can rein in the undesirable effects of noise with appropriate strategies for development and for representing information. This award has helped to train a number of undergraduate, graduate, and postdoctoral students on how to generate interesting questions and solve new problems, adopt tools from one field to apply in another, generate new ideas and conceptual viewpoints, and to effectively present findings to international audiences. Finally, with help from this award, the PI has developed a new undergraduate class called Quantitative Methods in Neuroscience. The aim is to train a new generation of biology and neurobiology students about quantitative tools, techniques, data analytics, and ways of thinking, that are fast becoming indispensible as biology transitions from being a primarily descriptive field into a quantitative one. Students finding themselves in walks of life outside of academics will find that quantitative tools like the ones developed in this course are in demand in various professions, now that data collection and analysis is becoming an integral part of fields as diverse as marketing and advertising, finance, medicine, fashion, and biotechnology. [1] Y. Burak and I.R. Fiete. Fundamental limits on persistent activity in networks of noisy neurons. Proceedings of the National Academy of Sciences 109(43) 17645-17650 (2012). [2] O. Koyluoglu, Y. Pertzov, S. Manohar, M. Husain, I.R. Fiete. Fundamental limits on capacity and forgetting in human short-term memory. (In revision for Nature Neuroscience). [3] J. Widloski and I.R. Fiete. A model of grid cell development based on spatial exploration and spike time-dependent plasticity. Neuron 83(2): 481–495 (2014).

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1148973
Program Officer
Kenneth C. Whang
Project Start
Project End
Budget Start
2011-10-01
Budget End
2014-09-30
Support Year
Fiscal Year
2011
Total Cost
$175,000
Indirect Cost
Name
University of Texas Austin
Department
Type
DUNS #
City
Austin
State
TX
Country
United States
Zip Code
78759