We here focus on determining the algorithms that enable highly similar visual information to be transformed into diverse, behaviorally relevant outputs. We also seek to determine the mechanisms that generate these algorithms. Understanding how visual information is transformed into representations relevant for behavior is key for restoring sensorimotor transformations in those who are blind or visually impaired, or suffer from sensory processing disorders. For our visual inputs, we use looming stimuli, the 2-D projections of an object approaching on a direct collision course. Looming stimuli elicit a conserved diversity in behavioral responses across species that are necessary for survival. This diversity is thought to emerge through parallel sensorimotor processing pathways that differentially transform visual features of a looming stimulus into motor outputs. Limited access to both visual feature encoding and visual feature integrating circuit components has however limited the development and biological validation of the algorithms utilized across pathways. We circumvent these limitations by using Drosophila melanogaster that provides the necessary electrophysiological and genetic access to the cell types that participate in these sensorimotor transformations. Our preliminary data suggest looming information is transformed within eight descending sensorimotor pathways (DN) that receive features of looming stimuli from up to six optic lobe columnar projection neuron (OLCPN) cell types. In this interdisciplinary grant, we capitalize on the complementary expertise of Dr. von Reyn (PI), who has pioneered electrophysiological, behavioral, and genetic methods for investigating feature integration within OLCPN and DN, and Dr. Ausborn (co-PI), who has broad expertise in the development of mechanistic biophysical circuit models for the analysis of neural computations within mammalian and invertebrate systems. Here we characterize the extent to which different DN intrinsic properties and circuit mechanisms account for the observed output diversity.
In Aim 1 we combine electrophysiology, RNAi silencing, and computational modeling to establish, at a molecular level, intrinsic integration mechanisms for each DN.
In Aim 2, we combine electrophysiology, optogenetics, and computational modeling to determine OLCPN synaptic inputs to DN.
In Aim 3, through concurrent model and experimental probing, we evaluate the dominant mechanisms that determine looming feature integration algorithms utilized across the DN population. This project will provide a thorough understanding of general principles for transforming sensory information into higher order, behaviorally relevant representations.

Public Health Relevance

Those who are blind or visually impaired suffer from an inability to transform visual information into behaviorally relevant representations to guide selection of appropriate actions. This project seeks to derive the algorithms for sensorimotor transformations that can guide technologies to restore visual-motor behaviors through neuroprosthetics. This project also seeks to uncover the underlying mechanisms for these algorithms that provide targets for restoring sensorimotor transformations in neurodegenerative diseases or neurodevelopmental disorders.

Agency
National Institute of Health (NIH)
Institute
National Institute of Neurological Disorders and Stroke (NINDS)
Type
Research Project (R01)
Project #
1R01NS118562-01A1
Application #
10052610
Study Section
Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section (SPC)
Program Officer
Gnadt, James W
Project Start
2020-09-15
Project End
2025-06-30
Budget Start
2020-09-15
Budget End
2021-06-30
Support Year
1
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Drexel University
Department
Type
DUNS #
002604817
City
Philadelphia
State
PA
Country
United States
Zip Code
19102