Even at the earliest cortical stages, visual processing is influenced by internally generated information. Top- down influence on visual processing is found in very different behavioral contexts. As a result, early visual areas are increasingly considered as ?interfaces? or ?blackboards? where bottom-up information is confronted to top- down representations. Yet, our knowledge on the cellular and network mechanisms by which top-down signals modulate the integrative properties of visual cortical neurons is still fragmentary. Moreover, the impact on visual processing and visual perception of the relatively small top-down modulations that were described in early visual cortices is debated. Finally, we ignore to what extent top-down modulation is flexible and can be adapted to a novel behavioral goal. Our long-term goal is to better understand the functions and mechanisms of top-down influences on early visual processing using a recent and tractable model: the modulation by sound of visual processing in V1. We have recently shown that the representation of the orientation and direction of the visual stimulus in V1 is improved in the audiovisual context, through a potentiation of the response of neurons with preferred orientations matching the orientation of the visual cue, and a suppression of the activity of neurons coding for orthogonal orientations and opposite directions. The overall objective of this application is to establish the cellular and network mechanisms, functional impact and flexibility of sound modulation on early visual pro- cessing. Our central hypotheses, based on our preliminary data, is that sound modulation in V1 results from the activation of local mechanisms controlling the orientation and direction tuning of V1 neurons, that modest im- provements of the representation of visual stimuli can improve visual perception, and that sound modulation in V1 is flexible and can be adapted to novel behavioral goals by training. The rationale for the proposed research is to better understand why, when and how internal representations bias sensory perception. To test our central hypotheses, we propose the following specific aims: 1) Determine the cellular mechanisms underpinning the orientation and direction-dependent sound modulation of V1 neurons by performing whole-cell recordings in awake mice, a technique for which our laboratory has a unique expertise; Using two-photon calcium imaging in mice performing Go/NoGo behavioral tasks, we will also 2) Determine the impact of sound modulation on the representation of the visual stimulus in V1 and on visual perception; 3) Demonstrate that sound modulation is flexible and can be adapted to a novel behavioral task though training. Our approach is innovative because we will bring to the mouse a topic so far mostly addressed in primates to benefit from functional imaging, genetic tools and our unique skill at performing whole-cell recordings in awake behaving mice. The proposed research is significant because it is expected to vertically advance our understanding on how top-down inputs adapt visual processing to the behavioral goal. Ultimately, such a knowledge will inform the pathophysiology of several de- velopmental, and degenerative disorders in which top-down control of sensory processing is impaired.
The proposed research is relevant to public health because several neurodevelopmental, mental and neurodegenerative disorders, including autism spectrum disorders (ASD), schizophrenia and prodromal Alzheimer?s disease (AD), are associated to difficulties in processing sensory information that could result from an abnormal top-down control of sensory processing. As our goal in this project is to determine the function and mechanisms of top-down modulation of visual processing, it is relevant to the part of NEI mission that pertains to supporting research that advance our knowledge of how visual system functions in both health and disease.