In our daily life, even in the face of multiple sound sources, our brain binds together frequency components that belong to the same source and recognizes individual sound objects. In humans, grouping of spectral components into single sound perception relies on the precise synchrony (< 30-ms window) of their onset timings, and this grouping plays a critical role in our perception of language. Despite the importance of this ?sensory feature binding?, we still know little regarding the neuronal circuit mechanisms underlying how our brain integrates spectrally and temporally distributed sound inputs. To address this gap in knowledge, this project will define neuronal circuits underlying the binding of harmonic sounds using mouse auditory cortex as a model system. Mouse auditory cortex consists of five areas that are interconnected to form hierarchical processing streams. Our preliminary data indicates that a higher auditory cortical area, A2, selectively represents multi-frequency sounds with coincident onset timings. We hypothesize that inhibitory circuits in A2 gate the integration of tones in a synchrony-dependent manner, and this gating gives mice an ability to detect harmonic sounds. Our goal is to examine this hypothesis by taking advantage of cutting-edge two-photon calcium imaging and in vivo whole-cell recording techniques that are available in mice. To achieve our goal, this project aims to (1) Determine the distinct spectro-temporal integration across auditory areas (macroscopic and cellular-level calcium imaging), (2) Dissect the circuit mechanisms underlying spectro-temporal integration (in vivo whole-cell recordings), and (3) Determine the perceptual roles of higher auditory cortices in processing harmonics (optogenetics during behaviors). Findings in the simple mouse cortex should provide a first step towards the ultimate understanding of the ?feature binding? circuits that enable verbal communication, and how they fail in diseased brains.

Public Health Relevance

In our daily conversations, we rely on our brains? ability to recognize individual sound objects, such as consonants and syllables. However, we know little about how our brains integrate spectrally and temporally distributed sounds to extract these features. Our research will reveal the neuronal circuit mechanisms underlying this computation, ultimately leading to the discoveries that help to develop better treatment for hearing problems.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
1R01DC017516-01A1
Application #
9817163
Study Section
Auditory System Study Section (AUD)
Program Officer
Poremba, Amy
Project Start
2019-07-01
Project End
2024-06-30
Budget Start
2019-07-01
Budget End
2020-06-30
Support Year
1
Fiscal Year
2019
Total Cost
Indirect Cost
Name
University of North Carolina Chapel Hill
Department
Psychiatry
Type
Schools of Medicine
DUNS #
608195277
City
Chapel Hill
State
NC
Country
United States
Zip Code
27599