Research has revealed much about the mechanisms of the visual system. However, perceptual experience is usually multimodal, with close relationships between visual and auditory modalities. Auditory signals influence neural activation throughout the visual pathways including in the mid brain and primary visual cortex. It is therefore important to extend the rigorous theories of vision to integrate multimodal contexts. Prior research on auditory-visual interactions has primarily focused on perception of space, timing, duration, motion, and speech, whereas recent research has demonstrated auditory-visual interactions in the perception of objects and faces. The goal of the proposed research is to fill the gap in our understanding of auditory-visual interactions at the level of visual feature processing. We will characterize which acoustic patterns uniquely interact with processing of low-level (e.g., spatial frequency), intermediate-level (e.g., material texture and 2D shape), and high-level (e.g., common objects, words, face identity, and facial expressions) visual features. To understand these interactions, we will combine psychophysics and computational modeling (AIM 1) to determine how associated sounds influence basic mechanisms of visual feature processing, including those that control image visibility (front-end signal-to-noise ratio and sampling efficiency), those that control signal competition for visual awareness, and those that control the strength and reliability of neural population coding of visual features in the presence of between- and within-receptive-field signal interactions. The results will provide an integrative understanding of how sounds influence visual signals, sampling, competition, and coding, for the processing of low-, intermediate-, and high-level visual features. The proposed research will also allow development of cross- modal methods for assisting visual perception by enhancing specific spatial scales, materials, shapes, objects, and facial expressions. For example, our preliminary results suggest that sounds can be used to boost and tune the perception of facial expressions, and to direct attention to specific spatial frequencies. In the translational aim (AIM 2), we will systematically investigate how sounds can be used to aid visual perception, for example, to direct attention to an object, material, word, or facial expression in search, facilitate object recognition via directing attention to diagnostic spatial-frequency components, and enrich scene understanding via directing attention to multiple spatial scales. Because feature-specific auditory signals are readily presented over headphones, the proposed research may provide a means to, for example, counter biased perception (e.g., perceiving facial expressions as negative due to social anxiety), and to direct attention to specific objects and spatial scales (e.g., details versus gist) for individuals with visual challenges such as low vision, strokes affecting vision, or with attention disorders. Thus, the proposed research will not only systematically integrate auditory influences into the current models of visual feature processing, but it may also provide a means to aid visual processing by using auditory signals.

Public Health Relevance

Visual signals are often accompanied by related auditory signals and therefore understanding auditory influences on visual processes is important for understanding how the visual system works in realistic contexts. Recent results suggest that auditory-visual interactions involve the perception of objects;for example, playing a characteristic sound of a target object (e.g., meow for a cat) facilitates visual search even when the sound is spatially uninformative. Understanding the nature of these interactions may provide new insights for alleviating vision problems such as age-related visual impairments by using auditory stimulation.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY021184-02
Application #
8313865
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Wiggs, Cheri
Project Start
2011-09-01
Project End
2014-08-31
Budget Start
2012-09-01
Budget End
2013-08-31
Support Year
2
Fiscal Year
2012
Total Cost
$374,888
Indirect Cost
$124,888
Name
Northwestern University at Chicago
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
160079455
City
Evanston
State
IL
Country
United States
Zip Code
60201
Mossbridge, Julia; Zweig, Jacob; Grabowecky, Marcia et al. (2017) An Association between Auditory-Visual Synchrony Processing and Reading Comprehension: Behavioral and Electrophysiological Evidence. J Cogn Neurosci 29:435-447
Menceloglu, Melisa; Grabowecky, Marcia; Suzuki, Satoru (2017) Comparing the effects of implicit and explicit temporal expectation on choice response time and response conflict. Atten Percept Psychophys 79:169-179
Menceloglu, Melisa; Grabowecky, Marcia; Suzuki, Satoru (2017) Temporal expectation weights visual signals over auditory signals. Psychon Bull Rev 24:416-422
Parrott, Stacey; Guzman-Martinez, Emmanuel; Orte, Laura et al. (2015) Direction of Auditory Pitch-Change Influences Visual Search for Slope From Graphs. Perception 44:764-78
Sherman, Aleksandra; Grabowecky, Marcia; Suzuki, Satoru (2015) In the working memory of the beholder: Art appreciation is enhanced when visual complexity is compatible with working memory. J Exp Psychol Hum Percept Perform 41:898-903
Skogsberg, KatieAnn; Grabowecky, Marcia; Wilt, Joshua et al. (2015) A relational structure of voluntary visual-attention abilities. J Exp Psychol Hum Percept Perform 41:761-89
Zweig, L Jacob; Suzuki, Satoru; Grabowecky, Marcia (2015) Learned face-voice pairings facilitate visual search. Psychon Bull Rev 22:429-36
Brang, David; Towle, Vernon L; Suzuki, Satoru et al. (2015) Peripheral sounds rapidly activate visual cortex: evidence from electrocorticography. J Neurophysiol 114:3023-8
List, Alexandra; Iordanescu, Lucica; Grabowecky, Marcia et al. (2014) Haptic guidance of overt visual attention. Atten Percept Psychophys 76:2221-8
Paller, Ken A; Suzuki, Satoru (2014) The source of consciousness. Trends Cogn Sci 18:387-9

Showing the most recent 10 out of 29 publications