This research investigates intersensory mechanisms that underlie the influence of auditory stimuli on visual perception. Findings from single-cell neurophysiology, event-related brain potentials, and functional MRI indicate that neurons in cortical areas that were once thought to be unisensory can be driven by stimuli from multiple modalities. Moreover, the timing of multisensory-driven activity is such that it is unlikely to be due to just to feedback from anatomically later cortical areas. A set of recently reported behavioral effects showing the enhancement of visual perception from informationally unrelated but concomitantly presented auditory stimuli seem to be manifestations of these multisensory mechanisms. To date, however, these effects have not been understood in terms of any common intersensory mechanism. This research tests the audiovisual temporal-encapsulation hypothesis which states that an abrupt transient auditory stimulus defines a temporal window within which the representation of visual stimuli that appear in that time window are isolated from interaction with temporally surrounding visual information. This hypothesis can account for all of the effects reviewed above and it makes novel predictions beyond these, which are the focus of the present proposal. Significantly, the intersensory effects that have been reported in the literature are primarily examples of visual facilitation by concomitant auditory stimuli. The new predictions that we have derived from the audiovisual temporal encapsulation hypothesis are examples of impaired visual perception by concomitant auditory stimuli, which make them especially powerful for testing the hypothesis. The general idea is that if visual information is encapsulated by concomitant auditory stimuli, then it follows that tasks that require integration of that encapsulated visual information with information sampled at a different time will be impaired. This idea will be tested in three specific aims.
Specific Aim 1 tests the effects of auditory stimuli in a visual integration task.
Specific Aim 2 tests the effects of auditory stimuli on the processes by which visible persistence for moving stimuli is actively suppressed known, as motion deblurring.
Specific Aim 3 tests the effects of auditory stimuli on the processes by which visual information is suppressed during eye movements, known as saccadic suppression. If confirmed, the audiovisual temporal-encapsulation hypothesis could explain why people are susceptible to auditory distraction when engaged in what are primarily visual tasks (e.g., driving). Moreover, the encapsulation process might be a mechanism that is subject to failure in impaired states due to drugs, alcohol or disease. It may also be a mechanism that is subject to decline with age.
This project seeks to understand the sensory mechanisms underlying the interaction between audition and vision under dynamic conditions (i.e., when stimuli are changing over time). These are important mechanisms because they determine when sensory stimulation in dynamic situations, such as when driving a car, will be distracting and possibly overwhelming. Moreover, they may be a source of non- optimal functioning with drugs, disease, development and/or aging.
Attarha, Mouna; Moore, Cathleen M; Vecera, Shaun P (2016) The time-limited visual statistician. J Exp Psychol Hum Percept Perform 42:1497-504 |
Jardine, Nicole L; Moore, Cathleen M (2016) Losing the trees for the forest in dynamic visual search. J Exp Psychol Hum Percept Perform 42:617-30 |
Attarha, Mouna; Moore, Cathleen M (2015) The capacity limitations of orientation summary statistics. Atten Percept Psychophys 77:1116-31 |
Fiedler, Anja; Moore, Cathleen M (2015) Illumination frame of reference in the object-reviewing paradigm: A case of luminance and lightness. J Exp Psychol Hum Percept Perform 41:1709-17 |
Attarha, Mouna; Moore, Cathleen M (2015) The perceptual processing capacity of summary statistics between and within feature dimensions. J Vis 15:9 |