The brain has a remarkable ability to change its own function. In the visual system, such plasticity is evident when children learn to read, when the elderly adjust to impaired vision, and when new drivers master navigating traffic. The mechanisms that produce long-lasting plasticity in the visual brain remain unclear, however. With support from the National Science Foundation, Dr. Stephen Engel of the University of Minnesota is carrying out research to elucidate mechanisms of long-term visual plasticity. The work builds upon effects of short-term plasticity that are relatively well-understood. For example, when observers view a bright pattern of stripes for a few minutes, their visual systems adapt, and fairly dramatic consequences can result: Faint patterns of similar stripes that were previously apparent can be briefly rendered completely invisible. But few laboratory studies have examined perceptual adaptation over the long term, primarily due to methodological limitations. This project uses recently developed technology to overcome those roadblocks. Participants view video on a head-mounted display that originates in a head-mounted camera and is image-processed by a lap-top computer. The system allows participants to live in a visual world that has been digitally altered. Four experiments use this system to test hypotheses about long-lasting plasticity in the visual system. The first hypothesis is that longer duration adaptation will produce longer lasting plasticity. Participants are placed for up to five days in environments similar to those that produce short-term perceptual adaptation effects. The persistence of adaptation is measured using traditional tests of perceptual abilities. The second hypothesis is that the visual system adapts to discount noisy, uninformative visual input, as suggested by preliminary results. Experiments in this project measure the strength of this novel form of adaptation over the long term. The third hypothesis is that both kinds of adaptation effects arise in early visual cerebral cortex. To test this, the neural bases of long-term adaptation is measured with functional magnetic resonance imaging (fMRI). Together, these experiments can strongly constrain both empirical and theoretical accounts of long-lasting visual plasticity.
This research is advancing understanding of how the human visual system can modify its own operation. This ability, visual plasticity, underlies the acquisition a wide array of human skills, from learning to read, to learning to hit a baseball, to learning to see patterns in satellite imagery. The project uses novel technology to place observers in a digitally altered world for up to 5 days, and measures how their visual perception and their visual cortex adapt to this challenge. Results can help identify specific factors, for example the length and kind of visual stimulation, which lead to long-lasting changes in visual performance. These in turn should have important applications in a diverse array of fields where visual plasticity is critical, from education to the military to public health.
The human visual system has the remarkable ability to change its own function in order to keep us seeing well when our visual environment changes. When people cope with eye disease, when they train as baggage scanners or artists, and even when they simply move from the city to the country, the kind of visual information reaching our brain changes. Such challenges produce large differences in the basic visual features, the colors, line segments, and simple shapes, which reach the eye. The visual parts of our brains are specialized for processing those features, and they automatically adjust to handle different balances of them. This process of visual adaptation allows us to continue to see well in an ever-changing environment. Understanding how it works has profound implications for many applications, from improving learning of reading to developing new therapies to manage consequences of eye disease. The neural processes that control this adaptation have been examined in dozens of studies for short-term changes, lasting a few minutes. But due to a lack of technology, little to no laboratory work has examined how our visual systems adapt to long-term challenges (goggles that invert or flip the world, the subject of many studies, present mainly challenges for integrating vision with our motor systems). We developed new "altered reality" technology that allows us to change the mix of features subjects view for hours and days at a time. Subjects view the world through a video camera that feeds into virtual reality goggles, via a laptop computer. The computer can process the video in real time, allowing subjects to live in a "photoshopped" world. We altered the balance between vertical and horizontal in subjects’ visual world, by removing vertical information from the video. Neurons in visual cortex are specialized to respond to different orientations, so this manipulation effectively shut off input to millions of vertical-preferring neurons in the brain. The work in this proposal investigated how the visual brain adjusts to this challenge over the long-term. Our first major discovery was that long-term adaptation is a distinct process, different from the short-term adaptation already studied in dozens of papers. That is to say, we have different and independent control over adaptation to short-term and long-term changes. To demonstrate this we had subjects adapt for 4 hours to a world without vertical by wearing our altered reality technology. Deprivation increased the responsiveness of vertical neurons, as they tried to signal any small amount of vertical still present in the input. We measured this effect perceptually: When we briefly showed subjects small amounts of vertical, it appeared brighter and more noticeable than normal. In addition, other orientations appeared tilted towards vertical. These effects grew quite large by the end of 4 hours of living in the world without vertical. To test for independent control of rapid, short-term adaptation and longer-term adaptation, we then briefly allowed subjects to view the normal world through the virtual reality goggles. This experience rapidly eliminated the effects of the 4 hours’ deprivation, as short-term adaptation brought subjects’ vision back to its normal state. However when tested subjects in a neutral condition (just a blank gray screen) the effects of vertical deprivation gradually reappeared. This striking recovery of adaptation proved that the short-term readaptation to the normal world did not wipe out the long-term effects of vertical deprivation. The processes that control long-term adaptation in our visual system must be distinct and independent from those that control short-term adaptation. We next extended the adaptation duration to 4 days, and made two additional major discoveries. Subjects wore the altered reality system during the day, and spent nights blindfolded with a nurse on call for assistance. Results of this experiment showed that the shorter-term adaptation, measured in our previous study, does not last over the long term. Our perceptual measures increase over the first day, as in our other experiments, but then surprisingly reversed themselves after about a day. Short- and medium-term adaptation apparently reach some limit after which they decline. The study also revealed that very gradual, long-term changes can step up to improve vision. These were evident as a gradual increase in perceptual measures of adaptation over the last two and a half days of the experiment. This slower process appeared to both maximize positive effects of visual adaptation and minimize its unintended negative side-effects. The project also produced several smaller scale studies, and several papers addressing general methodological issues important to the field. The most important of these developed a new way of displaying results of neuroimaging studies. Our approach should minimize errors in interpretation of "brain mapping" images that are made by the general public and seasoned investigators alike.