Many researchers study how nonhuman animals see the world. To date, however, only certain types of experiments have been possible, given the limitations of animal learning and response capabilities. With funding from the National Science Foundation, Drs. Nakayama and Pepperberg at Harvard and Brandeis Universities will address questions about visual processing in Grey parrots, by querying the birds in English, much as one queries human subjects. Grey parrots' ability to mimic human speech is one characteristic that has made them popular as pets. Pepperberg's Grey parrot Alex (now deceased) was able to verbally respond to simple optical illusions (e.g., the Müller-Lyer illusion, in which two lines appear to humans to vary in length but in reality do not; Alex responded as do humans). Drs. Nakayama and Pepperberg will train two birds, Griffin and Arthur, to learn to label various colors and shapes using the sounds of English speech. The current project will then examine whether parrots, like people, can (a) complete the shape of a partially covered object (e.g., recognize a figure as a square, and not as a five-cornered object, even though one of its corners is covered by a circle), a process formally known as "amodal completion" and (b) "see" objects that aren't actually real, like a triangle that "appears" between three pac-man-like partial circles that are arranged in a triangular manner, something formally known as an "illusory contour" or "Kanizsa figure". One might expect a parrot to be able, for example, to infer the presence of a predator that isn't fully observable, but no one has been able to ask any nonhuman such questions directly. Future research will involve more complex tasks designed to study how birds pay attention to objects in their visual environment.

The underlying long-term goal of this research is to determine whether the tasks reveal differences in perceptual processing between birds and humans. Success in training the parrots will enable Drs. Nakayama and Pepperberg eventually to examine a broad range of visual tasks to determine which perceptual abilities share the same underlying mechanisms in birds and humans and which do not. Similarities between these two species with very different brain sizes will allow us to understand which components of perception can be implemented with smaller scale neural architecture. Differences will indicate the components that do require greater brain size and/or complexity. The data will guide future comparisons with other species and provide insights into the structure and function of the human brain and may provide useful insights for the design of artificial visual processors.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Type
Standard Grant (Standard)
Application #
0920878
Program Officer
Lawrence Robert Gottlob
Project Start
Project End
Budget Start
2009-10-01
Budget End
2010-09-30
Support Year
Fiscal Year
2009
Total Cost
$79,998
Indirect Cost
Name
Harvard University
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02138