Despite advances in computer-controlled automation, the technological capabilities of robotic and 'smart' instruments are still far exceeded by the human brain. The next generation of intelligent devices will need to combine features such as the perceptual abilities, high learning rates and capability of generalization shown by the brain. To clarify a fundamental component of intelligent behavior, this project studies how cellular interactions within the cerebral cortex of the brain underlie problem-solving strategies in behaving primates. New technologies will be used for recording brain activity, for designing microchips, and for pattern recognition analysis. The objectives are to characterize the learning methods used to map sensory cues (from sight, sound and touch) into relevant motor behaviors (directed arm movements), to see how different learning strategies affect the dynamic relations among functional associations of groups of cortical nerve cells during learning, and to design a brain-machine interface that will sample and process neuronal activity in real time in behaving animals. Technological goals include the use of a special virtual reality environment for behavioral testing, the development of a wireless multichannel microchip for transmitting brain activity to a remote receiver, and the development of pattern-recognition algorithms to analyze complex patterns of brain activity among multiple cells. Results of this project from the Knowledge and Distributed Intelligence (KDI) initiative will have broad impact on fields such as neurobiology, bioengineering and computer science, will have potential applications in technology for intelligent interactive roboticsm, and will provide excellent cross-disciplinary training for a range of students and postdoctoral researchers.