This module serves the computing needs of CVS Core users. The past five years have seen exciting developments in vision science and translational research in ophthalmology at Rochester. With those developments, the need for new software and new computing technologies has increased significantly. Perhaps most important among these changes have been the development of new multi-user research resources and the proliferation of several sets of common technologies to a number of different research communities within CVS. Multi-user resources within CVS include two virtual reality labs for the study of perceptual learning and multi-sensory integration in both normal subjects and patient populations, and a 52-processor computing cluster for data analysis and simulation of large-scale neural networks. Common technologies include a number of shared imaging devices (e.g., adaptive optics systems) used by both the retinal imaging and vision correction communities, common multi-electrode recording systems for electrophysiology, and common stimulus display systems used by both neurophysiology and behavioral labs. These shared resources will continue to require significant support from the computing module, which will make important contributions to projects that include: massively parallel simulations of neural networks, new virtual-reality software for multi-sensory integration experiments in both human and non-human primates, software development for multi-electrode recordings in non-human primates, and imaging the living human eye using adaptive optics. The computing module is staffed by three highly qualified full-time applications programmers who work closely with CVS investigators to implement novel software and hardware designs that enable new research projects.

National Institute of Health (NIH)
National Eye Institute (NEI)
Center Core Grants (P30)
Project #
Application #
Study Section
Special Emphasis Panel (ZEY1-VSN)
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Rochester
United States
Zip Code
Bosen, Adam K; Fleming, Justin T; Brown, Sarah E et al. (2016) Comparison of congruence judgment and auditory localization tasks for assessing the spatial limits of visual capture. Biol Cybern 110:455-471
Chapman, Robert M; Gardner, Margaret N; Mapstone, Mark et al. (2016) ERP C250 shows the elderly (cognitively normal, Alzheimer's disease) store more stimuli in short-term memory than Young Adults do. Clin Neurophysiol 127:2423-35
Wimmer, Klaus; Ramon, Marc; Pasternak, Tatiana et al. (2016) Transitions between Multiband Oscillatory Patterns Characterize Memory-Guided Perceptual Decisions in Prefrontal Circuits. J Neurosci 36:489-505
Sharma, Robin; Schwarz, Christina; Williams, David R et al. (2016) In Vivo Two-Photon Fluorescence Kinetics of Primate Rods and Cones. Invest Ophthalmol Vis Sci 57:647-57
Kim, HyungGoo R; Pitkow, Xaq; Angelaki, Dora E et al. (2016) A simple approach to ignoring irrelevant variables by population decoding based on multisensory neurons. J Neurophysiol 116:1449-67
Jaynes, Molly J; Schieber, Marc H; Mink, Jonathan W (2016) Temporal and kinematic consistency predict sequence awareness. Exp Brain Res 234:3025-36
Kim, HyunGoo R; Angelaki, Dora E; DeAngelis, Gregory C (2016) The neural basis of depth perception from motion parallax. Philos Trans R Soc Lond B Biol Sci 371:
Dieter, Kevin C; Melnick, Michael D; Tadin, Duje (2016) Perceptual training profoundly alters binocular rivalry through both sensory and attentional enhancements. Proc Natl Acad Sci U S A :
Schwarz, Christina; Sharma, Robin; Fischer, William S et al. (2016) Safety assessment in macaques of light exposures for functional two-photon ophthalmoscopy in humans. Biomed Opt Express 7:5148-5169
Sharma, Robin; Williams, David R; Palczewska, Grazyna et al. (2016) Two-Photon Autofluorescence Imaging Reveals Cellular Structures Throughout the Retina of the Living Primate Eye. Invest Ophthalmol Vis Sci 57:632-46

Showing the most recent 10 out of 178 publications