Brain-Computer-Interfaces (BCIs) extract and classify electrical signals recorded from the brain to allow users to bypass the motor system and control hardware such as a computer cursor or a prosthetic device by thoughts alone. Practical applications for locked-in patients with ALS or severe spinal cord injury have been demonstrated in laboratory settings;however, current methods rely on the subjects to train their thoughts using feedback and often lead to user mental fatigue and unreliability in non-laboratory, self-paced conditions. The translational research proposed here seeks to address those issues by extracting information from the electrical activity in parietal regions of the brain associated with the intended targets of voluntary reaching movements. An algorithm will be written to translate those electrophysiological signatures of intended spatial field in eye-centered coordinates into spatial features of the computer screen. This inferential method is expected to provide more robust detection of the user's intentions under self-paced, non-laboratory conditions. Contributions include improved assistive technologies for disabled populations as well as new generalizable knowledge in basic neuroscience regarding sensorimotor processing.