The PI's objective in this research is to empower children with certain kinds of disabilities so they can participate fully in initial programming environments (IPEs) that are used to teach computer science. Specifically, the PI will investigate the science and necessary tool construction to support speech-enabled adaptation of IPEs such as Scratch, Lego LabVIEW for Mindstorms, and Alice, which were originally designed for manual input with a keyboard and mouse, in order to allow children with limited use of their limbs to interact via alternative interfaces. The aforementioned IPEs traditionally rely on user interfaces involving windows, icons, and other graphical widgets, and require a mode of program input that can pose a barrier to those with upper limb motor impairments, who may lack the dexterity and mobility needed to control a mouse or keyboard with their hands. The PI's approach will be to imitate the common mouse and keyboard interactions with a voice-driven interface that is customized for each IPE. To these ends, the PI will develop a speech-aware application that runs in parallel to the IPE, listens to the voice commands from the user, interprets the commands according to a grammar influenced by the IPE concepts, and imitates appropriate actions similar to mouse and keyboard operation within the IPE. Core research questions will include how such assistive customizations can be added with automation using reverse engineering and model-driven engineering.

During the first year of the project, the PI will extended a previously developed proof-of-concept to cover the entire Scratch interface. The result will be a robust tool that enables Programming by Voice in Scratch, and which will serve as the evaluation instrument for a target group of children with disabilities. The lessons learned from the first phase of the project will drive a generalization of the steps needed to customize a speech interface for an existing application. Techniques involving screen scraping and reverse engineering, as well as model-driven engineering, will be investigated to automate the process of adapting IPEs to Programming by Voice. The resulting tools will be applied to a new IPE, the Lego LabVIEW for Mindstorms, to allow children with disabilities to program robots. The design and evaluation of the project will be performed in collaboration with United Cerebral Palsy of Birmingham, who will recruit participants into the project and provide resources for training, evaluation, and feedback on the project design.

Broader Impacts: This project unites ideas of human-computer interaction with computer science education to provide customized assistive environments for teaching computational thinking to children with disabilities (targeting grades 6-12). The work will advance our ability to automate the generation of software development environments that support Programming by Voice, resulting in advanced capabilities to enable children with upper limb motor impairments to participate fully in computer science education opportunities. The PI will involve graduate students in the research; he will supervision undergraduate Honors projects, and he will also mentor high school students from underrepresented groups with science fair projects related to this work. The results of the research will be disseminated through a project web page that will include open source software, teaching materials, video demonstrations and publications.

National Science Foundation (NSF)
Division of Information and Intelligent Systems (IIS)
Standard Grant (Standard)
Application #
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Alabama Tuscaloosa
United States
Zip Code