The goal of this exploratory project is to determine whether people can learn to write legibly without visual feedback. Specifically, the PI will explore ways to train blind individuals to use handwriting as an accurate and convenient method for entering text into a Personal Data Assistant (PDA). He will determine equipment requirements, and design and implement proof-of-concept software and hardware that will enable low-cost, off-the-shelf PDAs or smart remote controls, which are currently unusable by people who are blind, to replace expensive Braille devices. If this is possible, determining the optimum feedback strategy will require evaluation of many different modalities and techniques. Blind and visually impaired subjects will learn the shapes of letters by tracing them out on laser-cut stencils. Various sizes and fonts will be evaluated to determine the best fit for writing on a PDA screen. Various audio and haptic feedback techniques will be evaluated for reinforcing muscle memory, with the goal of providing a virtual audio and haptic template, whose exact nature is not yet known, after the mechanical template is removed. The software will remember the trajectory of the physical template, and any deviation from it will result in changes in sounds or pseudo-haptic vibration patterns in the stylus. Similar feedback will be used to provide the user with information relating to position and orientation of the writing. In addition to providing feedback that guides the handwriting, it will also be necessary to integrate feedback such as synthesized speech from the application that is being accessed; this will lead to interesting research questions such as how to minimize the cognitive load imposed on the user and how to prevent different feedback channels from interfering with each other. As time allows, the PI plans to also explore whether strategies similar to those that turn out to be effective for blind users will prove effective as well for sighted people who must function in eyes-busy or poorly illuminated situations.
Broader Impacts: If successful, this project will lead to low-cost alternatives to Braille devices, will enable blind people to pick and choose from off-the-shelf consumer devices, and will eliminate the need to learn Braille for people who lose their sight after they have already learned handwriting. The project will lead to alternative ways for aging people to interact with devices, such as TV remotes, when physical or visual impairment make it difficult for them to read the legends on the buttons or to operate the buttons. These benefits are not limited to aged or disabled people, however. Handwriting input provides a convenient input modality for any device that employs natural language processing. Most current devices that use natural language input require the use of either a keyboard or a speech recognition system. Unfortunately, speech recognition is still unreliable in many domestic and work environments, and inappropriate in many social situations. Initial experiments with a PDA have shown that it is much easier and quicker to write commands (such as "tv on") than to navigate menus and select the required option. The PI's Archimedes Project has previously developed, and received several patents for, an Intent Driven Interface called iTASK that uses natural language text input to control computers or appliances and enter information into a computer. The proposed handwriting input strategies will be fully compatible with iTASK and will enhance its usability in a very broad range of home, work, school, and leisure applications.