The purpose of this exploratory project is to test a proof-of-concept that reflects an emerging and potentially transformative mobile system to enhance recognition and way-finding function for users with severe visual impairments. The PI's Near and Far Environmental Awareness System (NaFEAS) will run on a mobile phone platform in concert with RFID tags and sensors which serve as inputs and outputs to a knowledge database that can be updated by the user. NaFEAS will recognize tagged objects and provide audio and other feedback to users within a designated envelope around the sagittal and coronal body planes. In addition, NaFEAS will facilitate way-finding using tagged objects in the near and far environments. The focus here is on the development of a low-fidelity prototype using participatory design; the theoretical basis is embodied cognition, which emphasizes the full use of senses, gestures, and space to understand objects and the environment. The PI will take special pains to minimize the cognitive biases occurring when sighted designers or researchers retrofit visually-dominant interaction patterns to develop accessible devices. The prototype will be evaluated by target group members using an experimental design with self-report, behavioral, and physiological variables.
Broader Impacts: Besides resulting in a prototype product that will be of great value to the target population, the findings from the experiments will advance our knowledge of how individuals with severe visual impairments successfully interact with next generation systems, including mobile, wearable, and ubiquitous systems. This knowledge will be generalizable to technologies beyond the test bed the PI will be using.