This project will apply touch feedback technology (skin and stretch feedback) for use in a game controller and will result in a refined game controller with integral skin stretch feedback. Tactile interaction game effects that can be portrayed on this game controller will be integrated with a commercial video game to provide a demonstration of this technology. It is believed that this will lead to both improved game immersion and performance. Performance gains are expected to be in the form of faster completion times, improved spatial awareness and tracking capability, and reduced cognitive load of the user.
The game controller with embedded skin stretch feedback has the potential to change the gaming world as well as to help this technology proliferate into other applications. The tactile effects portrayed in game play can be refined and applied to any number of applications that can benefit from an additional modality of communicated information. These applications include: mobile navigation in cars, airplanes, helicopters; military applications for improved control of mobile robots or unmanned aerial/marine/ground vehicles; and medical applications such as robotic surgery or rehabilitation; or providing guidance to the blind. Since the game controller also has great appeal to young adults, and to the public in general, it had been and will continue to be used for educational activities. The controller can serve as a platform to explain the importance of having a strong technical background in engineering and computer science, and shows that careers in engineering can be rewarding.
This project has discovered an innovative way to emulate force feedback that meets the challenging requirements of providing compelling touch feedback for Virtual Reality (VR) while still allowing users to gesture and move their hands naturally in VR. An immersive experience in VR requires engaging all of the senses, and the sense of touch has been largely ignored, so this project’s technology innovation could have a significant commercial impact on VR and video games, as well as applications such as upper limb rehabilitation and robotic surgery. This project has transformed the fundamental research that the Principal Investigator (PI) was conducting on tactile perception in his NSF CAREER grant (IIS-0746914) to be applied to the fields of virtual reality and video games. This transformation was a result of a customer-centric approach that developed new tactile devices and experiences and gathered feedback from potential customers in an iterative fashion (customer discovery process). Through this approach the project team identified a new means to emulate force feedback using a new form of tactile feedback that was developed by the PI. This new means of emulating force feedback is the major intellectual merit of this research. It is of particular merit because this technology can portray force-like interactions that are more realistic than current vibration feedback (also known by the trade name "Rumble" feedback), yet does not require a robotic arm connected to a desktop as most force feedback devices require. Therefore, this new technology can provide a large motion range to users. Our technology can also be produced at a considerably lower cost than force feedback. In contrast to vibration feedback, our new form of tactile feedback uses sliding plates in the handle of a motion controller (e.g., a Nintendo Wii) to subtly shear the user’s skin and mimic and present the user’s hand with the friction forces that they would encounter when holding real-life objects. We refer to this form of touch feedback as "shear feedback." This project has developed and refined functional motion controller prototypes and paired these devices with conventional video game interaction scenarios such as sword fighting, shooting a slingshot, or fishing using the cross-platform development environment Unity. A variety of demos were developed to highlight a wide range of tangible physical interactions that can be portrayed using this newly developed form of touch feedback. Unity was chosen for these demos as it is rapidly becoming the game engine of choice for a great number of developers. This project’s Principal Investigator has formed a company, Tactical Haptics, to commercialize the outcome of this I-Corps project. The broader impacts of this research is in creating a new form of touch (haptic) feedback that emulates force feedback, but in a low-cost, untethered package. This makes this technology attractive to use in virtual reality (VR) and video games; however, it could also be used as feedback technology for space or surgical telerobotics or to gamify applications like physical therapy. Furthermore, the developed technology could create more engaging in science, technology, engineering, and math (STEM) education by portraying interaction forces of molecules, magnetics, wind, etc.