The proposed technology is a high-performance optoelectronic architecture for multi-finger interaction. As computing resources grow cheap and ubiquitous, the importance of embodying interaction grows, so that people can express themselves to computers with their bodies, in the same rich ways that we communicate with each other. Multitouch and free-air are important new sensory modalities for embodied interaction. Most multi-touch devices on the market are smartphones and tablets using capacitive multi-touch sensing technologies. However, these technologies do not scale well to large displays because of cost issues and technological challenges. They also cannot sense gloved fingers, precluding their use in many application scenarios. In contrast, the proposed effort encompasses a sensing structure that has a competitive cost structure when scaled to large displays, increasing linearly with display size, as opposed to the quadratic increase in costs exhibited by capacitive sensing. The technology does not require actual touching, only interruption of a sensing plane, enabling gloved interaction in emerging markets such as automobiles. It can work outdoors in sunlight, and in hazardous conditions. It is suitable for TV, tele-operation and 3-D scanning. Beyond multi-touch sensing, the technology enables other new modalities, such as pen+touch, haptics+touch, and free-air interaction. The range of potential markets is wide and deep.

The proposed technology can transform the impact of embodied interaction on society. Potential integration with large scale LCDs can transform users' experiences of media in living rooms and conference rooms alike. The high precision and many-finger tracking of the team?s approach in large formats in social innovation-oriented contexts will enable the development of new forms of Computer Supported Cooperative Work and Play in the information age workplace, education, and home. Integration with automobiles will first enable easier interaction by gloved hands on small control panels; more innovative solutions will use the platform for interaction across the whole dashboard, and using the windshield as a heads-up display. If successfully commercialized, the technology has the potential to open up a new broad array of industrial applications.

Project Report

ZeroTouch is a new technology that aims to bring multi-touch sensing to your desktop monitors, televisions and more. You’re probably most familiar with multi-touch from using your tablet or smartphone; anytime you pinch or zoom an image, or try to knock out mischievous pigs in Angry Birds, the multi-touch sensor in your phone is sending the x,y coordinates of each of your fingers to it’s internal software. However, the sensors used in phones and tablets rely on a technology that doesn’t scale well to large screen sizes, like those used on your desktop, or in your living room. ZeroTouch aims to fill that gap. Our goal in the NSF I-Corps program was to find commercialization opportunities for the ZeroTouch technology developed at the Interface Ecology Lab. During the course of the summer, the Principal Investigator and Entrepreneurial Lead teamed up with a business mentor and startup expert, and participated in an intensive crash-course in entrepreneurship. During this process, we reached out to 10+ potential customers each week, including computer manufacturers like Dell, software manufacturers like Microsoft, and other companies that we felt could benefit from the technology. During the development process of ZeroTouch, we received a lot of positive feedback from the research community regarding its potential applications and the possibilities it creates in terms of enabling human-computer interaction researchers to develop new, interesting forms of interaction. However, when we began to investigate the multi-touch marketplace, we discovered that the production costs of the sensor weren’t as amenable to mass production and commercialization as we would have hoped. While appropriate for a research environment, where staying on the cutting edge of technology is essential, ZeroTouch simply wasn’t ready for the mass market. Since then, we’ve continued to use ZeroTouch as an integral part of our research into new forms of human-computer interaction, establishing a research group dedicated to investigating the "living room of the future", performing basic research on simultaneous pen+touch interaction (like you do every day with a pen and paper), and further exploring ways to reduce the cost of ZeroTouch, for future commercialization. ZeroTouch is involved in research collaborations involving TEEX Urban Search and Rescue, and the University of Nottingham Mixed Reality Lab and Horizon Center of Digital Economy.

Agency
National Science Foundation (NSF)
Institute
Division of Industrial Innovation and Partnerships (IIP)
Type
Standard Grant (Standard)
Application #
1242538
Program Officer
Rathindra DasGupta
Project Start
Project End
Budget Start
2012-07-01
Budget End
2012-12-31
Support Year
Fiscal Year
2012
Total Cost
$50,000
Indirect Cost
Name
Texas A&M University
Department
Type
DUNS #
City
College Station
State
TX
Country
United States
Zip Code
77845