Simulated virtual prototypes have removed much trial-and-error from the design process in many fields. But to be useful as a virtual prototype, a simulation must reliably predict the relevant characteristics of the thing being designed: strength for a bridge; lift for an aircraft, or appearance for a fabric. Previous advances in rendering have led to a current explosion in the use of rendered images to prototype the appearance of objects, and even to represent them to customers, but only where the materials involved can be represented well. Textiles are currently difficult to represent, due to their intricate construction and subtle reflectance, and because their appearance is inextricably linked to their shape and motion. Although fairly suitable for movies, the state of the art in simulating cloth appearance and deformation is not predictive of real appearance and therefore not usable in a design process, in which realistic but inaccurate results are useless. This project, which represents a collaborative effort between computer scientists at Cornell University and textile designers at the Rhode Island School of Design, aims to create the predictive simulation tools needed to enable virtual prototyping to revolutionize the textile and garment industries in the same way it has already revolutionized so many fields of manufacturing. New models and algorithms created by the Cornell team will be integrated into the Loomit tool for fabric design, which will then be used by the RISD team to produce new textiles that will in turn be shipped back to Cornell where changes in the RISD design process resulting from Loomit's enhanced capabilities will be studied. At RISD this involvement will engage many student artists and designers, helping them develop the skills to work with increasingly technological media in their future careers, while at Cornell the project will afford computer science students experience understanding and solving problems faced by artists and designers.

Textiles remain a challenge for appearance simulation because of many unsolved fundamental problems. The state-of-the-art models for realistic cloth rendering and simulation were developed for entertainment applications and do not accurately predict the behavior of real textiles. In this project, the PI team will compare appearance and deformation models to measurements of textiles from CT, imaging, and other modalities, then improve them as needed so that they are capable of fitting real materials. Current methods cannot capture the properties of an existing fabric well enough to predict its appearance under close visual inspection. The PI team will develop new methods for jointly capturing the appearance and mechanical properties of textiles, combining measurement and model tuning to produce predictive working models of textiles and the garments and furnishings made from them. Detailed cloth simulations and renderings are currently impractical for moderately complex objects, such as complete garments. Scalability fundamentally requires multi-scale models that simulate details only when they are actually needed. The PI team will create multi-scale models that enable interactive, predictive visualization during the design process. Since the targeted applications all require accurate prediction, a major focus of the research is on validation, testing both the accuracy of the individual technical pieces and the ultimate usefulness of the new techniques in actual use.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1513954
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2015-09-01
Budget End
2019-08-31
Support Year
Fiscal Year
2015
Total Cost
$244,236
Indirect Cost
Name
Rhode Island School of Design
Department
Type
DUNS #
City
Providence
State
RI
Country
United States
Zip Code
02903