This Phase II project will complete the development and study of a computer-based product intended primarily for teaching children with intellectual disabilities. The product, """"""""SymbolTeacher,"""""""" addresses a pivotal skill - symbolic matching to sample - which is a target of many current programs for teaching this population. In a symbolic matching task, students are presented with an array of two- or three-dimensional stimuli and required to select the item that """"""""goes with"""""""" a sample; hence the name """"""""matching to sample."""""""" Unlike an identity matching-to-sample task, in which a match is made on the basis of physical identity, symbolic matches involve stimuli that are not identical; matches are made on the basis of instruction and feedback. For example, a student is taught to match the picture of a dog (referent) to the printed word DOG (symbol); there are no physical stimulus properties that define the symbol-referent relation, and thus nothing inherent in the stimuli to guide the student if he or she has had no previous experience with them. The project has two major objectives. First, we will adapt well developed, extensively researched laboratory methods and software for use by parents, teachers, and other helping professionals. Second, we will evaluate the resulting product to determine that it can be used effectively in typical teaching situations. There is a manifest need for the product. Symbolic matching provides a foundation for teaching a large variety of discrimination, reading readiness, and symbolic communication skills. However, many children with intellectual disabilities do not learn symbolic matching readily (or at all) via conventional instructional methods. Over the past decade, a substantial investment of NIH research funding has led to the development of methods that can reliably establish symbolic matching in children with disabilities. Despite these advances, the relevant knowledge and tools are not readily accessible to the professionals, parents, and children who would benefit. We will use the STTR funding mechanism to give broader access to this behavioral technology. ? ? ?