This project is being conducted by a large team across 6 institutions that is building on already developed open-ended constructed response versions of well-established concept inventories that can be accurately assessed with already created computer automated analysis resources. The computer-automated analyses are able to predict human ratings of students' work on these topics and have demonstrated higher inter-rater reliability than a group of trained expert human graders. Constructed response assessments reveal more about student thinking and the persistence of misconceptions than do multiple-choice questions, but require more analysis on the part of the educator. In past work, items designed to identify important disciplinary constructs were created based on prior research. The items were then administered via online course management systems where students entered responses. Lexical and statistical analysis software was used to predict expert ratings of student responses. To date, the work has focused primarily in the fields of biology and chemistry in biological contexts.

The current project is leveraging the previous research on Automated Assessment of Constructed Response (AACR), and extending the work to other institutions and other STEM disciplines. The specific goals of this project are to: 1. Create a community web portal for the Automated Assessment of Constructed Response (AACR) assessments to expand and deepen collaborations among STEM education researchers, thus providing the infrastructure for expanding the community of researchers and supporting the adoption and implementation of the innovative instructional materials by instructors at other institutions. 2. Propagate the innovations by providing instructors with professional development and long-term, ongoing support to use the assessments. This includes information about common student conceptions revealed by the questions, instructional materials for addressing conceptual barriers, and the opportunity to join a community of practitioners who are using the AACR questions and exchanging materials. 3. Expand the basic research to create and validate AACR questions in introductory chemistry, chemical engineering, and statistics. 4. Engage in ongoing project evaluation for continuous quality improvement and to document the challenges and successes the project encounters. 5. Lay the foundation for sustainability by providing interfaces for e-text publishers, Learning Management System vendors, and Massively Open Online Courses as potential revenue streams to operate and maintain the online infrastructure.

Intellectual Merit: Improving STEM education requires valid and reliable instruments that provide insight into student thinking. The automated analysis of constructed response assessments have the potential to assess "big ideas" in STEM in a richer, more multi-faceted manner than multiple-choice instruments. This project is extending the number of these items and provide an online community where instructors may obtain, score, and contribute to the library of items and resources necessary for their analyses.

Broader Impacts: The web portal is extending the use of the products created in this project to instructors nationwide. In addition it is providing the foundation for a national collaboration of science and engineering educators interested in developing deeper conceptual assessment tools and supports and mentors postdoctoral research fellows, and graduate research assistants in STEM education research.

National Science Foundation (NSF)
Division of Undergraduate Education (DUE)
Application #
Program Officer
Ellen Carpenter
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
State University New York Stony Brook
Stony Brook
United States
Zip Code