Numerous reports on the effectiveness of U.S. higher education in the Science, Technology, Engineering and Mathematics (STEM) disciplines call for increased emphasis on conceptual learning, rather than rote memorization. Suitable assessments (tests) of conceptual learning (often referred to as concept inventories or diagnostic question clusters), however, are few and are constrained by the ability to score the outcomes of the tests in a cost-effective manner. Multiple-choice assessments (selected responses) are more widespread in higher education, especially at medium to large institutions where class sizes are large, and where automated scoring provides the essential cost-effectiveness. Conversely, written response assessments (constructed responses), which are widely held to be superior at revealing actual student thinking, are quite rare in practice given the time and effort required for manual scoring. This project leverages the latest computerized tools and statistical techniques to make constructed response assessments more broadly available. Computer automation allows the use of these more insightful conceptual questions and tests with much larger numbers of students, thereby providing an enhanced understanding of students' conceptual learning. Project personnel work with developers of conceptual testing instruments to create constructed response versions of the tests coupled with the necessary computerized scoring tools with the eventual goal of providing computer-automated evaluation of conceptual thinking. The project is a collaboration among three major public universities.

Project Report

Critical to determining what and how students are learning about science are the assessments that faculty use. "What is tested becomes what is important." Often, large, introductory science courses concentrate on disconnected scientific "facts" that students then memorize in order to pass multiple-choice tests. Yet, science is about constructing explanations. Scientists do not take multiple choice tests. Rather, they write and propose models and theories to explain the natural world. The Automated Analysis of Constructed Response (AACR) research group explores computerized techniques for analyzing student writing in these large introductory courses so that faculty can ask students to engage in the more authentic task of writing and constructing explanations, rather than relying on multiple choice tests. While computers remain unable to "understand" free-form writing, they do have the ability to rapidly and accurately identify words and phrases in text. Computers exceed the potential of human evaluators in terms of their ability to rapidly process large volumes of text-based responses into their essential terms, phrases, and concepts quickly, accurately and consistently. The AACR group developed questions that address challenging concepts in biology and biochemistry and the computer techniques to analyze responses and predict how expert humans would score the responses. The agreement between computer predictions and expert human scorers is as good as or better than the agreement between expert human scorers. The AACR group has created, tested and implemented 120 AACR questions about evolution, biomolecules, genetics, metabolism, and thermodynamics. The project has produced 9 journal articles, 43 conference papers, 5 invited talks, 17 posters, 21 YouTube videos, and a project website (www.msu.edu/~aacr). With additional NSF funding (1323162, 1347740) the AACR team is expanding the national network to include 6 collaborating universities. Faculty at those institutions are using the AACR questions in classes and receiving reports that identify students’ scientific ideas and misconceptions about crucial, foundational concepts in biology. The faculty are developing instruction to address student learning challenges revealed by those questions.

Agency
National Science Foundation (NSF)
Institute
Division of Undergraduate Education (DUE)
Type
Standard Grant (Standard)
Application #
1022653
Program Officer
Myles Boylan
Project Start
Project End
Budget Start
2010-09-01
Budget End
2014-08-31
Support Year
Fiscal Year
2010
Total Cost
$458,575
Indirect Cost
Name
Michigan State University
Department
Type
DUNS #
City
East Lansing
State
MI
Country
United States
Zip Code
48824