This project is advancing undergraduate STEM education through the improvement of ways in which science majors, who anticipate becoming science teachers at the secondary level (middle school and high school), are prepared to engage in formative assessment reasoning that is grounded in evidence of student learning and understanding.

Traditional secondary level school assessment methods stress grading with too little emphasis on the role of assessment in fostering student learning. This encourages rote and superficial learning with teaching practices tending to emphasize completion of work over the quality of student understanding. This project seeks to revolutionize teachers' assessment practices as crucial to advancing educational reform. The work especially focuses on formative assessment, the type of assessment that holds the most direct promise for enhancing student learning in classrooms.

The prospective teachers in this project are undergraduate science majors preparing to become middle school and high school science teachers through a teacher preparation program in the College of Science at the University of Arizona. The project is being accomplished in four phases: (1) development of assessment probes to diagnose prospective science teachers' formative assessment reasoning skills at different points in the science teacher preparation program, (2) creation of two "maps," one that characterizes the progression of prospective teachers' reasoning throughout the program and a second map of expert teacher standards for formative assessment reasoning, (3) development and implementation of instructional activities designed to improve the alignment between prospective teachers' reasoning progression and the expert teacher reasoning map, and (4) analysis of the learning outcomes associated with the implemented instructional activities followed by revision of the activities.

The products created in this project - assessment probes and instructional activities - will be sustainable and directly incorporated in the daily instructional practices in the teacher preparation program at the University of Arizona. All of the products and research findings of this project will be fully available for other science teacher educators for use in teacher preparation programs across the nation.

Project Report

Through this project, we investigated how prospective science teachers, college students preparing to become science teachers, approached the analysis of student responses to written assessments. Through the analysis of participants’ responses, we discovered insights into the design of meaningful learning experiences for prospective teachers that elicit, challenge, and enrich their conceptions of student understandings. Based on our findings we created and implemented instructional interventions to help prospective teachers more effectively evaluate student work. The first phase of the project was used to identify what prospective science teachers notice when evaluating evidence of student understanding in another teacher’s inquiry-based unit. Analysis of our data revealed two major categories of elements, Task-General and Task-Specific, noticed by our study participants. Task-General elements included attention to learning objectives, independent student work, and presentation issues and they often served to guide or qualify the specific inquiry skills that were evaluated. Task-Specific elements included the noticing of students’ abilities to perform different components of an investigation. In general, study participants paid attention to important general and specific aspects of student work in the context of inquiry. However, they showed preferential attention to those process skills associated with designing an investigation versus those practices related to the analysis of data and generation of conclusions. Additionally, their interpretations of assessment outcomes were largely focused on the demonstration of general science process skills; much less attention was paid to the analysis of the epistemological validity or scientific plausibility of students’ ideas. In the second phase of the project, we investigated how prospective science teachers evaluate students’ written responses. We identified what elements of students’ written work were noticed, what types of inferences of student understanding were built, and what these noticed elements and inferences told us about levels of sophistication in assessing student understanding. The results of this study suggest that analyzing teachers’ assessment of student understanding requires paying attention to both domain-general and domain-specific aspects of teacher reasoning. Domain-general factors characterize how a teacher frames the assessment of student understanding. Domain-specific aspects characterize how the teacher attends to relevant disciplinary ideas. Beginning prospective teachers’ abilities to generate appropriate inferences were varied, and they were influenced by teachers’ content knowledge and pedagogical content knowledge, as well as by contextual factors. Study participants often focused on the description and qualification of student work, making fewer attempts to interpret and make sense of student ideas. Results from these studies have led to the development of a reasoning map of how prospective science teachers’ vary in sophistication in the evaluation of student written work. The map includes both domain-general and domain-specific areas of understanding and describes a spectrum from novice to expert understanding. Our work has also led to the development of a new instructional unit for prospective science teachers to improve their understanding and approaches to the evaluation of student work. This unit is currently being used in our courses for prospective science teachers at the University of Arizona, as well as being integrated into other education courses for current teachers.

Agency
National Science Foundation (NSF)
Institute
Division of Undergraduate Education (DUE)
Type
Standard Grant (Standard)
Application #
1043159
Program Officer
Kathleen Bergin
Project Start
Project End
Budget Start
2011-02-15
Budget End
2014-07-31
Support Year
Fiscal Year
2010
Total Cost
$199,269
Indirect Cost
Name
University of Arizona
Department
Type
DUNS #
City
Tucson
State
AZ
Country
United States
Zip Code
85719