This project is a three year project that is a collaboration between the Concord Consortium and CORD and that extends earlier ATE funded work on assessment of competencies with electronic circuits and test equipment. The new assessments provide finer grain analysis about student understanding and misconceptions, thus enabling specific and well targeted individual interventions. Each assessment challenges the user to accomplish some task, such as making a measurement or troubleshooting a circuit. The computer monitors the student and generates reports for use by the student, the instructor, or both. The project aims to improve learning by providing timely and informative feedback on students? progress, as inferred from their performance on realistic tasks.

The PI has found that students? scores on question-and-answer tests are not reliable predictors of their ability to perform a cognitively demanding task, as shown by their performance on simulation-based assessments. In particular, the PI has shown that there must be a serious reconceptualization of the assessment tools used to evaluate students? learning if they are to constitute reliable measures of the targeted knowledge and skills. By extension, the project investigates the central question of how to evaluate whether a student has learned something.

Project Report

Goals of the project The United States does an admirable job of training upper level STEM (Science, Technology, Engineering, and Mathematics) professionals. We turn out some of the most creative and productive scientists and engineers on the planet. That’s why so many other countries send their best and brightest to study at our universities. But our success at the college and graduate school level is not matched by our pre-college education system, with the result that many students graduate from high school and college unprepared to enter STEM careers. In a cruel irony we are forced to recruit foreigners for jobs for which our own graduates are unprepared. The SPARKS Project was aimed at helping students succeed in introductory courses in electronics at technical high schools and two- and four-year colleges. We believe that the approach we adopted – using web-based simulators linked to personalized feedback – would work equally well with older students, including adults seeking new employment opportunities, and that our success in electronics could be replicated in other domains. Why simulators? One of the barriers that keep people from entering STEM careers is the cost of the equipment used to train them. Instruments like digital multimeters and oscilloscopes cost thousands of dollars and schools cannot afford to buy one for every student who is interested in electronics. Moreover, laboratory sessions must be supervised, not only to assist the students but to ensure that they don’t damage sensitive equipment. So lab time is limited and students have inadequate opportunity to "learn by doing." Simulations can help. They are easily implemented (the technology for simulating electronic circuits has been around for decades) and can be made to run on just about any computer – including tablets and smartphones. They eliminate the risk of breakage and can log each student’s actions and analyze that data to guide instructors. They provide an opportunity for students to hone their skills through practice– having them troubleshoot a faulty circuit again and again, for instance, but varying the fault each time. A simulator can also provide intelligent feedback. Because it "knows" what task the student is attempting, the computer can assign students a grade based on their success or failure at that task. More important, it can tell the students where they went wrong and how to improve their performance, and offer them a tutorial customized to their particular problem. And the computer can go beyond merely detecting "right" and "wrong" answers; it can say, for example, in response to a troubleshooting task, "You found the fault but you made a lot of measurements that you didn’t need to make." What about the real world? When we describe our project we often get the reaction summed up in the phrase: "I wouldn’t want to be operated on by a surgeon whose only training had been on a simulator!" It’s a good point. Although we have tried to make our simulated circuits and test equipment look as realistic as possible, we acknowledge that nothing can replace the experience of working with the real thing, and no computer can replace a human being when it comes to teaching. We are not attempting to do either of these things. Our goal is to make students better prepared to get the most out of their limited lab time and to give both students and teachers feedback that will help them detect misunderstandings and confusions before they show up on the end of unit test. Our simulations are intended to supplement, not supplant, the classroom and the laboratory. The SPARKS simulations can be accessed at http://concord.org/projects/sparks and used free of charge. Originally, the project supported classroom management functions, reporting, and long-term storage of results. Since the project is concluded, current users can only access their own reports during the current session. What’s next? With support from the Google Summer of Code program we are adding non-linear components, such as transistors, to the simulations to cover digital as well as analog circuits. We are also planning to add a social dimension to SPARKS by implementing a communication link between breadboards, which will enable users to create more complex circuits involving the integration of multiple breadboards. This will enable us to engage students in collaborative problem-solving tasks, and teach them how to work effectively in teams. Beyond simulating hands-on electronics tasks, SPARKS has demonstrated the advantages of using a computer to "look over the shoulder" of students and to evaluate and coach each one’s performance in preparation for the hands-on laboratory experience that will follow. And while for this project we chose introductory electronics for our simulations, we are confident that many other fields of expertise in STEM and elsewhere can benefit from a similar strategy.

Agency
National Science Foundation (NSF)
Institute
Division of Undergraduate Education (DUE)
Type
Standard Grant (Standard)
Application #
0903243
Program Officer
V. Celeste Carter
Project Start
Project End
Budget Start
2009-05-01
Budget End
2013-04-30
Support Year
Fiscal Year
2009
Total Cost
$942,501
Indirect Cost
Name
Concord Consortium
Department
Type
DUNS #
City
Concord
State
MA
Country
United States
Zip Code
01742