An award has been made to San JosÃ© State University, the University of Wisconsin-Madison, and the University of Colorado at Boulder to pilot a validated assessment instrument in order to collectively assess the impacts of the Biology Research Experience for Undergraduates (Bio REU) Program. The Bio REU Program has exposed talented undergraduates to scientific research for more than 20 years. Sites are funded by NSF to support 8-12 week summer programs that culminate in a scientific symposium where students present their findings. Site-specific assessment data are collected by Site program directors and then submitted to NSF through annual progress reports. These data cannot be used to determine the effectiveness and impact of the Bio REU Program as a whole, because each program uses its own specific assessment instrument. This project will adapt and test a validated, on-line assessment tool using the Undergraduate Research Student Self-Assessment (URSSA) platform. The project will: (1) pilot test an adapted instrument on a subset of students participating in the REU site programs being conducted in summer 2010; (2) revise, retest and deploy the instrument to a larger group of REU program participants in summer 2011; (3) transfer responsibility for managing the tool and data to the Bio REU Leadership Council (LC) in 2011; and (4) make the instrument available to all Bio REU programs starting summer 2012. The PI and Co-PI of this project are members of the Bio REU LC and serve as co-chairs of the Bio REU Assessment sub-committee. These individuals will take primary responsibility for working with the URSSA team to deliver the products. At least 50 REU programs are anticipated to participate in the 2010 pilot study, and an additional 100 programs are expected to participate by 2012. The pilot group of REU sites represents a variety of REU programs, with varying scientific focus and targeted participant groups. The pilot study in summer 2010 will provide data to allow the core set of questions to be refined and be more applicable to the diversity of programs (e.g. types of institutions, sub-disciplines of biology, student populations served). Bio REU Site directors interested in participating in the pilot assessment should contact Dr. Julio G. Soto (San Jose State University) at Julio.Soto@sjsu.edu, or Dr. Janet Branchaw (University of Wisconsin Madison) at firstname.lastname@example.org. Information about this project can found in the Bio REU web site: www.bioreu.org/.
The primary objective of the project was to develop, administer and analyze a customized version of the Undergraduate Research Student Self Assessment (URSSA) survey for the BIO-REU group. We also wanted to gather validity evidence about the URSSA so it can be used confidently by REU programs. We helped the NSF BIO-REU group administer the Undergraduate Research Student Self Assessment (URSSA) survey over three years to 1603 students. Over the three years we collated and merged data from students at multiple REU sites. We then analyzed the data, wrote up the results for three years (2010-2012) and produced reports for the BIO-REU group. We helped BIO-REU customize the URSSA for their own use, set up the survey on the salgsite.org departmental site, and then responded to any difficulties instructors experienced with setting up their own instruments. Merging data required the creation of a SPSS data base made from the individual data from over 100 sites. Merging data was complicated by the fact that each site used slightly different versions of the instrument. Yearly analysis of the survey compared results over years, and between BIO-REU and non-BIO REU groups. Results showed consistent improvement in some areas over years. We also assessed the validity of URSSA through factor analysis, item analysis, and comparisons between textual and quantitative responses. Weston created simple descriptive statistics, used inferential statistical tests to compare between years and groups, and tested regression models to examine the effect of variables such as gender, race-ethnicity and class standing on survey variables. The validity of the survey was assessed in several ways. Factor analysis examined the latent structure of the survey to learn if the hypothesized structure of the survey matched the empirical response patterns of students taking the survey. We used Confirmatory Factor Analysis (CFA) to test model fit with four, three and one factor solutions. Results indicated that the scales used as the basis of the survey conformed with the factor structure of the survey, but were moderately correlated with each other, indicating a strong underlying factor common to all items. We also used item analysis to examine survey item functioning for floor and ceiling effects, redundant items, and item variability. The reliability of composite variables was also assessed through Cronbachâ€™s Alpha. item analysis showed only three items not functioning; these were removed from the survey. All composite scales showed adequate reliability. The validity of ratings by site was assessed by coding textual responses to open-ended questions for positive and negative views of learning gains and overall satisfaction with the REU. Average ratings were then correlated with numerical responses by site to learn if numerical responses reflected those found in the written responses of students. Overall, student written responses conformed with numerical ratings, especially with satisfaction items. Correlations were evident at both the student level, and when averaged overs students at the site level. Norms were also created with both the BIO-REU and the wider URSSA data. A regression model with student and institutional data will allow sites to generate expected scores on URSSA indicators based on the characteristics of their institution and the demographic characteristics of their students. We believe the project impacted both the BIO-REU and the wider REU community. The project allowed the BIO-REU group to survey its members with the URSSA. Information from the survey informed BIO-REU about student assessments of their own learning, and if self-assessments varied substantially among sites. Other information from the survey included summaries of the types of research activities conducted by students (e.g., poster presentations or attending conferences) and satisfaction with their REU experience. Information gained on strengths and weaknesses of programs provided feedback for PI's to make changes in their own laboratories. We also examined the validity of the URSSA instrument and found that its scales and items performed as anticipated. Information gained from student open-ended responses of their experiences in REU programs conformed to student numerical ratings of similar content. While further validation of the instrument is needed, the validity information to date allows for more confident use of URSSA by other individuals and programs. The use of the survey with over 2500 students (BIO-REU and other users) also provides a robust basis for comparative analysis within or with other programs. For BIO-REU, program personnel compared findings over years and between BIO-REU and non-BIO REU students. We also produced "norms" for institutional comparisons in the 2012 report. These comparisons show averages and ranges by institutional type and demographic characteristics of REU's. We anticipate placing these norms on the new Student Assessment of Learning Gains website as a reference for comparisons. This resource will benefit those using the URSSA to assess their REU's. Weston created three final reports for the project, and short summary reports.