The Quantitative Literacy Reasoning Assessment (QLRA) project is developing a non-proprietary QLR instrument, piloting it at several participating institutions across the country to begin the creation of a database of QLR abilities, and establishing an online resource portal for QLR assessment. Quantitative Literacy/Reasoning is a relatively new and growing field, with many institutions replacing traditional math requirements with various introductory QLR-requirements such as Liberal Arts Mathematics and Finite Math. The current developmental/introductory math program in this country is undergoing a profound paradigm shift, as focus moves from traditional algebra based curricula to the development of the quantitative skills and habits of mind required for decision making in our personal, civic and workplace lives. Underrepresented groups in STEM (minorities and women) are often disproportionately overrepresented in these traditional developmental courses. The mathematics point-of-entry for these underrepresented groups is a crucial time to nurture interest and engagement with mathematics that could lead to further STEM involvement. The QLRA project provides the needed assessment for curriculum innovation and coherence of these point-of-entry courses. Dissemination via the online portal allows institutions to easily adapt the non-proprietary instrument to their own needs. The QLRA Project provides the necessary assessment infrastructure and a collaborative platform as QLR requirements evolve around the nation.

Project Report

Quantitative Literacy/Reasoning (QLR) has been in the academic landscape for over two decades. For the latter half of this period, academic institutions across the US have been shifting the focus of introductory/general education math undergraduate courses toward QLR, emphasizing the quantitative tools that students will need for successful decision making in their personal, professional, and civic lives. While QLR courses and curricula are finding wide dissemination, assessment of QLR (in terms of skills of individual students, and the effectiveness of curricula) remains primarily a local activity. There have been publications such as Achieving QL, and the AACU QL VALUE rubric; most current assessment efforts however, are localized to a single campus, a single course, or even a single classroom. This is due, in part, to the difficulties involved with assessment of QLR skills. Even at locations where QLR tests are implemented for placement purposes, tests are not being used for end-result assessment. Bowdoin College, Colby-Sawyer College, and Wellesley College have existing instruments that provided a starting point for this project. However, none of these instruments allow for easy comparison across institutions since the instruments have only been administered locally. For example, Colby-Sawyer College developed and administered its QLR test to freshmen and seniors to assess and evaluate the impact of an NSF supported QL Across the Curriculum initiative. At the end of the four-year evaluation process, the lack of national data on student QLR abilities left the Colby-Sawyer community to wonder about the level of the impact of the initiative. This dilemma is not new, nor is it restricted to Colby-Sawyer; Rita Colwell expressed this succinctly in when she stated: "We do not really know if we are making progress [since]...we do not have genuine benchmarks for what constitutes quantitative literacy." This sentiment was echoed in 2008 in a paper in the American Mathematical Monthly which again stressed that most of these internal assessment tools have no national norms to compare to and the actual construct of "quantitatively literate" remains undeveloped. The QLR project described here aims to meet these needs by developing a valid and reliable test of QLR skills. In particular, the goals set forth in this NSF-supported project (NSF Grant DUE #1140562) were to design a QLR instrument that: is non-proprietary, provides a baseline of national QLR-scores from a variety of educational environments, is reliable, and has content validity. The QLRA test was created and piloted across the country by diverse institutions. In 2012 we had 10 schools pilot the test with 1,659 students. In 2013 11 institutions and 2,172 students piloted the refined version of the QLRA and 5 schools used our online test site. In 2014 we continued to offer the online test site through our 1 year no cost extension. The online site had 24 unique schools, 58 separate uses, and 2,169 students take the test. In addition we had at least 15 more schools using the test on their campuses (not all schools shared data as the pilot period of the grant was over), with more than 2,000 additional students. In 2015 we did not receive funding for our phase 2 proposal so have had to discontinue the online site. Most schools have switched to administering the test using software such as Blackboard and Moodle. As of March 19, 2015 we have had 18 new requests for use of the test. We wrote and published a paper, Towards Developing a Quantitative Literacy and Reasoning Assessment Instrument, summarizing our results from the first two years of the pilot project. This was published in the Numeracy journal in July 2014, http://scholarcommons.usf.edu/numeracy/vol7/iss2/art4/. We were able to establish the content validity and reliability of the instrument. We were able to establish a baseline of national QR scores for institutions to use for comparison when assessing their students' QR abilities. The QLRA continues to generate much interest as indicated by the 18 new requests we have had in the first 3 months of 2015 asking for information on use of the QLRA. Establishing the reliability and validity of the test gives schools the confidence to make use of the QLRA, and the national baseline of QR scores established by our project allows schools to be able to put their students' QR abilities into a national context. The Western Association of Schools and Colleges has mandated QR as an essential learning outcome for their participating schools for accreditation. This has accelerated the need for valid and reliable instruments such as the QLRA. Many states are beginning to replace College Algebra requirements with QR requirements again creating a need for valid and reliable assessment of QR. Many schools have begun using the QLRA in pre-post format for measuring growth.

Agency
National Science Foundation (NSF)
Institute
Division of Undergraduate Education (DUE)
Type
Standard Grant (Standard)
Application #
1140562
Program Officer
John Haddock
Project Start
Project End
Budget Start
2012-02-15
Budget End
2015-01-31
Support Year
Fiscal Year
2011
Total Cost
$193,253
Indirect Cost
Name
Bowdoin College
Department
Type
DUNS #
City
Brunswick
State
ME
Country
United States
Zip Code
04011