This Small Business Innovation Research (SBIR) Phase I project aims to improve the quality of online education and training by making it easier to add relevant quiz questions and knowledge checks to content as it is being created. The project applies recent advances in the ability of computers to detect topics and themes in text and to process natural language and builds on ideas from projects such as the Federal Learning Registry that associate contextual data to educational objects so that they can be better discovered and used. This Phase I project will create a proof-of-concept system that applies these methods to align existing questions to educational and training outcomes and to generate new questions relevant to the text being authored. Research objectives include validating that the system (a) effectively aligns content and assessments with outcomes, (b) supports curriculum design and (c) generates quality questions. The project will also research technical requirements for integration with enterprise learning platforms.

The broader impact/commercial potential of this project comes from significantly lowering the barrier to adding assessments to online content and from ensuring that these assessments are aligned with educational and training outcomes. This will help educational and commercial customers evaluate and demonstrate the effectiveness of online programs. It will also reduce the time and cost required to develop and deliver quality online training that focuses on specific competencies and learning objectives. More generally, the proposed technology has the potential to transform any content into an active and engaging learning experience by adding assessments. The proposed technology will be designed for integration into existing learning and learning content authoring systems as a software service. Initial collaborators include the assessment subsidiary of one of the largest learning management system companies and the National STEM Digital Library Resource Center.

Project Report

) that automatically generates assessment questions on the textual portion of any digital learning materials written in English. The questions are aligned with learning outcomes that are either automatically detected by ASPOA or identified by an instructor. This Phase I project resulted in a demonstration version of ASPOA in which instructors could edit the questions, save them, and export them in IMS Question and Test Interoperability format for use in learning management systems and other instructional technologies. The ASPOA system meets student expectations by using technology and interfaces that can be used on mobile devices as well as desktops. ASPOA applies a combination of natural language processing, artificial intelligence, and learning analytics to generate questions and align them with outcomes. The innovations in ASPOA include improved methods of generating questions and a method for aligning questions with outcomes and source text that overcomes the inherent difficulties in applying standard text analysis techniques to short bits of text whose interpretation requires significant domain knowledge. The ASPOA team researched and developed a set of evaluation criteria for assessment questions. They then used the demonstrator to evaluate the performance of ASPOA using questions, source material and outcomes in Chemistry. The test data came from the ChemEd Digital Library, an instance of Moodle, Connexions, and instructor-generated on-line course material and a standard published text (online version). Evaluations were carried out independently of the team that was developing the ASPOA technology. Evaluations showed that ASPOA was effective in generating questions related to the source text. Very few questions were of poor quality as determined by metrics such as grammaticality and intelligibility, and using ASPOA evaluators were able to generate questions in about one-quarter of the time required to compose them from scratch. The ASPOA team used the demonstrator to engage in discussions about ASPOA with faculty and administrators at multiple universities. As a result of these discussions, the ASPOA project has committed beta-test sites which will provide feedback and test ASPOA during its ongoing development. Intellectual Merit: The Phase I research forms the basis of a web-based "Software as a Service" solution for quickly generating, grading and analyzing outcomes-aligned assessments. Assessment questions are used to evaluate student comprehension and are also powerful learning tools. Decades of research show that answering questions is a better way to gain and retain knowledge than re-reading and reviewing material that has previously been studied. The difficulty in translating this research into results is that assessments take a long time to write and to grade. ASPOA removes this barrier and can be used with articles, case studies, web pages and similar materials that do not come with pre-existing exercises. ASPOA provides affordances for improving teaching and learning. It can be used by instructors to diagnose student learning and adapt their instruction on a class or individual basis, and students can use ASPOA for self-testing and self-guided learning. In addition, the Phase I research resulted in scientific contributions to automated question generation and outcomes alignment and uncovered ways that can be used to further improve question quality and reduce authoring times in the future. Broader Impacts: The ASPOA project is targeting higher education in the United States where over 4.500 colleges and universities serve over 21 Million students. These institutions face serious and costly problems with retention (due in part to student failure) and are required to collect data on student learning outcomes for program evaluation and accreditation. ASPOA addresses these problems by improving learning through assessment and providing assessment data correlated with outcomes. Moreover, the ability to generate assessments in real time can have a transformative impact on the way that instructors interact with their students in the classroom. ASPOA is also applicable to K-12, where teachers can used it to improve student learning in ways that meet the requirements of common core standards and No Child Left Behind initiatives. One of the beta sites for further development of ASPOA includes two departments in a College of Education where ASPOA is viewed as a tool that K-12 teachers can use in the classroom. It is anticipated that ASPOA will be disseminated to K-12 teachers through professional preparation programs in Colleges of Education, adding to its overall impact.

Project Start
Project End
Budget Start
2012-07-01
Budget End
2013-05-31
Support Year
Fiscal Year
2012
Total Cost
$149,998
Indirect Cost
Name
Eduworks Corporation
Department
Type
DUNS #
City
Corvallis
State
OR
Country
United States
Zip Code
97333