Today's clinical learning environments do not provide the level of deliberate practice, direct supervision, and rigorous assessment and feedback needed to develop diagnostic reasoning expertise. Clinical performance assessment emphasizes learner evaluation over learner development, lacks rigor and utility for developmental purposes, and clinical teachers have expressed particular difficulty with diagnosing reasoning deficits for remediation purposes. Further, medical students' diagnostic reasoning does not improve over the course of clinical training and senior medical students have highly variable diagnostic performance that is often rated below expectations according to theory-based and validated scoring criteria. Independent practice does not necessarily enhance the context for clinical reasoning; the majority of physicians' medical errors are thought to be diagnostic in nature. We propose to improve undergraduate medical education to minimize the time to clinical competency for first year residents through targeted diagnostic reasoning skill development that (1) integrates basic science and clinical instruction; (2) provides deliberate practice with structured, case-based learning opportunities; and (3) enables anytime/anywhere learning that fits with the demanding schedules of most medical students. Southern Illinois University School of Medicine (SIUSOM) is a recognized leader in using performance-based clinical competency exams to enhance reasoning skill acquisition among medical students. These exams feature clinical scenarios with standardized patients followed by diagnostic justification essays which require students to explicitly describe the thought process used to reach a final diagnosis. These essays are the most reliable method of assessing diagnostic strategies but are not in use in the majority of medical schools, though interest in improving diagnostic reasoning instruction and assessment during undergraduate medical education is widespread. Barriers to the widespread adoption of this approach are 1) the time-consuming need to hand score each essay; and 2) the difficulty in accurately and consistently identifying the causes of strategy failures. This project will develop an application to provide automated scoring of diagnostic justification essays, identification of the underlying causes of failure when students perform poorly, and feedback with instructional strategies for remediation specific to each deficit. We propose these specific aims: 1) Improve reliability of human scoring of DXJ essays. 2) Extend the automated scoring algorithms. 3) Automated reasoning failure categorization and remediation. 4) Complete the software development required for delivering the commercial product. 5) Evaluate predictive validity of automatically scored DXJ essays. The proposed product represents a significant shift in undergraduate medical training and through Phase III dissemination will address a critical gap between education and practice in academic medicine.

Public Health Relevance

Today's clinical learning environments do not provide the level of deliberate practice, direct supervision, and rigorous assessment and feedback needed to develop diagnostic reasoning expertise. Better preparation during undergraduate medical education can shorten the time to competency of first year residents, improving patient outcomes. We propose to develop and test a technology-enabled, deliberate-practice approach to training diagnostic strategy that includes automated scoring of diagnostic justification essays, identification of specific diagnostic strategy failures and targeted remediation. The proposed product represents a significant shift in undergraduate medical training and through Phase III dissemination will address a critical gap between education and practice in academic medicine.

Agency
National Institute of Health (NIH)
Institute
National Institute of General Medical Sciences (NIGMS)
Type
Small Business Technology Transfer (STTR) Grants - Phase II (R42)
Project #
5R42GM108104-03
Application #
9537633
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Cole, Alison E
Project Start
2014-06-01
Project End
2019-07-31
Budget Start
2018-08-01
Budget End
2019-07-31
Support Year
3
Fiscal Year
2018
Total Cost
Indirect Cost
Name
Parallel Consulting, LLC
Department
Type
DUNS #
121080290
City
Petaluma
State
CA
Country
United States
Zip Code
94952