Jim Carlson, PhD, discusses a program his school uses to enhance clinical reasoning competency.
Jim Carlson, PhD, PA-C, describes Rosalind Franklin’s use of a web-based virtual patient simulator to enhance case-based learning in teaching clinical reasoning.
Clinical reasoning is complex, requiring significant time and experience to master. Traditionally, this competency is acquired through didactic coursework followed by an apprenticeship phase where learners are mentored by experienced clinicians.1 Case-based learning (CBL) is frequently used to teach clinical reasoning.2 This paper describes the use of i-Human, a web-based virtual patient encounter simulator, to enhance CBL in both physician assistant and medical school curricula.
CBL is an instructional approach that uses patient vignettes to prepare students for clinical training, linking theory to practice.2 CBL uses an inquiry-based method where learners work independently or in groups participating in clinical reasoning by collecting clinical data (H&P), determining differential diagnoses, ordering and interpreting clinical tests, and constructing a management plan. Case vignettes are selected to meet specific clinical reasoning learning objectives, guided by faculty. CBL is similar to Problem Based Learning (PBL) and even bedside teaching with actual patients. However, CBL is generally more structured, relying on a guided inquiry model where learners are directed through a series of steps that model clinical practice while receiving feedback to hone decision-making accuracy.
CBL patient vignettes may be text-, computer-, or video-based, or use other forms of simulation based training (SBT). Standardized patients (SP) or mannequin-based cases are frequently used to mirror realistic encounters, reliably assess learner performance, and provide highly specific individualized feedback. However, SP and mannequin-based simulations are often resource intensive. Faculty time is required to write case objectives and scripts. SPs need to be trained by qualified educators. Quality assurance should be performed regularly to assure appropriate case portrayal, calibration to the script and rating tools devised. High fidelity mannequins require "animation" by trained simulation technicians and faculty presence is required to guide the flow of case simulations, as well as to review and assess student performance. While high in face validity and effective for teaching and assessment, the complexities associated with such simulations create scalability challenges that can prevent regular implementation, especially for programs with large class sizes.
Virtual patients (VP) provide an opportunity to engage students in CBL with greater frequency and fewer resources than other forms of simulation. Cook and Triola define a VP as a “specific type of computer program that simulates real-life clinical scenarios; learners emulate the roles of healthcare providers to obtain a history, conduct a physical exam, and make diagnostic and therapeutic decisions”.3,4 Assessment may be automated and case debriefing can be individualized as with SP or mannequin-based simulation, but delivered with fewer faculty resources. Once developed, a VP can be disseminated to an unlimited number of users and offer a solution to the scalability problems associated with other forms of SBT. Based on these advantages, our institution decided to adopt and integrate i-Human Patients® or “i-Human”, www.i-human.com, a web-based virtual patient platform, to provide greater access to CBL and complement our existing mannequin and SP-based activities.
The i-Human case player simulates a wide range of clinical conditions and reasoning activities. Users interact with a patient avatar to perform discrete clinical reasoning elements; conducting comprehensive or problem focused H&Ps, determining differential diagnoses, ordering and interpreting tests, and determining final diagnoses and management plans. Instructors can draw from an existing bank of cases or create their own using an authoring tool. Patients are highly customizable including presentations of male, female, adult, pediatric, and varied cultural backgrounds. Case authors have the option to customize the case flow to include strategies that provoke specific clinical reasoning behaviors such as identifying pertinent findings, linking findings to diagnoses, ranking diagnostic hypotheses, and linking each test ordered to the differential diagnosis selected. Figure 1 demonstrates the i-Human user interface.
i-Human case play follows a guided inquiry approach where students ask questions and make decisions as the case player advances sequentially through the patient encounter vignette, mirroring the clinical reasoning process for the learner. Students have the freedom to collect clinical information and commit to clinical decisions, but feedback can be given at each step to teach optimal decision making and prevent learners from straying too far from the appropriate decision making process. Conversely, feedback can also be withheld at any or all points to allow learners the opportunity to make decision errors and learn from those mistakes. This flexibility allows instructors to calibrate case difficulty to learner skill level, a feature shown to promote effective learning.5,6 To strengthen our clinical reasoning instruction and assessment, and to explore how a virtual case player might be effective used within a larger curriculum, our faculty decided to pilot the i-Human in four specific ways:
- Engage and assess individual Physician Assistant (PA) student clinical reasoning in a CBL course.
- Promote collaborative learning in small group CBL within the medical curriculum.
- Improve interactive learning in larger group case studies in PA and medical training.
- Study diagnostic reasoning behavior, cognitive bias, and diagnostic error
i-Human was incorporated into the redesign of a clinical reasoning course within our Physician Assistant (PA) curriculum. Clinical Decision Making (CDM), is a required didactic course for all first year PA students during their pre-clinical training. Course objectives include learning how to perform comprehensive and focused history and physical examinations (H&P), development of differential diagnoses and treatment plans, and documentation of clinical encounters. Content is delivered using CBL, allowing students to actively engage in and receive feedback on the clinical reasoning process.
Prior to the redesign, the course was largely delivered using a classroom format where students collected the case H&P verbally in a "round robin" format, playing the role of a patient and answering questions as the class asked them. Students were then tasked with determining a differential diagnosis, linking the H&P findings to support the diagnostic hypotheses suggested, and ordering initial testing. After class, students received the results of the tests they ordered, determined a final diagnosis, and submitted a case write up detailing the pathophysiology of the disease diagnosed. Since the H&P was taken as a class, this format did not allow for individualized engagement and assessment of these skills. In recent years, standardized patient and mannequin-based simulations were introduced to allow students more immersive experiences. These additions provided an opportunity to better assess individual student reasoning proficiency but use of mannequin-based and standardized patient cases was sporadic due to resource and time limitations.
Case studies typically delivered by faculty during in-class face to face sessions were developed into i-Human cases for individualized case play. Students now perform a focused H&P, determine a differential diagnosis, select diagnostic tests, review the returned diagnostic tests, determine a management plan, and document the case in a SOAP note format all within i-Human. This format allows for case play in a “flipped” classroom design where students experience the case prior to class sessions and come prepared to discuss case findings rather than spending precious class time collecting clinical information.
Course surveys demonstrate that 100% of students agree or strongly agree that virtual case play was a valuable addition to the course and that individual case play was helpful to learning. Specifically, students said i-Human cases brought a greater sense of realism, allowing them to feel engaged in authentic clinical reasoning. Faculty were pleased with greater access to individual student data for grading and debriefing purposes. SP and mannequin-based case studies are still used to assess psychomotor constructs such as interpersonal communication and direct physical examination. However, reasoning is now more regularly reviewed using i-Human, allowing faculty the opportunity to provide feedback specific to individuals and the class overall. i-Human continues to be used in conjunction with SP and mannequin-based cases to engage and assess PA student clinical reasoning behaviors in this CBL course.
Promoting collaborative learning in small group CBL
We also implemented i-Human to augment small group case-based learning within the medical school curriculum. Our M2 Clinical Skills Course, a two-week immersive session offered prior to transitioning to M3 clinical rotations. Objectives include reinforcement of clinical reasoning skills learned during pre-clinical didactic course work. Content is operationalized to specific disciplines including surgery, internal medicine, women's health, pediatrics, and other core clinical rotations in order to help students apply their developing clinical skill set to the specialties and environments they will be practicing during the third year. Course content is delivered using a CBL format where small groups of 7-8 students are asked to engage patient case studies or solve clinical problems under the guidance of two senior level (M4) students participating in a teaching elective.
i-Human is used on an interactive SMART board (See figure 2). Cases are scripted to deliberately engage small group collaboration using several tools embedded in the i-Human platform. Students perform a focused H&P, answer basic science and clinical questions relevant to the case topics, determine a differential diagnosis, and develop initial plans for management. Ambiguous clinical information is included to promote discussion and debate between learners, an exercise that strengthens critical thinking and reasoning. For example, if the case has abnormal heart or lung sounds, the group is asked to come to a consensus on what the finding is (e.g. wheezing vs. crackles) and the significance of the finding to the case presentation. Additionally, quizzing features and breakout exercises (e.g. acid base calculations, anatomy review, etc.) built into i-Human can be completed on the SMART board to link basic science with clinical concepts presented in the case.
Student surveys and faculty feedback highlight that using a virtual platform enhances collaborative learning that helps students. Prior course versions relied on paper case studies presented in larger group sessions offering limited realism and interaction between learners. i-Human appears to provide strong face validity in that learners report feeling engaged in authentic decision making. M4 facilitators, all whom took prior iterations of the course without the use of i-Human, found the virtual patient format to be engaging and effective at promoting small group discussion and collaboration and helpful in structuring their teaching and keeping the group on task, since it modeled the clinical decision making sequence and provided learning exercises that could be used to promote group discussion and consensus building.
Promoting collaborative learning in larger groups
As noted, utilization of virtual patient technology in a small group format engaging students in CBL-based clinical reasoning was effective and well received. Not surprising since studies show that CBL is often most effective when promoting social learning and collaboration.2 However, recruiting small group facilitators and identifying space for many small groups can be a challenge to implement on a regular basis. Additionally, scripting highly immersive detailed case studies for individualized student case play is resource intensive.
CBL has been informally and formally used during large group faculty-led sessions at our institution for decades. Classroom-based faculty-led case studies are efficient and well received by students and faculty due to their ability to efficiently model the clinical reasoning process. Traditionally, this has been done through paper cases or PowerPoint cases presented during lecture-oriented sessions, which was limited in terms of student interaction and the inability to place students in learning activities that let them feel they are making authentic clinical decisions (suspension of disbelief).
We use i-Human to bring greater collaboration and realism to traditional case studies. Using a large room with tables seating eight students, up to 300 students can be accommodated. A central computer is utilized to “play” i-Human while the case output is replicated on monitors at each table, allowing students to view and experience the case. See Figure 3. Thus, a single faculty can provide a guided small group CBL experience for 300 students. Case play is frequently paused to encourage students to discuss and come to consensus on the meaning of H&P findings, laboratory values, and final diagnostic hypotheses. As with the small group sessions used with the M2 skills course, the facilitated i-Human case player enable larger groups to realistically and collaboratively proceed through a variety of case studies and more fully experience the clinical reasoning process. Student and faculty feedback has been positive, resulting in students more regularly using i-Human in small groups and individually. Faculty feedback notes i-Human case player brings an element of realism and student engagement that was difficult to attain with paper or PowerPoint case studies.
Studying diagnostic reasoning behavior, cognitive bias, and diagnostic error
Missed or delayed diagnoses occur in approximetly15% of patient cases and efforts are underway to better understand how to train providers to avoid errors and improve accuracy.7,8 Cognitive reasoning failures are frequently associated with diagnostic error and most experts use a dual process (type 1 and type 2 reasoning) model to explain how clinicians make diagnostic decisions and understand how these errors might occur.
Experts tend to make decisions while relying on pattern recognition learned from experience (type 1 reasoning) while novices tend to make decisions using a more thorough but cumbersome deductive approach (type 2 reasoning).Understanding, studying and measuring the decision making process is challenging.
Virtual patient technology offers a powerful platform to study and address this need. i-Human has the ability to engage clinicians in authentic patient vignettes while capturing information about the H&P they perform, the tests they order, and the diagnoses they consider. Authors can construct cases of varying levels of difficulty to study how this variable affects accuracy. Similarly, case studies can be presented to both novice and experienced clinicians or different types, (physician assistants, physicians, nurse practitioners, etc.) in an effort to better understand how each group uses clinical information when making diagnostic decisions. We currently are putting protocols in place to study these variables using the i-Human case player in an effort to inform both teaching and practice.
Within our curricula, the i-Human virtual case player has been well received by faculty and students participating in CBL. It has allowed more frequent assessment and feedback on individual student clinical reasoning and greater collaboration in small and large group case studies. It serves as a method to study clinical reasoning behaviors and patterns. Our programs continue to rely heavily on mannequin and SP-based simulation, but adding a virtual case platform has been a valuable efficient way to more regularly engage students in meaningful learning activities focused on clinical reasoning.
About the Author
James (Jim) Carlson, PhD, PA-C is the Vice Dean, College of Health Professions and Associate Vice President for Healthcare Simulation at Rosalind Franklin University of Medicine and Science (RFUMS). He has over 12 years of teaching and experience in health professions simulation and assessment. His research interests involve simulation-based assessment, interprofessional education, and clinical reasoning evaluation.
- Bowen JL. Educational strategies to promote clinical diagnostic reasoning. NEngJMed.2006;355(21):2217-2225. doi:10.1056/NEJMra054782.
- Thistlethwaite JE, Davies D, Ekeocha S, et al. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Med Teach. 2012;34(6):e421-e444. doi:10.3109/0142159X.2012.680939.
- Cook DA, Triola MM. Virtual patients: a critical literature review and proposed next steps. Med Educ. 2009;43(4):303-311. doi:10.1111/j.1365-2923.2008.03286.x.
- Cook DA, Erwin PJ, Triola MM. Computerized Virtual Patients in Health Professions Education: A Systematic Review and Meta-Analysis: Acad Med. 2010;85(10):1589-1602. doi:10.1097/ACM.0b013e3181edfe13.
- Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10-28. doi:10.1080/01421590500046924.
- McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50-63. doi:10.1111/j.1365-2923.2009.03547.x.
- Berner ES, Graber ML. Overconfidence as a Cause of Diagnostic Error in Medicine. Am J Med. 2008;121(5):S2-S23. doi:10.1016/j.amjmed.2008.01.001.
- Graber ML. Educational strategies to reduce diagnostic error: can you teach this stuff? Adv Health Sci Educ. 2009;14(1):63-69. doi:10.1007/s10459-009-9178-y.