Competency, Mastery and Deliberate Practice: Revisiting the Goals of Simulation Based Assessments

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The America's -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



Carla Pugh, MD, addresses lingering questions regarding which type of simulation and assessment venue best meets specific learning objectives.

Use of simulation-based assessments has continued to evolve. In the early iterations, mannequin-based trainers were used for classroom and group based training in cardiopulmonary resuscitation for basic and advanced cardiac life support certification. With the advent of objective, checklist driven assessments, the 1970’s brought us the Objective Structured Clinical Examinations (OSCE). This venue allowed for several types of simulations including task trainers, mannequins and standardized patients.1 . History taking, physical examination and procedural skills were some of the clinical competencies assessed. Subsequently, the development of full body physiologic mannequins and virtual reality trainers allowed for more dynamic assessments where decision making, communication and complex procedural skills can be assessed. Despite these advances, there are still lingering questions regarding which type of simulation and assessment venue best meets specific learning objectives. The goal of this article is to revisit some of the broader end points of assessment: competency, mastery and deliberate practice.

Competency

While mastery may be the implicit goal of learners exploring career specific content domains, competency is the tried and true minimum requirement expected and used for work place decisions. In 1999, the Accreditation Council for Graduate Medical Education and American Board of Medical Specialties categorized six core competencies: Patient Care and Procedural Skills (updated in 2010); Medical Knowledge; Interpersonal & Communication Skills; Professionalism; Practice Based Learning and Improvement and Systems Based Practice. Similarly, competencies for nurse residents and nurse professionals were developed. For example, the National CNS (Clinical Nurse Specialist) Competency Task Force published a list of core competencies for clinical nursing including: Direct Care; Consultation; Systems Leadership; Collaboration; Coaching; Research and Ethical Decision-Making, Moral Agency and Advocacy.

Today, most healthcare professionals have a specific set of core competencies by which they are assessed and taught. A common theme in the process of developing core competencies is the use of consensus agreement and a focus on providing high quality patient care. Well defined core competencies provide a strong framework for simulation based assessments. Curriculum developers and simulationists may use the competencies to ensure global coverage of important clinical areas. Transitioning from a list of competencies to an actual assessment requires careful attention to the desired learning outcomes. While this can be facilitated by defining the knowledge, skills and attitudes necessary for successful patient care interactions, the wide variety of behaviors, responses and situations in clinical practice adds considerable complexity to the process.2


Figure 1. The multi-step process of developing valid and reliable simulation based assessments. Image Credits: Carla Pugh, MD

In a 2003 article entitled “Reliability and Validity of a Simulation-based Acute Care Skills Assessment for Medical Students and Residents”3the authors detail the process of developing and evaluating a simulation based assessment of acute care skills. Figure 1 highlights the major steps in this process. The starting point for this process was a needs assessment. In reviewing the literature, the group found that the knowledge base for acute care skills was traditionally assessed using a pencil and paper test. In addition, the physical examination skills had traditionally been assessed using standardized patients. In review of their assessment, it was duly noted that critical care events are not easily modeled with standardized patients. Moreover, the ability to cite and describe acute care management strategies does not ensure that the physician can actually provide treatment.Despite these known weaknesses in the current curriculum, graduate physicians are expected to be able to manage acute care patients.

Developing the simulation based assessment of acute care skills was a multistep process. Once a full body physiologic simulator was chosen, faculty used a consensus approach to select 10 clinical scenarios that fairly represented the content domain and skills unique to critical and emergency situations. The next step involved development of performance checklists. A key notation in this process was the depth of attention applied to checklist development: “The checklist items were limited to less than 20 actions. A scoring weight ranging from 1 to 4 was also provided for each checklist item. The magnitude of the weight reflected the importance of the particular action in terms of patient care.” The checklists were then piloted to determine how well each scenario matched the clinical environment and expected clinical actions. After updating the checklists and scenarios based on the pilot, the formal assessment was administered. The assessment was video recorded and debriefing sessions were conducted to review participant performance and experience with the simulation-based assessment. The raters were chosen carefully. Final scoring was conducted using video tapes of the assessment. The results of this study indicate that reasonably reliable and valid measures of clinical performance can be obtained from simulation exercises, provided that care is taken in the development and scoring of the scenarios. Another important study finding was that it is likely that a relatively large number of performance scenarios will be required to obtain sufficiently accurate ability estimates.

Core competencies provide a useful framework for developing simulation based assessments. However, transition from a list of competencies to a reliable and valid performance assessment is a complex process.4,5 Lastly, it appears that a wide variety of clinical scenarios must be used in order to adequately assess competency in a pre-defined content domain.

Mastery and Deliberate Practice

Mastery is defined as comprehensive knowledge or skill in a subject. It has also been defined as control or superiority over someone or something. While competency focuses on minimum proficiency in a broad content domain and is often the goals of certification examinations, mastery focuses on high end expertise and is usually a self-motivated goal. In addition, attainment of mastery is historically judged by collegiate consensus. The term mastery learning was coined by Benjamin Bloom in 1971. In a mastery learning classroom, students are helped to master each learning unit before proceeding to more advanced tasks. Formative assessment is frequent and inherent in the process. In addition, performance criteria are explicit and inform the learning process. As such, mastery learning is more about the step-by-step achievement process as opposed to traditional content focused curricula where students are serially exposed to content unrelated to achievement goals. Additionally, mastery learning requires well-defined learning objectives and performance criteria and focuses on overt behaviors that can be observed and measured. In this environment students must show evidence of understanding of specific material before moving on to the next lesson. Using criterion referenced assessments students are able to focus on achieving their personal best. Critics of mastery learning cite time constraints as a major flaw and note their preference and need to cover a lot of material in a small amount of time. This is a breadth versus depth argument.


Figure 3. Dr. Anne O’Rourke, a trauma surgeon at the University of Wisconsin, in the performance measurement and motion tracking laboratory helping to set criterion for advanced suturing techniques.

When applied in healthcare learning environments using simulation, mastery learning requires that learners meet or exceed a minimum passing score on a simulated examination prior to performing the procedure or skill in actual clinical practice. Simulation-based mastery learning featuring deliberate practice gives residents and fellows the opportunity for individualized skills development and feedback.

Deliberate practice has been defined as activities designed for the sole purpose of effectively improving specific aspects of an individual's performance. Formative assessments and timely feedback are a highly integral component of deliberate practice. Research has shown that levels of expertise are highly correlated with the number of hours spent practicing. In a classic study of musicians, researchers found that high level experts (or masters) spent, on average, around 10,000 hours in solitary practice during their music development by age 20 whereas the least accomplished expert musicians spent around 5,000 hours. Amateurs were noted to spend around 2,000 hours. This same trend was noted for athletes, chess players and other professionals.


Figure 2. Medical students at Northwestern University engaging in a classroom based deliberate practice exercise using a manikin-based simulation with computer feedback.

A closer examination of the path to expertise reveals the distinct development of pattern recognition and information retrieval processes that are not present in amateurs or novices. Advances in our understanding of how experts think derive primarily from studies where experts are instructed to think aloud while completing representative tasks in their domains, such as chess, music, physics, sports and medicine. A close look reveals that experts select relevant information from a situation, encode it in special representations in working memory and use this information for planning, evaluation and reasoning about alternative courses of action. In essence, the difference between experts and novices is not the amount and complexity of accumulated knowledge that can be memorized. The difference is more closely related to the organization of knowledge and how it is represented. Experts' organization of knowledge around key domain-related concepts and solutions allow for rapid and reliable retrieval of relevant information. In contrast, novices encode knowledge using every day concepts that are not domain specific. This type of encoding makes retrieval and use of relevant knowledge difficult and unreliable. In addition, experts typically acquire domain-specific memory skills that allow them to rely heavily on long-term memory and dramatically expand the amount of information that can be kept accessible during planning and reasoning regarding alternative courses of action. The superiority of experts' mental representations allows them to adapt rapidly to changing circumstances and anticipate future events in advance. These same representations appear to be an essential component of experts' ability to monitor and evaluate their own performance. Deliberate practice is a natural part of this process.

Simulation based assessments can greatly facilitate deliberate practice and help to pave the road to mastery. To achieve this goal, a mastery learning approach with well-defined performance criteria appears essential.

Conclusion

We have reviewed the major components of competency and mastery. While both rely heavily on assessment, use of simulation to achieve these goals requires different approaches. When the focus is competency, it is highly advisable to use nationally accepted and pre-defined core competencies as a framework. Design of simulation based experiences for competency assessment purposes is a complex process requiring consensus on scope of content and continuous evaluation of scripted scenarios to ensure applicability. A wide variety of carefully designed scenarios is essential in achieving this goal. In contrast, mastery learning requires access to expert based performance criteria. A unique aspect of this process is the potential for exceeding performance criteria. In this instance, if the pre-defined criteria are met or exceeded, the learner may then focus on achieving a personal best. This can be a continuous process guided by self-assessment and individually defined achievement goals based on desired level of expertise.

About the Author

Carla Pugh, MD, PhD, FACS is Associate Professor, Vice-Chair for Education and Patient Safety and Clinical Director, University of Wisconsin Health Clinical Simulation Program

In 2011 Dr. Pugh received the Presidential Early Career Award for Scientists and Engineers. Dr. Pugh is also the developer of several decision-based simulators that are currently being used to assess intra-operative judgment and team skills.

She received her MD from Howard University College of Medicine and her PhD from Stanford University School of Education and a Clinical Fellowship in Acute Care Surgery, University of Michigan,

Dr. Pugh’s research interests include the use of simulation technology for medical and surgical education. She holds a method patent on the use of sensor and data acquisition technology to measure and characterize the sense of touch. Currently, more than 100 medical and nursing schools use one of Dr. Pugh’s sensor enabled training tools for their students and trainees. The use of simulation technology to assess and quantitatively define hands-on clinical skills is one of her major research areas.

REFERENCES

1 Harden RM, Stevenson M, Wilson Downie W, Wilson GM. Assessment of clinical competence using objective structured examination. BMJ. 1975;1:447-451.

2 Cannon-Bowers, J, Salas, E 1997, Teamwork competencies: the interaction of team member knowledge, skills and attitude, in Workforce readiness: competencies and assessments, ed. H F O’Neil, Lawrence Erlbaum Associates, Mahwah, pp.151-174.

3 Boulet JR, Murray D, Kras J, Woodhouse J, McAllister J, Ziv A. Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology. 2003 Dec;99(6):1270-80.

4 Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med. 2013 Jun;88(6):872-83.

5 Linn, R.L., Baker, E.L., Dunbar, S.B. Complex, Performance-Based Assessment: Expectations and Validation Criteria. Educational Researcher, Vol. 20, No. 8 (Nov., 1991), pp. 15-21.

Featured

More events

Related articles



More Features

More features