- About Us
- Practice Resources
- Pay Dues
|Simulation in Physician Training|
Read the article and then use the corresponding link to take the CME test. Members must be logged in to take the test for free.
Duval County Medical Society CME Portal, January 2020
Date of Release: January 1, 2020
Date Credit Expires: January 1, 2022
Estimated Completion Time: 1 hour
The Duval County Medical Society (DCMS) is proud to provide its members with free continuing medical education (CME) opportunities in subject areas mandated and suggested by the State of Florida Board of Medicine to obtain and retain medical licensure. The DCMS would like to thank the St. Vincent’s Healthcare Committee on CME for reviewing and accrediting this activity in compliance with the Accreditation Council on Continuing Medical Education (ACCME). This month, the DCMS CME Portal includes an article, “Simulation in Physician Training” authored by Leslie Simon, DO, FACEP and Amanda Crichlow, MD, MSMS, FAAEM, which has been approved for 1 AMA PRA Category 1 credit.TM For a full description of CME requirements for Florida physicians, please visit www.dcmsonline.org.
Leslie Simon, DO, FACEP, Chair of Emergency Medicine and Medical Director at J. Wayne and Delores Barr Weaver Simulation Center, Mayo Clinic Florida and Amanda Crichlow, MD, MSMS, FAAEM, Attending Physician, Emergency Medicine at Florida Emergency Physicians of TeamHealth.
Medical simulation is used in medical education to facilitate the transition from medical student to practicing physician by allowing learners to apply medical knowledge in a safe, controlled setting. Simulation scenarios can be designed to assess each of the six Accreditation Council for Graduate Medical Education (ACGME) core competencies and allow residents to learn through active participation, feedback, and self-reflection while assisting faculty in standardizing the evaluation process.
1. Understand the reasoning behind simulation in physician education.
CME Credit Eligibility:
A minimum passing grade of 70% must be achieved. Only one re-take opportunity will be granted. If you take your test online, a certificate of credit/completion will be automatically downloaded to your DCMS member profile. If you submit your test by mail, a certificate of credit/completion will be emailed within 4 weeks of submission. If you have any questions, please contact the DCMS at 904-355-6561 or firstname.lastname@example.org.
Leslie Simon, DO, FACEP and Amanda Crichlow, MD, MSMS, FAAEM report no significant relations to disclose, financial or otherwise, with a commercial supporter or product manufacturer associated with this activity.
Disclosure of Conflicts of Interest:
St. Vincent’s Healthcare (SVHC) requires speakers, faculty, CME Committee and other individuals who are in a position to control the content of this educational activity to disclose any real or apparent conflict of interest they may have as related to the content of this activity. All identified conflicts of interest are thoroughly evaluated by SVHC for fair balance, scientific objectivity of studies mentioned in the presentation and educational materials used as basis for content, and appropriateness of patient care recommendations.
Joint Sponsorship Accreditation Statement:
This activity has been planned and implemented in accordance with the Essential Areas and policies of the Accreditation Council for Continuing Medical Education through the joint sponsorship of St. Vincent’s Healthcare and the Duval County Medical Society. St. Vincent’s Healthcare designates this educational activity for a maximum of 1 AMA PRA Category 1 credit.TM Physicians should only claim credit commensurate with the extent of their participation in the activity.
Medical simulation is used in medical education to facilitate the transition from medical student to practicing physician by allowing learners to apply medical knowledge in a safe, controlled setting. Simulation scenarios can be designed to assess each of the six Accreditation Council for Graduate Medical Education (ACGME) core competencies which are patient care and procedural skills, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice. It allows residents to learn through active participation, feedback and self-reflection and assists faculty in standardizing the evaluation process.
Most physicians can remember experiences as trainees when they were faced with a clinical situation that required knowledge or skills beyond their level of expertise. Simulation provides the opportunity to immerse trainees in life-like scenarios in a controlled environment that allows for “immediate feedback about questions, decisions and actions”1 for both technical and non-technical skills. It “complements, but does not duplicate,”1 education involving real patients and is “best employed to prepare learners for real patient contact”.1
The use of simulation in physician education is based on several educational theories. In 1946, Dale introduced his “Cone of Experience”2 that suggested that the effectiveness of a learning intervention is dependent on how engaged the learners are in the intervention. Simulation allows for active participation in the learning process compared to passive educational methods such as lectures. Simulation is also based on the principles of experiential learning. Kolb and Gibbs have described experiential learning as the process of participating and reflecting on an experience that occurs and then applying the learning in the future.3 Grant further expounded on these concepts and explained that the learning process occurs when one has an experience, thinks about that experience, identifies learning needs that would improve future performance, enacts a plan to incorporate what was learned, and then applies the new knowledge in similar future experiences.3 After a simulation scenario, a debriefing allows for trainees to both reflect in action and on action so they can continue to improve on the targeted knowledge, skills, or attitudes.
Over the last several decades the research has consistently demonstrated that simulation is an effective educational strategy. Numerous meta-analyses have demonstrated that its use produces better learning outcomes for knowledge, skills, and behaviors when compared to traditional teaching methods.1,4,5,6,7 Mastery learning is a competency-based educational strategy with its goal to ensure that learners achieve all the educational objectives with “little or no variation in outcome.”5 Simulation-based medical education integrating mastery learning has been shown with skills and procedures such as ACLS, lumbar puncture, and central line insertion to produce improved skill acquisition over traditional training modalities.5,6
The ACGME developed six core competencies that serve as common requirements of all ACGME accredited residency and fellowship training programs. From these competencies, specialty-specific milestones and entrustable professional activities (EPAs) have been developed for training programs to use to assess the competencies of their trainees in various domains. Simulation has and continues to be utilized to both train and assess trainees in the six core competencies: patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice.8 Simulation-based curricula can be tailored to teach and evaluate specific milestones and EPAs. It also allows for both the identification of and the implementation of targeted interventions to address knowledge gaps or skills deficiencies.
Patient Care & Procedural Skills:
The ACGME requirement for patient and procedural skills states that trainees must be capable of providing patient care that is both “appropriate and effective for the treatment of health problems and the promotion of health.”8 Appropriate and effective care involves the ability to obtain an accurate medical history, perform a physical examination, and synthesize the information gathered to formulate a suitable diagnostic and therapeutic plan for the patient’s chief complaint. Simulation-based medical education provides an environment for trainees to learn and improve these skills through the use of either standardized patients or high-fidelity manikins. In addition, with the duty-hours restrictions, trainees have fewer patient encounters than their predecessors, which may lead to inadequate exposure to certain pathology or patient populations.
Consequently, the ability to supplement real patient encounters with simulated patient encounters can help fill these gaps to ensure that critical skill acquisition opportunities are provided for each trainee. Procedural competency is paramount in many specialties as described by the ACGME that trainees need to be able to “competently perform all medical, diagnostic and surgical procedures considered essential for the area of practice.”8 Research has shown that length of experience, reputation and perceived mastery of knowledge and skill does not correlate with observed performance of a skill.9 Expert performance, however, occurs with deliberate practice which is the “provision of immediate feedback, time for problem-solving and evaluation, and opportunities for repeated performance to refine behavior.”9 Simulation-based procedural education, especially when combined with mastery learning and deliberate practice, has been shown in the literature to not only improve trainee performance but also improve patient outcomes with decreased complications with central line insertion, thoracentesis, and laparoscopic TEP hernia repair.7 Demonstration of procedural competency using simulation is a requirement for credentialing in ultrasound guided central line placement at Mayo Clinic and has decreased both infection rates and the rates of accidental arterial insertion.
The ACGME requires that trainees “demonstrate knowledge of established and evolving biomedical, clinical, epidemiological and social-behavioral sciences and apply this knowledge to patient care.”8 Miller’s10 assessment describes a pyramid in which the lowest level is knowledge (knows), followed by competence (knows how), performance (shows how) and action (does) at the top. The use of standardized written examinations to evaluate medical knowledge continues to be a component of medical education. In traditional medical education, most of the time is dedicated to knowledge acquisition rather than knowledge application. Simulation, however, provides a standardized mechanism for trainees to demonstrate the ability to apply their medical knowledge. At the University of Florida College of Medicine-Jacksonville, emergency medicine residents (during their PGY2 resuscitation block) compliment their clinical exposures with participation in standardized simulation cases to ensure they are capable of medically managing critically ill patients.
Practice Based Learning and Improvement:
Practice based learning and improvement includes the ability of trainees to “evaluate their care of their patients, to appraise and assimilate scientific evidence and to continuously improve patient care based on constant self-evaluation and life-long learning.”8 Adult learners are much more receptive to change and shifting their mental models when they recognize deficits in themselves. A facilitator skilled in debriefing helps trainees self-identify their strengths, opportunities for improvement, and how to incorporate evidence from scientific studies into clinical practice. Simulation training also allows for formative evaluation feedback to be incorporated into practice and provides an opportunity to practice both delivering and receiving feedback in a non-threatening setting. At the University of Florida College of Medicine-Jacksonville, during inter-disciplinary patient safety simulation scenarios, residents have the opportunity to receive faculty facilitated debriefing and conduct peer debriefings on the use of proper team communication techniques.
Interpersonal & Communication Skills:
The ACGME describes the interpersonal and communication skills as the ability of trainees to effectively exchange information and collaborate with patients, their families, and health professionals.8 An in-depth knowledge of pathophysiology does not correlate with the ability to effectively explain a diagnosis to a patient or effectively communicate as a leader of a medical team. There is a wide array of simulation-based curricula that can be developed to teach and assess trainees’ ability to communicate with patients and to learn and assess team leadership and team communication skills. Interdisciplinary mock code training, obstetric emergency training such as a shoulder dystocia scenario, or trauma team resuscitation training are all examples of simulation-based team communication initiatives. Simulation-based communication curricula can also provide opportunity for trainees to practice addressing issues such as informed consent, refusal of care, medical error disclosure, and delivering bad news to patients and family members. For example, at Mayo Clinic, hematology fellows learn to deliver a terminal cancer diagnosis to patients and their families using simulation.
Professionalism incorporates the demonstration of a commitment to carrying out one’s responsibilities with adherence to ethical principles.8 Physicians must demonstrate compassion, integrity, and respect for others. This includes sensitivity and receptiveness to a diverse population. Simulation scenarios can be designed to specifically address religious, cultural, age, disability, gender, and sexual orientation issues. For example, a resident may be asked to manage a patient with severe anemia that refuses blood products for religious reasons, or an uncooperative, injured adult patient with autism. It can also be used to teach sensitivity and appropriate sexual history taking in trans-gender patients. It is especially powerful when standardized patients can deliver feedback to the providers on how they perceived their care when participating in these trainings. Outside of patient satisfaction surveys, physicians often receive very little feedback on how patients respond to their “bedside manner” or insight on how subtleties such as body language and tone may shape patient perceptions. For example, internal medicine residents at Mayo Clinic use simulation to practice obtaining a sexual history on LGBT patients and then receive feedback on how they are perceived by both peer observers and the standardized patients themselves.
Trainees must demonstrate an awareness and responsiveness to the larger context and system of healthcare, as well as the ability to call effectively on other resources in the system to provide optimal healthcare. This includes working in inter-professional teams to enhance patient safety and patient care quality and identifying system errors and implementing potential system solutions. The simulation center may be used in many innovative ways to enhance healthcare delivery. It has been used to address quality and safety issues such as blood bank policies, bone marrow harvest procedures, and central line infections. Simulations for housekeeping staff have enhanced cleaning procedures to reduce transmission of clostridium difficile infections. Simulation can also be performed in situ allowing the learners to train and test the system in their workspace. Operating room (OR) scenarios focusing on everything from adherence to procedural time out policies to managing OR fires, malignant hyperthermia or complications during liver transplantation continue to advance quality of care and safety. At the University of Florida College of Medicine-Jacksonville, in-situ multi-disciplinary neonatal resuscitation simulations provide opportunities to improve the management of critically ill neonates.
Safety issues identified while conducting simulations frequently have practice implications. Reporting these issues as near misses or adverse events may help prevent this from happening in the clinical setting where actual provider injury could occur.
Experiential learning through medical simulation can help resident physicians transition from students with medical knowledge to physicians with the skills needed for clinical practice. Simulation can be used to assess, refine, and remediate learners in all the ACGME competencies and can be used in creative ways to help both educators and trainees gain insight into their own skills and opportunities for growth.
1. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005 Jan;27(1):10–28.
2. Lee SJ, Reeves TC. Edgar Dale: A significant contributor to the field of educational technology. Educational Technology. 2007; 47(6),56.
3. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul
Healthc. 2007 Summer;2(2):115-25.
4. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011 Sep 7;306(9):978-88.
5. McGaghie WC, Issenberg SB, Barsuk JH, et al. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014 Apr;48(4):375-85.
6. McGaghie WC, Issenberg SB, Cohen ER, et al. Does simulation-based education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011 Jun;86(6):706–11.
7. Griswold-Theodorson S, Ponnuru S, Dong C, et al. Beyond the simulation laboratory: A realist synthesis review of clinical outcomes of simulation-based mastery learning. Acad Med. 2015 Nov;90(11):1553-60.
8. Common program requirements [Internet]. Chicago (IL): Accreditation Council for Graduate Medical Education; 2018 [cited Jul 2018]. Available from: http://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements.
9. Ericsson KA. Deliberate practice and acquisition of expert performance: A general overview. Acad Emerg Med. 2008 Nov;15(11):988-94.
10. Miller G. The assessment of clinical skills/competence/performance. Acad Med. 1990 Sep;65(9 Suppl):S63–7.
To take the test and earn CME credit, click here.
1. How does the RTOG contouring atlas define target nodal areas in the treatment of breast cancer?
a. 5mm margin around vessels
b. bone and muscle
c. based only on pre-treatment imaging
d. definitions vary based on patient age and body mass index
2. The purpose of clips placed in the tumor bed at the time of surgery is to:
a. decrease the risk of infection
b. minimize post-operative risk of bleeding
c. improve accuracy of post-treatment mammogram surveillance
d. ensure full coverage of the tumor bed with the prescribed dose of radiation
3. Evaluation by radiation oncology early in the work-up and assessment of a patient is important because:
a. radiation should be the initial step in breast cancer treatment.
b. CT simulation is required prior to the start of other therapies.
c. radiation dose and fields may be tailored based on clinical findings at initial diagnosis.
d. the radiation modality should be determined at that time.
4. Which of the following factors does not affect plan robustness?
a. prescription dose
b. respiratory motion
c. treatment delivery time
d. change in seroma
5. Advantages of 3D conformal radiation therapy over IMRT or VMAT include all of the following except:
a. less low dose exposure to normal tissues
b. less sensitive to daily differences in patient set-up
c. increased conformality of high dose
d. increased robustness in the setting of breast edema
6. Which of the following modalities uses a continuous arc to deliver radiation dose?
a. 3D CRT
d. Proton therapy
7. Which modality eliminates the exit dose beyond the target volume?
a. 3D CRT
d. Proton therapy
8. The sharp fall-off of dose at the end of the proton path is called the:
a. given dose
b. Bragg peak
c. beam angle
d. Compton effect
9. One advantage of proton therapy compared to IMRT or VMAT is:
a. decreased dose to adjacent organs such as heart, lung and contralateral breast
b. increased conformality of high dose
c. decreased skin dose
d. target contouring is not necessary
10. What is the sensitivity of the 51-gene radiosensitivity signature?