Evaluation of Faculty Debriefing Post simulation Events
Introduction
Debriefing is a core component of simulation-based learning. Reflection on the events of a completed simulation is a crucial step whereby participants learn and modify their behavior.[1] This reflection is generally guided by a facilitator whose goal is to identify knowledge gaps and attempt to address them.[2] Commercially available courses exist to teach debriefing, and workshops devoted to improving these skills are often part of the curriculum at national and international simulation courses. Also, many programs host their internal debriefing training. Numerous debriefing models exist to help facilitate the debriefing of participants within a simulation.[3]
Fewer models exist to facilitate the debriefing of the faculty running the simulation. As of this publication date, the International Nursing Association for Clinical Simulation and Learning (INASCL) has created a curriculum of best practices in simulation, including session facilitation and debriefing. While the summary article mentions maintaining debriefing skills through observed practice and peer evaluation, these evaluation tools are not explicitly addressed.[4]
Continuing Education
Register For Free And Read The Full Article
- Search engine and full access to all medical articles
- 10 free questions in your specialty
- Free CME/CE Activities
- Free daily question in your email
- Save favorite articles to your dashboard
- Emails offering discounts
Learn more about a Subscription to StatPearls Point-of-Care
Continuing Education
Several post-simulation evaluation tools have been created to meet this need, but they vary in focus. Some assess the simulation as a whole; others focus only on the debriefing portion, and a few focus specifically on the facilitator.
OSAD
One early tool created to evaluate the faculty debriefer is the Objective Structured Assessment of Debriefing (OSAD).[5] Described in the surgery literature in 2012, the OSAD was initially developed by researchers through a review of the existing literature and focused interviews of providers and receivers of debriefing. These results were then synthesized into eight features essential to an effective debriefing. These features were modified into the categories of the final OSAD, which are then rated on a 5-point Likert scale by trained observers: approach, environment, engagement, reaction, reflection, analysis, diagnosis, and application. Descriptive anchors were included in the 5-point scale. The benefits of OSAD include its brevity (estimated at 5 minutes to completion), validity, and interrater reliability.[5]
A similar development process was used to develop a pediatric-specific OSAD.[6] This tool had related categories and 5-point Likert benchmarks. Several committees and simulation centers later validated and adopted it for debriefing standardization and faculty development.[6]
Citing concerns for the traditional, paper-based form of the OSAD, other researchers have created a modified electronic version (eOSAD).[7] These authors note the benefits of good interrater reliability, the ease of use of video-recorded debriefing sessions, and the ability to add comments, which were missing from the traditional OSAD survey. They also describe protection from data corruption as a benefit, namely by eliminating the risk of losing paper copies or exposing data to transcription errors. However, one major limitation of the eOSAD is its requirement for real-time computer and internet access.
DASH
An alternative to the OSAD for evaluating faculty debriefing and simulation sessions is the Debriefing Assessment for Simulation in Healthcare (DASH). This tool, first described in 2012, evaluates six elements of debriefing.[8] Using descriptions of observable behaviors as anchors, participants receive a rating on a 7-point effectiveness scale. This tool has been validated and tested, demonstrating reliability. It requires standardized training to use, which generally takes place via webinar.
Authors of the DASH boast that it applies to simulations in various domains and disciplines.[8] Indeed, since its development, it has been used to evaluate debriefing in several contexts. In addition to its use in faculty development, it has measured outcomes in research studies with a simulation component. For example, a modified version of the DASH (the DASH student version) has been used to measure outcomes in faculty-led vs. resident-led debriefing sessions for medical students and residents.[9][10][11] It has also been a tool when evaluating interprofessional simulation debriefings.[12]
PADI
More recently, allied health literature introduced an alternative evaluation tool. The Peer Assessment Debriefing Instrument (PADI) adds a self-evaluation component to the post-debriefing assessment.[13] It evaluates eight aspects of planning and conducting a simulation debriefing. The debriefer and evaluator rate performances on multiple elements within each domain on a 4-point Likert scale. They then compare responses, which opens a conversation, allowing the debriefer to focus the feedback on particular areas he or she wants the evaluator to address.[13] It is reliable and valid across healthcare disciplines for evaluating debriefing.
The creators of the PADI suggest its benefits include the short time necessary to learn and implement the tool.[14] They also suggest it could be a valuable data source when evaluating teaching skills and effectiveness.[14]
Other Tools
Some authors have created debriefing evaluation tools for their studies, but these are not commonly used outside that setting.[15][16] Others modify existing scales to fit their needs. For example, based on the OSAD and DASH, a self-reporting quality scale was used to create the TeamGAINS debriefing tool.[17]
Still, others allocate a portion of more comprehensive tools for focusing on debriefing. These include the more holistic evaluation of nursing simulation facilitators, as was done with the Facilitator Competency Rubric.[18]
Several tools seek participant evaluations of the debriefer as part of a larger assessment of the simulated learning session. These include the Simulation Design Scale, created by the National League for Nursing, the Simulation Effectiveness Tool-Modified, and the Debriefing Experience Scale.[19][20] Each represents post-event evaluations that students complete by describing their perception of a faculty member or simulation event’s effectiveness.
Clinical Significance
There is no direct clinical significance to the evaluations of faculty debriefing after simulations. Instead, tools such as the OSAD, DASH, PADI, and others allow faculty to receive an assessment of their debriefing skills during a single simulation event, which may work to improve debriefing skills, which may then improve learning and impact clinical care.
Enhancing Healthcare Team Outcomes
Simulation-based medical education is a growing component of medical and nursing education. It is used to test systems, enhance communication, and improve teamwork.[21][22] The post-simulation debriefing is the primary venue for exploring and addressing knowledge and behavior gaps. Principles such as psychological safety and nonjudgemental attitudes are crucial to enhancing this learning environment.[23] Faculty development of facilitators leading these debriefing sessions may improve debriefing quality and, by extension, the team’s learning.
References
Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2007 Summer:2(2):115-25. doi: 10.1097/SIH.0b013e3180315539. Epub [PubMed PMID: 19088616]
Dismukes RK, Gaba DM, Howard SK. So many roads: facilitated debriefing in healthcare. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2006 Spring:1(1):23-5 [PubMed PMID: 19088569]
Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2016 Jun:11(3):209-17. doi: 10.1097/SIH.0000000000000148. Epub [PubMed PMID: 27254527]
Sittner BJ,Aebersold ML,Paige JB,Graham LL,Schram AP,Decker SI,Lioce L, INACSL Standards of Best Practice for Simulation: Past, Present, and Future. Nursing education perspectives. 2015 Sep-Oct; [PubMed PMID: 26521497]
Level 3 (low-level) evidenceArora S, Ahmed M, Paige J, Nestel D, Runnacles J, Hull L, Darzi A, Sevdalis N. Objective structured assessment of debriefing: bringing science to the art of debriefing in surgery. Annals of surgery. 2012 Dec:256(6):982-8. doi: 10.1097/SLA.0b013e3182610c91. Epub [PubMed PMID: 22895396]
Runnacles J, Thomas L, Sevdalis N, Kneebone R, Arora S. Development of a tool to improve performance debriefing and learning: the paediatric Objective Structured Assessment of Debriefing (OSAD) tool. Postgraduate medical journal. 2014 Nov:90(1069):613-21. doi: 10.1136/postgradmedj-2012-131676. Epub 2014 Sep 8 [PubMed PMID: 25201993]
Zamjahn JB, Baroni de Carvalho R, Bronson MH, Garbee DD, Paige JT. eAssessment: development of an electronic version of the Objective Structured Assessment of Debriefing tool to streamline evaluation of video recorded debriefings. Journal of the American Medical Informatics Association : JAMIA. 2018 Oct 1:25(10):1284-1291. doi: 10.1093/jamia/ocy113. Epub [PubMed PMID: 30299477]
Brett-Fleegler M,Rudolph J,Eppich W,Monuteaux M,Fleegler E,Cheng A,Simon R, Debriefing assessment for simulation in healthcare: development and psychometric properties. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2012 Oct; [PubMed PMID: 22902606]
Level 3 (low-level) evidenceCooper DD, Wilson AB, Huffman GN, Humbert AJ. Medical students' perception of residents as teachers: comparing effectiveness of residents and faculty during simulation debriefings. Journal of graduate medical education. 2012 Dec:4(4):486-9. doi: 10.4300/JGME-D-11-00269.1. Epub [PubMed PMID: 24294426]
Adams T, Newton C, Patel H, Sulistio M, Tomlinson A, Lee W. Resident versus faculty member simulation debriefing. The clinical teacher. 2018 Dec:15(6):462-466. doi: 10.1111/tct.12735. Epub 2017 Nov 16 [PubMed PMID: 29144023]
Doherty-Restrepo J, Odai M, Harris M, Yam T, Potteiger K, Montalvo A. Students' Perception of Peer and Faculty Debriefing Facilitators Following Simulation- Based Education. Journal of allied health. 2018 Summer:47(2):107-112 [PubMed PMID: 29868695]
Brown DK,Wong AH,Ahmed RA, Evaluation of simulation debriefing methods with interprofessional learning. Journal of interprofessional care. 2018 Jul 19; [PubMed PMID: 30024297]
Saylor JL, Wainwright SF, Herge EA, Pohlig RT. Peer-Assessment Debriefing Instrument (PADI): Assessing Faculty Effectiveness in Simulation Education. Journal of allied health. 2016 Fall:45(3):e27-30 [PubMed PMID: 27585622]
Saylor JL, Wainwright SF, Herge EA, Pohlig RT. Development of an Instrument to Assess the Clinical Effectiveness of the Debriefer in Simulation Education. Journal of allied health. 2016 Fall:45(3):191-8 [PubMed PMID: 27585615]
Kable AK, Levett-Jones TL, Arthur C, Reid-Searl K, Humphreys M, Morris S, Walsh P, Witton NJ. A cross-national study to objectively evaluate the quality of diverse simulation approaches for undergraduate nursing students. Nurse education in practice. 2018 Jan:28():248-256. doi: 10.1016/j.nepr.2017.10.010. Epub 2017 Oct 12 [PubMed PMID: 29195107]
Level 2 (mid-level) evidenceGururaja RP,Yang T,Paige JT,Chauvin SW, Examining the Effectiveness of Debriefing at the Point of Care in Simulation-Based Operating Room Team Training 2008 Aug; [PubMed PMID: 21249934]
Kolbe M, Grande B, Spahn DR. Briefing and debriefing during simulation-based training and beyond: Content, structure, attitude and setting. Best practice & research. Clinical anaesthesiology. 2015 Mar:29(1):87-96. doi: 10.1016/j.bpa.2015.01.002. Epub 2015 Jan 28 [PubMed PMID: 25902470]
Leighton K, Mudra V, Gilbert GE. Development and Psychometric Evaluation of the Facilitator Competency Rubric. Nursing education perspectives. 2018 Nov/Dec:39(6):E3-E9. doi: 10.1097/01.NEP.0000000000000409. Epub [PubMed PMID: 30335707]
Level 3 (low-level) evidenceFranklin AE, Burns P, Lee CS. Psychometric testing on the NLN Student Satisfaction and Self-Confidence in Learning, Simulation Design Scale, and Educational Practices Questionnaire using a sample of pre-licensure novice nurses. Nurse education today. 2014 Oct:34(10):1298-304. doi: 10.1016/j.nedt.2014.06.011. Epub 2014 Jul 9 [PubMed PMID: 25066650]
Leighton K,Ravert P,Mudra V,Macintosh C, Updating the Simulation Effectiveness Tool: Item Modifications and Reevaluation of Psychometric Properties. Nursing education perspectives. 2015 Sep-Oct; [PubMed PMID: 26521501]
Level 3 (low-level) evidenceMiller D, Crandall C, Washington C 3rd, McLaughlin S. Improving teamwork and communication in trauma care through in situ simulations. Academic emergency medicine : official journal of the Society for Academic Emergency Medicine. 2012 May:19(5):608-12. doi: 10.1111/j.1553-2712.2012.01354.x. Epub [PubMed PMID: 22594369]
Auerbach M, Roney L, Aysseh A, Gawel M, Koziel J, Barre K, Caty MG, Santucci K. In situ pediatric trauma simulation: assessing the impact and feasibility of an interdisciplinary pediatric in situ trauma care quality improvement simulation program. Pediatric emergency care. 2014 Dec:30(12):884-91. doi: 10.1097/PEC.0000000000000297. Epub [PubMed PMID: 25407035]
Level 2 (mid-level) evidenceRudolph JW, Simon R, Dufresne RL, Raemer DB. There's no such thing as "nonjudgmental" debriefing: a theory and method for debriefing with good judgment. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2006 Spring:1(1):49-55 [PubMed PMID: 19088574]