Back To Search Results

Record and Data Retention in Medical Simulation

Editor: Stormy M. Monks Updated: 7/10/2023 2:18:04 PM

Introduction

Simulation can produce multiple types of data, and in turn, this data can provide significant support for simulation planning, grading stimulated activities, assessing attainment of educational goals and objectives, providing simulation usage reports for strategic planning and financial decisions, and processing academic records. Simulation data can be complex, and often involves multiple data points for multiple learners and systems. Additionally, managing simulation records and ensuring database security across a range of multiple simulation encounters can be challenging.[1]

Function

Register For Free And Read The Full Article
Get the answers you need instantly with the StatPearls Clinical Decision Support tool. StatPearls spent the last decade developing the largest and most updated Point-of Care resource ever developed. Earn CME/CE by searching and reading articles.
  • Dropdown arrow Search engine and full access to all medical articles
  • Dropdown arrow 10 free questions in your specialty
  • Dropdown arrow Free CME/CE Activities
  • Dropdown arrow Free daily question in your email
  • Dropdown arrow Save favorite articles to your dashboard
  • Dropdown arrow Emails offering discounts

Learn more about a Subscription to StatPearls Point-of-Care

Function

Data collection is the process of gathering relevant information to address a specific need. The need for data collection is increasing in the field of medical education, including simulation education, but increasing data sources and regulation is making it difficult to retain and process information. [2] There are various methods of data collection, and the one chosen is based on the information required and the systems available. The complexity of the data collected can bring about challenges such as ensuring that the correct amount and type of data addresses the appropriate need. 

Data is imperative to track simulation utilization, along with demonstrating and documenting the need for space, equipment, and personnel. Various data points simulation centers collect include, but are not limited to, time for set-up, tear down, and the time for the educational activity itself. Common utilization and cost metrics may include learner contact hours, and the amount of time faculty and healthcare simulation technology specialists spend preparation, facilitating, teaching, and evaluating a session. Room and equipment utilization are also important considerations for growth and strategic planning. The total duration of an activity and the number of individuals involved must be considered with percent utilization to help form an effective model for scheduling efficiency.

Capital equipment is an essential component of simulation training and, in many instances, is a unique factor that will attract customers to the simulation center. By tracking capital equipment utilization, purchasing decisions can be based on the number of uses and frequency of use. Understanding and tracking these items can help directors and financial stakeholders understand the needs for warranties and estimate useful life before replacement. Appropriate data to support these business decisions will help ensure the sustainable delivery of simulation-based education. The type and the sheer number of learners is an important detail for stakeholders, as the simulation center adapts to the type of learners most utilizing the simulation center. 

A critical aspect of data collection relates to the cost of operation. Contact hours are defined as the number of hours that a simulation activity was run multiplied by the number of learners impacted. Contact hours in simulation can be used in lieu of clinical hours in some specialties, such as nursing, but this is not universally viewed as equivalent across specialties.[3]

Beyond just the amount of time a student may spend in a simulation activity, their performance is of greater importance as educational programs move to competency-based mastery learning.[4] In these models, attainment of specific levels of skill and performance are used as markers for advancement. The careful review of learner performance and the ability to review or refute such performance becomes an area of contention. Simulation, due to its experiential and performance-based activities, plays a unique role in the current healthcare education system.

While capturing data can provide a better understanding of how students learn, it can also put simulation centers at risk of breaching the confidentiality of learners. Traditional sources of evaluation data come from manual records from the learner notes, and direct observations of performance. Healthcare simulation adds more electronic sources of data for evaluation, such as automated performance reports from manikin and task trainer systems, and now online, and virtual learning activities may allow even more sources.[5]

Issues of Concern

The Family Educational Rights and Privacy Act (FERPA) addresses a learner’s educational records privacy, including rules regarding access to, modification of, and disclosure for such data. While FERPA addresses the guidelines surrounding educational records of minors, it is also relevant to learners of any age and/or educational path, including those engaged in learning through the use of simulation (FERPA 34 CFR § 99.31.[6] It is also important to note that simulation centers may offer educational activities for minors, such as high school level healthcare training. If this relationship exists, the simulation center would be allowed to provide educational information such as performance ratings on the minor and vice versa. FERPA’s general consent guidelines state that an educational institution is unable to disclose educational records without prior consent from the learner (FERPA 34 CFR § 99.31).

There are exceptions to these guidelines, such as the allowance of providing “directory information” as well as providing information to parents of a “dependent student” without consent from the student (FERPA 34 CFR § 99.31 (a)(11)). One exception that may be more relevant to simulation center learners relates to the dismissal of a learner due to academic failure and/or legal ramifications of performance. Other issues/concerns that can be related to simulation centers and FERPA include but are not limited to, the potential liability of when simulation demonstrates a system concern, videotaping of educational activities, in-situ simulation encounters, and permission related to images and social media (FERPA 34 CFR § 99.31.).[6] It is important to note, that learner and/or participant consent is necessary for simulation audio/video recording and images.

In addition to creating data, it must also eventually be purged and/or destroyed. Sensitive data such as FERPA or HIPPA related data must be dealt with according to federal, state, or institutional guidelines.  With increased reliance on digital data sets, additional safeguards must be considered for the removal and disposal of computer systems and data containing devices.[7] Although data retention and storage are seemingly infinite, maintaining records beyond required regulation subjects the simulation program to additional review or audit. Data should be used to fulfill a purpose, such as teaching, assessment, and credentialing. Upon completion of that purpose, the data should be destroyed.[8]

Curriculum Development

Simulation center usage, access, storage, maintenance, and audio/visual recording all capture data. Recorded data is used to enhance learner education primarily through the assessment and evaluation of learner performance.  Additionally, data can assist with curriculum development and enrichment. Gathering and retaining data on student performance is necessary for faculty to monitor and intervene early when learners require remediation to improve performance.  To identify learner performance, medical student, and residency programs, collect data on educational objectives met by simulation activities. 

The American College of Graduate Medical Education (ACGME) provides a framework of physician competency domains as a means of recording data related to the resident’s specialty performance milestones. Medical students are evaluated using the American College of Medical Education (ACME) Entrustable Professional Activities (EPAs) defined by responsibilities that medical students are expected to be able to perform while unsupervised.[9] Milestones and EPAs are both observable and measurable and are currently assessed using simulation activities. 

Procedural Skills Assessment

Skill assessment is a vital part of simulation with deliberate practice exercises and task trainer usage, often being included in most simulation educational activities.[10]  Photos and/or videos of learners engaging in such activities require the same level of protection and privacy as other types of educational record data. If a photo and/or video of a learner is “directly related” to a learner and/or retained by the simulation center, the simulation center must follow the same legal requirements as any other educational record (FERPA 34 CFR § 99.3).

However, images and videos of simulation activities are typical of multiple learners, and the use of such records can only occur if the individual learner’s participation can be isolated from the other learners. If this is not possible without hindering the integrity of the educational activity, the learner would have the ability to review the image and/or video under FERPA guidelines (FERPA 34 CFR § 99.3). 

Medical Decision Making and Leadership Development

Enterprise-level simulation management systems have been designed by multiple manufactures to assist with the assessment and evaluation of learners, inventory management, case and curriculum storage and tracking, center utilization, and learner attendance. These systems can also support financial decision-making based on the data collected and maintained. Many of these systems are web-based and can have data storage on-site or using remote, cloud-based solutions. Individual simulation may offer learner performance tracking specific to the product or system. It is important to ensure consistent storage and use practices across all systems used within a single center or institution.  

Continuing Education

Due to its importance in higher education, refresher FERPA training is an important component of all faculty and staff of educational institutions. In addition to traditional data sources, computer simulators, VR systems, and other online and virtual learners provide a novel set of data resources that must be appropriately managed. Although these systems and underlying technology are new, the appropriate storage, use, and destruction are already outlined in federal guidelines. Although COVID-19 has reminded the simulation community of public health concerns, it is important to remember that healthcare status is protected on campus under FERPA and in clinical environments under HIPPA. Contact tracing does not negate this fact.[11][12]

Clinical Significance

Simulation training has become a common and effective method for assessing clinical competency. Great benefit exists through repetitive performance and deliberate practice.  In the past, clinical skills practice was practiced on other learners or in a clinical environment. The Accreditation Council for Graduate Medical Education (ACGME) has claimed that simulation is the best way to assess procedural skills in the patient care domain.[13] One additional benefit to simulation-based training is scheduled and reproducible training opportunities that can match a curriculum plan. Skills can be evaluated using the same evaluation instruments, such as checklists, for objective measurement.[13]

Pearls and Other Issues

Simulation offers the ability to demonstrate the safe and proficient performance of both basic and advanced procedural skill sets along with overall patient care performance outcomes utilizing cognitive, technical, and ergonomic factors.  To ensure these aspects are met, the simulation center must be able to not only capture this data but also have the ability to provide a reporting structure that allows both the learner and faculty the data demonstrating proficiency.  FERPA guidelines apply to all learners utilizing simulation centers in higher education; therefore, simulation center administrators, faculty, and staff must be knowledgeable of the guidelines that apply to their institution (FERPA 34 CFR § 99.3). 

As simulation continues to evolve into the future with hybrid, in-situ, and virtual simulations, the various aspects of distance learning will bring about creative challenges for simulation centers to record and safeguard educational records. 

Enhancing Healthcare Team Outcomes

One example of using simulation data to affect clinical practice and team training is the International Simulation Data Registry (ISDR). This registry, established in 2014, archives data related to pulseless arrest, malignant hyperthermia, and difficult airway simulations.[14]  This registry was created as a partnership between the American Heart Association (AHA) and the University of Toronto.  It is a simulation equivalent of the AHAs stroke/TIA specific Get With The Guidelines Registry to standardize simulation cases and allow the review of simulation performance data.[14]

References


[1]

Bland AJ, Tobbell J. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education. Nurse education in practice. 2015 Nov:15(6):517-23. doi: 10.1016/j.nepr.2015.07.006. Epub 2015 Aug 4     [PubMed PMID: 26302649]


[2]

Ellaway RH, Topps D, Pusic M. Data, Big and Small: Emerging Challenges to Medical Education Scholarship. Academic medicine : journal of the Association of American Medical Colleges. 2019 Jan:94(1):31-36. doi: 10.1097/ACM.0000000000002465. Epub     [PubMed PMID: 30256249]


[3]

Rutherford-Hemming T, Nye C, Coram C. Using Simulation for Clinical Practice Hours in Nurse Practitioner Education in The United States: A Systematic Review. Nurse education today. 2016 Feb:37():128-35. doi: 10.1016/j.nedt.2015.11.006. Epub 2015 Nov 10     [PubMed PMID: 26608389]

Level 1 (high-level) evidence

[4]

McGaghie WC, Harris IB. Learning Theory Foundations of Simulation-Based Mastery Learning. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2018 Jun:13(3S Suppl 1):S15-S20. doi: 10.1097/SIH.0000000000000279. Epub     [PubMed PMID: 29373384]


[5]

Thoma B, Turnquist A, Zaver F, Hall AK, Chan TM. Communication, learning and assessment: Exploring the dimensions of the digital learning environment. Medical teacher. 2019 Apr:41(4):385-390. doi: 10.1080/0142159X.2019.1567911. Epub 2019 Apr 11     [PubMed PMID: 30973801]


[6]

Pfeifer CM. Privacy, Trainee Rights, and Accountability in Radiology Education. Academic radiology. 2017 Jun:24(6):717-720. doi: 10.1016/j.acra.2016.09.028. Epub 2017 May 16     [PubMed PMID: 28526512]


[7]

Bergren MD. Data destruction. The Journal of school nursing : the official publication of the National Association of School Nurses. 2005 Aug:21(4):243-6     [PubMed PMID: 16048370]


[8]

Zeide E. The Structural Consequences of Big Data-Driven Education. Big data. 2017 Jun:5(2):164-172. doi: 10.1089/big.2016.0061. Epub     [PubMed PMID: 28632444]


[9]

Ten Cate O. Nuts and bolts of entrustable professional activities. Journal of graduate medical education. 2013 Mar:5(1):157-8. doi: 10.4300/JGME-D-12-00380.1. Epub     [PubMed PMID: 24404246]


[10]

Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Academic emergency medicine : official journal of the Society for Academic Emergency Medicine. 2008 Nov:15(11):988-94. doi: 10.1111/j.1553-2712.2008.00227.x. Epub 2008 Sep 5     [PubMed PMID: 18778378]

Level 3 (low-level) evidence

[11]

Bergren MD. HIPAA-FERPA revisited. The Journal of school nursing : the official publication of the National Association of School Nurses. 2004 Apr:20(2):107-12     [PubMed PMID: 15040763]


[12]

Kiel JM, Knoblauch LM. HIPAA and FERPA: competing or collaborating? Journal of allied health. 2010 Winter:39(4):e161-5     [PubMed PMID: 21184019]


[13]

Evans LV, Dodge KL. Simulation and patient safety: evaluative checklists for central venous catheter insertion. Quality & safety in health care. 2010 Oct:19 Suppl 3():i42-46. doi: 10.1136/qshc.2010.042168. Epub     [PubMed PMID: 20959318]

Level 2 (mid-level) evidence

[14]

Calhoun AW, Nadkarni V, Venegas-Borsellino C, White ML, Kurrek M. Concepts for the Simulation Community: Development of the International Simulation Data Registry. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2018 Dec:13(6):427-434. doi: 10.1097/SIH.0000000000000311. Epub     [PubMed PMID: 29672467]