Guest Column | July 23, 2018

Six Principles For Developing A Robust Training Program

By Natalie Abts and Lindsay Williams, MedStar Health

the-human-factor_na-lw

A well-designed medical device starts with a robust, user-centered design process. Devices should be assessed continuously during development to ensure they are designed appropriately for end users to perform the expected device tasks within the specified use environments. By applying human factors (HF) techniques iteratively during this process, designs can be updated to optimize safety and usability before a product is finalized.

Though HF-guided device design is the primary way to optimize usability and mitigate potential misuse and unsafe acts, other device elements requiring user interaction also should be optimized to assist the user when design mitigations are not possible.  This can include labeling, instructional materials, and training.

Training programs, in particular, often get less attention and assessment throughout development. This becomes particularly problematic with complex devices that, even when designed well, are too complicated to be operated by untrained users. For such situations, in which training is one of the only mitigation strategies for use error, it is critical to examine formal training programs to maximize information retention and ensure correct understanding of procedures. By incorporating good design principles from the beginning, and performing HF evaluations throughout development, the probability of misuse can be reduced. The following principles should be considered as a foundation for developing and assessing training.

  1. Begin With The End In Mind

Backwards planning is critical to the design and delivery of an effective training program. Start by asking, “What should the learner be able to do by the end of this experience?” For example, are there specific tasks that a user needs to accomplish when operating this device in the real world?

Medical device training programs often need to focus learning around tasks that may have negative safety-related implications if omitted or performed incorrectly (i.e., critical tasks). This ensures patient and user safety are given top priority. Once you have determined the intended behaviors, ask, “What does the learner need to know to acquire the appropriate skills and knowledge?” The answer to this question can be used not only to help design the training, but also to design means by which to evaluate the training.

  1. Consider The Learner’s Characteristics

To understand users, consider any prior knowledge they bring to the learning experience. This starts with their background, skills, and familiarity with relevant devices and procedures. For example, a nurse and a surgeon may bring very different knowledge to the table, but there could also be significant differences between a cardiac nurse and an emergency department nurse. Thus, devices intended for use by a variety of learners may need customized training programs. Programs designed universally, for the lowest level of knowledge, could cause frustration for more skilled and experienced users. Furthermore, different types of users may perform different tasks with a device, so customized training may already be necessary to accommodate instruction of the relevant material.

Personal identities, motivations, and abilities are additional learner profile aspects to consider. Imagine a group of nurses currently using a similar medical device in their work environment, with overwhelmingly negative experiences. Trainers may need to consider how these experiences will shape the nurses’ attitude towards learning a new device. Though it may be difficult to predict individual user reactions, some of this information can be gathered by conducting user research.

This type of activity is intended to more thoroughly understand end users, and how their individual and collective characteristics might affect both device use and the users’ learning experience. Sample methods include generating user profiles or personas, which contain information on specific users or groups of users. Depending on how broadly user groups are defined, breaking them down into smaller groups, with subsets of characteristics, can be valuable in beginning to identify factors in each subset’s approach to device use.

  1. Position The Learner To Do The Heavy Lifting

Learning is enhanced when the audience invests cognitive effort to make new meaning. Conversely, retention and understanding is reduced when learning is passive. Training that is primarily lecture-based is an example of a passive learning experience.  Long periods of sitting and listening without interaction fail to either engage the learner or to provide them with opportunities to form meaning.

Training programs should instead be centered on the learner and engage them to interact, ask questions, and discover information on their own. Even the lecture-based portions of the training can be enhanced to provide interaction opportunities. For example, rather than focusing entirely on technical use of the new device, begin by asking trainees about their previous experiences with similar devices. Not only will this encourage a more interactive session, but it can also shape the training to address any concerns.

  1. Provide Practice Opportunities

Learners should have multiple opportunities to practice using the device, outside of any final evaluations or skills tests completed at the end of the session. Just as long lectures fail to promote interaction, they are also more likely to result in memory lapses. To ensure learning is retained, it is important that training is designed to interrupt the forgetting process. To take advantage of this principle, the learning experience can be “chunked” into intervals, with practice interleaved at multiple stages. Short, similar tasks can be practiced individually, which will make learning less cumbersome and decrease the cognitive effort of remembering many successive tasks that make up an entire procedure.

Once smaller pieces of a larger task are covered, learners will be able to more easily perform longer procedures. However, successful device use in the real world is more probable if practice is performed realistically. After learners have demonstrated mastery of isolated steps, it is important that they practice potential use scenarios in their entirety. Incorporating realism into practice opportunities also involves mimicking the use environment to the extent possible.

Though it may not be realistic to conduct training in the actual use environment, aim for more than a standard conference room with tables and chairs. If training on an infusion pump, for example, the device should be mounted on a pole, as it would be in the use environment. Other equipment that would be present in the environment should be included to mimic potential barriers. If a hospital bed isn’t available, arrange tables in a similar configuration to how a bed would be positioned in relation to the device. Ensure accessories — such as medication tubing — also are available, so scenarios can be fully performed.

  1. Embrace Failure

People are more likely to remember if they have failed on their learning path. Thus, users should be allowed to attempt tasks independently and without redirection, rather than experiencing interruption by a trainer to immediately correct performance. This can be difficult for trainers who want to see devices used correctly, but this method does not typically result in a lasting impression. If users are not allowed to fail, it will be more difficult for them to remember the correct path.

It also is important that users receive realistic feedback when they do fail, and this should go beyond a simple “do this, not that” message. Ask users to explain what they did wrong and what they would do differently next time. Also, ensure that users understand why their actions were incorrect. Understanding the rationale behind certain requirements gives actions importance beyond simple memorization of task steps. If certain actions are associated with safety risks, make those risks part of the discussion to help create meaning.

  1. Evaluate And Validate

As a training program is being developed, a key component to maximize the probability of success is the application of iterative HF evaluations. In early stages, both data-driven results and subjective feedback can be used as evaluation tools. Some examples of evaluation methods include:

  • Heuristic evaluation (i.e., assessment against known HF standards) of any documents or handouts meant to supplement training
  • Performance testing, in which a user completes tasks after training and without the use of additional help resources
  • Comparative user testing, in which performance data is gathered from both trained and untrained users
  • Post-training comprehension assessments
  • Post-training interviews or surveys to gather feedback and recommendations

Though this may seem like a lot of assessment for a training program, many of these activities can be combined with formative stage device testing to maximize efforts. A training program that has undergone rigorous evaluation during development is better prepared for assessment during the HF validation study. To meet FDA requirements, the validation study must include a realistic representation of user training on the final device design, delivered by appropriate training personnel.  This way, assessment of performance during the device evaluation can correctly reflect the expected amount of background knowledge.  

Training programs can be assessed during validation in a variety of ways, some of which may be similar to those used during development. The primary difference is that any assessment completed before use scenario testing must be designed so as not to influence performance in an unnatural manner. For instance, post-training questions should be more general (e.g., adequacy of training material, feedback on general structure, preparedness for task performance) and not focus on knowledge assessment.

Questions should be designed so as not to clue users in to the content of performance testing, and discussion should avoid correction of any misunderstood information.  Bias also should be avoided by enlisting a third-party study moderator to conduct the interview, after a trainer leaves the room, ensuring that participants do not feel pressured to provide positive feedback.

Once testing is completed and performance cannot be affected, additional data can be gathered. You might see changes in user opinions on training once they have had to perform tasks independently, so it is important to gather training feedback at the end of the study, as well as directly after training is completed. However, participants may sometimes be inclined to blame training for poor performance once confronted with their use errors. Thus, this data should be examined carefully to ensure that any true training deficiencies are identified and corrected.

A training program is critical to any complex medical device. Well-designed training maximizes learner retention and understanding, and focuses on the information needed for safe and successful use. By starting the design and evaluation process early and engaging with it iteratively, potential problems can be prevented downstream.

Though training provides users with the foundation for device operation, it is important to understand that training only should be used as a supplementary risk management method, rather than the primary strategy for error prevention. To the extent possible, devices should be designed to mitigate safety risks and use error. This will reduce the need for users to rely on information learned from training, and make it easier for trainers to focus on key risks that cannot be mitigated through design.

About The Authors

Natalie Abts is the Senior Program Manager for the Usability Services division of the National Center for Human Factors in Healthcare. She manages the technical and quality aspects of usability projects conducted both for the medical device industry and within MedStar Health. Natalie has specialized experience in planning and executing both formative stage usability evaluations and validation studies for medical devices and combination products on the FDA approval pathway. She also leads an initiative to incorporate usability testing into the medical device procurement process in the MedStar Health system, and is active in delivering educational presentations to the medical device industry and other special interest groups. Natalie holds a master’s degree in industrial engineering, with a focus on human factors and ergonomics, from the University of Wisconsin, where she was mentored by Dr. Ben-Tzion Karsh.

Lindsay Williams is the Manager of the Instructional Design Team at MedStar Institute for Innovation's Simulation Training and Education Lab. She leads her team to design and develop eLearning courses, simulations, mobile apps, videos, and job aids with the goal of improving the performance of medical professionals and thereby improving patient care within the communities they practice. Lindsay applies her experience as a classroom teacher and teacher coach, a passion for constructivist learning, and a deep understanding of adult learning theories and design principles to advocate for learners. Lindsay is pursuing a Master’s in Learning and Design from Georgetown University and holds a Bachelor of Science in Science, Technology, and Culture from Georgia Institute of Technology, where she focused her studies on biomedicine and culture.