4 Tips to Creating More Effective Instructional Materials
A company has conducted several formative human factors studies and there are still use errors and difficulties associated with the IFU or instructional materials. The instructions have been tested and iterated several times by different people but still the issues persist. What could be going wrong?
This is a very common scenario we hear in our consulting practice as well as at workshops and conferences when colleagues describe their experiences with human factors testing of instructional materials. This article covers some of the common issues that lead to these issues and how to course correct.
Issue #1: Instructional design is not integrated with human factors
If you have ever tried to build a house of cards, carefully stacking cards together in hopes they will hold each other up, you know one puff of air or one wrong move and the whole stack can come tumbling down. Why? Because the cards are just propped up together and there is no foundation or secure framework on which you are building this house. This is like creating an IFU or other critical instructional materials without considering the foundational human factors inputs which are the framework needed to create successful instructional materials.
Many companies do not have an in-house instructional designer experienced in creating performance-based instructional materials. In these instances, what often occurs is the individual in the organization responsible for the instructions mimics the instructional design process but may lack full knowledge of instructional design intricacies necessary to avoid certain pitfalls. Not using human factors inputs as a foundation is one of the major pitfalls. Expert instructional designers know first you need to understand characteristics of the target audience and the conditions under which the target audience will be performing tasks to create instructional materials that work. In the world of medical devices and combination products, the target audience and conditions equate to the human factors user and use environment profiles.
Writing instructions that promote safe and effective performance by end users may be done in different ways for different types of users. For example, instructions written for lay users may need to be written differently than instructions for a healthcare professional. Instructions for a cancer patient or heart failure patient should be written so they are still easy to use and understand, even when the user is sick or symptomatic. The same holds true for the use environment. If the user is in a busy medical setting or in the back of an ambulance with two other people performing CPR, the design of the instructions should consider these environmental factors. User and environment characteristics directly impact the ability of users to interact with instructional materials and should be considered during the instructional design process first and foremost.
Another critical human factors input to instructional materials is the task use-error analysis (task analysis). The instructional designer (in this case the individual at your organization charged with creating instructions) should be able to leverage the task analysis to start organizing and drafting instructions. Best practice, and best-case scenario, is that the task analysis contains Perceptual, Cognitive, Manual Action (PCA) analysis which has many details the instructional designer can draw upon for instructional content. Additionally, if the task analysis provides traceability to the risk assessment, the instructional designer can easily identify the high-risk areas in the instructions and give those instructions the appropriate attention.
A less ideal situation, or worst case, is when this best practice is not followed and the task analysis is not detailed with all the information needed to create effective instructional materials. The instructional designer in this case may find missing use cases or user tasks and be able to alert the human factors team of any missing user tasks, steps or sub-steps or if there are any inaccuracies in the task analysis. The key is having the right person responsible for instructional design who understands the instructions need to come from the user point of view, must be designed for the user's environment and has the details required to create instructions from the user performance perspective.
Issue #2: Instructional materials are designed for aesthetics rather than performance
Expert instructional designers also recognize the need for the target audience (end users) to be motivated to learn. The design of instructional materials should entice the user to interact with them and support the user during use. In the case of medical devices and combination products, visual appearance is not often recognized as an element that influences user motivation or performance. Instead, it is frequently heard that "users don't look at or read the instructions" with no awareness that the visual appearance of the instructions demotivates users or makes it difficult for users to use them. A couple of examples are:
Lay user instructions with one large page of dense text in a tiny font, colorful lines all over the page, colorful images and graphics, warning and caution symbols, bold and italic text, regulatory disclaimer information, etc. Large sheets of instructions can be overwhelming to end users with folds that interfere with readability, and a size, organization or format that make the sheet difficult to navigate.
Healthcare professional Instructions with long blocks of text written in prose style, like a medical text, that is not put into the context of the users’ typical tasks or environment, has no images, and all warnings are grouped in an inconspicuous location at the end of the chapter. Users, including healthcare professionals, skim dense text and may miss important steps or information. Additionally, placing warnings at the end instead in the steps where they are most relevant may not mitigate risks as intended.
Branding elements and styling choices that interfere with usability of the instructional materials. A common issue is fonts and color combinations that are difficult to read, have low contrast on the page, or the contrast between text on a colored background is not readable by the end user.
Strategies to avoid these common problems include:
Keep instructions short and concise, using direct active-voice language.
Include information that is important to avoid use errors or mitigate risks without adding too much detail. When in doubt, leave it out. Test the instructional materials in a formative study and look at the study results to determine if you need to add additional information.
Spend more time focusing on how to separate information with more white space to give the instructions the appearance of being simple. For example, a manual was lengthened by several pages by reducing dense text and adding more white space. Due to the added white space and clean design, users thought the manual was easy to use and simpler compared with the shorter, crowded predecessor.
Issue #3: Instructional materials represent every stakeholder's input
Imagine a restaurant where several cooks add their flavors and preferences to a dish and once all combined, the dish is presented to the diner. The diner takes a bite and says the dish tastes awful. Now assume the restaurant is your company, the cooks are all the stakeholders involved in creating the instructional materials and the diner is your end user. If too many people in your organization (who are typically not representative of your end users) are making instructional design decisions, it is likely that in a human factors study the users will say your dish (instructions) tastes awful.
It is very tempting for the stakeholders that make up product development cross-functional teams to all provide their input into the instructions. It is also common for these same people to be dismissive of good instructional design practices because they feel the device or product is "easy to use". Unfortunately, these stakeholders are not representative of the end users and the stakeholders' logic is flawed when their expertise and experience with the product supersedes the actual end user perspective.
A more effective approach is to start putting drafts of instructions in front of representative users as soon as you have a prototype available to create early drafts of instructions. Through early human factors testing that includes the early drafts of instructions, learn what works for end users, what doesn't, and optimize based on user feedback applied within good instructional techniques. Remember, just because a user says to make everything bold and red, doesn't mean that is a good idea. Instead, interpret this feedback with an instructional design perspective. The feedback could mean the information in question did not stand out to the user. Consider more effective instructional design fixes like adding more white space, separating information, or improving organization of information to solve the issue without taking the user literally and making everything bold and red.
Issue #4: Tweaking around the edges
After a formative evaluation, human factors data show several root-causes attributed to the instructional materials. As a result, several small changes have been made to the instructions, but the next study shows the same issues with no improvements in performance. Tweaking around the edges may be the issue here. This is especially true if the same area of the instructions remains problematic for end users even after iterations. Look at how two different companies handled the issue of lingering use errors related to instructional design:
Company A decided to go back to the drawing board completely after they conducted a failed validation. The failed validation study results specifically linked root causes to poorly designed instructional materials for low literacy users. Company A made comprehensive changes to the instructional materials including detailing and separation of steps, styling of images, layout of pages, etc. Within two formative studies, Company A was on their way to a successful validation, on-time submission and ultimately received on-time FDA clearance.
Company B also conducted a failed validation with root causes linked to the instructional materials. Company B decided to only make slight modifications to the instructions including critical tasks with use errors associated with them. Formative after formative, the issues were still not resolved, and the instructional materials were still not optimized sufficiently to demonstrate readiness for a validation.
Being overly judicious when making changes to instructional materials can cause continuous iteration without resolution as demonstrated by Company B making small tweaks in their situation without improving results. Using a more assertive approach to implement more significant changes needed to make sure the instructions promote correct performance gets better results and shortens the time to resolution and successful human factors testing.
If your organization is conducting repeated human factors testing with instructional materials without much improvement in the results, check for these common instructional design problems. Examine your organization's process for creating and optimizing instructions based on human factors data and you will likely find at least one of these scenarios is the culprit. Once you know the problem, you can take action to implement better, more informed strategies that ultimately improve the instructional materials.