Chapter Fifteen
Revising Instructional and Noninstructional Solutions Based on Data

There are several reasons instructional designers engage in evaluation including providing key stakeholders with important information to raise awareness and to help them decide. Evaluation helps to ascertain learner satisfaction, knowledge/skill acquisition, transfer of learning leading to behavior change, and organizational results or business outcomes. Another primary objective of evaluation is to make revisions and improvements to the solutions based on data collected during the evaluation process. Strategies for making these revisions is the focus of this chapter.

According to The Standards (Koszalka, Russ-Eft, and Reiser 2013, 60), “the competency itself is rated as essential since all instructional designers should be able to revise solutions based on data and feedback that are received.” The performance statements associated with this competency, one of which is advanced and two which are essential, indicate that instructional designers should be able to (Koszalka, Russ-Eft, and Reiser 2013, 60): “(a) Identify product and program revisions based on review of evaluation data (advanced); (b) Revise the delivery process based on evaluation data (essential); and (c) Revise products and programs based on evaluation data (essential).”

Types of Revisions

Two primary categories of revisions are typically made. Product or program revisions involve changes to the content or messages being conveyed through the intervention. Besides the content itself, revisions could also be made to exercises, activities, assessments, sequencing or flow, and overall duration and length of segments. Adjustments to the delivery process include changes made to the manner in which the solution is deployed. Delivery processes could involve self-paced, job or performance aid, web-based, in-person, or “blended” approaches, which involve multiple methods. These types of revisions can encompass everything that happens before, during, and even after the intervention that pertains to how it is executed.

Sources of Input for Potential Revisions

Revisions to instructional and noninstructional interventions do not occur in a vacuum. There are many sources of input that can and should be sought and used. If a formal report of the evaluation results, discussed extensively in Chapter 14, is created, this can serve as a source of input that the instructional design professional can use to identify changes. Direct feedback from participants, facilitators, training managers, logistics support professionals, and others is a way for designers to hear directly from stakeholders about what could be improved through making changes. Such direct stakeholder feedback can be solicited in real time as the intervention is being deployed, immediately after, or at a later date through a formal debrief discussion.

Another source of input is change in professional practices that have occurred in the organization, which require updates to the program. A major upgrade to an organization's performance management system may necessitate significant revisions to the training that supported the previous version. A final source of revisions is change to organizational policy. An organizational policy decision that no employee air travel can be booked within two weeks of flying or a decision that nonbusiness critical travel is restricted to only two trips per year are examples of sources of input that could lead to potential revisions, perhaps to delivery practices related to participant and facilitator travel or introducing virtual delivery methods.

Reviewing Evaluation Reports and Other Data to Identify Needed Revisions

You may recall that Chapter 14 described the common elements in an evaluation report, which can take a variety of forms and can be delivered in a variety of ways. If an evaluation approach like Kirkpatrick's four-levels framework is used, the report will include insights (and corresponding recommendations) related to participant satisfaction with various aspects of the intervention, the knowledge and skill acquisition, transferring learning and corresponding behavior change, and the business and organizational impact. Ideally, the evaluation report will suggest revisions to the content and provide program delivery suggestions. The participant reaction to the registration process may have been negative, suggesting that improvements are needed. An evaluation report may reveal low test scores on a particular item, which may necessitate the need for more descriptive content or examples in an online learning module. When such revisions are not called out directly in the evaluation report, the instructional designer may need to review the findings, recommendations, and the raw data to dig deeper and identify potential adjustments that may be helpful.

Revising Delivery Processes to Improve Outcomes

The delivery process is the way the intervention is implemented. Delivery processes span the time horizon of activities that occur before, during, and after the intervention. Delivery processes that happen before the intervention may include but are not limited to registration, communication (welcome, overview of the intervention, logistical details), prework (readings, assessments), instructions, program material, and so forth. These processes should make the rollout of the intervention more streamlined and should enhance the likelihood of achieving the instructional objectives.

Assessing and Predicting Efficacy of Potential Revisions

Once potential changes to the content or the delivery process have been identified, the instructional designer will be tempted to simply make these changes and move on. It is critical, however, to stop and strongly consider the likelihood of success of the adjustments. What are the potential implications that may stem from the change? How will participants and others respond to the change? Might there be any unintended consequences or that must be addressed? Are the changes major or minor? What's the best way to phase in the changes—all at once or more gradually? Perhaps most important, will the proposed change lead to the desired outcomes? These and other questions help to anticipate the efficacy of the potential revisions so that the designer can be more confident in them before they are made. If doubts arise when answering these questions, this can be stimulus for the designer to adjust plans so better outcomes are likely to be achieved through the potential revisions.

Revising the Content of Products or Programs to Improve Outcomes

Once proposed revisions have been finalized, decided upon, or approved if needed, it is time to implement the revisions to the content or programs to achieve better results. One type of revision is to the content, which includes the key messages, narrative, and substantive information in the product or program. Revisions to content may include shortening or lengthening the volume of the content to either reduce or lengthen the coverage provided so it is matched to what is needed most by the learners. Revisions may also include the means by which the content is made more clear, accurate, or relevant to the learners in order to better achieve desired outcomes. Improving clarity involves making the messages easier for learners to understand or comprehend. Improving accuracy includes correcting inaccurate information. Improving relevance deals with adding material or examples that make the material more meaningful to the learners and more closely linked to on-the-job performance.

Besides changes to the content, another revision is to exercises, activities, or methods by which the content is delivered. When an intervention is piloted or deployed and feedback is gathered, there may be problems with some of the instructional methods used. First, changes may be made to the instructional method originally chosen. The original design of a Leveraging Inclusive Diversity program may have included a large group discussion of a topic. Through the evaluation, it may be determined that participants are reluctant to speak up in front of the full group. The program design can be revised so small group discussions followed by large group report-outs are used instead of going straight to the large group discussion. Second, changes may be made to the instructional method itself. Part of the design of a Developing Your Career workshop includes a card sort activity where participants select 10 cards (out of a possible 50) that best reflect what they enjoy doing most at work. Through the pilot, it is determined that participants are struggling with some of the items on the cards. Through this process, it's determined that two major revisions must be made to the activity—the descriptions on the individual cards must be clearer and there needs to be greater differentiation among each of the cards.

As discussed in Chapter 12, most instructional programs have participant guides, instructor or facilitator guides, presentations, and other material that make up the complete instructional package. Another focal point for potential revisions involves changes that may be made to such material. These changes may be needed for various reasons. There may be lack of clarity in the instructions, an exercise, or the way content, material, or messages are presented. Any time learners, as well as facilitators, become confused about what to do, what something means, or how something is relevant or meaningful, then the instructional designer must take action to remedy this and bring greater clarity. Any time inaccuracies, with grammar, punctuation, wording, or spelling exist, they must be corrected to ensure material is credible and useable. Graphics or visuals that are used in participant guides, posters, or presentations may be unclear, inaccurate, or should not have been selected in the first place and must be corrected.

By the time a complete instructional package can be ready for use, it must be error free, clear, accurate, and relevant. When this is the case, the designer can be confident that the solution is poised to succeed with a higher probability that the intended outcomes will be achieved. This confidence engenders a sense of pride and satisfaction in the instructional designer.

Gaining Stakeholder Support for Revisions

The instructional design professional doesn't work in a vacuum at any of the ID process stages. Many parties with varying, and sometimes conflicting, interests and priorities are involved in any design project. While it may seem insignificant compared to other key decisions made throughout the process, support for revisions must be garnered. Minor revisions, such as correcting spelling or formatting errors, may not require extensive stakeholder signoff and are often implemented by the designer and perhaps the manager. More substantive revisions such as increasing the length of a program, using externally procured content that requires additional funding, or adding a new exercise like a case study, may need additional stakeholder signoff. The first step in gaining support is to identify who is likely to have a vested interest or a say in the change. Once these individuals or groups are identified, a strategy for securing their support is developed and then executed. Stakeholders are fairly easy to identify, as they have likely been involved in the process already. However, sometimes a new stakeholder is identified and must be engaged. Strategies to secure stakeholder support for revisions are usually fairly straightforward and involve engagement approaches ranging from written communication, to a group meeting, to a one-on-one formal or informal interaction.

Whatever the engagement approach used, the goal is to receive approval for the proposed changes. Some organizations require formal approval such as a written sign-off authorizing the changes, while others are more informal, in which verbal confirmation of the change is secured. In less formal environments, this often suffices, but it may be useful to document the approval as a way to memorialize this as a future reference point. See Exhibit 15.1 for a list of stakeholder questions to assist in improving the intervention.

Implementing Revisions to Delivery of Products/Programs

Once approval has been received, the revisions are implemented. The nature, complexity, and the number of changes being made determine the approximate time required. The change also determines the resources required for implementation. An editor or graphic artist may be the one to change the material or visual aids. A subject matter expert may be needed to develop or alter content. A procurement professional may be needed to purchase a new off-the-shelf tool or activity identified. The training administrator or Learning Management System (LMS) team may be needed to change the delivery or technology support. Often, revisions needed fall to the instructional designer to make. Regardless of the change and who is making it, the instructional design professional typically oversees the changes and is ultimately accountable for ensuring the changes are made as intended.

Evaluating Effectiveness of Revisions

Once revisions have been implemented, the question becomes “did the changes lead to improved outcomes?” To answer this question, we can return to the four levels of evaluation and determine whether the revisions were effective. Were participants more satisfied with the modifications? (Level 1) Did test results improve? (Level 2) Was there a greater transfer of learning from the classroom to the workplace? (Level 3) Were business results and organizational outcomes achieved? (Level 4) The effectiveness of revisions can also be evaluated by many other means beyond the four levels framework.

Making Revisions in a Rapid Prototyping Environment

In today's hyper-paced environment, long cycle times of development that require many months to design, test, adjust, and implement are too slow, and therefore often not acceptable. Instead, rapid prototyping is becoming more commonplace in the world of instructional design. In such environments, sometimes a less-than-perfect product is enough to begin piloting and gauging effectiveness. Revisions are made, almost in real time, and are followed by additional “pressure testing,” evaluation, and adjustments. In this way, cycle times may be greatly reduced compared to traditional design approaches where a great amount of time is spent up-front “getting it right” before the solution is piloted, evaluated, and adjusted. Whether due to client sense of urgency for faster speed to market, the emergence of technology and design tools that facilitate more rapid design work, or the new norm of instantaneous everything, it is more common for the instructional design professional to work at breakneck speed using rapid design methodologies. Striking the balance between rigor and speed is an ethical dilemma sometimes faced by practicing instructional designers. While it's important to adapt and respond to faster and faster cycle times, the instructional designer must also maintain the course of the core ISD process that balances efficiency with effectiveness, which may get shortchanged, unless the designer holds firm.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset