Chapter 18

Reporting Evaluation Results

Tom Broslawsky

In This Chapter

Communicating evaluation results is as significant as achieving them. This chapter will explain the importance and challenges of communicating your results to diverse levels of management in an unbiased and objective manner. In this chapter you will learn

  • the importance of communication
  • the main principles for communicating results
  • how best to design your impact study results
  • how to communicate results to different stakeholders.
 

Why Reporting Results Is Critical

Each step in the evaluation process model requires planning and great attention to detail so one can achieve the desired outcomes of the evaluation project. Yet, the communication step does not always receive the attention it requires because it is multifaceted and somewhat complex. Evaluating data and producing a successful outcome are meaningless unless the findings are communicated properly and in a timely manner to stakeholders so that actions can be taken accordingly. There are three foundational reasons for making this step a high priority.

Communication Is Necessary to Make Improvements

The timeliness and quality of communications are critical when adjustments and improvements are needed. One must deliver updates to the appropriate groups throughout the process for timely adjustments. Communication of the final report provides comprehensive data for enhancing current or future programs aggressively.

Communication About Results Is a Sensitive Issue

Never assume that the results of your study will be greeted with praise by all, particularly in tough economic times. A program that is tied to internal political issues or the performance of others may distress some individuals but gratify others.

Different Target Audiences Require Different Information

A “one-size fits all” report most likely won’t meet the individualized needs of your stakeholders. Consider the information each group requires, the format/media, and the appropriate time for discussing the results. The composition of the target audience determines which communication process is most appropriate.

Principles for Communicating Results

Style can be as important as substance when reporting results. Managing the complexity of the message, audience, or delivery medium can be demanding, but you can achieve exceptional results by following a few general principles outlined here.

Timely Communications

Timing the communication of results can affect how well they are accepted and how effectively and quickly the findings are acted upon. Communicate results as soon as they are known so that decision makers can take immediate action. Realistically, however, it may be best to communicate at a time that is most convenient to the audience in an effort to maximize the effects of the study. Is the audience prepared for the results given their current work environment or other events affecting the business? Are they expecting the results?

I once scheduled the review of a nine-month return-on-investment (ROI) study with a product marketing team who had demanding schedules. The day before the session, a major issue developed with a leading product, and an “all hands” meeting was scheduled before our review session. The response to the ROI study results, as expected, was met with vague indifference. Several key players were not present, and those who did attend were distracted.

Practitioner Tip

Maximize the evaluation study’s results by ensuring that the greatest number of key stakeholders will attend the presentation and that the current work environment is favorable to a robust discussion of the findings.

Targeted Communication

Communication will be more effective if it is designed early in the planning process with a particular group in mind and tailored to their interests, needs, and expectations. The program results discussed in this chapter mirror outcomes at all levels, including the six types of data discussed in previous chapters: reaction or satisfaction, learning, job application, business impact, ROI, and intangible benefits. Some data are gathered early in the project and communicated as soon as possible to implement changes, whereas other data are collected after implementation and communicated in a follow-up study. The results, in the broadest sense, may range from early feedback in qualitative terms to ROI values in varying degrees of quantitative terms.

Appropriate Media Selection

Selecting the correct media is important because the proper form of communication can determine the effectiveness of the message. The more directly your audience is affected by the results, the more personal the media should be. Also, whenever appropriate, direct communication is the preferred method to ensure that the intended message is received. In other cases, a traditional memo distributed to select top executives may be more effective. Company publications offer an opportunity to tell the story of training success, as do detailed case studies.

Practitioner Tip

Developing case studies based on evaluation results can provide a great teaching tool for your audience because they will take the conceptual model of gathering data and apply it to authentic business situations. Supporters of the case study method point out that these studies turn out a more comprehensive report than one generated by only using statistical or data analysis. The audience for case studies based on evaluation data can convincingly debate that although the case study may be founded on one organization’s key business metrics, the final study is more robust because it addresses conditions that go beyond the numbers, such as corporate vision, economic environment, or leadership mandates.

Unbiased Communication

Let objective, credible, and accurate results speak for themselves. Sometimes, in the excitement of “wanting to make a splash,” over-inflated or controversial opinions may leak into a report. Although these declarations may get attention, they can detract from the accuracy of the results and turn off the recipients. Use the following pointers in developing your communication process:

  • focus on the training process, not the program or training department
  • give credit to participants and their supervisors
  • fully address evaluation methodology to give credibility to the findings
  • clarify data sources and explain why they are credible
  • state any assumptions made during the analysis and reporting and clarify why these assumptions were made
  • be pragmatic and make no claims that are not supported by the data.

Testimonials from Respected Individuals

Testimonials from individuals with recognized stature, leadership ability, or influence can have a strong bearing on how effectively messages are accepted. Opinions of your audience can be strongly swayed with this level of support. Conversely, testimonials from individuals who command little respect can have a negative effect on the message.

Communication Strategy Shaped by the Audience’s Opinion of the Program Team

Consider the credibility of the program team when developing the communication strategy. A reputation for integrity and reliability will make it easier to get buy-in from decision makers. However, if the program team has a weak reputation and low credibility, then presenting facts alone may not be enough to change perceptions. Facts alone may strengthen the voice of those who already agree with the results.

Communication Plan

Strict attention to detail is required to ensure that each targeted audience receives the proper information in a timely fashion and that appropriate actions are then taken. There are four basic steps to planning the communication of your evaluation project:

1. analyze the need for communication

2. identify the target audience

3. develop communication documents

4. deliver reports through appropriate media.

Table 18-1 describes the communication documents, targets, distribution method, and reasons for the communication.

Perhaps the most important audience member is the sponsor. This individual (or group) initiates the program, reviews the data, and weighs the final assessment of the effectiveness of the program. Senior management is responsible for allocating resources to the program and needs information to justify expenditures and gauge the effectiveness of efforts. Selected groups of managers (or all managers) are important to increase both support and credibility. Communicating with the participants’ team leaders or immediate supervisors is essential because they will most likely be in a position to encourage the participants to put the proposals into practice and reinforce the objectives of the program.

Table 18-1. Communication Plan


Communication Document Communication Target(s) Distribution Method Reason for Communication

• Complete report with appendices

• Program sponsor

• Distribute and discuss in a special meeting

• To secure approval for the project

• To drive action for improvement

• To show the complete results of the project

• To underscore the importance of measuring results, if applicable

• To explain techniques used to measure results

 

• Workplace learning staff

 

• To secure approval for the project

• To drive action for improvement

• To underscore the importance of measuring results

 

• Team manager

 

• To gain support for the project

• To prepare reinforcement of the process

• To create desire for a participant to be involved

• Executive summary (eight pages or less)

• Executive summary (eight pages or less)

• Senior management in the business units

• Senior corporate management

• Distribute and discuss in routine meeting

• Distribute and discuss in routine meeting

• To secure support for the project

• To secure support for the project

• To build credibility for the learning and performance process

• To stimulate interest in the workplace learning team

• General interest overview and summary without an actual ROI calculation, if applicable

• Participants

• Mail with detailed explanatory letter

• To secure agreement with the issues

• To enhance results and quality of feedback

• General interest article (one page)

• All employees

• Publish in company publications

• To demonstrate accountability for all expenditures

• Brochure highlighting program, objectives, and specific results

• Team leaders with an interest in the program

• Informational brochure

• To gain support for the project

• To secure agreement with the issues

 

• Prospective sponsors

• Include with other marketing materials

• To market future projects

Analyze for Communication

The program, the setting, and the sponsor’s unique needs will dictate the specific reasons for communicating program results. The most frequent reasons are

  • securing approval for a program and allocating time and money
  • gaining support for a program and its objectives
  • enhancing the credibility of a program or a program team
  • reinforcing the processes used in the program
  • driving action for program improvements
  • preparing participants for a program
  • showing the complete results of the training program
  • underscoring the importance of measuring results
  • explaining techniques used to measure results
  • marketing future projects.

Although this list is fairly inclusive, there can be other reasons based on the individual situation, context, and audience.

Select Audience

As part of developing the overall communication plan, one should give significant thought to who should receive results of a program. The audience targeted to receive information is likely to be diverse in terms of job levels and responsibilities, so communication pieces should be delivered accordingly. Examining the reason for the communication is always a sound basis for determining audience selection.

Preliminary Issues

A helpful exercise for developing these lists is to ask the following questions:

  • Does the audience have interest in the program?
  • Does the audience want to receive the information?
  • Has a commitment been made to this audience about receiving the communication?
  • Is the timing right for communicating with this audience?
  • How would this audience prefer to receive the results?
  • Is this audience likely to find the results threatening?
  • Which medium will be most convincing to this audience?

When scrutinizing each member of the audience, consider these three actions:

  • The project manager should know and understand the target audience.
  • The program team should examine what its specific needs are and why. Each group will have its own level of detail relative to the information desired.
  • The program team should try to understand any audience bias or differing views. Although some will immediately support the results, others may be skeptical or impartial.

The Communication Document

The type of formal evaluation report will correlate with the level of detailed information presented to the different audiences. Brief summaries of results with appropriate charts may be sufficient for some communication efforts. Projects that require a high level of approval and considerable funding will necessitate a much more comprehensive write-up, possibly a full-blown impact study. The flow of your communication should address the following issues:

  • why the program is being studied
  • why the evaluation is being designed
  • the evaluation methodology used
  • the results generated
  • conclusions and recommendations achieved.

An example of a comprehensive impact study is outlined in figure 18-1.

A few last things to consider:

  • give credit for success entirely to the participants and their immediate leaders
  • avoid bragging about results
  • ensure that the methodology is well understood.

Select Media

Even the most positive program results can be sent awry if they are not delivered in the appropriate format or setting. Certain media may be more effective for a particular group than others, such as face-to-face meetings, special bulletins, memos, or company newsletters.

An array of electronic media has exploded in the past few years with the advent of wikis, blogs, and Twitter. Case studies represent an effective way to communicate the results of a project, particularly when explaining the measurement methodologies in a group setting.

Scorecards

Scorecards are a performance management tool that concentrate on various performance indicators that can include financial outcomes, operations, marketing, process performance, customer perspective, or any other appropriate measure. In fact, in many organizations, each business unit will develop its own scorecard and incorporate each into one that ultimately reflects the vision of the whole business. Robert Kaplan and David Norton first explored the concept behind scorecards in their pioneering book The Balanced Scorecard. Kaplan and Norton suggest that data can be organized in the four categories of process, organizational, financial, and growth.

Figure 18-1. Format of Impact Study Report

• Executive summary

• General information

— Background

— Objectives of study

• Methodology for impact study

Builds credibility for the process

— Levels of evaluation

— Collection procedures

— Data analysis procedures

o Isolating the effects of training

o Converting data to monetary values

o Assumptions

• Program categories

• Results: general information

— Response profile

— Success with objectives

• Results: reaction and satisfaction

The results with six measures: Levels 1, 2, 3, 4, 5, and intangibles

— Data sources

— Data summary

— Key issues

• Results: learning

— Data sources

— Data summary

— Key issues

• Results: application and implementation

— Data sources

— Data summary

— Key issues

• Results: business impact

— General comments

— Links with business measures

— Key issues

— Barriers and enablers

• Results: ROI and its meaning

• Results: intangible measures

• Conclusions and recommendations

— Conclusions

— Recommendations

• Exhibits

 

The scorecard method provides a snapshot comparison of key business impact measures and how they align with overall corporate strategies. Whether the scorecard is a step up with numerical indicators, a traffic light configuration (green—success, yellow—mixed results, red—unsatisfactory) or a dashboard design, the purpose is to reflect the current conditions of the organization. Also, many corporations now use the terms scorecard and dashboards interchangeably because both convert strategies into accountability and measure progress to date. Figure 18-2 shows a SenseiROI dashboard depicting the results of impact studies on line manager absenteeism.

Routine Feedback on Project Progress

Routinely collecting reaction and learning data provides feedback promptly so that adjustments can be made throughout the project, possibly to several different audiences using various media. This process becomes complex and must be proactively managed. The following steps, some based on the recommendations of Peter Block in his book Flawless Consulting, are suggested for providing feedback and managing the overall process:

  • communicate quickly and appropriately
  • simplify the data
  • examine the role of the project team and the client in the feedback process
  • use negative data in a constructive way
  • use positive data in a cautious way
  • ask the client for reactions to the data
  • ask the client for recommendations
  • use support and confrontation carefully
  • react to and act on the data
  • secure agreement from all key stakeholders.

Following these steps will help move the project forward and generate useful feedback, often ensuring that adjustments are supported and can be executed.

Presenting Results to Senior Management

Presenting the results of an evaluation study to senior management can be one of the most challenging and possibly stressful types of communication. The challenges are convincing this highly intuitive group that outstanding results have been achieved (assuming they have), addressing the salient points clearly and concisely, and ensuring the managers understand the process.

Two potential reactions can create problems. An initial reaction may be that the results are impressive and it may be difficult to persuade the managers to accept the data. Conversely, if the data are negative, you cannot ensure that managers won’t overreact to the results. The following guidelines can ensure that this process is planned and executed properly.

Arrange a face-to-face group meeting with senior team members to review results and ensure they understand the process, even though they may be familiar with the measurement methodology. Although this presentation can consume precious executive time, an executive summary may suffice after receiving a methodology presentation a few times.

Bottom-line results should never be disseminated until the end of the session to allow adequate time to present the process and collect audience reactions. If additional accuracy is an issue, illustrate how the trade-off for more accuracy and validity can often be greater expense. Address this issue when necessary, agreeing to add more data if required. Gather concerns, reactions, and issues involving the process and make adjustments accordingly for future presentations.

Practitioner Tip

Present results in an organized and focused manner. Make sure that the project team understands the importance of explaining the measurement methodology, data gathering, and data analysis steps before sharing final results because this may minimize the full effect of the findings.

Reactions to Communication

The level of commitment and support expressed by the managers, executives, and sponsors will mirror how successfully the results of a project have been communicated. Top management may also reflect their positive perception of the results by allocating requested resources and voicing their commitment. A final measure will be the comments, nonverbal language, and attitudes expressed by the audience.

When major project results are communicated, sometimes a feedback questionnaire is administered to the audience or a sample of the audience to evaluate their understanding and/or acceptance of the information presented.

Summary

The communication step does not always receive the attention it requires because it is multifaceted and somewhat complex. Effectively communicating results is a critical step in the overall evaluation process. If this step is not executed properly, the full effect of the results will not be realized, and therefore the value of the study may be lost.

Knowledge Check: Reporting Evaluation Results

In the first quarter of last year, you joined a project team to develop marketing plans for increasing the percentage of environmentally green goods sold by the household cleaning business unit. Because the effect on the company’s bottom line was potentially dramatic, the team included senior managers, marketing reps, sales training, human resources, and workplace learning and performance staff, along with a few high-performing sales reps. A senior marketer had agreed to be the project sponsor.

A major part of the strategy was a product training course for the sales team. After three weeks of home study, 500 representatives were brought to corporate headquarters for a week of training on new products, competition, and business planning.

During the initial strategic planning session, a comprehensive study on the training was conducted to gather results from reaction through business impact. Because all representatives were required to undergo training, an experimental and control study to measure impact was not feasible. Results would best be measured by using the participants’ estimate of impact on key business metrics.

A reaction and learning evaluation was completed by all participants at the end of the home study and in-house training. Three months after the in-house training, the evaluation team emailed a comprehensive behavioral change evaluation, which included questions on estimation and confidence. In all, 366 of the 500 participants completed the evaluation.

Finally, after eight months, a report was emailed to all containing some results, partial recommendations, and promises of a more comprehensive report later in the fourth quarter.

1. As someone involved in this training and study would you consider the reporting of the evaluation results acceptable? If not, why?

2. If you were on the team designing the measurement study how would you have designed the communication plan?

Check your answers in the appendix.

About the Author

Tom Broslawsky, BSIE, is the owner of Managing thru Measurement, LLC, a company that specializes in assisting corporations and institutions to develop measurement programs that align training objectives with strategic corporate business goals. He has international experience as a consultant, speaker, author, and facilitator with expertise in the Phillips ROI Methodology, along with 30 years’ experience in education, training, healthcare, wellness, pharmaceuticals, and manufacturing. His most current venture is as the U.S. account manager for SenseiROI, a one-of-a-kind software product that incorporates the Phillips ROI Methodology into a simple, sustainable, and cost-effective measurement tool. He can be contacted at [email protected].

References

Block, P. (1981). Flawless Consulting. San Francisco: Pfeiffer.

Kaplan, R. S., and D. P. Norton. (1996). The Balanced Scorecard: Translating Strategy into Action. Boston: Harvard Business School Press.

Phillips, J. and P.P. Phillips. (2007). Show Me the Money: How to Determine ROI in People, Projects, and Programs. San Francisco: Berrett-Koehler.

Phillips, P. P., J. J. Phillips, R. D. Stone, and H. Burkett. (2006). The ROI Fieldbook: Strategies for Implementing ROI in HR and Training. Burlington, MA: Butterworth-Heinemann.

Phillips, J. J. and W. F. Tush. (2008). Communicating and Implementation-Sustaining the Practice. San Francisco: Pfeiffer.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset