Chapter 20

Using Evaluation Results

James D. Kirkpatrick and Wendy Kayser Kirkpatrick

In This Chapter

This chapter presents the idea that unless evaluation results are effectively used, you have likely expended a lot of energy but have stopped just short of realizing the ultimate purpose of training and learning—to improve organizational effectiveness. Upon completion of this chapter, you will be able to

  • identify the results most meaningful to each key partner group
  • approach each audience in the most effective way to share relevant results with them
  • use results to improve the effectiveness of training program delivery, on-the-job application, and related businss results.
 

Your Job Is Not Done Until You Create, Demonstrate, and Present Value

For decades, learning professionals have believed that their job is to design, develop, and deliver training programs. They have believed their work was done when program participants left the classroom or completed their e-learning modules. Many still believe they are truly effective when they fill classrooms, get 4.7 out of 5.0 on their reaction sheets, and close “skills gaps.” Although designing, developing, and delivering training programs are indeed important parts of the training discipline, they are not enough on their own. Extending learning into the business and creating and demonstrating value to business stakeholders is critical for training professionals who wish to remain viable into the future. Training professionals no longer have the luxury of just concerning themselves with learning events, if they ever did. Evaluation is key to being able to demonstrate business value by contributing to organizational effectiveness.

Leverage Data and Information

Using evaluation results to improve programs and show value is not a new concept. Rather, it supports the underlying purpose of training. According to the ASTD Competency Study: Mapping the Future (Bernthal and others, 2004), [measurement and evaluation is about] gathering data and information to answer specific questions regarding the value of learning and performance solutions; focusing on the impact of individual programs and creating overall measures of system effectiveness, leveraging findings to provide recommendations for change and to increase organizational effectiveness.

Our job as learning professionals is to “provide learning and performance solutions” and to “increase organizational effectiveness.” Thus, our job is to take the data and information we gather while conducting various learning events and, afterward, turn them into something useful for all.

Results Go Beyond Training

A historic problem with training evaluation is that trainers too often define “results” at Kirkpatrick Levels 1 and 2. Level 1 reaction sheets and Level 2 pre- and posttest results may tell a lot about the delivery of programs, but they are not a measure of value to the business. Data and information must be gathered at higher levels of evaluation (Kirkpatrick Levels 3 and 4) to show business value and justify training budgets. Figure 20-1 gives a brief summary of each of the Kirkpatrick four levels.

As you have learned in previous chapters, if you wait until after a training event is over to determine what data to collect, you have compromised your ability to not only measure the value of the program you have delivered, but to create it in the first place. You must build your evaluation plan into the design and development of your programs. Also, make sure that facilitators and trainers inform participants how evaluation will take place after training. Participants will be more comfortable because they will be expecting the reinforcing, monitoring, measuring, and encouraging activities back on the job. It also supports learning, because if training participants know there will be accountability for what they learned, they are more likely to stay engaged during class.

Figure 20-1. Kirkpatrick Four Levels

Level 1: Reaction

To what degree participants react favorably to the learning event.

Common measurement tools

• reaction sheet / survey

• focus group

• interview

Level 2: Learning

To what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event.

Common measurement tools

• written knowledge test

• role play and simulation

• activities and games

Level 3: Behavior

To what degree participants apply what they learned during training when they are back on the job.

Common measurement tools

• survey, interview, or focus group

• observation

• work review

Level 4: Results

To what degree targeted outcomes occur as a result of the learning event(s) and subsequent reinforcement.

Common measurement tools

• borrowed metrics (that is, existing company and human resource reports)

• survey

• focus group

“Results” are not always about Kirkpatrick Level 4 business results, like sales numbers, cost savings, customer retention, or human resource results, like the retention of top talent. Results in this chapter are defined more broadly, encompassing all four Kirkpatrick levels. This chapter will outline how to use results from all four levels most effectively.

Effective Training versus Training Effectiveness

Results can generally be divided into two types:

  • those that can be used to improve programs (effective training)
  • those that can be used to improve organizational effectiveness (training effectiveness).

You first want to ensure that your training is effective, or in other words, that the training program results in the successful imparting of the intended knowledge, skills, or attitudes to the participants. If you think there is room for improvement when you deliver your programs, first use lower level results to enhance your training programs before turning your attention to business results. Be sure to do this as efficiently as possible, however; your business leaders will likely not wait long to for you to show them business impact. Statistically, in most cases, lack of training effectiveness is not due to the delivery of training programs.

Your next and larger concern is that you accomplish training effectiveness so that the knowledge, skills, and attitudes learned during training are applied on the job and yield a measureable business result. The majority of the breakdowns occur at this stage. If you are in this majority and feel confident that you are delivering good quality training programs, continue to monitor the lower levels of evaluation (reaction and learning), but focus your attention on whether the training is yielding the desired behaviors and results.

Results Most Relevant to Each Key Partner Group

To create a cohesive training effort, share a summary of the results of a training program with everyone involved. This should be brief and high level. Then give each group the results that are most important to them. Different results will be most relevant for different groups in creating positive business impact, so emphasize the key results for each group accordingly.

The key partner groups that will use training results include

  • training participants
  • instructional designers and trainers
  • training leaders and consultants
  • business supervisors and managers
  • business executives.

If you made good decisions when building and delivering your various learning events, you will have results that can be useful to each of these groups. In addition to knowing what results to share with each group, you should also be clear about why the results are being shared and how they can be used.

Results for Training Participants

Strategy is executed one employee at a time, and training affects one employee at a time. The degree to which individuals are engaged, learn, and then apply what they have learned is the key to performance, confidence, engagement, retention, customer satisfaction, and business success. Therefore, it is wise to go over individual Level 1 and 2 results with participants to remind them of and reinforce the reasons they attended training in the first place: to learn, perform, and contribute to the organization. This means asking them about their reaction to the training program and any recommendations for improvement they may have. It also entails reviewing their own Level 2 learning scores for tests, activities, or demonstrations they performed during the class.

Results for Instructional Designers and Trainers

Instructional designers and trainers are primarily responsible for the quality of the training program itself and any required improvements, including both the content and delivery. Provide them with Level 1 reaction sheet data and Level 2 information from all in-class testing, activities, and other skills practice for this purpose.

Here is a list representative of the types of results that can help instructional designers build stronger programs and trainers deliver that material more effectively:

  • the degree to which participants were engaged during the event
  • the degree to which participants found the training relevant to their jobs
  • the degree to which participants learned what was targeted
  • the adequacy of time to practice key skills during the course
  • the ease of navigation of e-learning modules
  • the degree to which participants used social learning methods to enhance learning.

What often seems to happen with Level 1 and Level 2 results is that trainers glance over the data immediately after a program to see if participants say nice things about them, or if there are “reasonable suggestions” for improving programs. Instead, instructional designers and trainers should study the information to look for patterns. If the data suggest something is not strong—relevance, engagement, learning, and so on—dig deeper. Consider using stronger evaluation methods, such as interviews or focus groups, to determine exactly what is sub par. Then, you will be able to make the proper improvement to the course.

Too often, reaction sheets and learning data are gathered during class only to collect dust in the corner of someone’s office. With proper analysis, this information can be extremely useful to ensure that the learner achieves the foundational knowledge, skills, and attitudes that will set the table for performance results down the road.

Results for Training Leaders and Consultants

The most critical results for training leaders, such as managers, directors, consultants, and chief learning officers, to study in detail are Levels 3 and 4. There is a very critical cause and effect chain involving required drivers, critical behaviors, and desired results. In short, the degree to which required drivers are used determines how consistently critical behaviors are performed. The more consistently critical behaviors are performed, the more likely you are to accomplish the desired results.

Required Drivers. Processes and systems that reinforce, monitor, encourage, or reward performance of critical behaviors on the job.

Critical Behaviors. The few key behaviors that employees will have to consistently perform on the job to bring about targeted outcomes.

Consider this example that illustrates the cause and effect chain. Training participants have just completed a class where they learned how to use a new order entry system. The order entry system will reduce order entry time, saving the company $2 million annually once fully up and running. The new order entry portal has been loaded on customer services representatives’ computers, but the old system remains there as well during a transitional period. However, just because people learn a new skill doesn’t mean they are eager to apply it, particularly if there is a more familiar or easier way to accomplish a similar end. So in this case, drivers will be critical to make sure that the training graduates enter orders using the new system. Drivers could be a scoreboard of number or percentage of orders entered in the new system, coaching and encouragement from supervisors to use the new system, and an incentive to enter 100 percent of orders in the new system by a certain date. Training leaders should ensure drivers like these are in place and are getting used when the training program is complete.

Training leaders and consultants also should ensure that the programs for which they are responsible are being delivered effectively from a high level. They should review the Level 1 and 2 summary data and ensure that the instructional designers and trainers make any required content or delivery improvements.

Here are some examples of higher-level results that training leaders and consultants can monitor, analyze, and use as the basis of sound decisions:

  • action plan progress
  • coaching frequency
  • incidences of noncompliance
  • performance rates for critical on-the-job behaviors
  • reasons for lack of application
  • rewards for positive application
  • early business results
  • early human resource results.

Higher-level results can be used to make good decisions to maximize on-the-job application. For example, use data to identify and remove barriers to application and create focus on critical behaviors. This will directly influence the desired Level 4 results.

Results for Business Supervisors and Managers

To improve organizational effectiveness, learning professionals need to get the data and information into the hands of business people who affect execution—the front line leaders. Business supervisors and managers need to see Levels 3 and 4 results to ensure that they reinforce the key behaviors that will generate the targeted business results.

Supervisors and mid-level managers like to see these kinds of results:

  • applying knowledge and skills on-the-job
  • identifying and eliminating performance barriers
  • taking employee engagement scores
  • accomplishing action plans
  • demonstrating individual key performance indicators (KPIs)
  • showing operations efficiencies
  • executing cost savings
  • contributing to overall strategy.

Sometimes business managers and supervisors want specific suggestions on how to improve performance, and sometimes they don’t. If they do, that’s great. Talk with them and provide recommendations for action along with the potential results. If they don’t, give them what they do want. However, you might consider doing some “client shaping”; gently but clearly show them how you might be able to offer them more targeted recommendations in the future. Always keep in mind that they are your customers and it is your job to make their job of coaching and reinforcing learning as easy as possible.

Results for Business Executives

Business executives are at the highest level in the organization. It follows that results of training at the highest level are of the most interest to them. When providing data to the executive level, it is appropriate to focus on Levels 3 and 4. There are three reasons for this. First, business executives can influence change and growth in the organization more than any others. Second, this connects training directly to executing business strategy. Finally, it helps to dispel the common yet false belief that evaluation is no more than “smile sheets” and pre- and posttests.

Many training professionals regrettably provide Level 1 and 2 results to executives in great detail. This is what has created the myth that training can only yield results on those levels. Resist the urge to share data like number of courses available, sessions held, people trained, and hours of training completed with your executives. Focus on presenting to what degree information learned is being applied on the job and what key business results those actions are supporting.

To find out what types of ultimate results your senior sponsors and other executives are looking for, you will have to talk with them. A key question to ask them to get the information you need is, “What will success look like to you?” Here are some examples of results that balance being pleasing to executives and realistic to achieve:

  • efficient operations
  • compliance
  • retention of top talent
  • customer satisfaction scores
  • sales volume.

Summary of Key Results for Key Partner Groups

Every group that uses the results of training evaluation data should see all results in a high-level summary. However, certain results will be of the most interest for each group. Figure 20-2 provides a summary of the level of results that will be most useful and compelling to each of your key groups, along with representative decisions they can make using the information.

As you move further up the corporate ladder, so you move up the Kirkpatrick levels in terms of what type of information is appropriate and most meaningful to emphasize. Focusing your presentation of data following this guide will show both your sensitivity to limited resources and your business acumen.

Providing Evaluation Data to Each Group Effectively

Each group requires different evaluation data. Requirements for all groups, training participants, instructional designers and trainers, business supervisors and managers, and business executives are discussed in the following sections.

All Key Partner Groups

For any initiative, everyone involved should know the highest-level objective that the program supports. Using the example of the new order entry system from the last section, training participants should know that the reason they are learning a new way to enter orders is to save the company $2 million. This adds a higher level of meaning to the training that typically results in better learner engagement, which in turn, supports higher retention and application.

Figure 20-2. Results Most Important to Each Key Group

Kirkpatrick Level Key Partner Group Targeted Decisions
Level 1: Reaction Instructional designers and trainers

• Improving program development and delivery

• Ensuring training is targeted to strategic goals

Level 2: Learning
Level 3: Behavior Training leaders and consultants

• Improving follow-up and reinforcement to increase on-the-job application

• Improving business partnerships

Business supervisors and managers

• Improving decisions about training choices for direct reports

• Enhancing engagement of direct reports through support and accountability

• Improving performance of direct reports

Level 4: Results Training leaders and consultants

• Ensuring training offered is in alignment with key strategic initiatives and company goals

• Reducing costs by trimming nonstrategic training

Business supervisors and managers

• Improving department / division KPI metrics

Business executives

• Communicating strategic objectives to focus training and reinforcement efforts

• Modeling and communicating the business partnership approach to training, performance, and strategy execution

A good way to communicate the overall meaning of a program to all audiences is through a compelling Chain of Evidence. This links the intended outcomes at each of the Kirkpatrick levels to show how one supports the other. This takes a training initiative from being just a class to something that supports the highest organizational goals. Figure 20-3 presents an example of the Chain of Evidence for the new order entry system. In this example, the program is complete. At the onset of the program, the same Chain of Evidence can be used to show the cascading goals the organization hopes to achieve. The rest of this section is dedicated to explaining how to expand on the high-level Chain of Evidence to provide more detail to each key partner group for the areas of the most interest and use to them.

Training Participants

Training participants typically need to know just their individual results. Often, they know these before they leave the classroom. If feasible, the trainer, a peer, or a supervisor can meet with them individually after the program to talk about future actions for individual improvement and higher-level contributions. A career opportunity discussion can also be part of the conversation if relevant and appropriate. Personalized conversations, when possible, yield many benefits. For the participant, it makes them feel important. For the training organization, it punctuates the importance of training. For the organization as a whole, it increases employee engagement and satisfaction.

Figure 20-3. Chain of Evidence Example

Chain of Evidence

New Order Entry System Implementation Initiative

Level 1: Reaction   Overall course rating: 4.6 / 5.0
Level 2: Learning   Hands-on practice participation: 100%
    Posttest average score: 92%
Level 3: Behavior   Percentage of orders entered in new system:
    November: 48%
    December: 79%
    January: 99%
Level 4: Results   Cost savings that resulted from reductions in
    order entry hours:
    November: $83,250
    December: $132,000
    January: $149,000

 

Instructional Designers and Trainers

Instructional designer and trainer efforts need to be in alignment, so we advise meeting with them as a group to review evaluation results. Look for and generate action steps to improve engagement, learning, and follow through for participants. Taking the time to meet with these groups of people will eliminate any possibility that the results are skimmed over and then set aside. Take the initiative to make sure that results are used for future program improvements if indicated.

Business Supervisors and Managers

When working with business supervisors and managers, discuss what needs to be done collaboratively. Review data and information with them in the context of executing the strategy of their department in relation to employee engagement and contribution, and to corporate goals and directives. Constantly remind business supervisors and managers of the connection between training, their direct reports’ learning and performance, and their own support and accountability. Don’t think you can just email them a report and they will take it from there. The personal relationship that you maintain with the managers and supervisors who will reinforce, monitor, encourage, and reward the performance of critical behaviors on the job is key to training success.

During Jim’s time as the training director for First Indiana Bank, he found that he spent most of his time working with supervisors and managers at all levels. He constantly supplied them with data and information that, in his opinion, could help them improve the morale and/or productivity of their employees. At first he thought it would be enough to send this information and, surely, they would see things the way he did and make the indicated changes. He quickly found, however, that to really create change, he needed to make a business case to managers and supervisors that these were opportunities to actually execute their strategies and improve their department’s KPIs. He also learned that they wanted more than data. They wanted to know how to interpret it in relation to changing behavior and positively affecting future results.

In between personal meetings with key supervisors and managers, dashboards are an efficient and effective way to communicate program progress. The “stoplight variety” (green for actual results on target, yellow for results somewhat below target, red for results significantly below target) is an effective way to visually communicate program status. We recommend the metrics in the dashboard include measures of learning, drivers, and critical behaviors. They should be sequenced to show a cause-and-effect relationship among training, learning, reinforcement, performance improvement, and lower and upper tier results. This allows managers and training professionals to use the results to determine what to continue to support, and where to intervene to improve learning, performance, and subsequent results. Table 20-1 shows an example of a dashboard that could be used for the previously mentioned new order entry system training initiative.

Table 20-1. Dashboard Example


New Order Entry System Usage
  Target Actual
Training completed for all associates November 1 October 15
Orders entered in new system:
November 50% 48%
December 75% 79%
January 100%  
Order accuracy:
December 95% 92%
January 98%  

 

Business Executives

As you have read in previous chapters, it is critical to negotiate expectations and reporting formats ahead of time with executives. Rather than automatically sending them dashboards, scorecards, and “executive reports,” determine what results they want and need to see, and how often. Two categories of results that interest business executives are

  • ongoing results that show progress (or lack thereof) with specific strategic initiatives
  • a final Chain of Evidence that presents a story of ultimate value to the business.

Ongoing results that show the progress of an initiative are often overlooked. A dashboard or some other simple reporting method should be used regularly (typically monthly) to show program sponsors and other interested stakeholders to what degree things are moving along toward the ultimate goals. This is important for two reasons. One, when key indicators of progress fall below standard, you will likely need executive influence to get back on track. Two, this is a way of reassuring them that all is well. It is unwise to wait months for final Level 4 results to arrive to be able to demonstrate your value. Worse, if the initiative becomes derailed for any reason during the execution, you want to identify and correct the issues as quickly as possible so the initiative doesn’t fail to produce the intended results.

When a program is complete, you may need to present evidence that your efforts have brought value to the bottom line. This may be for key company initiatives or those where the value is being questioned. The emphasis is not to rely solely on ultimate business and human resource metrics, but instead, to show results at all four levels with your Chain of Evidence. This demonstrates the power of everyone working together and solidifies the role of training in a business partnership.

If you are asked to present the results of a training program, request the opportunity to do so in person. If you are granted this privilege, you will have the best chance of showing how the Chain of Evidence and training support the business goal in question. A conversation about the program in general also gives you the chance to build your business partnership with the executive level and keep the pump primed for receiving the information you need for future training initiatives. Positioning yourself as a business partner in this way will help you as a training professional, and ultimately your entire organization as you align to meet the highest objectives of the company. See figure 20-4 for an example of how a training professional can support business managers.

Figure 20-4. Example of Successful Trainer-Business Manager Cooperation

Linda Hansen works as a high-level training manager at a large mid-west healthcare network. She led the learning and development arm of a program to increase the incidences of nurses completing electronic patient charting entries. These entries were an important component in the effort to increase patient safety. The training for this skill went off without a hitch. Attendees not only responded to the training with high Level 1 engagement and satisfaction scores, but also high levels of knowledge and competencies at Level 2. The key to ultimately improving patient safety at Level 4 was to get the nurses to actually perform their newly developed skills on the job (Level 3).

Linda spent time both before and after training ensuring managers at all levels were on board with this initiative, knowing that training in and of itself would not bring about enough change and subsequent results. She and her team partnered with the information technology department and developed and administered an automated Level 3 assessment, which indicated that the amount of on-the-job application was below standards. They queried the nurses’ supervisors, who responded, “Yes, we know they are not doing it, but if we watch them, they will, and then when we don’t, they won’t.” In short, the supervisors had no data to tell them who was and who wasn’t complying.

To resolve the problem, she provided the data to the supervisors showing them who was compliant and who wasn’t. Because the degree that the nurses used the application on the job significantly affected the supervisors’ and managers’ key performance indicators, they were only too glad for the specific data. They immediately went to work ensuring all were in compliance at Level 3, thus leading to success at Level 4.

 

Case Example: Using Evaluation Results at 7-Eleven

Field Consultant Certification Training (FCCT) is one of 7-Eleven’s most important initiatives. The program, which has been running since February 2007, is designed to prepare business consultants to effectively produce results for the organization. In each of the three distinct training phases, the company’s focal team members gather for a week at the corporate headquarters in Dallas, Texas, to learn about leadership skills, the core processes involved in doing their job correctly and efficiently, and the analytical skills necessary to produce tangible results to the bottom line.

As with all graduation programs, the end truly is the beginning for business consultants who attend this training program. The graduation marks a new way of doing business. The intensive certification process ensures that the training the business consultants received in Dallas is applied and verified back on the job.

Results of this program at all four Kirkpatrick levels are tracked diligently, and monitored monthly by senior leadership to ensure the significant cost associated with the program is producing suitable results. Scorecards for all functions involved in this program include metrics designed to capture performance, looking more for the applicative results rather than simply tracking the number of people who complete the training. Measured performance is targeted directly at those behaviors and results associated with the strategic objectives and tactics on the 7-Eleven strategy map. Table 20-2 provides a summary of the metrics tracked for this initiative.

Table 20-2. Summary of Field Consultant Certification Training Metrics

Metric Definition Who Tracks? Who Receives? Purpose of Metric
Level 1: Reaction
Participant Daily Course Evaluations Facilitator Team

1. Facilitator Team

2. Learning Management

3. Learners

1. Help facilitators improve training delivery.

2. Help learning management team verify facilitator performance and monitor participant satisfaction.

3. Show learners that their feedback is valued and used to improve future programs.

Level 2: Learning
Knowledge Verification Facilitator Team (Using online assessment tool)

1. Learners

2. Managers

3. Facilitator Team

4. Development Team

1. Pretest focuses learner on know- ledge gaps, maximizing learning.

2. Provide supervisors and managers with individual learner reports for each of their direct reports to help them coach more effectively.

3. Give the facilitator team information to verify they are covering material adequately, or indicate places where material or delivery should be modified for more effective learning.

4. Give the development team the information to gauge entering and exiting participant knowledge to tailor course materials to their needs, while meeting stakeholder expectations of content mastery.

Level 3: Behavior
Business Plan Supervisors/ Managers, Facilitator Team

1. Learners

2. Supervisors

3. Learning Team

4. Senior Leadership

1. Learners are provided valuable feedback from various sources (division vice president, merchandising, their supervisor, human resources, training, and others) on their performance of a major job component (business planning).

2. Supervisors use the Business Plan opportunity in a variety of ways, including verifying current performance and coaching learners for improved performance.

3. The learning team uses the information as part of the verification of whether the training program was effective in changing behavior on the job.

Level 4: Results
Store Sales Results (six months after certification) Financial Planning, Training Team Senior Leadership, Financial Planning, Supervisors

1. Sales are tracked to make sure that certification impacts sales. (i.e, those who are certified have higher sales increases than those who are not). Senior leadership (and the learning team) use these results to verify that the program is working.

2. Financial Planning uses this information to justify the training cost to the organization/senior leadership.

3. Supervisors use this information to justify the expense (even though it is not theirs) of sending their employees to training.

Knowledge Check: Measuring the Value of a New Hire On-Boarding Program

You are the leader of the learning and development team for a large consumer products manufacturing company. The company has found an exciting new market niche and will therefore be hiring new employees in the coming months.

Until now, new hires were trained with an orientation program through the human resources department. You made the case for a different program run by the training department called the new hire on-boarding program. You have been granted the chance to pilot the program with the understanding that you will report program results throughout the implementation, and make a formal report to the executive committee in six months.

The targeted success outcomes of the new program are

• increased new employee engagement scores

• decreased employee turnover during the first year

• shortened time to targeted performance levels

• improved unit productivity.

For each of the following groups, what level of targeted results will you gather and share and what targeted decisions can be made from the results provided to each group? Check your answers in the appendix.

1. Instructional designers and trainers

Level of results:

Targeted decisions:

 

2. Training leaders and consultants

Level of results:

Targeted decisions:

 

3. Business supervisors and managers

Level of results:

Targeted decisions:

 

4. Business executives

Level of results:

Targeted decisions:

About the Authors

James D. Kirkpatrick, PhD, is a senior consultant with Kirkpatrick Partners. He provides workshops and consulting for Fortune 500 companies around the world on the topics of business partnership and four-level evaluation. His clients include Harley-Davidson, Booz Allen Hamilton, L’Oreal, Clarian Health Care, Edward Jones, Ingersoll Rand, Navy Federal Credit Union, Honda Manufacturing, the Federal Reserve Bank of St. Louis, the U.S. Department of Defense, the Royal Air Force, Petronas Oil Company, and the Abu Dhabi Police Department.

Kirkpatrick’s approach goes beyond the science of training, reinforcement, and evaluation. His emphasis is on the art of developing and sharing business cases with would-be partners on the benefits of and need for a business partnership approach. He uses metaphors and the testimonials of successful employees and managers to bring his message to life.

Kirkpatrick has cowritten three books with his father, Don Kirkpatrick, the developer of the Kirkpatrick four levels. He has coauthored two books with his wife and business partner Wendy: Kirkpatrick Then and Now (2009) and Training on Trial (2010). He can be reached at [email protected].

Wendy Kayser Kirkpatrick is the founder of Kirkpatrick Partners, LLC, a company dedicated to helping organizations become more effective through business partnership. She applies her skills as a certified instructional designer and expert presenter and facilitator to lead companies to measurable success.

Kirkpatrick’s results orientation stems from her career beginnings in retail, holding positions in merchandising, direct importing, and product development with Venture Stores and ShopKo Stores. From there she held marketing positions with Springs Industries and Rubbermaid. Most recently she was a training manager for Hunter Douglas Window Fashions, managing the curriculum for 1,500 sales and customer service representatives in North America. She can be reached at wendy.[email protected].

References

Bernthal, P. R., K. Colteryahn, P. Davis, J. Naughton, W. J. Rothwell, and R. Wellins. (2004). ASTD Competency Study: Mapping the Future. Alexandria, VA: ASTD Press.

Additional Reading

Brinkerhoff, R. O. (2003). The Success Case Method. San Francisco: Berrett-Koehler.

Kirkpatrick, D. L. and J. D. Kirkpatrick. (2007). Implementing the Four Levels. San Francisco: Berrett-Koehler.

Kirkpatrick, J. K. and W. K. Kirkpatrick. (2009). Kirkpatrick Then and Now. St. Louis, MO: Kirkpatrick Publishing.

Kirkpatrick, J. K. and W. K. Kirkpatrick. (2010). Training on Trial. New York: AMACOM.

Phillips, J. J. and P. P. Phillips. (2007). Show Me the Money. San Francisco: Berrett-Koehler.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset