In interviews, an individual responds orally to questions asked orally by one or more persons. Interviews may be conducted face-to-face or via telephone or video conferencing. This chapter provides basic guidelines for developing and applying interviews to collect data for evaluating training and other development programs. By reading this chapter, you will learn to
There are two basic types of interviews: structured and unstructured. Descriptions of each are listed below.
In a structured interview, the interviewer asks a series of preplanned questions to all interviewees. The interviewee may be asked to select one of several choices provided (close-ended questions) or may be asked to respond in his or her own words (open-ended questions).
Tables 6-1 and 6-2 are examples of close-ended and open-ended questions.
In unstructured interviews, each interviewee may be asked different questions, and no response choices are provided. The interviewers are free to develop their own questions based on the interviewees’ particular characteristics or experiences or their responses to previous questions.
Structured interviews are usually more efficient and accurate for collecting factual information. For this type of information, consider using close-end questions with a list of response choices for each question; however, allow for other responses. Using a response scale makes analyzing data much easier and faster. Open-ended questions can provide the most in-depth information for complex issues and topics related to people’s personal experiences and ideas. Many interviews combine both structured and unstructured techniques. For example, an interviewer may begin with a series of standard, preplanned questions about an employee’s overall reactions to a training program and then tailor subsequent questions to better understand the employee’s responses.
Table 6-1. Example of Close-Ended Question
How often do you think you will apply what you learned in class to your work?
a. Every day
b. Once a week
c. Once a month
d. Every few months
e. Less than every few months
Table 6-2. Example of Open-Ended Question
How do you plan to apply what you learned in class to your work?
Interviews are a useful tool for assessing participants’ and other stakeholders’ perceptions of a training or development program and the extent to which participants have applied their learning to the job (that is, how their work behaviors have changed). Interviews capture a range of perspectives and can be easily modified for different stakeholders, both within and outside an organization. For example, when I was working at a large aerospace firm, a colleague and I conducted an evaluation of a pilot 360-degree feedback process. We interviewed the engineers, their raters, their managers, and the top executive. Although the topics of the interviews were the same, we worded the questions differently for each group. Each group provided a different perspective on the usefulness of the process and how it could be improved.
Employees, their coworkers and managers, and other stakeholders can be interviewed to determine how a training or performance improvement program has changed the work performance of employees. For example, managers can be questioned about the quality of their employees’ work in the targeted area before and after a training program. For employees who have direct client contact, the clients could also be interviewed about the service provided to them.
Interviews can be used to collect anecdotal data about the value of the training, both through success stories in which the training program played an important role in preparing an employee to successfully resolve a problem and through incidents in which the training did not provide adequate preparation. Anecdotal data provide dramatic demonstrations of the effect of a training or development program. They spark interest in an evaluation report or presentation because they give real-life examples and provide the human story behind the research.
Interviews are also an effective approach to identifying obstacles to the success of the training program. For example, using carefully crafted questions, the interviewer can guide interviewees to candidly discuss the problems they faced in applying their classroom learning to the job. This information can then be used to plan how to better support transfer of learning to the job.
As with any other data collection method, interviews have distinct advantages and disadvantages. Several are summarized below.
Advantages
Disadvantages
To ensure that the evaluation data you collect are accurate and reliable, use more than one method of collecting information. All research methods have weaknesses and strengths. When you use interviews to collect data, also use a method with complementary strengths and weaknesses. Multiple methods provide a sounder basis for your conclusions. The data collected with different instruments also illuminate multiple facets of the issue and, therefore, provide deeper and broader insights. The data, and therefore the evaluation, will have more credibility with your internal or external clients. For example, because interview administration and data analysis are time consuming, interviews are usually not practical for large groups. However, they can be used to collect initial information to develop questions for surveys or focus groups that will allow for input from much larger populations to verify the interview data collected. Interviews also can be used to follow up surveys or focus groups to collect more in-depth information from a subset of the survey or focus group respondents.
The eight steps of planning and conducting an effective interview are presented below.
The first step in planning an interview is to clearly define the information you want to collect and identify the people who can best provide that information. Do you want to know how employees feel about a program, what they learned, or if they applied what they learned to the job? Who can most accurately answer your questions: the employees themselves, their managers, their customers, or their coworkers?
For example, we want to know if the 30 managers who completed the “Giving Employees Useful Feedback” course are applying what they learned on the job. Because employees are the recipients of the feedback, we will ask them about the feedback provided by their managers. We do not have the resources to interview all 240 employees of the managers, so we will draw a random sample of one employee per manager to interview.
Because the methods used to collect and analyze the data are interdependent, it is important to plan how you will analyze the data before making a final decision to use interviews for evaluation data collection. You need to determine if the resources and skills required for summarizing and interpreting the data collected will be available to you. Close-ended interview questions typically produce quantitative data, while open-ended questions result in qualitative data. Usual methods of summarizing quantitative data obtained from interviews are frequencies, percentages, and cross-tabulations; these may then be used with more advanced statistical methods, such as correlations or tests of significant differences, for example, chi-square and t-tests, and analysis of variance. Content analysis techniques are typically used for qualitative interview data.
Interviewers should develop an interview protocol or script to ensure all needed information is collected and the interviews are consistent for all interviewees. The protocol should include
Figure 6-1 is an example of an interview protocol.
When preparing the interview questions, do not ask for information that can be easily obtained elsewhere, such as in personnel records. Organize questions by topic and arrange them from general to specific within each topic. At the conclusion, allow time to ask the interviewees if there is anything they would like to add.
• Include only one idea in each question
• Use simple, familiar language
• Use complete sentences
• Do not use words or terms that have multiple meanings
• Use positive rather than negative inquiries whenever possible
• Screen the content, wording, and tone for potential offensiveness
Figure 6-1. Sample Interview Protocol
Interview for the “Giving Employees Useful Feedback” Course Evaluation
Instructions to Interviewer are in italics.
Opening Statement
Read the following statement to the interviewee:
Hello, [name]. My name is …… . [Take a moment here for small talk to build rapport.] We are talking to a sample of the employees of the managers who recently participated in the “Giving Employees Useful Feedback” course to find out if they are applying what they learned in class on the job. The information you provide will be kept confidential. It will only be seen by the evaluation analysts who will summarize the data for all employees. This information will help us improve the training course.
I will ask you several questions about the feedback your manager has given you in the last month. Depending on your responses, the interview will take from five to 30 minutes. We are defining feedback as information about your performance that explains what you did well or how you could improve. Feedback can range from a few words as your manager passes you in the hall to a long discussion in his or her office.
Questions
Ask the interviewee the following questions and note the responses below each question.
1. In the last month, has your manager given you feedback?
Circle employee’s response: Yes No
2. Did you request feedback from your manager in the last month?
Circle employee’s response: Yes No
If the employee responded no to both Questions 1 and 2, skip to the Closing Statement.
If the employee responds yes to Question 2, ask:
2a: Approximately how many times did you ask the manager for feedback?
2b: Please describe the situations in which you asked for feedback.
Proceed with Question 3 if the employee responded yes to Question 1. Otherwise, skip to the Closing Statement.
3. How would you rate the helpfulness of the feedback your manager has given you in the past month?
____ Very helpful
____ Helpful
____ Somewhat helpful
____ Not helpful
____ Harmful
4. Please describe the most helpful feedback your manager has given you in the past month. As best as you can remember, tell me about the situation and what the manager said.
5. Why did you find this feedback especially helpful?
(Note: Additional questions would appear here.)
Closing Statement
Read the following statement to the interviewee:
This concludes the interview. Thank you for taking the time to help us improve managerial training. Your input is valuable. We will present the results of the interviews to the executive team next month.
Pilot testing is a vital component of interview development. In a pilot test, interviews are conducted with a small group of participants to identify revisions needed to the questions, instructions, and data-recording procedures. You should also try out the data analysis techniques and preview the potential difficulty and time requirements for the full analysis.
In most cases, limit each interview to 45 minutes or less. Many people find it difficult to focus fully for longer periods. Schedule sufficient time between interviews to allow time for the interviewers to refine their notes on the open-ended questions. Immediately after each interview, the interviewers should type up and expand their notes while their memory is fresh.
Plan to conduct the interviews in a private, quiet place, preferably in a neutral location, such as a conference room rather than the interviewee’s or interviewer’s office. Invite the selected employees to participate in the interviews at least two weeks in advance. Inform them of the purpose of the interviews and the time, location, and expected duration of the interview. Send interviewees a reminder a few days before their scheduled interview, restating the purpose and benefits of the interview; providing the date, time, and location; and expressing appreciation for their participation.
Select interviewers who have strong listening, analytical, and writing skills and are also personable and friendly. The interviewers should not be in the employee’s chain of command and should be perceived as objective. The accuracy and completeness of the interview record can be increased by having two interviewers conduct each interview.
Provide training for the interviewers one or two days before the interviews begin. During the training, explain the purpose of the interviews and how the collected data will be used. Review the interview protocol and have each interviewer conduct at least two practice interviews, followed by feedback on their performance. Also include the following points in the training:
A coordinator should be assigned to monitor the interview process to ensure everyone is in the right place at the right time, confirm that the interviewers have the materials they need, answer questions, follow up if interviewees do not appear as scheduled, and alert the evaluation project manager of any issues. The project manager should be readily available so that any problems can be quickly resolved. Following the interview, send each interviewee a thank you note.
As soon as the interviewers have prepared their interview notes for submission, the coordinator should review the notes and check that the interview was fully and accurately documented. If responses are missing or unclear, the notes should be returned to the interviewer for prompt clarification.
The selected analysis techniques should be used by an individual who is well trained in data analysis. It is important to check and recheck calculations and summaries of qualitative responses to be sure they are correct.
You have been asked to create an interview protocol for an evaluation of a year-long new employee on-boarding program. The objective of the interviews is to determine if employees who completed the program in the previous six months found it useful and, if so, how. Develop a sample protocol that includes at least three questions. Check your answer in the appendix.
Anne F. Marrelli, PhD, is a senior organizational psychologist in the Organizational Effectiveness group in Air Traffic Operations in the Federal Aviation Administration. She has more than 25 years of experience in organizational performance improvement. Former employers include the U.S. Merit Systems Protection Board, American Express, Hughes Electronics, Educational Testing Service, and the County of Los Angeles. Marrelli earned a doctoral degree in learning and development from the University of Southern California and has published numerous journal articles and book chapters. She may be reached at [email protected].
Fowler, F. J. (2002). Survey Research Methods, 3rd ed. Thousand Oaks, CA: Sage Publications.
Marrelli, A. F. (1998). “Ten Evaluation Instruments for Technical Training.” In Another Look at Evaluating Training Programs, D. L. Kirkpatrick ed. Alexandria, VA: American Society for Training and Development, 58–65.
Pershing, J. L. (2006). “Interviewing to Analyze and Evaluate Human Performance Technology.” In Handbook of Human Performance Technology: Principles, Practices, Potential, J. A. Pershing ed. San Francisco: Pfeiffer, 780–94.
Phillips, J. J. (1991). Handbook of Training Evaluation and Measurement Methods, 2nd ed. Houston, TX: Gulf.
Rubin, H. J. and I. S. Rubin. (1995). Qualitative Interviewing: The Art of Hearing Data. Thousand Oaks, CA: Sage Publications.
Webb, E. J., D. T. Campbell, R. D. Schwartz, and L. Sechrest. (2000). Unobtrusive Measures rev. ed. Thousand Oaks, CA: Sage Publications.
Kvale, S. and S. Brinkmann. (2009). InterViews: Learning the Craft of Qualitative Research Interviewing. Thousand Oaks, CA: Sage Publications.
Patton, M. Q. (2002). Qualitative Research and Evaluation Methods, 3rd ed. Thousand Oaks, CA: Sage Publications.
Rubin, H. J. and I. S. Rubin (2005). Qualitative Interviewing: The Art of Hearing Data, 2nd ed. Thousand Oaks, CA: Sage Publications.