CHAPTER 8

______________

Steps 1 and 2: Planning and Conducting the Employee-Engagement Survey

Employee-engagement surveys are the most direct—and therefore, I believe, the best—way to assess engagement. However, since almost nothing is ever perfect, surveying has its advantages and disadvantages and must be approached and planned carefully, especially in government.

One obvious advantage is that well-designed and well-administered surveys quantify the level of employee engagement with a precision that other approaches can’t match. For example, let’s look at one of the “agree/disagree” statements suggested in the “diagnostic checklist” cited in Chapter 6 as a way to determine if employees are engaged: Employees are asked if “performance assessment and development often feel like transactional activities that are done to the employee as opposed to being driven by the employee.”1 An important question, but rather than try to answer it anecdotally or subjectively, a survey can answer this (and other important questions) quantitatively and therefore more precisely. In addition to identifying areas to focus on to improve engagement in the short term, survey results can also provide benchmarks to assess and improve engagement over time.

Another advantage is that good surveys deliver results that can be clear and actionable. For example, if survey results show that engagement is low because employees aren’t receiving regular and useful feedback from their supervisors or aren’t sure what their roles and responsibilities are, managers can act to address these shortcomings. In addition, surveys like the U.S. Merit Systems Protection Board (MSPB), U.S. Office of Personnel Management (OPM), and the U.K. questionnaires allow the surveying organization to develop an engagement index that summarizes the engagement level across the organization. This can be an important benchmark to assess overall engagement. One survey firm also provides a “change index,” a summary number (on a 0–100 scale) that rates organizational readiness to make the kind of change needed to improve engagement.

But surveying can be a double-edged sword. A poorly planned survey, or a survey with no follow up, can create more problems than it solves. Therefore, planning requires considering a range of issues:

• Deciding on survey governance

• Deciding whom to survey

• Deciding how often and when to conduct the survey

• Locking in senior leadership support

• Communicating the survey purpose, process, and results

• Designing the survey

• Preparing for possible public visibility

PLANNING A SURVEY

Deciding on Survey Governance

An important starting point is for the agency to consider how the survey process will be “governed”—that is, who will be responsible for managing the process (e.g., a steering committee, human resources). The U.K. Ministry of Justice, which has 80,000 employees, formed an engagement steering group of high-level business sponsors, in large part to achieve executive buy-in across the organization. The ministry also created a working group, representing all sectors of its workforce, that dealt with practical issues such as how the survey would be coordinated, what questions to ask, and who should receive results reports.

Decisions about governance are particularly critical in public-sector organizations, which, as described earlier, operate in a fishbowl environment where decisions and operations are subject to intense scrutiny by multiple stakeholders with often-conflicting interests. Smart agencies will involve as many stakeholders as possible in some capacity, perhaps through a steering committee. Given that labor unions still have strong influence in the public sector, they can also play a key role in governance.

The U.S. Postal Service (USPS), which has had its share of employee problems, provides an example of what can happen if unions aren’t involved. In 2010, the American Postal Workers Union, which represents more than 200,000 current and former USPS employees, called for its members to boycott the 2012 survey because, according to the union, “the Postal Service has misrepresented the results of employee opinion surveys in the past, when it used survey data to justify claims that employees supported its wage proposals.”2

Often, HR manages the employee-engagement survey process, but this depends on several factors, including HR’s reputation and credibility. Unfortunately, HR is not viewed as a credible business partner in some organizations. This is often true in government, where HR can be perceived as the “personnel police” because it is charged with enforcing the many personnel and civil-service rules that govern public-sector jurisdictions and agencies.

When we conducted an engagement survey at the University of Wisconsin, HR coordinated the survey process but we also worked hard to ensure that this was not perceived as “just another HR project.” With the active support of our executive sponsor, we continually reminded our group of senior executives that they themselves developed the strategic goal that drove the engagement initiative—that is, the goal to “create an environment of respect and inclusiveness through opportunities for employee engagement.” HR’s intent was simply (or not so simply) to help implement that goal.

Deciding Whom to Survey

Does the agency want to survey all employees or just a sample? The postal service surveys 25 percent of its employees every three months. In 2012, the OPM transitioned from surveying a sample of all federal employees to surveying the entire workforce. The University of Wisconsin Hospital and Clinics surveys all employees who have been on board for at least three months, since newer employees would not have a solid basis on which to answer the survey questions.

Another issue is what types of employees to include: permanent fulltime, part-time, temporary, or intermittent employees; contract employees; senior executives; employees in all locations; and so on.

Most government agencies that conduct surveys allow their employees to complete surveys while they are at work, on the clock. This decision needs to be communicated across the organization, particularly to supervisors.

There are several statistical issues related to surveying that may need to be considered, particularly if the agency decides to administer the survey to a sample (and not all) employees. A detailed discussion of these technical issues is beyond the scope of this book.

However, one example is a technique known as “weighting” the survey results to ensure they represent the true demographics of the entire workforce. For example, if an agency’s workforce is 50 percent women and 50 percent men, but only 40 percent of the survey respondents are women, the results would have to be proportionately adjusted (weighted) in favor of women to ensure that the data represent that 50 percent of the workforce are women.

Another statistical technique is to calculate “confidence intervals” to estimate the extent that the results of a sample approximate the results that would occur if the entire workforce responded to the survey. This is the technique public-opinion polling organizations use to analyze and report their results (i.e., determine the extent that sampling errors may exist). We often see the results expressed with a “margin of error”—that is, if the sample percentage of positive responses to a question is 60 and the margin of error is plus or minus 3 percent, then the real percentage of positive responses for the entire population would be between 57 and 63 percent.

These types of statistical issues can best be considered and addressed by someone with statistics expertise. Consulting firms that conduct surveys and analyze the results should provide this expertise.

Deciding How Often and When to Conduct the Survey

Ideally, jurisdictions and agencies should conduct engagement surveys on a regular schedule. This means deciding how often to conduct the survey. In the United States, the federal government now conducts the Federal Employee Viewpoint Survey annually. (The survey was initially conducted every other year.) The Canadian federal government conducts its survey every three years. While annual surveys continually refresh the engagement database, the counterargument is that one year is not enough time to act on the survey results and produce real change. The city of Tamarac, Florida, conducts surveys every two years but supplements these with “minisurveys” it administers more often to make sure the city is on the right track. The decision about survey timing can also hinge on resources, particularly in staff- and cash-strapped government agencies, because conducting an engagement survey costs both time and money.

After making the decision on the survey cycle, the next step is to decide the specific time during the cycle to conduct the survey. Finding a time to conduct the survey that everyone involved agrees on is ideal may not be possible.

There may be times to avoid, like when the budget is being prepared or right before an election. The University of Wisconsin Hospital conducts its annual engagement survey in February or March, after the hospital budget process is completed but before the annual performance evaluation cycle begins. The city of Juneau conducts its annual engagement survey in May, when the city finally emerges from the long Alaska winter (and employees are presumably in a more positive frame of mind in general).

In 2012, the OPM conducted the Federal Employee Viewpoint Survey in March but waited until a few weeks after the presidential election to release the results, which showed a decline in both federal-employee satisfaction and engagement.

For a university, the beginning or end of the semester should be avoided. At the University of Wisconsin, we had lengthy conversations about finding the “right time” to survey our employees. Some executives were concerned that employees might still be reeling from actions taken by the state legislature and governor to drastically restrict public employees’ collective bargaining rights and also increase employee contributions to retirement and health insurance. The fear was that employees would again voice their frustrations through the survey, even though the university did not have the authority to roll back these changes. Another concern was the “squeaky wheel” feeling that only disaffected employees would complete the survey and use it to vent their concerns.

After much back and forth with our division directors, we realized that there was no perfect time for us to conduct an employee-engagement survey. We also pointed out that the survey instrument we planned to use, developed by the MSPB, really focused on conditions in the workplace, not factors beyond our control (i.e., decisions made by the governor and legislature). Plus, we needed to establish a baseline and, if engagement was indeed low, it was important for us to know that empirically. Our solution was to conduct the survey in early spring—the time of year that seemed to present the fewest obstacles, in our case.

Locking in Senior Leadership Support

Government organizations that have conducted employee engagement surveys and then acted on the results consistently emphasize the absolutely essential need for executive support. This is particularly true in government where, as we have discussed, decision making is complex and often not transparent, and leadership can change rapidly. When I arrived at the University of Wisconsin, the strategic goal to focus on employee engagement was in place but not much work had been done on the goal. Therefore, the timing was right for us to make the case for moving forward using a survey to measure engagement.

We made this business case to our vice chancellor and, after he endorsed our approach, we made a similar case to our division directors. Despite what we thought was a compelling case for collecting survey data as a starting point and baseline, several directors were clearly uncomfortable with this approach. It took a strong statement from the vice chancellor reinforcing that (1) we were committed to employee engagement, (2) collecting empirical engagement data was consistent with our ranking as one of the top 20 research institutions in the world, and (3) we were therefore going ahead with a survey. To his credit, he also decided to administer the survey to his own direct reports. His commitment and leadership enabled us to proceed with the survey approach.

This was also true of the survey strategy of the Canadian province of Alberta. Without the ongoing support of the province’s deputy ministers, Alberta would not have been able to sustain its survey programs for 16 years and counting.

Surveys will almost always reveal problems, so agency leadership has to also be prepared to take the results seriously and make a firm commitment to responding. In public-sector agencies, where politics can rule and change is hard, this can take real courage. With survey data, management can no longer plead ignorance about the problems its employees identified. These employees—who have invested time and effort to complete a survey—expect management to act on the results. Failure to act will lead to cynicism, jeopardize future attempts to obtain employee feedback, and perhaps even lead to employee disengagement.

Communicating the Survey Purpose, Process, and Results

The organization must communicate to employees, right from the start, about why the survey is being conducted, how the initiative relates to the agency mission and strategy, what the survey is intended to accomplish, why the results will matter, how the survey will be administered, and how the results will be used. Without a well-developed communication strategy, confusion (and rumor and speculation) can sabotage the initiative. The agency must also emphasize to employees that their responses will be anonymous. Otherwise, they may not respond or, if they do, may not provide candid answers.

After the survey, results should also be communicated promptly and completely. Juneau and Oregon Metro, for example, post all their survey results online so employees can access them. Then, when an agency takes action to improve engagement, it must describe these actions, explain how they link to the survey results, and then provide regular progress reports.

The University of Wisconsin Hospital prepares a standard PowerPoint presentation on the survey results for its 400 managers. Each manager then tailors this presentation to include his or her unit-specific results: response rate, engagement score, work-group key drivers, the five questions that showed the greatest positive percentage changes and the five with greatest negative percentage changes. In this way, the hospital-wide message is consistent but individual unit managers present their own survey results to their units.

The Chorley Borough Council (a local government jurisdiction in the United Kingdom) made presentations on its survey results to its staff, highlighting organizational strengths and weaknesses. At the end of the sessions, employees were asked, “What is the one thing that would improve your working day?” Participants were also asked to write answers on Post-its and stick them on the wall before leaving. This generated 200 Post-its and lots of ideas, which were then presented to a staff forum—made up of around 25 employee representatives—for consideration and implementation.

While communication is critical for organizations in any sector that want to improve employee engagement, it is particularly important in government, where low turnover and long tenures can created highly entrenched workforces that often resist change. Getting the rank-and-file employees on board is therefore essential, and communication is an important tool to make this happen.

Appendix 2 is a sample communication plan to support an employee-engagement initiative.

PREPARING FOR POSSIBLE PUBLIC VISIBILITY

In many government jurisdictions, employee-engagement surveys and results are subject to laws or ordinances regarding freedom of information and open records. This level of public access and transparency simply doesn’t exist in private-sector organizations. While this shouldn’t prevent the agency from conducting a survey, it is a planning consideration. How will the organization, including senior leaders and elected and appointed officials, react if the results reveal problems, as they inevitably will? What if a survey of law enforcement, emergency-services personnel, or fire fighters shows that these workforces are not fully engaged? Is the jurisdiction/agency prepared to deal not only with internal employee-engagement issues but also with potential political, media, and public scrutiny? This possibility, including how to respond, needs to be anticipated.

CONDUCTING THE ENGAGEMENT SURVEY

So how should a public agency proceed to administer the engagement survey, including deciding which questions to ask and how to administer the survey?

There are many reputable firms that conduct employee-engagement surveys. I’ve mentioned some, including consulting firms such as BlessingWhite, TowersWatson, Kenexa, and Gallup. Engagement surveys are also conducted by other organizations such as the Great Places to Work Institute (which produces Fortune magazine’s “100 Best Companies to Work For” list) and the Chronicle of Higher Education (which produces the “Great Colleges to Work For” list). These firms and other organizations can provide invaluable support in reporting and analyzing the survey results and, most important, taking action to respond to them. Unfortunately, there are also organizations that may not be as reputable or experienced, especially working with government agencies.

With these issues in mind, there are several options when designing and administering an engagement survey:

• The organization can design and conduct the survey itself.

• The organization can hire an outside survey organization.

• The organization can hire an outside organization to conduct the initial survey(s) but then administer follow-up surveys itself.

DESIGNING AND CONDUCTING THE SURVEY BY THE ORGANIZATION ITSELF

One alternative is for the government organization or agency to handle the entire process itself, including designing the survey. Some government agencies have done this, but it takes time, expertise, and resources. The Canadian interjurisdictional survey, for example, was developed internally after extensive research on what other organizations have done. The MSPB engagement survey questions were developed based on a review of the literature on engagement and engagement surveys, as well as applying a series of statistical techniques that included factor analysis and measurements of reliability and validity to determine which group of questions could best measure employee engagement. In other words, some fairly sophisticated statistical work. Other widely used engagement surveys were developed in similar fashion.

Therefore, while it may seem attractive for a jurisdiction or agency to develop its own survey, there are existing surveys with proven statistical power and validity. Given the number of surveys that have already been tested and shown to quantify engagement, it may not make much sense to develop a new survey from scratch.

However, a viable alternative is for an agency to use an existing survey but manage the actual survey administration process itself. As we’ve described, there are surveys, like those used by MSPB and OPM, that are valid and accessible because they’re in the public domain. These surveys can also generate engagement composite scores or indexes—a useful way to summarize survey results (e.g., overall and/or by work unit or manager).

Doing it yourself requires resources and technical expertise to do the following:

• Decide what survey questions to ask.

• Develop and implement the communication strategy, particularly internally for agency employees.

• Design and administer the survey (typically this is done online but also includes allowing employees to respond who can’t access the survey online and/or don’t speak English as their first language).

• Follow up to maximize response rates.

• Summarize, analyze, and report the results.

• Act on the survey data to maintain strengths and improve areas of weakness.

• Follow up, including repeating the survey regularly to assess whether the needle of engagement is moving in the right direction.

If an agency doesn’t have the expertise and horsepower to handle these steps itself, or can’t commit to this level of effort, it probably shouldn’t design and conduct the survey without help.

The U.K. Ministry of Justice—Developing and Implementing an Engagement Survey

One government agency that managed its own survey process is the Ministry of Justice, the third largest agency in the U.K. national government. The ministry surveyed its 80,000 employees and achieved a 65 percent response rate. The project framework included a high-level steering group to achieve an “executive buy-in” across the entire ministry. A working group that included representatives from all parts of the organization grappled with practical issues such as what questions to ask, what reports to produce, and who should receive them. “Employee-engagement champions” were selected to help bring the process to life in units throughout the ministry.

The Ministry of Justice also initiated a major internal marketing push, with the slogan “Start a chain reaction,” which was part of a ministry communications campaign as well as a vehicle for local champions and advocates to build on.

What began as an engagement “project” ultimately became business as usual and was absorbed into the ministry’s corporate university, the Justice Academy.3

USING AN OUTSIDE SURVEY ORGANIZATION

Another alternative is to contract with an organization that has employee survey experience and expertise (e.g., Kenexa, BlessingWhite, TowersWatson, Gallup). This can be an effective strategy even though it can cost more, at least in direct expenses, than if the agency conducts the survey itself. The University of Wisconsin Hospital and Clinics, for example, has successfully used Kenexa for its surveys.

One variation on this approach is to hire an outside organization to conduct an agency-developed survey or one of the publicly available surveys. The province of Alberta used this approach, conducting research and working with an outside survey design expert to develop its own engagement survey and then contracting with an outside firm to administer it. Alberta did this largely to reassure its employees that no one internally would see their responses. This helped achieve a 71 percent response rate.

At the University of Wisconsin, we used the MSPB survey but hired a survey research firm to administer it. Like Alberta, a key reason for us to hire the outside firm was to help convince our employees that no one at the university would see their individual responses.

USING AN OUTSIDE SURVEY ORGANIZATION WITH AGENCY FOLLOW-UP

In this blended approach, the agency hires a consulting firm to conduct the initial survey (and maybe the first follow-up as well), but then the agency takes over. This is the approach used by the city of Juneau, Alaska. Juneau conducts an annual employee-engagement survey that was initially done by an outside contractor, but the city now administers it under a licensing agreement with the firm that owns the survey.

The advantage of this option is that the organization has outside help to do the initial survey planning, setup, communication, administration, and analysis. Then, after this development work is done, the agency can take over. This assumes that the agency has (or can acquire) permission to administer the survey, as Juneau did through a licensing agreement to use the consulting firm’s proprietary survey.

Selecting and then working with an employee-engagement survey contractor can be a complex process in government. Most, if not all, public-sector organizations have procurement rules they must follow to contract with an outside firm. This usually involves issuing a request for proposal (RFP), evaluating the responses, and then selecting the contractor that submits the best proposal. The following are some questions a jurisdiction or agency may want to ask if it decides to select an outside firm to either provide and/or administer a survey. In fact, even if an agency decides to conduct the engagement survey itself, it should consider many of these issues:

Does the firm have public-sector experience? While I agree that there are aspects of how government operates that should be more businesslike, this can be a slippery slope.

That’s why I think it’s important to work with an outside organization that understands the environment and unique character of the public sector, including the challenges identified in Chapter 4. This includes how government operates, including the mission and culture of government, how decisions are made, the complexities of operating in a political environment, the role of labor unions, the public visibility of government activities, and so on. Acting on survey results almost always means change and, particularly in government, this usually means culture change. So those who are helping to create that change must understand the public-sector environment and culture.

Does the firm have a validated survey that includes benchmark data (preferably from the public sector) to compare results against? Having a valid survey is critical to ensure that the results accurately reflect the level of engagement in the organization. Just because a set of questions looks like it can measure employee engagement doesn’t mean the questions are truly valid. What evidence does the company have that its survey will validly measure the level of engagement? Has the survey been validated across a large sample of employees and organizations? Does the company have benchmark data from other organizations and employees—preferably from the public sector—to compare results to? How will employees’ narrative comments be analyzed and reported?

How will the firm administer the survey, including follow-up, to maximize the response rate? Will the consultants test/pilot the survey before administering it across the organization? How will it reach employees who don’t have access to computers and/or have limited or no English language skills? How and when will the contractor follow up to maximize the response rate? How will the anonymity of individual respondents be ensured? How long will it take employees to complete the survey? Can they complete it partially, save their responses, and then return later to finish it? Will the survey allow respondents to provide narrative comments in addition to answering the specific survey questions?

To boost response rates, some agencies appoint employees to serve as “survey champions” to promote the survey and urge their colleagues to respond, including by sending reminders. In the city of Minneapolis, survey champions are trained and then responsible for (1) helping with the process of surveying and taking action (with an emphasis on “helping” since survey champions themselves are not responsible for conducting the survey or acting on the results), (2) providing expertise and helping managers/leaders interpret survey reports and create action plans, (3) driving leader/manager ownership and accountability to share results and create action plans, and (4) encouraging union buy-in to support solutions.

The city of Juneau also relied on survey champions to help generate an exceptionally high 94 percent response rate. Likewise, survey champions appointed by the U.K. Department of Justice helped this 80,000-employee organization generate a 65 percent response rate.

How will the contractor help communicate across the agency? Communication is critical—before, during, and after the survey. What is the communication plan to inform all staff about the survey, the results, and follow-up actions? How and when will the results be reported to management and employees (e.g., email, meetings, focus groups, online)?

How will the contractor analyze and report on the survey data (e.g., in what formats, and can it analyze and report the results by work units, managers, and demographic groups)? After the survey is completed, how will the consultant deliver the results? How will data be tabulated, analyzed, and reported? What data “cuts” will the consultant deliver (e.g., by work units, demographic groups, job titles/occupational groups, tenure groups, work shifts, locations, managers compared to staff)? How will results be reported? Will reports include raw data (i.e., question-by-question results), a summary, or both? In what format (e.g., Word, Excel)?

Does the consultant have a proven methodology to calculate engagement indexes and drivers? Will the firm calculate an index that summarizes the results across all the engagement questions? This index can be an important tool to compare an agency’s scores against other similar organizations, compare units within the agency, or assess progress over time. Also, can the consulting firm do statistical analysis to identify the drivers of engagement (i.e., the survey questions that are most important to employees and therefore most directly influence their engagement levels)?

Will the contractor help you decide how to act on the data to maintain engagement strengths and shore up weaknesses? Collecting employee-engagement data is the beginning, not the end, of the engagement process. Survey results often raise questions that the results alone won’t answer. Conducting a survey and then not acting on the results can leave the organization in worse shape than if it hadn’t even done a survey. After the survey, employees will be eager to not only see the data and results but learn about what’s next. So the consultants should be prepared to help the agency move forward. Moving forward can mean helping create engagement teams and then working with the teams to identify and implement action plans, monitor progress, and measure results.

So Many Engagement Surveys to Choose From—What to Do?

The wide variety of available employee-engagement surveys can be confusing. The different surveys include different questions and can also produce different engagement indices and lists of engagement drivers.

Given my contention that solid engagement surveys have a lot in common, I believe that it’s most important to make sure that the survey selected is valid and will truly measure engagement. Surveys developed by many consulting firms, as well as organizations such as the MSPB and the OPM, meet this standard. While each survey has its proponents, any of these (and others) can enable an agency to measure engagement with confidence. And the real payoff comes from acting on the results. The goal should not be to try to develop or find a perfect survey. Instead, the idea is to (1) select an instrument that will work for the agency, its strategy, and culture; (2) conduct the survey and analyze the results; and (3) act on the data to maintain areas of strength and improve areas of weakness.

CONDUCTING AN ENGAGEMENT SURVEY AT THE UNIVERSITY OF WISCONSIN

At the University of Wisconsin, we administered an employee-engagement survey to about 4,500 employees in the 13 units that provide administrative support to our campus. These units include HR, facilities planning and management, the police department, the student union, student health services, and student housing. These units represent a wide range of employees, occupations, and demographic groups, ranging from custodians to medical doctors. Our workforce also includes many employees who do not have computer access and/or do not speak English as their first language.

We started by piloting the survey first in the Office of Human Resources to work out any survey kinks and also show our campus colleagues that HR was willing to take this step ourselves before we asked other units to do it. We used the MSPB survey, plus the three questions from the OPM Federal Employee Viewpoint Survey that the Partnership for Public Service uses to calculate “Best Places to Work in the Federal Government” scores.

For the pilot, the university survey research center administered the survey to our 150-person HR staff (including student employees). The survey center provided us with overall summary results for the Office of Human Resources, plus summaries for the specific work units in HR. We did not see any individual employee responses, thus preserving employee anonymity.

Then, after the pilot, we expanded the survey to the other 12 units but used an outside contractor to administer the survey and report the results. This firm was selected through a competitive bidding process.

Our contractor conducted the survey largely online by providing each of the 4,500 employees surveyed with a customized web address to access the survey. Creating 4,500 employee-specific web addresses may seem overly complicated, but this was an important step to guarantee confidentiality—no university employee had access to these web addresses or any individual employee’s responses. We supplemented the online survey with paper copies (with random identification numbers, not names) for employees who don’t have easy access to computers, and we translated the survey into Spanish, Hmong, and Tibetan (that’s right, Tibetan—the university has almost 100 employees who speak this as their native language).

Our contractor kept the survey open initially for three weeks and then extended it for another two weeks to boost the response rate. The firm sent three email reminders.

We also worked directly with the other 12 units to help them communicate with their employees about the survey and prepare to receive, and act on, the results. We spoke with each director to explain the initiative, including the business case for engagement; distributed talking points (see box) to each director; and worked with each division to identify the subunits in their divisions they wanted reports on. We also asked each division to appoint a “data director” who was responsible for receiving the survey results and understanding how to explain and use them.

Employee Engagement Communication Suggestions and Talking Points for University of Wisconsin Division Directors

The following is a list of suggested talking points we distributed to managers at the University of Wisconsin to help them explain the employee engagement initiative to their staff members.

Suggested Communication Process/Talking Points

1. Involve managers and supervisors on the front end in communication. They may be able to help design the process and/or anticipate questions from staff. Make sure they are prepared to explain the initiative when staff members follow up with questions.

2. Make a deliberate choice about how you communicate. This may be different for different divisions. Venues could include the following:

• All-staff forums with time for questions

• Unit forums in smaller groups

• Emails to staff

• Combinations of these approaches

3. Follow up your initial communication by discussing staff reactions with managers and supervisors. Identify questions that should be answered at the outset to avoid unnecessary confusion or concern.

Suggested Communication Messages

1. Describe the employee engagement initiative. One of the goals in the strategic plan for the Office of Vice Chancellor for Finance and Administration is to “create an environment of respect and inclusiveness through opportunities for employee engagement.” To accomplish this goal, each division strives to engage employees more effectively in both how work is accomplished and how decisions are made.

2. Explain why we are doing this. Greater employee engagement will not only improve how individual employees feel about their jobs and the unit but also produce better unit decisions and better organizational performance. Research in both the public and private sectors confirms the power of employee engagement to improve individual and organizational performance.

Anticipated Benefits and Desired Outcomes

1. Greater personal meaning in work

2. Heightened connection to work, the university and division mission, and coworkers

3. Increased involvement and collaboration in division decision making, resulting in better organizational performance

4. More informed decision making by each division to help use resources effectively

5. Stronger partnerships across the organization

Division Process

1. Division creates engagement team with a mix of employees and managers/supervisors.

2. Contractor conducts employee-engagement survey.

3. Contractor collects and analyzes survey data.

4. Division-engagement team reviews baseline survey data.

5. Team establishes work plans and timelines to sustain areas of strength and address areas for improvement.

6. Team works across division to implement plans, monitor progress, learn, and make adjustments.

7. Division resurveys employees after about one year.

8. Division reports on best practices and shares learning across divisions.

The Survey

• The survey includes 20–25 questions/statements that have been shown to assess employee-engagement levels.

• The survey is administered online with hard copies for employees who don’t have access to computers.

• A third party administers the survey and analyzes the data. No one in the division or HR sees any employee’s individual responses.

• The contractor administering the survey provides summaries of the results (but not individual responses) to the divisions for analysis and action.

After the survey, our contractor provided each division with a series of spreadsheets containing unit-specific data, broken down question by question for the division’s work units and demographic groups. Each unit also received employee-engagement index scores based on the 16 MSPB engagement questions/statements. Because these questions were scaled from 1 to 5 (1 was “strongly disagree” and 5 was “strongly agree”), the index aggregated and summarized the unit’s overall responses to these 16 questions into a single number between 1 and 5, reflecting the overall level of engagement.

Our contractor also provided question-by-question results, as follows:

• The average score for each survey question (on the 1–5 scale)

• The percentage of “favorable” (4–5 on the 5-point scale), “neutral” (3), and “unfavorable” (1–2) responses for each question

The contractor also delivered to each division a summary that aggregated the results (question by question and via the engagement index) from across all 13 divisions to serve as a benchmark; division leaders could then compare this benchmark against their division’s results. Finally, each division received a summary of its employees’ narrative, verbatim comments, summarized by themes.

* * *

While there are several different ways for a government jurisdiction/agency to assess employee engagement, conducting a well-designed employee-engagement survey is the best approach to generate actionable data on an agency’s level of engagement.

Selecting and then administering an employee-engagement survey can seem like a daunting challenge. While there is no perfect solution, the key decision is to conduct the survey. Then it’s about selecting a survey instrument that will work for the organization and its mission, values, strategy, and culture; conducting the survey; analyzing the survey data; communicating the results; and acting on the data.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset