173
Chapter 20
Benchmarking
John T. Hansmann
Introduction
Have you ever asked yourself, “Is there a better way to do this?” Or, possibly, “ere’s got to be a
better way to do this.” Or, maybe, “I’m sure someone else has gured out a better way to do this.
ose are three layman’s phrases that describe the concept of benchmarking. is chapter will be
focused on the topic of benchmarking, describing what it is, and providing practical tools and
examples of how to successfully use the concepts of comparative analysis and benchmarking to
improve your operations and overall performance.
Contents
Introduction ............................................................................................................................. 173
Denition of Benchmarking .....................................................................................................174
Benchmarking in Healthcare ....................................................................................................174
Keys to Successful Benchmarking Eorts .............................................................................174
Benchmarking: e Eight-Step Process .....................................................................................175
Step 1: Data Comparison, Internal, and External ................................................................. 175
Step 2: Variance Analysis: Identify Performance Variance and Potential Improvement
Opportunities ......................................................................................................................176
Step 3: Identify a Specic Area of Opportunity, the Focused Department or Topic ..............178
Step 4: Identify the Comparison Department/Hospital .......................................................179
Step 5: Plan the Contact with Benchmark Partner: Survey, Telephone, and/or Site Visits .....180
Step 6: Perform Interviews/Site Visits ..................................................................................180
Step 7: Review Findings from Contact; Summarize and Develop Action Plans to
Implement Improvement Opportunities ..............................................................................180
Step 8: Implement Action Plans and Monitor to Sustain Changes .......................................181
Other Uses for Benchmarking/Comparison Analysis ................................................................181
Summary ..................................................................................................................................181
174John T. Hansmann
Denition of Benchmarking
A benchmark is “(1) a point of reference from which measurements may be made or, (2) something
that serves as a standard by which others may be measured or judged.
*
e concept of benchmark-
ing in business dates back to at least the 1970s, but it was really brought to the forefront in the 1980s
by Xerox Corporation. Xerox had lost a lot of their market share to other companies, and needed
to do something dierent. Robert Camp, who led a number of Xerox benchmarking initiatives,
dened benchmarking as “the search for industry best practices that lead to superior performance.
Benchmarking provides information to help make more informed decisions about the busi-
ness. It assists organizations to remain competitive and productive. It helps develop best practices.
Benchmarking is an organized approach to determine if someone does something better than you,
and to learn from them to improve your operations. It results in organizations seeking to become
the best by identifying and implementing best practices.
Benchmarking in Healthcare
Formal, step-by-step benchmarking initiatives have been successful in many other industries.
Healthcare has applied the concepts, but typically not as rigorously as other industries. e time
and resources necessary to perform benchmarking typically have not been provided. e practice in
healthcare tends to focus more on high-level nancial metrics and less on the underlying processes
or service and quality metrics. In most cases, a management engineer or process improvement
professional gets an assignment from the hospital administrative team to reduce costs by using
the provided comparative data, often called benchmarking data. e benchmarking data typically
shows a large calculated savings opportunity that the administration wants to realize. e manage-
ment engineer/process improvement professional needs to use a process adapted from the formal
benchmarking steps used in other industries to realize any of the potential savings opportunity.
Keys to Successful Benchmarking Efforts
An argument could be made that any benchmarking eort, whether small or large, that created a
learning opportunity and identied potential improvements was successful. But for any benchmark-
ing initiative to be considered a success, it needs to include many of the following key components.
Links to key business objectives, improvement initiatives, or strategies. e time and
eort necessary to complete a successful benchmarking initiative requires that it be done in
support of key business objectives or strategies. Initiatives not linked to key business objec-
tives dilute or waste necessary resources needed for critical projects.
Good metrics. e comparison data are the starting point for any benchmarking initiative.
Without the data, the organization doesnt know where its potential opportunities are or
who may be doing something better. And without the data, a baseline wouldnt exist against
which to measure when any improvements are made.
*
Merriam-Webster, http://www.merriam-webster.com/dictionary/benchmark.
Robert C. Camp, Benchmarking: e Search for Industry Best Practices at Lead to Superior Performance
(Wisconsin: ASQC Quality Press, 1989), 12.
Benchmarking175
Know your own processes. An organization must know itself to understand its strengths
and weaknesses prior to nding out about another organization. Knowing yourself helps
determine the priorities to pursue rst, and direct the analyses to the appropriate functions.
en when you partner with someone else, you can concentrate on the things that are
important to the organization, focusing resources and expertise.
Structured approach for interviews, data collection, and fact nding. is allows the
organizations resources to be as ecient as possible, and utilizes everyone’s time wisely. An
organized, well-thought-out discussion or interview also shows your partner organization
that you are serious about the process and respectful of their time. e focused eort mini-
mizes “industrial tourism.
Propensity to action. Once all the analytical work has been completed, and potential solu-
tions developed, they need to be acted upon. Implementation plans with assigned respon-
sibilities need to be developed. Action needs to be taken, adapting as necessary to make
modications to original plans. e good ideas need to be implemented for any results to
be attained.
Manage/deal with change and transition. With all change comes the need to manage
the human side of the transition, the emotions, and the ambiguity. All change requires the
understanding of the impact on the processes and people involved. While the actual change
event or physical change has to be managed, the more dicult part it is to manage the tran-
sition of the people to the new change.
Benchmarking: The Eight-Step Process
Many organizations have developed their own x-step methodology to complete a successful bench-
marking project. e commonality in all of them is that they follow the fundamental scientic
method for problem solving. A generic eight-step approach to successfully complete a benchmark-
ing initiative follows.
Step 1: Data Comparison, Internal, and External
A key attribute of benchmarking is the need for data to measure current and future performance.
e rst step of any benchmarking initiative is to collect data in your organization prior to looking
outside at anyone else. Typical benchmarking/comparison analysis data compares like departments
in operational metrics such as hours, dollars, and volume. Specic metrics, such as cost per unit
of service (UOS), productive or paid hours per UOS, average length of stay (ALOS), salary/wage/
benets (SWB) cost per UOS, and supply cost per UOS are used. Some organizations include ser-
vice level and quality metrics like CMS core measures in the analysis. Comparisons can be done
at the hospital level, but should be done at the departmental level to provide actionable solutions.
Once an organization has collected data about itself, it needs to collect the same data from
other organizations. ese data may be obtained through public sources, or most likely need to
be purchased from companies specializing in comparison data. e typical scenario involves the
organization purchasing data from an external source that specializes in comparison data. In this
scenario, data from the participating organizations are compared against each other. e value or
success of comparison data revolves around consistent data denitions and having a like compari-
son peer group, or hospitals/departments that are similar in operations. e data denitions provide
a map to determine which accounts to include or exclude, and specically from where the dierent
176John T. Hansmann
data elements originate. Included in the data denitions is the timeframe (annual, year to date
[YTD], quarterly) of the data to provide a consistent comparison.
Comparison peer groups are typically decided by factors such as size of operation (e.g., bed
size, number of patient days, number of procedures), patient type/mix, community populations,
and/or geographic distributions. In many cases the data need to be normalized for wage dier-
ences (wage index) or patient mix (using case mix index [CMI]) or severity adjusted to include
patient complexity in the comparisons. Additional operational characteristic details such as unit
design (circle, square, X, H, I), unit storage (centralized vs. bedside), and available technology
(automatic dispensing cabinet) rene the comparison even more. e point of the comparison is
to select other hospitals that are as similar in operations as possible to which you can compare your
hospital/department. e more similar the operations, the more any variances may be attributable
to actionable dierences.
Step 2: Variance Analysis: Identify Performance Variance
and Potential Improvement Opportunities
e second step for a benchmarking/comparison analysis initiative is to analyze the comparison
data and determine if a potential improvement or savings opportunity exists. is variance analysis
calculates the dierence between how one organization performs compared to another. e vari-
ance or potential savings opportunity is calculated by taking the dierence in operating perfor-
mance (e.g., productive hours per unit of service) between the two organizations and multiplying
by the volume of the focused organization. is eectively calculates the focused organizations
cost of operations using the comparison organizations cost structure and the focused organiza-
tions volume. In others words, it is measuring how much can be saved if the focused organization
could operate at the other organizations cost structure.
Most comparisons use either quartile or percentile grouping methodology. Quartile and per-
centile groupings organize the comparison data in rank order from best to worst to identify a
quartile or percentile for each hospital. In a quartile ranking system, the best or top quartile is the
1
st
quartile, the 2
nd
quartile is the next best, and so on. In a percentile ranking, the data are rank
ordered from best to worst, with the percentile rank identifying the percent of hospitals that are
worse on the list. For example, if a hospital is at the 75
th
percentile, its performance is better than
75% of the hospitals on the list.
Normally a high-level target is established to perform at the 50
th
percentile, 67
th
percentile, or
75
th
percentile levels. is is equivalent to the best and second-best quartiles. e methodology
ultimately identies a specic department or hospital as the target hospital. e target hospital is
used to calculate the variance or potential savings opportunity. It is important that the comparison
target is an actual hospital that is really performing at the identied level versus an average or other
mathematical variance that is unattainable.
Many options exist to estimate an improvement goal for the department. e ultimate goal is
to achieve the same or better performance as the target department, but in some cases that may not
be realistic in the short term. Practice and experience show that in a short time period (dened as
one year), it is rarely feasible to make changes greater than 20% of the total expenditure. Another
concept that has been used to set the goal is to improve one spot in the rankings. is is especially
useful in the case where the ultimate target may be a real stretch goal for the department. A third
concept to set a goal creates a potential savings range, and is sometimes described using terms such
as conservative and aggressive targets. However it is done, the point is to identify a hospital that is
Benchmarking177
performing better than you. Improving one spot, being at the median of the list, or being better
than 75% of the hospitals is much more realistically attainable than reaching the top of the list.
is methodology also protects against comparing to a potential outlier that may not be attainable.
e table in Figure20.1 shows typical comparison data and potential opportunity savings for
Med/Surg nursing departments. It compares like departments using productive dollars and hours
per patient day. e metrics are calculated for each department, and then rank ordered from low-
est to highest productive hours per patient day. is example uses the percentile methodology,
where the target performance is the 50
th
percentile, or better than 50% of the departments listed.
Many available tools provide graphical views of the data in addition to the table of numbers.
e data displayed in Figure20.1 show the departments aligned in ascending productive hours
per patient day order. Organizing the data in ascending order allows clustering of hospitals/depart-
ments, which tend to indicate a consistent practice or performance, to be seen more easily.
In the example, the target hospital (50
th
percentile) is hospital R. Its hours per unit perfor-
mance are 9.56 productive hours per patient day. e focus hospital is hospital N, running at 10.30

12.00
10.00
8.00
6.00
4.00
2.00
0.00
A
P
L
H
B
I
J
D
R
O
K
M
Q
F
N
G
E
C
Hospital A
Hospital P
Hospital L
Hospital H
Hospital B
Hospital I
Hospital J
Hospital D
Hospital R
Hospital O
Hospital K
Hospital M
Hospital Q
Hospital F
Hospital N
Hospital G
Hospital E
Hospital C
9,297
7,400
12,420
11,091
7,694
8,255
8,665
7,775
9,290
10,494
8,654
13,356
13,427
8,633
9,350
8,437
10,769
10,450
35.86
30.95
53.15
48.74
33.97
36.90
39.41
35.56
42.68
48.85
40.32
62.67
63.33
41.90
46.29
41.96
55.14
56.68
40.64
35.31
56.56
55.92
40.80
42.31
43.54
37.67
46.61
52.66
47.35
70.49
72.39
45.42
51.14
45.87
61.56
62.00
$ 2,147
$ 1,806
$ 3,135
$ 2,420
$ 2,190
$ 1,722
$ 2,341
$ 1,915
$ 2,508
$ 3,032
$ 2,282
$ 3,760
$ 3,731
$ 1,957
$ 2,408
$ 2,256
$ 2,650
$ 3,041
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
100%
94%
88%
82%
77%
71%
65%
59%
53%
47%
41%
35%
30%
24%
18%
12%
6%
0%
8.02
8.70
8.90
9.14
9.18
9.30
9.46
9.51
9.56
9.68
9.69
9.76
9.81
10.10
10.30
10.34
10.65
11.28
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
(0.64)
(0.56)
(1.31)
(1.64)
(2.24)
(3.34)
(3.20)
(5.66)
(8.67)
Productive Hours/Patient Day
Peer Comparison Group
8.02
8.70
8.90
9.14
9.18
9.30
9.46
Hosp ID Hospital Name
Data Comparison
Dept UOSRank
Percentile
Prod
FTE’s
Prod
Dollars
Prod
Hrs/UOS
Prod FTE
Var to
Target
Paid
FTE’s
9.51
9.56
Target
Focus
9.68 9.69
9.76
9.81
10.10
10.30
10.34
10.65
11.28
APLHBI JDROKMQFNG
EC
Comparison Group: Med/Surg
UOS: Patient Days
Comparison Target: Median (50th Percentile)
Figure 20.1 Typical comparison data and potential opportunity savings for Med/Surg nursing
departments.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset