16165.png

Chapter 3

Real Live Pay-offs: Five Non-Monetary Cases

The first non-monetary case concerns improved decision-making and the second an issue of national security. The others involve saving lives. All concern C-level decision makers. Determining objective selection criteria for legacy system consolidation (and avoiding a Congressional inquiry in the process)

The first author was employed by the Defense Information Systems Agency during the early 1990s. One of the first conversations he had with his boss’ boss began with her stating: “Your task is to keep me from having to testify in front of a congressional inquiry!” To a newbie federal government employee, this was an unpromising start. It seemed that the DISA Center for Information Management had been tasked with developing recommendations regarding Department of Defense payroll systems. At the time, there were 37 payroll systems generating payments to civilian employees. Both the number of systems and upcoming preparation for Y2K were putting pressure on the department to simplify its environment.

In addition to the number of systems, more profound data-related challenges were revealed. Various individuals, in their attempts to determine the total number of civilians employed by the department, submitted comprehensive requests for the individual systems to report regarding items such as number of employees covered. The inevitable rejoinder from each system was something like: “What do you mean by employee?” followed by “This is how we define it for our system.”

While it might have seemed impertinent, the question was quite reasonable. Since each system defined employee (a very important Master Data concept) differently, each request from headquarters had to be further qualified so that the “real” count be determined. For example, does the request pertain to just full-time or does it also include part-time employees? That systems used different definitions was due to a lack of data standards within the department.

There was a subordinate, theoretically unnecessary cost resulting from the lack of standardization as well. The actual (and, more importantly, perceived) cost of answering what appeared to be a simple question was high as a result of the time required to manually reconcile the disparate data required to standardize responses. For example, if a given system returned the total number of full- and part-time employees, then the number of part-time employees had to be manually subtracted from the overall total if the initial query pertained only to full-time employees.

This arrangement presented a secondary challenge as it preserved the status quo. In spite of valiant efforts by the participants, not only was virtually every request frustrating to satisfy, the data comprising the response was inevitably of poor quality as well—which contributed to poor quality decision-making. Many initial requests for DoD-wide information went unsatisfied as the frustrating cost of clarifying the initial (seemingly reasonable) request grew to unreasonable proportions. So, efforts to improve operations were actually hampered by poor departmental data management practices.

Back to the challenge. The request for recommendations was clearly motivated by a desire to reduce the number of departmental civilian pay systems needed—ideally to just one. The effort was to involve a study and recommendations regarding which system to keep and how to merge all others into the surviving system. Because the 37 systems were located in 37 states, local congressional delegations, anxious to demonstrate unique capabilities of each system and to convince the study team of the virtues of each (as well as anxious to preserve local jobs), invited the team to visit on-site and personally inspect the various systems.

It quickly became apparent that, with the desired outcome being a single system to perform all DoD civilian payroll processing, there would be 36 losing systems (or 37 losers if the decision was to merge all 37 into a new system). With so many potential losers, a daunting congressional inquiry seemed inevitable. What was clearly needed was a means to objectively compare the 37 systems against departmental requirements. Newbie’s boss’ boss was also aware that a previous process modeling exercise for each system had produced thoroughly uninteresting and non-helpful results: each system was virtually indistinguishable from the other 36 systems – from a process perspective. After all, how much variation can a payroll process exhibit and still comply with organizational processing standards?

The method designed was eventually documented in DoD-mandated instruction (Aiken 1996). It permitted objective evaluation and comparison of the systems using data reverse-engineering to determine data types processed by each system. This made it possible to determine that the system in Florida supported maintenance of more departmental data requirements than another system because it included requirements similar to the following: One-legged engineers + working in waist deep water + under rotating helicopter blades + on overtime.

When congressional delegations from the various “losing” systems inquired as to the reasons their systems were not selected, the “one-legged engineer” example was often enough to satisfy their questions. The feared congressional inquiry never materialized, and the department was able to begin planning system consolidation with minimal fuss.

Data migration planning was based directly on metadata discovered during the study and was several times less expensive than originally anticipated as a result. The department saved millions as a result of simplified processing, reduced technology footprints and economies of scale. More importantly, the department was finally able to answer specific questions now that it possessed a single source of standardized data.

Interestingly, no one attempted to calculate the cost of implementing standardized data and avoiding the cost of systems consolidation.

Everyone has bills to pay (but some bills are more equal than others)

The Pentagon contracts for services with many service providers and, consequently, processes many, many invoices. When this data management challenge evolved over time and was not engineered (or periodically re-engineered) for optimal performance, processing slowed and, on occasion, halted. The result? Invoices did not get paid. On the other side of the accounting equation, when service providers do not get paid for services previously rendered, a natural response is to halt further services. And when the service in question is responsible for providing Internet connectivity to the Pentagon, consequences can be enormous.

The operational unit responsible for utilizing the accounting system shared by Pentagon operations and other DoD organizations often encountered poor system reliability and, as a result, relied heavily on manual processing. Things came to a head when the Pentagon’s Internet connection was almost turned off as a result of unpaid invoices for service. The threat to the Pentagon of losing Internet connectivity provided the motivation necessary to identify and quantify opportunities to improve data quality, entry and reporting. We were called in to develop the plan and deliver results quickly with better reporting practices, permitting the service provider to identify “important” accounts. A data centric fix let the billing company assign a flag to the various accounts indicating that automatic cut off procedures should not apply to flagged accounts.

Can an anywhere-near-accurate monetary value be assigned when the Pentagon is unable to access the Internet specifically as a result of poor data management resulting in poorly designed bill processing on the part of the Pentagon’s internet provider?

Identifying payment error correction and boosting troop morale? (priceless)

And speaking of payroll, please help us place a true value on a compensation payment errorespecially a compensation error occurring over multiple years to an army private who is serving his county under warfighting conditions. First, one must ask: What is the actual cost of diverting the attention of not just the warfighter, but also, in this instance, a private’s lieutenant, also under fire, who is concerned with correcting a four-year compensation underpayment for one of his soldiers? In this case, it cannot be accomplished easily or in a straightforward manner.

For a private in the armed services, experiencing greater need for immediate cash management than a higher-ranking member of the military, a pay issue can be critical to the wellbeing of family members. Moreover, pay issues represent negative impact to troop and unit morale, focus and mission outcomes. Brig. Richard Nugee, in his keynote address at the 2010 Data Governance Conference, reported that the actual time to resolve just such an issue took many, many hours over a period of months to correct a processing error that resulted in underpayment of approximately $12 thousand (Nugee and Seiner 2010). The reported cause of the issue was a rather simple data management issue; bad quality data was originally input to the system. Good quality data management processes would have prevented the poor quality data from entering the system instead of allowing the individual who entered the data to believe that it had been done correctly. Data governance principles encouraging the prevention of incorrect data to be entered into the payroll system would have mandated the design and implementation of upfront checking instead of attempting to validate data items after initial input.

To quantify the cost of the example presented, we can (1) total the number of hours for respective time spent by the two individuals addressing the issue and (2) multiply these hours by their hourly pay rates. The lowest active private annual pay rate in the British Armed Forces is approximately $27 thousand, while the lowest annual pay rate for a lieutenant is approximately $38 thousand.19 In fiscal terms, it would take only a small investment of their time to exceed the cost of resolving the underpayment error. However, there is simply no way to include non-quantifiable costs associated with this situation in the final total because we cannot precisely quantify the presumably negative value of the situational impact to individual and unit morale, loss of focus and mission outcome.

Horrifyingly but realistically, the true cost of either individual’s diversion may be a fatality. Most certainly, it makes no sense for the cost of the correction to have exceeded the cost of the adjustment. Unfortunately, at this point of maturity in the data management profession, there is no toolset that permits precise identification of the total costs of this data management error. With practice, we can get better but, so far, outside of Brig. Nugee’s unit, this skill has not been recognized as necessary, so very few practitioners or researchers are exploring it.

Saving warfighter lives (friendly fire death prevention)

The story below, posted to an on-line forum often visited to follow discussions regarding computer risk,20 is, unfortunately not an isolated tale. Based on keen interest in the role played by data in these types of incidents, it has become apparent that the issue is of growing importance. The story posted to the forum is below.

Date: Tue, 26 Mar 2002 10:47:52 -0500

From: Subject: Friendly Fire deaths traced to dead battery

In one of the more horrifying incidents I’ve read about, U.S. soldiers and allies were killed in December 2001 because of a stunningly poor design of a GPS receiver, plus “human error.” http://www.washingtonpost.com/wp-dyn/articles/A8853-2002Mar23.html

A U.S. Special Forces air controller was calling in GPS positioning from some sort of battery-powered device. He “had used the GPS receiver to calculate the latitude and longitude of the Taliban position in minutes and seconds for an airstrike by a Navy F/A-18.” According to the *Post* story, the bomber crew “required” a “second calculation in ‘degree decimals’” -- why the crew did not have equipment to perform the minutes-seconds conversion themselves is not explained. The air controller had recorded the correct value in the GPS receiver when the battery died. Upon replacing the battery, he called in the degree-decimal position the unit was showing -- without realizing that the unit is set up to reset to its *own* position when the battery is replaced. The 2,000-pound bomb landed on his position, killing three Special Forces soldiers and injuring 20 others. If the information in this story is accurate, the RISKS involve replacing memory settings with an apparently-valid default value instead of blinking 0 or some other obviously-wrong display; not having a backup battery to hold values in memory during battery replacement; not equipping users to translate one coordinate system to another; and using a device with such flaws in a combat situation.

The very idea that any contractor would sell to the US Armed Forces a device that, when replacing the batteries, would reset a target to a soldier’s own position without sufficient warning is inconceivable. While it would seem to incorporate a broadly focused discussion on product development lifecycles and approaches, here we are concerned with good data management practices which always dictate that alerts related to recently changed positions are not only standard in modern operating systems, but also expected by the users of today’s smart phone apps!

Saving warfighter lives (US Army suicide prevention: a clear data governance success)

We were fortunate to play a small role in support of the Army’s suicide prevention efforts and, at one point, made a significant contribution. As part of the effort, extensive and detailed coordination was needed to manage project-critical data from a variety of organizations. Diagrams (such as Figure 22) were considered key to coordinating the various data requests.

Time deadlines were tight; we needed to make the various coordination meetings as efficient and effective as possible. While all the participating organizations wanted to support such a good cause, “Our data is bound by certain terms and conditions...” was a phrase repeated over and over again. Worse, the condition was interpreted as a need for more analysis and discussion. This, of course, required the one resource that we were lacking: time.

 

Image14869.PNG

Figure 22 Attempted mapping of sources and uses of data

A meeting was arranged wherein all involved data stewards were gathered. Also invited to this meeting was Thomas E. Kelly III, then Deputy Under-Secretary of the U.S. Army, the senior officer, who, after repeatedly hearing the phrase, “my data,” reminded all present that the data actually belonged to the senior officer since all present reported to him through the existing chain of command. Immediately following this reminder, the phrase, “my data,” was banished from the group’s vocabulary, except when referring to the senior officer’s data. The reminder was issued tactfully but with enough force to carry the weight of an order.

At the end of the meeting, the senior officer reiterated the process for handling questions as to future data ownership: “Make an appointment to speak directly with me!” This single event, known in information technology circles as executive buy-in, served several purposes, all of which saved valuable time in the effort to prevent future soldier suicides. By agreeing to take command of the data, the senior officer provided a very heavy dose of management support for the project.

If all CEOs would take similar direct action and responsibility, there would be an immediate positive impact to the bottom line of their organizations. It would save millions (and, for some organizations, billions) annually.

Perhaps most important, however, was the inherent empowerment of the team. The conversation turned immediately from Can this be done? to How are we going to accomplish this?

Deputy Under-Secretary Kelly also made clear the need for speed and that mistakes along the way would be tolerated. A perfect solution was not essential; instead, a workable solution in prototype form was acceptable, as it would provide lessons learned while an improved, permanent version was implemented. This case in particular demonstrates how an act of data governance management increased the speed at which a prototype solution was developed!

 

 

 

 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset