Chapter 3
How Organizations Struggle with Data Fluency

Few people need to be convinced about the value of using data to inform decisions. A 2012 survey conducted by the technology consulting firm Avanade found that 84 percent of senior leaders think using data has helped them make better business decisions. Furthermore, data use can help drive growth; as 73 percent of respondents claim to have already leveraged data to increase revenue.1 Increasingly, the opportunities and benefits of data are well understood. What is generally not yet understood is how to actually use data and convert the potential energy stored within this information into kinetic energy that will propel growth and facilitate goal attainment.

When tracking the progress of data use, many organizations value quantity over quality. In 2014, Facebook shared that it was storing 300 petabytes of user data, an amount that had tripled since the previous year.2 That’s more than a CD’s worth of data for each man, woman, and child in the United States. A study by Oracle indicates that the average data warehousing project costs $1.1 million and takes 10 months to deliver.3 Tremendous volumes of data and massive investments are common ways that organizations demonstrate that they are on the big data bandwagon. Measures of frantic reporting activity can be equally misguided. While we worked at AOL, the business intelligence group was proud of the hundreds of reports it generated every month and how this number was growing.

Investments in technology and massive quantities of raw data don’t guarantee you can have thoughtful conversations using data. Data fluency means having quality discussions, a common use of measures, and shared appreciation for the meaning of your unique data. This chapter is dedicated to the development of these skills.

Your organizational strategy may be defined by a few big decisions by your leadership team. However, execution requires many small decisions.

Strategy communications should always be accompanied by metrics, which help frontline employees take ownership over their roles in the execution. The message should be two-fold: This is what we are trying to achieve and this is how we will measure if we are achieving it.

Amy Gallo, Harvard Business Review4

When an organization is data-fluent, individuals use data in everyday activities. They understand key metrics and how they relate to strategy. Useful, usable data products and tools influence how people think about their decisions.

According to IDG Enterprise’s Big Data report for 2014, the average organization will spend approximately $8 million on big data projects this year.5 Whether that number sounds like loose change or a king’s ransom, you want to ensure your organization’s data investments aren’t squandered.

This chapter describes some of the pitfalls that organizations face when trying to make good use of their data. The problem is seldom a lack of commitment or investment. As you learned earlier in Chapter 2, “The Data Fluency Framework,” data fluency requires a strong foundation in all four quadrants of the framework. Weak data culture, poor individual skills, or a fragile ecosystem can doom organizations to cycles of wasted effort.

By the end of this chapter, you will be able to:

  • Identify common ways organizations struggle with data fluency
  • Understand how these struggles are aligned with weaknesses in elements of the data fluency framework

Pitfalls on the Path to Data Fluency

By itself, data isn’t enough. A singular focus on investing in technology to manage and visualize your data won’t get it done. Leaders committed to a data-driven enterprise are great . . . but still more is needed. Even a team of good people isn’t enough. While all of these factors are necessary for data fluency, the pieces alone are not sufficient to transform your organization’s use of data.

Below we discuss a few illustrative ways organizations are unable to align all the factors necessary to reach data fluency. Of course, there are many ways to fail. We will describe a few. In doing so, we hope you can draw comparisons to your own organization so that it becomes more clear how an area of weakness in your organization’s culture, skills, or capabilities can act as a roadblock that stifles your best efforts at data informed decision-making.

In sharing the cases below, we also hope you will feel a little less alone as you and your organization seek greater levels of data fluency.

Report Proliferation

Reports have a way of multiplying like rabbits. Start with a perfectly useful and important report: a monthly sales report with product enhancements and utilization metrics sent to strategic accounts to make them aware of improvements coming and past usage. Customers see the information and want to know more. The report grows. A missing metric is added along with a detailed breakout. The report expands to the point where it gets split into separate reports, each targeted as a separate audience. Then a new executive arrives with her own perspective on the best way to present the same data. New reports are spawned, but the old ones don’t go away. Someday, somebody might still find them useful.

We’ve seen this report proliferation in all kinds of organizations—hospitals, schools, technology companies, insurance companies, and manufacturers. “If we report on everything, surely the right information will exist somewhere in a report.” Perhaps they’re right, but if no one can find what they need, everyone’s left sorting rabbits.

Even sophisticated analytics companies can scarcely control the tendency toward report carpet-bombing. We consulted for a large healthcare company who was fighting against an institutional culture that always pushed for more data, more reporting, and more metrics. An important part of its services is to deliver annual summary reports to its clients about the costs that affect health benefits. The company’s analysts would spend thousands of hours preparing detailed PowerPoint presentations, chock full of hundreds of “key” metrics. When it came time to present to its clients, the discussions quickly dove deep into the details. Important recommendations were lost in a pile of minutia. The first step in recovery is to recognize you have a problem. Our client knew this volume of information wasn’t good for the data product authors or the data consumers.

Balkanized Data

Organizations which design systems . . . are constrained to produce designs which are copies of the communication structures of these organizations.

Conway’s Law

Departments in an organization can easily become independent silos, operating with their own set of norms, conventions, and terminology. This impacts what you can do with your data and what you can understand. Almost 50 years ago, computer programmer Melvin Conway observed a similar issue with software development and coined Conway’s Law. You’ve experienced this problem if you’ve ever been on a customer service call where you give all your personal information at the start of the call and then have to give it all again every time you’re transferred.

Although Conway’s Law was partly meant in jest, it’s an accurate sociological observation that has been confirmed in studies. Communication is hard and people with different motivations and backgrounds will create a fractured design.

A similar challenge applies to data. Each organizational department may use different data systems and terminology, processes, and conventions in data conversations and products.

We recently experienced this in a large urban school district. The Office of Assessment and Accountability (OAA) is a data-lover’s paradise, with the latest technologies, a team of skilled data scientists, and an effective data warehouse beloved by principals and teachers alike.

However, policy changes were encouraging school leaders to think about teacher quality and performance alongside the more traditional measures of student achievement. With this in mind, members of the OAA began to coordinate with the human resources (HR) department to connect information on teachers with their students. Unfortunately, HR had no systematic way of generating electronic records on teachers. In fact, most of the records were kept as hard copies in file drawers. When HR began entering the information by hand into spreadsheets, it found that teacher identification numbers changed annually and didn’t match state identification numbers. Years of student data could not be connected back to teachers charged with these students.

Data Elitism

Working with data can require a lot of technical skill. And data can tell stories and reveal truths that an organization may not want to share broadly. Why not centralize your efforts and limit access to data to the highly trained few who can be trusted to bring order to chaos? This is the viewpoint of Tom Davenport in his book, Competing on Analytics. In his view, the best analytical organization is one that has centralized control:

Stage 5 organizations develop a robust information management environment that provides an enterprise wide set of systems, applications, and governance processes. They begin by eliminating legacy systems and old spaghetti code and press forward to eliminate silos of information like data marts and spreadsheet marts. They hunt for pockets of standalone analytic applications and either migrate them to centralized analytic applications or shut them down.

(Harvard Business Review Press, 2007)

Like an over-eager police force hunting down deviants, this IT-led vision of business intelligence focuses on control, consistency, and data management. An extreme approach, however, comes at the expense of the individuals who use the data. Distancing analysis from the people who must use it results in data producers and their products that are disconnected from the decision-making process. Data products aren’t trusted and they often aren’t useful. All the problems of a command and control economy emerge.

We’ve seen this happen in credit card organizations where data scientists built models to predict customer behaviors that were not in alignment with the regulatory environment. The knowledge of what decisions could be made given the regulatory environment was in the hands of the data end users. Getting the data scientists to understand the real business environment took extra time and effort and the initial set of models were useless.

The Supermodel

It’s easy to mistakenly believe that dashboards must look slick and flashy with eye-catching visualizations that sizzle with graphical luster. Unfortunately, dashboards of this type often say little, and what they do manage to communicate can be a distraction.

Banks are notorious for creating supermodel dashboards that conceal as much as they present. One large bank displays the retirement accounts of its clients with a snazzy dashboard. The landing page in the website gives you many options including the ability to compare yourself to others. Enter a few data points (age, marital status, savings, and salary), and you’ll soon be comparing your retirement nest egg to that of “people like you.” The presentation is capped by a colorful animated pie chart showing your funds by asset class (large cap growth, global/international, and small/mid/speciality).

If you want to substantively analyze your retirement account year-over-year, you’ll have to break out your calculator and gather old PDF statements. If you want to see how much your fund grew due to contributions versus market performance, it’s your responsibility. Good luck finding the impact of fees on your investment returns over time.

These dashboards distract customers from what’s actually important and steer them toward options that work for the bank. They might look fancy but as soon as you start asking questions you hit a wall. Dashboards must display the right information with a clear message in a limited space.

Searching for Understanding

Your visions will become clear only when you can look into your own heart. Who looks outside, dreams; who looks inside, awakes.

C.G. Jung

An organization’s capability to make fluent decisions from data depends on how well the organization knows itself. Self-awareness helps you answer the difficult questions: What does success look like? Are we moving in the right direction? Who should we compare ourselves to?

For a new organization—especially one in an emerging market—it takes time to figure out what matters most. These organizations often lack focus in their data analysis, measurement, and communication while on the path of discovery. Even with the best intentions, organizations can struggle to make good use of their data as they search for the information and metrics that will align with their emerging strategy.

We experienced this with a client that was innovating in the health benefits market. This company had a new approach for helping families with healthcare issues using high-touch customer service. By giving every employee access to a health concierge, the company encouraged preventative care and supported employees with serious, long-term conditions like diabetes. Helping employees navigate the healthcare system led to lower benefits’ costs for employers. Despite its success, the company wasn’t sure how to explain the connection between its model and the reduction in costs.

The company leadership wanted to use data to communicate with its customers as well as to measure internal performance. Without clear links between its actions and health outcomes, it struggled to make its case even with a growing volume of data.

Presentations to clients varied from telling anecdotes to deep-dive analyses, but the company never quite demonstrated how its services led to financial impact. Internally, the organization had not yet arrived at a common data language. Different areas used different metrics to track performance, often focusing on things that were easy to measure, like volume of calls or customer satisfaction surveys. A focus was on activities rather than outcomes that affected costs.

It was obvious to us that this company was on a journey. Without a well-understood market, it needed to learn to know itself, which then could be expressed through its reporting, dashboards, and communications with clients.

Data Care

Sometimes data quality is a proxy for organizational discipline and morale. Do people throughout an organization really care or are they just collecting a paycheck? Team members who actually care about an organization and the organization’s mission will go the extra mile to make sure data is correctly populated, erroneous data is fixed, and extra data needed for future reference is added. We have found issues of data care at large publicly traded companies, academic institutions, and nonprofits.

Working with the fundraising team of a large, yet struggling nonprofit organization, we found data care and organizational morale low. The fundraising team used Raiser’s Edge, a popular fundraising software platform to capture the all-important donor information.

Gift officers were given reports with the names of individuals in their region that contributed more than $5,000 in any of the last 3 calendar years. Armed with a regional list they were asked to make donor calls, visit donors personally, and raise a certain dollar amount (often five to seven times their salary). None of the gift officers reached their goals. Why? Poor data care.

When trained on Raiser’s Edge, gift officers were asked to keep track of e-mails and visits with prospective and current donors. Although this information would be reported at a biweekly teleconference with the whole fundraising team, little effort was made to populate the information in the company’s data product. As a result, each gift officer kept track of his donor knowledge independently, and institutional knowledge was not built. When gift officers left the organization, replacements started with little knowledge about donors in their region. Although you knew past gift history, you didn’t know why they gave, what they liked, what was conveyed with the giving of the gift, or if they had a preferred contact number. Knowledge that would be needed for future reference was never recorded and, therefore, wasn’t generated for the organization.

Metric Fixation

Leaders and organizations can become mesmerized by one metric at the expense of others. Examples range from the management of a customer service call center, the derailment of a university present, to an increase in hospital-acquired infections. Laser focus can be helpful in achieving a goal; however, leaders need to ensure that important things that are temporarily out-of-focus are also being monitored.

For example, customer service call centers can often become fixated on metrics to the detriment of both employees at these call centers as well as customers. Some common metrics at customer service call centers include average call wait time, average call length, whether or not the issue was resolved on the first call (and doesn’t call back within 30 days), and retention of customers (keeping customers who are trying to cancel service, even if it means changing their subscription package to a lesser service). Incentives are often established for call center employees, typically low wage earners, against metrics like these. Let’s take a closer look at one metric: customer retention. If customer service employees are fixated on customer retention and their livelihood depends on it, a whole host of tactics will be used to keep folks subscribed to their current service (phone, cable, internet, gym, magazine, etc.). Transferring calls to retention specialists, call dropping, and offering a different subscription service at a lower monthly cost are a few.

Adrianne Jefferies reports about the pressure on Comcast customer service employees with respect to retaining customers and how their monthly income could be greatly affected by the loss of services. Jefferies reports that “metrics-obsessed reps are . . . trying to reach a predetermined outcome in the call, and they’re trying to do it in under 11 minutes. Comcast has turned its customer service reps into sales reps.” Even when customers explicitly tried to cancel their service, they found difficulty.

In another setting, a president of a private liberal arts college was hyper-focused on increasing enrollment and lost sight of other important metrics. Increasing enrollment is important for a variety of reasons and had worked for him at a previous institution. In his mind, increasing enrollment meant increasing revenue, which would lead to new programs and pay for new buildings.

Everybody at the college was soon busily increasing the number of students applying, admitted, and enrolling. Tuition was significantly discounted in year one to attract students with competitive financial aid and merit scholarships. This hyper-focus led to more students in the short-run. Meanwhile, little attention was paid to retention.

Without the capacity to properly handle the new students, students became frustrated by the inability to get into the classes they wanted, overcrowded first year dormitories, and reduced financial aid in subsequent years. Consequently, the 6-year graduation rate dropped. Revenue gains by increasing the freshman class were soon lost by students transferring or dropping out. The board lost confidence in the president and a leadership change was made.

Hospital leadership can also become hyper-focused on a single metric to the detriment of other metrics. One hospital, for example, had an internal issue with hospital-acquired sepsis, a life-threatening infection. Concerned, hospital educators created competencies around hospital-acquired sepsis for all nurses. Continuing education focused on sepsis and the chief nursing officers held lunch-and-learn sessions. Awareness was raised and patient outcomes improved. However, although sepsis decreased, urinary tract infections increased. Attention paid to one infection type meant attention diverted from another. One chief nursing officer lamented, “You can have your team focus only on so many things.”

As a result of metric fixation, thought leaders have tried to introduce broad concepts that measure and score all underlying metrics. In healthcare, for example, patient experience is a new metric that healthcare providers are beginning to measure. Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS), a survey administered to a sample of patients leaving any hospital setting and reported to the government, attempts to measure patient experience. The verdict is still out as to whether this will result in better health outcomes. It is an attempt at moving organizations beyond metric fixation.

Finding Balance

This chapter explored a few of the ways organizations struggle to make the most of their data. Whether it is an issue with too many reports or too few, a lack of commitment to using data or a fixation on a particular metric, many organizations have found it challenging to build data fluency.

The path toward data fluency requires balance. In particular, organizations should look for the following:

  • Balance between the needs of data users and the interests of the data producers
  • Balance between letting the data speak and allowing the data author to define the message
  • Balance between the business users who make decisions using data and the technologists who manage and deliver the data
  • Balance between focusing on a few key metrics and the flexibility to change measures of success as needs evolve
  • Balance between the deep analysis of data scientists and the direct reporting that provides the most important data in a simple format
  • Balance between data presentation that puts understanding first and data products that use aesthetics to capture an audience’s imagination

A data fluency framework can help to provide guidance about how to make the right choices. The following chapters will explain the four quadrants of the framework and the organizational capabilities needed to avoid the struggles just described.

Notes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset