Chapter 4
Analytics

The Engine of Information Monetization

Mobilink is Pakistan’s leading provider of voice and data services, based in Islamabad, with thirty-eight million subscribers communicating over nine thousand cell sites and 6,500 kilometers of fiber optic backbone. But as with most telecommunication providers, and many other kinds of businesses, its leadership is as concerned about customer churn as it is about gaining new customers. Faced with fierce competition in an exploding market, it was compelled to generate improved economic value from the data on its millions of subscribers and two hundred thousand retailers across ten thousand cities, towns, and villages. More specifically, how could it answer the age-old but increasingly important question: How can we offer the right services to the right customers at the right time? How can we monetize all of this information, including social media data, in building customer trust, improving loyalty, and decreasing churn while maintaining profit margins?

To answer these questions, Mobilink deployed SAP’s InfiniteInsight1 predictive modeling solution against its storehouse of Big Data, resulting in better targeted and more effective promotions and campaigns, which increased customers’ usage of high margin, value-added services like text messages, ringtones, and music. The results of this effort included an eightfold increase in the uptake of customer retention offers (from 0.5 percent to 4 percent), a 380 percent boost in campaign response rates (mostly from social network analysis), and a predictive modeling approach that enables Mobilink to deploy new predictive models in less than a day.2

Unless you are licensing information directly or trading it for goods and services, it’s more likely you are monetizing it indirectly via some form of analytics or digitalization. Even with the latter, most licensed or bartered information has some degree of analytics applied to it before it is shared. And even when licensing information, most organizations will add value to it by generating and selling the insights or analysis instead of or alongside the raw information itself.

However, evolving from traditional business intelligence (BI), represented by enterprise reporting or end user query tools, has been slow to materialize in many organizations. Not only have lagging organizations lost out on the opportunity to understand their businesses and markets better, but they have squandered opportunities to generate measurable economic benefits from (i.e., monetize) their information assets. In this chapter, we will explore the case for reaching beyond business intelligence, and how embracing these ideas can lead to improved economic benefits for your organization.

Beyond Basic Business Intelligence

“From what I have been able to determine, we have over 100 distinct internal BI implementations producing some 15,000 reports, most weekly, some monthly or quarterly,” confided the new CIO of a Big-4 systems integrator. “And that’s just in the U.S.” He went on to question the value these implementations and reports generate for the organization: “I have no idea who is using them, and if they’re using them at all, for what purpose. But these systems are costing us millions, so I’m considering just shutting down some just to see who if anyone complains.”

This story is repeated over and over in varied vernacular by IT executives I speak with, most often by those who have inherited a gaggle of data warehouses, data marts, and BI applications. Often it’s followed by a common proclamation: I’m desperate to get IT out of the report writing business. Their real concern isn’t the cost or resources required by BI—it’s the inability to link it to discernible economic gains. I’m certain that if IT executives could attribute top- or bottom-line value propositions—or any key organizational metrics, for that matter—to these implementations, then they would be clamoring to keep them with in IT.

From Descriptive to Diagnostic, Predictive, and Prescriptive Analytics

Basic BI implementations are everywhere, in every corner of every organization. They range from personal spreadsheets to financial and production reports to executive dashboards. The sprawl of these applications, particularly as a result of commonplace data warehouse implementations over the past twenty years, no doubt has improved enterprise transparency and influenced improvements in productivity, customer and partner relationships, and compliance. But slicing and dicing and reporting on information, in most cases, has fallen woefully short of producing measurable economic benefits. And where these implementations may have been measurable, few organizations have actually measured them other than with poor proxies of value such as “user satisfaction.” Ultimately, BI and data warehousing have become a significant IT cost sink, in many instances only with acknowledged soft and unmeasured business benefits. Yet inertia to continue generating hindsight-oriented reports and dashboards is a function of having chased after them the past twenty or thirty years.

Typical BI implementations tend to be far removed from actually monetizing data, or connecting the dots between them and top- or bottom-line value propositions. More advanced analytic capabilities which generate actionable insights, predictions, or explicit recommendations can be connected more readily to economic improvements. Therefore, there’s an imperative for organizations to reach for advanced analytics capabilities (Figure 4.1).

However, advanced analytics initiatives tend to be more challenging and vocational—that is, targeted at a particular business problem or opportunity rather than “the enterprise.” And because information reporting and exploration continue to serve a valid purpose in organizations (especially with the emergence of self-service BI), it can be helpful to consider BI and advanced analytics as distinct entities and initiatives with unique value propositions, staff, and technologies from one another.

Figure 4.1 Gartner Analytic Ascendency Model

Figure 4.1 Gartner Analytic Ascendency Model

The Advanced Analytics Advantage

Fifty years after Coca-Cola acquired the Minute Maid brand in 1965, a customer called to complain about the consistency of orange juice from week to week and from one season to another. Except this wasn’t just any customer. It was a popular fast food restaurant with over two thousand locations. They had threatened to switch to another “orange drink” provider.3

If there’s anything Coca-Cola prides itself on, it’s the quality and consistency of its products. Oh, and not losing customers. So this call set into motion a project to determine just how to establish orange juice consistency regardless of the season, supply chain disruptions, or regional orange prices and availability. Our job was to “take Mother Nature and standardize it,” said Jim Horrisberger, director of procurement at Coke’s massive juice packaging plant an hour south of Disney World. “Mother Nature doesn’t like to be standardized.”4

But that’s just what Coke did. With the help of Revenue Analytics, a consultancy that also helped Delta Airlines maximize its revenue per mile, “We basically built a flight simulator for our juice business,” said Doug Bippert, Coke’s VP of business acceleration. The so-called super-secret methodology called “Black Book” analyzes a quintillion decision variables including weather patterns, anticipated crop yields, and cost pressures, plus the six hundred flavors Coca-Cola scientists determined comprise an orange. Then the algorithm dictates how to blend and batch oranges, based on their flavor analysis, for consistent juice.5

Now, Bippert said, “If we have a hurricane or a freeze, we can re-plan the business in five or ten minutes because we’ve mathematically modeled it.” So the next time you drink orange juice, remember you’re drinking an algorithm.6

Whereas BI implementations are appropriate typically for informing business managers of performance indicators, advanced analytics implementations can provide far-reaching organizational benefits over basic BI. The requisite BI tool/platform features of data aggregation, summarization, selection, slicing, drilling, and charting are tuned for presenting information interesting to users, but not necessarily information important to optimizing business processes. Today, only strategic decisions (and not even all of them) may be made at a rate slower than the speed of business. Tactical and operational decisions increasingly must be made at a rate faster than that of which humans are capable.

Despite continued corporate reliance on spreadsheets, now analytics is a core competency in most organizations. However, most implementations—particularly enterprise implementations—entail basic decision support solutions for business managers and executives. In addition, much information is still left to interpretation, pockets of analytics result in information clashes between departments, and many users choose not to rely on analytic output to guide their decisions and behavior, or at least limit their reliance on it. Resolving this demands producing analytic results beyond simple summarizations, and involves delivering those results direct to business processes–not necessarily to people. Directing analytic output merely at eyeballs continues to be one of the great fallacies and limitations of BI.

However, advanced analytics in all its variety of instantiations (for example, data mining, prediction, artificial intelligence, complex-event processing, visualization, and simulation) can be not only difficult to implement, but also difficult to articulate and coordinate. This is why even the most well-intentioned analytic initiatives can too easily devolve into a pedestrian presentation of lagging and loosely relevant performance indicators.

Ultimately, the keys to generating measurable economic value with information involve implementing forms of advanced analytics focused upon:

  • Exploiting Big Data. Increases in the volume, velocity, and variety of information assets are a boon for data monetization. Analytic capabilities have struggled to keep up with the size, speed, and range of available information. But organizational capabilities are just as important.
  • Improved and actionable decision making. Analytic technologies and solutions can drive improved and actionable decision making, leading to increased business performance. Solutions for decision making tend to embrace complexity, including sophisticated scenarios. And they frequently involve improving or automating business processes, or reducing risk.
  • Identifying monetizable insights. In addition to classic decision making and automation, monetizing information through advanced analytics can be achieved through deeper understanding of processes, people, and markets; improved foresight; and innovating products and services.

While many of the ideas you conceive will overlap some of the above, each of these should be a starting point for exercises in identifying ways to generate improved economic benefits available information. Ask yourself and others in your organization: How can we exploit the characteristics of Big Data? How can we use information to make better/faster decisions? What questions can we ask now that couldn’t be addressed earlier? The answer to each of these most certainly will lead to opportunities for monetizing your (and others’) information assets.

Exploiting Big Data

At the beginning of the millennium, I started to notice a set of rising information management challenges among clients. These challenges related to the emergence of e-commerce, an increasing number of documents, images, and other unstructured data types, and electronic interactions among business. While many clients were lamenting and many vendors were seizing the opportunity of these fast-growing data stores, I also realized that something else was going on. Sea changes in the speed at which data was flowing mainly due to electronic commerce—along with the increasing breadth of data sources, structures, and formats due to the Y2K-centered ERP application boom—were as or more challenging to data management teams than was the increasing quantity of data. I defined these three-dimensional challenges as a confluence of a rising volume of data, increasing velocity of data, and widening variety of data.7 These “three Vs,” as they’re known, formed the basis for understanding and defining what today is known as “Big Data.”8 While most information management professionals and nearly all vendors obsess over the size (volume) of information, this represents only one facet of magnitude (i.e., bigness). And additional “Vs” which others have cleverly (?) posited, such as “veracity” or “value” or “vulnerability,” are neither measures of magnitude nor defining characteristics of bigness.9

Regardless how one defines Big Data, it represents a fertile resource for monetizing, sometimes by licensing it to or trading it with others, but more often by analyzing it and executing on these insights. Not just reporting on it, but actually analyzing it. By merely reporting on Big Data, the result is the same pie charts and bar charts and tables as you would get by analyzing a sample of smaller data—albeit with perhaps some improved confidence. Increased data volumes can improve analytic accuracy, increased data velocity can improve analytic precision, and increased variety can improve analytic completeness.

Monetizing the Increased Volume of Information

With regard to Big Data volumes, basic BI solutions tend to be challenged. But advanced analytic solutions, particularly data mining, machine learning, and predictive applications, flourish with large datasets. The larger the dataset, the easier it is to uncover hidden subtle patterns and trends, and therefore, opportunities which are likely to evade the competition.10 In addition, the subtle signals are magnified by large datasets, and a larger breadth of dimensions can be established to discern business monetizable correlations. If the goal of a business process is to increase the probability of a customer transaction happening, then having a wealth of data about, for example, that customer, similar customers, market/shopping conditions, and product and upstream supplier inventories, along with subtransactional data11 about the customer’s real-time shopping behavior (online or in-store), can be used by advanced analytic applications.

One excellent example of a company taking advantage of massive volumes of data is Vestas. Founded in 1898 as a blacksmith shop in western Denmark, Vestas has breezed into a global leadership position in the wind turbine business with over 75 gigawatts worth of generating capacity worldwide.12 To optimize the placement of wind turbines Vestas thus far has collected about 20 petabytes of weather data. This includes information on temperature, barometric pressure, humidity, precipitation, and wind direction and speed from altitudes of ground level up to 300 feet at the tip of the blades. Feeding this information into an IBM supercomputer has enabled Vestas to reduce wind forecast modeling from three weeks down to fifteen minutes, and precisely optimize the placement of turbines within a 10-square-meter grid—whereas previously it could only optimize placement within a 27-square-kilometer grid. With the ability to develop turbine sites one month faster, Vestas installs one turbine every three hours somewhere around the world, passing along these savings and margins to customers, employees, investors, and product development.1314

Monetizing the Increased Velocity of Information

As with the earlier Lockheed Martin project insights example, the increased velocity and variety of project information made available enables it to anticipate complex design and development program issues much earlier, thereby saving millions of dollars in cost overruns and rework.

Typical BI solutions where analytic output is directed at people not processes can falter as the speed of data inputs ratchets up. Increased information velocity implies an increased speed of business. At even low levels of velocity, humans become incapable of ingesting and using information efficiently, at which point business process effectiveness can suffer. This is the point at which advanced analytic applications are required that consume and respond to swift streams of data (think of algorithmic stock trading.) These apps can make recommendations either to users responsible for administering an operational business process or be integrated with the business process applications themselves.

Monetizing the Increased Variety of Information

While many BI solutions can report on information from multiple sources on demand, or make use of integrated data in data warehouses or marts, they don’t truly take advantage of this diversity of data. A query tool can readily co-present data from differing sources, but tends to falter at corelating in ways other than standard regression analyses. Advanced analytics technologies, however, specialize in relating multiple data sources to discover unforeseen relationships such as matching employee candidate qualifications to the skill sets and experience of an organization’s best employees.

By 2:1 over the issues of information volume and velocity, IT and business professionals contend that the variety of information not only is the greatest information management challenge, but represents the biggest opportunity for generating economic benefits from information.1516

This is why organizations like Coca-Cola discussed earlier find huge measurable benefits in integrating information on crop yields, cost pressures, GPS data, and the six hundred flavors comprising an orange to optimize orange juice production consistency.

Even public sector organizations can find ways to monetize a variety of information. Pinellas County in Florida implemented the Centers for Disease Control’s Youth Risk Behavior Surveillance System (YRBSS) which monitors six types of health-risk behaviors considered leading causes of death and disability among youth and young adults. It integrates data from systems which track injuries and violence, sexual behavior and STD rates, alcohol and drug use, tobacco use, dietary behaviors, level of physical activity, and the prevalence of obesity and asthma. The youth risk index then is used to implement one or more programs within three areas: parental engagement and parenting practices, school connectedness, and/or protective services. Through improved understanding of these issues and the factors which drive them, Pinellas County has been able to make strides in reducing the unfortunate expenses associated with them.17

Increased information volume, velocity, and variety characteristic of Big Data can also help magnify, identify—and thereby monetize—hidden nuggets of insight.

Improved and Actionable Decision Making

Analytic technologies and solutions can drive improved and actionable strategic, tactical, and operational decision making, leading to business performance gains that deliver tangible economic benefits. Solutions involving decision making often are targeted at situations in which the number of actors, amount of information, number of information sources, and the potential outcomes and their urgency or importance are great. As such they often involve improving or automating business processes or reducing risk, not merely informing individuals.

Embracing Complexity, Unexpected Activity, and Changing Conditions

In addition to embracing complexity, advanced analytic solutions can handle unexpected data and dynamic business/market conditions more effectively. Machine learning algorithms, and neural networks in particular, make sense of more randomized inputs, adeptly dealing with outliers or information that doesn’t fit otherwise prescribed formats, categories, or time frames.

Until recently, an Australian insurance company had segmented customers on six dimensions using an off-the-shelf BI tool. This led to insurance products and marketing campaigns which failed to differentiate the company or gain market share. After supplementing its BI implementation with an initial deep analysis of customer data, resulting in an awareness of more than one hundred significant dimensions (including artificial derived ones), the company was able to identify and target gaps in the market quite easily.

One of the most dynamic activities in any business is the configuration and evolution of complex products. At some point NCR business leadership realized its ATMs were offered with tens of thousands of configuration options. Dealing with this kind of SKU proliferation on top of increasing product complexity adversely affected the sales cycle. Across the value chain, NCR experienced a lack of alignment between customer demand, solutions management, operations, and sales. So with the help of analytics software from Emcien, it produced a demand-shaping pattern analysis for determining the optimal number of product configuration options, resulting in a $110 million bump in revenues and a 5 percent increase in sales efficiency.18

Optimizing Business Processes

Ultimately, any form of information monetization is the result of some business process or combination of business processes. BI tools generally are standalone with respect to the business processes that they support. Even when embedded into business applications, they tend to present charts or numbers in an application window. Ideally, output is updated to reflect the user’s activity and needs, but less often is it used to affect the business process directly. Evolving to complex-event processing solutions, recommendation engines, rule-based systems, or artificial intelligence (AI), combined with business process management and workflow systems, can help to optimize business processes more directly, either supplementing or supplanting human intervention.

Case in point: a company formed from a collection of shopping stalls in 1919 by an English trader named Jack Cohen today has hardwired its thousands of refrigeration units to a data warehouse. By 1996, Cohen’s company would become the United Kingdom’s largest retailer, collecting £1 of every £7 spent in the U.K.19 Tesco now has over 3,100 stores across the U.K., Europe, and Asia.

“Ideally, we keep our refrigerators at between -21 and -23 degrees Celsius, but in reality we found we were keeping them colder.” said John Walsh, Tesco’s energy and carbon manager for Ireland. “That came as a surprise to us.” In a project collaborating with IBM Dublin’s research laboratories, Tesco worked with refrigeration manufacturers to collect data from in-store controllers and feed it into a dedicated data warehouse. The system takes unit readings twenty times per minute and overlays the analytics on a Google map illustrating their performance. Each unit produces seventy million data points per year. With sophisticated statistical processing and machine learning, the system can identify refrigeration units which are performing suboptimally, even when knowing nothing about how refrigerators should perform. As a result, upon complete rollout, Tesco expects to deliver €20 million per year to its bottom line via a 20 percent reduction in refrigeration energy expense, and eliminate 7,000 tons of CO 2.20

Automating Governance, Risk, and Compliance

Even before mandatory governance, risk, and compliance (GRC) reports are produced—typically with BI or corporate performance management platforms—organizations should have the means to identify potential, imminent issues before they occur. Advanced analytic applications, tied to data streams consisting of leading GRC indicators, can make business process owners aware before trailing indicators.

Deeper analytic capabilities are needed to uncover the ultimate source of discovered GRC or fraud issues through the forensic examination of transactions and other business activity, including contrasting those activities with an understanding of written contracts. Additionally, as a matter of course during prescribed audits, robust analytic tools can go beyond the manual sampling still done by most auditors today to assess every record fully.

One of the biggest risks anywhere is driver fatigue, especially for those driving big rigs. “Trucking and logistics is an extremely challenging and competitive industry,” explains Lauren Dominick, Predictive Modeler at FleetRisk Advisors, a 25-person commercial fleet consultancy based in Alpharetta, Georgia. “These companies need to be able to deliver goods reliably and on schedule, while ensuring driver safety and avoiding accidents that can cause delays, damage cargoes, and even cost lives.” Dominick also describes how the work environment is stressful with an average employee turnover rate exceeding 100 percent annually.21

The idea behind FleetRisk Advisors is one of pure information monetization: that companies have a mountain of unused information (dark data) about their drivers and vehicles. Originally, FleetRisk’s process suffered under the strain of manual calculations and basic types of reporting. Today, it automatically compiles near real-time telematics data from each driver’s electronic logbook and integrates it with employee information from other systems. With this integrated information asset, FleetRisk analyzes drivers’ pay versus one another and industry averages, and combines this with other stress factors and employment history. As a result, it has been able to help reduce employee attrition by more than 30 percent saving on training and other hiring-related costs. More important, explains Dominick, “Across our client-base, we’re seeing a minimum of 20 percent reduction in the overall accident rate, and an 80 percent reduction in severe accidents such as rollovers, driving off the road or rear-end collisions. By identifying the risk factors—especially those that contribute to fatigue—we help our clients to intervene before accidents happen.”22

Enhancing Scenario Planning

Even beyond forecasting and other kinds of prediction, advanced analytic capabilities can include comparative risk-reward assessments (for example, Monte Carlo simulations). This is the realm of sophisticated economic modeling. Yet most business leaders are barely aware of such methods, let alone trained in them. Increasingly it is becoming critical for organizations to embrace advanced analytical solutions like scenario planning that can be used to drive optimal organizational strategies and performance.

One planning process we all know could use some improving is the efficiency of airline gate and ground personnel. But the problem isn’t with the personnel themselves—rather, it’s with their scheduling. It’s a little known fact that estimated times of arrival (ETAs) are provided by pilots themselves. 10 percent of the time these estimates are off by more than ten minutes, and 30 percent of the time they’re off by five minutes. Pilots may know their speed, altitude, direction, and distance to the airport, however they lack other detailed information and the analytic capability to provide more accurate estimates.

But a system being installed at airports around the world developed by PASSUR analyzes information about weather, fl ight schedules, and historical flight data, plus data feeds from its network of passive radar stations installed near airports. This immense multidimensional body of information enables the computation of refined ETAs by evaluating current conditions versus prior “sky scenarios” to model and balance likely outcomes and risks. At airports where this system is in place, the improved efficiency of personnel scheduling and throughput can yield several million dollars per year. And at such outfitted airports, United Airlines claims its fleet is avoiding two to three airport diversions per week.2324 At a cost of up to $200,000 each, sky scenario planning is a significant way to monetize this information.

Identify Monetizable Insights

In addition to classic decision making and automation, monetizing information through advanced analytics can be achieved through deeper understanding of processes, people, and markets; improved foresight; and innovating products and services.

Understanding Unstructured Information

BI technologies and implementations are inherently hard-coded to accept certain kinds of data from known sources in understood formats. Advances and extensions in BI technology have made it possible to perform rudimentary analysis on unstructured text. But when it comes to truly analyzing it using language and syntactically aware algorithms, graduating to advanced analytic technologies which specialize in this is required. And beyond written text, sources of enterprise intelligence and performance increasingly are in even more complex formats, such as audio and video. As discussed in chapter 2, much of this unstructured information lies dormant in the form of what’s called “dark data.” Much like the dark matter of the universe, we know dark data is there because it has a gravitational effect on our business and operations as it remains unused and unmeasured.

As I have long advised clients, there are only three ways to generate economic benefits from unstructured information (or content) before you structure it: you can read it, search it, and sell it. But the process of extracting most forms of value from unstructured information, even the process of categorizing or parsing content, implies some form of pre-processing. Maximizing the economic value of most content involves analyzing it in some way, either to identify patterns or structure it so as to render it as input to some automated process.

This is just what the Birmingham, Alabama-based insurer Infinity Property and Casualty did to identify indicators of fraudulent insurance claims. Infinity’s senior vice president of claims operations, Bill Dibble, doesn’t believe in “silver bullets”—but he had this idea that insurance claims could be scored using predictive analytics in much the same way as credit applications. While Infinity already had a rudimentary system which screened questionable claims based on “red flags,” it still required quite a bit of manual intervention, thereby slowing down the claims process, and affecting customer service. Moreover, its hard-coded flagging process had trouble catching emerging fraud patterns. However, by text mining the content of previous claims known to be fraudulent, Infinity matches these patterns to the content of police reports, medical files, and other accident-related documents. These patterns of “narrative inconsistency” indicate probable fraud.

With this new predictive claims analytics system in place, Infinity’s success rate in pursuing fraudulent claims jumped from 55 percent to 88 percent, and in the first six months of operation the system increased subrogation recovery by $12 million. In addition, it has reduced its cost of claims adjustment, leading to improvements in its Loss Adjustment Expense (LAE) ratio. As Dibble exclaimed, “With predictive analytics, we were basically able to close a hole in our pocket where money was leaking out steadily.”2526

Identifying Faint Signals

Competitive advantage and sometimes saving lives increasingly relies on the ability to identify and monetize faint signals. These are signals that either present themselves with infrequent occurrence (for example, fraud, missing input) or in subtle patterns (for example, supplier quality glitch, inventory leakage, and underserved market subsegments). However, human fingers and eyes clicking through and viewing data with a BI tool can all too easily miss these signals. We are at a tipping point in the evolution of the Information Age. Decision making increasingly is automated and managed by machine-based models far more effectively than any human could manage.

Illustrating these faint signals is the web crawling and analytic technology behind HealthMap, a system that really got on the map in 2014 when it identified a clustester of “mystery hemorrhagic fever” cases in Guinea. It wasn’t until ten days later that the World Health Organization, having been notified by the health ministry of Guinea a day earlier, announced an outbreak of Ebola. Developed by researchers, epidemiologists, and software developers at Boston Children’s Hospital, HealthMap scours billions of posts from tens of thousands of social media sites, local news, government publications, infectious-disease physicians’ online discussion groups, RSS feeds, and other sources to detect, plot, and track disease outbreaks. Every hour the application repeats the process, analyzes the text in fifteen languages, filters out noise such as “an outbreak of home runs” or “Justin Bieber fever,” and determines the geographic location referenced in the content. It also incorporates a machine learning algorithm which improves over time—matching confirmed disease outbreaks with the information it collected on them, and learning to anticipate their spread.

“Getting case information and data of very specific epidemiology, of outbreaks, and fit that into disease models, where it might spread to, how bad it might be, then integrate it with climate data or transportation data, all of a sudden it’s data you wouldn’t necessarily have access to,” said John Brownstein, epidemiologist and HealthMap co-founder. The ability to quickly identify outbreaks can help local health organizations respond to and contain them faster, not just saving lives, but also saving these organizations and governments money and resources.

To support the cash-strapped WHO in deploying scarce resources, HealthMap continued to track the disease across nearby West African countries.2728 Currently, HealthMap is helping track and now predict the spread of the mosquito-borne Zika virus outbreak.29

Evolving to Insight and Foresight

BI tools are designed to present historical data, not so much assist in looking into the future. Slicing and dicing and drilling data, as is the specialty of BI tools, also can offer users insight into not just what occurred but why it occurred. But ask any stock trader, for example, which is more monetizable—knowing the past, or anticipating the future. The answer is clear.

True, even commonplace spreadsheet software includes a host of statistical formulae and chart types to expose and illustrate trend lines. As is their exclusive domain, advanced analytic technologies greatly exceed basic BI tools in the types of visualizations, pattern matching, and correlative algorithms, predictive techniques and modeling capabilities, simulation, forecasting, and scenario-planning capabilities available to organizations.

Just ask Mike Higgins, manager of sales, analysis, and strategy at Advanced Drainage Systems, whether it’s better to know the past or have a jump on the future. Headquartered in Hilliard, Ohio, Advanced Drainage Systems (ADS) is the world’s largest manufacturer of products for stormwater management and sanitary sewer applications. Previously, it would draw upon reports from operational, sales, and marketing systems such as its Oracle ERP implementation to estimate forecasts, do planning, and set budgets. Then it evolved to pulling this data into an Excel spreadsheet.

“We look at how sales and order activity is performing currently, match that with a five-year historic trend of the seasonality of the business on a month-to-month basis, and produce a rolling twelve-month forecast,” said Higgins. To supplement and tweak the model, “We seek input from our sales managers, based on what they know and data from field intelligence about different types of business activity,” he explained. But that wasn’t enough. Forecasts were still less than 80 percent accurate, often either leaving money on the table or product in the warehouse.

The next step of ADS’s analytic evolution involved working with the advanced analytics company Prevederé Software to assess over one million external datasets for their relevance in enhancing the forecasts. With this access to weather data, global, and local economic data, along with building and construction industry data, ADS identified external leading indicators for its business, creating over 150 predictive models in six weeks. By stripping the bias out of their forecasting and being heads-up to a multitude of external factors, ADS improved its monthly forecast accuracy to 98 percent, leading to a 26 percent increase in monthly year-over-year revenue and a 20 percent growth in monthly sales.30

Instigating Innovation

Beyond basic BI, a market is emerging for analytic solutions that drive business innovation. The advanced analytic capabilities of identifying new patterns, performing simulations, and exposing weak signals previously mentioned can be used to spur novel ways of looking at the business and its operations, strategies, emergent customer needs, and market shifts. As organizations struggle to squeeze yet more cost out of the business, the upside that innovation offers can—and should be—driven by a deeper and more exploratory analysis of available data from within and externally.

Clinical pathways themselves are a fairly recent innovation in the history of medicine. Clinical pathways (or “carepaths”) are used by health care providers to ascribe the set or cycle of recommended tests and treatments and waiting or recovery periods for a particular patient or type of medical condition. According to OpenClinical.org:

Clinical pathways were introduced in the early 1990s in the UK and the USA, and are being increasingly used throughout the developed world. Clinical pathways are structured, multidisciplinary plans of care designed to support the implementation of clinical guidelines and protocols.31

As databases of electronic medical records (EMRs) are quickly replacing years of pen-and-paper-based diagnoses, lab results, surgical records, and notes from doctors, nurses, physical therapists, etc., this wealth of data is a fuel spawning additional innovations in health care—particularly mining it to develop new or improved clinical pathways. The imperative is there too. In the U.S., Medicare, Medicaid, and the Affordable Care Act (Obamacare) each incentivize health care providers for the quality of care, and disincentivize repeat visits by a patient for the same ailment.

With this kind of innovating in mind, Mercy Hospitals, one of the top five health systems in the U.S., worked with advanced analytics provider Ayasdi to identify situations in which clinical variation could be narrowed, and narrowed toward protocols which provided better outcomes. The topological data analysis solution, leveraging existing computational biology algorithms and machine learning, surfaces groups of similar patient procedures and generates clinical pathways that result in improved patient outcomes and reduced expense. It pinpoints treatments, nursing orders, patient comorbidities, prescriptions, and even equipment that contributed the most to variations in cost and outcomes. And it can be used to model proactively the impact a care path change will likely have.3233 According to Mercy’s VP of clinical informatics, Dr. Todd Stewart, the adoption of clinical pathways increased more than 20 percent by Mercy physicians, clinical pathways are deployed more quickly, and over forty new pathways have been defined covering 80 percent of all clinicians. All told, Mercy is saving more than $18 million per year, and increasing as more pathways are incorporated.34

From Information Monetization to Information Management

Throughout this section we have explored ways of understanding, conceiving, developing, and implementing ideas for monetizing information. We have broken down the conceptual barriers to thinking about ways to generate economic benefits from available information assets—yours and those of others. We have taken a look at how dozens of organizations spanning nearly every industry and geography are monetizing information. And we have stepped through a basic process for making it happen in your organization.

However, there’s one big roadblock to monetizing information at an enterprise level. If you noticed, nearly all of the examples throughout this section are one-off, functionally-targeted ideas and implementations. No doubt some new businesses during the past 10–20 years have been based on an economic architecture of monetizing information. As Gartner analysts Saul Brand and Dale Kutnick wrote about the novel concept of economic architecture, “Enterprises must find new digital business opportunities, driven by macroeconomic and microeconomic forces. These will enable them to modify their income statements and balance sheets to improve capital deployment and, ultimately, to restructure their businesses.”35

But few other organizations have been able to complete the transformation from a fixation on the tired triumvirate of people-process-technology to a digital, information-driven business. Why is this? Because they yet fail to manage their information with the same discipline as their traditional assets—those assets represented in black and white on their financial statements. Many IT and business executives talk about information as an asset, and maybe even have the phrase “information is an asset” included in their data strategy documents or data governance principles. But for the most part, such declarations are just a meme, lip-service, or—at best—a whiff of a vision for information. To me, increasingly they’re a clarion call to hire a chief data officer.

In the next section we’ll examine why organizations are so challenged with managing information well, and how they can overcome this by adopting some new and reclaimed ideas rooted in economics and traditional asset management practices.

The next step in your infonomics journey is to become adept at managing information as an actual asset. If you’re an experienced information professional or CDO, this may require you to cast aside some traditions. On the other hand, this may be just what you’ve been looking for. And if you’re a CEO, CFO, or business executive experienced with managing other kinds of assets, it’s likely you will ask, Why haven’t we been doing this all along?

Notes

1 Formerly KXEN.

2 SAP.com Customer Successes, accessed 05 June 2016, www.sap.com/bin/sapcom/fi_fi/downloadasset.2014-08-aug-27-15.predictive-analytics-customer-successes-pdf.html.

3 Coca-Cola executive, interview with author, 13 October 2015.

4 Duane Stanford, “Coke Engineers Its Orange Juice—with an Algorithm,” Bloomberg, 31 January 2013, www.bloomberg.com/news/articles/2013-01-31/coke-engineers-its-orange-juice-with-an-algorithm#p1.

5 Ibid.

6 Ibid.

7 Doug Laney, “Deja VVVu: Others Claiming Gartner’s Construct for Big Data,” Gartner Blog Network, 14 January 2012, http://blogs.gartner.com/doug-laney/deja-vvvue-others-claiming-gartners-volume-velocity-variety-construct-for-big-data/.

8 It is believed that James Mashey, a scientist at Silicon Graphics, Inc., or perhaps Michael Cox and David Ellsworth in a 1997 ACM article, were first to publicly use the term “Big Data” in the mid-1990s, not I.

9 For further thoughts on additional Big Data “V”s, see my blog “Batman on Big Data,” Gartner Blog Network, 13 November 2013, http://blogs.gartner.com/doug-laney/batman-on-big-data/.

10 Presuming the dataset is of sound accuracy, completeness, timeliness, and other quality characteristics I detail in chapter 13.

11 “Subtransactional data” is a term I began using in the 1990s to refer the lower level of detail beneath business entity, customer-related, and transaction data. It is data reflecting activities between discernible business events. Examples include log data, website clicks, IoT device communications, and video feeds.

12 accessed 05 June 2016, www.vestas.com.

13 “Vestas Wind Systems Turns to IBM Big Data Analytics for Smarter Wind Energy,” ibm.com, 24 October 2011, www-03.ibm.com/press/us/en/pressrelease/35737.wss.

14 “Vestas, Turning Climate into Capital with Big Data,” ibm.com, 2011, www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=IMC14702USEN&appname=wwwsearch.

15 Douglas Laney, “Methods for Monetizing Your Data,” Gartner Webinar, 22 August 2015, www.gartner.com/webinar/3098518.

16 Information variety represents a greater challenge than volume or velocity because it cannot be solved simply by scaling or swapping infrastructure.

17 Sue Hildreth, “Data+ Awards: Florida Youth Welfare Agency Pinpoints Aid with BI,” Computerworld, 26 August 2013, www.computerworld.com/article/2483944/enterprise-applications/data--awards--florida-youth-welfare-agency-pinpoints-aid-with-bi.html.

18 “Cashing in on Improved Profitability through Pattern Detection and Big Data Analytics,” emcien.com, 2015, http://emcien.com/wp-content/uploads/2015/10/NCR-Success-Story.pdf.

19 Denise Winterman, “Tesco: How One Supermarket Came to Dominate,” BBC News Magazine, 09 September 2013, www.bbc.com/news/magazine-23988795.

20 “Data Analytics Solution Used to Optimize Refrigerators and Reduce Energy Costs in Grocery Stores,” ibm.com, www-03.ibm.com/software/businesscasestudies/mx/es/corp.

21 “FleetRisk Advisors Helps Clients Reduce Accident Rates and Driver Turnover,” IBM, accessed 09 February 2017, http://presidionwp.s3-eu-west-1.amazonaws.com/wp-content/uploads/2014/09/FleetRisk.pdf.

22 IBM, “FleetRisk Advisors.”

23 Andrew McAffee, and Erik Brynjolfsson, “Big Data: The Management Revolution,” Harvard Business Review, October 2012, http://hbr.org/2012/10/big-data-the-management-revolution/ar/2.

24 Passur.com, accessed 09 February 2017, www.passur.com/success-stories-airlines.htm.

25 “IBM Enables Infinity Property & Casualty Insurance to Combat Fraud,” youtube.com, uploaded 05 May 2011, www.youtube.com/watch?v=qoFYo60rlC0.

26 James Taylor, “Putting Predictive Analytics to Work at Infinity Insurance,” JT on EDM, 15 September 2009, http://jtonedm.com/2009/09/15/putting-predictive-analytics-to-work-at-infinity-insurance/.

27 Lyndsey Gilpin, “How an Algorithm Detected the Ebola Outbreak a Week Early, and What It Could Do Next,” TechRepublic, 26 August 2014, www.techrepublic.com/article/how-an-algorithm-detected-the-ebola-outbreak-a-week-early-and-what-it-could-do-next/.

28 Zoe Schlanger, “An Algorithm Spotted the Ebola Outbreak Nine Days before WHO Announced It,” Newsweek, 11 August 2014, www.newsweek.com/algorithm-spotted-ebola-outbreak-9-days-who-announced-it-263875.

29 “2016 Zika Outbreak,” healthmap.org, accessed 09 February 2017, www.healthmap.org/zika/#timeline.

30 “Case Study | Improving Forecast Accuracy with Prevederé Software,” prevedere.com, 2013, www.prevedere.com/wp-content/uploads/2016/03/CaseStudy-ADS.pdf.

31 “Clinical Pathways,” openclinical.org, accessed 09 February 2017, www.openclinical.org/clinicalpathways.html.

32 “The Journey from Volume to Value-Based Care Starts Here,” ayasdi.com, accessed 09 February 2017, www.ayasdi.com/applications/clinical-variation/.

33 “The Science of Clinical Carepaths,” ayasdi.com, 11 February 2015, www.ayasdi.com/blog/bigdata/the-science-of-clincial-carepaths/.

34 Dr. Todd Stewart, email to author, 13 January 2017.

35 Saul Brand and Dale Kutnick, “Digital Business Success will be Driven by Economic Architecture,” Gartner, 04 December 2015.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset