3
The Analytics Team

Thomas H. Davenport

Technology, Operations, and Information Management, Babson College, Wellesley, MA, USA

3.1 Introduction

Analytics are created or initiated by people. People frame the question to be answered by analytics, select the data to be analyzed, propose and test hypotheses, and determine how well the hypotheses are supported in the data. Even in relatively automated machine learning environments, analysts or data scientists select the data and the tools, and kick off the process of finding a model that fits the data. The capabilities of human analysts are among the most important factors in determining the success of an analytics initiative.

In organizations of any size, it is impossible for one analyst to do all the necessary analytics work. Therefore, the topic of human analytical resources quickly becomes one of assembling and managing an analytical team. In addition, there may often be too many different skills required for high-quality analytical work for one person to possess them all. It is usually the case that some sort of division of labor and skills across a team is necessary.

This chapter, then, will focus on assembling and managing teams of analytical people to analyze data and assist the organization in making analytical and data-driven decisions. It addresses not only an organization's requirements for analytical capabilities but also the individual skills required to make analytics successful. It will also address some of the ways in which analytical teams can be organized within a company.

Although this chapter appears in a book published by INFORMS, it is not a commercial for that organization. Nevertheless, there should be little doubt that certification of analytical skills is a useful exercise to ensure that the necessary skills are present in an individual's repertoire. INFORMS has created one of the more effective certification programs in its CAP, or Certified Analytics Professional. The CAP certification requires work experience in analytics, but the Associate CAP (aCAP) does not. I won't discuss these further–there are many materials available on the program's website (https://www.certifiedanalytics.org/)–but as a member of the Analytics Certification Board, I will testify to its quality and urge individuals and organizations to pursue this certification.

3.2 Skills Necessary for Analytics

The skills necessary to work with analytics have evolved considerably over the several decades that companies have been pursuing business analytics. I'll describe the evolution in this chapter, beginning with the basic skills that have been necessary since the 1960s or 1970s, when the use of analytics in businesses began to take off.

Quantitative skills–broadly speaking, the ability to extract, meaning from numbers–are the core requirement for any type of quantitative analyst. But tuning a regression equation or manipulating a spreadsheet is only the beginning. Effective analysts need to be proficient not only with data but also with people.

  • Quantitative and technical skills are the foundation. All analytical people must be proficient in both general statistical analysis and the quantitative disciplines specific to their industry or business function: lift analysis in marketing, stochastic volatility analysis in finance, biometrics in pharmaceutical, and informatics in health care firms, for example. Some types of analysts–those involved in “business intelligence” or reporting work–may get by without substantial statistical knowledge, but this lack would probably limit their careers today. Analytical people must also know how to use the specific software associated with their type of analytical work, whether it be to build statistical models, generate visual analytics, define decision-making rules, conduct “what-if” analyses, or present a business dashboard.
  • Business knowledge and design skills enable analysts to be more than simple backroom statisticians. They must be familiar with the business functions and processes to which analytics are being applied–marketing, finance, HR, new product development, and the like. They need enough general business background to work at the interfaces of business processes and problems. They also must have insight into the key opportunities and challenges facing the company, and know how analytics can be used to drive business value. One study of quantitative analysts suggested that they have more business acumen than their nonanalytical counterparts [1].
  • Data management skills are perhaps even more important to analytical professionals than statistical and mathematical expertise. It is often commented that such professionals spend the majority of their time manipulating data–finding, integrating, cleaning, matching, and so on. And the most commonly sought software skill by employers of data scientists is not a statistical program, but rather SQL–a query language for data management [2]. There is little doubt that analytical professionals need skills in managing and manipulating data, and for some this activity will constitute a major component of their jobs.
  • Relationship and consulting skills enable analysts to work effectively with their business counterparts to conceive, specify, pilot, and implement analytical applications. Relationship skills–advising, negotiating, and managing expectations–are vital to the success of all analytical projects. Furthermore, an analyst needs to communicate the results of analytical work: either within the business to share best practices and to emphasize the value of analytical projects or outside the business to shape working relationships with customers and suppliers, or to explain the role of analytics in meeting regulatory requirements (e.g., utility company rate cases). This skill has been described as “telling a story with data [3].”
  • Coaching and staff development skills are essential to an analytical organization, particularly when a company has a large or fast-growing pool of analysts, or when its analytical talent is spread across business units and geographies. All analytical professionals may not need them, but they are certainly required for supervisors and managers of large teams. When analytical talent is not centralized, coaching can ensure that best practices are shared across the company. Good coaching not only builds quantitative skills but also helps people understand how data-driven insights can drive business value.

One survey of quantitative analysts' activities suggested that there are really several categories of the role [4]. Based on their self-reported time allocation across 11 different analytical activities, the analytical professionals surveyed were clustered into four groups: generalists, data preparation specialists, programmers, and managers. Every participant indicated they did a little of each activity; however, managers mostly managed, programmers mostly programmed, and data prep folks mostly worked on data acquisition and preparation. The generalists do all these activities, of course, but focus more on analysis, interpretation, and presentation than other activities. Across all four categories, the least amount of time was spent on data mining and visualization.

Of course, few individuals come equipped with the full array of skills I've described; this is where teaming comes in. To constitute effective teams, a company needs the right mix of analytical talent in its teams of analysts. For example, it is often a good idea to balance hard-core quantitative experts–who focus on more advanced analytical techniques–with business-oriented “translators”–who have a broader skill set, combining strong analytics with business design and management skills to link professionals to their customers.

3.2.1 More Advanced or Recent Analytical and Data Science Skills

The practice of analytics has changed substantially over several different “eras [5].” However, the skills I've described for basic analytical work don't go away over time. That's in part because companies still have a need for descriptive analytics and the other activities performed in early analytics periods, and also because the skills required for that era still apply in later eras of analytics. However, as analytical practice has evolved, new skills are added. That is, the skills for doing analytics across the different eras of analytical practice, unfortunately, are cumulative. To be more specific, none of the statistical, business acumen, relationship, data management, and coaching capabilities required for traditional quantitative analysis go away when organizations move into the era of “big data.” This occurred around the turn of the twenty-first century in Silicon Valley, when organizations needed new data management and analytical approaches for the rise of online business.

But there are new skills required in the big data era. Data scientists–the new term for people doing high-skill analytical and data management work in this environment–typically have advanced degrees in technical and scientific fields [6]. Because they are testing many different approaches to online operations and commerce, they need experimentation skills, as well as the ability to transform unstructured data into structures suitable for analysis. In Silicon Valley, performing these tasks also requires a familiarity with open-source development tools. If the data scientists are going to help develop “data products”–products and services based on data and analytics–they need to know something about product development and engineering. Perhaps because visual displays are a good way to comprehend a very large data set, the time that big data took off was also the time that visual analytics became widely practiced in large organizations, so a familiarity with visual display of data and analytics also became important during this period.

The next era, which I would argue began around 2012 or 2013 in the most sophisticated companies, involved the combination of both big and small data for analytics within large organizations. What skills got added at this point? In addition to mastering the new technologies used in combining big and small data, there's a lot of organizational and process change to be undertaken. If operational analytics means that data and analytics will be embedded into key business processes, there's going to be a great need for change management skills. At UPS, for example, which initiated a large real-time driver routing initiative called ORION during this period, the most expensive and time-consuming factor by far in the project was change management–teaching about and getting drivers to accept the new way of routing. This period was also marked by the early use of statistical machine learning approaches, which were necessary to handle the large and fast-changing data environment of the period.

The current era, which started perhaps 5 years ago in online businesses and 2 years ago in other industries, involves extensive use of artificial intelligence or cognitive technologies. This means that analysts and data scientists need a heavy dose of new technical skills–machine and deep learning, natural language processing, and so forth. There is also a need for work design skills to determine what tasks can be done by smart machines, and which ones can be performed by (hopefully) smart humans.

The cumulative nature of these additional skills over time means that it is even more important to take a team-based approach to analytical and data science projects. It is impossible to imagine, for example, that someone who possesses the rare skill of deep learning analytics would also have all the other skills I've mentioned thus far in this chapter. The only way to have all the necessary skills on a team is to staff projects with people who hold different–and hopefully complementary–skill sets.

3.2.2 The Larger Team

Analytics were initially created to improve human decision-making. But there are many circumstances in organizations in which analytics aren't enough to ensure an effective decision, even when orchestrated by a human analyst. In order for analytics to be of any use, a decision-maker has to assess the analytical outcomes, make a decision on the basis of them, and take action. Since decision-makers may not have the time or ability to perform analyses themselves, such interpersonal attributes as trust and credibility between analysts and decision-makers come into play. If the decision-maker doesn't trust the analyst or simply doesn't pay attention to the results of the analysis, nothing will result from the analytical work, and the statistics might as well never have been computed.

I cited one such example in my first book on analytics [7]. In the course of research for that book, I talked to analysts at a large New York bank who were studying the profitability of the bank's branch network. The analysts went through a detailed and highly analytical study in the New York area–identifying and collecting activity-based costs, allocating overheads, and even projecting current cost and revenue trends for each branch in the near future. The outcome of the analysis was an ordered list of all branches and their current and future profitability, with a clear red line drawn to separate the branches that should be left open from those that should be closed.

The actual outcome, however, was that not a single branch was shut down. The retail banking executive who sponsored the study was mostly just curious about the profitability issue, and he hardly knew the analysts. He probably wasn't aware of all the work that would go into the analysis process. He knew–but the analysts didn't–that there were many political considerations involved in, say, closing the branch in Brooklyn near where the borough president had grown up, no matter where it ranked on the ordered list of branches. Basing actions on analytics often require a close, trusting relationship between analyst and decision-maker, and that was missing at this bank. Because of the missing relationship, the analysts didn't ask the right questions about the analysis, and the executive didn't frame the question for them correctly.

Instead of just analysts and data scientists, there are really three groups whose analytical skills and orientations are at issue within organizations. One is the senior management team–including the CEO–that sets the tone for the organization's analytical culture and makes the most important decisions. Then there are the professional analysts and data scientists, who gather and analyze the data, interpret the results, and report them to decision-makers. The third group is a diverse collection I have referred to as analytical amateurs. They comprise a large category of “everybody else,” whose daily use of the outputs of analytical processes is critical to their job performance. These could range from frontline manufacturing workers, who have to make multiple small decisions on quality and speed, to middle managers, who also have to make decisions with respect to their functions and units–which products to continue or discontinue, for example, or what price to charge for them. IT employees who put in place the software and hardware for analytics also need some familiarity with analytical topics, and also qualify as analytical amateurs.

To really succeed with analytics, a company will need to acquaint a wide variety of employees with at least some aspects of analytics. Managers and business analysts are increasingly being called on to conduct data-driven experiments, interpret data, and create innovative data-based products and services [8]. Many companies have concluded that their employees require additional skills to thrive in a more analytical environment. One survey found that more than 63% of respondents said their employees need to develop new skills to translate big data analytics into insights and business value [9]. Bob McDonald, at one point CEO of Procter & Gamble and then head of the U.S. Veterans Administration, said about the topic of analytics (and business intelligence more broadly) within P&G:

We see business intelligence as a key way to drive innovation, fueled by productivity, in everything we do. To do this, we must move business intelligence from the periphery of operations to the center of how business gets done.

With regard to the people who would do the analysis, McDonald stated:

I gather there are still some MBAs who believe that all the data work will be done for them by subordinates. That won't fly at P&G. It's every manager's job here to understand the nature of statistical forecasting and Monte Carlo simulation. You have to train them in the technology and techniques, but you also have to train them in the transformation of their behavior [10].

Of course, all senior executives are not as aggressive as McDonald in their goals for well-trained analytical amateurs. But in even moderately sophisticated companies with analytics, there will be some expectations for analytical skills among amateurs of various types. As Jeanne Harris and I wrote in the new edition of our book Competing on Analytics,

To succeed at an analytical competitor, information workers and decision-makers need to become adept at three core skills [11]:

  1. Experimental: Managers and business analysts must be able to apply the principles of scientific experimentation to their business. They must know how to construct intelligent hypotheses. They also need to understand the principles of experimental testing and design, including population selection and sampling, in order to evaluate the validity of data analyses. As randomized testing and experimentation become more commonplace in the financial services, retail, and telecommunications industries, a background in scientific experimental design will be particularly valued. Google's recruiters know that experimentation and testing are integral parts of their culture and business processes. So job applicants are asked questions such as “How many tennis balls would fit in a school bus?” or “How many sewer covers are there in Brooklyn?” The point isn't to find the right answer but to test the applicant's skills in experimental design, logic, and quantitative analysis.
  2. Numerate: Analytical leaders tell us that an increasingly critical for their workforce is to become more adept in the interpretation and use of numeric data. VELUX's [Anders] Reinhardt [until recently global head of business intelligence at the Danish window company] explains that “Business users don't need to be statisticians, but they need to understand the proper usage of statistical methods. We want our business users to understand how to interpret data, metrics, and the results of statistical models.” Some companies, out of necessity, make sure that their employees are already highly adept at mathematical reasoning when they are hired. Capital One's hiring practices are geared toward hiring highly analytical and numerate employees into every aspect of the business. Prospective employees, including senior executives, go through a rigorous interview process, including tests of their mathematical reasoning, logic, and problem-solving abilities.
  3. Data literate: Managers increasingly need to be adept at finding, manipulating, managing, and interpreting data, including not just numbers but also text and images. Data literacy is rapidly becoming an integral aspect of every business function and activity. Procter & Gamble's former chairman and CEO Bob McDonald is convinced that “data modeling, simulation, and other digital tools are reshaping how we innovate.” And that changed the skills needed by his employees. To meet this challenge, P&G created “a baseline digital-skills inventory that's tailored to every level of advancement in the organization.” The current CEO, David Taylor, also supports and has continued this policy. At VELUX, data literacy training for business users is a priority. Managers need to understand what data are available, and to use data visualization techniques to process and interpret them. “Perhaps most importantly, we need to help them to imagine how new types of data can lead to new insights,” notes Reinhardt [12].

As with analytical professionals, additional function unit- or business unit-specific expertise in analytics may be needed by amateurs. In the case of IT professionals, for example, those who provision and support data warehouses and lakes should have some sense of what analyses are being performed on data, so that they can ensure that stored data are in the right formats for analysis. HR workers need to understand something about analytics so that they can hire people with the right kinds of analytical skills–and how analytics can be employed to identify promising employees, or those likely to leave the company soon. With the rise of artificial intelligence, even the corporate legal staff may need to understand the implications of a firm's approach to automated decision-making in case something goes awry in the process. There are also an increasing number of AI applications in corporate litigation as well.

3.3 Managing Analytical Talent

In addition to hiring people with the right kinds and levels of skills, there are a number of activities that are involved in ongoing management of analytical talent [13]. One such activity is to conduct an assessment of the analytical skills within your organization and a “gap analysis” of the differences between the current state and the desired future state. The level of details in the assessment may vary by the purposes of the organization, but may include a roles inventory of analytical jobs and their locations within the organization, a skills inventory, and an analytics talent map. The skills inventory might include a listing of the analytical skills required, and a comparison to the desired skill levels and numbers of people possessing them. A talent map is a high-level mapping of current roles and skills, comparisons to desired future objectives, and elements of plans to close the gap–all ideally in some visual format that is easily comprehensible by busy executives.

The factors measured in the assessment will also differ by the strategies and priorities of the organization conducting the assessment. Some typical examples of factors include the following:

  • How many people are there in each of the major analytics functions?
  • What percentage of analytical professionals are capable of predictive and prescriptive analytics, as opposed to descriptive analytics?
  • How many data scientists are able to use machine learning to build models?
  • What percentage of data management-oriented employees have any experience with Hadoop and other big data tools?
  • How many employees are familiar with each of the software tools in our approved portfolio for analytics?
  • How many analytics staff have close and trusting relationships with the business leaders in the units and functions they serve?
  • What analytics/technical skills exist within the current staff by type and number of years of experience?
  • What percentage of analytics team members have more than 3 years, or less than 1 year, of experience in analytics?
  • Which software/tools have the most and least number of skilled resources available for development and support activities?

Answering these types of questions can allow analytics leaders to build the initial foundation of their organization's talent strategy and roadmap. In order to ensure that the information is relevant to the entire enterprise, it is important to involve all analytics leaders within the company and to include questions or decision points that address the unique nature of the organization and industry. In addition to providing important information, for highly decentralized analytics groups such an inventory can also be a first step toward building a greater level of cohesion.

After doing the assessment, a company will normally want to formulate some objectives and plans for what to do about the results, with a time frame for planned changes. One company, for example, determined that only 5% of its analytics staff were comfortable with predictive analytics, and it wanted to shift to 95% with that skill over 5 years. Another organization determined that its staff lacked close relationships with business leaders, so it developed clearer assignments of analysts to business units, and asked business leaders to participate in annual performance assessments for analytics staff.

A one-time talent assessment is of limited value. People, their skills, and objectives for new capabilities change all the time. Organizations should reassess their analytical roles and skills every year or two. Once an assessment process is in place, it can be repeated relatively easily.

3.3.1 Developing Talent

Many analytics organizations primarily think about hiring people with needed skills. But it is often less expensive and more effective to develop skills through education and training programs, either in-house or in partnership with universities. If there is a critical analytical skill that an organization identifies that is particularly important, it is not difficult to arrange a training program for it. There are, for example, training programs available for organizations that want their analysts to achieve CAP certification from INFORMS.

For another example within a specific firm, Cisco Systems has been expanding for several years into advanced services that analyze the data thrown off by devices like routers and switches. In addition, Cisco has been using analytics extensively for internal purposes, such as sales propensity modeling and demand/production forecasting.

However, managers within Cisco felt that they lacked the data science skills to effectively perform all these activities. Desmond Murray, a Senior Director for IT at Cisco, was running Enterprise Data and Data Security for the company in 2012. His team was adopting new big data technologies (Hadoop, SAP HANA, etc.) for the company to use, but demand within the business was limited. He concluded that a set of educational offerings around data science would build awareness and stoke demand for these new technologies.

Murray designed a distance education (an obvious approach, given Cisco's distance conferencing business offerings) program on data science with two different universities. The program would last for 9 months and results in a certificate in data science from the university. Students attend virtual classes a couple of nights a week, and of course had homework as well. Cisco is now on its sixth student cohort with 40 students in each. About 300 data scientists have been trained and certified, and are now based in a variety of different functions and business units at Cisco.

But Murray, by now head of the Enterprise Data Science Office within the IT organization, didn't stop there. He realized that the newly trained data scientists needed some support from their managers if they were going to be satisfied in their new roles. So Cisco also created a 2-day executive program led by business school professors on what analytics and data science are and how they are typically applied to business problems. The program also covers how to manage analysts and data scientists, and how to know whether their work is effective. Cisco's initiatives to develop its employees' analytics and data science skills are relatively unusual, but they don't have to be. Any company that is serious about analytics and data science could undertake similar steps.

3.3.2 Working with the HR Organization

Analytics and data science organizations in companies can do a lot to identify and inculcate needed skills. At some point, however, it will probably be wise to collaborate with a company's human resources (HR) function to institutionalize talent management processes. HR groups can help to establish formal job titles, create linkages between skill and seniority levels and compensation, and provide internal and external resources for training. If analytics and data science skills are considered strategic, HR groups can help to source, nurture, and manage them. Many HR organizations are themselves interested in doing more with analytics in their own functions, so a partnership with analytics groups can be mutually beneficial.

HR organizations can provide guidance about the type of future skills that the organization will need. Additionally, HR leadership can describe the types of nontechnical skills that they are planning to develop or already have available to support the analytics function (e.g., business acumen, relationship, or communication skills).

At Cisco, the creation of data science skill development programs revealed that there was no standard at Cisco–or at many other firms, for that matter–for who is a serious data scientist and who isn't. So they created a “Pyramid of Analytic Knowledge” to classify different levels of expertise and establish a career track. Murray and his successor worked with Cisco's HR organization to incorporate these into official job classifications and compensation bands.

3.4 Organizing Analytics [14]

One of the key questions to address in managing analytical teams is “How should we best organize our analysts and data scientists?” It is a common question arising from a common situation: Analysts and analytics/big data projects are often scattered across the organization. That is how companies get started with analytics–here and there as pockets of interest arise. However, when an organization starts to get serious about analytics and data science, it often adopts an enterprise perspective in order to develop analysts effectively and deploy them where they create the greatest business value. In order to achieve these objectives, pockets of analytics and data science usually need to be coordinated, consolidated, or centralized.

The trend over the past decade has clearly been toward centralization of analysts, and that makes sense for several good reasons. If a company wants to differentiate itself in the marketplace through its analytical capabilities, it doesn't make sense to manage analytics locally. Skilled and experienced analysts and data scientists are a scarce and high-demand resource. A central function can deploy them on the most important projects, including cross-functional and enterprise-wide projects that may be otherwise difficult to staff. Centralization also facilitates analyst development because people have more opportunity to connect with and learn from one another. In addition, a central group with a critical mass of people helps with recruiting analysts by demonstrating the organization's commitment to analytics and providing new hires with a community. Finally, research led by my frequent coauthor Jeanne Harris [15] suggests that analysts in centralized and well-coordinated structures are more engaged and less likely to leave their employer than their decentralized counterparts.

However, recent trends suggest that analytics and data science teams are not immune from the normal pressures that move centralized functions in a more decentralized direction. Previously centralized analytics groups have been decentralized and dispersed in several different companies over the past year or two. The leaders of these groups cite several reasons for the decentralization, including the visibility of centralized budgets, complaints of lack of responsiveness by business unit and function leaders, and perceptions of excessive bureaucracy in large analytics groups. It seems likely, then, that despite the efficiency and effectiveness benefits of centralization, there will be the usual oscillation between centralization and decentralization in analytics and data science groups.

Another common situation among organizations I encounter is a significant analytics presence in one or two business functions, plus small pockets of analytics across the rest of the organization. The lead functions vary by industry–risk management and trading in financial services, engineering and supply chain in manufacturing, and marketing in consumer businesses. The challenge here is simultaneously to connect the pockets of analytics and spread the wealth of expertise resident in the advanced units. In these cases, full centralization could be unnecessarily disruptive, so the organization needs other mechanisms to coordinate analyst talent supply.

In the book Analytics at Work, Jeanne Harris, Bob Morison, and I discuss five common organizational models [16]. They are a useful place to start, but organizing your analysts isn't as simple as just picking one. There are different organizational circumstances, with many variables and mitigating factors in play, and many variations on these five options. This section attempts to decompose the organizational models for analysts and data scientists, and provides tools for developing and tuning your own model.

3.4.1 Goals of a Particular Analytics Organization

When debating alternative organizational structures for analytical and data science groups, it is important to keep the overriding goals for the organization in mind. Typically, the following are some of the goals of analytical groups and their leadership within companies:

  • Supporting business decision-makers with analytical capabilities
  • Helping to develop new customer-oriented products and features involving data and analytics
  • Providing leadership and a “critical mass” home for analytical and data science-oriented people, and the ability to easily share ideas and collaborate on projects across analysts
  • Fostering visibility for analytics and big data throughout the organization, and ease in finding help with analytical problems and decisions
  • Creating standardized methodological approaches, tools, and processes
  • Researching and adopting new analytical and data science practices
  • Reducing the cost to deliver analytical outcomes
  • Building and monitoring analytical capabilities and expertise

Different priorities for these goals may lead to different organizational models. For example, the goal of supporting business decision-makers with analytics may be best served by locating analysts directly in business units and functions that those decision-makers lead. That decentralized approach may also be the most effective one for development of products and services based on analytics and data. However, such decentralization may work against the goal of giving analysts and data scientists the ability to easily share ideas and collaborate.

Note that throughout this section (and the chapter in general) I have generally mentioned analysts and data scientists in the same breath. This usage is intentional; I believe that it was always difficult to clearly differentiate between “traditional” quantitative analysts and data scientists, and it is becoming increasingly difficult over time. At one point, data scientists tended to be more experimentally focused than traditional analysts, and also were likely to write code to transform unstructured data into structured formats for analysis. But now the tasks that these two groups perform certainly overlap, and the cachet of the data scientist title means that it is being applied to more jobs. My assumption is whatever organizational structure makes sense for one group also makes sense for both; that is, analysts and data scientists should be part of the same larger group. Of course, there are always exceptions to any organizational structure rule.

As I suggested above, no set of organizational structures and processes is perfect or permanent, so organizations must decide what particular goals are most important at any point in their analytical life cycles. For example, if an organization has had a centralized group of analysts and data scientists for a while and it has become unresponsive to business unit needs, it may be time to establish stronger ties between analysts and specified business units and leaders. A company with highly localized analytics may need to switch, at least for a while, to a more centralized structure. If possible, however, organizations should avoid rapid swings from highly centralized structures to highly decentralized structures, and back again. There are usually less disruptive ways to achieve the desired goals.

3.4.2 Basic Models for Organizing Analytics

Figure 3.1 shows the common organizational models described in Analytics at Work.

Figure depicting centralized, consulting, functional, center of excellence, and decentralized organizational models described in Analytics at Work. Analytics group and analytics projects are represented by blue and green boxes, respectively.

Figure 3.1 Common organizational models described in Analytics at Work. (Adapted from Davenport, Harris, and Morison, 2010.)

In a centralized model, all analyst groups are part of one corporate organization. Even if located in or primarily assigned to business units or functions, all analysts report to the corporate unit. This obviously makes it easier to deploy analysts on projects with strategic priority, as well as to develop skills and build community. However, especially if the analysts and data scientists are all housed in the corporate location, it can create distance between them and the business (although this can be mitigated by other factors, as I describe below). Implementing a centralized model for analytics is easiest where there is successful precedent for operating other functions or managing scarce resources as centralized shared services.

In a consulting model, all analysts/data scientists are part of one central organization, but instead of being deployed from corporate to business unit projects, the business units “hire” analysts for their analytical projects. This model is more market driven, and especially important here is the analyst/consultants' ability to educate and advise their customers on how to utilize analytical services–in other words, to make the market demand smart. This model can be troublesome if enterprise focus and targeting mechanisms are weak, because analysts may end up working on whatever business units choose to pay for (or whatever wheel is squeakiest) rather than what delivers the most business value.

In a functional or “best home” model, there is one major analyst/data scientist unit that reports to the business unit or function that is the primary consumer of analyst services. This analyst unit typically also provides services in a consulting fashion (or even better, strategic prioritization) to the rest of the corporation. As already mentioned, many financial services and manufacturing firms have, in effect, a functional model today, with one or two well-established analyst groups in functions like marketing or risk management. The best home may migrate as analytical applications are completed and the analytical orientation of the corporation changes, typically from operations to marketing.

A center of excellence model is a somewhat less centralized approach that still incorporates some enterprise-level coordination. In this structure, analysts are based primarily in business functions and units, but their activities are coordinated by a small central group. The CoEs are typically responsible for issues such as training, adoption of analytical tools, and facilitating communication among analysts. The CoE builds a community of analysts/data scientists and can organize or influence their development and their sharing across units. It is most appropriate for large, diverse businesses with a variety of analytical needs and issues, but that still would benefit from central coordination. This is perhaps the most popular of the five models. In the era of business intelligence, this model was sometimes called a “business intelligence competency center.”

There are many variations on this model, depending on the powers of the CoE. Do analysts report to it dotted line? Does it control the staff development agenda and resources? Does it double as a Program Management Office (PMO), with powers to coordinate priorities and resources across business units? Or are the business units solidly in charge of their analysts?

In a decentralized model, analyst groups are associated with business units and functions, and there is likely an analytics group or groups for corporate functions, but there is no corporate reporting or consolidating structure. This model makes it difficult to set enterprise priorities and difficult to develop and deploy staff effectively through borrowing and rotation of staff. It is most appropriate in a diversified multibusiness corporation where the businesses have little in common. But even then it makes sense to build a cross-business community of analysts so that they can share experience. As a result, this is the model I (and my Analytics at Work coauthors) am least likely to endorse.

Beneath the surface, each of these models is essentially either centralized or decentralized. The consulting and functional models are variations on centralization–the consulting model has different funding and deployment methods, and the functional model is centralized, just not at corporate. The CoE model is an overlay on a decentralized structure. So are other hybrid models, most commonly a combination of decentralized analyst groups in business units plus a central group at corporate that focuses on cross-functional, cross-organizational, and enterprise-wide initiatives.

These five models have pros and cons and trade-offs in terms of deployment and development and other objectives. Figure 3.2 indicates the strengths of each in terms of four specific goals.

A table depicting the strength of the five models centralized, consulting, functional, CoE, and decentralized are represented in the column heads. Solid and half-filled circles are denoting very strong and somewhat strong, respectively.

Figure 3.2 The strength of the five models. (Adapted from Davenport, Harris, and Morison, 2010.)

3.4.3 Coordination Approaches

One basic structure may be the best general fit, but no model will be best in terms of meeting all goals. Whatever the basic model, there will be a need to coordinate across analyst groups or across different parts of the business that are consuming analyst services. In a sense, all models are hybrids. Even if all analysts and data scientists work in one centralized corporate unit, the customers for their services are spread across the enterprise. You need coordination mechanisms to manage and meet demand for analytics.

There are a variety of common coordination mechanisms, some of which we've already mentioned. The mechanisms can supplement the formal reporting structure for purposes of enabling groups to plan and work together, and developing an enterprise view of priorities and resources. Think of them as ways of supplementing and fine-tuning a basic centralized or decentralized model, or of compensating for its inherent weaknesses. And note that all present challenges.

Program Management Office

This is a formal corporate unit for setting enterprise priorities, coordinating analytics and big data initiatives, influencing resource deployment on key initiatives, and facilitating the borrowing of staff across analytics groups. As mentioned above, it may be a function within a center of excellence. PMOs are especially useful where potential business value from analytics is high and resources are scarce and distributed. Under a PMO, the deployment process must be sophisticated to meet the dual needs of project staffing and analyst development.

Federation

Analyst groups and their associated business units work together on priorities, coordination of initiatives, resource deployment, and analyst development under a set of “guidelines of federation.” The most basic form of federation is a clearly chartered enterprise governance or steering committee. These committees add an immediate enterprise view, but they sometimes lack clout and even commitment. Some firms have considered federation as a sixth type of organization model.

Community

Decentralized analysts can be encouraged to share ideas and analytical approaches in a community. Such a community would typically involve occasional meetings, seminars, written communications, or electronic discussions or portals. It may be facilitated by a community organizer, and typically benefits from some budget. In most cases, this is a relatively weak coordination mechanism.

Matrix

Analyst groups report both to their associated business units and to a corporate analytics unit, with one line solid and the other dotted. Establishing dotted-line reporting to a central organization injects an imperative to get coordinated, but dotted-lines can lose their force over time if they're not regularly exercised.

Rotation

Some of the analysts in a centralized model are physically located in and dedicated to business units on a rotational basis. Or there is an enterprise-wide program facilitating the lending and migration of analysts across decentralized units. The strength and success of rotation programs are easy to gauge–analysts really do have mobility across the enterprise.

Assigned Customers

Some centralized analytics groups, such as the one at Procter & Gamble, have assigned or “embedded” analysts to work exclusively with particular business units and the leaders of those units. The assignments fall short of a matrixed tie in the organizational structure, but they help to ensure that the analytical needs of the units and their leaders are met. Recently, however, some of the embedded analysts at P&G have been put into a matrix structure; business units and functions were more comfortable having their analysts report to them.

For purposes of deploying analysts on the most important business initiatives, the PMO is the strongest mechanism. For purposes of developing analysts, all of the mechanisms can help the cause, but rotation programs may have the most profound effect. The coordination mechanisms can be used in combination–for example, a PMO focused on deployment and a community focused on development, or a federation focused on coordination and a matrix focused on ensuring alignment with business needs.

What Model Fits Your Business?

Any basic organizational design for analyst may look good on paper, but it's got to work in the context of how the business already operates. To evaluate, design, implement, and refine organizational structures, you've got to look behind the organization chart and consider some basic variables that have to be working together for any organizational model to succeed. These factors can either mitigate or strengthen the effects of any particular organizational structure. Figure 3.3 lists six key variables [17].

Figure depicting six key variables for tuning organizational designs.

Figure 3.3 Six key variables for tuning organizational designs.

Home location is the geographical location where analyst groups officially reside for administrative purposes. Home base and formal reporting lines have been the dominant variables in organizational design, especially in companies where more headcount has indicated more power. However, in today's more fluid and collaborative organizations, home location means less and less (especially if coordination mechanisms are effective). Home location is a matter of convenience, with the goals of limiting travel to work locations, accommodating employees' preferences, and getting enough people in one place regularly to sustain a community. In many firms today, analysts are based offshore, either as employees or contractors.

Work location is where the work of business analytics is performed, typically a mix of in the field (wherever the business customers of analytical models and services may be) and in regional or corporate analytics centers (where colleagues and support services are readily available). It is generally best to locate analytics work, insofar as possible, where the corresponding business work is. This greatly facilitates communication with business leaders and those who perform the work process under analysis. Make sure that home location and reporting structure don't erect barriers to analysts' working close to the business.

Reporting structure is the formal lines of connection, direction, and administration. Analysts and their groups typically report to local business units, to corporate, or to an intermediate unit (e.g., business sector or region) if the corporation is so structured. Some reporting structures are matrixed, with analysts reporting solid-line to business units and dotted-line to the corporate analytics organization, or vice versa. Reporting structure may be predetermined if analytics is part of another organization, such as marketing or IT. Reporting lines should not be so rigid as to impede the flexible staffing and development of analysts. Given the advantages of enterprise coordination of analytics, a least a dotted line to a central group or CoE makes sense in most organizations.

Business structure is the shape of the enterprise. Are its business units highly autonomous? Or are they closely coordinated? To what extent do business units already share functions, services, and important-but-scarce resources? Is power concentrated at the regional level? Centralizing analysts and data scientists may seem the logical thing to do, but then prove very impractical if that flies in the face of a locally autonomous or regionalized business structure.

Centralized analytics groups are a natural match for an integrated “one business” business. If business units are intertwined and must work with and rely on one another regularly, you need a centralized or consulting model, or else a strong federation. If business units are autonomous with little interconnection, analysts may stay decentralized, but a center of excellence helps in sharing experience and building the analyst community. And if the enterprise relies extensively on business partners to perform major processes, you may need a centralized structure, especially if there's need or opportunity to coordinate analytics with partners.

Funding sources are seldom considered in the context of organizational design, even though paralysis is guaranteed if organizational structure and funding sources are at odds. Friction is minimal if funding follows the lines of formal reporting, but matters are seldom that simple because business services like analytics often have multiple funding sources. These may include funds from corporate, business unit assessments, direct funds from business units, chargeback to business units for analyst time, and project-based funding from the sponsoring business unit or units. The organizational questions are as follows: To what extent does the basic model under consideration align with funding sources? How does funding need to be revised or influenced by coordination mechanisms to support the analytics organization and its work?

Project-based funding is the most market and demand driven, but it requires a certain level of maturity among business customers in setting analytics ambitions and priorities, and among analyst groups in advising customers and marketing their services. Project-based funding (or other funding for services performed) should in most cases be supplemented by seed funding (to foster innovation) and infrastructure funding (to build capability), usually from corporate.

Infrastructure includes the configuration and ownership of other essential resources, especially technology and data. This variable is similar to funding sources–alignment is essential to the success, but the variable is seldom considered in organizational design. Analysts cannot work across business processes and units if local systems and databases, inconsistent tools, and fragmented infrastructure prevent it. And business units cannot incorporate new technologies and techniques for analytical applications of corporate standards prevent it. To capitalize on analytics, the infrastructure must be local-but-interoperable or corporate-but-flexible.

As a practical matter, those six variables are never perfectly aligned, and organizations will have to experiment with and adjust the coordination mechanisms over time. As a common example, if data and technical infrastructure are fragmented, a company might phase an organizational consolidation alongside (or slightly in advance of) the rationalization and consolidation of those resources.

3.4.4 Organizational Structures for Specific Analytics Strategies and Scenarios

There are at least seven scenarios [18] for how enterprises approach and employ analytics (Table 3.1). These different emphases suggest different basic organizational models.

Table 3.1 Other factors driving effectiveness of analytics organizational structures.

Scenario Definition Basic model
Traditional analytics and BI Make analytics tools and resources available to meet a broad variety of business needs Centralized
Analytics for the masses “Democratize” analytics and spread their use broadly across the organization Centralized, with considerable effort to create self-service approaches
Big data Tap the analytical potential of unstructured and nonquantitative data Functional if one unit is in the lead leveraging these data; otherwise, consulting or centralized
Decision-centered Enable the rapid and accurate execution of business decisions–both frequent/structured and infrequent/new Model relatively unimportant if analysts can work closely with decision-makers, with a means of sharing methods and experience
Embedded analytics Make real-time, automated analytical decisions part of core business processes and systems Centralized or consulting, and close relationship with IT
Function- or process-specific analytics Use specialized analytical technologies and applications to excel at a differentiating business process Functional if there's an organization focused on the process; otherwise, consulting or centralized
Industry-specific analytics Use specialized analytical technologies and applications to excel at processes common to an industry Centralized or consulting, or functional if focus is on very specific applications

3.4.5 Analytical Leadership and the Chief Analytics Officer

Another key organizational question is the leadership role for analytics within organizations. There are already a substantial number of “Chief Analytics Officers (CAO)”, and I expect that more will emerge. The role may not always have that title (it may, for example, be combined with Chief Data Officer–particularly in financial services), but there is a need–at least for each of the three centrally coordinated models described above–for someone to lead the analytics organization. The CAO could be either a permanent role, or a transitional role for an organization wanting to improve its analytical capabilities. There are a few Chief Data Scientists in organizations, but often these roles are combined with Chief Analytics Officer titles.

The roles of a Chief Analytics Officer could include any or all of the following:

  • Mobilizing the needed data, people, and systems to make analytics succeed within an organization.
  • Working closely with executives to inject analytics into company strategies and important decisions.
  • Supervising the activities and careers of analytical people.
  • Consulting with business functions and units on how to take advantage of analytics in their business processes.
  • Surveying and contracting with external providers of analytical capabilities.

One key issue for the CAO role is whether analytical people across the organization should report to it. While an indirect reporting relationship (as one dimension of a matrixed organization) may be feasible, a CAO without any direct or indirect reports seems unlikely to be effective.

In one insurance firm, for example, the CEO was passionate about the role of analytics, and named a CAO as a direct report. But the CAO had only a couple of staff; all other analytics people in the organization did not report to him. The CEO did not want to “rock the organizational boat” by having such traditional analytical functions in insurance as actuaries and underwriters report to the CAO. As a result, the CAO felt he had no ability to carry out his objectives; he resigned from the role, and the CEO did not replace him.

3.5 To Where Should Analytical Functions Report?

There are a variety of different places in the organization to which centralized analytical/data science groups and their CAO leaders can report. While there is no ideal reporting relationship, each one has its strengths and weaknesses. In the following section each alternative is discussed.

Information Technology

Some organizations, such as a leading consumer products firm, have built analytical capabilities within the IT organization, or transferred them there. There are several reasons why this reporting relationship makes sense:

  • Analytics are heavily dependent upon both data and software, and expertise on both of these is mostly likely to reside in an IT function.
  • The IT function is used to serving a wide variety of organizational functions and business units.
  • Analytics are closely aligned with some other typical IT functions, for example, business intelligence and data warehousing.

Of course, there are some disadvantages as well. IT organizations are sometimes slow to deliver analytical capabilities, and may have a poor reputation as a result. They may also overemphasize the technical components of analytics, and not focus sufficiently on business, organization, behavior, skill, and culture-related issues. Finally, IT organizations typically want to produce standardized and common solutions, and this may inhibit one-off analytical projects. In principle, however, there is no reason why IT organizations cannot overcome these problems.

Strategy

A few analytical groups, including those at a large retailer, report to a corporate strategy organization. This relationship allows analysts to become privy to the key strategic initiatives and objectives of the organization. Another virtue is that strategy groups are often staffed by analytically focused MBAs who may understand and appreciate analytical work, even if they cannot perform it themselves. The possible downsides to this reporting relationship are that strategy groups may not be able to marshal the technical and data resources to make analytical projects succeed, and strategy groups are usually relatively small.

Shared Services

In organizations with a shared administrative services organization, an analytics group can simply be part of that capability. The primary benefit of such a reporting structure is that analysts can serve anyone in the company–and often there are charging and resource allocation mechanisms in place for doing so. The downside is that analytics may be viewed as a low-value, nonstrategic resource like some other shared service functions. With the appropriate mechanisms in place, this problem can surely be avoided.

Finance

Being a numbers-focused function, finance organizations have the potential to be a home for business analytics groups. The obvious virtue of this arrangement would be the ability to focus analytics on the issues that matter most to business performance, including enterprise performance management itself. For some unknown reason, however, most CFOs have not embraced analytics, and the finance function remains a logical, if uncommon, home for analytical groups. At some firms, however, including Deloitte (for internal analytics) and Ford, the finance function is beginning to play a much stronger role in championing analytical projects and perspectives.

Marketing or Other Specific Function

As noted above, if an organization's primary analytical activities are concentrated on marketing or some other specific function, then it makes sense to incorporate the analytical group within it. The resulting structure would allow a close focus on the analytics applications and issues in the functional area. Caesars Entertainment, for example, has put analytics in a reporting relationship to marketing. Obviously, it would also make it more difficult for analytical initiatives outside those functional areas to be pursued.

Product Development

The most likely industries for having analytics (and data science) reporting to product development are those–like online businesses–where there are a substantial number of “data products,” or products and services based on analytics and data. There are, for example, analytics groups at LinkedIn, Facebook, and Google who report into product development organizations.

3.5.1 Building an Analytical Ecosystem

Most of the foregoing discussion about analytical capabilities has been focused on organizing and developing internal analytical capabilities. But there is a broad set of analytical offerings that are available from a wide variety of external providers as well. The providers include consultants, IT (primarily software) vendors, offshore analytical outsourcers, data providers, and other categories of assistance. Some provide general analytical help across industries, but in almost every industry there are also specialized analytics and data providers. Many firms can benefit from working with such “analytical ecosystems” to improve their capabilities.

The key in constructing an effective analytical ecosystem is not to let it grow at random, but to identify the analytical capabilities the organization needs overall. Then a decision should be made as to whether internal or external capabilities are most appropriate to fill a specific need. In general, external capabilities make sense when the need is highly specialized, not likely to be needed frequently, and not critical to the organization's ongoing analytical capabilities.

A major pharmaceutical firm's Commercial Analytics group, for example, has a well-developed ecosystem. There is a large group–more than 30–of internal analysts, but their capabilities are supplemented by outside help when necessary. The group has worked with specialized consultants on analysis of physician targeting, for example. The company's primary prescription data provider also works with it on analytics issues. Software vendors have consulted on analytical methods and techniques. Finally, the group supplements its work with help from an offshore analytics vendor in India.

3.5.2 Developing the Analytical Organization over Time

A final point is that analytical organization structures should develop and evolve over time. An internal structure and ecosystem that makes sense at the beginning of developing analytical capabilities will become obsolete later on. For example, it may be very reasonable to have a highly decentralized organizational model early on, but most firms create mechanisms for coordination and collaboration around analytics as they mature in their analytical orientations. It may also make sense to “borrow” a number of external resources in a firm's early stages of analytical maturity before making the commitment to build internal capabilities. In addition, companies may want to add data science capabilities to existing analytics groups to take advantage of the potential of big data.

The best way to adapt organizational capabilities to current needs is with a strategy or plan. Admittedly, in the early stages, there may not be anyone with the formal authority to even create a plan. However, if it appears that analytics are going to be key to an organization's future, it may make sense for a small group of analysts or data scientists to get together and create a bottom-up one.

At a large U.S. bank, for example, the head of the distribution organization (including physical branches, call centers, ATMs, and online channels) realized that she had a large number of analysts in her organization, but they weren't providing the value of which they were capable. She met with the managers of the diverse analytics and reporting groups in her business unit, and asked one of them to take the lead in assessing the problem. His work determined that the vast majority of the groups worked on reports rather than more predictive analytics, and that there were virtually no resources devoted to cross-channel analytics. With this start, the group began to develop a plan to remedy the situation and shift the balance toward predictive analytics and a cross-channel perspective. There was also a major focus on reducing the amount and frequency of reports. Later, this same sponsor moved a different business unit toward heavier use of machine learning technologies.

Plans should probably be revised every year or so, or with major changes in the demand or supply around analytics. There are usually clear signs–if anyone is looking–that the current model has become dysfunctional. It is a key step in an organization's analytical development that someone takes responsibility–either informally or formally–for assessing the organization of analytical resources, and for creating a better model.

No set of skills, plans, or organizational structures is perfect–even for a given time and situation–and every structure or skillset, if taken beyond its limits, will become a limitation. The leaders of contemporary organizations will need to become conversant with their analytical capabilities and how they are organized. Most importantly, they will need to realize when their current organizational approach and team no longer functions effectively, and needs to be restructured and/or reskilled.

References

  1. 1 Harris J, Craig E, Egan H (2010) Counting on Analytical Talent (Accenture Institute for High Performance), March.
  2. 2 Muenchen RA (2017) The Popularity of Data Science Software, R4Stats website. Available at http://r4stats.com/articles/popularity/. June 19 version.
  3. 3 Davenport TH (2014) Ten Kinds of Stories to Tell with Data. Harvard Business Review blog post, May 5. Available at https://hbr.org/2014/05/10-kinds-of-stories-to-tell-with-data.
  4. 4 Roberts P, Roberts G (2013) Research Brief: Four Functional Clusters of Analytics Professionals, Talent Analytics Corp, July. Available at http://www.talentanalytics.com/wp-content/uploads/2012/05/ResearchBriefFunctionalClusters.pdf.
  5. 5 Davenport TH (2013) Analytics 3.0. Harvard Business Review, December. Available at https://hbr.org/2013/12/analytics-30.
  6. 6 Davenport TH, Patil DJ (2012) Data scientist: the sexiest job of the 21st century. Harvard Bus. Rev. 90 (10): 70–76.
  7. 7 Davenport TH, Harris JG (2007) Competing on Analytics ( Harvard Business Review Press).
  8. 8 Davenport TH, Kudyba S (2016) Designing and developing analytics-based data products. MIT Sloan Management Review, Fall 2016. Available at http://sloanreview.mit.edu/article/designing-and-developing-analytics-based-data-products/.
  9. 9 Hartman T (2012) Is Big Data Producing Big Returns? Avanade Insights blog post, June 5. Available at http://blog.avanade.com/avanade-insights/data-analytics/is-big-data-producing-big-returns/.
  10. 10 McDonald quotations from Davenport TH, Iansiti M, Serels A (2013) Managing with Analytics at Procter & Gamble, Harvard Business School case study, April.
  11. 11 The three core skills needed for analytical managers is adapted research originally published in Harris J (2012) Data is useless without the skills to analyze it. Harvard Business Review, September 13. Available at https://hbr.org/2012/09/data-is-useless-without-the-skills.
  12. 12 Davenport TH, Harris JG (2017) Competing on Analytics ( Harvard Business Review Press, revised edition).
  13. 13 This section draws on content from a research brief by Harrington E (2014) Building an Analytics Team for Your Organization, International Institute for Analytics, September. Available at http://iianalytics.com/research/building-an-analytics-team-for-your-organization-part-i.
  14. 14 This section is a revised and updated version of a chapter by Morison R, Davenport TH (2012) Organizing analysts, in Enterprise Analytics, Davenport TH, ed. ( Prentice Hall).
  15. 15 Accenture Institute for High Performance (2010) Counting on Analytical Talent.
  16. 16 Davenport TH, Harris JG, and Morison R (2010) Analytics at Work: Smarter Decisions, Better Results ( Harvard Business Press), pp. 104–109.
  17. 17 Framework is based on Building an Analytical Organization, Business Analytics Concours and nGenera Corporation, 2008.
  18. 18 Framework is based on Morison R, Davenport TH (2008) Mastering the Technologies of Business Analytics, Business Analytics Concours and nGenera Corporation.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset