6 CONTENT, STRUCTURE AND ALIGNMENT

‘Good strategy requires leaders who are willing and able to say no to a wide variety of actions and interests. Strategy is at least as much about what an organization does not do as it is about what it does.’

Richard Rumelt1

We have reached the point in the book to start considering how you are going to write the strategy. What should it contain? What shouldn’t I include? How do I strike a balance between being too verbose but providing the clarity to make it understood to others? What’s the right level for a data strategy to be set at, and how do I know if I have got it right if this is the first time my organisation has seen a data strategy?

There are so many questions. Worse still, the answer to a number of them is the dreaded two-word response – it depends.

The intent of this chapter is to guide you through the process. It is not intended to deliver a templated data strategy – even if such a thing could be developed, it almost certainly would not fit your organisation, whether in terms of content, style, scale or fit with audience – nor a definitive list of contents. What it will provide is an indication of what should be considered for inclusion and in what level of detail, how to structure the data strategy to link the key themes together and act as a reminder of the importance of strategic alignment with other strategies within your organisation.

The key is to link the three activities of people capability, delivery across the information ecosystem and organisational maturity (Figure 6.1) together across the data strategy. This will be explored further in this chapter.

There are many points to bear in mind at this juncture, some of which have been covered in the preceding chapters:

  • Who is the audience for the data strategy, and what are the various perspectives through which this document will be viewed?
  • What level of detail is appropriate to ensure all who read it are engaged, whilst striking a balance between the risk of being too high level to be meaningful or too detailed to retain the reader’s interest?
  • How do I keep it grounded such that it is clear in its intent to define implementation plans that are true to the goals of the data strategy?
  • How ambitious do I want to appear, and how specific should the waymarkers be to give enough guidance to make the ambition seem realistic?
  • How do I give relevancy to the data strategy, such that the end destination is seen as one which delivers corporate goals and is realistic in the expectations set?

Figure 6.1 Linking the triumvirate of people capability, information delivery and organisational maturity

images

If you have clarity on these five questions then you are well placed to begin the process of defining the data strategy. If you are clear on fewer than three, then it suggests there is more work to be done, and more groundwork to be covered, before you are confident you are going to be heading on the right track. Anywhere in between and it suggests you have enough to make a start, whilst being conscious of the need to clarify those which are ambiguous as you go, ideally as early as you can.

6.1 APPROACH TO DEFINING THE CONTENT

Where to start? This is the conundrum many who have had to determine where to begin their data strategy have wrestled with and spent many anxious hours wondering whether their first foray into drafting it is on the right track. Clearly, this is a very personal challenge. Some relish getting to the stage of putting thoughts into words, feeling the rush of satisfaction at finally getting visible evidence of progress before them. Others find the information gathering, analysis and discussion the easiest part of the process, and find putting it into words particularly difficult and may make several abortive efforts before getting it finally under way.

In truth, there isn’t a right or wrong way; our personal traits lead each of us to approach this in our own way that best suits us.

Two methods that I have observed to be really effective involve a collective approach. In the first, one person takes the lead as author, and others play different parts – depending on the numbers available to you to assist, one can act as the researcher, feeding key inputs as the author progresses, and both can challenge one another to determine whether the strategy is following a logical thread, with a third person acting as a reviewer of the first raw draft, tweaking and tightening sections as it takes shape. A fourth person can then review either the whole document or large parts of it with the detachment of having sat apart from the drafting process.

The second approach is similar to the first, but involves a more collaborative authoring approach, in which there is a team-based approach to writing sections – possibly relatively short elements of the document, such as a couple of paragraphs – and there is an editor who commissions and then reviews those elements as they come together to ensure they link, and smooths out the idiosyncratic writing styles of each contributor. Again, someone should review the final copy and sense-check for clarity, continuity and coherence, based upon a level of understanding of what the data strategy is seeking to achieve and the audience it is targeting.

Of course, you may be faced with having to draft the data strategy entirely on your own, either due to the size of organisation or lack of wider awareness of its importance. In such cases, you will be acting as judge and jury of the document you draft unless you can get someone with an impartial perspective from elsewhere in the organisation to provide the sense check – possibly someone who has had some strategy experience, either in your organisation or elsewhere, or who is familiar with presenting documents to senior stakeholders and knows what style tends to work and, just as importantly, what will jar.

I have mentioned Agile as a recommended approach to developing your data strategy, deploying iterative stages to progress things and set expectations of what is likely to be achieved within short windows of time. The benefit of this approach is the focus it provides on targeting specifics rather than being faced with the task of grappling with the whole document. This involves breaking down the data strategy based on your intended structure – we shall cover this shortly – and making it discrete, so there is a logical start, middle and end to the Agile sprint you construct.

Just remember, throughout the drafting process you can more easily shorten a document than extend it. Whilst the latter is feasible, it often involves more of a rewrite than a precis, and so is more time-consuming and challenging. Therefore, if you have succeeded in getting a flow going then it is easier to run with it, knowing it is too detailed, and trim once finished.

Finally, if you are ready to start to commit words to paper (or whatever software takes your fancy), do refer back to Chapter 2 on CLEAR. As a prompt to make sure you are focused on what is needed to make your data strategy a success, this section is an important reminder and should be used as a yardstick to ensure you are on the right path.

6.2 DETERMINING THE CONTENT

I do not want to be too specific in terms of specifying the content of your data strategy – to some extent, it depends on the nature and maturity of your organisation, the audience, and whether you are the first to define a data strategy or are following tracks laid before by others. It also depends on the sophistication of the strategy activity as a function within your organisation, which we touched on in an earlier chapter.

In general, it is important to reflect on the data needs that you can identify within the corporate strategy that should be reflected in the data strategy. As a minimum, you would expect to ensure these are fully covered and aligned appropriately.

There are some key elements you would expect to find in a data strategy:

  • a strong data management vision;
  • a coherent business case to support investment (even if that is just resources within the organisation, these still come at a cost);
  • some guiding principles, values and alignment to the corporate vision;
  • clearly articulated goals related to data;
  • evaluation criteria and metrics to track success;
  • clarity on the data strategy programme vision to be delivered;
  • clarity of roles and responsibilities.

My recommendation is to contain the data strategy within 12–20 pages, and to focus on the high-level direction setting whilst providing waymarkers as a guide to drive the expected pace in the implementation phase. There are many data strategies out there which overshoot on this – one I had a workshop group review recently was in excess of 70 pages in length. Do not underestimate the challenge of writing something shorter but more focused. The French mathematician Blaise Pascal famously wrote, ‘I have made this longer than usual because I have not had time to make it shorter.’2 There is an art to being concise, complete and accurate, but it is unlikely your first iteration will succeed in this goal. The target of 20 pages will take time, effort and constant revision and is likely, therefore, to start out as a much larger document before being trimmed back to the recommended size.

I tend to think a data strategy should have an executive summary, with the goal to condense in about a page the key content of the document. It is a challenge to achieve, but it must set:

  • context – why a data strategy, what is its alignment to the corporate strategy, why now and what it is seeking to achieve;
  • scope – be clear on what is to be delivered, by when and how;
  • what is required to deliver it – resources, dependencies, sponsorship, accountabilities;
  • when it will be achieved – waymarker headlines only;
  • the value it is expected to deliver.

This needs to be written tightly, avoiding flowery language and being absolutely clear on all points.

In terms of the rest of the document, I would expect the waymarkers to be displayed graphically along a timeline, perhaps a page or two, that can be referenced through the data strategy with the evaluation criteria and roles, responsibilities and accountability taking another page or so. The bulk of the content will be discussing data management and data exploitation in the context of a vision set out at the start of the data strategy – potentially over a couple of pages, to include reference to the business case or rationale for why a data strategy is required by the organisation – which is clearly aligned to the corporate strategy. As mentioned earlier, do consider the use of diagrams wherever possible, as these can convey a large amount of information succinctly and also help those who are more comfortable receiving information in a pictorial form than blocks of text.

There are some core topics I would recommend you consider in defining your own data strategy content, detailed below.

6.2.1 Purpose

Chapter 4 introduced ‘purpose’ as the first term of the PRIDE acronym. It is an essential component in stating your remit in defining a data strategy and therefore should be the opening statement of your data strategy. You need to be able to set the context of the data strategy, outlining any restrictions or constraints in the scope it covers, the context in which it sits (for instance, is this supporting key deliverables within the corporate strategy, or other documents which should be read in conjunction with the data strategy to provide the entire context?) and the time period it covers.

It is worth reflecting on a few key pointers when establishing purpose. Organisations typically want a data strategy to achieve a number of important goals, which may include some of the following (if not, it is still worth bearing these in mind when defining the purpose of your data strategy):

  • Manage data securely, compliantly and consistently, which is critical to the organisation’s success in minimising risk.
  • Establish a credible approach to move the organisation to be insight-led, utilising data at the heart of the way it operates, to improve the quality of decisions made.
  • Drive innovation and recognise data as an asset which can deliver value.
  • Embrace digital opportunities, which require a more effective approach to data management and exploitation.
  • Plan for future trends and be prepared for them to maximise opportunity and mitigate risk.
  • Use data to establish competitive advantage in a sustainable and innovative way.

6.2.2 Introduction

It is likely that you will need to define the data strategy – what it contains, and just as importantly, what it does not. This is particularly important if this is the first time your organisation has had a data strategy; setting it in terms that the organisation comprehends is essential for it to be understood, contextualised and adopted.

This is the point at which you need to articulate what is meant by ‘data’ in the context of a data strategy. Most organisations use the term data strategy to mean the whole host of information requirements, not just those specific to data. I shall, therefore, refer to data strategy to cover the spectrum of data, information and the exploitation of data in the following sections. I recommend that you follow a similar course in scoping the breadth of the data strategy for your organisation.

It would be usual, rather like a book, to acknowledge those who contributed to the data strategy, recognising their contribution but also, by implication, the wide reach the data strategy has had in its compilation and the breadth of inputs it has incorporated. There should be recognition of the sponsor and the author(s), and remember to include the publication date. I strongly advise the inclusion of a recommendation regarding the review cycle and process (for instance, which governance board within the organisation ‘signs off’ on the data strategy and how it is commissioned). Whilst the recommendation may not be adopted, it does provide a prompt to the organisation to determine the appropriate review cycle and process following the adoption of the data strategy.

6.2.3 Data management

The first section of your data strategy usually covers data management. Before I go further, let me define what I mean by data management.

The challenge of defining the approach to data in a strategy document entitled ‘data strategy’ is how you refer to the data elements of it if the scope of the data strategy embraces all aspects of information (see Chapter 3 for further elaboration on scope). I am using the term ‘data management’ to encompass the full range of data-specific activities, which I shall cover below, and which are all related as elements of the management of data. The term often has specific meanings to some, so for those individuals I am taking liberties in how I use ‘data management’, but this is the challenge with the taxonomy in this field; it has evolved and so the terminology has morphed into having numerous meanings.

There are many frameworks available to define the data management spectrum. Two of the most established and commonly used are provided by DAMA and the CMMI (Capability Maturity Model Integration) Institute – the latter had been collaborating with the EDM Council but ceased to partner on the approach in 2014, with each having their own data maturity models as a result – which I will cover further in this section. There are numerous others, some of which I will also reference to highlight areas that are potentially of interest to explore further should you wish to depart from either the DAMA or CMMI frameworks.

I have added the Accenture data maturity model as I think it is helpful to demonstrate the transition from an ad hoc type of approach to what is termed an ‘industrial’ level as maturity develops across five key themes. The sequential nature of the flow is perhaps not to be seen literally, as it is feasible (and often desirable) to be making progress on multiple points simultaneously, but is indicative of a growing level of data management sophistication.

I do not intend to go into too much detail on each section of data management, simply because there are many publications available which cover this topic extensively. Remember, too, that this is a data strategy, so keep your content in this section relevant to the audience you are seeking to engage – it is not a research document or solely for data management professionals to digest.

6.2.3.1 Data maturity assessment

The starting point of a data strategy should be an assessment of where you are starting from as an organisation, and this applies in particular with the maturity of the organisation related to data management. There are many data maturity assessment methodologies out there, but most are variants of one another, typically with 5–6 levels and 20–25 categories on which to score current performance. The main variance tends to be based on the level of detail within criteria; some of the outlying criteria may differ, but as to be expected, the core components tend to be relatively common (for instance, data architecture, data governance, data quality, design and standards).

The benefit of establishing a baseline is the opportunity it gives to set realistic expectations of where you are seeking to reach within the time frame of the data strategy and the steps required to get there. This will enable you to establish a common understanding of your starting position, clearly articulated as to the basis on which this judgement has been reached and with reference to specifics within your organisation that provide meaningful context, and to therefore outline the approach needed to progress. Organisations with a low level of data maturity often have a lack of trust in the data and are not exploiting it fully for that reason. By contrast, those with a higher level of data maturity will be demonstrating how important data is to the organisation through their use of it in evidence-led decision making, with trust and transparency underpinning this approach, and will be more confident in their direction setting.

This provides a point on which to ground what follows in the data management section of your data strategy, which is why I usually advocate starting with this, rather than, as some data strategies choose, to end with it to summarise. I think it has more influence as a contextual scene setter that can gain audience attention, rather than making the risky assumption that the reader has made it to the end of the section, fully cognisant of how it comes together in the data maturity assessment.

The data maturity assessment should be used throughout the period the data strategy covers to drive coherent actions focused on how to enhance the maturity whilst embedding such changes in the organisation. It is an important baseline, enabling the organisation to have a standardised and agreed methodology for measurement. The assessment can be used as a barometer of progress to ensure those things it has committed to achieve are being supported and, where appropriate, funded to enable progress.

There are a number of data maturity assessment models available to use, two of which are shown in Figures 6.2 and 6.3. These are some of the more commonly used versions, but most consulting firms have devised variations that they will offer and provide consulting services to deliver. There are also maturity models which are specific to data governance (see Figure 6.4), as well as BI and analytics.

6.2.3.2 Data and its purpose

It is worth making a definitive statement within your data strategy about the relevance of data to your organisation. This might sound obvious, but treating data as an asset within your organisation is more likely to move the discussion towards a recognition that, as with any asset, it needs managing and funding to maintain it at its highest level of quality. If your organisation aspires to be information- and insight-led, then the quality of the data is aligned directly to the value its use will deliver in its processes and decision making.

Figure 6.2 Data management maturity model CMMI ©2014 All rights reserved. Used with permission.

images

Figure 6.3 Data maturity model Copyright © 2018 Accenture

images

This leads on to the concept of being ‘fit for purpose’. Not all organisations have necessarily the understanding to know what fit for purpose amounts to, and therefore how good data needs to be. There is a balance to be struck with data quality. There is an investment required to make it highly performant for the organisation and it must be considered in the context that not all data is equal in its importance. As a result, investing time and resources in data quality is potentially a trade-off between the need for accuracy, precision and timeliness versus the data being good enough for decisions to be made. Of course, this has to reflect the need to be compliant and so it is a complex matrix that leads to determining what fit for purpose data amounts to, whilst remaining secure, safe and compliant.

Defining purpose also provides a way to set out the stall of the data strategy in any pitch for resources and/or funding. If there is broad agreement as to what constitutes fit for purpose, the data maturity assessment will provide a path to achieving this and highlight what needs to be addressed. This can be a useful springboard to a business case to deliver the operational activities that underpin the implementation of the data strategy.

6.2.3.3 Data governance

The data strategy needs to explain the approach to data across the spectrum of activities within what is collectively known as data governance. As the term suggests, the governance of data is focused on the end-to-end management of data and typically consists of common themes based on data architecture, data modelling and design, data security, master data management, data standards, data quality and the key aspects of how that is managed collectively, whether through operational data stores or data warehouses, or in document and content management systems.

There are numerous ways in which data governance is typically characterised, but one of the most established is the DAMA wheel. The definition DAMA uses to describe data governance is a more encompassing version to some, and this is where the terms data management and data governance can sometimes be interchangeable. For instance, DAMA would go so far as to suggest a data strategy is part of data governance, and perhaps in a narrower definition solely focused on data management this might have some merit, but in the rather broader (or some might say looser) definition of data strategy encompassing the exploitation of data then this doesn’t hold true – data governance would typically be part of the data strategy.

The key difference between the DAMA wheel and the CMMI Institute model is that the former is based on knowledge, the latter on process. This leads to a slightly different focus, in which DAMA has 11 top level domains compared to just five in the CMMI model (six, if you include supporting processes), though there is a greater level of granularity in the latter with 20 process areas and five supporting areas underpinning the five categories of the model.

In practice, both models give a view on how to progress maturity, and so it is as much a question of preference as to whether you wish to base the maturity assessment on knowledge, process, competency (IBM model fits this approach) or business capability (DCAM model). The Accenture model is in some ways a blend, also has five key dimensions underpinning the approach, but is more extensive than some of the others in its reach into data exploitation.

It doesn’t matter to me how you define data governance or data management – they are labels which, as we have seen, have different definitions in common use in well-established reference guides that are there to enable the implementation, and ongoing delivery, of data governance and management activities. What is key is to ensure there is a common understanding as to the terminology used within your own organisation, and the data strategy presents an important opportunity to establish the terminology you would like to see adopted going forward. Therefore, ensure you seize the moment to be clear on your definitions, both in the appropriate parts of the document and the glossary, which is going to be the authoritative list of definitions used in this data strategy and a handy reference guide to establishing the lexicon in the organisation.

Figure 6.4 The DAMA wheel Copyright © 2017 DAMA International.

images

Rather like the DAMA wheel, the Gartner data governance maturity model3 implies that data governance spans the full range of data management activities. It talks to information as early as level one on its multi-level approach, identifying this as the stage where the value of information is recognised and an enterprise information management strategy is required (for which the term ‘data strategy’ can be used so interchangeably!).

So, what place does data governance have within the data strategy?

Regardless of how wide you wish to make the definition, the need to establish principles of controls and standards is essential to make progress in the wider context of a data strategy. The definition may change, but the same principles apply.

Common data standards are essential to effective data governance and are the lynchpin of your data management capability. Data must be structured in a way that sources are known, defined, secure but accessible as needed to utilise it effectively. Data quality is the essential standard by which you certify that the data is reliable to use or highlight deficiencies. This applies to data whether in recognised systems or in any other means of storing it and is pivotal to your data strategy.

I have applied a kitemark approach to make the quality of data visible to all, in order to flag any MI or reporting issued to illustrate the reliability and veracity of the data within the report – green meaning the data is of high quality and decisions can be made with confidence; amber indicating it is broadly of acceptable quality, but there are issues and so it is good enough for indicative purposes but not accurate; red to demonstrate that the quality is limited and therefore only broad-brush decisions should be made if absolutely necessary, due to the risk of error.

The data maturity assessment will flush out where the gaps lie in your organisation’s data governance, as well as other aspects of data management and its exploitation. The challenge is to align the organisation to put bandwidth and funding, if required, into resolving these gaps. The data strategy must set out clearly how this is to be approached, defining waymarkers to be achieved that align with the expected rate of progress that is at the heart of the data strategy.

How this looks for your organisation will entirely depend on its maturity and appetite to drive change. Much of this is cultural, behavioural and, sadly in many organisations, seen to be either someone else’s problem or a burden being imposed on staff who are already busy. Assigning data owners and stewards, and setting quality standards with their input, is an essential first step to getting the corrective steps in place.

The compliance lever is always an option to get movement in the data governance arena. Personally, I tend to think this is a blunt instrument as there are usually plenty of positive reasons to improve data integrity and quality within an organisation. In a private sector organisation there are almost certainly revenue-generating opportunities to be had from knowing your customer better, or efficiencies to be gained by removing duplication of effort and/or making data more readily available. Those organisations outside the private sector can focus on efficiencies and effectiveness, especially in providing services of a higher quality at a lower cost by getting it right first time and presenting the whole picture to those in customer-facing roles.

That is not to say that the compliance angle is not important, as it most certainly is – just look at the scale of the fines levied to those companies highlighted in Chapter 3 to consider how your own organisation might be affected. Whether hefty fines or increased pressure to comply are a realistic risk, there is little to be gained from teetering on the edge of compliance as it is usually more expensive to fix retrospectively than to be compliant in operating today, not least as the time to resolve issues may be prohibitively expensive due to the scale and complexity involved. Better to be ahead of the curve, operating compliantly and maintaining a level which is affordable that ensures the organisation is both safe and legal.

In many organisations, data governance initiatives start by addressing specific tactical issues to get a level of traction and the potential for recognition if such initiatives can assist in solving them. This may, initially, constrain the scope of the data governance activity (for instance, to a specific function or project), but it could also enable it to be delivered more deeply through having something to focus on, thereby delivering more value and increasing the evidence on which it is judged to have succeeded.

As evidence grows of its success and awareness increases, so the data governance initiative can expand its scope and use the success as a case study to demonstrate the approach taken. This will enable the breadth of activity within data governance to be delivered, as the opportunities that lie ahead require more elements of data governance to be deployed. As a result, data standards, data quality and consistency in master data management will start to take hold in the organisation.

There are many fine texts out there on the topic of data governance, along with frameworks and models which can be used to measure the current state and track progress (further details in the bibliography of this book). Use these, as they are effective in providing clarity and cohesion across the organisation on what is needed – don’t forget, data governance may not be the most exciting topic for many senior stakeholders to discuss at any length, so make it easy by making it accessible to them.

6.2.3.3.1 Data integrity The integrity of data is an area which typically forms part of the data governance activity. It is often used interchangeably with data quality, but this is perhaps a misunderstanding of the breadth of what data integrity encompasses. There are various definitions of data integrity and various sources which are available. Below are some of those most commonly referenced.

Good Manufacturing Practice/US Food and Drug Administration (FDA) 21 CFR Parts 210–12 – ALCOA – is used in the pharmaceutical industry and, as this is a regulated sector, requires evidence of compliance to the data integrity principles (subsequently expanded to ALCOA Plus with four additional criteria).4 Some of the definitions are specific to the nature of the sector:

  • Attributable – data should clearly demonstrate who observed and recorded it, when it was observed and recorded, and who it is about.
  • Legible – data should be easy to understand, recorded permanently and original entries should be preserved.
  • Contemporaneous – data should be recorded as it was observed, and at the time it was executed.
  • Original – source data should be accessible and preserved in its original form.
  • Accurate – data should be free from errors and conform with the protocol.

The Dodd–Frank Wall Street Reform and Consumer Protection Act CFTC 1.73 – an act passed in 2010 to increase accountability and transparency in the light of the financial crisis of 2008 – expects financial organisations to maintain data integrity, enforced by a variety of rules and regulations across multiple jurisdictions. This particularly requires a data integrity focus on the following:5

  • Entity and Referential Integrity: Trusted identification and relationships exist between tables. Knowing that the structure is being maintained and also improved as a firm continues to grow.
  • Context of Data: Bad feeds and/or points of data will be addressed, corrected, documented, and communicated to required stakeholders.
  • Incomplete Data: This causes disruptions amongst consumers and will need to be actioned by the team. Could be a bad file from a vendor or a failed process internally, but it will be easier to identify and investigate with a team that is centralised and understands the firm’s entire data sets and their uses.
  • Timeliness of Data: Files and processes will need to be run seamlessly to have the proper information flow through the organisation.
  • Changing Environments: As system and software changes are made, attention will need to be placed on the impact of the data. The team will be responsible for ensuring that everything is cut over properly and seamlessly.

Generally, data integrity has a focus on the end-to-end lifecycle of data, wider than data quality, albeit that plays a part in its integrity, and has a breadth that extends through to retention, archiving and redundancy of data. It encapsulates auditing and tracking of data to be able to evidence integrity is being maintained against defined standards. It is related to, but separate from, data security, which also forges links through a dependency on data integrity to be able to support business continuity and data loss prevention.

Reflect on how data integrity features in your data strategy. Just like the other elements of data management, it may already be in train but it should be called out and referenced, with a clear view of where the organisation needs to get to in the period covered by the data strategy. Even if these activities are under way within the organisation, they should be incorporated into the data strategy, bringing coherence to the whole of data management in the wider context of how it enables data compliance and exploitation.

6.2.3.3.2 Data standards It is important to consider data standards within your data strategy. Whilst not necessarily called out explicitly in many of the models, data standards define the way you choose to operate within your organisation. Their role is to provide a common understanding, setting a standard all are expected to follow, enabling quality and integrity to be assessed from that basis. Without clearly defined and communicated data standards, each member of your organisation will have a view of what that standard is in their own mind, and you will find you are operating with many variations, making data governance an impossible task as each person will believe their interpretation to be correct.

Many organisations do not have data standards, or have something akin to these but have never documented them. This leads to confusion, as one interpretation of data entities and attributes differs from another, and so the way in which data is captured varies markedly. This carries a cost, through the daily effort in re-work that goes on at an individual level to transform the data to make it consistent before it can be used, as well as hindering the ability to bring data sets together through the lack of consistent data standards.

There are data standards published by type of activity as well as sectors. Just a few of the established versions include:

  • ISO 55000 – asset management data and decision making – defines data as an asset, covers how data is there to meet stakeholder needs for information, highlights the data lifecycle and includes archiving and deletion, as well as the retention of data and document management.
  • The FDA has established standards and a programme to enable organisations to align. The CBER-CDER6 Data Standards Strategy provides a consistent methodology for pre- and post-market regulatory review to ensure safe and effective medical products are available to patients. The goals of the CBER-CDER Data Standards Program Action Plan are set out in Figure 6.5.
  • Common Education Data Standards is a US-based model that creates a schema that easily identifies and standardises the educational organisations and relationship with others with clear naming conventions and metadata ready to be adopted by those within the sector. It is entirely voluntary to use, but its goal is to link and streamline the exchange of data across institutions and sectors.

Figure 6.5 CBER-CDER data standards strategy goals Source: USFDA Data Standards Program Action Plan https://fda.report/media/149624/Data_Standards_Program_Joint_Action_Plan_v5.1.pdf.

images

Data standards are exceedingly important. For instance, if your organisation manufactures any physical products, there are almost certainly manufacturing data standards to be achieved to recognise they are appropriate for use. If those products are to conform to a consistent standard, such as AA batteries or type approval of a car or component, these standards are clearly articulated and industry-wide. Without data standards there would be a lack of consistency, and the cost of commonplace items would be significantly higher.

In 2020 the UK government established the Data Standards Authority, which consists of a multidisciplinary team drawn from a wide range of backgrounds in technology, strategy and policy. Working with experts across the wider public sector and devolved administrations across the UK, as well as academia and private sector organisations, the DSA identifies, improves and helps implement data standards that meet user needs.

It is important to establish what data standards your organisation is following, and the compliance to the standard. If you cannot find a standard, then there is probably a strong case for introducing one or more standards to drive greater efficiency in the organisation, further cementing the data strategy at the heart of making the organisation both data aware and conscious of the importance of the data strategy.

6.2.3.4 Data retention and compliance

One of the areas closely associated with data governance, as well as compliance, is the policy of the organisation towards data retention. In many jurisdictions, there are specified retention periods for various types of data but these may vary from one location to another, so if your organisation operates across boundaries and the data strategy has to support all parts of the organisation, it may be necessary to reflect differing compliance regimes in the data strategy.

Within the European Union, GDPR imposes the need for a data retention policy, covering the seven key GDPR principles:

  1. Lawfulness, fairness and transparency.
  2. Purpose limitation.
  3. Data minimisation.
  4. Accuracy.
  5. Storage limitation.
  6. Integrity and confidentiality.
  7. Accountability.

Be aware of local regulations in the domain in which you are operating and, if yours is a global institution, how these differ by country and therefore the need for clarity in how these are complied with at a local level. Remember, basic concepts like the use and retention of data may differ, and you will be subject to local laws for data pertaining to that jurisdiction.

There are a growing number of GDPR-like data privacy laws established around the world, many of which have modelled changes based on GDPR. These are just some you might need to be aware of:

  • Australia – the Privacy Amendment (Notifiable Data Breaches) to Australia’s Privacy Act came into effect in 2018.
  • Brazil – Lei Geral de Proteção de Dados, modelled on GDPR, came into effect in 2020.
  • Canada – the Digital Charter Implementation Act has been, at the time of writing, proposed by the Canadian government and will reshape the Canadian privacy landscape through the introduction of the Consumer Privacy Protection Act to overhaul the existing Personal Information and Electronic Documents Act.
  • Chile – in addition to amendments to Chile’s constitution, a series of updates to Ley 19,628 (the legislation known as ‘On Protection of Private Life’) have been passed and current proposals are reflecting changes to bring it more into line with GDPR.
  • China – at the end of 2020, the Personal Data Protection Law was released in draft form and impacts any organisation doing business in China, rather than just those operating via a physical presence in the country.
  • India – the Personal Data Protection Bill 2019 has been progressing through the Indian parliamentary scrutiny process for some time but is expected to be debated post-committee stage via a final report in autumn 2021.
  • Japan – the Act on Protection of Personal Information was amended to apply to both foreign and domestic companies in 2017, and has a ‘reciprocal adequacy’ agreement in place with the EU.
  • New Zealand – the Privacy Act of 1993 was amended in 2020, though it is some way short of being as comprehensive as GDPR.
  • South Africa – the Protection of Personal Information Act became law in 2020 and has some key differences to GDPR, not least the application of criminal charges.
  • South Korea – privacy standards in South Korea have been well established, with the Personal Information Protection Act in law since 2011.
  • Thailand – the Personal Data Protection Act was passed in 2019 but came into effect in 2021. It is similar to GDPR, but as with South Africa, criminal charges can apply.
  • USA – the strictest, and most high profile, data privacy legislation is the California Consumer Privacy Act, which has some similarities with GDPR. Several states have followed California’s lead in seeking to implement their own such legislation. Most interesting for those operating both sides of the Atlantic, the EU–US Privacy Shield framework was invalidated by the European Court of Justice due to surveillance fears and concerns over a lack of privacy standards in the US. Data transfers from the EU to the USA are typically using standard contractual clauses, non-negotiable legal contracts drawn up in the EU, until such time as a new accord can be reached.

In addition to the above, trading blocks such as the Organisation for Economic Co-operation and Development and Asia-Pacific Economic Cooperation have devised their own data privacy guidelines for cross-border data transfer. However, these guidelines tend to be less rigorous than those devised in country and so have little effect if trading with those nations who have adopted more stringent data privacy laws.

Do not be surprised if your investigations prior to developing the data strategy reveal there to be much more data out there than you were initially aware of. Organisations frequently store documents off-site, and the phrase ‘out of sight, out of mind’ is often very true when it comes to unstructured documents which may not have been recorded particularly well when sent off-site. You wouldn’t be the first to find off-site storage belonging to the organisation that has no reference as to what documents are contained therein.

Seeking to get agreement to data retention periods sounds simple enough, but is often challenging, not least because different parts of the organisation may have different requirements of that data. This is where the data governance roles of data owner (the individual accountable for specific data, typically defined by domains such as finance, HR and so on) and data steward (who is responsible for implementing governance initiatives on behalf of the data owner and would usually be an individual with deep knowledge of that data domain) play a key part, identifying the stakeholders for data in the organisation and seeking to gain agreement to the retention policy for that data entity. Once established, of course, the rules need to be applied, which should also be borne in mind when defining the data strategy, as this will become a critical element of the implementation plan.

The final stage of the data lifecycle is to archive or, in most cases, destroy the data. In some (rare) cases, the data may not be destroyed due to the legal requirement to retain it or it being a matter for public record. However, most organisations can and should archive and subsequently delete data once it exceeds the data retention policy, and often will have an imperative to do so to remain compliant.

In some cases, systems may not enable hard deletion of records and this presents a significant challenge from a compliance point of view. This will be particularly so if those systems are relatively old and hence a soft deletion, in which the data is not visible to the user, may be the only option open to you. In such cases, there is a need to align your activity with that of the information risk community and IT systems architects to explore what their plans are to remediate this issue.

6.2.3.5 Open data

There is a growing trend towards transparency of data, making it more accessible to those outside the organisation where it is not specifically a competitive threat to do so. Open data standards are being developed rapidly to enable data to flow freely and to enable entrepreneurial organisations and individuals to access data from multiple sources to develop new applications.

There are many open data initiatives. I have listed just a few here (Table 6.1) but if you are interested to know more there are a number of groups you can engage with through the International Open Data Charter,7 which works across governments and wider organisations worldwide to facilitate the sharing of knowledge of open data.

Table 6.1 Selected open data standards

Provider

Description

GovEx (the Centre for Government Excellence, Johns Hopkins University, USA)

Developing a list of civic data standards

Geothink (McGill University, Canada)

Project to enable municipal open data publishers to standardise data sets

European Data Portal (data.europa.eu) – joint venture with Johns Hopkins University and McGill University

The Open Data Standards Directory – over 60 open data standards

Open Data Institute

Developing an open data ecosystem with governments and organisations

Open Standards Board

UK government body to define open standards for data, technology and services

OpenActive

UK-based organisation providing open data standards for sport and physical activity

Open Data Standards

Develop Open Exposure Data Standard (OED) and Open Results Data Standard (ORD) to support Oasis loss modelling framework-based models in the reinsurance market

To enable the adoption of open data, you will almost certainly need to adopt a data classification approach to align the data governance effort with the sensitivity of that data, enabling it to be classified and accessed accordingly. Whilst this is potentially a low-level activity in itself, the trend towards a more open data landscape is gathering pace, and so this may well require an acknowledgement within the data strategy and a statement of intent and direction if it is something your organisation needs to embark upon.

6.2.3.6 Data acquisition

If your organisation is dependent upon data acquired from other organisations, such as credit reference agencies or other third-party service providers, then it would be appropriate to map out the acquisition strategy in the overall data strategy. This may not change in the period the data strategy covers, though with the increasing volumes of data in the world there are an ever more varied range of data sets that can be acquired, and so it is perhaps a timely opportunity to consider whether this innovation has created an opening to enhance your own data, or seize new business opportunities.

Setting out the current position, the value extracted from the data acquired and the current potential to be exploited enables all within the organisation to understand the value of the investment being made. This is particularly relevant if the data in question is key to the delivery of activity, and without it the activity may be hindered or not viable at all, just so there is clarity of understanding in retaining the investment through the period the data strategy covers.

6.2.3.7 Data systems

Whilst the data systems should be covered within an IT strategy produced elsewhere in your organisation, it is appropriate to cross-reference any significant changes or investments being made in new systems within the period covered by the data strategy. It is highly likely that there will be critical data migration tasks to be undertaken which will present opportunities to improve data quality (though it is depressingly common for organisations to make minimal efforts at enhancing data prior to it migrating, only to discover the significant investment in the new system is thoroughly undermined by the poor quality of data). Such instances should be called out in the data strategy and dependencies made across to the IT strategy.

6.2.3.8 Data capabilities

Depending on the current sophistication of data expertise within the organisation, the data strategy should highlight the current state and desirable future state of data capabilities within the organisation, and propose an approach to address this, either through training or recruitment. In many ways, a data capability assessment is as key to the organisation’s success as the data maturity assessment, given one is identifying the steps needed to be taken and the other the means to be able to deliver these.

There are two levels: capability to deliver, and capability to utilise what is delivered. It is essential to establish the position of both, as these will either inhibit or enable your data strategy to realise its goals, and so there must be a clear focus on how any gaps will be resolved. It may require buying in expertise, either through recruitment or consulting routes (more a short-term expediency, as the expertise will need embedding in your organisation as soon as possible to be effective).

It is likely that you will need to specify how any gaps can be filled, as the nature of the specialism will probably mean that there is not an understanding as to how to resolve this within your organisation at the moment. There will also need to be consideration given to the potential to upskill your resources, as this is cheaper and has the benefit of building on the corporate knowledge those members of staff already have of your organisation. However, such an undertaking must not be done lightly, as it is equally a significant commitment from those colleagues who are willing to learn as it is for the organisation to invest in delivering training to them.

There is also a need to identify safe opportunities for staff who are either new to the organisation with relevant skills or being upskilled within the organisation to learn from experience, so it is important not to set them up to fail by taking on something too big or complex as their first opportunity to deploy those skills. Establishing confidence is key: the newcomers to the organisation feeling their way into it and establishing trusted networks will be a big part in making them effective quickly; the existing staff need to be seen to succeed and not likely to fail, as their newly acquired skills may be eroded if they are exposed in front of colleagues who may even be sceptical of their new-found capabilities.

6.2.4 Data exploitation

Now the foundations have been covered, assuming this data strategy has the broader definition of covering the entire information landscape, it is time to consider the exploitation of the data.

6.2.4.1 Reporting and management information

I mentioned in the previous section the confusion that arises from terms in this arena having different definitions, often causing a degree of complication through misinterpretation. MI or reporting – often the terms are used interchangeably in organisations – provide a backward look at what happened, tracking measures such as performance (how many stock items were sold last month), productivity (how many stock items were produced last month) and resources (how many days were lost last month due to absence, what was the return on investment last month).

What is the difference between MI and reporting? Again, it is largely preference and established practice within an organisation. MI is produced to enable decision making based on what has occurred, and underpins control and coordination of activities. It covers any business activity, so whilst the term ‘finance MI’ may be commonplace, MI does not have to be exclusively financial in nature. Without MI, organisations cannot report financial performance or compliance, or track sales or staff activities. It is therefore at the heart of any organisation wanting to know what is happening, whether by the minute, hour, day, week, month or even year, and typically is presented in the same format each month to ease the executive in spotting the key information of interest.

Reporting is a more generic term, and is confusing as MI is also a form of reporting. It can be used more colloquially to mean simple or routine reports, for example the quick insight generated by querying the system for basic information such as annual leave entitlement, stock levels for a particular item, staff in post at a specific grade or location – all things that the operational system can usually provide from its in-built reporting functionality without having to link numerous data sets and/or develop complex formulas to calculate the required information.

Whether you see benefit in having both terms in use in your organisation or not, it is likely that the genie is out of the bottle and these are already commonplace in the lexicon of the organisation. It may therefore be simpler, if you wish to avoid the terms being used as if synonymous, to try to differentiate along the lines above, in which reporting is low-level activity and not the calculated information generated as MI.

Many organisations will have an MI strategy, a coherent plan on how to use BI tools to deliver the outputs to run operational and financial activities. Assuming you are taking the track of creating a data strategy that encompasses the broader information landscape, the MI strategy would form a sub-strategy and hence be the more detailed view of what is to be delivered than is rightly positioned in the data strategy.

In addition, the MI strategy will dovetail with the IT strategy in terms of the BI tools, data warehousing or data lakes, data storage mechanisms to support the MI strategy and the data retention and integration strategies (increasingly, in a cloud computing environment, the traditional ways of integrating data are being superseded by new methodologies). There will also be overlap with the approach to information risk and security, along with data protection regulations, given the accessibility and retention policies play heavily into the reporting space.

The investment in a central repository, such as a data warehouse or lake, will be key to your ability to integrate data sets to support reporting. Whilst some organisations are able to avoid this sort of investment through deploying data virtualisation techniques, if your data is fragmented – especially if you have outsourced some of your activity and that data is managed by a third party on a system outside your organisation – then the integration of data is likely to require a means to bring these together, which is what a data warehouse or a data lake can do for you.

The benefits of taking extracts from multiple places and consolidating them in one place, using common keys and links to join data sets together, transforms the ability to deliver reporting in a consistent manner, opening up opportunities to build once and automate, thereby saving resources. It can also be very beneficial in opening up opportunities that would otherwise have been overlooked, as the first time data is linked together may reveal something which otherwise would have gone unnoticed in the organisation. It is such moments that deliver real value and accelerate the return on investment (ROI).

The use of maturity models has been referenced in other parts of the book, and one I particularly like – albeit one of the oldest, dating back to 2005 – is produced by an organisation formerly known as The Data Warehousing Institute, but now rebranded as Transforming Data With Intelligence (TDWI).8 It has developed a number of maturity models, but the original BI maturity model was an effective way to assess progress of your organisation on its adoption of an investment in data warehouse technology to becoming intelligence-led. It assesses five key aspects of an investment in BI capabilities over six stages of maturity to determine how advanced your organisation has become in enabling greater exploitation of the information:

  1. BI adoption curve – moving from a cost-based model to a strategic resource that drives the business and ultimately shapes the market. Low on this one and you are in the Japanese knotweed of spreadsheets. High, and yours has progressed to be a truly analytically driven organisation.
  2. Local control versus enterprise standards – the balancing act between a devolved model with limited standards and a more strategic approach embedded in the organisation with greater clarity on standards to be followed. As progress is made along the six stages, the transition is from ‘think local, resist global’ to ‘plan global, act local’.
  3. BI usage – an initial early adopter approach, with power users exerting influence that shifts as the organisation utilises reports to drive more effective operations to be more empowered, with the capability extended more widely and becoming more customer-focused.
  4. BI insight – shifting from a model of historic views of the data to drive a better awareness of issues within the organisation towards a more action-orientated approach, which ultimately exploits data nearer real time to optimise decisions, identify opportunities and embed such intelligence via models into core systems to lead to automated decision-making.
  5. Business value and ROI – the transition from the initial investment, with limited exploitation to release value due to data integration challenges, to an enterprise resource with stable costs and a platform to be exploited through automation of decisions, insight leading to competitive advantage and a more outward perspective on the opportunities to drive the market.

6.2.4.2 Analytics

The exploitation of data is at its most advanced in the analytics arena, moving from the retrospective of what has happened to looking at what is likely to happen – predictive and prescriptive analytics. The goal of analytics is to shape the future and enable the organisation to either adapt to those situations it cannot influence, mitigating the impact and steering a course to minimise the effect, or to positively influence a future situation to the advantage of the organisation. Of course, this only works if the organisation has the maturity to comprehend the message and adapt, and this is the tricky balance so many organisations fail to achieve – they embark on wanting to exploit analytics, but are not prepared for the negative discoveries they find, as they have perceived only upsides.

In my first role, I was tasked with rebuilding a forecasting model that predicted the operational volumes and revenue for the organisation. It worked well for a number of months, and then suddenly indicated volumes were about to drop dramatically, plunging the company into a loss-making situation. The model, which had been heralded for its relative accuracy for many months, was now dismissed as flawed and inaccurate. However, in a matter of weeks volumes fell off a cliff as the recession of 1990 hit home, and the company was plunged into crisis. If only faith had been retained in the same model which had been in vogue weeks before, and time had been spent on preparation rather than continuing as if oblivious to what was coming.

In much the same way as a data maturity assessment can demonstrate how advanced the organisation is in its management of data, so there are analytics maturity models that can serve a similar purpose. Most consulting firms use these, and there are certainly versions, like the McKinsey maturity model on analytic capability and utilisation,9 that are relatively easy to apply. Whilst these might be used in a less outwardly visible fashion than the data maturity assessment you undertake, it is worth exploring the maturity of the organisation in regards to becoming analytically driven.

The key to establish with the analytics maturity model when defining the data strategy is twofold: firstly, what capability you have to be able to deliver on the ambition in the data strategy (indeed, recruitment of analytics staff may be a part of the data strategy, if this is part of growing the capacity and capability); secondly, what the demand for analytics within the organisation is to be able to deploy this successfully.

In a lot of organisations, a few enlightened individuals recognise the potential of analytics to turn theirs into an insight-led organisation, utilising evidence and analytical skills to drive future outcomes. However, this may come unstuck if a similar demand does not exist across the whole organisation, such that the analytics capability is not well understood, it fails to gain traction and the organisation continues to operate as it always has – using intuition, repetition (and either failing or producing suboptimal outcomes) and flawed information garnered through asking the wrong question of poor-quality data.

I have often found myself being asked why, if these failings are so prevalent, more organisations don’t fail or lose influence in their markets. Surely, there should be an almost Darwinian shift to those who do invest and exploit data effectively. My answer, unfortunately, is that the majority of companies still haven’t truly found the sweet spot in analytics adoption to have really outperformed their competitors to fully evidence the significant differentiator that analytics brings, and, where this has occurred, companies hold dear the knowledge of the competitive advantage that the application of analytics provides, rather than be overt in publicising what they are doing that is making such a difference.

I recall, many years ago, being prevented from presenting ground-breaking activity at conferences precisely because the organisation concerned did not want that sort of information leaking out to competitors.

In addition, just look at the vast majority of newer ‘tech-based’ companies and you will see that analytics is at the heart of the organisation’s culture and strategy. This is why it is important for you to take the initiative, and challenge the organisation and its practices if there is scope to manoeuvre your organisation to be a leader and not a follower.

In terms of content, the analytics section of your data strategy will be guided by the maturity of the existing capability and appetite of the organisation to embrace it. There will be a temptation to leap ahead in this field to advanced analytics, but there are pitfalls in such an approach, not least the cultural embedding and data quality issues you are most likely to find. Keep a focus on what the goal of exploiting data through analytics is. It may be to increase profitability by focusing on the right blend of customers to generate more sales volume, expanding their product holding, driving cost out of processes by identifying ways to increase efficiency or a myriad of other things the organisation could look to achieve. It may also vary depending on business unit or function – whilst not endless, there is likely to be a lot of data, which in turn creates a lot of scope to deploy analytics.

I recommend a balance of opportunism – identifying where there are stand-alone activities that could result in ‘quick wins’ – mixed with a more pragmatic focus on getting the foundations laid to support a significant investment in analytics. The latter does not have to be seen as a constraint; there is plenty of value to be gained along the way, and the timescale for delivering the foundations can be as bullish as the organisation can afford, commit to and resource. You need to position it accordingly to retain the link in key stakeholders’ minds that there is a need to invest in constructing the foundations to fully realise the greater ambition of the data strategy.

Bear in mind that the delivery of analytics early can increase its value, especially if it enables your organisation to be the first to spot and exploit an opportunity and so get first mover advantage.10 In 2017 EY and Forbes identified through research11 that only 7 per cent of organisations move fast in exploiting analytics, with 38 per cent of this group incorporating analytics into the design of business initiatives.

Figure 6.6 provides a good illustration of the key points to be reached on the journey to embedding analytics fully into the organisational DNA. Without achieving the steps along the way, regardless of how quickly these are achieved, progress will be illusory, as it is not building on the foundations of what went before. This is a continuum, and as such requires progression to make the gains stick.

As with the other sections of this chapter on content, I won’t dwell too long on the specifics of analytics. Again, many highly regarded texts can be found that cover this in significant detail, and the bibliography of this book provides some I have found useful.

Remember that the content needs to be coherent for your audience, and this is a section with the potential to run away with advanced statistical terminology, making it daunting or forbidding to the reader. This is the opposite of what we are aiming to achieve.

6.2.4.3 Data science

Data science12 is an oft-used term which is incorrectly applied more commonly than should be the case. Indeed, some would articulate data science and analytics to be one and the same, and from some of the descriptions applied to the former it is hard to distinguish it from the latter. For my purposes, I regard data science to be an extension of analytics that has embraced the exponential growth in computing power and intelligence to provide additional capabilities which broaden analytics into a programming environment much broader than that a traditional analytics team would provide. This incorporates machine learning, AI, deep learning and advanced coding, all of which utilise the power of computing alongside the algorithmic influences of statisticians.13

Figure 6.6 FP&A strategy data maturity model to accelerate finance transformation J. Myers and A. Alhagi (2017).

images

The field of data and analytics has never quite recovered from the marketing hype since the turn of the millennium that has unleashed a number of terms and phrases,14 such as ‘big data’,15 ‘data is the new oil’, ‘open data’ and ‘data engineering’, to name only a few. Those of us engaged within the data and analytics community would struggle to either define consistently or agree on whether these terms and phrases are of any value to us today. Indeed, my own experience has been that it has made the challenge of engaging with those eager to embrace what we can deliver harder by introducing more terminology which, in itself, creates a further barrier to simplifying and standardising our language to make it accessible. It is all the more important to recognise these terms do exist, but to use them cautiously, and with the caveat of providing as standardised a definition as possible of what is meant within the data strategy, if it is indeed necessary to incorporate them.

Should you include a section on data science in your data strategy? Increasingly, I would advocate that you should, even if this is nascent in your organisation’s development, given the rate of growth of this discipline and the transformative effects it is having across all industries and the challenges it is posing to sectors which are being disrupted by it. If you are operating in one of those, data science should be part of your overall strategy, let alone data strategy, if you are to stay in business.

Examples of data science making significant headway proliferate across sectors, public and private alike. Indeed, there is plenty of discussion as to how data science is democratising data16 in a way perhaps not seen previously, making major changes to our way of working (and living) that transcend not just the routine but the previously perceived complex. Radiography, for example, which required highly skilled practitioners who had undergone training over many years, is now realising higher accuracy rates from AI than the qualified radiographer. The application of AI and machine learning is also used in facial recognition, typically by law enforcement agencies across the world to identify individuals, behaviour traits and track criminals amongst large crowds, though there are significant ethical and accuracy concerns which have arisen, ranging from racial profiling to the infringement of human rights. Legal cases are beginning to identify instances in which the use of facial recognition and other AI applications are deemed unlawful due to privacy concerns and a lack of consent.

These are just two examples of radically different approaches which transform workplaces in a way we would not have envisaged a decade or two ago, but which provoke a growing disquiet over the advancement of technology and how it is balanced with the question of ethics. The advancement of data science is moving faster than the rate at which the ethical debate can keep pace, which is challenging, and there are many institutions seeking to provide guidance.17, 18 However, this is a relentless challenge given the pace at which data science is moving, and so the ethical debate should also be referenced within your data strategy, to alert the reader to the challenges and to be prepared to engage – this is integral to the relationship your organisation has with its customers, consumers and stakeholders.

I would recommend, if this topic is pertinent to your organisation and therefore the data strategy, that you investigate further the direction of AI and machine learning and its impact on your organisation – even if you are not investing in this technology, you may find your competitors are, and this may put you at a significant disadvantage in the relatively near future if you are not prepared. It is also essential that you investigate the ethical dimension, and understand how this applies to your organisation, the sector in which you operate and the customers you engage with. The profile of the ethical concerns with this technology is growing, and your customers may well have concerns at any move your organisation makes to utilise it, so it is prudent and a key compliance step to communicate this element of your strategy proactively.

6.3 LOGICALLY STRUCTURING YOUR CONTENT

Establishing an understanding of what should be considered in your data strategy is clearly an early task to undertake in the process of determining the look and feel of the strategy. The maturity models mentioned in this chapter provide helpful context, a framework to build a consensus within your organisation to establish an agreed baseline, and focus minds on the aspiration and outcomes sought to be encapsulated in the data strategy; however, this needs a key thread to link the range of topics together into an easy to follow strategy document that the lay person can access and action.

As with so much in the world of data and analytics, what is needed is a story to cut through the detail that can seem to be forbidding to those who have little background or desire to explore too far – they just want to know the what, why, when and how. Of course, the data strategy alone isn’t going to answer all of these, but it needs to ‘sell’ the vision and direction, such that the implementation plan is building on the positivity, bringing all parties together to make it happen.

Figure 6.7 shows how the steps outlined within this chapter build on one another to add value and intelligence and, as such, are not discrete islands of activity. The key to providing your content with a story is to build this narrative into a picture that flows seamlessly from one aspect of the strategy to the other. For many who work on a data strategy, this is one of the most challenging parts of delivering the strategy, as the flow is both obvious and also, ironically, difficult to describe without resorting to the same words used to describe the content. It is an area where the analytical mindset has to converge with a creative partner to be able to describe, in whatever form it might take, how the series of interacting content blocks come together to deliver something recognisable and desirable to the wider organisation.

Figure 6.7 The transition to an intelligent, data-driven organisation generating business value

images

What is the risk of simply presenting the data strategy as a series of goals or statements of deliverables over a time period? To put it simply, unless you are in an organisation which is entirely data-driven or geared around data, most authors of a data strategy will discover that it is a hard ‘sell’ to be pushing any strategy, let alone one which is focused on data and its exploitation. An article in the Journal of Management & Organization in 201519 reported that an average of around 50 per cent of strategies fail, though it acknowledged that identifying failures in executing strategies is challenging.

The CLEAR acronym was provided earlier in this book to remind those embarking on data strategy definition of the essential building blocks to positioning the data strategy to ensure it remains relevant to all within the organisation. The challenge when structuring the content is to fit it to the CLEAR goals, with clarity and relevancy two of the key points to particularly bear in mind. If the final data strategy does not reflect the CLEAR principles, then you may find yourself making little progress despite devoting time and effort to it.

There are many ways the data strategy could be structured, and as with so many aspects of guiding the process of defining a data strategy, there is no ‘one size fits all’ solution. The maturity of the organisation, the drivers behind why a data strategy is being drafted in the first place, the sponsorship and wider alignment of the data strategy, not to mention the recognition of the need to change the organisation if the data strategy is likely to bear fruit, are all dependencies in the thinking required in approaching how to structure the strategy content. Nonetheless, there are some key principles to be considered.

Firstly, many strategies fail to reach execution because, in reality, they are not strategies at all.20 This might sound obvious, but, as various studies have found, one of the commonest forms of failure is down to the strategy not being executable because it is an aspirational statement with no grounding on the rationale behind it or steps to achieve it. Therefore, do not fall into the trap of failing to set out your strategy, and remember to use CLEAR as a constant check.

Secondly, the strategy has to be relevant to all who are expected to read it, which should be a broad base across the organisation given that data touches all. It should be made accessible, interesting and informative – if the reader does not take something away with them having read the strategy, then nothing is going to change.

Thirdly, if the data strategy does not dovetail with the strategies already in place, especially the corporate strategy, then it is likely to be detached from the priorities of the wider organisation and therefore overlooked in terms of implementation.

Fourthly, and perhaps most alarming generally from a strategy perspective, a global study21 conducted by Strategy&, the strategy consulting division of PwC, found only 8 per cent of company leaders were said to excel at both strategy and execution, with only 16 per cent said to excel at one or the other. Let me restate this, to reinforce the impact this is likely to have on what you are embarking on in defining and executing a data strategy – less than one in six company leaders excels at either strategy or its execution, yet you are about to embark on delivering both.

The task in getting the data strategy through the engagement and adoption phases to deliver benefits is one littered with failures in some of the largest organisations around the world. Do not underestimate the scale of the challenge, and consider wisely the choice of sponsor for the data strategy and those working with you to deliver it.

In making the data strategy accessible, interesting and informative, tell a compelling story. There are some great exponents of storytelling who can provide real insight into how to do this effectively, so I will keep it brief in this book and signpost for those who want to know more – hopefully, many of you, given how important this is to overcome the four hurdles above and more – to those who can provide the expertise in this critical area.

6.3.1 Strategy storytelling

The essence to devising a strategy story is to own it, by which I mean that you have to personalise your own understanding and develop the story to go with it. This is a collective effort, so if there are others who are part of the data strategy journey, they too must have the same level of understanding but, most importantly, be true to the story in telling it in their own way.

It may seem surprising to hear that personalisation is a key factor, as many will tell you to stick to a script and to avoid deviation from its core, but this is an artificial approach to enabling any leader to be true to themselves and their teams in retelling the story. Therefore it will be apparent to anyone who knows that leader that their story delivery is wooden or stilted, the words are not their own and they are fixated on the words to ensure they get every one of them delivered as written. It will be yet another corporate instruction to deliver this cascade and not to deviate from it, at which point it is hollow in its delivery and fools no one.

The essence is keeping the core messaging the same such that the key to it is retold time and time again, but each time with a delivery personalised to the individual giving it so it carries their passion and credibility to the audience.

Another aspect of storytelling the strategy is to balance the vision with reality. If the strategy is made to sound a walk in the park, then it is open to challenge on the basis of ‘why haven’t we done this already if it is so simple?’ or ‘is this truly aspirational or underplaying where we should be aiming for?’

In telling the story it is important to reflect on what could undermine the strategy or, at least, derail it in part, and you should be prepared to counter these potential points in advance of their being raised. In other words, demonstrate that there has been significant thinking that has gone into the strategy definition process and it has been challenged along the way. Reflect on what could go wrong, but also recognise that for those who have been in the organisation for some time there may be elements of revisiting past failures that those newer to the company (some of whom may have devised the strategy) are unaware of. By acknowledging that the strategy is going to tackle areas which may have failed previously but learn from the past, it recognises the history but importantly is embracing the future with experience behind the new approach.

There are different ways of delivering the strategy. One view is that the engagement of a larger audience in presenting and sharing the strategy is an effective way of bringing a collegiate mentality together, so all are immersed in the experience and get the opportunity to discuss, share anecdotes and views, and ultimately form a common bond and experience to take away a positive outlook to engage further.

There is also a view that the campfire approach, in which smaller groups gather to share the experience and participate in the discussion, is the way to go in building a strong bond and allegiances to take on to further groups. Successful engagement is likely to lead to informal discussions, whether gathered around the water cooler or coffee area or similar unofficial meeting spots, and if the data strategy programme team can participate in these in a relaxed way it helps bind people into the story through the sharing of updates, personal perspectives and anecdotes, bringing it to life.

In my experience, a blend of these approaches works, especially as some parts of the organisation may have more introverted or reflective personalities than others. Either way, it is a discussion, a story shared with contributions welcomed at an early stage, rather than a formulaic PowerPoint presentation with a tight script. These approaches can be supported by engagement groups, dropping into existing team meetings or other opportunities to keep the profile high, bearing in mind there is a need to have a story to tell and not all employees respond in the same way, so there may need to be some variation in trialling engagement.

The challenge with this is that the data strategy is fundamentally a document that needs to be supportive of the dissemination of it as a story. In that sense, the data strategy needs to provide the common core of a message to be delivered and so needs to read as such. If there is not a common core, then there is unlikely to be the level of consistency in the messaging required to keep everyone on board. However, the data strategy becomes the underpinning document that an audience can refer to having heard the delivery of the story rather than the story itself, as it is essential the audience relates to the data strategy content. You therefore need to work with your senior executive audience to frame the story based upon that core messaging and work with them until such time as they have fully bought into owning the data strategy. Achieve this, and you have a data strategy ready to navigate through the choppy waters where most strategies sink and can sail through to a successful implementation.

Further references on where to look for resources to help you shape your strategy story are in the bibliography.

6.3.2 Balancing vision and detail

There is always a temptation to provide more detail than is required – it is a flaw in so many data strategies that exist today, and one often a result of those most passionate about the subject being the author and so wishing to go a step further than is often needed. On the other hand, there are some data strategies in the public domain which are so light on detail that they are barely recognisable as strategies at all, rather falling foul of being visionary or aspirational statements with little grounding or foundation from which to build.

It is a tricky balance, but one which can be resolved by active engagement of a wider cross-section of those who know the audience and are able to advise in where best to pitch the content to make it land successfully. For credibility of the data strategy it needs to have sufficient basis to get agreement on where the starting point is, potentially a brief acknowledgement to some of the past that has led to that being the case, but is forward-focused with a positive message that sets a destiny and rationale. Throughout the drafting stage, iterate and review based on stakeholder feedback, utilising the connections to senior stakeholders to ensure they are aware that there is a data strategy coming, what it is shaping up to contain and how it aligns with the wider corporate goals. Do not forget to provide the means to develop a story from the core of the data strategy, so seek the opportunity to also gain a sense of how comfortable others are in how that story is shaping up to build understanding and familiarity.

As covered in the earlier chapters, remember that the audience for the data strategy is not those who are necessarily the architects of it. It must therefore be accessible and framed in a way that it aligns with the corporate strategy as a key enabler, so the linkage is obvious to all. A common failing is to expect the reader to make the links as if they are so obvious they do not need to be covered at all, when this is almost certainly not the case. Your stakeholders are likely to be busy, and if they have not seen a data strategy before may even disregard it as not relevant to them, unless the groundwork has been put in first to position it and then deliver on making it an essential read.

Be mindful as to what you are expecting to achieve. Those who are most effective at storytelling will paint a vision of where they are heading in a compelling way based upon providing the rationale for how this is achievable, so make sure that there is a vision of what the data strategy will enable that is markedly different at the end of its period, whether three, five or ten years (or any other variant). This is the core of the story, how it builds to gain buy-in to the series of objectives to be attained and how this impacts the organisation.

6.4 STRATEGY ALIGNMENT

I have referenced the need to align the existing strategies in the organisation with the data strategy, none more so than the corporate strategy which, if it has credibility within the organisation, will have a particular focus as it is likely to drive shareholder value and executive bonuses (for a private sector enterprise) as well as careers and credibility on being able to deliver on promises made. If you are seeking to make a data strategy a key partner in the strategies already at play in your organisation then you need to become a team player and align and acknowledge dependencies and, importantly, enablers in your data strategy to be embraced into the fold.

Clearly, the corporate strategy will be highly contextual and specific in nature to your organisation. It is therefore impossible to define how the data strategy might best be aligned, other than to give a few pointers as to areas you might like to consider incorporating into the data strategy.

6.4.1 Customer strategy

The nature of what constitutes a customer may differ markedly between organisations, and in some the notion of a customer may be rather more liberal than in others (for example, an individual or organisation being the recipient of services from a monopoly supplier – as may be the case in the public sector, or regulated industries). However, regardless of the type of organisation, what is common is that a service (or product) is provided in some shape or form to an end consumer, resulting in the customer having an experience which may be positive or negative, that creates a perception, regardless of the outcome. The degree of choice that the customer has may vary, and this may also lead to a difference in the quality of the experience based upon the resources or effort expended by the supplier.

It is essential that the data strategy recognises that it is performing a key role in enabling the organisation to manage its customer relationship more effectively, potentially resulting in a better customer experience. This may be through the provision of better MI to those making decisions to speed up the process, linking data together to provide a more holistic view of the customer, anticipating what the customer wants and tailoring a proposition accordingly; however this is achieved, data is at the heart of making it possible.

In addition, there will be a myriad of customer touchpoints with the organisation, and being able to capture these and ensure that the experience has been a positive one, learning from what engagement has taken place, is pivotal to enhancing the experience the next time. This involves capturing the data compliantly, linking it with other relevant data and turning this through analysis into meaningful insights that can be acted upon.

If improvements to the customer experience can be achieved through better-quality data, leading, in turn, to better-informed decisions, then the data strategy becomes a critical enabler to the corporate strategy.

6.4.2 Digital strategy

Increasingly, organisations are also developing digital strategies to enable customers to access services in an ever-expanding range of options and providing staff with the means to interact with customers through whatever channel is appropriate in a joined-up way. There is close alignment between the definition and implementation of digital capabilities and the data and analytics to underpin making these as effective as they can be. It would therefore be remiss not to link these two strategies if they both exist in the organisation, given the clear interdependencies. If new digital channels or capabilities are being implemented, then the data strategy should recognise the need to facilitate the support and data requirements necessary to make these work, in addition to determining how data can be extracted most effectively from new digital services to optimise analytics and so develop a better understanding of the customer.

6.4.3 Other strategies

Depending on the sophistication of the strategic planning capabilities in your organisation, there may be many other strategies that you may need to dock in with to align with the data strategy and identify opportunities to flag dependencies and enablers between the strategies. The following is a brief list, but is not necessarily exhaustive.

6.4.3.1 Marketing strategy

There will be significant demands for data and analytics support to a marketing strategy to deliver customer experience and determine key activities such as forecasting, pricing, channel strategies, targeting, optimisation strategies, customer propositions and next best activity for a customer, to name just a few. It is highly likely that a data and analytics capability will be working closely with marketing, given their dependency on data and analytics to drive engagement.

6.4.3.2 HR strategy

The relationship the organisation has with its own staff is directly related to data. Engagement of employees is directly correlated with customer satisfaction, and so it is probable that the HR team is a data-intensive environment to ensure its employees are being supported, managed and developed in a mutually beneficial way to drive optimisation from the HR budget and resources. This area has been a little behind in terms of the more sophisticated exploitation of data, focusing more on data capture and measurement historically, but this is certainly changing globally and becoming one of the functions with most to gain from becoming insight-led as the battle for talent in much of the world becomes more challenging.

6.4.3.3 Technology strategy

The organisation will almost certainly have some form of technology strategy given the constantly changing world of networking, hardware, software and IT consumables. The pace of change is moving faster, and lead times to change are now causing a degree of concern amongst the public as the nature of technological advances is seen to be potentially more invasive and discriminatory than at any time before.

In a 2019 survey conducted by Fujitsu,22 39 per cent of the UK public that were interviewed said they had less trust in organisations than they did five years ago, and a sizeable minority are resistant to adopting new technologies such as driverless cars, drone technology and cryptocurrencies. The same survey found that 58 per cent of business leaders had chosen not to adopt some technologies due to customer nervousness, citing data security perceptions and concerns about AI and quantum computing as two specific examples. Thirty-four per cent of leaders in manufacturing organisations highlighted employee resistance to technological change as their biggest concern.

The alignment between technology and data is obvious – one supports the other to capture data to be processed in operational environments and provides capabilities to exploit it – but the trend is increasingly converging, with AI, machine learning and data science increasingly utilising technological advances to drive data exploitation in more complex and sophisticated ways. The data strategy will therefore be both an enabler to the technology strategy and have dependencies on what it provides to manage and exploit data.

6.4.3.4 Finance, risk and compliance

There will be standards and policies in many organisations focused on finance – the risk approach and appetite and the compliance regime in which the organisation operates. There will be a data requirement to support these, but there will also be dependencies on the use of data to inform and shape approaches, and these will be either explicit in such standards and policies or specified in instructions to define processes and the use of data.

In addition, the compliance issues discussed in detail earlier in this book will need to be tracked via a data protection officer within your organisation, and evidence provided to demonstrate compliance and the processes underpinning this. This will also include adherence to the data privacy laws in those jurisdictions in which your organisation operates. The data strategy should reflect what is required to support finance, risk and compliance activities.

6.5 RELEVANCY IS KEY TO ENABLING EFFECTIVE ADOPTION

As highlighted in the CLEAR acronym in Chapter 2, it is essential to keep a strict focus on the audience for the data strategy. Chapter 4 also explained the need to compose the data strategy following the RAVE principles, and it is important to frame and define the content in as clear a way as possible to make the data strategy accessible to all. This combines the concept of storytelling with the need to focus on achieving the goals and waymarkers.

If there is one all too common a failing in data strategies, it is the temptation to make them too detailed through either straying into implementation activities or overplaying the content by providing too much information. The key is to recognise the level of information that needs to be imparted to make the data strategy coherent and likely to be endorsed, with as little information as is necessary to be able to make the point cogently. Brevity, and associated clarity in what needs to be achieved and why, is a winning formula in gaining senior executive sponsorship.

If you have followed the outline in this book so far, you will have ensured that your approach has been inclusive, iterative and tested along the way. This will provide an essential route to plotting the delicate balance between too little and too much information within the data strategy. It may sound obvious, but the reader will look at it through an individual lens, based upon their own perspective, role and personal experiences, and so no two individuals will arrive upon exactly the same interpretation of what the data strategy means – it is entirely contextual, based upon complexity that you will never be able to comprehend. Therefore, the data strategy has to be as clear and direct as possible. Remember, you want it to be endorsed, but more than that, to be the subject of campfire discussions across the organisation with the content being consistent, despite many flavours in how it is delivered.

So how do you ensure you strike the right balance?

There are several things you can do to ensure you are focused on what is needed to deliver a great data strategy.

6.5.1 Take on the implementation role

It may seem to contradict my advice throughout this book to this point, but take on the implementation role as you progress through the drafting stage. The acid test should be whether the data strategy is coherent enough to be picked up and delivered in the way you intended, recognising there will always be a degree of latitude required as events unfold and time informs what were assumptions made earlier.

It may be that you will be tasked with both the strategy definition and delivery phases, as is more than likely in smaller organisations with fewer resources to split these tasks apart. However, even if this is not the case, as the author of the data strategy you should be able to provide direction and consultation to those who do have implementation responsibilities. Therefore, you need to start to think about the practicalities of the implementation phase:

  • What resources are you assuming will be devoted to implementation, both directly (coordinating the implementation activity) and indirectly (delivering or enabling parts of the strategy)?
  • Is there funding required to deliver some elements of the strategy and, if so, is this secured, or linked in with another activity in which it is clearly understood and managed?
  • What baseline have you used to start from, and has this been validated?
  • What risks, assumptions, issues and dependencies (RAID) have you made, and are these documented?
  • Who is sponsoring the implementation, and is that individual fully behind it, with the right governance in place to see it through?
  • What authority has been delegated to you to drive this forward and to own the programme to drive the change?
  • Have you provided clear evidence to underpin the baseline such that it could stand up to audit or other investigation should there be any queries raised through implementation?
  • How front-end loaded is the implementation, and how is momentum from strategy approval to implementation maintained to keep up with the strategic objectives?
  • How will progress be tracked and measured, to ensure there is clarity on how effective and successful the data strategy has proved to be?
  • How will lessons learnt be captured through the implementation phase, such that the experience gained in delivery informs subsequent activity?
  • What is the intent to keep the data strategy as a living document, rolling it forward a year at a time, and how will this be managed within the context of the implementation?

These questions represent only a small proportion of those likely to arise in the course of implementation, but they are a good starting point to challenge how thorough your data strategy is in being able to set a course for those tasked with implementation. Bear in mind my recommendation that the data strategy is kept relatively short, so inevitably it has to be high level, with sufficient guidance to lead those implementing it to know what was intended and the pace to be struck from the outset.

If you can challenge your own thinking through exercising a sufficient degree of distance and objectivity – easier to say than do – then swapping hats to consider how the above will be achieved is a valuable exercise to undertake as the data strategy begins to take shape. Invite others to do likewise who are involved in shaping the content, as it will inform their thinking and sharpen their content to frame it in a way that makes it clearer to implement.

6.5.2 Reflect on the waymarkers as guide rails

A key input to guide the implementation activity is the use of waymarkers through the data strategy. As discussed previously, the waymarkers serve as guide rails to inform the implementation and should not be a straightjacket to constrain decisions which need to be made at the time. However, the advantage of including waymarkers is the notion of pace, progress and delivery that they give to provide clarity and confidence in terms of outcomes, to both those signing off on the data strategy and those who implement it. Waymarkers also provide a means of assessing that the organisation has been able to maintain the assumed tempo that the data strategy has been devised upon.

As a result, there is significant importance in the clarity of the waymarkers to act as guide rails. If the waymarkers are unclear, it is likely that the data strategy will fail to maintain the rhythm required to deliver what it has stated due to a different interpretation or understanding. There is often a tendency to overstate the tempo rather than build in the contingency that will be required to accommodate other events or underestimation in the RAID timescales underpinning the data strategy.

Do review the waymarkers with a critical eye before finalising the draft data strategy, as they will almost certainly form the basis for the implementation plan and inform those who are leading the implementation as to what ‘good’ looks like to achieve delivery of the data strategy. Ask those who are providing input to the data strategy to review the waymarkers to ensure they are clear, in terms of specifics as to what is being delivered and the anticipated impact.

I would also recommend a review of the consistency of the waymarkers in terms of the links between those activities to be achieved (for instance, it would be a major oversight to have overlooked a critical dependency which comes via a later waymarker) and, in particular, the clarity of why those specific activities have been called out in the waymarker. It is likely that the rationale is the link between a corporate strategic objective and the data strategy, but it could also be a critical enabler which unblocks significant progress on a wider front across the data strategy.

Do make sure that the waymarkers are called out in a consistent way throughout the data strategy. It is often apparent that different hands have played a part in writing the data strategy and as a result the means to track progress (via waymarkers or any other recognition of what is to be delivered) is presented in different ways throughout, thus making it a challenge in its own right to piece together. If the waymarkers are to be successful, they need to be pulled through the text into either a specific section of the data strategy or as distinct elements of the content. This could be through a note at the end of each section, highlighting the waymarker in its own text box, for instance.

6.5.3 Develop your own story

As discussed earlier in this chapter, strategy storytelling will become a key part of the data strategy implementation. As the lead on the data strategy, you get to start the story, telling it in your own way to achieve the desired level of commitment through to approval. Start to develop your story with the team around you, the wider group you have consulted as SMEs in pulling together the data strategy content and those you work with closely who may not have been directly involved.

How you paint the picture of where the data strategy heads, its importance to the organisation and the vision of the future will all help shape understanding and advocacy. Test your story to see how it resonates with others and refine it based on feedback. Get others to play back their version of the story a day or two later, giving them time to reflect on what they heard. It will evolve, and you are not looking for repetition, rather that the message remains consistent at its core but is told in words that have meaning and commitment for that individual. We are aiming for individual storytelling of a common narrative: so long as the core remains true then it is achieving its purpose.

6.5.4 Steal with pride – learn from others who have delivered successful strategies

The task of devising a data strategy is a daunting one, especially if there is no precedent within your organisation. I am always surprised when I meet potential authors of a data strategy who have not looked at the evidence of others who have developed data strategies, as there are many accessible in the public domain via a basic internet search.

I structure my workshops to incorporate learning from others at the heart of the approach. It seems obvious to do this, but the value participants get from the experience of critiquing actual data strategies that are publicly accessible is the most useful part of those workshops. You should recognise that the process of developing a data strategy is broadly the same for anyone, regardless of the organisation or experience in doing so. For instance, we have the same activities to write about in our data strategies – we all undertake the basic premise of data capture, management and exploitation, albeit we do so in many different ways.

Therefore, the essence of the data strategy you are about to write should have commonalities with those you can access. Even better, those that are of a higher standard will also give you a sense of their baseline, culture and insight into their challenges – legacy data and/or system issues, proliferation of reporting leading to multiple versions of the same information, for example. Reflect on how you might construct your own review of a baseline to gain broad agreement as to where you are starting from.

If you are particularly impressed by a data strategy you come across, possibly one written a year or two ago, consider trying to reach out to the author to see how it has gone to learn from their experience. Many people welcome the opportunity to share experience, especially where it is proactively sought based on positive first impressions. However, tread carefully. As discussed in this book, most strategies fail to achieve a successful outcome, often falling at the implementation hurdle, so an impressive data strategy does not necessarily translate into a positive implementation story. Be prepared that this might be a sensitive topic, but if you can, learn from what prevented even the most favourably considered data strategy translating into delivery for that organisation, and apply your own approach.

6.5.5 Red team the final draft

A common practice in some organisations, and especially so amongst consultancy firms, is to ‘red team’ as you are approaching the completion of the final draft of your document. The practice is commonplace in procurement, security and military fields, but it can also be applied just as effectively to any situation in which a proposal is being delivered and there is a desire to test it as extensively as possible to ensure it can be enhanced to withstand the deepest scrutiny.

Creating a red team involves getting colleagues who have similar knowledge and expertise to those who would typically be faced with approving and also implementing the data strategy (though these could be run as two separate exercises, given the different nature of the challenge) to review the proposed final draft and provide as much challenge as they can to test those who have authored it on its content. This is meant to be an extensive process and one that is not to be undertaken lightly. Anyone chosen to participate on the red team is there to be as demanding as possible, with their goal to ensure they have left no stone unturned in trying to identify any weakness or oversight in the document.

This process can be a difficult one, so those who are there to represent the data strategy know they have to be ready for a challenging encounter in which they will feel grilled by those who are on the red team. However, it will ensure a better quality product which has higher integrity and clarity than if the data strategy were to be submitted without this input. It will also spot most of the likely errors and gaps in the data strategy that those who author the document perhaps miss, either through being too close to it or making an assumption that is not stated in the document.

If you have the means to take such an approach, it is a highly effective one to:

  • improve the final draft before it is submitted for approval;
  • identify the obvious weaknesses and bolster or change the draft prior to submission;
  • highlight the way in which those who have to approve and/or implement the data strategy are likely to interpret it;
  • give a sense of how ready the final draft is, to determine whether you are on track to meet the proposed submission date;
  • clarify the take-away messages to ensure the priorities and tone of the data strategy set the implementation team on the right path;
  • test the validity of the RAID and waymarkers to determine whether the basis for proceeding is correct and the tempo of delivery is realistic.

It is essential you prepare the team prior to the red team session. The whole point is to make it as challenging as possible to save time, reputation and credibility down the line. It is better to be put under this level of scrutiny in advance of publishing the data strategy than to find the issues later, when they are much harder to address and could call the whole basis of the data strategy into question.

6.6 TEN TO TAKE AWAY

In summary, the key points from this chapter are:

  1. Always have someone within your team who can review the proposed content before a final review by an independent reviewer.
  2. There are some common themes or topics you would typically expect to find in a data strategy: do consider whether you have got these covered in your approach.
  3. The final data strategy is likely to be between 12 and 20 pages, but it is often easier to start with more and refine it to less than to aim for this from the outset.
  4. Undertaking a data maturity assessment is an effective way to establish a common understanding of a baseline and to have a structured way to monitor progress. There are a number of models available that could be used.
  5. Data governance, quality, integrity and standards are all essential components of data management, and so should be considered for inclusion in establishing your foundations.
  6. Assessing capabilities within your organisation is important to identify any gaps which may hinder the implementation of the data strategy.
  7. Strategy implementation is a high-risk activity; most fail. Therefore, be aware of the pitfalls and ensure you engage and communicate effectively with stakeholders. Strategy storytelling is a highly effective way to build understanding and momentum behind the data strategy as you transition into execution.
  8. Strategy is visionary, so avoid too much detail. Align to other strategies within the organisation to make it coherent as to where the data strategy fits, particularly the corporate strategy.
  9. Reflect on the implementation phase – the practicalities of funding, resources, responsibilities, sponsorship, timing, amongst others – to consider how the transition can be seamless. The data strategy, at the sign-off stage, should ensure these are known and agreed prior to implementation commencing.
  10. Learn from other data strategies available in the public domain. These will help formulate your own thoughts as you determine the structure and content for your organisation, and review these in the context of the points in this book. You can learn a lot from doing some simple research in advance that can help guide your own approach.

 

1 R. Rumelt, Good Strategy, Bad Strategy: The Difference and Why It Matters. London: Profile Books, 2012.

2 Blaise Pascal, Lettres Provinciales, 1657.

3 K. Taylor, Data Governance Maturity Models Explained. https://www.hitechnectar.com/blogs/data-governance-maturity-models-explained/.

4 U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER), Center for Biologics Evaluation and Research (CBER), Center for Veterinary Medicine (CVM), Pharmaceutical Quality/Manufacturing Standards (CGMP) Data Integrity and Compliance With CGMP Guidance for Industry. Draft Guidance. 2016. https://www.fda.gov/media/97005/download.

5 T. Kauzlarich, D. Wertheimer and J. Heigel, Data Integrity’s Central Role in Financial Compliance: Maintaining Regulatory Compliance with Dodd-Frank Rule 1.73. Sagence, May 2014.

6 The Center for Biologics Evaluation and Research and the Center for Drug Evaluation and Research, both part of the FDA.

7 https://opendatacharter.net/.

8 For further detail on the original 2005 TDWI data maturity model and other BI maturity models, this blog provides a helpful overview: James Serra, Business Intelligence Maturity Assessment. 2013. https://www.sqlservercentral.com/blogs/business-intelligence-maturity-assessment.

9 G.R.M. Wood, Data & Analytics SIG: ‘Data and Analytics the Key to Success & Growth’, Organisational Maturity Survey (slide 12). 2016. https://www.slideshare.net/graemermwood/aiiadataanalyticsprojectexternal20160721.

10 For an insight into companies that had first mover advantage, see D. Tyre, The First-Mover Advantage, Explained. 2018. https://blog.hubspot.com/sales/first-mover-advantage.

11 E. Maguire, Data & Advanced Analytics: High Stakes, High Rewards. Forbes. 2017. https://www.forbes.com/sites/data-and-analytics/2017/02/15/data-advanced-analytics-high-stakes-high-rewards-2/.

12 For a short but interesting background to the evolution of data science, see G. Press, A Very Short History of Data Science. 2013. https://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science/.

13 For a really insightful example of what data scientists are delivering in a range of organisations, see Hugo Bowne-Anderson, What Data Scientists Really Do, According to 35 Data Scientists. Harvard Business Review, 15 August 2018. https://hbr.org/2018/08/what-data-scientists-really-do-according-to-35-data-scientists.

14 Some terms and phrases were commonly used prior to 2000 – data mining and data warehousing being two examples from the preceding decades – but I would suggest the terminology has become less obvious and hence definitions increasingly diverse.

15 Francis X. Diebold, ‘Big Data’ Dynamic Factor Models for Macroeconomic Measurement and Forecasting. 8th World Congress of the Econometric Society, 2000. https://www.sas.upenn.edu/~fdiebold/papers/paper40/temp-wc.pdf.

16 This term has become increasingly commonplace. See Democratizing Data Science. https://news.mit.edu/2019/nonprogrammers-data-science-0115, and 4 Ways to Democratize Data Science in Your Organization. https://hbr.org/2021/03/4-ways-to-democratize-data-science-in-your-organization.

17 Amongst many others, the Alan Turing Institute, the Institute and Faculty of Actuaries in conjunction with the Royal Statistical Society, and Harvard University have all devised frameworks for data science ethics in recognition of the need to provide some guidance.

18 A helpful framework for those conducting data science, which also incorporates ethics, is to be found in Sallie Ann Keller, Stephanie S. Shipp, Aaron D. Schroeder and Gizem Korkmaz, Doing Data Science: A Framework and Case Study. Harvard Data Science Review, 21 February 2020. https://hdsr.mitpress.mit.edu/pub/hnptx6lq/release/8.

19 Carlos J.F. Cândido and Sérgio P. Santos, Strategy Implementation: What Is the failure rate? Journal of Management & Organization, 21 (2), 237–262, 2015.

20 Freek Vermeulen, Many Strategies Fail Because They’re Not Actually Strategies. Harvard Business Review. 2017.

21 Paul Leinwand, Cesare Mainardi and Art Kleiner, Strategy That Works. Boston, MA: Harvard Business Review Press, 2016.

22 Fujitsu, Driving a Trusted Future in a Radically Changing World. 2019. https://www.fujitsu.com/uk/imagesgig5/driving-a-trusted-future-research-report-uk.pdf.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset