10 FLEXIBILITY IN EXECUTION

‘It is not the strongest of the species that survive, nor the most intelligent, but the one most responsive to change.’

Charles Darwin1

The previous chapters have outlined the key ingredients to achieving a successful outcome for your data strategy. It is not easy, but these are the steps to follow if you are going to do what many would contend is the hard part – turning strategy into successful execution. Strategy sets direction and execution plots the course, whilst adopting a dynamic, but aligned, approach to delivery is essential to achieving a successful outcome. So, you might ask, what does a further chapter that talks about flexibility in execution provide beyond what has been covered so far?

There are many reasons why the subject of this chapter is important, not least that you are not in control of the environment in which you operate. Many things can occur which impact your plan, and you are not necessarily in command of when or how they will manifest themselves, but it is important to anticipate that change is actually normality – it is a myth for most organisations that there is a constant in the environment in which they operate. In a speech delivered in Edinburgh in 1867, months before becoming prime minister, Benjamin Disraeli said: ‘Change is inevitable. … Change is constant.’2

An effective plan should not be a mould in which the execution has little scope to adapt. If this is the case, it will not survive long. The plan will have to flex to the challenges that arise, which could vary from resources having to be deployed elsewhere at short notice to solve an unforeseen and urgent task, or delays outside the control of the implementation team due to supplier issues, or a shift in priorities. The impact of these will vary widely, depending on the nature and length of time that they act as a distraction, but all will lead to a need to reset the plan in the context of what is the optimal way forward based on the new outlook for the programme.

It is likely that the impact of change will be felt more extensively where the nature of it is longer lasting or changes the direction of the organisation. It is important that data strategy execution is managed with a wider portfolio of programmes in large organisations, as this will provide important insight into changes that you need to be aware of. If the PMO (or similar) team is doing its job effectively, they should be tracking dependencies and benefits, and so be able to spot the implications of changes elsewhere on the data strategy implementation.

If you are operating within a smaller organisation, it is probably a case of keeping close contact with those who are privy to change occurring elsewhere which could have a bearing on your own plan. Your sponsor, for instance, should have access to the sort of information which could be essential for you to know to navigate your way around to keep your overall goals on track.

The risks of change are such that you should focus on capturing your dependencies as soon as you start work on defining your plan. These can range from the relatively obvious (risk of resources not being available to start the implementation) to the more esoteric (uncertainty in business demand which could place risk on technology investment being sustained throughout the planning period), but you should look at the wider impact such shifts could have on each key deliverable and manage them via a recognised risk management framework, and apply that technique to score and track the risk.

The changes that impact upon data strategy implementation could be short or long term. The response will vary, of course, but the approach to monitoring and tracking them should be the same. However, clearly the extent of the impact upon the implementation plan will determine the response. Depending on the assessment, it will likely result in a number of next steps:

  • Determine if the nature or scale of the change is unavoidable, especially in the context of the impact upon the data strategy implementation. This is a tricky call to make, and one which will almost certainly need to be escalated to the sponsor if it is an organisation-wide change or requires the executive board to make the call.
  • Assess options. Can the impact be mitigated by changes to timing within the implementation plan, realigning priorities or reassigning resources to lessen the effect on the programme? Is there a way to deliver part of the solution within the programme and return to complete it later in the programme? What is the impact of not doing the activity or task within the plan – does it have a material impact on the ability to deliver the data strategy? Provide options to the sponsor and agree the best course of action.
  • Seek trade-offs. If the change is unavoidable, is there a way to find an alternative means to deliver the plan? For instance, if resources are no longer available, could you bring in external resources and gain funding to ensure you can remain on track? Is there a way to backfill resources being taken off the implementation of the data strategy through using alternatives, even if they need training to step up? These types of approach may have a consequence on the budget you are working to, but there may be a willingness to keep momentum and therefore a willingness to fund beyond the original budget if it ensures that progress is not stalled.
  • Replan. The need to determine the revised approach will be informed by answers to the above, as appropriate, leading to some element of replanning. Whatever the outcome, there is a need to communicate this to all stakeholders and to reset the current status and projected outcome over the coming period. Do not forget to take the wider community with you. It is essential that there is understanding of what has led to the change and to retain confidence in the implementation programme – this is not due to programme failure, and it is important to retain credibility as well as maintain the morale of those involved in the delivery of the programme.

10.1 MANAGING THE IMPACT OF CHANGE

The change has been assessed, there is no alternative, and so you have to proceed with the implications to the implementation plan as you have assessed them. In part, this is a test of the agility of the implementation phase you are in, as you should have developed the appropriate mechanisms to track dependencies and assess the likelihood of the risk emerging. Of course, there may be things which you did not foresee when the plan was first approved, though an effective programme has dependency tracking built into it and so should spot new dependencies as they emerge – it is an ever-changing picture, and like radar on a ship, needs to be monitored continuously to see new threats or risks as they appear.

As we discovered earlier, the misapprehension that a successful implementation means sticking rigorously to the plan is a strong contender to lead to failure. Most organisations operate in a dynamic marketplace with a constantly changing landscape as each organisation seeks advantage over its competitors, the economy shifts and the customer makes choices. To think this has no impact on your data strategy implementation is clearly missing the point of why a data strategy is important to your organisation.

Referring back to the research conducted by Sull et al.,3 some of the findings are stark in this space. They found that organisations do not adapt quickly enough to changing market conditions (see Figure 10.1) and miss opportunities through a lack of agility and pace of decision making. Whilst only 10 per cent of managers thought failing to adapt at all was the problem in strategy execution, it is the pace behind the agility which presents itself as the problem; 29 per cent of managers were able to seize fleeting opportunities or mitigate emerging threats, whilst 24 per cent believed their organisations reacted quickly but lost alignment with the corporate strategy.

Figure 10.1 Organisations failing to adapt quickly enough to market conditions

images

Perhaps most damning of all, just 11 per cent of managers believed that all of the organisation’s strategic priorities had the financial and human resources needed for success – indicating that nearly 90 per cent expected strategic priorities to fail for lack of resources. The key message in the research is that agility must be balanced with alignment, and the findings bear out the risks if this is not the case.

The need to manage change within the implementation phase requires strong stakeholder communications and clarity of messaging. The assessment of the change needs to be undertaken quickly, but reliably, as the message will convey the impact of the change both in terms of what will not be achieved and the consequence of replanning to try to mitigate the impact of the change and deliver what it is feasible to do so. Clearly, you must assess the relevance and impact of the change for each stakeholder. If a particular stakeholder or group is going to be impacted negatively, then there needs to be greater engagement, and discussions to explore how to strike a balance in the mitigation need to be put in place that might involve the stakeholder providing resources or funding to assist in finding a solution.

Of course, the impact of the change may not be prescriptive in terms of it having a direct effect on the implementation programme. There may be options to even out the impact on the plan that mitigate the effect but result in the burden being carried by several stakeholders. This will need a collective will to take a group approach to keeping things moving as a whole, and there may be wider dependencies upon the delivery of elements of your own implementation that then impact other programmes. All of this should be understood in advance, through an effective stakeholder assessment that identifies dependencies not only within your own implementation plan but those you are responsible for addressing in the implementation plans of others.

If you are able to work through the impact assessment and come up with options, then you may draw up recommendations to put before the stakeholder group. It is likely that the impact will analyse a number of considerations, including:

  • wider dependencies (for instance, on other programmes or commitments) and the cost/benefit implications;
  • opportunity cost of the delay (due to the alignment of activities as planned which would be lost through delay);
  • resource availability at a future point if activities are delayed;
  • sequencing implications of any reordering of the tasks to be undertaken.

There may need to be options that put different proposals in front of your stakeholders, with the consequences of these made abundantly clear so all parties are aware of what they would be signing up to. Clearly, this would also need to go through more formal governance via your sponsor too, but with stakeholder feedback to guide a preferred decision.

The prioritisation process is, therefore, a complex one. It has to incorporate:

  • dependencies in both directions – those you have within your own programme and those you are carrying within other programmes;
  • resource constraints;
  • value-based judgements;
  • delivery sequencing (there is no point moving something forward due to resource availability only to find it cannot progress due to a critical deliverable being planned for later delivery);
  • material progress being demonstrable within your programme to avoid subsequent challenges on the evidence of delivery being strong enough.

It is a complex web to have to negotiate, so prepare yourself as best you can in the planning stage to have as much of this information to hand, so that you do not have to spend too much time collating it when you need it – at the point of having to implement this type of change, time will be of the essence and you will potentially be rushed into a decision.

The role of the sponsor is critical if and when you reach this point. The sponsor has not only potentially greater awareness of the wider political dimension as to what is driving the wider change decision, but also the network to be able to influence some of the audience you need to engage to enable you to gather all of your facts and get the right level of buy-in to the recommendations you put forward. The sponsor has to be party to the decisions to be reached, the implications and consequences, and be ready to fight your corner should there be any senior-level resistance to the proposed direction you intend to take. The sponsor might have to compromise, but if fully briefed on the background behind the proposed direction, then they can hold their own in finalising agreement on your behalf.

10.2 ASSESSING IMPACT OF CHANGE

This chapter has referred to the need to assess the impact of change on the data strategy implementation plan. The practical steps of doing this are a methodological approach to capturing all aspects of the change to ensure the bearing it will have on the implementation as a whole is fully comprehended, to enable you to make a decision fully cognisant of the totality of the impact. This forms a fundamental part of the programme management discipline, and so I do not intend to go into detail in this book when there are many other sources that provide more comprehensive coverage of this topic (see the bibliography at the end of this book for further references). However, I want to outline the basics, so you have a level of understanding should you need to do change impact assessments yourself for the first time.

You may have heard the term ‘change control’ within your organisation, and so may have experienced this for yourself without being fully aware of why or how this is undertaken. The process is simply trying to gather all aspects of the change and to manage it in a way that provides rigour to underpin a decision ultimately to be made based on the facts provided. It is relatively straightforward to do, though it may at times feel time-consuming to gather all the inputs before getting to a decision.

In the context of the data strategy implementation plan, change control is usually a five-step process:

  1. Identify and capture/log the change.
  2. Undertake an assessment of impact across the programme.
  3. Recommend and agree a decision.
  4. Implement the change.
  5. Close and undertake change review (typically for larger changes, to learn from the process to improve subsequent instances).

The process is relatively clear; the challenge is usually identifying all stakeholders (step 2) to ensure all impacts are captured to be able to undertake this part of the process as extensively as necessary to progress to a decision. If you have to assess multiple options to identify which change is to be implemented, then this process can be accelerated by conducting the potential changes in parallel, especially if it involves a similar group of stakeholders being involved through each of the changes.

If the data strategy implementation has gone through a thorough review of key stakeholders at the outset, when the implementation is being planned, you should have a good understanding of who to engage in the change control process. If not, then this stage could be time-consuming in itself, which could have been avoided through earlier planning. Ideally, a stakeholder map or list is in place to support communications, as well as ownership or interests in part of the implementation plan, and there should be resources outlined in the plan that would guide you to the right areas to engage on those activities.

You may also find that those who you believe need to be engaged in making the decision, especially signing off on a recommendation, delegate it to others. If this is the case, then seek to make the implications of this clear – you are taking a delegate from them who is empowered to make decisions, otherwise that individual is of limited value to you.

10.2.1 Resources

The implementation plan will detail the resources – human and financial – assigned to deliver the outputs and when these are intended to be deployed. Any change to the plan will need to be impact-assessed, as it is more than likely that it will need to realign resources to those changes. Of course, it may not be feasible to switch human resources, due to other commitments, and the skillset required needs to be factored in if there is to be any proposed substitution of resource to cover those activities reassigned in the plan.

Operating in the data strategy space is a delicate balance, from marshalling scarce resources with specific data-related skills, on the one hand, to mobilising operational teams across the entire organisation to drive the outcomes needed, on the other. The interdependencies in this data ecosystem necessitate a complex web of activities to be delivered with relatively limited options to change sequencing. Ultimately it is likely that the dependencies between activities will lead to a constraint on further progress being possible, unless that critical activity is delivered.

It is, therefore, essential to know where the pinch points are in the plan for the most challenging skillsets you need, so you can do all you can to try to minimise the risk. Alongside this, explore what the options are for substituting those named within your plan with others within the organisation if necessary, and, if budgets to source additional resources are not totally out of the question, investigate the options to buy in the skills you require and identify what might be available – either through the contract, interim or consulting market. The latter, if the right person can be found, brings deep skills and experience, just not the depth of knowledge of the organisation that the employee would bring.

The financial resources available to deliver the implementation phase are likely to be impacted by any change. This may arise due to the replanning required, which no longer plots a logically sequenced approach but incurs additional costs through a less efficient order in which activities are delivered. It may require retaining resources for gaps between activities, otherwise they will be redirected, or the use of higher-cost resources (especially if resources need to be drafted in from outside the organisation). There is also the potential of delays incurring a rise in costs, whether through price rises for items such as software or hardware, or simply an increased cost base through moving between financial years.

It is important to explore the opportunities to accrue budget to offset some unforeseen delays or cost increases, especially if it is feasible to bring some of that into a current financial year rather than delay into the next year. Clearly, this will be guided by the finance function in your organisation, but if you are constrained to fixed budgets in-year to deliver the strategy implementation, then this is something you will need to consider if you are facing a shift of costs into a subsequent year.

If your funding is based upon achieving milestones in your implementation plan, then the risk is that the slippage in the plan through the change imposed upon it will have a greater bearing than would otherwise be the case. Whilst funding linked to milestones is more usually applied to externally delivered projects, it is increasingly a way to keep focus on the activities within and ensure there is a results mentality behind the strategy implementation, rather than a less focused course navigated through the delivery of the strategy. I am personally an advocate of such an approach – it provides a degree of certainty to those who are tasked with delivery and sets an expectation for those funding it, which guarantees alignment in the understanding of what is expected – but those things outside the immediate control of the implementation programme need to be spotted, impact-assessed and called out at the earliest opportunity to provide an early warning that the timeline is no longer achievable, enabling time to work through the consequences and agree a reset amenable to both parties.

10.3 CAPABILITY REASSESSMENT AND THE ROLE OF LEARNING AND DEVELOPMENT

Chapter 6 covered the importance of understanding the capability of the organisation and those within it at some length. It is important to comprehend the capability and readiness to embark on a data strategy from the outset, otherwise the expectation versus what is realistically achievable may differ and scupper the data strategy from the start. Similarly, it is essential to embark on the strategy implementation fully aware of the capabilities of the organisation to turn strategy into reality. Implementation is a different skillset to strategy development, and whilst it is not unheard of to find people skilled equally in these two capabilities, the need to have a team (whether formally or virtually) you can trust to deliver the implementation is critical to your likely success.

The evolving nature of implementation requires someone who is as comfortable dealing with ambiguity as they are marshalling granular detail into a comprehensive plan. Without the agility to adapt and flex to the unplanned and unforeseen that lies ahead, those in the implementation space will soon find their programme in difficulties, and ultimately those who lead in these situations have to be effective communicators to ensure those around them are equally comfortable with uncertainty and ambiguity.

The point of revisiting the topic of capabilities in this chapter is to highlight that there will inevitably be changes through the course of the strategy implementation. People move roles in an organisation, especially those with scarce skillsets and a proven track record of success in programme implementation, and will also potentially leave the organisation. Demands within the organisation may simply determine that someone within the implementation programme team is needed elsewhere and switch them out at relatively short notice.

10.3.1 Handling the risks of losing key members of the implementation programme team

In many respects, the more successful your programme is proving to be, the more attractive your team will become to others who are looking to staff programmes of their own or have challenges within their function that need someone to troubleshoot. Whilst this is a tremendous accolade to you and your programme, it is one of the greatest ironies that retaining a great team can become harder the more the profile of the programme is raised and recognised for being successful. Whilst a moment of pride and acknowledgement, in the short to medium term it is a big challenge.

Anyone who has operated for any time within a strategy implementation environment will become knowledgeable about the organisation, the strategy, the rationale for embarking on it and the direction the implementation is taking. Those in the team will build a strong rapport amongst themselves, but most importantly, with those stakeholders in the wider organisation that it is important to engage and keep motivated and positive towards the programme, in part because it opens doors to staff in that function and buys trust.

Losing this knowledge base inevitably causes a loss of momentum, no matter how effective the replacement may be or how quickly they can be in situ. The formalities – the background to the data strategy, what it is seeking to achieve and why, the implementation plan and the progress to date – can all be acquired relatively quickly; what is much harder to achieve is confidence in both the newly appointed individual who has to take over and the stakeholder group, and awareness of those nuances of what the colleagues you engage with really think, where they see their own role in the implementation and the level of their commitment (do not always assume what people tell you to be what they actually think about the work you are leading on) to what you are trying to achieve.

That individual is starting out for the first time but actually, from a stakeholder perspective, once again, building trust and having to grow a level of understanding because the implementation is already under way and is not going to stop to accommodate their need to learn quickly.

Think of it along the lines of competing in the Le Mans 24 hour motor race. The previous driver has been at the front of the pack, a clear leader and driving the race of their life – which is why they got hand-picked for promotion, a bigger opportunity or a career move. Rather suddenly, and abruptly, that leader has called ahead to say that on the next lap the car will be stopping at the pits and another driver needs to be found – a change which had not been anticipated so early in the race.

In the scramble to find an alternative – as it is unlikely you have a reserve driver already lined up ready to go so soon – whilst the car completes the current lap you have to alert people to the change about to be made, find a new driver and get that individual fully briefed on the car, the race, the tactics and conditions, and with the car heading for the pits the other cars go racing by whilst you enact the change. If you are lucky, the original driver may have a few words for the new driver whilst heading for the pits and then as they change places.

Your new driver, still getting familiar with the car, rejoins the race towards the back of the pack rather than the lead, and you have to support them in gaining confidence to tackle finding their way towards the front again, to re-establish that momentum you had worked so hard to build and lost in an instant. If the driver doesn’t get familiar with the car quickly, they will not be working through the pack but risk being lapped by those who were behind only a lap or two ago.

This analogy demonstrates that the risk is known only from the point at which you have to deal with the uncertainty, but then you have to find a course which gets you back in the race and build the confidence of the new driver to try to make up the lost ground and get back to performing at a similar level to the previous driver.

It is often overlooked that at the start of an implementation there is an opportunity to undertake a period of mobilisation – identifying what needs to be done, refining the plan, and assigning resources and briefing those individuals accordingly. What is not considered, once the implementation is under way, is the impact of having to join midway through compared to the relative luxury of having the time for mobilisation. Of course, the benefit of joining part-way through is that the implementation has potentially been running long enough to gain a positive reputation, such that you join something with momentum, which had yet to be established at the outset.

Nonetheless, I think the challenge for those joining a programme and taking a leading role from someone who has built a strong reputation is always one which is daunting, and as the implementation lead something you need to consider – it is worth investing time in supporting your newcomer in whatever way enables them to achieve their own momentum for the greater good of the programme.

10.3.2 Learning and development

One area that programmes tend to lose sight of is the need to continue to invest in those who are undertaking leading roles within the implementation programme, who are also likely to need continuous development to keep their skills fresh and relevant to the tasks they are undertaking in your programme. They also need to consider their own career development and build their knowledge and skills to enhance their own career prospects.

The staff within the organisation will have been assessed, to some degree, in their capability to lead the data strategy implementation. It is essential that the findings of this activity are not lost at the outset of the implementation, as these will need to be factored in to the selection of the implementation team and to address the skills and/or capability gaps that have been identified at the earliest opportunity. I would recommend capturing this in a skills and capability matrix, so there is at least an understanding at a point in time to drive appropriate development. Ideally, this would then be updated, but clearly this is a significant overhead.

In the meantime, you need to consider how you operate most effectively given there are potential constraints in capability within the team, which might involve accelerating the learning by doubling up on resources so those needing to learn do so by shadowing someone within the team with the relevant experience. There is also the opportunity to take this further, using mentoring or coaching sessions with the team, using the skills and capabilities already within the team or possibly the wider organisation. You could also adopt a more project-team-based approach to delivery, utilising your resources to work in groups to collectively deliver on activities, rather than assigning each item to one or two members of the team. This spreads the risk and also accelerates the implementation team getting to know one another, building rapport and pooling skills to deliver as a group.

Learning can take a variety of forms, and people tend to learn differently, so try to understand what works best for members of your team and provide appropriate opportunities.

Just as I have described the risk of losing some of the better members of your programme team, do not overlook that you may have talent within the team ready to make that step up, which of course mitigates some of the risk of losing programme knowledge. If you are running a programme which is going to last two to three years then you should be thinking about succession planning, risk mitigation plans and rotation of the team – often, moving people within the team to take on different elements enables them to keep learning, and constantly challenges the programme team to think out of the box in how to engage most effectively with the wider organisation to deliver a successful outcome.

You may well have some fantastic skills and capabilities within your programme team: if so, encourage the team to run knowledge-sharing sessions to upskill through pooling the team’s capabilities. For those at an earlier stage in their career than some in the team, it is often these types of sessions that have the most impact, developing knowledge through seeing and then doing, reinforced by the approaches taken by more experienced members of the team. I can certainly remember moments in programmes I have been part of where I took away things I either observed or actively learnt from others in the team.

One of the ways you can build better links with the wider organisation is to actively seek opportunities to get members of functions across the organisation to talk through what they do and how they operate, and to give some insight into the specific skills and capabilities that underpin their work. It is often underrated how much satisfaction individuals get from being given the chance to talk with pride about what they do, especially if there is an attentive audience eagerly taking it in. Engaging in this way is often about building relationships on a one-to-one basis, establishing trust and respect for what one another brings to the organisation and getting the synergies from the collective effort. Of course, you may also find your next round of talented team members through such wider engagements, which also brings in fresh perspectives and diversity of backgrounds to the team.

Build in the need for learning and development into the programme. Do not regard it as something which is a luxury, as done well it will provide payback far greater than the time taken to undertake it.

10.3.3 Reassessing maturity through deploying new skills

The implementation of the data strategy should be measured by the data maturity assessment progression, amongst other KPIs. A key part of this is to focus effort on upskilling the organisation to be more mature in its understanding of data and the importance of becoming information literate (that is, most organisations and consultants talk of being data literate, but this usually misrepresents the goal which they are seeking, which is to be more intelligent in the acquisition and use of information, not the handling of raw data), and to be more insight-led to drive more evidence-based decision making.

The challenge, in trying to upskill an organisation, is how do you know it is sticking? Many organisations track the number of courses delivered and the hours accumulated undertaking e-learning or classroom teaching, or even rely on a simple post-course feedback sheet on the training provided before the individual has had the chance to put it to the test and demonstrate that they can now do things more effectively than was previously the case. This is auditing the activity has taken place, but is certainly not demonstrating value delivered through any assessment of impact achieved.

The various assessment models highlighted earlier in the book are designed to be able to measure progression and therefore assume that there will be some evidence beyond the audit style of approach to be able to demonstrate the difference. It may seem obvious, but very few organisations seem to have a methodical approach to conducting maturity assessments other than treating them as a point in time appraisal. This seems a rather haphazard and risky way to track whether one of the key elements of a data strategy has actually borne fruit and the organisation has made a fundamental shift in its ways of working.

As with any major deliverable within the implementation programme, there should be a strand of activities aligned to increasing the information literacy and proof of a shift to an evidence-based model of decision making. This should have a series of specific tasks to be delivered which, in turn, should have measurability to show how each contributes to the overall goal. This has to be rooted in practical evidence of progress, tracking how ways of working have changed to approach reaching a decision in a way the organisation would not have followed previously.

This can be hard to define, but it may be clearer if you take a discrete piece of work, either a project or a well-defined task undertaken within the organisation where you have a ‘before’ to act as a baseline, and have taken the relevant staff through the training to get them to approach it in a way that is putting into practice what the training has instructed. These examples become your case studies, those involved become your advocates and the evidence of progress becomes the momentum to build confidence that change will lead to a positive transformation for the organisation.

The maturity assessments provide detail behind the various steps and have rigour in how these are scored if applied correctly. It is therefore easy to identify the activities needed if the organisation is to progress to become more mature and the programme can structure itself to provide the means to achieve this. Of course, what the reassessment is seeking to evidence is that the learning has been applied rather than that the training has been provided, and so the implementation programme has to pursue the case beyond delivering information literacy to actually following this through into it becoming embedded practice. This is where stakeholder commitment, to provide the opportunities to demonstrate how powerful the change can be, is so critical to your success.

I will also refer back to Chapter 7 at this point. The art of being able to make progress in this area is as much about communication as providing the technical training. It needs commitment from the leadership that this is the direction the organisation is going, a reinforcement of the maturity assessment as a critical measure of success (some organisations embed the maturity assessment progress into objectives of managers to demonstrate this is not optional) and an ongoing drip-feed of success stories to demonstrate the benefits of shifting from the familiar to a new way of working. You need to ensure that you have the communications pipeline ready and the key messages from those who are influential to tell the story of why the change is needed and the impact it has made, and to call out those teams who have successfully embraced the new approach to give recognition.

10.3.4 Recruitment

The capability assessment undertaken at the data strategy definition stage will have highlighted where there are key gaps in the organisation to be able to transform to a new way of working. This could be down to a lack of knowledge, but it could just as easily be resistance to change. Without confronting these in the data strategy implementation the success of the programme will be undermined from the start.

Whilst learning and development is going to be a key weapon in the implementation armoury, there may well be a need for greater impetus from the start to get change moving. This could be through the implementation programme having a third party assist in its delivery, providing consulting support and expertise in getting change driven through an organisation similar to your own.

However, do remember that this will be a high cost to the implementation programme. You are likely to need to commit to the data strategy for a number of years, so building a dependency in a third party either leading or advising your programme may become a prop rather than supplementary, and make separation at the end of their engagement a significant risk to the endurance of the changes made or still to be achieved.

It might be that there are some key posts which require additional expertise or supplementary resources due to availability pressures impacting the amount of time that key stakeholders can provide to support your implementation. Consider whether hiring specific resource to cover these posts is a viable option, either as contractors or interim consultants, and whether they are backfilling to cover the current tasks for the individual in post or engaging with the programme on their behalf.

There may be posts which are either vacant or do not exist in the current organisation but which you identify as key to the data strategy being implemented. In the former case, take a look at the specification of the post in question and assess whether there is an opportunity to strengthen key inputs which you are likely to require in that post to be able to drive through the data strategy implementation.

You might be looking at how the current organisation is configured to build on the data strategy and, ultimately, to own it if there is not a logical place in the organisation for it to sit. Many organisations have introduced the CDO role, albeit with numerous variations as to what this post undertakes, from a specific focus on data to a broader brief to encompass its exploitation too. This is not to say that the CDO role should be tightly defined – many other roles with universally recognised titles have variations in responsibilities, not least finance, which can also incorporate loosely related activities such as procurement, risk, legal, audit and estates, whether in systems, unstructured data outside systems (for instance, offsite storage facilities) and the quality and accessibility of the data.

In some organisations, data is misleadingly put with the CIO on the grounds it resides in systems and so logically belongs to the CIO. Aside from the fact that not all data is in systems – unstructured data is often still paper-based, and keeping the document management companies providing offsite storage very comfortable in satisfying the insatiable demand that continues, despite talk of the paperless office for decades – the CIO controls very little data, providing the plumbing for the data to be contained and flow through but having no control over the staff within the organisation who enter and use the data. To get traction, data responsibility should sit with those who lead staff who generate and use the data, rather than those who provide the systems that host it.

Consider, as part of your data strategy and maturity, the readiness of your organisation to adopt a CDO role. It doesn’t need to be titled as such, and many organisations operate with posts which have the same responsibilities you would expect a CDO to fulfil but with a different title (for example, chief analytics officer, head/vice-president/senior vice-president of data, chief data scientist). However, having a focal point at a high level of seniority for data and its exploitation is becoming increasingly commonplace in medium and large organisations and those operating in the public sector, and this is perhaps something you should consider in the data strategy (or its implementation) to ensure there is a senior role to take on leadership responsibilities for the strategy in the future.

There may be key roles that you need to add to your organisation to facilitate specific changes that the data strategy will introduce. For instance, if you are intending to commence a data governance programme it would be wise to seek to hire someone who has had experience in doing this in another organisation, ideally with similarities to your own in terms of the challenges you face (not necessarily the same sector – data and its governance are broadly the same – but preferably of similar scale and maturity). A plan to move into predictive analytics for the first time, for example, will require someone experienced in building that capability and engaging with stakeholders to define and/or demonstrate the art of the possible in what can be achieved, and then build a team to roll out that capability.

There may be a case to develop these skills in-house, particularly if there isn’t the budget to recruit, but there is a steep learning curve to introduce these capabilities into the organisation if you have not done this before, especially as it will be new for all to comprehend and get behind within the organisation as a whole.

Therefore, the data strategy should identify what is required to achieve a successful implementation of the waymarkers and, if it doesn’t, the implementation plan needs to specify these as critical dependencies and ensure appropriate budgets are assigned for recruitment and ongoing staff costs. It is also important to be pragmatic about the timescales to recruit and, if you need someone on board in advance of the approval and likely start date, to consider hiring a contractor, interim or consultant, to ensure the implementation plan does not stall.

10.4 COMMUNICATING CHANGE

The data strategy is a key agent of change within the organisation, transforming the way in which all staff within the organisation capture, maintain and use data. As Chapter 7 explained, much of the success of the data strategy implementation will depend on how effectively you are able to get the message across that the data strategy is something that impacts the whole organisation, rather than something distant for just the executive part of the organisation to think about.

The constant state of flux that many organisations experience in the fast-paced world we operate in will invariably lead to a degree of tacking and changing direction in the execution of the strategy. This is to be expected, but it also needs to be communicated. It is a factor of the human mind that we deal with change on a constant basis – things not turning out quite as we expected, unplanned-for changes that confront us in our daily lives, even the choices we make for dinner needing rethought if the store doesn’t have the key ingredient or the restaurant no longer has that menu option available. Whilst these are commonplace in our lives, many people seem to regard work as a constant, doing the same thing routinely, and so change to this pattern becomes more significant than would be expected.

Of course, some react well to change and others positively thrive on it, and there is not a blanket rejection of change, otherwise we would not have change professionals in many of our organisations. However, even our most enthusiastic change professionals can, at times, overlook the resistance to change you may encounter. If you are leading a data strategy implementation, handling change resistance is going to be a key aspect of your role, so do not underestimate the extent of it and, more importantly, how changes to your implementation programme may be positioned as a failing of the programme when it isn’t necessarily anything of the sort.

The landscape of the organisation is important to consider when embarking on change. What has been the history of change in the recent past (up to the last decade, say), and how do staff refer to previous change programmes?

In many organisations, the most animated discussions about change will be the ‘remember the time when …’ sort, as if change was being imposed and staff rallied round to resist the ill-thought-out ideas of the time. People often choose to remember what the low points of programmes were, the parts which did not land well and had to be dropped or redefined to get traction. It is remarkable how, if you explore such discussions with that group further to consider the positives or the overall intent of the programme, there will be tacit or reluctant acknowledgement that some of the programme was a success and may well have made a positive change, but it is the parts that went wrong that enter folklore.

The way you choose to implement the data strategy will almost certainly depend on the backdrop of your own organisation, the approach taken to strategic implementation programmes and the confidence to communicate boldly with the wider organisation on change. If your organisation is rather conservative in its approach, you may well have to work hard to create the oxygen to enable your implementation programme to breathe and get wider engagement in the organisation. By contrast, if your organisation is much bolder and confident in promoting change, then you will need to be alert as to how to seize this opportunity to engage with the wider organisation to make what you are doing relevant and of interest to them.

Assuming you are in a positive situation to build a coherent communications plan around your data strategy implementation, then you should be mindful that your direction may change and hence you need to provide some scope in your communication to set direction but not to box yourself in on the tactics used to get there. The communications need to be focused on the need for change, the direction of travel and the benefits of getting there, not the detail that will become a focal point if you are seen to keep changing the tactics within the implementation programme. Define the steps that lie immediately ahead in greater detail, where you have certainty, but not those which lie further ahead – this will lose focus for those reading it and blur the key messages you are trying to get across.

It is common for data strategy implementation to get lost in the wider scheme of transformational activity being undertaken within an organisation. This is particularly the case if there is a function or team that leads on data and analytics, which therefore makes the implementation seem like a ‘business as usual’ task for those in that team but not for wider consideration. This is completely flawed thinking, as data pervades the organisation: it is no exaggeration to say that organisations would not operate if it were not for data – how would we conduct transactions, know how to compile a product or service, assign resources, bill a customer, pay an employee and so on? – hence a data strategy is for all to understand and support its implementation.

I have found that many data initiatives fail in organisations due to three things: a lack of sponsorship and drive, limited executive commitment to achieve its goals or an inability to get the focus to make it important to get buy-in to change across the organisation. They have been trying to do the right thing, but trying to execute it in the wrong way.

Many of my peers across the data profession will be able to trade stories about the difference the three key elements above can make, regardless of how compelling the concept may be or the timing is right to execute. Back to the Le Mans analogy, no matter how great the driver, if the wider team is not committed to the same goals with the levels of intensity to make the execution of the race a success, then the driver alone cannot rescue the team from failing to win the race. Breaking down in the early stages of the race with engine trouble is not the fault of the driver or the original concept, but the issue is to be found in the preparation and commitment to ensuring successful execution.

Your challenge is to ensure you embark on the data strategy implementation cognisant of the importance of having the communications plan running as a key stream within your own plan, and to make sure it is resourced and has executive commitment throughout. Do not overpromise, nor delve into reams of detail, but keep the focus on why the programme is needed, what it is seeking to achieve, how it is being delivered and the part that you need each individual within the organisation to understand so that they are informed, ready and willing when called upon to play their part. Be as open as possible about the trajectory having to adapt but ensure that the core message remains the same, such that you demonstrate a degree of control over change that may be imposed on you and the programme, rather than being caught out unawares and having to make decisions as you go along. Whilst the latter may be the case from time to time – every complex programme has to deal with uncertainty, and it would be folly to think otherwise – it does not need to be evident in the communication delivered to the organisation at large.

Make the most of such uncertainty by turning it into an opportunity, and identify any benefit that accrues as a result. It is often the case that such unforeseen changes can yield positive outcomes, and whilst it might delay or impact some part of the programme at that time, there can often be unintended consequences that can be seized upon to deliver benefits which were not originally envisaged or to bring other activity forward in the plan that would otherwise have been much later.

If you have to adapt, look for what that brings in return. Even if this may be tenuous in your mind there is a positive story to be told and one which is focused on continuing to make progress – do not give the impression of stalling or losing your way. For instance, if you need to bring in expertise due to resources having to be realigned, look at the positives of bringing in knowledge and capabilities which were not originally anticipated, and the benefits of learning from what other organisations have done in this space. If you have to delay an activity due to a dependency slipping, then what does this enable instead to keep you moving forward, even if only at half the pace you had anticipated previously? In other words, retain control of the messaging to portray the positives about what you are doing and the difference you are making.

Should you feel that there is limited scope to communicate change within your organisation due to its risk-averse nature, the lack of communications expertise to give a clear message on change or the fear of ‘they have heard it all before’, then consider how you can share progress in a way which is not change-focused. This may seem a little odd, given the data strategy implementation is a change programme, but for many people change is a difficult thing to embrace as it often cannot give longer-term certainty for the individual to the intensity, or degree, they seek.

In such cases, the focus of communications has to be more subtle, highlighting collaborative activities that have led to something which can be viewed positively as a constructive use of time and resource to make things better for teams and individuals within the teams. This is much smaller in scale and ambition, but achieves much the same outcome – a willingness to work with the implementation team in a way which is more about driving local improvements with those directly involved, to enable those individuals to have input and potentially shape outcomes that they are satisfied will work and are happy to adopt.

If you think this sounds more like continuous improvement then I would not entirely disagree, other than to say that this is adopting those techniques in the context of a much larger planned series of activities that need to be implemented across the organisation. It feels rather more organic, getting people on board to work with you through building trust and a willingness to talk openly about frustrations and opportunities to do things better, and capturing this in a way that can be incorporated into your own approach to implementing the data strategy.

It does have a degree of stealth about it, as there is a plan behind the engagement model which is not necessarily shared as overtly as would be the case in an organisation more open to transformation than others, but it is also about overcoming fear and mistrust to be able to deliver change through engaging with those at the sharp end of activity so they buy in to it from the outset. Building trust in this way can be progressed into a more engaged model where the programme more proactively promotes what it wants to do to build on the initial successes, as it is about getting those who might be both most involved in any change and experience the greatest impact to feel they have a stake in its execution. In many ways, it is building advocacy from an initial position of suspicion, such that these individuals can be active promoters of this evolving approach to other groups who may have a similar mindset.

10.5 A DYNAMIC DATA STRATEGY

The earlier chapters of this book talked briefly about the strategy cycle within your organisation, and you may find that you are tied in with a fixed period the strategy is intended to cover. Whilst this provides certainty of direction to the executive members of the organisation, it is a relatively arbitrary way of working in most cases (there are exceptions, driven by the likes of regulatory compliance or contractual positions, where the timing is entirely beyond the organisation’s control) and tends to bind the thinking to a period of time when strategy is an evolving activity – no organisation should stand still due to the complexities of the world in which it operates and the changing landscape that results from the impact of competitors, regulation, customer expectations and financial performance.

Historically, organisations have tended to work to fixed periods in defining strategy to ensure there was clarity for shareholders, regulators and the executive board to know what was intended to be achieved, and by when. It is often referred to as a ‘plan then do’ approach, in which the strategy is prescriptive and the execution purely tactical in conveying the strategy into delivery. Whilst this has served organisations relatively well, the dynamics of the modern world are making this not only less relevant but also too pedestrian for organisations to adapt to change. Decisions arise which will affect the strategic direction of organisations faster than ever before, and frequently these either disrupt the strategy or make it obsolete.

The case for moving to a more dynamic approach to strategy definition has been growing in recent years as the world becomes rather more challenging to operate in without some element of flexibility. Strategy has also become more fluid, less obsessed with needing to provide all the answers, and more focused on the direction and leaving the execution of the strategy more open to translation by those closest to the task at the time. The increased focus on techniques like Agile has led to a recognition that the prescription of the past is no longer practicable, and hence strategic execution is becoming much more a ‘test and learn’ model of adaptability, which enables those leading the charge on implementation to refine and repeat according to what has been learnt from the previous activity.

The other change is a more performance-based approach to strategy execution, seeking to measure what has been achieved in shorter windows rather than taking a longer-term view of whether the strategy has borne fruit. This demands a more responsive approach to strategy, as this lends itself to constant review of the implementation approach, and adaptation and evolution being applied to increase its effectiveness. It is important to stress that the measures need to apply at multiple levels in such a model, as the strategy still needs to be able to demonstrate its effect over a longer time frame to show the scale of what has been achieved from the original baseline, because a more granular level of performance will not necessarily present that bigger picture. This will also ensure that the performance in strategy execution is blended with the Agile iteration to retain its focus on the overall direction, and not lose sight of this for more opportunistic benefits in the short term which will detract from the bigger goal over a slightly longer horizon.

In terms of data strategy, I have been an advocate of adopting a rolling view rather than operating to fixed points of time throughout my career. I had a moment of clarity early on, when tasked with devising a ten-year strategy in an organisation that was embarking on taking a more strategic approach to its whole business than had previously been the case. I was leading a relatively new part of the larger business and had challenges in mapping out beyond a year; three years was a limit of realistic ambition given there were established competitors in a market still relatively new to us, and hence it was difficult to see how our recent arrival into that market would play out.

The ten-year vision was delivered, but there were two key components within it: a one-year strategy, which was easily transferable into an implementation plan, and a three-year strategy, which identified funding, people and product development needs to get to a point of greater maturity and, hence, stability, having passed the point of being a new entrant to the market. The ten-year strategy had the grand visionary statements that were little more than aspirational and was caveated with plenty of assumptions and forecasts that were inevitably lacking any robust evidence but were the best that could be produced at the time.

This demonstrated to me that, as someone with a keen interest in strategy, the situation you are operating within – often referred to as PESTLE, encompassing political, economic, sociological, technological, legal and environmental factors – defines the length of the viability of a strategy. For parts of the rest of the organisation, investment decisions were made on the strength of a ten-year projection of revenue based on likely demand, and whilst this might be questionable in the later years, without the view to a positive revenue outcome from the investment required, the project would likely have been curtailed or blocked.

Data strategy endures. It is not a business strategy in the sense of decisions needing to be made on what opportunities need to be grasped and where the organisation sees its future (expansion through organic growth, acquisition, stabilising or retrenching, for example). Data persists in organisations and hence needs appropriate controls and direction applied. This requires a data strategy to ensure the organisation remains compliant (or becomes compliant, of course), efficient in being able to access it and able to exploit it effectively. Therefore, there is a strong case to be made that data strategy needs to be a rolling strategy, conforming to the timelines imposed by the wider organisation but continuing to define its future as it goes.

How do you keep a data strategy relevant to the organisation? My own view is that it has to be dynamic, flexing to represent what the organisation needs to achieve whilst retaining a purpose and vision of how data can help make the organisation a more effective and efficient performer in a compliant and responsible manner.

I have encouraged those for whom I have delivered data strategy workshops to favour the concept of a three-year vision of what is to be achieved but to review annually, in order to keep it as a three-year direction of travel. Some suggest that a data strategy struggles to be of value if it goes beyond just a year, but I still harbour doubts that a data strategy covering a year is much more than an implementation plan.

If you operate in an organisation of complexity and scale, across multiple countries, or even one which is not used to change (or having a data strategy), a year is too short a time to demonstrate real impact to get an executive group to buy in to the change to be achieved. An effective data strategy should contain waymarkers that set out a vision of what will be achieved in a year, for those hungry for that level of immediacy. However, for many organisations, especially those early in their data strategy journey, there is so much to be done that real impact will be felt beyond the first year, which will be taken up with getting alignment, agreement and the infrastructure in place, along with the first series of pilots to demonstrate the art of the possible and whet the appetite for more to follow.

Establishing a three-year data strategy, reflecting on the impact of a completed year one to test and learn before setting the ambition for the new fourth year to be added to the rolling data strategy, will provide the opportunity to reassess and reset the ambition of the data strategy. As year two of the original data strategy becomes the next year in question, there is an opportunity to tighten some of the assumptions and expectations based upon the learning and experience gained within the implementation team.

One of the best-known approaches to iterative development is the concept of learning from experience, and what I am advocating here is to do just that – learn from your first year of strategy implementation, observe what went to plan and what did not, and adapt your approach and set your sights accordingly for the years ahead. A three-year data strategy should reflect that what has been discovered through the first year is valuable insight which must be used to revisit the data strategy in the context of the evidence gained.

It could be that the data strategy was not ambitious enough, progress and engagement has surpassed expectations and hence implementation has raced ahead, and so there is a need for a reset. Conversely, and possibly more likely, it might highlight that there has been more resistance than anticipated, the assumptions haven’t necessarily played out as expected, and there has been less progress but a lot of learning about the nature of the organisation, the teams within it and the priorities assigned to the data strategy implementation. Either way, you have learnt something which now needs to be used as evidence to apply to the remaining years of the data strategy and to shape a new third year.

If you do not apply this evidence, then you are failing in one of the likely basic premises of the data strategy itself – a desire to be more evidence-based in decision making. You also run the risk of creating a split between the data strategy and the implementation plan, as it is likely that those leading on the latter are not about to repeat the same (possibly painful) mistakes again the following year and so will adapt the approach taken. It is essential to keep the data strategy and the implementation plan aligned, so if there is a risk of changing one but not the other it is a sign of drift and should be called out and dealt with immediately.

I recognise that the rolling view of the data strategy may not fit with the approach taken to strategy definition within your organisation. I also accept that having a dynamic approach, in which an annual review iterates the relevance and pace of the data strategy, may also be out of alignment with the way your organisation works. However, these things can be done either in partnership with those who control the strategy process or, if necessary, independently.

There is little to be gained from the data strategy seeking to be a realistic vision if you do not reflect potentially significant changes in the organisation or learning from the progress of the implementation (recognising the potential resistance to change could have been either over- or underplayed). There is also a risk of a lack of impetus or direction if you are almost at the end of the data strategy before defining what lies ahead. As I stated earlier, the data strategy is different to many other organisational strategies as it represents a continuum; data will always persist in the organisation, regardless of the direction the corporate strategy takes.

Therefore, I recommend operating a rolling and dynamic approach to data strategy definition, which in turn ensures a closer coupling between definition and execution – the lack of which, as stated earlier, is the biggest cause for the majority of strategies failing to deliver.

10.6 TEN TO TAKE AWAY

The key themes to take away from this chapter are:

  1. Data strategy implementation needs to interlock with other change programmes or activities to ensure there is continued alignment, with a particular focus on any impact on your dependencies. Ensure you are tracking any wider activity that could have a bearing on the data strategy implementation.
  2. Respond to changes with agility and flexibility, as an inability to adapt quickly enough is a known cause of implementation failure. Consider the impact on the implementation programme in terms of resources, dependencies and benefits.
  3. Incorporate a change control process into the implementation of the data strategy and engage your stakeholder network appropriately to ensure there is an informed agreement to any changes to be made.
  4. Consider the impact of the strategy implementation on resources – human and financial – to assess whether these can be addressed within the implementation programme or need to be escalated.
  5. The implementation team will change over time and it is important to balance the risks of losing knowledge, continuity and personal relationships with the opportunity to bring in different perspectives and to keep the team energised. Plan for this as best you can: consider succession planning, broadening knowledge across the team and how to accelerate the integration of newcomers to the team.
  6. Do not overlook the development of skills within the implementation team once the programme is up and running. Assess training needs, focus on how to ensure there is breadth of knowledge supplemented by the experience the programme can offer, and bring in others from those areas in which there is engagement on data strategy implementation to share experience. Allow time for learning and development in the programme.
  7. Use maturity assessments to review progress through implementation. Information literacy will be key to making the shift underpinning the data strategy stick within the organisation, and needs to be embedded in the implementation programme as a key transformational deliverable.
  8. Consider the resourcing available, and assess whether there is a need for additional resources – whether recruited permanently or as contractors or consultants – to enable the implementation to be delivered at the required pace and quality. The data strategy may be a conduit to defining a role of a CDO for the first time to provide a focal point for data management and exploitation in the future.
  9. Build advocacy in the organisation as you communicate change. Bear resistance to change in mind; understand how change has been managed within your organisation to learn from experience. Build communications into your implementation plan and look to establish ways to share across the organisation and build some momentum to establish trust and understanding.
  10. Explore the feasibility of running a dynamic, rolling data strategy to keep it moving forward and to review constantly and provide flexibility to accelerate or decelerate as appropriate. Ensure the implementation is focused on measurement to demonstrate the value the data strategy is delivering to the organisation.

 

1 Attributed to C. Darwin, On the Origin of Species by Means of Natural Selection. London: John Murray, 1859.

2 B. Disraeli, Speech on Reform Bill of 1867, delivered in Edinburgh, 29 October 1867.

3 Donald Sull, Rebecca Homkes and Charles Sull, Why Strategy Execution Unravels – and What to Do About It. Harvard Business Review, March 2015. https://hbr.org/2015/03/why-strategy-execution-unravelsand-what-to-do-about-it.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset