9 EXECUTING THE STRATEGY – PART TWO: DELIVERY

‘You have to be fast on your feet and adaptive or else a strategy is useless.’

Attributed to Charles de Gaulle

The data strategy is drafted and approved, and you have completed the mobilisation stage discussed in the preceding chapter. It is now time for action.

Mobilisation has given you the opportunity to review the continued relevance of the data strategy at a more detailed level, challenging the RAID log and enhancing it to provide the level of data you are going to need to be able to track an active RAID log as part of the implementation of the data strategy. Resources have been allocated (or, potentially, you know there are no other resources coming – still a certainty, even if a more challenging one!), timescales and priorities agreed, and now it is purely about delivering on the promise.

How you structure the implementation, whether as a programme or a loose arrangement of activities undertaken through other routes but tracked centrally, is entirely down to you, your organisation’s preference and the resource you have to overlay formality over the strategy. Nevertheless, there are some practical tips and personal preferences I will share with you in this chapter, providing some guidance to the data strategy implementation path you choose.

9.1 ASSIGNING ROLES AND RESPONSIBILITIES

It is important to establish the roles people will play across the organisation as part of the delivery of the data strategy. The standard model used for some time in programme management is the RACI model, which identifies responsibility, accountability, those needing to be consulted and, finally, those to be informed. There are variations on this theme; I quite like the addition of support into the mix (those who will help and/or support those responsible), but it is a well-understood model and as such is probably familiar to some of your colleagues, if not to you yourself. See Figure 9.1.

I do not intend to cover in detail the thinking behind the usage of RACI (or RASCI) models as there are plenty of reference sources available should you wish to pursue this topic in more depth than space permits here. I shall instead provide a generic level of how you may wish to apply your RACI in the specific context of the data strategy and leave you to investigate other, broader, perspectives in the mass of literature on this topic.

Whilst the acronym leads with those responsible, it is perhaps more pertinent to start with who is accountable (A). Someone has ultimate accountability for the implementation of the data strategy. This may be your role, reporting in to a board or senior team to outline and demonstrate progress, or that of a sponsor for the programme. Accountability should never be shared (unless a role is, of course, shared, such as a job share in which the role is managed carefully as one despite being undertaken by two). The point of accountability is that it is clear to all who has final say and determines direction and the corralling of resources to deliver that for which they are accountable.

Figure 9.1 RASCI D. Mann (2017) Working Well with Distributed Teams. Akoonu. https://www.akoonu.com/blog/working-well-with-distributed-teams/.

images

It is feasible to have accountability split at a task level in multiple places, and this is actually quite common for large programmes such as the implementation of a data strategy. This entails a hierarchical approach, breaking down the project into a series of deliverables, each of which could have a set of tasks attached to it. In such a scenario, the accountability may be defined by groups of activities to be undertaken; it could even be accountability at a task level, though this tends to be unusual unless that task is so discrete and significant that it requires individual accountability.

Responsible (R) may concern any stage from the programme delivery owner, reporting in to a sponsor who is held accountable, or detailed tasks for which someone is responsible for delivery. Unlike accountability, the difference with responsibility is that there could be multiple people responsible for achieving a task if it involves a collaborative effort. Those responsible would be expected to track and report on progress regularly, as this is the step that helps guide the accountability in determining whether the programme is on target to deliver outcomes to intent, time and budget.

Those who are consulted (C) will be key agents in the delivery of tasks, whether individually or collectively. Their role is often to bring subject matter expertise, specialised input and alignment of activities to that which lies outside the programme, such as other programmes or business-as-usual activities which are undergoing changes and therefore need to be considered. They therefore perform a critical role in advising and guiding the programme to ensure there is integrity between the programme deliverables and those supporting the everyday running of the organisation. Whilst they are there to be consulted and to advise, it would be challenging to go against their opinion or recommendations without documenting and reaching agreement as to why the programme needs to do so. Therefore, the consulted can have more teeth in blocking the programme if it is not seen to be consistent with the wider direction, and so the term may not be as passive as perhaps it first sounds.

Finally, the informed (I) group will be a mixture of those who are impacted but not directly participating (they may have to work in a different way in the future as a result of the data strategy, but their views or opinions would be swept up by one or more individuals consulted on their behalf); those who have an interest in tracking your progress and therefore receive regular performance reporting (could be the board, or some group your sponsor is accountable to within the organisation); and those who have an interest that is peripheral unless the data strategy implementation strays into territory outside that which was initially considered (hence the importance of tracking changes throughout the programme). This could arise if there is a compliance issue that was unforeseen, resulting in those who were to be informed suddenly needing to be consulted, for instance.

The informed group is key to keep engaged, which is why the communications approach is so critical to get right throughout the implementation of the data strategy, otherwise there is a risk that the application of changes delivered via the implementation fails to land effectively through a lack of buy-in or simply low levels of awareness.

The addition of support (S) into the RACI provides a clear line of sight to those who can enable the successful implementation of the data strategy. This group will play a role in helping you deliver, whether through the provision of resources to support those who hold responsibility, the delivery of messaging via communications, or the tracking of progress via a structured programme management and financial control process. Capturing this additional area to assign supportive roles is vitally important to ensure those who need to work with the implementation programme know what is expected, when, why and how it is to be achieved. It also enables the responsible group to be aware as to whom they can turn to in order to achieve the desired outcome.

With a clear RACI (or RASCI) in hand, you should have a good understanding of not only what is to be delivered through your plan but also who is undertaking what roles to bring it to life. One simple, and relatively obvious, point – do not focus on named individuals per se, as there may well be changes in personnel in the life of the implementation plan, but capture roles that are essential to delivering on your plan and names of individuals currently in those roles as a secondary level input. You are seeking to fix the RACI within the organisation; do not come unstuck as people within your organisation come and go.

One final observation. The use of RACI within organisations does not mean that it is necessarily followed. This may seem a perverse statement: after all, the whole point is to assign roles and responsibilities. Often, it is the grey areas which cause issues, with people being too obsessed with what their responsibility is rather than being flexible and responsive to how they engage with others when such vagaries arise. Therefore, avoid making the RACI bureaucratic; it needs to be indicative but there is still no excuse for a lack of collaboration. A good idea is to link the responsibilities within the RACI in a value-added way, to demonstrate ‘why’ something should be done and to link it in a way similar to a value chain. This focuses the mind on completing end-to-end activities to release value, with the RACI acting as a guide to who plays what role in making it happen.

9.2 PLAN FOR ACTION, PREPARE FOR CHANGE

Operations keep the lights on, strategy provides a light at the end of the tunnel, but project management is the train engine that moves the organization forward.

Joy Gumz1

The implementation phase, once in action, will take the plan, defined through the mobilisation stage, and assign resources. At this point, the plan is communicated to all stakeholders to ensure that there is a thorough understanding of how, when and where the implementation will take place – the detail on what needs to be achieved by the organisation as a whole, and specific functions in particular. Bear in mind that the strategy may well have only touched a minority of staff in your organisation in its formulation, so the need to communicate the strategy takes on greater importance if you are to succeed in taking your colleagues with you.

It is highly likely that the implementation plan will focus in on greater detail in the months or year ahead, whilst sketching out in lesser detail the rest of the implementation of the data strategy. Do consider the validity of the waymarkers in determining the features in the first year as consideration will have been given in defining them as to the cadence to be expected in delivering the data strategy. If you know more detail such that it either changes the focus, accelerates or decelerates the pace or has new inputs which align with the core intent of the data strategy, then you should not feel a hostage to the waymarkers. However, these will have been a factor in the sign-off process, so you should communicate the need for change openly and vigorously, to ensure there is no confusion as to the rationale later.

This plays to one of the key themes that will be at the heart of project managing the implementation of the data strategy – change management. It is essential that there is a recognition of the likelihood of change throughout the implementation phase, and it is important to engage all parties openly on the need for (and approval of) change as well as clear lines of communication as to why change is required, the impact it has and the demand profile it now requires from others to meet the revised plan.

The significance of preparation in implementation planning was highlighted in the previous chapter, suitably referencing Abraham Lincoln’s quote about preparation time for the axe to cut down the tree. If there is anything experienced programme managers will tell you about what made for successful programme delivery, one of their key messages would almost certainly be the need to focus your plan on what you intend to do, even if reality quickly becomes divergent from it. As Winston Churchill put it: ‘Those who plan do better than those who do not plan even though they rarely stick to their plan.’

Therefore, familiarise yourself with the need to construct effective change management controls in your plan and to recognise that there will be divergence, whether that takes the shape of time, resources, focus or a mixture of all three. An inability to flex, make effective decisions and communicate those changes with clarity to all who need to be aware will result in your plan – and, thereby, the data strategy – failing almost as soon as it has started.

The first time you find a change needing to be implemented will be the opportunity for you to put your programme governance and controls to the test. I suggest you trial this prior to going live, as there is little to be gained from losing momentum through having to determine how to respond on the hoof and it will only lose time and confidence in your plan. Walking through the change management approach will check how effectively it engages those parties who need to know about the change, and establish how efficiently those areas are able to adapt to it to inform you as to the flexibility and adaptability your organisation has built within its DNA. You may well have a feel for this yourself, as it is one of those things that underpin the type of culture your organisation has, so whilst you might not be able to spot it via any documented evidence, you may have observed the inherent inertia or resistance to change on other activities you have been party to.

Change is inevitable; if your plan does not change then it is more likely that you are aloof from the challenges your organisation is facing than your environment being so stable that the implementation plan does not need to adapt and evolve. Unless you operate in a highly regulated, slow-moving sector – perhaps nuclear energy being one of the few examples – then your organisation will be buffeted by the winds of competition, compliance, economic stability, resource availability, sector changes, supplier delivery availability and pricing impacts; the list is almost endless, and applies equally to small businesses to those operating on a global scale.

The coronavirus pandemic has shown the fragility of our economies to factors outside our control that can have a global impact, regardless of size of business or sector in which you operate. Those who have been able to ride the storm and seen demand soar have had pressures in being able to source supply of raw materials and resource constraints, not to mention trading patterns which have been changeable on a regular basis that cannot be easily predicted. Many organisations with perishable goods in their supply chain have seen those assets disposed of due to an inability to trade. Whilst this has been an unusual event, at scale it shows the vulnerability of even the best-run organisations regardless of size.

I have stressed the likelihood of change, the importance of adaptability and the effectiveness of being able to take it in your stride as essential features for a successful outcome. I should add that if you find yourself having to diverge significantly from the underlying data strategy that you are tasked with implementing, there will be a trigger point at which that change is so great that the data strategy needs to be paused and consideration given to having to update it. This results in a reset of the implementation, as there are too many instances of strategies of all types no longer reflecting the reality of the implementation, and a confused outcome resulting due to a lack of common understanding and agreement on the direction being taken.

As Michael Porter said: ‘Strategy must have continuity. It can’t be constantly reinvented,’ adding: ‘continuity of strategic direction and continuous improvement in how you do things are absolutely consistent with each other. In fact, they’re mutually reinforcing.’2 Indeed, Masaaki Imai, regarded as the father of kaizen – defined as gradual, unending improvement, doing ‘little things’ better; setting, and achieving, ever higher standards – describes incremental change as a key part of continuous improvement. He says it is ‘not a paradigm shift or invention, but slow and steady progress is the most innovative. It helps apply change easier, as well as giving the reins to the organization rather than having to respond to external forces.’3

You will also require some form of programme governance to track what you are achieving, assess progress versus the plan via metrics and bring the RACI to life to ensure there is the right engagement at all levels to keep things on track. This will be a critical window into the programme for your sponsor and/or those expecting you to deliver a successful implementation, but you should also treat it as a shop window to demonstrate the change you are making to the organisation. This chapter will cover the other key aspects of the governance you will require, but ensure you have representation across your stakeholders in the way you govern the implementation phase, as it is better to have all parties inside the tent focusing attention and effort collectively on the same objective than to have groups isolated, outside the governance but able to disrupt at a more senior level within your organisation.

9.3 CUSTOMER ENGAGEMENT

The theme of customer, or stakeholder, engagement has run throughout this book, so it should not come as a surprise to have it raise its head here, in this important chapter about the delivery of the data strategy.

The preceding section talked of the RACI (or RASCI, if you prefer) and aligning the implementation plan to the resources across the organisation. It is essential to consider the engagement across the organisation, as any data strategy needs total buy-in from all of it to succeed. There is not a single function within the organisation that operates in a vacuum without using data in some way, so this data strategy is for every function – it is neither optional nor limited in its application. You will need all parts to engage, and the more they do the more likely you are to succeed.

9.3.1 Communications

Many of the communications challenges were covered in Chapter 7. However, in the context of customer engagement, with strategy execution the important factor is how to compete for communications space and deliver a compelling message that makes your colleagues take notice. This is what your focus needs to be on when you are devising your approach, and the constant refrain in your discussions with communications colleagues to ensure this is at the forefront of the messaging. It is essential to consider how you will deliver impact through communications rather than assume the act of communicating is, in itself, simply the answer.

Most organisations have far too much going on and, to put it bluntly, many messages from your communications function are likely to fail to land as well as hoped. This may be due to the sheer amount of noise in the organisation, making it difficult to comprehend how any of this comes together and what takes priority, or it might be that the communications are failing to reach the audience. I have experienced the latter, especially in those organisations that determine that the ‘go to’ place should be the intranet, or some social media type platform. That is not to say these platforms are not a good option, but do some analysis of the channels (it is a data strategy you are delivering, after all, so this should be familiar to you!) to assess effectiveness (measurement), breadth (coverage of the organisation using it) and frequency (how often they are accessed) to understand what they bring and how best to utilise them. A blended approach will almost certainly reap better rewards, especially if these are linked to drive a coherent story.

Often, those who are the most engaged will utilise the broadest range of channels to garner information, particularly within organisations that have multiple communication delivery routes available. This can lead to a sense that social media platforms are exceedingly popular due to the number of hits obtained, but actually it is a minority of the workforce using those channels disproportionately. Invariably, it is those who tend not to utilise the broad range of channels who are the hardest to reach but the ones you want to influence the most. This requires some creativity to identify how to get to the breadth of the organisation, and is where having local knowledge comes in very useful.

Wearing my analytics hat, it is disappointing (but maybe not surprising) how lacking the communications delivery is in terms of value-added metrics, counting hits (and usually not unique ones at that) or simply the fact that something ‘is out there’ as job done. I have to say I’ve worked in far more organisations with intranets which seem to have search engines designed to find what you don’t want than those which do – the experience everyone expects is akin to an internet search, and the return of hundreds or thousands of hits in which nothing on the first two pages is even related to the search simply undermines the credibility of the intranet.

What does this mean for your communications strategy and customer engagement in this implementation stage? Simple: don’t leave it to the communications team, take the bull by the horns and devise messaging for those who are connected to the programme to deliver. Remember the advice in the earlier chapters to get people aware and involved with the data strategy, the show and tells, the opportunistic seeking out of ways to embed your message alongside others and to weave a common thread, the strategy storytelling approach of Chapter 6. If you own the implementation, you own the communications. Remember what the RASCI indicated. You are supported by the communications team but you are accountable for this activity, so make it work.

Remember, too, we all learn and absorb information differently. It is quite likely that your organisation hasn’t developed a sophisticated approach to communications in which it segments the channel, message, even the timing, based on what works best for the individual (this may be one of the goals of your data strategy in due course). As someone with some awareness of data and analytics, it should not surprise you to discover that we all like to listen and learn differently, so a ‘one size fits all’ approach is never going to work.

Think about this carefully. Try to remember the last six to ten corporate messages you were told, set them down and try to identify the important ‘takeaway’ message in each. Do the same with some colleagues in different areas who should have had the same corporate messages. Compare notes. See how consistent they are, from whether you have the same list through to the detail of the takeaways in there. Invariably, you will discover differences and may not even have the same list – some messages will resonate for certain groups and completely pass others by, due to the brain filtering out those which do not seem to be relevant on the surface. Ask how the others received, or gathered, the information. Chances are there will be differences there too, even down to the opportunity to discuss and engage leaders on such things, to simply receiving these messages via a written update.

9.3.2 Employee engagement with the strategy implementation

De Bussy and Suprawan conducted a national study into which stakeholders had the greatest impact on corporate financial performance.4 The coefficient for employees was 0.84, which left customers (0.36), suppliers (0.35), communities (0.32) and shareholders (0.08) in the distance. Whilst this research relates to a different field – financial performance – the findings would seem to hold true with regard to data strategy implementation too. Employees have to be engaged to drive the impact – if they are, then there is a clear association with driving successful outcomes.

Think about the governance you intend to put in place for the implementation of the data strategy. How do you want to ensure there is transparency, evidence and total commitment to your cause? In practice, this will demand that you make this everyone’s business, something they want to participate in and influence in terms of the practical delivery steps. This will be onerous, challenging (having engaged stakeholders can be more of a handful than a seemingly passive organisation) and time-consuming, but it will bear fruit and achieve the best outcome through a really effective collaborative approach. You will need to be prepared, able to demonstrate progress and be in control of the narrative and direction, but be ready and willing to adapt and flex to achieve those outcomes you want. Getting there with all behind you, even if on the surface a little slower, and delivering something not quite what you had expected, is an achievement.

Having identified the key players to be involved in the governance group, ensure that there is clarity on how you expect them to communicate and the messaging to be delivered (and how this is defined and agreed by the group). Seek inputs across the group, and do not be afraid to abandon the ‘one size fits all’ flawed strategy – explore the reasons and the pitfalls of such an approach with the group and agree on the alternative, even if it is tailored to different areas of the organisation and takes advantage of some understanding of communications segmentation.

Many strategy implementations fail as the communications all but dry up, starting off with gusto and then turning into a trickle some months down the line, till there is no communication being issued at all. Whilst there is a risk of communicating with nothing to say, you do need to recognise that across the spectrum of the strategy implementation there are likely to be successes and progress to be reported, and often the art is ensuring you keep your programme front of mind through those periods when otherwise the stories might dry up. Remember, there will be communications of some sort in each part of the organisation on a daily basis; it is as much about seeking opportunities to align the work the implementation programme is doing as it is to have headline stories. Consider how you can devise links between such announcements into the strategy implementation, and support the relevant team members to make the most of such opportunities within their own functions.

Continuous engagement is essential if you are to keep the momentum behind the strategy implementation. It is the one of the highest causes of implementation failure,5 and programmes are prone to over-elaborate the level of content required to enable communications to be released, which leads to delay and, when ready for release, too much information being issued to be absorbed effectively. Little and often – as often as possible – is the order of the day to keep progress at the forefront of minds and lines of communication going.

At every step of the strategy implementation, the impact on various parts of the organisation should be assessed and relevant communications drafted in conjunction with the key stakeholders representing those areas of the organisation. The programme implementation team should have key links into the relevant business areas, and be able to tailor messaging, spot opportunities to integrate messages and brief the appropriate people for onward dissemination of those messages.

In my experience, there is always an angle to be found in linking data strategy to the communications activity within the organisation. However, I have also tended to utilise key messages within ‘local’ communications to reinforce elements of the data strategy implementation, rather than call them out specifically as data strategy deliverables. Putting the message into the localised context, in words that sound familiar to the audience receiving that message, increases the likelihood of adoption and support. It also binds the local leadership into the message they deliver, ensuring support for the data strategy implementation, which is key to your success.

9.3.3 Alignment with the corporate strategy

Clearly, the link between data strategy and the organisation’s corporate strategy is integral to providing the important grounding and purpose behind the data strategy. This should be enshrined in the data strategy itself, but is an important communication message in its own right. Alignment between data strategy deliverables and the roles these play in enabling delivery of the corporate strategy should be clear for all, so the implementation plan needs to retain sight of these and track the wider impact as the implementation progresses.

Just as the data strategy needs to be flexible and have a coherent and integrated change management approach, so the corporate strategy should have adopted those same principles. Connecting the change management activities of both strategies is therefore essential to keep the alignment between the two strategies and to make sure that any change in the corporate strategy is assessed as early as possible (ideally, well in advance of the change being approved), and any dependencies or enablers reflected in the data strategy and how it plays into facilitating the strategic objectives of the organisation.

In the same way that the corporate strategy has been distilled into a series of divisional and functional objectives via an implementation plan, so the data strategy needs to find an alignment to reflect a similar construct. This is essential, as otherwise the practical alignment between the corporate strategy implementation and your own data strategy implementation will find a disconnect that will threaten to undermine the effectiveness of your implementation.

9.3.4 Retaining agility

The role of Agile in the process of defining and implementing the data strategy has been touched on at various points in this book. Much of the communications approach also fits well with an Agile approach, blending the milestones in the implementation plan with the opportunism of identifying how to communicate messages effectively alongside other plans to disseminate information across either part or all of the organisation.

As with any complex implementation programme over a longer time frame, the challenge of being able to keep messaging fresh and relevant becomes an issue, especially if the communication is not directly impacting those in receipt of the message. The importance of keeping the communications fresh and relevant is not unique to strategy implementation, but can be detached if it loses the sense of ‘what does it mean for me?’ in what is being disseminated. This is where the blend of Agile and local expertise in understanding the audience is so important.

There is always a risk with any programme implementation that the focus on milestones, the longer-term success and the overall intent of the strategy is too distant for some in the organisation to understand. I recommend taking a more thematic approach to turning the deliverables into use cases (rather than user stories), enabling your colleagues to understand the cause and effect of the change in the context of their work as the most effective way to get traction and retain it over a period of time.

Use cases in an Agile context work well to describe functional behaviours in a flow of events that enable fine-tuning of the outcome, which is particularly suited to data strategy implementation, given there may be several ways to deliver a change to the benefit of the end user. The importance is to focus on value, and not spend too much time defining them to absolute levels of detail as they are indicative rather than a blueprint, and you will iterate the solution from the foundation of the original use case.

Agile provides the means to deliver outputs and outcomes within a self-imposed constrained timescale that move things forward. There are a number of Agile approaches, and I particularly like the dynamic systems development method (DSDM) version as it is more project-based than many (Agile had its roots in software product development). However, there are many aspects of the Agile approach that can work and, if appropriate, I would suggest taking the best, or most pertinent, elements that these offer and defining a way of making it work for you (even if this is departing from the strict definitions of each model, it is being agile!). I have also operated using the Scrum approach, which is very team-focused and is simpler to operate due to being easier to explain and therefore quicker to adopt. Indeed, it is not uncommon to blend DSDM and Scrum as an approach, bringing the best of both together to get flexibility and effective stakeholder engagement as a key focus.

Agile is an approach which can be just as effective when deployed to your communications strategy. Identify the things to be achieved over a given period – I would suggest no longer than a month, possibly shorter – and devise the practical communications that deliver a message that tells your colleagues ‘what it means for me’ and also, if appropriate, what that means they need to do to help you keep moving things forward. Bringing them into the communication, making them feel some part of the delivery, will make for a more inclusive approach to sharing the burden and, consequently, the success, and the vast majority of us get a buzz from being part of a successful outcome.

Look for the key themes you want to feature through the month, start to build a range of use cases, communication pieces and briefings to share, and be prepared at the end of the month to review the progress made. As with all Agile activity, without the test and learn of what worked and what did not land as expected then you are likely to be in a cycle of repeating past mistakes. Improve, month on month, and you will start to build a following around engagement and getting actively involved with your communications approach, and sharing in the success of your implementation.

Share the measurement of your Agile communication sprints (a time-boxed period used within the Scrum methodology to deliver an agreed workload) as a key part of your overall progress. Don’t forget, the measurement of a successful implementation is as much about overcoming those communication issues which hinder the majority of strategy implementations, so it should be one of your programme KPIs.

9.4 PROJECT TEAMS, A PMO AND THE DATA AND ANALYTICS FUNCTION

Inevitably, the data and analytics function (should you have one; if not it may be a collective of those with similar skills working informally across an organisation, or possibly just yourself as strategist, advocate and enabler) will play a key role in the definition and implementation of the data strategy. It is likely that many in such a function would have shaped the thinking of the data strategy and input to it along the way, possibly even helping with the final drafts. It is unlikely therefore that the data strategy implementation will come as a surprise.

The implementation will need a level of project coordination and tracking if it is to succeed. It is always a risk, once the strategy has been defined, that the daily activities of the organisation lead to the implementation of the strategy being lost, and so the organisation reverts back to a tactically driven approach, failing to achieve the things which move the organisation forward in a meaningful way. It is easy to spot such organisations: they tend to end up with numerous workarounds to key activities and a lack of design in how things work, and are then handicapped by major challenges such as technical debt – a dependency on systems which are no longer effective, compliant and/or supportable in delivering what the organisation needs.

The scale of the project team will need to align with the size and type of organisation you operate within. Some organisations will have these aspects already in place, a programme (or portfolio) management office (a PMO, or PfMO, if overseeing portfolio) that oversees all programmes within a portfolio. If either a PMO or PfMO is deployed to coordinate and direct the implementation, it is important that there is absolute clarity on all aspects of the data strategy, from the intent of the waymarkers, resources to be applied and from which parts of the organisation, the overall goals of the data strategy, the RACI (or RASCI) plus an understanding of the RAID log – they will need to utilise the latter two in their PMO planning.

To understand the complexity of what you are trying to do, consider the number of moving parts you have probably got within your data strategy. Data features in the work of every part of the organisation, whether capture, maintenance, use, manipulation, creation or storage to retain, through to deletion. The data strategy will therefore impact on everyone in the organisation as it is going to change some of the activities in the data lifecycle. You need to ensure everyone is playing their part in delivering whatever changes you are introducing over the course of the data strategy implementation, and that needs communication, understanding and buy-in to make it a reality, all of which needs to be tracked to identify it is happening and, more importantly, where it is not and why.

In the implementation of the data strategy, relatively small things can knock the whole programme sideways. If you find the organisation still capturing data inadequately, for instance, with missing or incorrect data, then it impacts the rest of the data lifecycle. One of the slightly frustrating aspects of having worked in the data and analytics space as long as I have is that the problems relating to data quality still persist despite effective technologies, increased awareness, greater resources in the data and analytics arena, and a myriad of articles written about the topic being available today. Indeed, we have more data today, and arguably it is less accurate in totality at a time when there is increasing demand to use it.

The project team may be a generic programme management function (if such a team exists) or may be left to you or someone similar who has had some influence in the design of the data strategy or is seen as an appropriate person to lead on its implementation. Either way, the task of the project lead is to quickly turn the data strategy, its waymarkers, RACI (or RASCI) and RAID log into an implementation plan with milestones, detailed deliverables, activity owners and the mechanism to track progress and measure success. This will need inputs from others, and some coordination to do so, which is where the PMO (or PfMO) comes into play to support the definition of the implementation plan in a way that can be tracked.

The PMO may undertake a number of activities, such as the following ten common examples:

  • Governance – provision of support across the breadth of the implementation programme, capturing decisions and tracking actions to enable an effective governance regime to function.
  • Performance management – delivery of reporting at all levels of the implementation programme to set standards. Proactive tracking of issues and assessment of current or future risk of performance failings with mitigation identified where possible.
  • Planning – focused on milestones and tracking the plan to ensure deliverables are on course. Includes breaking down the plan into activities and ensuring these are tracked and aligned at all times to provide consistency and line of sight to any risks.
  • Risk management – capture and maintenance of RAID logs and assessing scale of risks. Includes issues and dependencies, as well as tracking assumptions to ensure these remain valid. Evaluates RAID to determine escalation as appropriate.
  • Human resources – ensures the resources are optimised to deliver the programme, identifies potential gaps and risks, and seeks to mitigate these through prioritisation and planning. Maintains the resource view of the implementation plan to track utilisation and need to roll on/off resources.
  • Financial resources – oversight of budget, tracking progress and forecasting future spend based upon plan and commitments. Production of (at least) monthly reporting to senior stakeholders on tracking spend versus progress to flag whether on track.
  • Supplier/stakeholder management – ensures third parties and other delivery arms within the organisation are aligned and delivering to plan, on time, to quality. Reporting performance wherever appropriate.
  • Communications – oversight of communication plan, tracking deliverables and ensuring these align to the key messages the programme wants to provide to stakeholders. Manages sign-off processes and identification of opportunities to open new channels within the organisation.
  • Quality control – provides a quality assurance function to ensure delivery meets requirements as captured and obtains sign-off. Undertakes benefits analysis in conjunction with stakeholders to ascertain these have met expectations, and conducts lessons learnt activities from such reviews.
  • Document management – capture, cataloguing and availability of programme documentation. Ensures there is a consistent format to programme documentation.

Alongside the capability assessment, discussed in the previous chapter, there is a need to establish the capability of the data and analytics team to be more directly involved in driving the activity in the implementation plan, taking ownership and integrating into their own goals and objectives. How effective an approach this will be will depend on:

  1. how skilled the data and analytics team are to step into this space;
  2. their influence and reach within the organisation to drive this through;
  3. their capacity to do so, given other work demands and priorities.

It is an ideal opportunity, should it be needed, to bring the data and analytics team (or community, in the absence of such a team) to the forefront of the organisation as all too often they operate in the background and lack the opportunity to exert influence on the organisation’s thinking. This provides a great platform to be able to elevate their profile, demonstrate their wider business appreciation (assuming this is found to be the case through the capability assessment – not all data and analytics professionals possess contextual knowledge or the ability to apply it in a wider business environment) and see the impact that their work has in making a difference. In addition, it provides the opportunity to develop broader skills, whether those are in project management, benefits tracking and realisation or organisational understanding.

Just as the wider organisation may lack the information literacy skills that are essential to embed an intelligence-led approach to decision making, so the data and analytics team may lack the business awareness or depth of understanding to be truly effective in reaching in to drive change across the organisation. This is where finding those with the ability to bridge both of the groups is pivotal to a successful implementation. Without the business appreciation and comprehension of how the organisation operates, delivering change becomes a remote and, therefore, detached activity that lacks the necessary impact or the appropriate level of awareness across the organisation.

The capability assessment should have established how big a hill this is to climb and, indeed, this may be one of the earlier deliverables in your implementation programme, as the increased levels of awareness across the organisation are key to driving real change. If this is not tackled, then do not expect there to be any appetite to change to something unknown and potentially seen to be irrelevant. Resistance may be driven by fear of change, but it is as likely to be an indicator of failing to sell the benefits and importance of the change to those you need to influence. There will be change resistance but, equally, it may be just as much a failing to engage and inform.

Do not assume everyone ‘gets’ the data strategy, communicate it effectively and with context for the various parties you need to engage, and prepare the ground for plenty of challenge on the case for change.

There is a strong case to be made to bring in data and analytics expertise into the lead roles in implementation delivery, and not just because of the subject matter expertise such individuals bring. The opportunity to see the end-to-end nature of the data strategy enabling the corporate strategy to succeed will enlighten those in the data and analytics function as to the impact of their work, but also the way in which it enables the execution of the ultimate outcome to deliver the objective. It is often the lack of opportunity to gain wider experience that holds back the influence of those data and analytics professionals within an organisation, yet they have insight and expertise not found elsewhere in the business to be able to define and deliver innovative solutions to business problems that may not otherwise be devised, let alone considered.

This is likely to require some additional skills within the data and analytics function, especially in operating as part of an implementation programme rather than simply as a contributor to it, so do bear this in mind when selecting those you wish to include. Make time to explain the benefits and opportunities this will bring, and the importance of being able to listen and learn to capture insight on business problems which the wider team could tackle – this is potentially bringing the data and analytics team nearer the frontline activities, which is where the biggest impact is often felt.

Extra skills the data and analytics function may need involve comprehending how Agile delivery methods operate to be an effective proponent of such techniques. I am a big advocate of utilising Agile in the approach to delivery in data and analytics, as I feel this provides rapid value to a customer, builds a level of understanding and sets expectations of needing to learn through delivery, given the answer to complex problems is often nigh on impossible to define through detailed requirements up front.

If the lead role in the data strategy implementation is taken on by the data and analytics function, utilising Agile methodologies, the awareness of programme management will increase within that function. It will also increase the way in which intelligence is used within a PMO/PfMO, given the skillset likely to be found amongst those in a data and analytics function, and support the coordination of managing the range of moving parts within the implementation phase. It will also give further opportunities to see how the wider organisation engages with information, to increase awareness of how future deliveries could be made easier to digest and act upon effectively without having to second-guess what the analysis indicates to be the right decision. Having subject matter expertise embedded in the implementation programme is also likely to keep a level of focus on the aims of the data strategy and an informed view on the impact of deviating from what was defined within the strategy.

The goal of having members of the analytics team involved directly in the implementation also presents an opportunity for those who might otherwise tend to be in the background to come to the fore and present their skills in a transformative way. It is a proactive approach, enabling the analyst to fulfil the role of being a change agent to link their work directly to the impact on decision making, increase their business awareness, enhance their stakeholder network and improve visibility of their capability to the wider organisation.

The impact of having clear benefits and being able to translate these into KPIs is covered elsewhere in this book. I do not propose to go into great detail on KPIs other than to suggest that the data and analytics function should probably have some ownership or oversight of how these are compiled and represented due to the nature of the implementation programme being focused on data strategy. The KPIs must be relevant and significant in what they measure whilst also being deliverable – do not define what cannot be reported due to a lack of data or a lack of clarity in their definition. There is little to be gained from defining KPIs which are, in themselves, riddled with quality and reliability issues, as this will only serve to undermine the implementation of the data strategy.

The data and analytics function will be an important delivery vehicle for the implementation programme, both in what it provides but also in the wider spectrum of contacts and insight within the organisation it may be able to provide. As such, the function is a critical dependency to the successful implementation of the data strategy and it will be essential that all the resources in this function are earmarked to support the implementation.

If resources prove to be constrained, then I would question the prioritisation process and the potential indicator this is alerting you to that the goalposts in the organisation may have moved, and the relevance of the data strategy implementation may have been lost or made redundant due to wider events taking priority. An effective implementation programme with strong leadership, sponsorship and an active PMO/PfMO should be able to spot this emerging long before it becomes an issue, and adapt and reflect in the programme accordingly.

The scale of the data and analytics function will reflect the maturity of the organisation and the level of investment it has made to date in this area. If you are just starting out on the journey, you may lack the depth of resources within the organisation and also be constrained on funding to bring in external expertise to assist. However, the data strategy covers the breadth of the information lifecycle, not just data management activities, and so needs to represent this in its implementation (see Figure 2.1). Many of the core data activities are critical enablers to those which exploit the data, and whilst it might be overplaying it to suggest that without the progress on the data front the exploitation activities would not be feasible, it is almost certainly the case that they would be highly suboptimal due to having to be repeated or having compromised actions which are undertaken due to the issues with the data. Do not lose sight of the nuances, the dependencies and the potential efficiency gains which, whilst they may be embedded in the data and analytics function, constrain or release resource to have a massive impact on what can be delivered to the wider organisation.

9.5 THE PRIORITISATION CHALLENGE

The implementation of the data strategy will need to be assigned to the relevant individuals and teams within the organisation based on the RACI (or RASCI) within the plan. However, those individuals and teams will have existing commitments and the anticipation of working towards future deliverables, so the engagement prior to confirming the implementation plan must determine the resources needed to commit to the data strategy deliverables. This leads on to a discussion regarding prioritisation.

Clearly, there is a strong likelihood that a proportion of the current workload will contribute in some way to the implementation of the data strategy – it is unlikely that the data strategy will start implementation with a major shift from the current approach, but if this is the intent, there is a need to be clear on this and make sure the communication process provides not only the rationale but also the implications on current activity and direction. Therefore, an initial assessment on the workload within teams, the direction in which they are heading, and the availability of the right resource – those with the relevant skills and knowledge – is a high priority for engagement at the start of the implementation phase.

The key lever to recall throughout the discovery stage at the start of implementation is the direct relationship between corporate strategy and the data strategy. The latter is a critical deliverable to make the corporate strategy a reality, either directly or indirectly. Understanding this is important when it comes to the prioritisation calls that lie ahead. All activities in the organisation should be selected on the basis that they are compatible with, and enabling, the corporate strategy to become reality. There are a few instances where this may not be quite so clear, such as compliance activity – unless your corporate strategy has some reference to effective governance or operating at a high level whilst remaining safe and legal – but these should also enable the data strategy implementation, as they are consistent with the foundational principles of sound data management practice.

Alignment to those activities which are clearly attributable to delivering the corporate strategy will bind the data strategy activities into the same reporting mechanisms and ensure the dependencies are tracked accordingly. This strengthens the importance of these activities being delivered, thus making the data strategy implementation more integrated and likely to reach a successful outcome.

Where there are activities which do not support the data strategy implementation, there is a need to investigate the ownership of such activities and the key drivers as to why these are consuming resources. If these are contradictory to the direction of the data strategy, or consuming key resources needed to move the data strategy implementation forward, then there is a case for exploring which of these takes priority.

The data strategy implementation, though effectively a programme in its own right, may find itself having to fight battles on several fronts across the organisation to align activities to deliver in the structured approach the implementation plan has devised. It is worth using the advanced planning that the waymarkers provide to define the milestones and deliverables, and to get these planned in as early as possible into priorities across the functions within the organisation.

A further challenge related to prioritisation is the competition for scarce resource, especially in data and analytics functions which are likely to be key to the successful implementation of the data strategy. The previous chapter highlighted the need to undertake a capability assessment, and, if this is done, it will highlight where challenges are likely to lie, enabling those leading on strategy implementation to plan ahead where pinch points are likely to occur.

These can be offset, of course, if the data strategy has the influence to increase capacity in areas where demand so clearly outstrips supply, but there is always a time lag in recruitment (from business case submission and approval through to advertising, hiring and having the individual start), followed by a period in which the successful applicant has to familiarise themselves with what has gone before and the nature and structure of the organisation, identify key networks and become truly productive. This could take most of the first year of the implementation plan to achieve, and so it would be wise to target any recruitment dependencies within the data strategy as early as possible to mitigate this potential constraint on progressing the strategy implementation.

The data strategy is an enabler of the wider corporate strategy and, therefore, for others to achieve their own goals. Therefore the other lever to use in the prioritisation challenge likely to be ahead of those leading the implementation is mobilising relevant stakeholders to apply pressure to increase the importance of implementation activities. If you are able to get those stakeholders driving the deliverables, recognising how they impact their own activities and contribute to the corporate strategy, then it is a powerful alliance to increase the likelihood of getting key activities prioritised.

The most effective way to get the data strategy implementation to succeed is to have stakeholders across the organisation taking ownership of various parts of it to ensure they have an active stake, rather than leaving it solely to an implementation team (or individual) to have to shoulder the load. I can speak from personal experience that the impact an active sponsor and motivated stakeholders have makes or breaks the successful implementation of the data strategy. It is well worth the investment in getting to the point at which those stakeholders are making the implementation happen, and the work of the implementation team becomes one of coordination, guidance and advising on details within the implementation activities.

Finally, understand the governance within your organisation. Prioritisation should be undertaken through some sort of approach which has a degree of formality behind it, with stakeholders assigned either as customers or functions to deliver the activity, timelines and resources agreed, and a clear understanding amongst all parties as to what success looks like as a result of delivering the activity. Being focused on the mechanics through which activities are agreed and resources assigned will become your focus throughout the implementation, so be prepared for the time it takes to fully engage with these groups, understand how they operate, and learn who is influential within them and how to get activity into the prioritisation process and achieve the right outcome. This will be your day-to-day way of operating in the months ahead, so get ready to become a master of the art of governance and prioritisation.

9.6 REQUIREMENTS

Just as the art of understanding the governance is important to appreciate how things get done within your organisation, so the need to be able to construct the activities within the implementation plan into requirements will be essential to get these through the governance and on to the work stack of relevant teams in the organisation.

Generally, organisations use a particular approach to capturing requirements, and there are several out there. If your organisation does not have a formalised approach to requirements gathering, this might be an opportune time to introduce something – a study by Stieglitz (2012) found over 70 per cent of failed projects were due to a lack of requirements gathering.6 I am not intending to review the full range of requirements-gathering techniques, but I will reference a few I have found most common and/or useful that you might want to research further if you are keen to develop your knowledge in this particular area.

The collation of the requirements for data strategy implementation may involve one or more of the techniques listed below (or others not referenced here), but the process of wrapping up the information collated from these approaches tends to fall into one of two camps: user stories (sometimes called epics) and use cases, the cornerstone of the Agile methodology approach, or formal, documented requirements, which are the more traditional, waterfall-style of project approach.

I described use cases earlier in this chapter; user stories tend to capture the experience described by those specifying the need to be able to define what the outcome should achieve, feel like or deliver. They are relatively detailed, and are then used in the sprint and product backlog process of Agile to determine what will be delivered at the end of a sprint to the customer. They tend to be structured in a broadly common format, along the lines of:

As a [job role], I want to achieve [define a goal] so that I realise the following benefit of/can do [the outcome/reason for the requirement].

In contrast, a documented requirement has historically supported a more conventional, or waterfall, approach which has a clearly defined expectation up front and then has the project or programme seek to deliver to that requirement. These requirements tend to be extremely detailed, given that they may be supporting a long-term activity to deliver an outcome or output some time off, and so have to provide that robustness to be able to operate to that sort of longevity.

Documented requirements have a breadth of detail encompassing all aspects – technical, functional, operational – and so tend to be created by those more closely associated with the technical detail who are focused on the delivery end of the activity, which can remove the level of engagement with the end stakeholder more closely associated with the user case. The nature of these requirements also makes changing them much harder, typically involving a detailed change process due to the impact across the breadth of activity over a longer time frame than the user case.

Both approaches have their place, and it is often quite clear as to which one to use, depending on the complexity, time to complete and the feasibility of adopting a more dynamically driven approach more closely associated with Agile.

There are some useful pointers in requirements gathering, regardless of the approach to capturing them:

  • Establish goals and objectives early – ensure you capture all details and sign them off to align to why they are required.
  • Document detailed notes – capture all stakeholder discussions as it may take more than one meeting to formulate the requirement.
  • Share documentation – provide access to the requirement documentation as it takes shape to gather feedback. This will lead to a better end product and speed up sign-off.
  • Ensure you engage all relevant stakeholders – it may seem that you have identified the key stakeholder for the requirement, but there may be others, including those who simply exert influence or can assist in promoting your activity. Without the buy-in of the wider group it may prove harder to garner full support and engagement – they may even block or delay progress – so do take a moment to assess if there are others who need to be involved.
  • Capture and validate all assumptions – whilst an assumption may seem obvious to you, there is a risk that you may not be aware of a reason why it is flawed. State any assumptions and be prepared to work them through – generally, the less to be assumed the better.

Finally, some advice on how to engage stakeholders. It may be easy to gather the requirement; the stakeholder(s) may have a clear view on their need and benefits and be able to articulate it clearly. However, this is not always the case. A good approach to adopt is called active listening, which goes beyond what is said to comprehend the way it is said, the body language and the sentiment behind the words. Build trust and confidence, be as transparent as you can be, establish empathy, and comprehend the issue and its impact, and you are on a good footing to capture the requirement in a way likely to result in a positive outcome for your stakeholder and establish a good ongoing relationship.

9.6.1 Interviews

One-to-one discussions with customers, or key stakeholders, is a traditional but effective way of exploring the end goal – what is the outcome needed, and why – of the activity you are about to undertake. These can be extended to groups, if appropriate, but do be mindful that the larger the group the more influenced it can be by one or more individuals, and some may be less willing to speak up or be intimidated by others in the group.

Think about group dynamics. If you know a little about the likely participants do some research to assess the likely interaction of putting them together, and try to keep the group focused – having too much of a spread of interests can make it difficult to capture the breadth of inputs you receive or skew where the focus is, and therefore lead to unbalanced requirements emerging. Have a clear structure to the interviews, a line of questioning which provides a framework for all participants to follow the thread of the interview, and try to make sure everyone has their say – it is often wise to follow up individually after a group interview to ensure those who may have been less engaged or quieter have a second opportunity to participate and contribute.

Finally, prepare detailed outputs from the interviews to share with those who participated to ensure you have captured everything accurately and to provide an opportunity to embellish the information you gathered now there has been chance to reflect since the interview itself. It is surprising how few people do follow-up interviews in any way, and yet many who participate will continue to think and reflect on what they heard and evolve their thinking – remember, you may have caught them relatively cold on the day, and even sending information about the interview in advance may have remained in their inbox if they are awash with other priorities, so the quality time may be in that reflective period after the interview. Do not forget to ask: even if the individual does not suggest there were further reflections, there may be a nugget lurking in their mind which could be key to your success.

9.6.2 Focus groups

Not to be confused with wider group interviews, focus groups are a way of having (as the name suggests) focused discussions on a particular aspect or topic of the activity you are seeking to deliver, usually involving those who are recognised as the SMEs in their field within the organisation. These groups will be moderated by someone who leads the discussion, and so the SMEs are directed according to what the purpose of the focus group is to try to ensure a positive outcome, whereas an interview has greater steering by those who participate.

Typically, focus groups will be used to work through a more complex issue or where there are multiple stakeholders who all have to come together to contribute to a successful outcome. They are therefore a good approach to defining a requirement where there are multiple teams involved who all have to work together on a common goal, to try to avoid ambiguity or disagreement in the critical delivery phase.

9.6.3 Delphi technique

Similar in principle to focus groups but often used on a wider range of issues, the Delphi technique brings SMEs’ knowledge to a particular issue or deliverable through a third-party facilitator. The SMEs will provide answers to a set of questions posed by the facilitator, often remotely, and once the facilitator has analysed the feedback it will be shared anonymously with the group as a whole, so the collective inputs can be seen by all SMEs. The answers may be similar or conflict, but the group reviews the content provided by the facilitator to work on a revised view based on the collective responses, and may do this a few times till a consensus is reached or the bar is raised through the benefit of the group of SMEs each contributing and stimulating fresh thinking or, at least, challenge to original thinking. Once the collective SMEs have reached an agreed point, or have converged to a broadly similar view, the facilitator structures the responses into a set of requirements.

9.6.4 Brainstorming

Brainstorming is a commonly used and well-known technique for a number of activities, including defining requirements. The key to a successful brainstorming session is the concept that there is no such thing as a bad idea: the focus is on gathering as many ideas as possible to capture for subsequent discussion. An effective brainstorming session is often recognised through the opportunity that all can contribute without critique in the process of idea generation, and sometimes the more way-out ideas can be the ones which are taken forward simply because, having been regarded as left field, on exploration they are found to open up opportunities well worth further investigation.

A well-run brainstorming session can deliver really innovative thinking that many other approaches would not create, due to more structured approaches being taken. Therefore, do not discount brainstorming if there is a need to introduce a more creative and exploratory approach to capture requirements; it can be highly effective at challenging the traditional approach and making others think differently.

9.6.5 Strawman approach (or prototyping)

In situations where the requirement can be difficult to articulate, or involve a series of decisions which could be time-consuming to work through, developing a strawman (also called a prototype) can be a very effective way of focusing the minds of a group of SMEs to critique and refine an initial proposal to shape a more developed approach. The benefit of this is that it gives a point of reference, or focus, to analyse something common to the group which would otherwise be difficult to define and so would be challenging to progress using a wider group.

The essence of the strawman is a starting point and it is quite feasible that the final solution may look totally different, or could end up very similar to the strawman. The point is that it does not matter: its purpose is solely to provide common ground on which to focus effort to improve, enhance or develop a solution which is better through using a common starting point developed by a much smaller group in the first instance (maybe 1–3 participants; the key is to have as few as reasonably necessary to develop the strawman). The requirements will be more robust having had the benefit of a common review point, and to have had clarity on the changes included to improve on the strawman.

9.6.6 Nominal group technique

This approach is a more rapid approach to identify a way to achieve an outcome, and is based on greater spontaneity than many others, so is often used in cases where time is short. Figure 9.2 demonstrates how it is undertaken, but essentially it works on a basis of individuals being brought together for this purpose, with no previous input or engagement, to share ideas and select the best through a voting process.

Figure 9.2 Nominal group technique Diagram courtesy of Shiv Shenoy, PM Exam Smartnotes (pmexamsmartnotes.com). S. Shenoy, How to Collect Requirements – Part 1. PM Exam Smartnotes https://www.pmexamsmartnotes.com/collect-requirements-process-tools-and-techniques-part-1-of-2/.

images

9.6.7 Observation

This approach is effective in cases where those who are customers or key stakeholders of the requirement are not skilled in crafting requirements or lack the time to do so effectively. It involves someone with good observational skills working with the relevant area(s) to identify the problem, articulate the requirement and hence define success criteria in a structured way that can then be signed off by the customer. It usually involves an individual with good requirement capture skills to shadow key team members to observe the issue, build a complete picture (which may be complex in totality) and present in a way the customer can relate to the process that has led to the requirements as defined.

9.6.8 Document review

If there isn’t an opportunity to engage with knowledgeable stakeholders, sometimes there is an approach which involves capturing information from documents to pull together a complete picture and form a requirement. This approach obviously has limitations, not least that it is only as good as the quality, relevancy and currency of the documents, and so may misrepresent reality due to failings in any of these areas. It also requires a breadth of documents to be able to form a complete enough picture, and ease of access to them.

In my experience, this is often a last resort due to the lack of stakeholder engagement and may be used as an input to some of the other techniques listed above rather than on its own.

9.7 BENEFITS DEFINITION AND TRACKING

Once the implementation plan defines the activities, the milestones and the resources required to achieve the outcome, it is essential to define what success looks like and to turn this into a measurable benefit. This might sound obvious, at least at a theoretical level, but in my experience the data strategies I have seen across a wide range of organisations worldwide are particularly weak in this area. There is also the rather academic debate over the accounting for benefits – is the data strategy an enabler for some other activity, especially those which are key components of the corporate strategy?

The difficulty with this debate is that it is largely futile. Without the enabler, the positive outcome realised elsewhere would likely fail to materialise, and as a result would be missed or possibly suboptimal at best. Therefore, the enabler is a clear contributor to the outcome, if not a direct outcome in itself.

The main impact of how your organisation tracks enablers is the visibility that they receive and hence the importance which is attached to their resourcing and delivery. Many organisations fail to track adequately the enablers and dependencies of a strategy implementation, which leads to periods of paralysis due to these bumps in the road becoming blockers or insurmountable obstacles. Tackling them only when they rear up in front of the organisation is costly, as the time it takes, resources needed and the delays caused far outweigh having a structured programme operating which ensures there is a smooth road ahead to ease the implementation journey.

I find the best way to bring this to life, which is particularly apt given that strategy has its roots in warfare, is to outline the implications of marching forces across open ground unaware that there is a significant river in the distance without a bridge to enable the troops to cross. Reacting to the river only when it is in sight is probably just what the enemy were hoping for, and your forces are sitting ducks whilst the message is relayed to find engineers who can construct some sort of crossing whilst under heavy artillery. Having a map, knowledge and surveillance of what lies ahead, an advanced force of engineers able to construct the bridges in advance of your arrival and defend these makes for a faster and more effective assault, but needs priority to be given to supporting a front line of those who capture information and prepare the ground for what lies ahead.

The same applies to your own data strategy implementation. Don’t be caught cold, unaware of something which may derail your implementation; continue to do your homework and be several steps ahead of the main force delivering in the here and now. This means operating on multiple fronts, but it is worth it to avoid being stranded and exposed. In data strategy implementation terms, this may make the difference in retaining credibility and, with it, support from your stakeholders and maintaining momentum. As the phrase goes: ‘By failing to prepare, you are preparing to fail.’7

The key to successful benefits definition is to identify the owner of the benefit and to ensure that that individual signs off the benefit as a first step to taking accountability for its delivery. The PMO team will continue to track these benefits, as they are integral to the successful implementation of the data strategy, even though the ownership of benefits may become fragmented. However, unless there is some accountability for the totality of the benefits to be realised – which may reside with the sponsor of the data strategy implementation, of course – there is a risk that the benefits will not manifest in the way intended and fail to deliver expected value. After all, the data strategy was devised as a whole and will have been designed to deliver optimal benefits in their entirety, and this would have been a key factor in the data strategy being signed off in the first place.

Finally, I would like to highlight the importance of transparency in the reporting of benefits to ensure those who are engaged in some capacity have assurance that they are contributing to the bigger prize of delivering change within their organisation. It is also a key factor in retaining support of your most influential stakeholders, and will also enable you to control the narrative on the progress of the implementation programme. Whilst it may not always be moving along as you might like, and there are bound to be twists and turns to deal with along the way, being open and honest about these will build trust that you are on top of implementation and managing the many strands which deliver the data strategy. Failure to do so will only lead to speculation, rumour and misinterpretation, all of which will suck energy from the implementation effort itself and may delay important approvals and resources being realised in a timely fashion.

Jeff Austin, the former Vice-President, Strategy Planning, at DuPont Pioneer said: ‘Are we doing what we said we would be doing?’8 and this is a useful prompt as you progress through the implementation journey. Keep in mind the intent, direction and impact the data strategy aimed to achieve and remain focused on the objectives, or goals, it set out to deliver. The data strategy was signed off as the direction the organisation wanted to commit itself to, and should be aligned to the corporate strategy. It is a useful – and simple – prompt to bear in mind to challenge the programme team and those around you focused on delivering it.

9.7.1 Non-financial benefits definition and tracking

A further area that can confuse the benefits definition from the outset is the question of those activities which result in non-financial outcomes but contribute to the overall goal. It is not necessarily that those involved in the implementation of the data strategy may not recognise the value of these activities, more a question of how to account for them in their contribution to the overall delivery.

The task to convert such non-financial benefits into something that can be tracked, and the impact assessed, is to convert these into something which can be inferred in a way that is easy to articulate and backed by clear evidence that the cause and effect have been quantified in some way.

Take employee engagement, for example: those who have to delve into the furthest reaches of the organisation looking for data and are hampered by the quality and timeliness of it when they find it, and then struggle to be able to repeat the exercise, will almost certainly find this a demotivating part of their role within the organisation. Improving that experience, making the data accessible, reliable, consistent and timely, would improve the effectiveness of that individual to focus their effort more on the task they are seeking to achieve, rather than spending the majority of their time in preparation and a minority in adding the value that the role is paid to achieve. This, in turn, may well lead to an increase in employee engagement, so the cause and effect can be related and a target of an improvement in employee engagement set and measured via KPIs.

There is also evidence that indicates an increase in employee engagement has a direct impact on customer satisfaction,9 hence the indirect effect of making the performance of that individual employee increase is likely to be of benefit through to the customer experience. Richard Branson said: ‘empowering and taking care of your staff is the best way to look after your customers and keep them coming back for more’.10

There is a handy seven-step process (Figure 9.3) which is a useful tool to define and measure both financial and non-financial benefits which is worth using as a prompt. This gives a consistent approach to tracking business benefits and can be adapted to your own organisation. For instance, if you are in a not-for-profit environment, the value may be the opportunity cost of the action being taken which, in turn, is focused on optimising your decisions as to where business benefit is greatest for both your own organisation but also wider stakeholders you are there to serve.

Figure 9.3 Seven steps to legitimise, measure, and value financial and non-financial business benefits Copyright © 2020 by Marty Schmidt. Used with permission. https://www.business-case-analysis.com/business-benefit.html.

images

9.7.2 Benefits dependency network

In the discipline of programme management, a process called benefits dependency network (BDN) is used to capture five critical pieces of information, all of which need to be tracked to be able to assess the benefit realised. The five categories are:

  1. Objectives – the desired end state.
  2. Benefits – the benefit to the organisation of the desired end state.
  3. Outcomes – the specific aspects of the end state.
  4. Projects – those activities to be undertaken to deliver outcomes.
  5. Enablers – facilitators in the delivery of the projects and programmes, leading to the outcomes. These can be direct activities or indirect, contributing something which is an essential conduit or efficiency step to make delivery achievable faster or more effective (for example removing barriers or creating a new environment or process which reduces time and effort).

The value of adopting this type of approach is that it can be used for two directly related purposes. Read the BDN in the sequence outlined above and it provides the rationale for structuring the data strategy into projects and programmes to deliver outcomes, whilst recognising the key enablers needed to achieve these. Working backwards, it gives the purpose and structure to assure delivery through demonstrating the rigour of the approach in achieving effective outcomes, which in turn delivers the benefits and achieves the overall objective.

This is therefore a useful mechanism to use to challenge whether, in the context of a dynamic business environment, the approach from end to end is still relevant, and identify the impact of change on the whole, rather than just the immediate task at hand. It also provides visibility as to the critical path to navigate through a raft of activities to keep a focus on managing risks and dependencies.

9.8 TEN TO TAKE AWAY

This chapter has sought to set you on the path of strategy implementation. Key points to take away are:

  1. Establish the roles and responsibilities at the start of your data strategy implementation. Utilise the RASCI model, as support will be vital to your programme in many ways (communication and learning, for example), and focus on the role as well as the person – people move on; you need the responsibility to remain with the post – whilst using it to guide delivery rather than create rigid boundaries.
  2. Change is inevitable, so do not stick rigidly to the plan. Constant review is necessary, but if you find a need to diverge so far that it no longer relates to the data strategy as defined, you may need to redefine the data strategy. Do not lose sight of the link between the data strategy and its implementation.
  3. You need to utilise communication as a key activity of the implementation programme, and to engage stakeholders at all levels in the organisation. Do not depend on a ‘one size fits all’ approach: there will be different challenges across the organisation and your success needs to align to those messages being delivered locally to drive the data strategy into their thinking. Employee engagement is a key element in determining your likely success.
  4. Adopt an Agile approach to the data strategy implementation, especially communications. Break down your plan into a series of deliverables that fall within the month that you can wrap a series of communications around.
  5. Build your programme team to harness a range of expertise. Incorporate skills in project coordination, data and analytics, local knowledge from various parts of the organisation and blend a cohesive capability to bring out the best of the team. Your ability to resource the team will be dependent on organisation size and commitment to the data strategy implementation, but seek to ensure there is a strong programme management office to oversee your own programme and the dependencies you are tracking elsewhere.
  6. Undertake a capability assessment of the programme team at the outset to identify gaps and weaknesses and seek to address these through a variety of learning approaches.
  7. Assess prioritisation within your implementation plan. The data strategy should align to the corporate strategy, which should assist in the process, but there are other constraints and challenges along the way, not least the availability of scarce resources. This may entail some negotiation or rephasing of the implementation plan.
  8. Determine the appropriate governance to put in place to oversee the data strategy implementation. It is a key forum to raise visibility of the implementation programme, to determine prioritisation calls, and to seek to gain resources from key stakeholders who are therefore able to understand the need and timing of the request.
  9. Utilise requirements gathering as part of the implementation programme. These may be captured via use cases or more formal documented requirements, but it is essential to get these signed off to ensure that deliverables within the data strategy implementation are aligned to what the organisation needs to be able to progress.
  10. Ensure the implementation has clear links between its deliverables and benefits tracking to demonstrate the value the data strategy has delivered. Ensure that the benefits and requirements remain focused on the objectives, or goals, the data strategy defined.

 

1 J. Gumz, Risk on Complex Projects: A Case Study. Newtown Square, PA: Project Management Institute, 2012.

2 M.E. Porter, Competitive Strategy: Techniques for Analyzing Industries and Competitors. New York: Free Press, 1980.

3 M. Imai, Kaizen: The Key to Japan’s Competitive Success. New York: McGraw-Hill, 1986.

4 N.M. De Bussy and L. Suprawan, Most Valuable Stakeholders: The Impact of Employee Orientation on Corporate Financial Performance. Public Relations Review, 38, 280–287, 2012. https://espace.curtin.edu.au/handle/20.500.11937/62193.

5 Harvard Business Review and Strativity Group identified 62 per cent, the highest rate of failure, was due to poor communications. Referenced in S. Percy, Why Do Change Programmes Fail? Forbes. 13 March 2019. https://www.forbes.com/sites/sallypercy/2019/03/13/why-do-change-programs-fail/#112e91872e48.

6 C. Stieglitz, Beginning at the End: Requirements Gathering Lessons from a Flowchart Junkie. Paper presented at PMI® Global Congress 2012, North America, Vancouver, British Columbia, Canada. Newtown Square, PA: Project Management Institute, 2012.

7 A saying often attributed to Benjamin Franklin, though there is no written evidence of his having uttered it.

8 Economist Intelligence Unit, Why Good Strategies Fail: Lessons for the C-Suite. 2013. https://eiuperspectives.economist.com/strategy-leadership/why-good-strategies-fail.

9 See Institute of Customer Service, The Customer Knows. 2017. https://www.instituteofcustomerservice.com/product/the-customer-knows-how-employee-engagement-leads-to-greater-customer-satisfaction-and-loyalty/.

10 R. Branson, Like a Virgin: Secrets They Won’t Teach You at Business School. London: Virgin, 2013.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.186.6