Chapter 12


Launching the ship – successful deployment in the organization

‘This is something I got wrong. I thought it was all about technology. I thought if we hired a couple thousand technology people, if we upgraded our software, things like that, that was it. I was wrong’ —Jeff Immelt, (former) CEO of General Electric.70

Successful data initiatives can bring tremendous business and scientific value, but many die on the launch pad because of inadequate preparation, internal resistance or poor programme management. How can you raise the odds that your data initiative will succeed? What principles can help reduce the cost and effort required?

We start by presenting an unfortunate case study of an ambitious big data project, which was launched with significant media hype, but which ended in painful failure.

Case study – The 62-million-dollar failure

You probably first heard of IBM’s artificial intelligence program Watson in 2011, when it beat two living, breathing humans to win the popular American gameshow Jeopardy. Two years later, IBM proposed a more noble use for its champion AI program, teaming up with the M.D. Anderson Cancer Center at the University of Texas in a highly publicized project that would employ Watson in the fight against cancer. The goal was to expedite clinical decision making by having Watson match patients to archives of documented clinical trials. The world watched in expectation for the coming revolution in cancer treatment.

By the end of 2016, however, the project had proven to be a failed investment: $62 million plus significant expenditure of internal resources at the Anderson Cancer Center: staff time, technology infrastructure and administrative support. It was a sobering lesson in the reality that massively promising projects can become massive failures.

Indications are that this was not a failure of big data or of Watson technology but rather of poor project execution. A University of Texas audit cited numerous problems related to service providers, and it seems Watson was never even successfully integrated with the medical centre’s new electronic medical records system.

In hindsight, experts realized there was apparently not enough data for Watson in this application, even if it had been successfully integrated with Anderson systems. Because so many treatment options have not yet been explored in the medical literature, and because there were relatively few high-quality clinical trials on record, Watson did not have sufficient research literature to draw on. The original project motivation was that Watson could process every article ever written to produce the best recommendation for treatment, but the reality was that oncologists often need to choose between drugs that have never been directly compared in randomized trials.

Mary Chris Jaklevic, a healthcare journalist who reported on the Watson–Anderson failure in 2016, highlighted the extreme mismatch between media hype about the project’s potential and the complete failure of the project to produce results. She ended with a point we should take to heart in this rapidly developing world of big data and AI: ‘… make a habit of pointing out gaps between what’s claimed and what’s been demonstrated to work.’89

Why our projects fail

Although most organizations won’t make headlines with such expensive failures, relatively few are successful in making a breakthrough in their analytics programs. In a recent survey of leaders innovating in big data analytics, three quarters reported revenue or cost improvements of less than 1 per cent.90 In another study, only 27 per cent reported success in their big data initiatives.32 The Harvard Business Review describes an MIT researcher recently addressing a group of 150 machine learning enthusiasts. He started with the question, ‘How many of you have built a machine learning model?’ to which roughly one third raised their hands. He then asked how many had also deployed and/or used that model to generate value and then evaluated the results. None kept their hands up. None.91

In my experience, and in speaking with colleagues about their experiences, many organizations have taken steps, sometimes significant steps, to find new value from data and analytics, only to reap little practical benefit. Sometimes this is because the problems are very difficult, but it more often reflects problems with staffing, project management or organizational dynamics. Still, we’ve seen other organizations launch analytics projects and reap substantial returns.

So how can you maximize the likelihood that your big data and data science initiatives will succeed?

Here are some principles to follow.

Become data-driven

Keep asking questions about your business

Ask basic questions, such as ‘What customers or products account for the top 20 per cent of our revenue?’ Ask more nuanced questions, such as ‘What motivates our customers to purchase?’ and ‘What sequences of cross-channel actions are the strongest signals that l might soon lose a valued customer?’ You can come up with hundreds of questions like these. Focus on answering the ones most critical for your business.

Challenge your basic assumptions

Especially do this if you are very familiar with your business. When colleagues propose answers to your (sometimes obvious) questions, ask for data to back up those answers. In their book, Yes, And,92 Kelly Leonard and Tom Yorton describe how a bit of data shattered some basic assumptions they had held about their 50-year-old Chicago theatre. When an outsider asked them why they thought their guests were coming to the theatre, they immediately responded with the obvious answer: the guests obviously wanted to see the show playing that night. The questioner then surveyed the guests, who gave very different reasons. The guests were actually using the theatre as a novelty event: to celebrate a birthday or the achievement of a business milestone, to entertain out-of-town guests, or because they had received the tickets as gifts or purchased them at charity events. Not a single patron gave the expected answer. Not a single one was there simply because they wanted to see the show playing that night. Seasoned management had been so certain and yet completely wrong in their assumptions. (‘Yes, And’ (p. 174). HarperCollins. Kindle Edition)

Create and monitor KPIs

If you’re not keeping score, you’re just practising. Don’t simply monitor the obvious KPIs, such as revenue. Track your micro- and macro-conversion rates and your churn rate. Track your lead indicators, including those from customer activity. Track stickiness, including frequency metrics. Display the KPIs your teams can influence in places where they can see them. Set goals. Celebrate the goals. This part isn’t rocket science.

Get new ideas

Technology applications quickly spread within industry sectors as employees change jobs or attend industry events, but to stay ahead you’ll want to look at what’s happening in other sectors. If you’re in banking, look at what e-commerce companies are doing. If you’re in e-commerce, look at what logistics companies are doing. Go to industry conferences and talk to vendors and analysts about use cases they’ve seen.

Organize your data

If you follow the advice above, you should very quickly become frustrated with the current state of your data systems. Hire and engage people who can shepherd and draw insights from your data. Train staff across your organization to use your BI tools, particularly self-service tools which allow them to explore data on their own. Selectively move data from silos to a central data warehouse.

Get the right people on board

Hire people who understand how to apply data science to business. Hire data scientists and hire data-driven people across the organization, ideally from the top down. The higher the level of buy-in within the organization, the better the chance that analytics initiatives will be funded and supported and that the entire organization will catch the vision. Top level buy-in is still relatively rare, as demonstrated by a recent industry survey. When CEOs were asked whether they were leading their companies’ analytics agendas, 38 per cent said yes. However, when the other C-suite executives were asked, only 9 per cent said the CEO was indeed leading that agenda.90

Be aware that analytics efforts often illuminate internal shortcomings, some of which directly implicate powerful colleagues. Expect internal resistance, sometimes as ambiguous criticism or stalled cooperation, often appearing after the release of incriminating analysis.

A data-driven approach affects hiring and training throughout your organization, not only in data and analytics teams. Consider the case of General Electric (GE). GE started a major digital initiative around the beginning of 2010, acquiring companies and creating thousands of roles related to data science. In a recent interview,70 GE’s CEO Jeff Immelt recounted some key learnings from this process. He described how, even beyond staffing the data science roles, GE found they needed to hire thousands of new product managers and different types of commercial people. The impact of the transformation extended to onsite support and even sales people.

Keep in mind

Transforming into a data-driven organization requires changes throughout your organization. It’s not enough to simply create a data and analytics team.

I have seen companies start data science initiatives by bringing in a few newly minted ‘data scientists’ and setting them loose to find their own way within the organization, hoping to somehow reap tangible benefits. We wouldn’t do this with an IT initiative, and we shouldn’t do it with an analytics initiative. Projects should be done in project teams consisting of well-vetted staff with complementary skills who are ultimately connected in meaningful ways with use-cases and stakeholders. The stakeholders, in turn, should continually feed business intuition back into the development process. This should all go without saying in a mature organization, and yet we often don’t see it happening.

I would suggest you stop using the same vendors to meet your staffing and project needs. Talk with newer, smaller companies you haven’t yet used. Your new initiatives should start small, so let a small company with a few competent, creative professionals help you start it. Don’t expect that large service providers can provide top-notch staff for every engagement, and don’t expect that updated job titles reflect updated capabilities. One of the problems highlighted during the audit of the Watson–Anderson shipwreck was non-robust vendor selection.93

As an analytics programme matures, it will likely grow into a hybrid of centralized teams and decentralized analysts sitting within business units. The centralized teams will include a BI team and one or more teams of analytics specialists. Some of the decentralized analysts located within business units will have originally joined those units in non-analytic roles, over time assuming analytic responsibilities. As you transform your organization in its use of data, keep these people in their analytic roles if they can efficiently retrieve data, ask relevant business questions, perform basic analysis, and communicate results clearly. If not, cut your losses and replace them in this function with more analytically adept staff.

Find senior analytics leadership who can form a vision, a roadmap, and a team. Although an organization may organically grow in its ability to be data-driven by hiring or re-purposing de-centralized analysts, it will generally be limited to spreadsheet-level analytics until it commits to recruiting a senior analytics leader and building out a strong analytics team. Empower that team not only with the resources and flexibility they need to collect data and build models but also with access to stakeholders and key decision makers.

Without such a team, typically centralized, you will be very limited in your ability to recruit top analytics talent, and the talent you do secure will repeatedly be pulled into ‘urgent’ business problems and have little time for long-term strategic initiatives. In addition, effectively deploying analytic projects such as recommendation engines, natural language processing, advanced customer segmentations and deep learning models will typically require the synergy of a centralized team of experts.

Break down silos

Data silos severely limit your ability to draw maximum value from your data, but you’ll need extensive stakeholder management and significant technical resources to consolidate the siloed data spread across your functional units and legal entities (particularly following acquisitions). Business units tend to be protective, if not of their data then at least of their IT resources. How to best navigate this gauntlet depends on how your organization functions, but executive-level support goes a long way.

Focus on business value

It is very important to keep your data scientists focused on providing business value. There are non-technical people in your company who have developed a deep understanding of the customer, the product and the market. Your data scientists should speak with them at the very start of an analytics project. They should go back to them on a regular basis to show data and intermediate results. The business colleagues will quickly identify flawed assumptions or inappropriate interpretations of data. In some cases, they can even provide valuable assistance in constructing your analytic models.

Measure results

We talked earlier about promoting the use of KPIs within the organization, and this applies to data science efforts. Don’t start a data science project unless you know why you’re doing it and what it looks like when it succeeds. Are you looking to increase conversion rates? Marketing ROI? Market share? Customer lifetime value? Measure your starting point, set a target, and estimate resulting revenue gains. By the end of the year, you may have an ROI for the analytics programme itself.

Stay agile

Remember to stay agile, starting with minimum viable products (MVP) and working with short delivery cycles. It goes against our academic training, but we need to progressively work towards incomplete solutions rather than an immediate 100 per cent solution. Start your analysis on just a sample of the data. If you start by collecting and cleaning all possible data, you’re no longer working with an MVP and you’ll waste weeks or months before getting far enough to see any pitfalls that might exist in your approach. Start with simple models, such as decision trees, statistical regression and naïve Bayes. Refine your models once you’ve found applications with demonstrable business value.

As far as possible, get specialists working on specialized problems. Find people to extract and clean data who are skilled in this, rather than asking your statisticians and AI experts to do it.

Don’t let your data scientists reinvent the wheel; instead leverage as much existing tooling and software as possible. Don’t spend several months re-building an AI tool that is already available on a pay-per-use basis from Amazon, Google or Salesforce unless you need a custom feature or have hit a usage threshold making it more cost-effective to develop in-house. Your in-house efforts should be spent fitting existing tooling to your business.

In conclusion

The quantity and types of data available to you today present a tremendous opportunity. Understanding how to use this resource can improve your strategy, tactics and operations in more ways than you might expect, providing valuable insights, raising KPIs, reducing costs, and ultimately enabling better customer experiences. The key technologies are in place and others have already blazed trails for you to follow, often across boundaries that traditionally shielded industries from competitors. I wish you success on your journey!

Takeaways

  • Many analytics projects fail or produce little value due to poor programme management or insufficient project scoping.
  • It is critical to keep short feedback loops with business stakeholders and to work towards clear KPIs.
  • Siloed data and internal resistance may hinder analytics projects.
  • Analytics initiatives often fail without senior analytics leadership.
  • Leverage existing technology, but don’t expect off-the-shelf technology to deliver a complete solution.

Ask yourself

  • Who in your company is allowed to make decisions based on ‘gut feel’ alone? Does anyone challenge this person’s decisions?
  • Which of your initiatives have clear KPIs and measurable targets? Remember: if you’re not keeping score, you’re just practising.
  • If your organization keeps duplicate copies of information within different data systems, how do you guarantee that the data is consistent? Data that is copied from a source system can quickly become stale or corrupt and cause havoc with your reporting.
  • Who in your organization decides which data to leave in siloed data centres and which data to collect in a central data repository? Do you have a chief data officer, or an expert in the field of master data management?
  • How are you monitoring developments in the fields of data and analytics, including new technologies that you could leverage or new methods that may already be giving a competitive edge to other companies? Perhaps you’ll want to attend a leading analytics conference, such as one of the Strata Data or Gartner Data and Analytics conferences.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.136.226