4

art

COMPETING ON ANALYTICS
WITH
INTERNAL PROCESSES

Financial, Manufacturing, R&D,
and Human Resource Applications

ANALYTICS CAN BE APPLIED TO many business processes to gain a competitive edge. We’ve divided the world of analytical support for business processes into two categories: internal and external. The next chapter will address external applications—customer- and supplier-driven processes—and this one will focus on internal applications (refer to figure 4-1). It’s not always a perfectly clean distinction; in this chapter, for example, the internal applications sometimes involve external data and entities, even though they are not primarily about supply and demand, customers, or supply chains. But the focus here is on such internally facing functions as general management, finance and accounting, R&D, manufacturing, and human resource management. These are, in fact, the earliest applications of “decision support.” The original intent was that this data and these systems would be used to manage the business internally. Only more recently have they been applied to working with customers and suppliers, as companies have accumulated better data about the outside world with which they interact.

The challenge, then, is not simply to identify internal applications of business analytics but to find some that are clearly strategic and involve competitive advantage. Any company, for example, can have a financial or operational scorecard, but how does it contribute to a distinctive, strategic capability? Customer-related applications are more likely to be strategic almost by definition; internal applications have to work at it to make a strategic impact. They have to lead to measurable improvements to financial or operational performance. In the box “Typical Analytical Applications for Internal Processes,” we list some common approaches.

FIGURE 4-1

Application domains for analytics

art

Typical Analytical Applications for Internal Processes

Activity-based costing (ABC). The first step in activity-based management is to allocate costs accurately to aspects of the business such as customers, processes, or distribution channels; models incorporating activities, materials, resources, and product-offering components then allow optimization based on cost and prediction of capacity needs.

Bayesian inference (e.g., to predict revenues). A numerical estimate of the degree of belief in a hypothesis before and after evidence has been observed.

Biosimulation (e.g., in pharmaceutical “in silico” research). Manipulation of biological parameters using mathematics and/or rule bases to model how cells or other living entities react to chemical or other interventions.

Combinatorial optimization (e.g., for optimizing a product portfolio). The efficient allocation of limited resources to yield the best solution to particular objectives when the values of some or all of the variables (e.g., a given number of people) must be integers (because people can’t be split into fractions) and there are many possible combinations. Also called integer programming.

Constraint analysis (e.g., for product configuration). The use of one or more constraint satisfaction algorithms to specify the set of feasible solutions. Constraints are programmed in rules or procedures that produce solutions to particular configuration and design problems using one or more constraint satisfaction algorithms.

Experimental design (e.g., for Web site analysis). In the simplest type of experiment, participants are randomly assigned to two groups that are equivalent to each other. One group (the program or treatment group) gets the program and the other group (the comparison or control group) does not. If the program results in statistically significant differences in the outcome variable, it is assumed to have the hypothesized effect.

Future-value analysis. The decomposition of market capitalization into current value (extrapolation of existing monetary returns) and future value, or expectations of future growth.

Monte Carlo simulation (e.g., for R&D project valuation). A computerized technique used to assess the probability of certain outcomes or risks by mathematically modeling a hypothetical event over multiple trials and comparing the outcome with predefined probability distributions.

Multiple regression analysis (e.g., to determine how nonfinancial factors affect financial performance). A statistical technique whereby the influence of a set of independent variables on a single dependent variable is determined.

Neural network analysis (e.g., to predict the onset of disease). Systems modeled on the structure and operation of the brain, in which the state of the system is modified by training until it can discriminate between the classes of inputs; used on large databases. Typically, a neural network is initially “trained,” or fed large amounts of data and rules about data relationships—for example, “A grandfather is older than a person’s father.”

Textual analysis (e.g., to assess intangible capabilities). Analysis of the frequency, semantic relationships, and relative importance of particular terms, phrases, and documents in online text.

Yield analysis (e.g., in semiconductor manufacturing). Employing basic statistics (mean, median, standard deviation, etc.) to understand yield volume and quality, and to compare one batch of items with another—often displayed visually.

Financial Analytics

We’ll start by discussing financial applications, since they of course have the most direct tie to financial performance. There are several categories of financial analytics applications, including external reporting, enterprise performance management (management reporting and scorecards), investment decisions, shareholder value analysis, and cost management.

External Reporting to Regulatory Bodies and Shareholders

External reporting doesn’t lead to competitive advantage under “business as usual” conditions. It’s not normally an advantage to report more quickly and accurately beyond a certain level. Cisco Systems, for example, has touted over the last several years its ability to close the books instantaneously and to report its results almost immediately at the end of a financial period. But we often wondered why it bothered; the SEC doesn’t demand instantaneous closes, and it’s not clear what else a company could do with financial results intended for external consumption. But it is a different story with the information used by managers to make better strategic, investment, and operational decisions.

Reporting and scorecards, both financial and operational, are some of the most common applications of business intelligence and decision support. They’re obviously important to managing any business, and they’re increasingly important (with the advent of Sarbanes-Oxley legislation, for example) to keeping senior executives out of jail. While organizations do not compete on the basis of their reporting and scorecards, having systems in place to monitor progress against key operating metrics and monitor progress against plan is critical to strategy execution.1 As we argued in chapter 1, reporting activities have typically not made extensive use of analytics, but there are exceptions. One is the prediction of future performance.

Public companies have to make regular predictions of future performance for investors and analysts. The consequences of poor predictions can be dramatic; investors may severely punish companies that fail to “meet their numbers.” Most companies make straightforward extrapolations of results from earlier periods to later ones. Yet in certain industries with a high degree of change and uncertainty, extrapolations can be problematic and yield poor predictions.

The information technology industry is one of those uncertain industries. Products and customer desires change rapidly, and disproportionate sales volumes take place at the ends of reporting periods. Hewlett-Packard (HP) found that it was very difficult to make accurate predictions of revenues in such an environment, and in one quarter of 2001 had a “nearly disastrous” revenue growth forecasting error of 12 percent.2 HP executives decided to put some of their quantitative researchers in HP Labs onto the problem of creating more accurate revenue forecasts.

The researchers used a Bayesian inference approach (see the box) to predict monthly and quarterly revenues from data thus far in the period. After several adjustments, the algorithm yielded significantly better revenue predictions than the more straightforward approach used previously. HP incorporated the new algorithm into its performance-reporting dashboard, and the company’s (then) CFO, Bob Wayman, noted, “It is reassuring to have a solid projection algorithm. It’s crisper and cleaner, it has rigor and methodology as opposed to my own algorithm.”3

Better prediction of future performance from an operational standpoint also allows companies to take action sooner. Using “near-real-time” operational data, managers can quickly identify emerging trends, make predictions, and take prompt action. For example, during the last recession, Dell executives used predictive models to recognize months before their competitors how thoroughly sales were going to soften. They took preemptive action on prices and products, which resulted in better (or at least, less awful) financial performance during the downturn. More importantly, as the recession ended, they were able to adjust again to boost their market share.

Enterprise Performance Management and Scorecards

Another exception to financial-reporting applications involving analytics is when companies try to explain financial performance from nonfinancial factors. A successful enterprise performance management initiative requires companies not just to predict performance accurately but also to come to grips with some broader questions: Which activities have the greatest impact on business performance? How do we know whether we are executing against our strategy? An organization needs quantifiable insight into the operational factors that contribute to business results and a way of measuring progress against them.

Over the past decade or so, the biggest advance in management reporting has been the adoption of management dashboards or balanced scorecards. These scorecards report not only on financial performance but also on such nonfinancial domains as customer relationships, employee learning and innovation, and operations. These have been a great step forward in understanding performance. But most adopters of balanced scorecards aren’t very balanced, focusing primarily on financial reporting. It’s great to add nonfinancial metrics to a scorecard, but if investors and regulators don’t really care about them, it’s natural to emphasize the financial numbers.

Another problem with most scorecards is that even when companies do have both financial and nonfinancial measures in their scorecards, they don’t relate them to each other. Management professors David Larcker and Chris Ittner studied several companies that had adopted balanced scorecards.4 None of the companies they examined had causal models that related nonfinancial to financial performance.

Nonfinancial or intangible resources (such as human knowledge capital, brand, and R&D capability) are growing in importance to both company business performance and external perceptions of company value.5 Even the most favorable studies only show an explanatory power of approximately 50 percent for the effect of financial measures such as earnings per share, net income, economic profit, or return on invested capital on a company’s market value.6 In some industries, EPS accounts for less than 5 percent of value. We believe that a management team that manages all its sources of value—tangible and intangible, current and future—has a significant advantage over those who do not.

Some companies are working to develop a holistic understanding of both financial and nonfinancial value drivers. A few companies at the frontier are seeking to manage both their current and future shareholder value.7 These organizations are exploring how to infuse their scorecards with data from Wall Street analysts and future-value analytics to gain better insight into the implications of decisions on shareholder value. Companies that develop such a capability might be well on the way to competitive advantage.

Reporting and scorecards are most likely to lead to competitive advantage when the business environment is changing dramatically. In those circumstances, it’s particularly important to monitor new forms of performance. New measures need to be developed, new models of performance need to be created, and new management behaviors need to emerge. The speed and effectiveness of organizational transformation in such periods can certainly be or lead to a competitive advantage. For example, a property and casualty insurance company we worked with needed to transform itself. It was a poor financial performer, with losses over the last four years of well over a billion dollars. It paid out $1.40 in claims and expenses for every premium dollar it took in. The company had monitored its financial results, but it didn’t have a handle on what was creating the poor performance.

As part of a major corporate transformation, the company focused on three key processes: producer (agent) relationships, profitable underwriting, and claims execution. In addition to redesigning those processes, the company created new measures of them and collected them in a balanced scorecard. The scorecard served as a means to rapidly communicate the performance of and changes in the processes, and the success of the change initiative overall, to the management team. The scorecard assessed the company’s ability to deliver on such objectives as:

  • Selecting profitable new markets to enter
  • Attracting and selecting the right customers
  • Driving pricing in accordance with risk
  • Reducing the severity of claims

Reward systems for employees and managers at all levels of the company were changed to tie to performance on these objectives. Some automated analytics were embedded into underwriting systems to speed the process and improve the quality of pricing decisions.

The company used these process changes and reporting approaches to dramatically turn itself around. It began to make substantial amounts of profit and was eventually acquired by another insurance firm for $3.5 billion—up from perhaps zero value a few years earlier.

A medium-sized semiconductor firm is similarly undergoing a major corporate transformation today. The company has historically been driven by its engineering capabilities and has sold to customers what it could design and manufacture. Today, however, its executives have concluded that it needs to change its capabilities to focus on marketing and customer relationships. The entire industry is changing, and market leaders such as Intel have concluded that identifying markets and applications for semiconductors (such as mobile computing and multimedia technology) is more important than squeezing a few more MIPS out of microprocessors.

The alternative is to understand markets and customers better and to begin to serve the most profitable customers. In the past, the company had worked diligently to satisfy customers even if doing so wasn’t even profitable. Now new metrics of customer profitability have been developed, and both customer-facing personnel and management are being measured on this criterion. Like the insurance company, the semiconductor firm is putting its new measures in a balanced scorecard and has developed “strategy maps” that link the new measures and reporting structures to its new strategy. The new reporting approach has only recently been adopted, but virtually every senior manager believes that if the company is to compete and prosper in this new competitive environment, the new reporting system will be a critical component.

Cost Management

Although some might question whether cost management can lead to a competitive advantage, there are an increasing number of examples of firms that have made effective analysis and management of costs a strategic capability. In some cases the new insights into costs are applied for internal management only; in others they are used to help set the prices that customers pay and the customer behaviors that companies attempt to influence. We’ll give one example of each type. Cost allocation is not in itself a highly analytical activity, but once completed, it allows the optimization of costs through modeling and rapid, accurate decisions about how business relationships and pricing should be treated. For example, it can be used to decide which distribution channel a particular customer should be encouraged to use and what price she should be charged.

One of the pressing reasons for managing costs well is when a bankruptcy court judge orders you to do so. That was the situation faced by MCI, which went into bankruptcy because of poor (and at times illegal, according to the courts) financial management when the company was known as WorldCom. One of the company’s problems was that it had little or no control or even awareness of its product/service costs. Various services were offered over the customer’s networks, but there was little sense of the resources and costs necessary to make those services available. Under these conditions, it was easy to view the company’s revenue as purely incremental, with no costs—and if people think things are free, they are often priced too low. The company had many revenue reports but very few relating to costs. As a result, it was impossible to determine the profitability of offerings to customers.

This revenue orientation may have been reasonable at one time, but today the telecommunications business has become more commoditized. It’s harder to grow your way into profitability. So MCI embarked upon a major initiative to calculate its activity-based costs. It created 5,000 cost centers, grouped into 970 cost groups. A set of costing factors (business versus consumer focus, for example) was created to apply to each of the cost centers each month. The costing factors are reviewed annually. MCI created a collection point for the cost information at the enterprise level, which was much more effective than the previous approach. As John Nolan, MCI’s vice president for analysis and planning, put it, “We used to try to line up data fields from 300 spreadsheets. When we started using an activity-based costing tool, there were tears of joy about what this tool could do.”8

As with many other major analytical initiatives, there was considerable support from MCI’s senior management for the cost management initiative. Mike Capellas, MCI’s CEO at the time, had been known for a strong data orientation at Compaq, which he had previously headed. He believed strongly in allocating all costs to P&Ls. At MCI he consistently championed the costing effort, and he publicized his preferences by refusing to look at any P&L without fully allocated costs. The cost and profitability information at MCI is now being used to determine compensation, and has begun to change management behavior and refocus attention.

Most importantly, the cost information allowed MCI to understand the profitability of its service offerings and customer segments, which was a condition of emerging from bankruptcy. The new focus on profitability impressed Wall Street, which consequently valued MCI’s equity more highly. In 2005, MCI was sold to Verizon for a price of $8.4 billion, and there is little doubt that the cost analytics helped increase the value of the company.9

For an example of cost management involving customers (i.e., still an example of internal use of analytics but with external benefit), we can turn to RBC Financial Group, whose major business unit is better known as Royal Bank of Canada—the largest bank in that country. RBC wasn’t losing money or in bankruptcy court, but around the year 2000, a couple of senior executives and an IT manager determined that they could make substantially more money if they achieved a better understanding of customer profitability. Doing so would allow the bank to understand which customers were profitable, which services ought to be offered to which customers, and how unprofitable customers might be turned into profitable ones.

RBC had some advantages in customer profitability analytics. The company had long been customer focused. As early as the late 1970s, a visionary started a customer information file—a universal customer record for retail and commercial banking—that contained information on all of the bank’s products. There was even a customer profitability model, which had been built in 1992. The problem was that it wasn’t very accurate. It used aggregate information, rather than individual customer data, to place customers into three different profitability groups. According to Kevin Purkiss, RBC’s senior vice president of client profitability analytics:

The model placed customers into three large “buckets:” A, B and C. The “A” customers made the most profit, the “B” customers made some, and the “C” customers broke even or lost money. This information was delivered to the field office. It helped align the sales force around customer profitability and planted the seeds for the new customer-centric organization. However, it wasn’t refined enough for advanced channel optimization or relationship pricing. In addition, we realized later that in some instances, customers were treated without consideration of the potential business they could contribute.10

Before it could create a better customer profitability model, RBC first needed better information on its costs. The bank had employed activity-based costing for over twenty-five years, but until the late 1990s, it wasn’t able to assign costs to customer service channels such as tellers, ATMs, and telephone banking. At that point a new initiative was undertaken to identify the costs of those channels, as Chitwant Kohli, vice president of costing and profitability (to even have such a role is unusual) explained:

As a services company, we are most interested in tracking labor costs, which make up over 60 percent of non-interest expenses. We extract expense data quarterly from the general ledger for each individual cost center. These cost centers are then grouped with like units based on the products, services, and activities performed by the unit . . . For example, domestic branches, which sell and service the same product lines using the same processes, became a unit group . . . Within the 30 to 40 unit type groups, we establish total staff time consumed by each activity based on unit time per activity multiplied by volumes processed. This enables the proper allocation of the unit’s salary cost to products and activities and also forms the basis for apportioning premises and general operating costs. Once we have these drivers by product, activity, and channel, we can aggregate costs across all units to arrive at both transaction and total product cost. These costs are then available for use in profitability models. For example, we can report the full end-to-end product cost of residential mortgages including acquisition and renewal costs by channel, back office processing, call center support, system costs, Head Office and regional overheads. For every customer we can then arrive at the costs associated with “ownership” of each separate product in the customer’s portfolio, based on transaction usage and channel preference.11

While the math to compute activity-based costs isn’t that difficult, coming up with the needed data does require a substantial degree of diligent investigation. In RBC’s case, however, it clearly was worth the trouble. The bank could use this cost information to assign costs to the different products, channels, and transaction types that customers use, and from that could determine the transaction and lifetime projected profitability of customers. RBC now feels that it has high-quality customer profitability information models and no longer needs to refine the data or the models. Instead, the focus is on how the information is used, according to Kevin Purkiss: “For us, ongoing competition is based on how you use the information. At this point rather than refining the model, we get more bang for the buck out of how we can use the information in our business. We’ve done lots of work on value propositions—trying to understand different segments and how they relate to product lines and new products. We gained lots of insight on that two years ago, and now we’re trying to better understand geographical variations.”12

Purkiss feels that today about 10 percent of large banks have ABC and profitability information, so it is less of a competitive advantage than it was when RBC did it around the year 2000. However, most banks have not employed these analytics across all products, services, and geographies, as RBC has largely done by this point. The barriers to doing so are perhaps less technological and more organizational, and are part of taking an enterprise-wide approach to analytics.

Merger and Acquisition Analytics

Mergers and acquisitions (M&A) have historically not been the focus of a lot of analytical activity, perhaps with the exception of detailed cash flow analyses. There is usually little attention to operational analytics involving supply chain efficiencies, predicted customer reactions, and impacts on costs within the combined organization. This may be a reason why a high proportion—estimates as high as 70 and 80 percent—of M&A deals are not terribly successful in terms of producing economic value.

It doesn’t have to be that way, however. Certainly some M&A deals must be done so quickly that extensive analytics would be difficult or impossible. When Bank of America was given the opportunity to acquire Fleet Bank, for example, it had about forty-eight hours to make a decision. But for most deals, there’s plenty of time to undertake analyses. At Procter & Gamble, for example, the acquisition of Gillette was considered for more than a year before the deal was announced. P&G’s analysis identified significant savings (in its supply chain, through workforce reductions and potential gains from customer synergies), which were used to determine how much P&G offered for Gillette in the deal. Similarly, CEMEX, the global cement company, uses analytics to quantify expected benefits from increased market share and improved profitability by enforcing its processes and systems on the takeover target.

Manufacturing, Operations, and Quality Analytics

One analytical domain that has long existed in companies is operations, especially manufacturing and quality. This was the original home, for example, of Total Quality Management and Six Sigma, which, when done seriously, involve detailed statistical analysis of process variations, defect rates, and sources of problems. Manufacturing and quality analytics have had an enormous impact on the global manufacturing industries, but their impact has been less revolutionary for service industries and nonmanufacturing functions within manufacturing companies. For the great majority of organizations, it seems difficult to summon the required levels of discipline and rigor to apply statistical quality control or even a strong process orientation outside of manufacturing. This means, of course, that it becomes difficult for firms to compete broadly on the basis of analytics, since they are often limited to the manufacturing function and generally focused on achieving productivity improvements rather than innovations to gain a competitive advantage.

Real analytical competitors in manufacturing, then, are those that go beyond just manufacturing. There are a few great examples of this approach. One is at a small steel manufacturer in the United States, and it illustrates that analytical competition applies both to smaller firms and to the manufacture of commodity goods. Rocky Mountain Steel Mills, a steel rail-, rod-, and bar-making division of Oregon Steel, faced a critical decision about manufacturing capacity in early 2005. It had shut down its seamless pipe mill in 2003 because of price pressures, but the primary customers for seamless pipe were oil drillers, and by 2005 oil prices had risen enough to make Rob Simon, the company’s vice president and general manager, consider reopening the pipe mill. However, he found the rules of thumb and standard costing/volume analyses previously used for such decisions to be too simplistic for a fast-changing mix of demand, pricing, production constraints, and industry capacity.

Simon decided to turn to a more analytical approach, and Rocky Mountain installed an analytical software tool called Profit InSight. He began to work with monthly analyses to determine whether the plant should be reopened. Potential customers and other managers thought that the high prices for pipe clearly justified the capital outlay that would be needed to reopen, but Simon’s analyses suggested that increased revenues for pipe would be offset by lower production of rod and bar, and would not be profitable. Only when pipe prices continued to rise throughout 2005 did Simon decide to reopen the plant in December of that year. Even when production started, his models suggested holding off on taking orders because prices were predicted to rise. Indeed, by January 2006 they had risen by 20 percent over the previous quarter.

Rocky Mountain estimates that in addition to the higher prices it received, it averted a $34 million loss it would have faced from production constraints had it reopened the plant earlier in 2005. The success with the new pipe mill was also a key factor in a substantial rise in Oregon Steel’s stock price. Profit InSight is now used as a weekly strategic planning tool, and Rocky Mountain Steel Mills has completely abandoned its previous “rhetorical” sales planning and forecasting approach for the new analytical methods. They are measurably superior, but Simon still had to demand that everyone listen to what the analyses show and quit second-guessing using the old approaches.

Analytics can also be applied to assess manufactured quality. Honda, for example, has long been known for the quality of its automobiles and other products. The company certainly has analytical individuals in its manufacturing quality department. However, it goes well beyond that function in identifying potential quality problems. Honda instituted an analytical “early warning” program to identify major potential quality issues from warranty service records. These records are sent to Honda by dealers, and they include both categorized quality problems and free text. Other text comes from transcripts of calls by mechanics to experts in various domains at headquarters and from customer calls to call centers. Honda’s primary concern was that any serious problems identified by dealers or customers would be noticed at headquarters and addressed quickly. So Honda analysts set up a system to mine the textual data coming from these different sources. Words appearing for the first time (particularly those suggesting major problems, such as fire) and words appearing more than predicted were flagged for human analysts to look at. Honda won’t go into details about any specific problems it’s nipped in the bud, but says that the program has been very successful.

Toshiba Semiconductor Company is another business that has made extensive use of analytics—in particular, visual representation of statistical analysis—in manufacturing quality. The initial applications focused on advanced analysis for new product and technology development, but they expanded quickly into other areas such as sales, marketing, development, production, and quality assurance. The company’s executives are strong advocates of analytics, and have led the company by the concept for over six years. The overall approach at Toshiba is encompassed by a broader initiative entitled “Management Innovation Policy and Activity.”

The visual analytics approach was first used by engineers in several semiconductor fabrication plants (fabs) for yield analysis—a key problem in the industry. According to Shigeru Komatsu, the company’s Chief Knowledge Officer (it is rare for analytics be addressed in such a role, but they are at Toshiba Semiconductor), “We have worked on standardizing performance indicators, we have built shared databases, we have made efforts to share analysis cases and results, and we have implemented analytic software such as Minitab and Spotfire DecisionSite in order to increase our efficiency in analytics.”13

One key issue with analytics in manufacturing is how well production workers can actually use sophisticated analytical tools, but Toshiba has found that visual analytics help to address this problem. “We observed that only 20 percent of the possible users of yield management software, for example, actually use it effectively. This is not only because the complicated special features make it hard to use for many engineers, but also because it only has a part of the data we need to analyze,” said Koji Kimura, the yield manager at Toshiba’s bipolar fab in Kitakyushu.14 Using the visual approach, Toshiba has been able to take an integrated approach to analytics in its business. The company combines analytics with a highly disciplined approach to quality and both operational and managerial innovation.

Another key aspect of manufacturing analytics is to ensure that the right products are being manufactured. We’ll refer to it as the configuration problem—making sure that the products offered to the market are those that the market wants. The configuration problem, like the ones described earlier, is cross-functional; it takes place at the intersection of sales and manufacturing, and usually also involves the supply chain, financial, and even human resource processes of the company. To compete on the basis of configuration is thus by definition an enterprise-wide activity. Configuration is highly analytical. It involves predictive modeling of what customers want to buy, as well as complex (usually rule-based) analysis of what components go with what other components into what finished products.

What companies compete on the basis of configuration? Some high-technology companies, such as Dell, are known for their configurable products. Wireless telecommunications companies may have many different service plans; some have developed automated analytical applications to find the best one for each customer. They also tailor their services to each corporate account. Automobile companies need to compete on configuration, though U.S. and European manufacturers have traditionally done it poorly. Since manufacturing a car from scratch to a customer’s specification is viewed as taking too long (at least outside of Japan, where it is commonly done), car companies have to forecast the types of vehicles and options customers will want, manufacture them, and send them to dealers. Far too often, the mixes of models and option packages have not been what customers want, so cars have to be substantially discounted to be sold during promotions or at the end of a model year. The mismatch between consumer desires and available product has been one of the biggest problems facing Ford and General Motors.

Both companies are trying to do something about configuration, but Ford is probably the more aggressive of the two. The company has begun to change its focus from producing whatever the factory could produce, and worrying later about selling it, to trying to closely match supply and demand. As part of this initiative, Ford is using configuration software to maintain rules about options and components, which will reduce the number of production mistakes and more closely match dealer orders to production schedules. Through a manufacturer/dealer partnership called FordDirect, Ford and its dealers also allow customers to browse dealer inventory, design a vehicle and match it to dealer inventory, and get a price quote and financing for a specific vehicle through the FordDirect.com Web site. FordDirect software is also used by dealers to track leads and measure the results from advertising. Ford has not yet entirely mastered the art of competing on configuration, but it is clearly making strides in that direction.

The industrial and farm equipment manufacturer Deere & Company is another firm that has invested heavily in analytics around configuration and product availability. On any particular Deere product line—say, for a tractor—there may be tens of thousands of valid configurations. Deere’s goal was to reduce inventory and complexity by eliminating configurations that were least desirable to make and sell. Deere analysts worked with academics to find the optimal number of configurations on two product lines at Deere. Profits on the two lines were increased by 15 percent as a result of the analytical work, which allowed 30 to 50 percent reductions in the number of configurations offered to customers while maintaining high levels of customer service and satisfaction.15

For Internet-based businesses, operations means churning out the basic service for which customers visit a Web site. Successful online businesses use analytics to test virtually all aspects of their sites before implementing them broadly. At Google, for example, the primary reason customers visit the site is to use its search capabilities. Google has a very extensive program of testing and analytics with regard to its search engine. Google employs a wide range of operational and customer data and analytics to improve search attributes, including relevance, timeliness, and the user experience. The company developed many of its own proprietary measures of search relevance. Most of the metrics are gathered in an automated fashion, such as the percentage of foreign results, how deeply users go in the retrieved items list, the percentage of users who go to each page of the search result, and the measures of search latency or timeliness. But Google also collects human judgments on the search process, and even observes individual users (at Google headquarters and in users’ homes) as they use the site for individual queries and entire sessions. One technique employed is eye tracking, from which “heat maps” are created showing which areas of a page receive the most attention.

Google is heavily committed to experimentation before making any change in its search site. As Bill Brougher, search product manager for Google, put it:

Experiments are a requirement of launching a new feature. It’s a very powerful tool for us. We have been experimenting for years and have accumulated a lot of institutional knowledge about what works. Before any new feature ships, it has to pass through a funnel involving several tests. For example, any change to our search algorithms has to be tested against our base-level search quality to make sure the change is a substantial improvement over the base. A little blip in quality isn’t significant enough to adopt.16

Google’s methods for analytical operations are as rigorous as any firm’s, and the nature of the business makes a large amount of data available for analysis.

Research and Development Analytics

Research and development (R&D) has been perhaps the most analytical function within companies. It was the primary bastion of the scientific method within companies, featuring hypothesis testing, control groups, and statistical analysis.

Of course, some of this highly analytical work still goes on within R&D functions, although much of the basic research in R&D has been supplanted by applied research (which can still use analytical methods) and the creation of extensions of existing products. In several industries, research has become more mathematical and statistical in nature, as computational methods replace or augment traditional experimental approaches.

We’ll describe the analytical environment for R&D in one industry that’s changing dramatically with regard to analytics. In the pharmaceutical industry, analytics—particularly the analysis of clinical trials data to see whether drugs have a beneficial effect—always have been important. Over the last several years, however, there has been a marked growth in systems biology, in which firms attempt to integrate genomic, proteomic, metabolic, and clinical data from a variety of sources, create models and identify patterns in this data, correlate them to clinical outcomes, and eventually generate knowledge about diseases and their responses to drugs. This is a considerable challenge, however, and firms are just beginning to address it. We interviewed three pharmaceutical firms—one large “big pharma” company and two smaller, research-oriented firms—and found that all of them have efforts under way in this regard, but they are a long way from achieving the goal.

Analytics is also being used effectively to address today’s challenges in R&D, and this is one way that Vertex Pharmaceuticals, Inc. competes.

Vertex, a global biotech firm headquartered in Cambridge, Mass., has taken a particularly analytical approach to R&D, and its results are beginning to show the payoff. Vertex’s President and CEO, Joshua Boger, Ph.D. is a strong believer in the power of analytics to raise drug development productivity. As early as 1988 (when he left Merck & Co. to found Vertex) he argued that, “What you need in this business is more information than the other guy. Not more smarts. Not more intuition. Just more information.”17

Vertex has undertaken a variety of analytical initiatives—in research, development, and marketing. In research, Vertex has focused on analyses that attempt to maximize the likelihood of a compound’s success. This includes developing multiple patent lead series per project and ensuring that compounds have favorable drug-like attributes. Vertex’s approach to drug design is known as rational or structural. This approach seeks to “design-in” drug-like properties from the beginning of a drug development project and it enables Vertex to determine as early as possible whether a compound will have drug-like attributes.

More of Vertex’s efforts in using analytics have gone into the development stage of R&D, where its analyses indicate that most of the cost increases in the industry have taken place. One particularly high cost is the design of clinical trials. Poor clinical trial design leads to either ambiguous trial results or overly large clinical trials. This causes substantial delays and raises costs. Vertex has addressed this challenge by developing new trial simulation tools. These tools enable Vertex to design more informative and effective clinical trials in substantially less time. Vertex can now perform trial simulations hundreds of times faster than were previously possible. With these simulations, Vertex can also reduce the risk of failed or ambiguous trials caused by faulty trial design. This simulation advantage allows Vertex to optimize trial design in a fraction of the time it takes using industry standard design tools, thereby shortening the trial cycle time.

Clinical trial operation also represents some of the highest cost increases across the pharmaceutical industry. The operation of clinical trials, as with trial design, takes place within the development stage of R&D. Operational activities that are not automated lead to significant costs. Vertex uses analytics to automate and enhance clinical trial operations; examples include tools for patient accruals and electronic data capture (EDC).

Whether in R&D or elsewhere, the company begins with the right metric for the phenomenon it needs to optimize. Analysts determine how to obtain the appropriate data and what analyses to conduct. With these findings, Vertex constantly compares itself to competitors and to best practice benchmarks for the pharmaceutical industry. “We compete on analytics and culture,” says Steve Schmidt, Vertex’s chief information officer. “We encourage fearless pursuit of innovations but we ruthlessly measure the effect these innovations have on our core business. We’re always looking for new meaningful analytic metrics, but where we look is driven by our strategy, our core corporate values and strengths, and our understanding of the value proposition to our business.”18 Vertex is a great example of applying analytics to product R&D, and as a result the company has an impressive array of new drugs in all stages of development.

The pharmaceutical industry is also pursuing analytical approaches that don’t even involve the laboratory. So-called in silico research uses computational models of both patients and drugs to simulate experiments more rapidly and cheaply than they could be performed in a lab. One systems biology start-up, Entelos, Inc., has produced computer program “platforms” to simulate diseases and treatments in the areas of cardiovascular diseases, diabetes, inflammations, and asthma, among others. Entelos partners with pharmaceutical companies and other research organizations to identify and test new compounds. The goal is to use computational simulations to reduce the very high cost, long cycle times, and high failure rates of conventional laboratory research in the pharmaceutical industry. One collaboration on a diabetes drug between Entelos and Johnson & Johnson, for example, led to a 40 percent reduction in time and a 66 percent reduction in the number of patients necessary in an early-phase clinical trial.19

Of course, R&D today involves not only product innovation but also innovation in other domains: processes and operations; business models; customer innovations such as marketing, sales, and service; and new management approaches. In a very important sense, the idea behind this book is that each area in which an organization does business can be one in which R&D is conducted. In the previous chapter, we noted how Capital One identifies new offerings through its test-and-learn market research approach. At Internet-oriented firms such as Amazon.com, Yahoo!, and Google, every change to a Web page is treated as a small R&D project. What’s the baseline case for measures such as page views, time spent on the site, and click-throughs? How does the change work on a small scale? How does it work when it is scaled more broadly? This test-and-learn approach to operational R&D is just as important as R&D on products.

Operational, business-model R&D also doesn’t have to involve the Internet. Harvard professor Stefan Thomke has described, for example, the experimental approaches that Bank of America has adopted for customer service innovations in its retail branches.20 Just as if it were developing new products in the laboratory, the bank treats each change in its service processes as an experiment and applies such approaches to experimental rigor as control groups, pilot studies, and statistical analysis.

In health care, research on service operations means finding better evidence-based medicine and treatment strategies for specific diseases. Despite the seemingly scientific nature of medicine, several studies suggest that only one-quarter to a third of medical decisions are based on science. A growing industry of health care providers, insurance companies, and third-party data and analytical service providers are working to make health care more efficient and effective through analytics.

One increasingly common approach, for example, is to try to predict the likelihood that members of a health plan will develop higher risk for more severe disease over time. Healthways is one company that works with insurers to make such predictions and to identify ways to improve health outcomes and thereby reduce the likely cost to the insurer. Health-ways uses data on members’ demographic, claims, prescription, and lab procedures to predict (using artificial intelligence neural network technology) which ones will be at highest risk for greater future total medical expenditures over the next year. Healthways employs more than fifteen-hundred registered nurses who then provide telephonic and direct mail interventions from one of its ten call centers nationwide to help members develop healthy behaviors which reduce the severity of the disease, improve outcomes, and reduce the cost to the health plan. This approach to risk management can also reduce ongoing health maintenance costs and lower the risk of disease recurrence.21

Other approaches to evidence-based health care involve developing appropriate treatment protocols for patients in health care facilities. Partners HealthCare in Boston, for example, has several initiatives under way to improve health care outcomes through protocols and “clinical decision support” for physicians and other care providers. Partners’ first efforts were primarily around drug prescriptions because they are the least ambiguous approach to clinical care, but in several disease areas the organization has also developed a set of care protocols. Many of the clinical decision support “rules” are embedded in systems that are used when physicians input an order for a patient.22 Other steps in evidence-based medicine for Partners include monitoring of patient data and events, making the right decisions the easiest decisions for caregivers, and predictive modeling of treatment and medical performance.

Human Resource Analytics

The final internal analytical application we’ll discuss in this chapter is in human resources. As with other parts of organizations, the tools for employing analytics in HR are becoming widely available. Most large organizations now have in place human resource information systems (HRIS), which record basic HR transactions such as hiring date, compensation, promotions, and performance ratings. Some go well beyond that level, and record skill levels on a variety of aptitudes and learning programs undertaken to improve those skills. Companies increasingly have the ability to relate their investments in human capital to their returns on financial capital. Whether they have the desire, however, is another question. People may be “our most important asset,” and even our most expensive asset, but they are rarely our most measured asset. Many companies may be beginning to employ HR analytics, but they are hardly competing on them.

The most conspicuous exception, of course, is in professional sports. Baseball, football, basketball, and soccer teams (at least outside the United States) pay high salaries to their players and have little other than those players to help them compete. Many successful teams are taking innovative approaches to the measurement of player abilities and the selection of players for teams. We’ve already talked about the analytical approach to player evaluation in baseball that was well described in Michael Lewis’s Moneyball. In American professional football, the team that most exemplifies analytical HR is the New England Patriots, which has won three of the last five Super Bowls.

The Patriots take a decidedly different approach to HR than other teams in the National Football League (NFL). They don’t use the same scouting services that other teams employ. They evaluate college players at the smallest, most obscure schools. They evaluate potential draft selections on the basis of criteria that other teams don’t use—intelligence, for example, and a low level of egotism. As coach Bill Belichick puts it: “When you bring a player onto a team, you bring all that comes with him. You bring his attitude, his speed, strength, mental toughness, quickness. Our evaluation is comprehensive. Scott Pioli [vice president of player personnel] and the scouting department do a great job of getting a lot of details by going into the history of a player, his mental and physical makeup as well as attitude and character. He eventually receives one grade and that establishes his overall value to the team.”23

Belichick often refers to the nonphysical attributes of players as “intangibles,” and he is perfectly comfortable discussing them with players and media representatives.

The Patriots manage the potential player data in a Draft Decision Support System, which is updated daily with new reports from scouts. Cross-checkers at the team’s offices check the scout ranking by comparing West Coast scout ratings with similar East Coast ratings (i.e., does the 6.2 UCLA player compare with the 6.2 Georgia Tech player?). No detail that can provide an edge is overlooked.

We don’t know of any businesses that compete on HR analytics to this degree (other than perhaps Capital One, which we discuss later in the chapter), but the phenomenon is beginning to emerge. Companies are measuring more consistently across global HR processes and putting the information in systems. There are varying approaches to quantitative HR, including 360-degree evaluations, forced rankings, predictions about attrition, and so forth. None of these approach “rocket science,” but they do connote a much more disciplined and methodical approach. At American Express, for example, which has employees in eighty-three countries, an HR executive in Asia commented, “Everything that we touch has a metric attached. Whatever we do we use globally consistent processes, measurements and databases. We do things in a methodical, thought out, consistent manner, and we have consistent platforms.”24

Such global discipline is a prerequisite for serious HR analytics, and it’s not hard to imagine that HR metrics will be used more for prediction and optimization in the future.

One organization taking HR analytics seriously is Sprint, the Fortune 50 wireless telecommunications company. The human resources team at Sprint discovered that employee relationships follow a predictable life cycle that is very similar to that of their customer relationships. Chad Jones, the vice president of customer management, worked with the HR team to translate Sprint’s six-stage customer relationship life cycle into a series of similar stages and questions for HR, such as:

How do employees learn about Sprint?

How do we make sure we find and hire the most qualified candidates?

How are we going to get the employee up and running and get them productive?

How are we going to get them their first paycheck and make sure they are happy with that first paycheck?

How are we going to intervene when the employee is unhappy?

And how are we going to get the employee recommitted year after year to the business?

Sprint attempts to measure as many of these issues as possible for its employees and has found that the insights gained from this analysis help the company optimize each stage of its relationship with employees and promote a motivated, positive workforce.

The shift to “talent management” is also driving the move toward HR analytics. One manufacturing company we interviewed, for example, has developed a “talent management index” from four proprietary measures, and it uses the index to evaluate how each organizational unit manages its investments in human capital. Goldman Sachs, which, like professional sports teams, pays its employees extremely well, is beginning to apply analytical approaches to its workforce. General Electric, Accenture, and Procter & Gamble seek quantitative reasoning abilities in potential recruits. Harrah’s, another analytical competitor in general, also uses HR analytics extensively in the recruiting process.

Another factor driving the shift to HR analytics is increasing rigor in staffing and recruiting processes. Companies are increasingly viewing these processes as activities that can be measured and improved; people are almost becoming just another supply chain resource. This is particularly noticeable in relationships with external staffing firms. Apex Systems, a fast-growing ($300 million in 2006 revenues) IT staffing firm based in Richmond, Virginia, has observed a long-term move among its clients to greater use of more rigorous processes and metrics in its clients, and it’s trying to stay ahead of the trend by adopting more analytical measures and management approaches in its own business. Apex looks at a variety of staffing metrics, including the following:

  • Time to respond with a first, second, and third qualified candidate
  • How many candidates does a customer see
  • Frequencies of payroll errors or customer invoice defects
  • Speed of customer issue resolution
  • Overall customer satisfaction levels

Apex’s customers are increasingly using vendor management system (VMS) software to track the efficiency and effectiveness of staffing processes, so the company needs to establish and understand its own analytics to stay ahead of demand.

If there is any company that rivals the New England Patriots for HR analytics in the business world, it’s Capital One. The company uses analytics extensively in the hiring process, requiring potential employees at all levels to take a variety of tests that measure their analytical abilities. The company uses mathematical case interviewing, a variety of behavioral and attitudinal tests, and multiple rounds of interviewing to ensure that it gets the people it wants. The process applies at all levels—even to the senior vice presidents who head business functions. For example:

When Dennis Liberson flew into Washington to interview for the top human resources position at Capital One Financial Corp., he was told that his interviews with the 16 top executives would have to wait. First, he was whisked off to a hotel room to take an algebra exam and write a business plan. Liberson, who jokes nearly seven years later that he might have been “the only HR guy who could pass their math test,” got the job and is now one of the company’s executive vice presidents. He also got an early taste of Capital One’s obsession with testing.25

Liberson is no longer the head of HR, but the focus on testing remains. A candidate for a managerial role at Capital One, for example, is still asked to review a set of financial statements and charts for a publishing company and then answer questions like these:

What was the ratio of sales revenue to distribution costs for science books in 2002? (Round to nearest whole number):

A. 27 to 1     B. 53 to 1      C. 39 to 1      D. 4 to 1      E. Cannot say

Even the most senior executive without some quantitative skills need not apply.

Overall, however, other than these few companies and professional sports teams, few organizations are truly competing on HR analytics. Perhaps this emphasis will come with time, but what seems to be lacking most is the management desire to compete on HR in the first place. Perhaps as people costs continue to rise and constitute higher percentages of organizations’ total costs, and as executives realize that their people are truly their most critical resource, analytics will catch on and proliferate in HR.

This chapter has considered a wide variety of internal applications of analytics. In each case, our objective was to illustrate not just that analytics are possible in a particular function but that they can be the basis for a different approach to competition and strategy. We hope these examples will drive senior executives to think about their own strategies and how they perform their internal activities. The next chapter, which addresses the use of analytics in external (e.g., customer and supplier) relationships, offers even more possibilities for competition.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.142.193