5
Applying the FAROUT Method

Choosing from an increasing array of techniques can be one of the most difficult tasks facing an analyst. Research of the use of conceptual tools and techniques suggests that executives heavily support the idea that executives who use the right tools will succeed, while simultaneously recognizing that most tools over-promise and under-deliver results.1 We think one of the reasons this is believed is that the tools selected and used by analysts are frequently incorrect and inappropriate to their needs.

There are hundreds of analytical tools and techniques identified in the literature.2 Some techniques are briefly referred to, while others are treated through book length treatises. Many more are applied in unique or proprietary ways by analysts in addressing their challenges. We are not surprised that "tools" are seen as over-ambitious and under-performing. The well-worn analogy that a worker with only a hammer sees every task as simply a decision on how many nails to use holds true in the case of competitive analysis. We might also suggest that just because a worker has hundreds of different tools, it does not mean that he can apply them all effectively. Most specialists have a handful that they prefer and can apply most effectively. Fortunately, we believe that there are some considerations that analysts can make to lessen the likelihood of using the wrong tool for the job.

Techniques have an important role to play in the larger competitive and strategic analysis process. Individuals who study for a Master of Business Administration degree (MBA) or other higher-level business program will be exposed to some of these techniques in their marketing or strategy courses. Among the benefits that analysts and their organizations gain from using techniques correctly are the following:

  1. Greater understanding of relationships and situations—Virtually every technique and combination of techniques requires the analyst to ask numerous questions including "what?," "how?," "when?," "who?," "where?," and most importantly, "why?." These questions lessen the likelihood that they will miss or overlook important facets of the analysis being undertaken.
  2. Initially focus the analyst on data and facts—Most techniques require data and facts first and discourage the use of unqualified opinions, beliefs, rumors, or feelings. Although some techniques are highly qualitative, and others highly quantitative, nearly all of them require the analyst to maintain a keen understanding of the soundness of data input.
  3. Guide efficient data collection efforts—On agreeing which questions are to be answered, the analyst can then consider which techniques will be of most use. This then drives the data collection effort and lessens the likelihood that time and resources will be spent on collecting unnecessary or redundant information.
  4. Encourages analysts to be rigorous—Most techniques compel the analysts to consider a wider and deeper range of possibilities than they would normally accomplish alone. This is exemplified by the processes we outline in this book that can require multiple steps, which in turn require many checks and balances between these steps. Hasty analyses, badly organized information, and only using convenient data always leads to dissatisfied decision makers and short careers for the analysts who operate in this fashion.
  5. Forces analysts to think critically—Analysts should consider the benefits and limitations inherent in looking at data and information in specific ways. Most of the techniques presented here, along with the information we urge analysts to consider, should help them to prepare defensible and well-reasoned insights that will stand up to the critical scrutiny of demanding decision makers. Many of these techniques are subsequently modified for particular proprietary applications in enterprises, which often then become part of the enterprise's analytical repertoire.
  6. Promotes a proactive attitude to analysis—Most techniques require the analyst to consider the options and think through the relative value of each before use. Utilizing selection criteria to choose the best technique(s) for particular challenges causes analysts to think ahead in terms of the data they will need to operate each and the type of outcome each will deliver. This helps to determine their suitability to address the question(s) being asked.

Studies of the use of competitive analysis tools and techniques have demonstrated the extent of their use, as well as perceived judgments of their effectiveness. According to A. Gib and R. Gooding's survey of competitive analysts, the most-utilized tools included competitor profiling, product/market analysis, industry analysis, qualitative research methods, and customer satisfaction surveys. These tools all tended to be rated highly in terms of their perceived effectiveness, along with management profiling. The least-utilized tools included spire analysis, dialectic inquiry/devil's advocacy, gaming theory, force field analysis, and experience curve analysis.

As advisors and consultants in the area of competitive and strategic analysis, we are not surprised by the finding that a tool's extensive use and its perceived effectiveness would be highly correlated. Analysts will use tools they perceive to be effective and will shy away from ones they perceive to be ineffective. Having said that, we have no knowledge of how well trained the analysts were in applying the thirty tools that were rated, the nature of the questions or topics that their decision-making clients had asked them to address, the context in which they applied the tools, or the quality of the data/information used in employing the techniques.

Applying the Techniques

There is a process to properly identify analysis techniques, and analysts should think through a series of questions before they make their choice:

  1. What is the full range of techniques that can be used to respond to the question asked?
  2. What is the focus and scope of the competitive phenomenon being analyzed?
  3. What are the constraints—personal, informational, organizational, resources, and contextual—that might affect the analysis process?

Every technique we detail in this book delivers certain things very well, but most also have drawbacks of which the astute analyst must be aware, and we draw attention to these individually.

In the "Background" section of each technique, we provide our readers with the context in which the tool was originally developed. A good number of the tools presented in this and our prior book3 were developed decades ago by individuals who recognized problems at the time and sought a conceptual and/or methodological means for solving it. They often become popularized through a management guru or large consultancy practice.4 Many of these tools have been in use, taught, and improved through the years to the point where they are now viewed as standard models for use by an analyst. These tools are also the most likely to have been customized or adapted for use by an enterprise, especially as they are more likely to be well understood and regularly employed in analysis work.

Other techniques have been developed more recently, often in response to new phenomena being encountered. These tools have not had the benefit of decades of critical scholarly scrutiny or improvement through practice and teaching and may yet still evolve dramatically. This does not make them more or less useful to the analyst, and time will be the best arbiter over whether they too become part of the standard competencies of the competitive analyst.

We try to point out the "strategic rationale" for the tool's development, and we demonstrate the important links the tool has to other strategic concepts. We also suggest what implications the analyst's application of the tool and the results that are generated from its application will have for the specific decisions being made.

In each section, we look at both the "strengths and advantages" and "weaknesses and limitations" of each tool identified. We caution readers against reading too much into the length or the number of strengths or weaknesses identified. It was not our goal to be exhaustive but to highlight the more prominent items that the analyst must consider. Readers should critically consider each of the points made in this section independently and factor them into their application of the tool. This will help to guide the level of confidence by which they communicate their findings.

Very few questions can be satisfactorily answered through the application of a single analytical technique. If it were that easy, then there would be no need for this book. Most strategic, business, and competitive questions are complex, dynamic, and cognitively challenging. This requires the analyst to identify the sequence and range of techniques that need to be applied, and every time that this is done, a different answer will emerge. Some application sequences will be more efficient and effective than others, sometimes, but not always. The analyst also needs to identify the nature and range of data that is already available or can be acquired before making a decision. Some techniques will have to be discarded simply because it is impossible to obtain the required information in a timely manner.

An Evaluation Scheme for Assessing the Adequacy of Tools and Techniques: FAROUT

Any form of intelligence generated must ultimately satisfy a decision maker and the organization's needs. An effective analyst needs to know how the intelligence generated by the application of a technique will be used. Although these principles may appear simple to apply on the surface, there are a variety of objective considerations that make the execution of an analyst's responsibilities far more difficult in practice.

After years of conducting strategic and competitive analysis projects, some authors realized that there were a limited number of key considerations common to all high-value analysis.5

A unique concept we have developed for analysts is the FAROUT approach. The FAROUT profile we developed for every technique can be beneficial in helping analysts make choices from the techniques we detail in this book. Applying FAROUT to their selection process will support analysts in selecting the particular technique or combination of techniques to best meet their unique situations. Having said that, we want to be clear that FAROUT is only a guide, and, as such, is only one input among many others that an analyst needs to consider in developing their craft. It is designed to assist analysts in knowing which techniques are appropriate for any given situation.

FAROUT is based on the premise that for analytical output to be insightful, intelligent, and valuable to business decision makers, it needs to meet a number of common characteristics. The output needs to be

Future-oriented
Accurate
Resource efficient
Objective
Useful and
Timely

Failure to meet all these criteria to a satisfactory level will result in the analytical output being less than optimal and of lesser value to the decision maker.

The six components of FAROUT are described next.

Future-orientation: Relying on the past as predictor of the future can be dangerous. This is particularly true when innovation, science, and technology factors can quickly disrupt a market. This has been very evident in the rapid adoption and development of e-commerce causing disintermediation of entire industries. By definition, the intelligence resulting from an analyst's work must be future-oriented, looking both deeply and broadly at what might happen. They must be willing to take risks to some degree by being both inventive and predictive. Early warning, foresight, prescience, or prevision cannot be adequately generated by using historical data that are focused entirely on the past. The better analytical methods for developing intelligence are indeed future-oriented.

Accuracy: The effective analyst should develop outputs that aim for high levels of accuracy. This means that the insights gained are precise. Accuracy also means that the analyst's insights are as closely matched as possible to the actual conditions of the phenomena being analyzed. High levels of accuracy are difficult to attain in practice when the data underlying the analysis

  • has come from only one source;
  • has not been cross-validated against both hard and soft information;
  • is collected under time constraints that restrict the comprehensiveness of the collection process;
  • needs to be converted from sources in ways that it was not originally designed for; and/or
  • comes from sources filled with high levels of bias in the first place.

Although achieving the highest levels of accuracy is theoretically desirable, the analyst usually has to trade-off against other conceptual and pragmatic considerations. Experienced practitioners have suggested that in a good proportion of competitive marketplace contexts, accuracy or precision may often be much less important than developing an enhanced understanding or perspective.

Resource efficiency: To produce effective analysis, data needs to come from sources that cost less than the resultant output is worth. This equation refers to the marginal value of gathering the additional information required. At the margin, the subsequent use of a tool is valuable to the extent that it will increase the value of the insights more than the resources expended by the enterprise's analyst in applying it. Executives commonly lament that their organizations gather enormous amounts of data, with the thought that it eventually may be needed. Their experiences suggest that much, if not most, of the data will likely lay dormant for years inside contemporary, high volume, digital storage devices. Although an Internet search may conveniently produce volumes of apparent "hits," one phone call to a well-placed and knowledgeable contact would most likely have produced far superior information in a fraction of the time.

Objectivity: The application of a given method affects the degree of bias held by the analyst, analyst groups, and/or organizations.6 Too many otherwise "good" analyses can be clouded by cognitive or social biases ranging from prior hypothesis bias through recentness and availability to groupthink, all of which provide comfort in dealing with risk or uncertainty.7 To minimize the potentially destructive nature of these common biases, data or information should be viewed by analysts in a dispassionate, impersonal, rational, and systematic manner. In other words, objectivity helps to minimize the destructive potential of analytical and decision-oriented biases. It is also essential for analysts to recognize the potential for and avoid the selective use of facts to provide support for pre-ordained or desirous conclusions. Experienced analysts recognize that delivering bad news is just as much a part of their job as delivering good news, and they will tackle each with equal skill and objectivity.

Usefulness: Almost by definition, valuable analytical outputs will meet the critical intelligence needs specified by a particular decision maker. The output of some techniques and models can be quickly understood, whereas others may be less easily digested and can require the decision maker to further engage with the analyst before they are confident of making decisions. It helps if the analyst and the decision maker can design a process that helps each of them to develop a clear understanding of the problems and intelligence needs, as well as a deep and broad appreciation of the decision context. This understanding does not always come easily, and for many, will only emerge over a lengthy period of time, which engenders trust and respect for the tasks and responsibilities of each. Ultimately, effective analysts strive to produce outputs that meet, or surpass, the requirements of the decision maker.

Timeliness: Strategic business information or competitive data frequently has a limited "shelf life," especially where those decisions are being made in dynamic, hyper-competitive, or turbulent environmental contexts. Consequently, raw data loses its value the longer it remains excluded from the decisions underlying organizational action. Some methods of analysis may provide the intelligence required by clients for decisions but take far too long to develop. This could happen when there is a need to subject the data to multiple phases of analysis, the need to gather a certain volume or variety of data that is not quickly accessible, or to employ other, less readily available individuals in the process. On the other hand, some methods of analysis may require little time to perform but do not deliver the other required features of objectivity, accuracy, utility, and resource efficiencies. Valuable analysis will provide decision makers with enough time to implement the course of action recommended by the analysis.

Using the FAROUT Rating System

Managing the analysis of business and competitive data is a challenging task, and we are not aware of any Analysis for Dummies books or magic software that can replace an analyst who knows how to employ a good balance of both science and creativity. We do know from experience, it is highly unlikely that good analytical output will be based on just one analytical method or tool. Rather, a combination of several techniques will be required.

Each analytical method has unique limitations, and these limitations multiply when placed in specific organizational contexts. The FAROUT system will enable the analyst to mix the appropriate tools to be applied in analysis tasks so as to maximize the insights generated for their decision makers. It is our view that the more successful analysts recognize and are sensitive to the limitations associated with any particular analytical method or technique. The sensitized analyst can address these issues throughout the whole of the competitive and strategy analysis process to overcome such limitations.

We utilize a five-point rating scale to assess each analytical technique contained in Part Two. The five-point scale ranges from low (1) to high (5) and is expanded in Table 5.1. Every technique is assessed against the six FAROUT elements. Our objective in offering the FAROUT framework is to assist analysts in assessing the outputs of different analytical methods to ensure high intelligence value. If the analysis delivers on all six characteristics, analysts and decision makers can be reasonably confident that the output will make a difference. All the techniques and their ratings are summarized for easy reference in Table 5.2.

Table 5.1
Assessment of Analysis Techniques Using the FAROUT Scheme

image

Table 5.2
FAROUT Summary of Methods

image

image

References

Clark, D.N. (1997). "Strategic management tool usage: A comparative study," Strategic Change, 6(7), 417–427.

Fahey, L. (1999). Competitors: Outwitting, Outmaneuvering, and Outperforming. New York, NY: John Wiley & Sons.

Fleisher, C.S., and B.E. Bensoussan (2003). Strategic and Competitive Analysis: Methods and Techniques for Analyzing Business Competition. Upper Saddle River, NJ: Prentice Hall.

Gib, A., and R. Gooding (1998). "CI tool time: What's missing from your toolbag?," pp. 25–39 in the Proceedings of the 1998 international conference of the Society of Competitive Intelligence Professionals, Chicago, IL.

Harris, S.G. (1994). "Organizational culture and individual sense making: A schema-based perspective," Organization Science, 5(3), 309–321.

Hawkins, S., and R Hastie (1990). "Hindsight: Biased judgments of past events after the outcomes are known," Psychological Bulletin, 10(3), 311–327.

Hogarth, R.M. (1980). Judgment and Choice: The Psychology of Decision. New York, NY: John Wiley & Sons.

Hogarth, R.M., and S. Makridakis (1981). "Forecasting and planning: An evaluation," Management Science, 27(2), 115–138.

Mathey, C.J. (1990). "Competitive analysis mapping," Competitive Intelligence Review, 1(2), Fall, 16–17.

McGonagle, J.J. (2004). "Analytical techniques," Competitive Intelligence Magazine, 7(4), 51, 54.

Prescott, J.E. (1986). "A process for applying analytic models in competitive analysis," pp. 222–251 in King, W. and D. Cleland [eds.], Strategic Planning and Management Handbook. New York, NY: Van Nostrand Reinhold and Company.

Rigby, D.K. (2003). Management Tools 2003. White Paper. Boston, MA: Bain & Company, Inc.

Rigby, D.K. (2001). "Putting the tools to the test: Senior executives rate 25 top management tools," Strategy and Leadership, 29(3), 4–12.

Sandman, M.A. (2000). "Analytical models and techniques," pp. 69–98 in Miller, J. [ed.], Millennium Intelligence: Understanding and Conducting Intelligence in the Digital Age. Medford, NJ: Information Today.

Webster, J., Reif, W.E., and J.S. Bracker (1989). "The manager's guide to strategic planning tools and techniques," Planning Review, 17(6), Nov/Dec., 4–13.

Endnotes

1 Rigby, 2001.

2 For example—Clark, 1997;Fahey, 1999; Fleisher and Bensoussan, 2003; Mathey, 1990; McGonagle, 2004; Prescott, 1986; Sandman, 2000; Webster et al., 1989.

3 See Fleisher and Bensoussan, 2003.

4 Rigby, 2003.

5 Fleisher and Bensoussan, 2003.

6 Harris, 1994; Hawkins and Hastie, 1990.

7 Hogarth, 1980; Hogarth and Makridakis, 1981.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.219.3.72