8

Avoid Predictable Surprises

IT HAD BEEN two years since Carrie Brice’s agency had been selected to receive the National Academy of Public Administration’s Award for Excellence, one of just ten government organizations to be so honored. Since then she had participated in panel discussions and appeared as a conference speaker on the subject of “making government work” on several occasions. Inside the agency, an agricultural support services organization, the award was heavily promoted by senior management as evidence of the workforce’s talent and dedication in carrying out reforms begun several years earlier. As the deputy director for field operations, Carrie had received special recognition for her role in the process. She had led a comprehensive field office restructuring that seemed on track to vastly improve service delivery. She was considered the prime candidate to replace the agency head, who was planning retirement within the year.

It therefore came as a shock when the director called her at home late one evening to inform her that six employees at the agency’s Phoenix supply center had been arrested by local police and the FBI and charged with dealing in narcotics and stolen government property. After a sleepless night, Carrie arrived at her office at six o’clock in the morning and immediately began going through the audit and management review reports covering the Phoenix center’s operations, as well as the regular monthly reports she received from the center’s manager. Nothing in these reports gave even the slightest hint of irregularities in center operations.

As Carrie’s shock morphed into anger, she was determined to get to the bottom of this costly and acutely embarrassing event. It had dangerous ramifications not only for the agency’s reputation but also for her personal career aspirations. The arrests generated interest from the media, which attracted the attention of several members of Congress who served on the agency’s appropriation committee. The department-level inspector general had no choice but to launch his own independent investigation of the Phoenix center.

Knowing that the agency would be dealing with the fallout from this for quite some time, Carrie was determined to get out in front of the crisis. She appointed a small investigative task force composed of her immediate and most trusted assistant, the agency’s chief counsel, the head of human resources, and the head of security. She commissioned it with answering this basic question: why had there been no early warning of the situation?

As the external and internal investigations ensued, disturbing answers began to emerge. It became apparent that the agency’s pride in its success, and its promotion of its reputation for excellence among its employees, had created a climate of self-congratulation. On the one hand, this helped reward the workforce for a difficult job well done. But on the other hand, it had sent a subtle and inadvertent message that any information that might contradict this positive image would not be welcome. First-line supervisors and middle managers, who were in the best sentry positions to spot irregularities, operated under the unspoken assumption that to deliver bad news was to incur disapproval from on high. So when confronted with suspicious circumstances, they tended to look the other way in the hope that what they suspected would never materialize into a real problem.

With only good news moving up the reporting chain, everyone, especially the supervisors and middle managers involved, was kept happy. The cost of this stifling filtration of critical information became painfully clear to Carrie and the other agency leaders as they digested the various investigative reports they received. In the senior managers’ well-intended efforts to motivate and reward the workforce, as well as promote the agency’s reputation for excellence among its peers and external stakeholders, they neglected a cardinal rule of preventing a predictable surprise such as the one at the Phoenix supply center: bad news is usually much more valuable than good news and is stifled at great risk.

Defining Predictable Surprises

A predictable surprise is an event or series of events that takes an individual or organization by surprise, despite prior awareness or availability of all the information necessary to anticipate the events and their possible consequences.1 If you do not seek to identify, evaluate, and mitigate such ticking time bombs, all your efforts to plan for a successful transition and to lay the foundation for longer-term success could be for naught—because when those bombs do explode, all your energy will go into firefighting. Your hopes for systematically getting established and building momentum will be dashed.

Of course, true surprises really do happen. And when they do you simply must confront the consequences and surmount the resulting crisis as best you can. But far more often, new leaders are taken off track by surprises that really shouldn’t have been surprising—if the warning signs were known and heeded. This often happens because the new leader, like Carrie Brice and her associates, simply doesn’t look in the right places or ask the right questions.

We all have preferences about the types of problems we like to work on and those we prefer to avoid or don’t feel competent to address. But as a new leader, you will have to discipline yourself either to dig into areas in which you are not comfortable or interested, or to find trustworthy people with the necessary expertise to do so.

In the complex operating environment of most government organizations, surprises can come from such external sources as political shifts, trends in public opinion, catastrophic national events, and quickly changing economic conditions—and from such internal sources as information system crashes, key personnel losses, individual improprieties of various kinds, major product or service quality failures, and organizational political intrigue. In either case, it is essential to regularly gather as much information as possible about those areas that pose the greatest threats. Otherwise, you may find yourself facing the very unpleasant task of dealing with a predictable surprise.

Sources of Vulnerability

What renders organizations vulnerable to being predictably surprised? Organizations become vulnerable when they lack the capacity either to (1) sense and respond to emerging threats in a timely manner or (2) learn from experience and disseminate the resulting lessons learned to the right people and places. In the former case, organizations—like Carrie’s—fail to see emerging threats or are unable to mount an effective response in time. In the latter case, organizations squander opportunities to learn from experience—good and bad—and so are doomed to repeat the same mistakes. We term these, respectively, sense-and-respond failures and learn-and-disseminate failures.

How do organizations avoid falling prey to these failures? They put in place processes explicitly designed to increase the organization’s capacity to sense and respond and to learn and disseminate. This is illustrated in figure 8-1. At the top of the figure is the senseand-respond (SR) loop, which consists of processes to recognize emerging threats, establish priorities, and mobilize an effective response. At the bottom of the figure is the learn-and-disseminate (LD) loop, which consists of processes to learn from experience, to embed the resulting insights into relevant parts (people and processes) of the organization, and to prevent people from forgetting. The SR loop and the LD loop both draw upon and augment organizational capabilities.

To avoid predictable surprises, the SR loop and the LD loop must operate efficiently (fast enough) and effectively (focusing attention and resources in the right ways). Each loop is necessary; neither one on its own is sufficient to protect the organization. An organization with an effective SR loop, for example, responds well to emerging threats with which it has some previous experience. But if it lacks an LD loop, it cannot learn from sense-and-respond failures and do better the next time. Likewise, the capacity to learn from failures isn’t worth much without the capacity to sense and respond to emerging threats.

FIGURE 8-1

SR and LD loops

image

Sense-and-Respond Failures

As illustrated in figure 8-1, the SR loop consists of three key subprocesses—recognition, prioritization, and mobilization. A failure in any of these can leave the organization vulnerable to being predictably surprised.

Recognition Failures

Some disasters can’t be foreseen. No one, for instance, could have predicted in the 1960s that the HIV virus would jump the species barrier from monkeys to infect humans on such a vast scale. But many unforeseen disasters that strike organizations could and should be recognized because they happen for predictable reasons. These reasons include:

Preconceived Notions. Cognitive biases—systemic weaknesses in the way people observe events and make decisions—may blind individuals and organizations to emerging threats. Preconceived ideas about what is “possible” or “impossible,” for example, can cause a leader to focus attention on certain types of problems while inadvertently allowing more serious ones to develop almost in plain sight. These recognition failures occur when leaders discount or ignore evidence that does not fit with their beliefs. Any time you hear someone say, “That’s impossible,” a warning bell should sound.

Confirmation Bias. A related problem arises when multiple, competing sources of information and analysis exist within an organization—one as large as the U.S. government or as small as a local public-service agency. When this is the case, some leaders will gravitate to assessments that confirm what they want to hear and tune out dissonant views. For example, media, congressional, and special commission scrutiny of intelligence gathering and analysis before 9/11 and the Iraq invasion fell into this trap and demonstrated how pervasive and crippling self-censorship was among those charged with identifying potential threats to national security.

This phenomenon is evident at all layers of organizations, and to win bureaucratic wars and retain influence with key decision makers, some leaders, like those in the first-line and middlemanagement ranks of Carrie’s agency, quickly learn to tell those in power only what they want to hear. This failure can be reinforced by the complacency that comes when, despite mounting evidence, leaders believe that because the problem hasn’t happened before, it probably won’t happen now.

Inoculation. When the signals associated with a threat are masked by a high level of background noise, the result can be false alarms that “inoculate” leaders, making them resistant to seeing truly serious problems. By signals we mean clues, indications, and other evidence of serious vulnerability to an impending problem. Noise refers to conflicting information that points to other problems or presents more benign explanations for the threat. When the signal-to-noise ratio is low (i.e., when there are relatively few signals and a lot of noise), it becomes very difficult for even the most aware leader to distinguish genuine threats from false indications.

The signal-to-noise problem is further compounded by the organization’s response to previous false alarms. Analysts tend to err on the side of caution—better to be chastised for being too cautious than for being too optimistic. This tendency among analysts may cause multiple false alarms and create “crisis fatigue” among leaders who grow leery of repeated warnings of problems that never seem to materialize. But when those responsible for scanning the environment allow themselves to drift from being overreactive to being underreactive, the organization as a whole may drift into perilous waters and become vulnerable to a true threat when one inevitably emerges.

Silos. Recognition failures also can occur because of the way organizations are structured. Most organizations have distinct silos that contain and vertically move valuable information. But often there are barriers that prevent information from being seen by other parts of the organization. Leaders must constantly make trade-offs between the need to create and protect these deep pools of expertise and information, and the need to integrate and synthesize information across the organization.

The simplest type of integration problem occurs when various members of an organization have pieces to the puzzle, but no one has them all—and, critically, no one knows who knows what. In short, the organization’s knowledge never equals the sum of its members’ knowledge. While various parts of the organization may have all the information necessary to perceive and deter a predictable surprise, no one person in the organization is capable of putting it all together.

In theory, senior management should play the role of synthesizer, compiling the information into the big picture to avoid stovepipe syndrome. But the barriers to this goal are great. There is immense pressure within bureaucracies to filter information as it rises through the hierarchy. The temptation to withhold or gloss over sensitive, confusing, or embarrassing information is great. Those at the top inevitably receive incomplete and distorted data, and overload may prevent them from keeping up-to-date with incoming information.

Illusory Consensus. Finally, organizations can suffer from illusory consensus, a problem rooted in the twin desires of most bureaucracies to avoid expending energy and incurring blame. It is all too easy, especially for the new leader, to interpret a lack of opposition to an initiative as positive support. Those who harbor doubts may keep quiet because they assume decision makers are armed with better information, or because they want to avoid accountability for mistakes. As soon as a predictable surprise occurs, however, those who were silent suddenly have an incentive to distance themselves from failure by going public with an “I told you so” message.

Illusory consensus is closely compatible with the concept of groupthink, which describes how members of an organization suppress their critical doubts and allow the false appearance of consensus to emerge. This is how Carrie and her associates got into the danger zone of ignorance without realizing it.

The mirror image of the illusion of consensus is suppressed dissent. Suppressed dissent can arise when one part of the organization is vested with too much responsibility for a particular issue and seeks to retain its primacy. In such situations, other parts of the organization, even those with valuable information or perspectives to add, aren’t consulted or, in the worst case, may be pushed out of the decisionmaking process. The result is that too narrow a focus is brought to bear on the issue and potential problems go unrecognized or are given too low a priority. To avoid recognition failures, leaders must strive to mitigate the impact of biases and ensure that organizational resources are appropriately allocated. One way to determine whether a recognition failure occurred is to assess whether leaders marshaled adequate resources to scan the environment for emerging threats. That means determining whether leaders did a reasonable job of directing the organization to gather, integrate, analyze, and interpret available data. Did the leader conduct an ongoing scan of those elements of the external operating environment on which the agency is most dependent or to which it is most vulnerable? Did the leader strive to integrate and analyze information from multiple sources to produce insights that can be acted upon? If the leaders did not do an adequate job, the organization’s systems for recognizing emerging threats must be strengthened and the leaders’ responsibility for crisis avoidance must be clarified and reinforced.

Prioritization Failures

Predictable surprises also occur when threats are recognized, but prevention is not given appropriate priority. Failures of prioritization are commonplace and result from cognitive, organizational, and political factors. Individuals may inappropriately discount the future, organizations may inadequately assess the likelihood of potentially damaging events, and special-interest groups may attempt to distort perceptions of potential costs and benefits to protect their perquisites.

Competing Priorities. How can leaders prioritize emerging threats when they are beset by competing demands on their attention? How can they possibly distinguish the surprise that will happen from the myriad potential surprises that won’t? Of course, they can’t make such distinctions with 100 percent accuracy. Uncertainty always exists—high-probability disasters sometimes do not occur, and low-probability ones sometimes do. Therefore, if an organization undertakes careful cost-benefit analyses and gives priority to those threats that would inflict the highest costs, its leaders should not be held accountable for a failure of prioritization. If the leaders fail to take these steps, they must concentrate on strengthening systems for setting priorities.

Overload. Those responsible for scanning the environment can also suffer from information overload, which keeps them from responding to all serious potential threats. As a result, their efforts either become too diffuse to be useful, or they are forced to ignore lower-priority areas. In either case, the organization risks failing to see an emerging threat until it is too late. Overload can occur when the resources devoted to environmental scanning are insufficient for the volume of information to be processed, or when the range of environmental sources of information increases over time without accompanying increases in scanning resources. Experienced managers will recognize selective attention, noise, and overload as the common state of most organizations. But the critical lesson for new leaders is that a failure to establish an ongoing process of environmental scanning will almost surely at some point result in the unpleasantness of a predictable surprise.

Secrecy. For government organizations engaged in national security, law enforcement, health issues, and the like, secrecy is necessary due to various security classifications and privacy concerns. But often the impulse toward secrecy extends to far less sensitive information because of tradition or misguided coveting of a valuable resource by those who see it as a source of influence. The net result is that important information is not shared internally, and even top leaders can remain uninformed. This trap offers a key lesson for new managers: when you are developing strategy, do not cut yourself off from consultation with those who you believe have potentially valuable input. To ward off a predictable surprise, you must temper your natural optimism with a thorough examination of all potential obstacles.

Conflicts of Interest. Conflicts of interest that result in poor prioritization are a major issue of concern for government organizations. Myriad layers of controls have been implemented to address such conflicts, from annual financial disclosure requirements to regular audits, to rigid ethics rules and criminal statutes governing who you may work for and what kind of access you are allowed to your former employer after leaving government service. Such transparency is not perfect—there are many loopholes—but overall it serves as a strong deterrent to corruption in carrying out a government leader’s public-service responsibilities.

Discounting the Future. Predictable surprises often play out over time frames substantially longer than the expected tenure of many organizational leaders, especially those who serve in politically appointed positions that oversee the operations of ongoing government functions. This can create a variation of the free-rider problem. “Why,” such a leader might ask, “should I be the one to grapple with this problem and take the heat when nothing is likely to go wrong during my watch? Better to focus on my short-term goals and reap rewards for their attainment.”

Low-Probability Events. A related problem concerns dealing wild cards—potential problems that have low probabilities but very high costs. If terrorists could detonate a nuclear device (not necessarily a fission or fusion weapon; a radiological device would do) in a major city, it could do hundreds of billions—even trillions—of dollars of damage to the world economy. In theory, governments should allocate resources to avoiding a disaster like that—for example, by helping gather up more unsecured nuclear materials in the former Soviet Union or interdicting global trade in nuclear materials—based on a combined assessment of the likelihood of and the cost of such an event. In practice, however, government tends to underinvest in preventing wild-card events because the best intelligence and technical analyses available deem them as unlikely to occur, even though the potential impact would be catastrophic. On the other hand, governments often tend to overinvest in politically sensitive but less threatening crises, such as swine flu inoculations.

To avoid prioritization failures, leaders must strive to employ systematic and disciplined processes for establishing priorities. Tools such as decision analysis and risk analysis can help focus attention on issues that have a low likelihood but high consequences. Leaders must also be systematic in auditing their organization’s incentive systems to ensure that conflicts of interest are not blocking action on emerging threats. Finally, leaders must lead organizational assessment and dialogue processes that focus attention on critical priorities.

Mobilization Failures

When an emerging threat has been determined to have serious potential consequences, leaders must mobilize to prevent it. This means marshaling support, educating important external constituencies, focusing the attention of key people in the organization, and making surprise prevention a personal priority. Organizational and political barriers often impede leaders’ efforts to catalyze an adaptive response to emerging problems. Organizational inertia and complexity often erect major barriers to timely action. The actions of special-interest groups to delay or block action likewise can prevent leaders from addressing emerging threats until they explode into full-blown crises.

Collective Action Problems. One class of incentive failures that arises in all types of organizations and can create predictable surprises is known as collective action problems. For example, an agency’s incentive awards plan may in fact create unhealthy internal competition between functions that draw on the same information resources, perhaps leading one to prevent the release of information to the other. Also, there are situations where members of an organization either try to take a “free ride” in the hope that others will assume responsibility for emerging problems, or to behave as if someone else were in charge of heading off looming problems. In both cases, no one feels compelled to act. This situation can become particularly dangerous when organizational members perceive that taking perhaps risky preventative action will yield them little reward if they are right and significant penalties if they are wrong.

Special-Interest Groups. Efforts to address pressing problems often yield broad, but modest, gains for many and large and painful costs for special interests. The result? Special interests are strongly motivated to block action, and the many who stand to benefit are more difficult to energize. Too often in these cases, disaster has to happen before the blocking power of special interest can be overcome.

Sometimes it is impossible to overcome opposition, even opposition that is likely to result in disaster. But the leader does have tools at his or her disposal to mobilize support to prevent predictable surprises. The most important of these is the courage to commit himor herself to addressing looming disasters and the associated willingness to spend political capital to achieve this goal. Leaders also can employ the tools of strategic coalition building to analyze potential support and opposition and to build winning alliances in support of action. If leaders embrace the challenge of mobilization and exert effort commensurate with the risks involved, they should not be held accountable if a surprise occurs. If they fail to take preventive action, they must strengthen their capacity for mobilizing effective responses.

Learn-and-Disseminate Failures

Organizations suffer learn-and-disseminate failures when they fail to focus attention on learning from experience, embedding these lessons within the organization, and preventing their employees from forgetting. Once again, failures can occur in each of these key LD loop subprocesses and contribute to predictable surprises.

Focus Failures

Organizations fail to learn from past mistakes because they lack the mechanisms needed to share and codify, to the greatest extent possible, key lessons learned. Overcoming this tendency means setting up groups to analyze crisis experiences and generate lessons learned. Too often, however, organizations don’t take those steps. Sometimes a leader who simply doesn’t recognize the importance of learning from experiences is to blame. More often, however, responsibility lies with the organization, which is caught in a vicious cycle of firefighting and is too busy dealing with current crises to learn from past ones. Such circumstances set the stage for predictable surprises.

To avoid this problem, leaders must find ways to carve out the time and other resources necessary to invest in learning from experience. In a cost-constrained environment, investment in learning is too often the first thing to be jettisoned, so leaders must be prepared to fight hard for the organizational “slack” necessary for effective learning to occur.

Embedding Failures

Even when leaders focus attention on capturing lessons learned, the organization may still fail to disseminate the lessons appropriately, especially to the front lines. As a result, the benefits of experience do not get embedded into the organization’s sense-and-respond systems, setting the stage for a future repetition of problems the organization already has confronted.

To understand why dissemination may not occur, it’s important to distinguish between individual knowledge and relational knowledge, as well as between tacit knowledge and explicit knowledge. The four types of organizational knowledge are summarized in table 8-1.

The implications for the creation and preservation of organizational knowledge are far-reaching:

Tacit knowledge gained by an individual who has confronted a problem is more difficult for an organization to capture than explicit knowledge. Think of employees responsible for maintaining a complex information system: they come to know all the system’s idiosyncrasies, but that knowledge is very difficult to codify and transmit.

Relational knowledge gained by a group confronting a problem is more difficult to capture than individual expertise. When faced with a crisis, for example, experienced teams know which members are going to perform which tasks and who is going to react in which ways; they don’t have to consult procedures to mount a quick and effective response. That knowledge is difficult to capture.

Tacit-relational knowledge—the knowledge that individuals have but cannot easily articulate to others—is the glue that holds the organization together and is by far the most difficult type to preserve.

TABLE 8-1

Types of organizational knowledge

Individual knowledge
(Possessed by individuals
about how to do their jobs)
Relational knowledge
(How to work effectively
as a group)
Explicit knowledge: Transferable verbally or through writing

• Rules

• Laws

• Procedures

• The “science” of a profession

• Organizational charts

• Formal decision-making processes

• Plans for coordination

• Written communication protocols

Tacit knowledge: Transferable by being taught or working with someone who has experience

• Rules of thumb

• Techniques

• Approaches to individual decision making and problem solving

• The “art” of a profession

• Approaches to group decision making and problem solving

• Negotiated divisions of responsibility

• Key sources of information and influence

• Trust and credibility

To avoid dissemination failures, leaders must ensure that organizations match dissemination mechanisms to the type of knowledge to be preserved. Explicit lessons can be taught to individuals in the form of cause-and-effects models and rules of thumb, or they may be codified into more formal guidelines, checklists, regulations, and policies. More tacit and relational knowledge often must be transferred in the minds and hearts of people.

Forgetting Failures

Too many organizations fail to remember lessons from the past. This often occurs with the loss of people, including a new leader’s predecessor, who may have taken away a valuable bank of knowledge. If possible, consultation with the person who held your job before you can be a valuable early learning experience and a good way to get a head start on avoiding a predictable surprise.

Fortunately, organizational memory typically contains significant redundancy. In a given unit, it is rare for all experienced personnel to depart at the same time, and those who remain can help educate new members. At the same time, the erosion of capability in critical areas can be subtle and all the more pernicious when it goes unnoticed. Costly and unnecessary memory loss afflicts most organizations. Any time there is a significant change in personnel, important knowledge can by irretrievably lost. Similarly, any time that responsibility for a critical activity, such as the start-up of a new program, is transferred from one part of the organization to another, critical insights can fall through the cracks. Leaders, therefore, must make knowledge preservation a core activity. This means identifying when key transfers of personnel and responsibility occur in the organization and intervening to assure that the thorough and accurate transfer of knowledge is high on the agenda.

Avoiding Transition Surprises

As a new leader, you are particularly at risk of being predictably surprised. In part, this is because you lack critical information and key relationships that help you spot emerging programs. The organization you are entering may also be overly siloed or have incentives problems or learning disabilities that further increase your vulnerability. Or there may be entrenched special interests that will fight to block needed change from occurring.

While these organizational risk factors are unquestionably problematic, the greatest risks may lie in how you approach your new leadership role. You may be susceptible because you have a particular mind-set that leads you to see some things and not others, and to gravitate toward certain problems and avoid others. What can you do to avoid being predictably surprised? The answers are easy to list but hard to realize:

• Discipline yourself to look in areas and ways that are not your preferences. This means forcing yourself to move out of your comfort zone. If you have a particular functional background, for example, there is the risk that you will view all problems through the lens of its mental models. To a person with a hammer, everything looks like a nail.

• Build a team with complementary skills. It is all too easy to staff your core team with people just like you. It’s a natural impulse because it’s easy and comfortable to surround yourself with people who think the way you do. Strive to get more cognitive and stylistic diversity on your team and do the hard work to integrate it. Also don’t forget to explicitly task your staff with checking for predictable surprises lurking in their functions.

• Embed early-warning systems directly into the frontline processes of your organization. It can be tempting to treat senseand-respond and learn-and-disseminate systems as add-ons. For example, some organizations set up units explicitly tasked to scan the external environment. While potentially helpful for integrating information and insight, such systems only work if sensing and learning are embedded at a more micro-level in the organization, where information about emerging problems will first become available. People at the front line need to be clear about what to do with such information and, critically, must be incentivized to share and not conceal it.

Conclusion

The failures in organizations’ sense-and-respond and learn-anddisseminate loops described in this chapter are key contributors to predictable surprises. A weakness in any link in this sentinel chain of information processing—for Carrie’s agency it was the SR loop—renders an organization, and its leaders, vulnerable. The types of failure also often compound and reinforce each other. Learning failures, for example, can result from a lack of integration, or from a lack of incentives among the organization’s leaders to invest in capturing lessons learned. But in any case, leaders, especially those new to their positions, must be acutely aware that predictable surprises are awaiting them if they do not take effective action to neutralize their potential. There likely will be enough legitimate surprises during your tenure; work to deter the predictable ones by following the advice in this chapter.

ACCELERATION CHECKLIST

1. Are there areas of your organization that might harbor potential surprises? If so, what will you do to assess the risks?

2. Is the organization vulnerable to recognition failures? If so, what can you do to better recognize potential threats?

3. Do the organization’s systems for planning and prioritization do a good job of assessing risk and setting priorities? If not, what can you do to strengthen prioritization?

4. Are there barriers to mobilizing responses to avert potential crises? If so, what can you do to mobilize resources and build coalitions?

5. Does the organization do a good job of focusing attention on learning, distilling lessons learned, and embedding them in sense-and-respond systems? If not, what can you do to enhance organizational learning?

6. Is the organization at risk due to memory loss? If so, what can you do to prevent or slow forgetting?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.226.105