10
Uncertainty: Accept Uncertainty and Ambiguity

As we discussed in the introduction, uncertainty is a natural element of crises and crisis communication. Crises and disasters are by definition abnormal, unpredictable, and uncertain events. Many crises occur as surprises, thereby requiring time to assess what has happened, why it has happened, and what steps to take to address the impending damage or threat. Other crises, such as hurricanes, are predictable, but the actual course they will follow and their intensity when making landfall are all uncertain. Accordingly, a best practice of crisis communication is to acknowledge the uncertainty and ambiguity inherent in a crisis situation.

Despite the uncertainty inherent in crises, crisis spokespersons often feel a need to be overly certain and overly reassuring in their messages to publics. This intuitive urge may be largely a consequence of a belief that publics cannot accept uncertainty in situations and need certainty in the face of a crisis, even when information is simply unavailable. Unfortunately, overly reassuring statements in the face of an uncertain and equivocal situation may reduce a spokesperson's credibility. This potential for diminished credibility is particularly high when crises evolve in startling, unpredictable ways.

What Causes Uncertainty for Publics?

Uncertainty occurs when we have difficulty drawing conclusions from the available information. This means we can't predict what will happen. The discomfort of this uncertainty increases for individuals who are closest in proximity to the crisis (Spence et al., 2005). Specifically, uncertainty makes comprehension difficult when the available information is insufficient, multifaceted, or doubted (Brashers, 2001). Recognizing these three signs of uncertainty enables crisis spokespersons to both acknowledge and take steps toward resolving the resulting confusion and frustration experienced by publics. We discuss these three features of uncertainty in the following paragraphs.

When information about a crisis is insufficient, we mean it is either “unavailable or inconsistent” (Brashers, 2001, p. 478). Crises typically instill shock for individuals on multiple levels, including those tasked with responding to, those affected by, and those observing the situation. For example, when a downtown Minneapolis, Minn., bridge on I‐35W collapsed into the Mississippi River during rush hour, the community was both horrified and confused. Because the bridge was on a commonly used route, residents were at once concerned about the victims of the crisis and uncertain about how to navigate their way through the city. In response to this uncertainty, Nelson, Spence, and Lachlan (2009) observed that many residents in the area turned to live media coverage of the event to resolve their uncertainty. In this case, local media kept residents informed of the deaths, injuries, rescues, and damage while also suggesting alternate transportation routes.

Information is multifaceted when it is “ambiguous, complex, unpredictable, or probabilistic” (Brashers, 2001, p. 478). Weick (1995) explains that ambiguous information is particularly frustrating for publics. He sees ambiguity as “an ongoing stream that supports several different interpretations at the same time” (pp. 91–92). Crisis situations with prolonged waiting periods such as hurricanes often create such ambiguity. For example, Hurricane Matthew created considerable consternation for residents along the east coast of Florida in 2016. Some predictions had the hurricane hitting the coast at full strength. As the hurricane came closer to landfall, the projections showed a lesser likelihood the hurricane would maintain its current strength or direction. Yet messages from Florida's governor continued to warn residents of extreme danger. In the end, Hurricane Matthew produced little damage in Florida but did cause considerable flooding in states further north. The drastically different interpretations of the available evidence for Hurricane Matthew created ambiguity for residents that many resented after the hurricane passed by.

Information is doubted “when people feel insecure in their own state of knowledge or the state of knowledge in general” (Brashers, 2001, p. 478). Publics often feel doubt about the available information in response to politically motivated acts of violence. For example, almost half of Americans participating in a 2016 Pew Research Center poll still worried that the government's antiterrorism policies had not done enough since 9/11 to protect the United States (Pew Research Center, 2016). Lingering concerns and doubts about the ability to defend against politically motivated acts of violence are intentional. For such acts to produce political or policy influence they must “inculcate fear” (Tuman, 2003, p. 7). Consequently, such acts are typically done seemingly at random without a clear focus on who will be victimized. The resulting uncertainty in this case, then, is the means by which anxiety is nurtured.

How Do Publics Respond to Uncertainty?

In their most severe forms, crises instill a sense of uncertainty that Weick (1993) characterizes as a cosmology episode. Cosmology episodes occur when, in response to crises, people suddenly and profoundly feel that the order they once assumed in the universe no longer exists. Those experiencing a cosmology episode are so traumatized they momentarily cannot make sense of what is happening around them. Such incidents are devastating because both the ability to comprehend what is occurring and the means to rebuild that level of comprehension collapse together. Weick summarizes the human reaction to cosmology episodes in three statements:

  • I have never been here before,
  • I have no idea where I am,
  • and I have no idea who can help me (pp. 634–635).

Imagine, for example, the initial reaction of motorists either on or approaching the Minneapolis bridge as it collapsed into the Mississippi River. The thought of an eight‐lane bridge in the middle of a major city collapsing had not likely crossed the minds of these individuals. Those whose cars were stranded on the bridge likely wondered what was happening and whether or how they could be rescued. One of the surgeons who operated on injured drivers and passengers described his patients by saying, “They were in shock, they were happy to be alive, but they felt sad for all the people they had seen” (Levy, 2007, para. 9).

In the end, 13 people lost their lives on the bridge the day it collapsed and 145 were injured (Ferraro & Clarey, 2012). The bridge collapse also created a larger crisis of confidence in the city's infrastructure for residents. As Weick (1995) explains, “people frequently see things differently when they are shocked into attention, whether the shock is one of necessity, opportunity, or threat” (pp. 84–85). The shock of the bridge collapse created a sense of urgency that inspired inspections of transportation infrastructure in Minnesota and across the country. The Minneapolis bridge was rebuilt with added safety features and Minnesota intensified the assessment and repair of its bridges. To do so, Minnesota undertook a 10‐year initiative to repair all bridges with observed problems. Minnesota's state bridge engineer described the results of the bridge safety initiative, saying, “Certainly we've done a lot to ensure the safety of bridges so it doesn't happen again, so I feel confident about the advancements we've made in bridge design, construction, inspection, maintenance and that all works toward good bridge safety” (Schaper, 2017, para. 16). Ultimately, this initiative sought to replace the uncertainty Minnesotans felt about the safety of their bridges with confidence that far‐reaching and long‐term improvements had been made.

What Kind of Information Do Publics Seek to Reduce Their Uncertainty?

Uncertainty is an uncomfortable state creating a drive for uncertainty reduction, usually by seeking more information (Berger, 1987). As publics seek to reduce uncertainty, they initially pursue information about what actions they can take to protect themselves. We discuss the importance and nature of messages that empower publics to protect themselves in the following section. In addition to empowerment in the face of crises, publics also frequently ask questions seeking to resolve multiple interpretations of evidence, the intent behind the actions preceding the crisis, and the assignment of responsibility (Ulmer & Sellnow, 2000).

Early in crises, there is limited evidence available about the cause of the crisis. As investigations are conducted, evidence explaining what happened can reduce uncertainty for publics. Conversely, competing interpretations of this evidence can create a degree of ambiguity that actually heightens uncertainty. For example, British Petroleum's (BP) 2010 tragic Deepwater Horizon oil spill in the Gulf of Mexico caused tremendous damage that was readily observable by publics. As of 2018, the company is still engaged in a massive cleanup effort projected to cost tens of billions of dollars. Although the well‐being of wildlife and the surface appearance of the Gulf have improved notably, “scientists debate just how much oil from Deepwater Horizon is still out in the environment” (Ferris, 2017, para. 15). Much of the oil was degraded by naturally occurring microbes below the ocean's surface. How much oil remains and the potential for this oil to cause long‐term environmental damage are still contested. Complex evidence, in this case technical measures of oil residue on the ocean floor, often create ambiguity because of competing interpretations of the same findings. As such, this kind of scientific discussion can actually increase postcrisis uncertainty for both decision makers and their publics.

Questions of intent focus on the motives and planning process that occurred prior to the crisis. Answering questions of intent can, for example, explain whether the crisis was caused by an accident or malfeasance. If an organization had good intentions prior to the crisis and the events are seen as purely accidental, publics tend to be more sympathetic. If, however, the organization was motivated by profit, thereby causing the organization to cut corners in its crisis planning or safety measures, publics are much less forgiving (Coombs, 2015). As the investigation into the Gulf crisis unfolded, BP was plagued by accusations that, for the sake of profit, the company failed to engage in adequate safety measures before beginning to drill at extreme depths in the Gulf. BP was even portrayed in the feature film Deepwater Horizon as prioritizing profit over safety. The movie grossed over $100 million (Deepwater Horizon, 2016). The pressing question for publics as they watched oil gushing into the Gulf day after day from the vantage point of an underwater camera was, “How could this happen?” In this case, BP's intentions have been consistently portrayed as impure. BP provided a rebuttal to the accusations that their intentions were self‐serving, but public opinion has largely rejected these counterarguments.

Questions of responsibility involve assigning blame for the crisis. Assigning blame can reduce uncertainty for publics by identifying a source that is accountable for corrective action and liable for compensating those harmed. With natural disasters, responsibility is often interpreted as an act of God. By contrast, for industrial accidents and similar crises, responsibility is typically assigned to an organization or organizations. Because this responsibility carries with it a significant legal and financial toll, the assignment of blame is often debated in both public discussion and the courtroom. For example, BP and Transocean, BP's contractor that actually performed the drilling in the Gulf, initially blamed each other for the crisis. BP accepted its responsibility for the cleanup but insisted the failures in the technology in place to prevent the type of blowout that occurred in the Deepwater Horizon were the fault of Transocean. Transocean countered that BP oversaw another contractor, Halliburton, that had failed to adequately create a cement plug that was essential in preventing the crisis (Quinn, 2010). Such debates over responsibility do little to reassure publics the crisis will be resolved. As we mentioned previously, focusing on legal and reputational matters while people are disoriented and suffering in the wake of a crisis shows a lack of compassion.

How Can Organizations Avoid Overreassuring Their Publics?

When responding to the fear or outrage of publics, organizations may be tempted to overreassure their stakeholders. Ulmer and Pyle (2016) explain, “organizations that are focused on protecting their image most often use communication strategies to absolve themselves from blame, minimize the crisis, or over‐reassure the public about the impact of the crisis” (p. 114). To do so, organizations often manipulate or withhold information. In the short term, overreassuring audiences can create a momentary calming effect. In the long term, however, Ulmer and Pyle explain that overreassuring audiences deprives the organization of the credibility it urgently needs to communicate effectively in postcrisis phase. As we explained previously, being open and honest at all phases of a crisis helps organizations maintain their credibility—even if doing so causes initial damage to the organization's reputation.

We acknowledge overreassurance may also be intertwined with the uncertainty surrounding the crisis. Spokespersons may minimize the perceived impact of a crisis because they sincerely believe the risk is minimal or perhaps nonexistent. For example, when Thomas Eric Duncan was diagnosed in a Dallas hospital as having contracted the Ebola virus, Tom Frieden, then‐director of the Centers for Disease Control and Prevention (CDC), said, “I have no doubt that we'll stop this in its tracks in the U.S.” (Muskal, 2014, para. 2). Not long after making this comment, two nurses who had treated Duncan were diagnosed with Ebola. This news of Ebola having spread within the hospital caused public confidence in the CDC's ability to contain the disease to plummet. Indeed, Frieden's comments, seen as overreassurance, haunted him throughout the crisis. Frieden said what he believed at the time. Had he tempered his initial comments by accepting the risk of the disease spreading, Frieden likely would have created considerable alarm. Yet, having to backtrack on what was perceived as overreassurance hampered his credibility for the remainder of the crisis. The lesson from this case is that caution in reassuring statements early in the crisis may cause short‐term challenges, but such restraint can help spokespersons avoid long‐term credibility problems.

What Are Some Other Ways to Manage Uncertainty?

A best practice of crisis communication, then, is to acknowledge the uncertainty inherent in the crisis situation with statements such as, “The situation is fluid” and “We do not yet have all the facts.” As we discussed earlier, this form of ambiguity is strategic in that it allows the communicator to refine the message as more information becomes available and to avoid statements likely to be shown as inaccurate as more information is obtained. Many of the best practices will help manage and reduce uncertainty. For example, having a precrisis plan helps people know what to do during a crisis. Preevent relationships with stakeholders can help crisis managers predict responses. Maintaining open relationships with publics and the media will increase the flow of information.

Karl Weick's extensive work on sensemaking during crises provides a useful framework for maintaining flexible communication that acknowledges uncertainty during crises. Foremost for Weick (1988) is the need for organizations to willingly let go of their previous assumptions once a crisis occurs. Weick explains that organizational leaders often provide tenacious justifications that their previous assumptions and planning can minimize the crisis. Unfortunately, these assumptions are often proven false. Frieden's insistence in absolute terms that the Ebola virus would not spread within the Dallas hospital is an example of a tenacious justification for existing plans. Many spokespersons engage in tenacious justification or rigidity in their crisis response because, like Frieden, they assume certainty where it does not exist. Avoiding statements with such certainty is strongly advised.

To fully engage in sensemaking, spokespersons should recognize that crisis communication is a process of uncertainty reduction. This process transpires progressively in the crisis and postcrisis phases, continuing into the return to a new precrisis phase. The ongoing discovery and clarification of information require steady adaptation and constant communication. Weick's (1995) explanation of the sensemaking process can be summarized in four general strategies for crisis spokespersons: let go of previous assumptions, take action, collect feedback, and retain the lessons taught by the crisis.

First, uncertainty reduction is based on letting go of assumptions that can blind the organization to the true nature of the crisis. In many cases, the flawed assumptions made by an organization result in missed opportunities to prevent crises or, once a crisis occurs, limit the capacity for the organization to respond. Weick (1995) explains that organizations must first move beyond tenacious justifications of previous assumptions and prepare to take whatever novel actions may be needed to manage the crisis.

The uncertainty or ambiguity created by the crisis should not preempt the organization from taking action. Enacting a reasonable strategy to address the needs of the organization's stakeholders is essential. Central to this enactment process is understanding how the organization's previous actions contributed or may have contributed to the crisis (Weick, 2001). From the sensemaking perspective, then, the first step is to stop doing more of the wrong thing. This step is accomplished through a thorough and frank assessment of how previous assumptions and actions may have caused the crisis or diminished the organization's readiness to respond. Recognizing these failures and replacing them with novel actions is essential to uncertainty reduction.

Once an organization reconsiders its previous actions, the major objective is obtaining as much feedback as quickly as possible about the effectiveness or ineffectiveness of whatever alternate strategies the organization has enacted. Feedback from publics is essential in this step of the sensemaking process. Weick (2001) explains, “Managers literally must wade into the ocean of events that surround the organization and actively try to make sense of them” (p. 244). For Weick, this process involves gathering as much data as possible, interpreting the information, and deriving lessons learned. These lessons enable organizations to select the best‐known actions for preventing or responding to crises.

Based on feedback obtained and the interpretation of it, organizations retain these lessons learned about the most effective strategies for crisis management. The retention of these lessons contributes to organizational memory and learning. The goal is that these strategies will be retained and influence the organization's precrisis planning and continuous environmental scanning of risk issues. Thus, retention of lessons learned allows the uncertainty reduction process to come full circle from crisis to postcrisis assessment and back to precrisis planning. These lessons provide “cause maps” that heighten organizations' sensitivity to small failures that, if acknowledged, could prevent future crises (Weick, 2001, p. 305).

In summary, spokespersons can manage uncertainty through four steps of sensemaking:

  1. Accept uncertainty by letting go of previous assumptions and preparing to address the novel aspects of the crisis.
  2. Take reasonable actions that prioritize the needs of stakeholders over concerns for the organization's reputation.
  3. Constantly seek feedback on the success or failure of the initial actions and be prepared to enact alternative solutions.
  4. Retain the lessons learned about how to best respond to similar crises and include these lessons in precrisis planning.

By acknowledging failure, taking reasonable actions in response to these failures, carefully scrutinizing these actions, and deriving lessons learned from the crisis, an organization can both reduce uncertainty about the current crisis and avoid repeating similar mistakes in the future.

What Are the Ethical Standards for Managing Uncertainty?

Ideally, organizations seek to reduce uncertainty as soon as possible after a crisis begins and to quickly share what is known with all stakeholders. Doing so enables stakeholders to make informed decisions about protecting themselves and about the organization's effectiveness in responding to the crisis. Conversely, organizations can engage in the unethical practice of strategically increasing ambiguity through the manipulation of information and, in so doing, deflect responsibility or criticism for the crisis. As we have discussed, information related to crises is often ambiguous. From an ethical perspective, however, we set forth the following distinction:

  1. Strategic ambiguity is ethical when it contributes to the complete understanding of an issue by posing alternative views based on complete and unbiased data that aim to inform.
  2. Strategic ambiguity is unethical if it poses alternative interpretations using biased and/or incomplete information that aims to deceive (Ulmer & Sellnow, 1997, p. 217).

The ongoing discussion about BP's cleanup process in response to the Deep Horizon crisis is a fitting example. Claims that much of the oil is being consumed by natural microbes are ethical if such claims are based on the best science and the best evidence available. Such claims could, however, be unethical if they are made in an attempt to relieve the company of its responsibility simply because much of the oil is difficult to find in the vast Gulf of Mexico. We base this distinction on Nilsen's (1974) ethic of significant choice.

Nilsen (1974) argues that free and informed choice is essential to any democracy. Therefore, publics cannot exercise their full freedom when making significant choices that may profoundly affect their lives unless they have “the best information available when the decision must be made” (p. 45). Nilsen accepts the whole truth cannot be known through human interaction. In addition to the uncertainty caused by crises, some degree of bias or ambiguity is always present. However, he maintains “there should be no less information provided, no less rigor of reasoning communicated, and no less democratic spirit fostered than circumstances make feasible” (p. 73). When organizations seek to mislead publics by denying them access to information or by misrepresenting the information that is known, they are violating the ethic of significant choice.

Returning to our earlier example, after the Minneapolis bridge collapse, Minnesota's legislature was forthright in sharing information about all bridges in the state known to have structural damages. One state legislator acknowledged that, before the collapse, Minnesotans had a “general sense” that bridges in the state were deteriorating, but they did not comprehend “the scale of the problem” (Montgomery, 2017, para. 7). The transparency of Minnesota's legislature in discussing the dangers posed by other bridges in the state helped generate support for a higher gas tax, much of which was devoted to repairing and replacing problematic bridges. In the end, “Minnesota not only replaced the collapsed bridge, but also made a concerted effort to repair and replace hundreds of other bridges around the state” (Montgomery, 2017, para. 2). The open communication and public involvement evident in Minnesota's response to its bridge infrastructure needs clearly meet the criteria of Nilsen's (1974) ethic of significant choice.

Summary

By their nature, crises and disasters are abnormal, unpredictable, and uncertain events. Thus, spokespersons are advised to accept this uncertainty and to avoid overly certain and overly reassuring statements. Eventually, the errors prevalent in such overstatements typically reduce a spokesperson's credibility. Alternatively, spokespersons should make statements that reflect the fluidity of the information available. Maintaining the types of relationships advised in the third and sixth best practices can help create a flow of information that accommodates this degree of uncertainty.

References

  1. Berger, C. R. (1987). Communicating under uncertainty. In M. E. Roloff, & G. R. Miller (Eds.), Interpersonal processes: New directions for communication research (pp. 39–62). Newbury Park, CA: Sage.
  2. Brashers, D. E. (2001). Communication and uncertainty management. Journal of Communication, 51, 477–497.
  3. Coombs, W. T. (2015). Ongoing crisis communication: Planning, managing, and responding (4th ed.). Thousand Oaks, CA: Sage.
  4. Deepwater Horizon. (2016). Box Office Mojo. Retrieved from http://www.boxofficemojo.com/movies/?id=deepwaterhorizon.htm
  5. Ferraro, N., & Clarey, J. (2012, July 30). I‐35 bridge collapse: Five stories, five years later. Pioneer Press. Retrieved from http://www.twincities.com/2012/07/30/i‐35w‐bridge‐collapse‐five‐stories‐five‐years‐later
  6. Ferris, R. (2017, June 26). Much of the Deepwater Horizon oil spill has disappeared because of bacteria. CNBC. Retrieved from https://www.cnbc.com/2017/06/26/much‐of‐the‐deepwater‐horizon‐oil‐spill‐has‐disappeared‐because‐of‐bacteria.html
  7. Levy, P. (2007, November 29). 4 dead, 79 injured, 20 missing after dozens of vehicles plummet into river. StarTribune. Retrieved from http://www.startribune.com/4‐dead‐79‐injured‐20‐missing‐after‐dozens‐of‐vehicles‐plummet‐into‐river/11593606
  8. Montgomery, D. (2017, July 30). Many bridges found deficient after I‐35W collapse. Here's how Minnesota responded. Twin Cities Pioneer Press. Retrieved from https://www.twincities.com/2017/07/30/after‐collapse‐minnesota‐fixed‐deficient‐bridges
  9. Muskal, M. (2014, October 16). Four Ebola quotes that may come back to haunt CDC's Tom Frieden. Los Angeles Times. Retrieved from http://www.latimes.com/nation/nationnow/la‐na‐four‐ebola‐quotes‐haunt‐frieden‐20141016‐story.html
  10. Nelson, L. D., Spence, P. R., & Lachlan, K. A. (2009). Learning from the media in the aftermath of a crisis: Findings from the Minneapolis bridge collapse. Electronic News, 3(4), 176–192.
  11. Nilsen, T. R. (1974). Ethics of speech communication (2nd ed.). Indianapolis, IN: Bobbs‐Merrill Company.
  12. Pew Research Center. (2016, September 7). 15 years after 911, a sharp partisan divide on ability of terrorists to strike U.S. Retrieved from http://www.people‐press.org/2016/09/07/15‐years‐after‐911‐a‐sharp‐partisan‐divide‐on‐ability‐of‐terrorists‐to‐strike‐u‐s
  13. Quinn, J. (2010, May 11). BP and Transocean blame each other for Gulf of Mexico oil spill. The Telegraph. Retrieved from http://www.telegraph.co.uk/finance/newsbysector/energy/oilandgas/7712652/BP‐and‐Transocean‐blame‐each‐other‐for‐Gulf‐of‐Mexico‐oil‐spill.html
  14. Schaper, D. (2017). 10 years after bridge collapse, America is still crumbling. National Public Radio. Retrieved from https://www.npr.org/2017/08/01/540669701/10‐years‐after‐bridge‐collapse‐america‐is‐still‐crumbling
  15. Spence, P. R., Westerman, D., Skalski, P. D., Seeger, M., Ulmer, R. R., Venette, S., & Sellnow, T. L. (2005). Proxemic effects on information seeking after the September 11 attacks. Communication Research Reports, 22(1), 39–46.
  16. Tuman, J. S. (2003). Communicating terror: The rhetorical dimensions of terrorism. Thousand Oaks, CA: Sage.
  17. Ulmer, R. R., & Pyle, A. S. (2016). International organizational crisis communication: A simple rules approach to managing crisis complexity. In A. Schwarz, M. W. Seeger, & C. Auer (Eds.), The handbook of international crisis communication research (pp. 108–118). Malden, MA: Wiley Blackwell.
  18. Ulmer, R. R., & Sellnow, T. L. (1997). Strategic ambiguity and the ethic of significant choice. Communication Studies, 48, 215–233.
  19. Ulmer, R. R., & Sellnow, T. L. (2000). Consistent questions of ambiguity: Jack in the box as a case study. Journal of Business Ethics, 25, 143–155.
  20. Weick, K. E. (1988). Enacted sensemaking in crisis situations. Journal of Management Studies, 25, 305–317.
  21. Weick, K. E. (1993). The collapse of sensemaking in organizations: The Mann Gulch disaster. Administrative Science Quarterly, 38, 628–652.
  22. Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks, CA: Sage.
  23. Weick, K. E. (2001). Making sense of the organization. Malden, MA: Blackwell Business.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.101.192