CONTENTS

You may not have been able to prevent all uncertainties from influencing your project; the preceding normality has cascaded into a crisis in which your project’s continuation is threatened. Unsurprisingly, this is a time of considerable stress. Yet, it is a time in which clarity is required, and mindless actions need to be avoided. With emotions being stretched to the limit, objectivity is of the utmost importance to enable an appropriate recovery.

The lure of a ‘great escape’

Imagine, that all that has been written in the previous chapters has not worked out. Uncertainty and complexity have taken its toll and if no actions are taken – swiftly – the project might be suspended or stopped altogether. With so much at stake, what do you do? When a crisis hits a project, there are often specific types of behaviours, similar to the ones described in the previous chapters and yet considerably amplified. These, too, are counterproductive because, ironically, they can reinforce the chaos instead of helping with a solution.

The defensive retreat

In a crisis characterised by chaos, one is likely to lose orientation: ‘Where are we?’, ‘What is happening?’ and ‘What shall we do (quickly)?’ are some of the questions that we ask ourselves in a crisis. As a result, and because of that uncomfortable feeling of ‘being lost’, we often tend to fall back on our basic instinct: of self-preservation. We become more inward-looking and try to cover ourselves as individuals. Divisions that emerged during the incubation time (if there was any) may intensify. Instead of more collaboration, we shift towards more adversarial relationships. We start building a wall around us that offers an illusion of comfort. Mindful decision-making becomes less likely.

Ostrich effect

The term the ‘ostrich effect’ was coined in the early 2000s (Galai and Sade 2006) to describe the inclination of people to avoid thinking about negative issues in their life, even though those issues might be pressing and need to be dealt with. This behaviour is likened to the myth of ostriches sticking their heads in the sand in the face of extreme danger rather than fleeing (this is a myth: ostriches do not do this but will always try to flee and if they cannot, will flop to the ground and lie prone, ready to escape should the opportunity afford itself). The metaphor of people sticking their ‘head in the sand’ rather than facing up to problems is not a new one but took a long time to be identified as a specific cognitive bias. From a psychological perspective, it is the result of the tension between what people recognise to be essential and what people anticipate will be painful. The uncomfortable reality does not go away, but people prefer to delude themselves into thinking there is no problem as long as they can, before responding with panic and stress when they are forced to act (Babad and Katz 1991).

What is true of everyday life applies to organisational life as well. For example, in some situations where people are confronted with risks, they may delay responding in the hope things will improve (Kutsch and Hall 2005). Elsewhere, it was found that market traders continually monitored stock performance when markets were performing well but much less frequently when markets were flat or falling, thereby not taking action in time to protect their portfolio values (Karlsson et al. 2009).

Broken communication

Going hand-in-hand with increased defensiveness in the face of a crisis is a change in the way we communicate. Communication can be used to preserve our integrity and even to damage others. Information is exchanged, not necessarily to explore what is happening or what we should do next but to lessen any potential blame attaching to oneself. Emails sent for the purpose of explaining one’s decisions and leaving a trail to be used later to justify one’s actions make sense in that narrow personal context, but may not help the project in the here-and-now. Communication can degenerate into accusations and thus becomes more destructive than helpful in crisis recovery. It is especially important to look for this in inter-departmental or client-provider communication as each group may well retreat to its own domain for relative safety if it seems as if the project is unravelling. When trust falters, resorting to formal communication alone can virtually eliminate the spontaneity, collaboration and improvisation that may be vital for a resolution at that moment.

Centralising power

There is a tendency in crises, when we do not see progress from others, to feel the need to assume control. We believe that we can do better and that by transferring power to ourselves we can single-handedly deal with the situation more effectively than we could as a group. Unfortunately, this form of centralisation (and the accompanying power games) is often a side-effect of a crisis in projects.

Thinking and acting in the past

It is likely that a crisis, in its totality, is of a type that you have not directly experienced before. However, human nature means that it is also likely that you will rely on recovery mechanisms you have deployed in the past. Tackling a novel crisis that requires quick and decisive action through drawing on past solutions is not only likely to be ineffective but might exacerbate the crisis. We are habitual creatures, and we tend to rely on our past experiences and schemas of action. However, these past-informed habits may not fit the present crisis. Being aware of this may enable us to focus more clearly on new, more creative solutions to the current crisis.

Hindsight bias

People tend to revise the probabilities of an event or phenomenon happening after it has happened. They exaggerate the extent to which that event or phenomenon might have been predicted beforehand. This tendency is known as hindsight bias (sometimes colloquially called the ‘I-knew-it-all-along effect’) and is one of the most widely studied decision traps due to its ubiquitous nature.

This cognitive bias is remarkably robust across all populations and in all situations. For example, it has been found in people’s views on general knowledge questions, in predictions of sporting events and political elections, in business and organisational decision-making and in medical diagnoses (Christensen-Szalanski and Willham 1991).

The effect of hindsight bias is that people believe they knew the outcome of an event after that event has already happened. There are some factors at play in hindsight bias (Pohl and Erdfelder 2004). First, people tend to misremember or distort their previous predictions. Second, people regard events as having been inevitable all along and, finally, people assume events could have been foreseen. Where all three factors are in play at the same time, hindsight bias is very likely to occur.

In organisational activities, hindsight bias can distort decision-making profoundly. For example, it significantly distorts the evaluations of strategic decisions in organisations and, as a result, distorts projections for the future (Bukszar and Connolly 1988). Elsewhere, with start-ups, entrepreneurs who recalled their experiences with previous experiences were over-confident in their belief that they would be successful in the future as a result of hindsight bias (Cassar and Craig 2009).

Tunnel vision

Under increased stress, the scope of our radar narrows and shortens. Our minds tend to focus on single actions, and the insufficiency of response in alleviating the situation is compensated by a further fixation on executing it, sometimes over and over. Such fixation may prevent us from remaining sensitive to the bigger picture of what is going on around us.

Key enablers to recovering

The behaviours that typically emerge in crises are ones that need to be actively managed, in a mindful manner. Usually, the goals, when faced with a crisis, are its immediate resolution and ‘damage control’. This is important, but so too is a necessary shift in the way the crisis is managed. As quickly as possible, the recovery stage needs to be initiated.

Project continuity

This book is very much about the ‘soft’ aspects of project management, about behaviours and applied practice. Nevertheless, one cannot ignore the need for some mindless structure that can be relied upon quickly. Every crisis response involves business continuity planning. For project work, this continuity plan defines which functions in the project are critical to address, protect, and maintain. For example, in a software development project, a vital function might be the ‘testing environment’ in which sections of code are brought together and functionally evaluated. If these critical functions come to a standstill, the whole project could be put on hold. In a crisis, such functions deserve special attention. So, a plan can be activated quickly, even if that means in a mindless checklist manner.

Checklists

In a crisis, the resulting stress and ‘tunnel vision’ may be countered by the relative ‘automation’ of a checklist. A checklist should not replace human situated cognition, but it can help to probe the situation and aid a project manager’s state of mindfulness. A useful checklist is one that is:

  • Short and simple. Simplicity forces project managers to accept a stimulus and to interpret.
  • Focussed. It aims at critical functions of a project. Insignificant components of a project are not checklisted.
  • Practical. It only offers probes based on actions that are doable and feasible.

Closeness

We might have all the necessary plans in place, but because of the unfamiliar character of a crisis, we are not ready to exercise those plans, let alone be reflective and creative. The question this raises is how to sensitise a project team for a crisis, in a safe environment, before the crisis happens. One answer lies in simulating worst-case scenarios. Crisis simulations have the great benefit of getting close to extreme situations. Playing through worst-case settings, to test one’s endurance and adaptability in a ‘live’ but safe environment is at the core of simulating crises.

It is a puzzle that, in many projects where substantial value is at stake, planning and preparation involve tools and techniques that do not incorporate the emotive side of managing uncertainty. Most planning approaches advocated in project management seem to exclude the behavioural side of a crisis. If you can, simulate a crisis that allows you to receive immediate feedback on crucial stakeholders’ behaviours and skills under high-stress conditions. It is too late to test out aspects of mindful behaviour when a real crisis is already unfolding.

Tiger Teams

When a crisis strikes, seeking an outsider’s perspective can be vital. Internal politics tend to take over in the middle of a major problem as people can become insensitive and defensive and may entrench themselves in their silos. If we wish to find a swift solution to avert disaster, this silo mentality needs to be broken up. Tiger Teams (whether they are called that or go by some other name) can deliberately be set up as high-performing teams aiming to reconcile potentially opposing views and facilitating solution-finding in out-of-control situations.

They need to be on stand-by or to hover around a project, monitoring the situation and ready to provide the project manager with support. They can be parachuted in when the situation warrants it. A Tiger Team must not replace the project manager but support the project manager in the following:

  • Listening and asking questions from multiple perspectives about what is happening and yet not rushing to conclusions despite the pressure to act quickly.
  • Imagining worst-case implications, together with the details of complex, potentially dynamically changing, tasks.
  • Suppressing members’ egos in terms of ‘knowing the answer’ yet remaining inquisitive in creating options.
  • Willingness to break existing rules and processes, with the ability to think outside of the usual methods of operation.
  • Skills to create solutions that work at the technical, process and human levels.
  • Ability to maintain a continuously high level of focus and intensity of action.
  • Maintaining all this to achieve rapid project recovery while operating within challenging timeframes under the ‘spotlight’ of senior management.

In projects, the role of the project manager is frequently an ‘everything’ function. They are often expected to switch seamlessly from managing ‘business as usual’ to being a crisis manager. The shift from normality to a crisis-like situation, however, can be difficult for us who are emotionally and structurally attached to the project. Instead, a set of seasoned managers, possibly currently involved in other projects but with scope to provide the necessary support to a project in trouble, can be used to ensure that valuable input.

Mindful practices

Tiger Teams

In Intel development projects, unexpected problems occasionally require some form of trouble-shooting. At the centre of initiating a problem-solving process are so-called ‘Tiger Teams’ – ad hoc small groups of subject matter experts coming together to deal with routine, every day, problem-solving (and also crises). These Tiger Teams are often ‘self-selected’, with experts from within and outside the domain in which the project resides. Availability is driven by an interest to engage with a challenging problem. The length of engagement is initially estimated, but there is flexibility in extending the time the experts belong to the team until the problem is solved:

Setting up a Tiger Team for a limited time requires flexibility in the organisation to provide expertise on an ad hoc basis. This is because critical problems are generally not planned for, they emerge. Cost-centred thinking about lending resources to deal with a temporary problem is not a constraint at Intel.

Another factor to consider is uncertainty in how long a Tiger Team is likely to be deployed. Without having certainty about when the problem will be solved, stakeholders and cost centres need to be informed, and they also need to show time flexibility, allowing their scarce resources to be engaged with the problem until they are no longer required:

What is mindful about it? A Tiger Team, often referred to as a response team or a ‘hit squad’, provide additional mindful capabilities in times of extreme complexity and uncertainty: a crisis.

Tiger Teams are not just a collection of experts; they are experts driven by a purpose or problem statement. Their strengths are to initiate an open and honest constructive ‘struggle’ to understand multidisciplinary, cross-functional problems by reconciling different perspectives and allowing holistic big picture thinking. A well-functioning Tiger Team exhibits the following characteristics:

  • Openness, trust and respect: everyone is encouraged to speak freely from his or her disciplinary perspective. An opinion is neither right nor wrong; all opinions are respected.
  • Common goal: the goal is to get the problem resolved, and the common goal of trouble-shooting a multi-disciplinary problem is prioritised.
  • Commitment: although Tiger Teams are a temporary form of troubleshooting, every expert shows a desire to contribute to problem-solving.

Logistical independence

The resources that have been provided for project ‘normality’ may not be the most suitable to help in a crisis. Indeed, one may argue that the resources deployed in the incubation phase of a crisis have been insufficient to prevent it from occurring in the first place. The Tiger Team has to draw on a pool of readily or quickly made available resources, be it people or additional funds. This access to resources should be detached from the daily business of the project. Lengthy, otherwise sensible, change-related processes need to be unhooked or circumvented.

Nevertheless, this does not imply that the provision and deployment of resources to stem a crisis should be allowed to unfold haphazardly. Similarly, the question of how this extra resource is paid for should not add to the difficulties of the project. Preparations should ideally be made in advance of any crisis occurring. Arrangements may, for example, include the provision of a budget for these resources (as yet unspecified, since the nature of any future crisis is unknown) in advance of their mobilisation. As we encounter time-consuming ‘blame-games’ in crises, the risk of a silo- mentality associated with lengthy and often futile searches for root causes could be overcome by switching to a contractual model that ‘shares’ the costs of managing a crisis, regardless of which party ‘caused’ it. This focuses minds on solutions, not blame.

Leading the art of recovering

A crisis in a project is often perceived as threatening, a period of confusion which requires an urgent remedy. It is at these moments that project managers need to be ‘leaders’ as their staff will look to them to ‘do something’. The challenge for leaders in projects is to ‘bring things back to normal’.

Readiness to initiate a radical shift in the mode of management is required in advance; from a phase characterised by shock, confrontation and increased response rigidity, to one of reflection, collaboration and adaptation. Leadership is necessary to prepare stakeholders for such an essential transition and to facilitate a move from potential inaction and rigidity towards recovery.

Readying stakeholders

Stakeholders need to be educated for ‘when’ it happens, not ‘if’. Of course, if at all possible, we want to prevent a crisis from ever happening, and this is the focus of all the planning that goes into projects. The result of this effort is an unspoken assumption that failure will not or can not happen, and this makes readying stakeholders for engaging with a crisis all the more difficult. Doing so is an implicit acknowledgement of failure. It also costs time and effort to prepare stakeholders for a crisis in the absence of one. Why prepare for something that has not happened yet and may not happen anyway? We have to allow time and effort, often in advance of the execution of the project, for techniques that would enable key stakeholders (for example, the client) to rehearse a crisis, and to test the response capability to deal with one. Whether such rehearsing involves the development of plans, simulations of scenarios, or storytelling does not matter, as long as the approach helps to sensitise people to the emotive factors of a crisis. Words in isolation, in the form of a plan – dry, impersonal – are inadequate to convey the behavioural side of a crisis.

Just-world hypothesis

The just-world hypothesis (also called the just-world phenomenon, bias, belief, or fallacy) is a cognitive bias where people view the social environment as, primarily, a fair environment (Lerner 1980). The result of this is that people will devalue the experience of victims within a social environment. This is because people want to believe the world is essentially a fair place so rationalise or explain-away injustices, often blaming people who are victims for their own misfortune. This is the case even when those victims have little or no control over events. People take unjustifiable credit when things go well, believing this to be entirely or mainly as a result of their hard-work, superior intelligence or some other personal capacity.

It has been suggested that the just world fallacy is held by people for several reasons (Hafer and Bègue 2005). Chief among these appears to be that people fear facing vulnerability. Because people fear to be victims themselves, when they hear of other people becoming victims of some event, they try to assign blame for the event on the victim’s behaviour. A classic example of this is victim-blaming rape victims. This allows people to believe they can avoid becoming victims by avoiding certain behaviours associated with victims. Another explanation for the just-world phenomenon is that people are seeking to reduce the anxiety they feel caused by the world’s injustices. Believing people are responsible for their misfortunes allows people to believe the world is fair and just.

Within organisations, the just world fallacy can arise in a variety of contexts. It can impact on business ethics where lower ethical standards are found among managers who have a strong belief in the just world hypothesis (Ashkanasy et al. 2006). It can also be found in recruitment and promotion and, through this, power differentials in organisational hierarchies (Pfeffer 2010). This effect can also be felt more widely across organisations. For example, when competitors go out of business, managers in an organisation may feel confident that they can take over that competitor’s business or, in some other way, capitalise on that competitor’s demise failing to see that the reasons for the competitor’s failure may also affect their organisation too. The just-world fallacy means that managers know the failure of competitors to be as a result of their own failing rather than something systemic (Pfeffer 2010).

Being reluctant to press the panic button

We need to set expectations in circumstances where people look to them for guidance. The temptation is to convey messages that the situation is likely to turn out for the best. On the one hand, this optimistic perspective instils confidence and motivates, but it might also lead to illusions of control and thus to blind spots. If people absorb the belief that everything will go well, they may become less vigilant and less adaptive in their understanding. The result may be greater rigidity and inaction in response to the unfolding situation.

On the other hand, portraying a doomsday scenario encourages fatalistic behaviour, in which people sit back and let fate play its cards. It is down to the leader to find an appropriate balance in setting expectations between being too optimistic and pessimistic. Generally, we tend to underestimate the severity of a crisis and overestimate our capabilities to deal with one, so we should tend to look at challenging over-optimism.

Restraint bias

Restraint bias is a well-recognised cognitive bias that describes the tendency of people to overestimate their ability to control their basic impulses, temptations and desires (Nordgren et al. 2009). This leads people to increase their exposure to these temptations and urges and, in so doing, increasing the likelihood that they will succumb to them. This arises due to a particular type of empathy gap effect in that people are unable to empathise with their future mind-state and thus fail to consider how their mental processes will operate at a future date.

Restraint bias manifests itself in many ways in everyday life. Some examples include the inability to avoid smoking and the tendency of people to become over fatigued (Nordgren et al. 2009).

In marketing, consumer restraint bias is used to positive effect for businesses seeking to promote their products and services. Indeed, in some cases, advertising relies on restraint bias to encourage consumers to place themselves in the way of temptation and, hence, to consume (Campbell and Warren 2015).

Being hesitant to centralise

Defensive retreats often go hand-in-hand with the urge to centralise. We may lose trust in other people because they did not prevent the crisis from happening in the first place. As a result, we may be tempted to rein in many tasks they have previously delegated. Be warned! If we do this, we may find it challenging to focus on what a leader is supposed to do in a crisis – facilitate an understanding of the problem and its solution. Being bogged down in detail prevents us from maintaining sensitivity to what is happening.

A crisis requires courageous action, yet it is not up to us to take all the brave steps. Indeed, we need to let go and facilitate action where it matters, close to the problem, and develop the commitment to respond in the most sensible and valuable way. Such determination cannot be taken for granted, given that people may retreat into their shells and may exhibit rigidity, only responding to cover themselves. Such self-preservation in times of upheaval is common (and understandable) and needs to be addressed by project leaders. This commitment should be supported by the provision of an extensive response repository for those ‘firefighters’. Vast power needs to be channelled down to front-line staff, while leaders remain sensitive to what is going on and intervene only when necessary. However, interventions may be interpreted as a sign of distrust or suggest that power is centralised and does not require any response from those closest to the problem. If we intervene, we need to explain and clarify the purpose of such intervention.

Maintaining trust

A crisis can be deliberately triggered because of hidden agendas. Crises encourage us to be more defensive, and walls are consequently built around silos (defensive retreats). In this situation, trust can evaporate. Trust can be established (or re-established) by us if we focus on showing compassion and concern. In a crisis, people may think that their work and importance is diminished. If, for example, a Tiger Team has been parachuted in to mediate, other stakeholders may find themselves side-lined. Show ‘real’ concern to every project team member. Consider the necessity of shifting power, for example, to Tiger Teams and address and explain the rationale for the decisions and leadership interventions that have been made.

Showing concern goes hand-in-hand with being honest and transparent. Communicate that there are conflicting perspectives and expectations; be honest about the pressure that people are under but show optimism that solutions are feasible. Provide pertinent information in ‘real-time’. Outdated information may be misinterpreted as following a hidden agenda.

However, although communication in a crisis is essential, it needs to be controlled. Unreliable information may only add to rumours and fuel false impressions. We want to air their opinions, especially when their own positions and departments are involved. Unwarranted speculations about what is happening or what might be done are detrimental if project managers fail to control them. We should offer our project members valid information and plenty of opportunities to voice their opinions, but information (or a lack thereof) should not be turned into ammunition to serve political agendas. Facilitate communication in an open manner by assuring the reliability of its content.

Manage behaviour, not plans

Crisis management plans or checklists – outlining sets of predefined, ‘mechanistically’ and thus mindlessly performed actions – are there for a reason, to provide some form of structure and order in an environment perceived as being chaotic. However, these plans can be a double-edged sword. On the positive side, they help to trigger behaviours quickly and efficiently. Conversely, they may suppress situated human cognition. If plans do not adequately match the situation at hand, we may blindly walk into disaster. We constantly need to reflect on the appropriateness of plans and check-lists and their execution. If a plan does not appear to match the situation as we perceive it, we must deviate from it. Ultimately, we manage behaviour and plans are there to support this, not vice versa.

Channel resources to where they are needed first

Contingency plans do help to pinpoint critical functions in a project. The question of ‘What must not go further wrong?’ – a question often not asked – drives a greater understanding of the most vital resource allocation. In a crisis, we may throw resources at anything that poses a threat. A much more effective course of action is to prioritise the deployment of resources to where they matter. For example, if we talk about a project that delivers a range of benefits (e.g. functions), then systematically categorising what must or should not go wrong (any further) is a sensible approach. The ‘must not’ requires the most considerable attention and forces clear prioritisation.

Learning from crises

A crisis in which a project stands at the edge of disaster requires reflection and learning. However, we often want to detach ourselves from this uncomfortable experience. The urge to forget and move on to other tasks can leave the potential for learning untapped. Learning from a crisis, if it happens, often takes the form of analysing, documenting and allocating root causes, with the purpose of standardising responses to any future crisis. In its own way, this appears to be a sensible solution unless we operate in an environment in which crises unfold in random patterns. However, expecting a similar predicament to hit you another time may, in itself, form a root cause for future failure.

Learning should go beyond the past, and learners – us – should be reluctant to replace valuable human cognition with yet another additional layer of prescribed process and procedure without a strong rationale for doing so. But how? Storytelling is considered a potent mechanism to convey rich context – an event or crisis – and provide a platform for the ‘listeners’ to develop their learning.

A ‘good’ story:

  • is authentic and one that the listener is familiar with and can relate to;
  • combines words with images and audio to appeal to all our senses;
  • is connected to an organisational narrative or a bridge is built so that the story is linked to the context of the listener;
  • provides a clear structure, often helped by a timeline;
  • is simple and relatively short, to maintain attention.

People naturally make sense of experience through storytelling, and therefore, it can be a very powerful learning tool. Storytelling done well can encourage reflection, inspire current and future collaborative approaches, stimulate enquiry and help to build knowledge and understanding. Additionally, cultural and emotional contexts can be understood and acknowledged as being important. It is only one way to reflect upon practice and find ways of making sense of crises but, compared with dry, codified knowledge that may never be read, it is a very accessible means of learning. Indeed, it happens anyway. In social gatherings, we may trade ‘war stories’ of what went wrong in projects we have been involved in.

Law of triviality

Initially observed by the British naval historian, Cyril Northcote Parkinson, the law of triviality describes the tendency for people to devote inordinate amounts of time and effort to thinking about and resolving minor, trivial details while ignoring significant or crucial matters (Keller and Meaney 2017). This differs from the more famous Parkinson’s law, where Parkinson observed that work expands to use up the amount of time allocated for it. The law of triviality has become known as ‘bikeshedding’ because Parkinson invented a fictional committee to approve the plans for a nuclear power station, but they spent inordinate amounts of time focused on the trivial task of thinking about the material used in the construction of the bikesheds.

In organisations, the law of triviality can manifest itself in many ways. The most obvious one is that organisations tend to give disproportionate weight to trivial issues. This can lead to the time being devoted to any particular item being inversely proportional to the amount of money involved and the amount of energy, effort and time generated by changes being inversely proportional to the complexity of those changes.

There would seem to be some underlying reasons explaining behaviours consistent with the law of triviality. Perhaps the most important of these is that people prefer to focus on and form an opinion about problems that are easier to understand than more complex issues. Connected with this, making decisions about more essential matters brings with it more responsibility for those decisions, and people often seek to avoid or dodge responsibility for decisions. Often, managers will assume that people responsible for complex decisions will already have done their job and assessed the issue. Finally, people are drawn to trivial issues as these require less time, effort and money to resolve.

Mindful practices

Challenging ways of working

In The Technology Partnership (TTP), it is the responsibility of project managers to explore beyond the ‘known expertise’. They do this through extensive empowerment but this only works if access to additional know-how is provided:

If you need particular expertise on a project, you can pull it from anywhere in the company.

The provision of additional expertise gives project managers multiple perspectives but does not constitute a delegation of responsibility. Deference to expertise, as exercised at TTP, is aimed at:

  • seeing your project from a different perspective;
  • encouraging scepticism;
  • acknowledging adversarial views;
  • challenging your assumptions.

All of this is done in the interest of making fewer assumptions, noticing more and ignoring less. This is all about addressing risk blindness – the ability to notice blind spots – and is carried out at TTP using an elaborate process of ‘peer reviews’. This process acts as a ‘sensor’ to highlight blind spots and, in TTP, is carried out by independent functions:

TTP’s peer-reviews are not designed to ‘check’ whether project managers are compliant with the organisation’s rules and procedures. Instead, they are designed to make project managers think about what they are doing and, most importantly, why:

Even in times of urgency, where there might be a temptation to rely on an ‘autopilot’ mentality and replicate what one has been doing in the past, deference to expertise provides a ‘sanity-check’:

What is mindful about it? The implementation of such a peer-review system has its challenges, too. Project managers might see it as ‘Big Brother’ watching over them and telling them what is right and what is wrong. Or, they might rely on their peers as a crutch to help them make decisions rather than making decisions for themselves. This is why challenging assumptions to detect risk blindness, should not include the imposition of ‘answers’. In TTP, the peer-reviewing mechanism acknowledges the ‘folly of imposed solutions’ and offers support to make a project manager think and be creative in his or her problem-solving – it is not about making the project manager obey. This is in stark contrast with many other organisations, where ‘auditing’ is used to ensure that employees remain within a supposedly self-evidently correct management framework. Not so TTP, which uses expertise ‘just’ to inform.

The impact of recovering on relationships

A crisis is a time of high emotion with the project viability at stake. The threat of stopping or suspending the work and the resulting potential damage to the reputation of all the parties involved hangs like a dark cloud over the heads of stakeholders. It is vital that crisis management efforts are not only targeted at the most vulnerable and most critical functions but also at those people who are most affected (these are not necessarily those in the thick of it). Relationships are at stake.

Establish clarity

The antithesis of a defensive retreat is to break down barriers and share information freely while being sure to control the accuracy of the information. Sharing information should be done with the help of clear contact points. Stakeholders should not need to seek out sources of information on crisis updates or how the crisis is being dealt with. An obvious choice is the project manager, who is most often closest to the evolving situation and has the most unobstructed view of events.

Listening

An important aspect of crisis management is that of caring through listening. Listening is not as easy as it may seem, though. Reflective listening involves both content checking and feeling checking. Content checking implies mutual acknowledgement of each other’s understanding of what has been said. Restating content provides reassurance of a shared understanding. Feeling and checking is not so much about the content but the emotions, involving feedback and reflection on each other’s emotional state.

As with so many skills that are important in a crisis, listening, to show that one cares for another’s content and emotions is not without barriers:

  • Anticipating a message: you may already think or expect in advance what the person is going to say and hence you might interrupt them.
  • Rehearsing an answer: while the person is trying to convey their message, you may already be thinking about an answer, and thus you will not give them the attention they deserve.
  • Thought wandering: a cue by a person may make your thoughts wander off. This may lead to misinterpretation and the need for that person to repeat the message.
  • Premature conclusion: you may already have come up with a conclusion, although the message is incomplete.

Collectively owning a crisis

A crisis, regardless of whether sudden or creeping, is often caused by a multitude of factors. Hence, searching for a single root cause is often a futile exercise; not having a root cause in place does not negate the collective ‘ownership’ of the crisis, though. However, ownership is not to be mistaken for accountability. A project manager may be held accountable for what he or she does to resolve the crisis by providing timely feedback and measuring progress toward recovery. Ownership, though, is the obligation of the collected stakeholders. Ownership is created by establishing collaboration and a sense of partnership in the belief that recovering from a crisis is in the best interest of all parties. Commitment from all parties involved to engage in timely (and often costly) actions of troubleshooting is highly valuable and provides a sense of cohesion.

Think long-term relationship

A characteristic of a defensive retreat is myopia: short term thinking. We tend to ask ourselves during a crisis how to recover from it in the short-term. Our horizon may not move beyond the phase of recovery. However, not only is the project at risk but also the long-term relationships with stakeholders. Questions need to be asked about what happens after a successful recovery, and how (potentially) damaged trust between stakeholders can be re-established. It is dangerous to wait until after the crisis has passed to consider how groups and individuals could and should work together in the longer term. Projects are (by definition) transient and, although it is difficult when current work is in turmoil, it is important to consider future projects and the sustainable working relationships that will be necessary to support them.

Hyperbolic discounting

A behaviour commonly found among people is that their perceptions of rewards vary over time. It is found that the way people value a relative reward or return at some point in the future differs when it is compared to a valuation of that reward at an earlier date. This non-consistent, time-based valuation is described by the theory of hyperbolic discounting (Frederick and Loewenstein 2002). It is called hyperbolic discounting because the behaviour observed is found to invariably follow a hyperbolic curve (Bradshaw et al. 1976). This is because rewards in the here and now are weighted by people more heavily than future ones. If the reward is far into the future, then its value diminishes to virtually nothing. This kind of delay discounting helps to explain impulsive behaviour by people. The tendency to seek immediate gratification can be seen in all sorts of situations, such as craving a cigarette, over-indulging in food or alcohol, and general procrastination.

Primarily, hyperbolic discounting exists because of the nature of time – people have limited time available to them and are geared to instinctively recognise that resources stored for the future cannot be used if they are unable to avail themselves of it. This is because the future is uncertain, and the distant future is highly unpredictable and results in a present bias – consume now (Azfar 1999).

Sharing the burden of recovery

It is tempting in a crisis to look for a root cause and allocate the burden – costs, emotions, responsibility for recovering from it – to those believed to be the triggering factor for the predicament. This search for single-point failures and single-point accountability is often already manifested in the choice of contract. Projects most often rely on a ‘traditional’ type of contract, with a focus on the position of one party to the contract concerning the actions of the other parties. Essentially, this can set up an adversarial relationship, with each party to the contract protecting its position and looking to maximise its benefit. The agreement itself can encourage and exacerbate the adversarial stance taken by the various parties delivering the project. The focus can shift to personal gain rather than the goals of the project. This is the antithesis of project partnering wherein there is an implicit (and often explicit) assumption that all parties involved in the project are committed to a single goal while recognising the different and shared needs of the various organisations involved. This is well-understood in many project environments and has led to the development of a variety of alternative, more collaborative, contract forms. With collaboration comes the incentive to share the costs of tackling problems. Typical of these types of contracts is some form of pain/gain share agreement, whereby the costs of failing to meet milestones or objectives are shared among project participants. By the same token, the project is delivered below budget, or early, then all participants can share this better than expected outcome.

The key of a partnering contract is to encourage swift and efficient collaborative problem-solving while avoiding the sometimes crippling transaction costs that are so often the outcome of more traditional contract forms. These transaction costs have two effects on the project: they can mire the project participants in bickering over where the fault lies, rather than focusing on resolution, and they involve inordinate record-keeping and costly arguments over who is to blame, often resulting in protracted legal disputes long after the work has been completed.

To avoid some of these problems, collaborative, partnering-type projects will typically be ‘open-book’ whereby the client and the project team can view each other’s project documents. There are two main justifications for this: project parties have to trust each other, and it avoids the need for costly claims. Hidden agendas are removed, and the project staff and workers start to focus on the needs of the project rather than those of their organisations.

Celebrate a victory

Overcoming a crisis is a feat to celebrate. Crises do occur and often are not preventable. They are high-pressure situations in which emotions take the upper hand and recovering from one deserves recognition. However, it is tempting to lay the memories of such a painful phase of the life cycle to rest, to forget and to move on. Stakeholders need to recognise their successful recovery and rebuild potentially damaged relationships. By celebrating victory over a crisis, negative connotations about its occurrence can, at least to some extent, be alleviated.

Negativity bias

Negativity bias (sometimes known as the negativity effect) describes the tendency of people to give more weight to negatively perceived events than they do to positively viewed events, even where those positive events are as equally important (or even better) than the negative. In other words, ‘bad … is stronger than good’ to a scale of as much as three times (Baumeister et al. 2001). Adverse events can take many forms, from everyday activities to significant life traumas. Whatever they are, they will almost always outweigh similar good events in terms of their importance to people. Events may be outcomes of close relationships, social networks or physical events.

There are several explanations for people’s innate negativity bias. It may be that we pay more attention to adverse events rather than positive ones. Another explanation is that we learn more from adverse events than positive. From an organisational point of view, people also tend to make decisions based on harmful data rather than positive data and people are more motivated to complete tasks if they think they will lose something rather than if they believe they will gain something.

In organisations, negativity bias manifests itself in a host of ways. In particular, it has been found to affect people in their interest in doing something new or seeking to innovate. This is because managers and other organisational decision-makers tend to recall when innovation and change failed more readily than when it succeeded (Luthans et al. 2011). In addition, workplace discipline and subsequent productivity can be impacted by negativity bias (Skaggs et al. 2018).

Apple – a success of recovering

In 1997, Apple was about 90 days from going bankrupt. The Silicon Valley pioneer was founded in 1976, with a mission to mass-market small, simple and, most of all, affordable computers. In the mid-1980s, Apple failed to compete with Microsoft. Lack of new ideas, and failed products, as well as the gamble of takeovers of ailing companies such as Next took its toll. The company was burning through money and capital. By 1997, Apple had lost $867 million, and its value reduced to $3 billion.

In July 1997, Steve Jobs returned to Apple (after he had been ousted in 1985). His first step to get Apple out of the red was criticised by many: he aligned Apple with his key competitor Microsoft. Bill Gates, the CEO of Microsoft, and Jobs announced cooperation that would allow the release of an updated Mac version of Microsoft Office, as well as a significant investment programme in Apple of around £150 million. Jobs argued:

On such solid footing, feeding off the successful products of Microsoft, Jobs imposed his authority, confidence and good ideas to lead Apple into profitability. The iMac G3 was introduced in 1998, and defined the hallmark of Apple ever since, pioneering in technology, underlined by simplicity and appeal in design. The ‘all-in-one’ iMac became a hit with 800,000 units sold in the first five months. The iMac was followed by the iPod in 2003 and the debut of the iPhone in 2007. At the launch of the iPhone, Apple was worth $73.4 billion.

In many ‘failing’ project-based organisations, the first reaction may well be to ‘entrench’ a company’s capabilities and culture to deliver projects, to think and act more mindlessly by reinvigorating the past that may well have triggered a crisis in the first place. Like Apple, an organisation needs to break with the past, free itself of past crises, be reluctant to press the panic button, and decisively shape a future that creates opportunities. The discomfort in mindfully shaping a prospect is to empower employees to look past a crisis and engage stakeholders in the process of creating mindful behaviours; creating sustained commitment towards and embracing uncertainty beyond what is known from the past.

Towards an art of recovering

A crisis is something to be anticipated. Uncertainty will sometimes slip through our defences, and complexity will do the rest in creating a state that threatens the viability of a project. This threat increases the pressure for us to act mindlessly, to jump to conclusions, and emotions will run high. Counterproductive behaviour is to be expected and needs to be managed carefully. Defensive retreats need to be broken down and objectivity re-established. Additional capabilities, for example, the use of a Tiger Team, may need to be parachuted in, to provide mindful interventions.

The previous sections provided a range of suggestions on how to cope with a crisis in a project. These suggestions – and to remind the reader, these are just suggestions – are encapsulated in the following statements:

Reflection

How well do the following statements characterise your project? For each item, select one box only that best reflects your conclusion.

Fully agreeNeither agree nor disagreeFully disagree
We have a common understanding that crises are a normal, yet infrequent, part of project life.12345
A crisis includes the management of behaviours, and less so of plans.12345
We prepare for crises by experiencing them via simulations and do not rely exclusively on a written crisis management plan.12345
Fully agreeNeither agree nor disagreeFully disagree
We use independent personnel to give objectivity and help develop solutions in crises.12345
We accept that decision rigidity may hamper the development of an effective response, and we provide appropriate levels of freedom to enable creative solutions to emerge.12345
We try to care for every stakeholder by keeping them informed of the evolving situation and addressing their particular needs.12345
Fully agreeNeither agree nor disagreeFully disagree
Managing the full flow of information is critically important in a crisis. We provide relevant and timely information in a calm, orderly, and controlled manner.12345
We recognise the importance of sharing the burden of a crisis.12345
We reflect on our behaviour in a crisis to learn to deal with future events.12345

Scoring: Add the numbers. If you score higher than 27, your mindful capability to recover from a crisis is good. If you score 27 or lower, please consider how you may be able to enhance your capability to manage a crisis.

References

Ashkanasy, N. M., C. A. Windsor, and L. K. Treviño. 2006. “Bad Apples in Bad Barrels Revisited: Cognitive Moral Development, Just World Beliefs, Rewards, and Ethical Decision-Making.” Business Ethics Quarterly 16(4): 449–734.

Azfar, O. 1999. “Rationalizing Hyperbolic Discounting.” Journal of Economic Behavior and Organization 38(2): 245–522.

Babad, E., and Y. Katz. 1991. “Wishful Thinking—Against All Odds.” Journal of Applied Social Psychology 21(23): 1921–38.

Baumeister, R. F., E. Bratslavsky, C. Finkenauer, and K. D. Vohs. 2001. “Bad Is Stronger Than Good.” Review of General Psychology 5(4): 323–703.

Bradshaw, C. M., E. Szabadi, and P. Bevan. 1976. “Behavior of Humans in Variable-Interval Schedules of Reinforcement.” Journal of the Experimental Analysis of Behavior 26(2): 135–411.

Bukszar, E., and T. Connolly. 1988. “Hindsight Bias and Strategic Choice: Some Problems in Learning from Experience.” Academy of Management Journal 31(3): 628–416.

Campbell, M. C., and C. Warren. 2015. “The Progress Bias in Goal Pursuit: When One Step Forward Seems Larger than One Step Back.” Journal of Consumer Research 41(5): 1316–31.

Cassar, G., and J. Craig. 2009. “An Investigation of Hindsight Bias in Nascent Venture Activity.” Journal of Business Venturing 24(2): 149–641.

Christensen-Szalanski, J. J. J., and C. F. Willham. 1991. “The Hindsight Bias: A Meta-Analysis.” Organizational Behavior and Human Decision Processes 48(1): 147–681.

Frederick, S., and G. Loewenstein. 2002. “Time Discounting and Time Preference: A Critical Review.” Journal of Economic Literature 40: 351–401.

Galai, D., and O. Sade. 2006. “The ‘Ostrich Effect’ and the Relationship between the Liquidity and the Yields of Financial Assets.” Journal of Business 79(5): 2741–59.

Hafer, C. L., and L. Bègue. 2005. “Experimental Research on Just-World Theory: Problems, Developments, and Future Challenges.” Psychological Bulletin 131(1): 128–671.

Karlsson, N., G. Loewenstein, and D. Seppi. 2009. “The Ostrich Effect: Selective Attention to Information.” Journal of Risk and Uncertainty 38(2): 95–115.

Keller, S., and M. Meaney. 2017. “High-Performing Teams: A Timeless Leadership Topic.” McKinsey Quarterly 1(3): 81–87.

Kutsch, E., and M. Hall. 2005. “Intervening Conditions on the Management of Project Risk: Dealing with Uncertainty in Information Technology Projects.” International Journal of Project Management 23: 8.

Lerner, M. J. 1980. The Belief in A Just World: A Fundamental Delusion. New York: Plenum.

Luthans, F., C. M. Youssef, and S. L. Rawski. 2011. “A Tale of Two Paradigms: The Impact of Psychological Capital and Reinforcing Feedback on Problem Solving and Innovation.” Journal of Organizational Behavior Management 31(4): 333–503.

Nordgren, L. F., F. Van Harreveld, and J. Van Der Pligt.. 2009. “The Restraint Bias: How the Illusion of Self-Restraint Promotes Impulsive Behavior.” Psychological Science 20(12): 1523–28.

Pfeffer, J. 2010. Power: Why Some People Have It and Others Don’t. New York: Harper Business.

Pohl, R. F., and E. Erdfelder. 2004. “Hindsight Bias.” In: Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, edited by Rüdiger F. Pohl, 363–783. Hove: Psychology Press.

Shontell, A. 2010. “TIP OF THE DAY: ‘We Have To Let Go Of This Notion That For Apple To Win, Microsoft Has To Lose’.” Business Insider. 2010. https://www.businessinsider.com/tip-of-the-day-we-have-to-let-go-of-this-notion-that-for-apple-to-win-microsoft-has-to-lose-2010-9?r=US&IR=T.

Skaggs, B. C., C. C. Manz, M. C. B. Lyle, and C. L. Pearce. 2018. “On the Folly of Punishing A while Hoping for A: Exploring Punishment in Organizations.” Journal of Organizational Behavior 39(6): 812–158.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.190.159.10