Coping with Complexity 81
the load is less than the capacity, the situation is by definition considered as
normal. That is, however, correct from only a quantitative point of view.
From a qualitative point of view, information input underload matters
because it negatively affects the operator’s ability to be in control.
Information input underload may occur either if information is missing (a
true underload condition) or if it has been discarded for one reason or
another, for instance in response to a preceding overload condition.
Table 4.2: Coping Strategies for Information Input Underload
IIU strategy Definition
Extrapolation Existing evidence is ‘stretched’ to fit a new situation;
extrapolation is usually linear, and is often based on
fallacious causal reasoning.
Frequency
gambling
The frequency of occurrence of past items/events is used
as a basis for recognition/selection
Similarity
matching
The subjective similarity of past to present items/events is
used as a basis for recognition/selection
Trial-and-error
(random
selection)
Interpretations and/or selections do not follow any
systematic principle.
Laissez-faire An independent strategy is given up in lieu of just doing
what others do.
Whereas coping strategies here have been described as responses to input
conditions of either overload or underload, it is more in line with CSE to see
them as typical ways of responding and not just as determined by the
working conditions. The choice of a coping strategy not only represents a
short term or temporary adjustment but may equally well indicate a more
permanent style. One example of that is the balance between intuition and
analysis mentioned above; another is a generic efficiency-thoroughness trade-
off (ETTO; cf., Hollnagel, 2004). Indeed, many of the judgment heuristics
described by Tversky & Kahneman (1974), such as representativeness,
availability, adjustment and anchoring, or even the concept formation
strategies described by Bruner, Goodnow & Austin (1956), such as focus
gambling, conservative focusing, simultaneous scanning, and successive
scanning, represent not only temporary adjustments to save the situation but
also long-term strategies that are used for many different situations. Coping
with complexity in a long-term perspective may require a JCS to conserve
effort and keep spare capacity for an ‘emergency’, hence to make a trade-off
before it is objectively required by the current situation.
82 Joint Cognitive Systems
DESIGNING FOR SIMPLICITY
The growing system complexity can lead to a mismatch between (task)
demand and (controller) capacity. This mismatch can in principle be reduced
either by reducing the demands or by increasing the capacity or by doing
both at the same time. As described in Chapter 2, the history of technology
can be seen as a history of amplification of human capabilities, directly or
indirectly. In our time this has opened up a world of decision support
systems, pilot’s associates, human-centred automation, and so on, of which
more will be said in Chapter 6. This has been a major area of research and
development since the early 1980s and remains so despite the often
disappointing results.
Another way to extend capacity is to select people who excel in some way
or other. Selection can be enhanced by training people to ensure that they
have the necessary competence to accomplish the required tasks. An extreme
example of that is the training of astronauts. More down to earth is the
training of operators for complex processes, such as nuclear power plants or
fighter aircraft, where the training may take years. Since training, however, is
costly and furthermore must be maintained, considerable efforts are put into
making systems easier to operate, for instance by increased automation
and/or better design of the work environment, thereby reducing the demands
for specialised training.
A reduction of the mismatch can also be achieved by reducing the
demand by smart system design, specifically by simplifying the information
presentation. This approach has been taken to the extreme in the pursuit of
user-friendly products of all kinds. Here the ideal is completely to eliminate
the need to cope with complexity by making the use of the artefact ‘intuitive’.
This means that everyone should be able to use the artefact immediately, or at
least after reading the ‘quick start’ instructions that today accompany most
consumer products with some modicum of functionality. This has led to the
adage of ‘designing for simplicity’, which, if taken seriously, means that the
complexity becomes hidden behind the exterior, so that the requisite variety
is reduced to something that corresponds to the ‘innateabilities of humans.
By putting it this way it is easy to see that this is an approach doomed to
failure, for the reason that complexity or variety is not actually reduced but
rather increased, as the entropy of the modified artefact is increased. Another
problem is that there is little or no consensus about how the ‘innate’ or
minimal abilities of humans should be determined.
Simplicity-Complexity Trade-Off
In discussing the trade-off between simplicity and complexity there are a
number of inconvenient facts that must be recognised. The first is that both
Coping with Complexity 83
complexity and simplicity are epistemological rather than ontological
qualities, as discussed above. This means that the degree of complexity or
simplicity of the description of a system is relative to the user, to the point of
view taken, and to the context and purpose of the description. This relativity
means that any description is vulnerable to the n+1 fallacy. The n+1 fallacy
refers to the fact that while it is possible to describe the system for n different
conditions, hence to account for the complexity under these conditions, there
will always be a condition that has not been accounted for, which is the n+1
condition. This is so regardless of how large n is. The consequence for
system design is that it cannot be guaranteed that the system description will
be simple also for the n+1 situation; the very principle of designing for
simplicity therefore has a built-in limitation.
Any local improvement will invariably be offset by an increase in
complexity and variety of other situations essentially those that have not
been included in the design base – and will therefore lead to an overall
increase in complexity. This may not have any discernible effects for long
periods of time, but nevertheless remains a potential risk, similar to the
notion of latent conditions in epidemiological accident theories (Reason,
1997). In this case the risk is unfortunately one that has been brought into the
system by well-meaning interface designers, rather than one, which has
occurred haphazardly or unexpectedly.
Information Structuring
The common basis for interface design has been the ‘right-right-right’ rule,
according to which the solution is to display or present the right information,
in the right form, and at the right time. This design principle is simplicity
itself and would undoubtedly have a significant effect on practice if it were
only possible to realise it. The basic rationale for the design principle is the
hindsight bias to which we all succumb from time to time – although some do
so more often than others. The essence of that bias is that when we analyse an
incident or accident that has happened, such as the Apollo 13 problem
(Woods, 1995), we can very easily propose one or several solutions that
would have avoided the problem, if only they had been implemented in time.
(The hindsight bias is thus equivalent to the fallacy of relying on
counterfactual conditionals.) That is, if only information X had been
presented in format Y, then the poor operators caught by unfortunate
circumstances would have understood the situation correctly, hence not have
failed. The shortcomings of reasoning in this manner are, however, easily
exposed if we look at each of the elements of the ‘right-right-right’ rule in
turn.
84 Joint Cognitive Systems
The Right Information
The right information can be determined in two principally different ways.
One is by referring to a specific and preferably ‘strong’ theory of human
action. In practice, this has often been a theory of human information
processing, which emphatically is not the same as a theory of human
action. The literature is awash with various display design principles,
although some are more principled than others. One early example is the
work of Goodstein (1981), Rouse (1981) and, of course, Rasmussen &
Vicente (1987) (see also Vicente & Rasmussen, 1992). The latter is
representative of what we talk about here, because the display design refers to
one of the better-known version of a human information-processing model.
Another way of determining what the right information is in advance is to
consider specific situations as they are defined by the system design. An
almost trivial example of that is a situation described by operating procedures
both for normal operations and emergencies. In these cases it is clearly
possible from the procedure to derive what the information demands are
both in content and in form. Assuming that the operators follow the
procedures rigidly, and also that the procedures are valid for the situation, the
display problem is in principle solvable. Examples of that are found in task-
based information displays (O’Hara et al., 2002) and in the various
structuring of procedures (event-based, symptom-based, critical function
based), e.g., Colquhoun (1984).
A less stringent criterion is to identify the information that is typically
required for a range of situations, and which can be derived from, e.g., a
control theoretic analysis of the system (Lind & Larsen, 1995). An example
that has been widely used by industry is the so-called star display developed
in the 1980s. Another example, which also has considerable practical merit,
is the concept of critical function monitoring (Corcoran et al., 1981). Similar
examples may be found in aviation, where much ingenuity and creativity go
into the design of the EDIC precisely because it is a very limited display area
hence enforces a keyhole effect (Woods & Watts, 1997).
In the Right Form
Assuming that the problem of determining what the right information is has
been solved, the next challenge is to present the information in the right form.
For any specific information contents there is a large variety of information
forms. The importance of form (or structure) has been studied in problem
solving psychology at least since Duncker (1945) and is an essential issue in
information presentation and display design (Tufte, 2001).
Most of the proposed design principles do in fact combine content and
structure (form), although it is always possible to suggest alternative ways of
Coping with Complexity 85
structuring a given set of data. This, however, is precisely the problem since
it is very difficult to determine in advance what the best possible
representation format is. A great number of factors must be taken into
consideration, such as personal preferences, experience, norms and standards
of the work environment, whether work is by a single user or a team, etc.
Other factors are the temporal and task demands, e.g. whether there are
multiple tasks and severe time constraints or few tasks and a more leisurely
pace. Clearly, in a state of high alert and with a need to respond quickly, it
may be advantageous either to highlight the important piece of information or
to filter out less important information. Both solutions affect the form of the
presentation, but neither is easy to do in a consistent manner.
Since it is impossible to reduce the real complexity, the alternative
solution is to reduce the perceived complexity of the system by simplifying
the information presentation. The reasoning is that if the system can be made
to look simpler, then it will also be simpler to control. The fundamental
problem with this approach is that it shifts the complexity from the exterior to
the interior of the system. Designing for simplicity does not actually reduce
complexity or eliminate demands but only changes their appearance and
focus. That the principle nevertheless has had a considerable degree of
success, for instance as ecological interface design (Vicente & Rasmussen,
1992), is due to the simple fact that the effects of complexity are unevenly
distributed over all possible situations. Good interface or good interaction
design may in many cases produce a local reduction in complexity, for
instance for commonly occurring situations or conditions, which means that
these tasks become easier to accomplish. What should not be forgotten is that
this advantage has a price, namely that the very same displays may be less
optimal or downright inconvenient in other situations. Just as there is no
such thing as a universal tool, there is no such thing as a universal display
format.
Technically, the problem is often expressed as a question of identifying
user needs and preferences, with suggestions that the problem is solved by
finding or having the right user model (e.g. Rich, 1983). The concept of a
user model is, however, close to being conceptually vacuous, quite apart from
the fact that it reiterates the unfortunate separateness between the user on one
side and the machine on the other (cf. Chapter 3). User models are not a
viable solution as variability within and between situations and users is too
large. Focusing on the issue of information contents and information
structure also tacitly accepts that the problem is one of transmitting
information from the interface to the user, rather than one of ensuring that the
JCS can maintain control. Even if information was presented so that it could
be understood with little effort and no ambiguity, there would be no
guarantee that a person would be able to find the right response and
effectuate it in time. The issue of information presentation puts too much
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.113.188