71
Chapter 4
Coping with Complexity
The basic issue for CSE is how to maintain control of a process or an
environment. Both processes and environments are dynamic and
therefore complex, and joint cognitive systems are striving to cope
with this complexity. The coping takes place at both the individual and
organisational levels, the latter as in the design of work environments,
of social structures, and of technological artefacts.
INTRODUCTION
Coping with complexity is about the effective management or handling of
situations where there is little slack, where predictability is low, and where
demands and resources change continually. Coping with complexity is thus
essential to maintain control. Since it is clearly desirable that a JCS can do
this for a reasonable range of performance conditions, design must consider
both the variability of system states and the variability of human performance
whether as individuals or teams. The purpose of design can, indeed, be
expressed as enhancing the capability to be in control, thereby ensuring that
the social and technological artefacts function as required and as intended.
About Coping
An early reference to the concept of coping with complexity can be found in
the field of management cybernetics (Beer, 1964). Yet coping with
complexity did not receive much attention until it was brought into the
human-machine research vocabulary. Here it was defined as the ability “to
structure the information at a higher level representation of the states of the
system; to make a choice of intention at that level; and then to plan the
sequence of detailed acts which will suit the higher level intention”
(Rasmussen & Lind, 1981, p. 9).
Almost half a century ago Donald MacKay (1968; org. 1956) defined the
minimal requirements to a system that would be capable of goal-guided or
goal-directed activity. As shown in Figure 4.1 only three functional
72 Joint Cognitive Systems
elements are needed, which MacKay called a receptor, a comparator, and an
effector. (Notice that this is an elementary feedback-controlled system.)
System
output
R
C
Goal
representation
Y:
X:
E
Receptor
EffectorComparator
Mismatch
signal
F
Figure 4.1: Minimal requirements for goal-guided activity (after
MacKay, 1968/1956).
In Figure 4.1, the point X represents the system’s goal and the line F the
space of possible goal states. The active agent is represented by the effector
system E, which is governed by the control system C. The system is said to
be goal-guided if the overall pattern of E’s activity reduces the distance
between Y and X to a minimum, within a given time interval. The distance
between Y and X is expressed as a mismatch signal from the receptor system
R.
We assume that in a typical situation E is capable of a certain range or
variety of modes of activity (including inactivity), so that the function
of C is to select from moment to moment what E shall do next, out of
the range of possibilities open to it. (MacKay, 1968, p. 32)
This simple system works well as long as the ‘movements’ of the target X
can be represented as changes in position along a single dimension and as
long as they are smooth or continuous. As soon as the ‘movements’ become
more irregular, corresponding to a more complex and therefore also less
predictable environment, more complicated functions are required to
maintain control. MacKay argued that although having a hierarchy of
controllers in principle could extend the same fundamental arrangement to
cover the more complex situation, it became impractical to describe the
performance of the system in terms of nested feedback-loops as soon as there
were more than a few levels.
Coping with Complexity 73
SOURCES OF COMPLEXITY
Attempts to define complexity are many and range from the useful to the
useless. Complexity is never easy to define, and the term is therefore often
used without definition. A start is, of course, to consider the dictionary
definitions, according to which something is complex if it consists of (many)
interconnected or related parts or if it has a complicated structure (sic!). A
more substantive treatment can be found in the field of general systems
theory, where complexity is defined by referring to the more fundamental
concept of information. It is here argued that all scientific statements have
two components. One is an a priori or structural aspect, which is associated
with the number of independent parameters to which the statement refers.
The other is an a posteriori or metrical aspect, which is a numerical quantity
measuring the amount of credibility to be associated with each aspect of the
statement. Complexity is now defined as follows:
The amount of this ‘structural’ information represents what is usually
meant by the complexity of a statement about a system; it might
alternatively be defined as the number of parameters needed to define
it fully in space and time. (Pringle, 1951, p. 175)
Pringle goes on to point out that the representation of complexity in the
above sense is epistemological rather than ontological because it refers to the
complexity of the description, i.e., of the statements made about the system,
rather than to the system itself. Ontological complexity, he asserts, has no
scientifically discoverable meaning as it is not possible to refer to the
complexity of a system independently of how it is viewed or described.
This important philosophical distinction is usually either taken for granted
or disregarded. In the latter case the epistemological and ontological aspects
of complexity descriptions are mixed, which sooner or later creates problems
for descriptions. The reason is that while the epistemological aspects are
amenable to decomposition and recursive interpretation, the ontological
aspects are not. Indeed, if complexity as an ontological quality of a system
could be decomposed, it would in a sense be dissolved, hence cease to exist.
Some of the important factors that affect complexity are shown in Figure
4.2, superimposed on the basic cyclical model. This also suggests a
convenient way to group the different factors.
In relation to the evaluation and interpretation of events, two important
factors are insufficient training and lack of experience. Of these,
insufficient training is the more specific and also the one that best can be
controlled by an organization. Shortcomings in the evaluation and
interpretation of events may lead to an incomplete or partial
understanding of the situation.
74 Joint Cognitive Systems
Other factors are insufficient time and insufficient knowledge. Even if a
condition can be recognized, it may be impossible to maintain a correct
understanding, if time or knowledge are in short supply. This is
particularly important for situations that are out of the normal, such as
accidents. An incomplete or partial understanding leads to problems in
choosing or selecting actions.
Modifies
Generates
Deficient
interface
design
Insufficient
time and
knowledge
Insufficient
training and
experience
Unexpected
events
I
n
a
p
p
r
o
p
r
i
a
t
e
p
l
a
n
s
Directs /
controls
P
a
r
t
i
a
l
u
n
d
e
r
s
t
a
n
d
i
n
g
Construct
Activity
Event / feedback
Figure 4.2: Important factors that affect complexity.
A third group of factors is associated with the complexity of the interface,
which both provides the information about what happens and the means
by which an intended action can be carried out. If the interface is difficult
to use, the implementation of an action may be incomplete or incorrect,
leading to unexpected results. This problem is often solved by relying on
a standard that is effective across different levels of user experience and
cultures.
As already mentioned, the complexity of a process, hence the difficulties
in coping, depends in the main on two closely coupled issues. One is the
degree of orderliness or predictability of the process, and the other is the time
that is available. The coupling between the two comes about in the following
way. If predictability is low, then more time is needed to make sense of what
is going on and to decide on the proper control actions. Conversely, if time is
short or inadequate, it may not be possible to develop an adequate
understanding of what is going on (develop an adequate construct), and
control actions are therefore more likely to fail in bringing about the desired
Coping with Complexity 75
change. This will increase rather than decrease the unpredictability of the
process, hence limit the available time even further. This particular kind of
coupled dependency is technically known as a deviation-amplifying loop
(Maruyama, 1963).
LOSING CONTROL
If we consider joint cognitive systems in general, ranging from single
individuals interacting with simple machines such as a driver in a car, to
groups engaged in complex collaborative undertakings such as a team of
doctors and nurses in the operating room, it soon becomes evident that a
number of common conditions characterise how well they perform, and when
and how they lose control, regardless of domains. These conditions are lack
of time, lack of knowledge, lack of competence, and lack of resources (cf.
Figure 4.3).
What causes
loss of control?
What can help maintain
or regain control?
Unexpected events
Acute time pressure
Not knowing what
to do
Sufficient time
Anticipation of
future events
Limited task load
Clear alternatives
or procedures
Not having the
necessary
resources
Capacity to
evaluate and plan
Not knowing what
has happened /
what happens /
what will happen
Knowing what has
happened and what
happens
Being in control of a
process means:
Knowing what will happen
Knowing what has happened
Figure 4.3: Determinants of control.
Of greatest importance is the intimate link between loss of control and the
occurrence of unexpected events, to the extent that one might say that this in
practice is a signature of loss of control. Unexpected events also play a role in
another related way, namely as a consequence of lost control. The loss of
control is nevertheless not a necessary condition for unexpected events to
occur. There may be other factors, causes and developments outside the
boundaries of the JCS that lead to events, which for the JCS are unexpected.
These issues will be discussed again in Chapter 7.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.26.90