CONDITIONAL PROBABILITY MACHINES AND CONDITIONED REFLEXES
Albert M. Uttley
ABSTRACT
An important characteristic of animal behaviour is that the same
motor response can be evoked by a variety of different configurations of
the world external to the animal. For the animal these configurations re
semble one another in some respect. Similarly there can be variation in
the motor response, and different responses resemble one another. In this
paper, it is suggested that this "resemblance" is based on two known mathe
matical relations: the first is the inclusive relation of Set Theory; the
second relation is that of conditional probability. From these two rela
tions are deduced the principles of design of a machine whose reactions to
stimuli are similar in a number of ways to those of an animal; some simi
larities are suggested between the structure of the machine and that of a
Nervous System.
The machine can assess resemblances between sets of input data,
not between other phenomena from which that data was derived; the nervous
system is limited similarly to assessing resemblances between signals in
sets of fibres, not between sets of physical quantities external to this
system, and from which those fibre signals were derived — between internal
representations, not between external configurations.
In order to make a start on this problem, it has been found ne
cessary to idealise the form of input, abstracting some important factors.
The input signals must be all-or-none, active or inactive; the duration of
activity is not considered. In a representation, inputs must be active
simultaneously, so temporal pattern cannot be discussed. The neighbourhood
of input fibres is not considered, so spatial pattern cannot be discussed;
temporal and spatial patterns are considered in a separate paper. In con
sequence, the present paper is limited to considering representations such
as taste, smell, and colour; these can be classified without reference to
time, or direction.
253
UTTLEY
The inclusive relation forms the basis of classification; a
machine based on this principle will be called a classification machine.
The relation of conditional probability arises out of the inclusive rela
tion; a machine based on the two principles will be called a conditional
probability machine; it must have all the properties and structure of a
machine based only on the first principle. In consequence classification
is discussed first.
The simplest form of classification machine can assess input
signals which possess only two states, active and inactive, which may be
said to indicate the presence and absence of properties; but it must make
no use of the inactive state, to distinguish classes defined by the absence
of properties; such a machine has been called "unitary".
Because a class of objects is defined by a set of properties, a
classification machine must possess one unit for every possible combination
of inputs; the unit must operate if they are all active, i.e., if a repre
sentation possesses the corresponding set of properties.
All units are of the same design, being two-state in nature, and
they must be connected to inputs in all possible ways; such a machine can
be constructed with random connections between units and inputs. For a
classification machine, class recognition is instantaneous and correct, it
does not grow or decay. Resemblance is limited to the determination that
representations are of the same class, i.e., that for the two, there is a
common set of active inputs; based only on Set Theory there can be no re
lation between representations with no such common set, other than tha.t
they are different, disjunct.
But a further relation can arise between disjunct representations,
which is based on their relative frequency of .joint occurrence; this is the
relation of conditional probability and it measures a variable degree of
resemblance. If the machine is extended so as to embody this principle, it
must have two new design features. Each unit must possess a variable state;
and there must be interconnections between units. The function of the vari
able state of a unit is to store the unconditional probability of the corres
ponding set of properties; this quantity may also be called the mean fre
quency of joint occurrence, and it can be time weighted in various ways.
A conditional probability is the ratio of two unconditional probabilities;
but in computing machines, division is a more difficult operation than
subtraction, so machine design is simpler if unconditional probabilities
are computed on a logarithmic scale. It can be shown that the unit must then
possess two new properties; the stored quantity must grow in the absence of
events; and a certain amount must be destroyed instantaneously if the set
of properties occurs. From a growth equation describing both these functions
it is possible to calculate the rate of growth and decay of the conditional
probability relation between sets of properties.
CONDITIONAL PROBABILITY MACHINES
255
For two sets of inputs J and K, whether disjunct or not,
there must be connections in a conditional probability machine, from the
J unit and the K unit to the (J u K) unit, which stores the uncondi
tional probability of the union of the two sets; these connections mediate
a function of supercontrol, from a unit to a superunit. Rules have been
deduced which determine which of the J and K units controls the
(J u K) unit. There must also be a separate system of interconnections
from each (J u K) unit to all subunits such as J and K, mediating
sub control by the (J u K) unit. Whichever unit is not effecting super
control is subcontrolled; there are other necessary rules.
Such a system of interconnections provides a physical path be
tween sets of input channels whether disjunct or not; but this path is
formed to a certain degree, and depends entirely on the nature of past re
presentations. If a nervous system embodies the principles of a conditional
probability machine, disjunct representations, such as "smell of food,"
"sight of food," or "sound of bell," can evoke a, common response, e.g.,
"salivation". For such a system, the situtation must be described this
way: if the conditional probability of "salivation," given "smell of food,"
is unity (deterministic, reflex behaviour), and if "sound of bell" and
"smell of food" are presented jointly, then the conditional probability of
"smell of food" given "bell" will rise. The unit storing this last condi
tional probability might be connected to an effector mechanism in different
ways. The simplest possible effector mechanism would be no more than a
threshold which ca,used an all-or-none reaction if the probability exceeded
a certain value. The conditional probability machine could then be much
simpler — a conditional certainty machine.
I . INTRODUCTION
An important characteristic of animal behaviour is that the same
motor response can be evoked by a variety of different configurations of the
world external to the animal. For the animal these configurations resemble
one another in some respect. Similarly there can be variation In the motor
response, and different responses resemble one another. In this paper, it
is suggested that this "resemblance" consists of two known mathematical re
lations: the first is the inclusive relation of Set Theory; the second re
lation is that of conditional probability. From these two relations are
deduced the principles of design of a machine, whose reactions to stimuli
are similar in a number of ways to those of an animal; also, the structure
UTTLEY
FIGURE 1
1_JT
n _
n
n _
n n
n n
n
n
. . . .
n _
n n
1 2 3 4-
FIGURE 2
T J T J
CONDITIONAL PROBABILITY MACHINES
257
of the machine has some similarity to that of a Nervous System. [Sholl
and Uttley, 1953]
The features of a Nervous System which are relevant to this dis
cussion are shown in FIGURE 1. Signals enter the nervous system along
discrete afferent channels, and after central delay, signals are sent out
to motor units; the afferent pattern of signals will be considered first.
At any instant, the external world will be said to be in a particular
configuration. At the same instant there will be a set of signals in the
afferent channels; each signal will be called an input quantity to the
central mechanism; the complete set of input quantities will be called a
"representation". The central mechanism can be concerned only with re
presentations and their resemblance, not with the resemblance of configu
rations .
II. CLASSIFICATION
It has been suggested [Hayek, 1952] that when an animal gives
the same reaction to two different representations, it is because they are
of the same class or set. But the mathematics of Set Theory cannot be ap
plied immediately to the input system described above. Firstly, the input
quantities are known to contain a measure of intensity; and Set Theory Is
concerned with properties which are either possessed or not possessed.
Secondly, the input signals are not independent; and in Set Theory no re
lation is considered between elements of a set other than that of inclusion
in it.
1 1 (a) Some Necessary Abstractions
Consider therefore the following simpler situation. Each input
quantity has only a binary measure, i.e., it has one of two possible values,
which will be called "active" and "inactive". If input j is active the
representation will be said to possess property j. There is independence
between all pairs of inputs, i.e., in a large ensemble of representations
the fraction possessing property j is the same whether the ensemble
possesses property k or not. A time record of the contents of three
channels j, k, and 1 might take the form of FIGURE 2a.
Such an input is still too complex for simple classification.
There must be three temporal abstractions. Firstly, there will be no
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.251.53