Feed back loops Jixuan Hu and Martin Taylor
Posted: Mon Apr 01, 1996 3:00 pm
As a result of Jixuans posting (in a separte message) I have
amalgamated the exchanges. The advantage is to be able to follow
completed specific discussions as the happen; the disadvantage is that
some of the flow has been lost. See what you think.
From: Martin Taylor <mmt@BEN.DCIEM.DND.CA>
and
From: Jixuan Hu <jixuanhu@GWIS2.CIRC.GWU.EDU>
Subject: Re: Feed back loops
To: Multiple recipients of list CYBCOM <CYBCOM@gwuvm.gwu.edu>
On Fri, 31 May 1996, Martin Taylor & Jixuan Hu wrote:
Jixuan (extracted from earlier posting)
A complex system will keep its initial behavior (stable, growth,
oscillation of chaos) unchanged as long as the structure of the
dominating circular cause-effect loop within the system is kept
unchanged.
Martin
This is ambiguous as stated. In one way of looking at it, it is
tautological, but if one takes "the structure of ... kept unchanged" to
mean that all the causal linkages retain the same weights, timings,
etc., then the statement is false.
Jixuan
Martin, thanks for bringing the discussion into Einstein age of
cybernetics!
Martin response
Interesting you should use that metaphor. We did try to describe the
action of these networks using the analogy of Feynman diagrams, but the
maths was beyond us and we didnt find a physicist able and willing to
help. The notion was that each stable behaviour was like a stable
particle, describable by an orbit within the transition matrix that
defines the network--but the nonlinearity of the "squashing function"
means that this transition matrix is data-dependent, making matters very
difficult (for us, at least). However, it is very tempting to consider
a "thought" as corresponding to the stable behaviour of some module of a
complex network, and that stable behaviour as corresponding to a
particle capable of interacting with another "particular thought" if the
nodes concerned happed to influence one another. As with particle
interactions, the result might me something entirely different.
But, as I said, we got nowhere with the maths, and at the time the
available compute power (and financial resources for programmers, etc)
was inadequate to go much beyond 8 or 10 nodes in a network simulation--
far too few to make it easy to develop quasi-independent "particular
thoughts".
Jixuan
As you see I use "Law of System Inertia" to indicate that my
work was intended to build a framework for cybernetics like what Newton
did for physics. I would be very much pleased if that principle could
be falsified, or better yet, rephrased and improved. But before that,
lets see...
Martin
Some years ago, we conducted a series of experiments (never published)
on small (typically 8-node) neural networks with random internode links.
The nodes were each of the form:
out.j(t+1) = f(sigma(weight.ij * input.i(t)), where f was a standard
"squashing function" with limits at zero and some positive number.
Weight.ij were assigned values from some random distribution that was
fixed throughout the duration of an experiment. In an experiment, the
"temperature" of one or perhaps two nodes in the network might be
varied, "temperature" being represented by the rate at which f moved
from zero to its positive limit value.
Jixuan
If your f is changing, perhaps what I meant by "structure" is changing
too.
Martin response
Sorry, this was ambiguous. By "rate at which f moved" I mean a scaling
parameter p on the x axis in y = f(p*x). Larger values of p, increased
"rate of movement of f". Sorry about that. In different experiments we
used different values of p, and in some experiments we slowly varied p
for some one node, keeping the others in the rest of the network
constant.
Martin
All the following results apply to networks in which all the weights and
all the temperatures (i.e. ALL the parameters of the network) were
constant.
Many networks behave as Jixuan describes. Some sets of weights and
temperatures yield behaviour that rapidly approaches a fixed point, some
go into oscillations of longer or shorter periods, and some behave in a
manner we could not distinguish from chaotic.
But also there are many networks (defined as sets of parameter values)
that do NOT behave like this; their behaviour can depend drastically on
the initializing values of the data on the node inputs, and can change
drastically and permanently if even a single small impulse is added to a
single input of one node.
Jixuan
Very interesting - can you make a distinction between these two types of
networks? I.e., what is the difference between those who obey Hus first
law (Law of System Inertia or Law of Circular Causality) and those who
dont obey?
Martin Response
I know of no distinction other than the following: If we vary the
temperature parameter value p slowly on node A of an 8-node net, and on
node B even more slowly, so that the values of A and B jointly scan a 2-
D space in the same way an electron beam scans a TV screen, then the
network as a whole will sometimes be in a robustly stable condition,
always having, say, an oscillation of period 22 no matter what inputs
you proved, and then it may move into a region of variable behaviour,
where an impulse might move it from a period 22 oscillation to, say, a
fixed point behaviour or to (apparently) chaotic behaviour; then the
scan continues, the multiple behaviour might stabilize in, say, a period
5 oscillation or a fixed point, or (apparent) chaos, or...
There may well be functional relations among the weights and
temperatures that determine whether the network is stable (i.e. sustains
a behaviour after arbitrary inputs) or moves between behaviours. But I
dont know what those relationships might be.
Jixuan
Are those non-abiding networks more complex? Has more levels
of circularity loops? Dominating loop(s) is(are) changing? (Dominating
loops can be found by analyzing the sensitivities of different
parameters.)
Martin response
Structurally, they are identical, and they may differ only by some very
tiny variation in some one parameter. But in some sense they must be
more complex. What that sense is, I do not know.
Martin
The SAME set of parameters can yield a network
that at one time goes to a static fixed-point behaviour, at another time
has a regular oscillation, and at another time behaves apparently
chaotically, and can be switched among these behaviours by a single
impulse at a judiciously chosen time.
Jixuan
May we then say that this set of parameters doesnt form any stable
dominating loop within the system? Please check back your data and let
us know.
Martin response
I have only summary results, no old data. This was all done some years
ago, sorry. There was an interim report that I might be able to dig up,
but since we really didnt know what to make of it all, we didnt write
it up properly. Perhaps we should have, and perhaps we still should. I
dont know--I guess we kind of assumed that people more into network
simulations would know all about this kind of stuff.
Martin
An interesting property of such parameter sets is that usually if one
slowly changes a parameter (we changed the "temperature" of only one
node) while it is behaving, the characteristic behaviour is sustained
until some boundary value is passed, after which the behaviour switches
abruptly into another mode (e.g. chaotic to fixed-point). If one slowly
reverses the changes, the behaviour eventually switches back, but not
until well past the parameter value at which the original switch
occurred. The behaviour exhibits hysteresis.
Jixuan
Just like a living system that can live, get sick, and die. The
structure of circular causality loops in a living system is very
complex. In my previous message I was only talking about fundamental,
general and basic, unit of these loops. Also please see Rene Thoms
theory about catastrophe dynamics in complex systems.
Martin
The point of this comment is to say that in any moderately complex mesh
of feedback loops, the behaviour is likely to be indeterminate, and to
change drastically if the mesh has even minor input from outside, or if
any of the parameter values of the feedback loops changes even
trivially.
Jixuan
Yes, the key term here is "moderately complex." How do we define
complexity and how do we measure it - how much is "moderately complex"?
For me, I was focusing at the behavior of "unit-loop," or systems that
are clearly dominated by such loop(s). People in system dynamics
community call it "key loop."
Martin
One the other hand, for any particular loop, the behaviour may well be
robust against small disturbances from outside or in its parameter
values.
Jixuan
Sure.
Martin
It has not escaped our notice that these phenomena may have some
relevance to clinical psychiatric problems (to paraphrase Watson and
Crick). But that is an area in which we have no expertise.
Jixuan
I bet they have! But the phenomena may only be found in systems with
higher level of complexities. Do we have some psychiatrician on CYBCOM?
Anthony Gill phone: +44 (0)1295 812262
Phrontis Limited
Beacon House fax: +44 (0)1295 812511
Horn Hill Road
Adderbury email: t2@phrontis.demon.co.uk
Banbury
OXON. OX17 3EU
U.K.
amalgamated the exchanges. The advantage is to be able to follow
completed specific discussions as the happen; the disadvantage is that
some of the flow has been lost. See what you think.
From: Martin Taylor <mmt@BEN.DCIEM.DND.CA>
and
From: Jixuan Hu <jixuanhu@GWIS2.CIRC.GWU.EDU>
Subject: Re: Feed back loops
To: Multiple recipients of list CYBCOM <CYBCOM@gwuvm.gwu.edu>
On Fri, 31 May 1996, Martin Taylor & Jixuan Hu wrote:
Jixuan (extracted from earlier posting)
A complex system will keep its initial behavior (stable, growth,
oscillation of chaos) unchanged as long as the structure of the
dominating circular cause-effect loop within the system is kept
unchanged.
Martin
This is ambiguous as stated. In one way of looking at it, it is
tautological, but if one takes "the structure of ... kept unchanged" to
mean that all the causal linkages retain the same weights, timings,
etc., then the statement is false.
Jixuan
Martin, thanks for bringing the discussion into Einstein age of
cybernetics!
Martin response
Interesting you should use that metaphor. We did try to describe the
action of these networks using the analogy of Feynman diagrams, but the
maths was beyond us and we didnt find a physicist able and willing to
help. The notion was that each stable behaviour was like a stable
particle, describable by an orbit within the transition matrix that
defines the network--but the nonlinearity of the "squashing function"
means that this transition matrix is data-dependent, making matters very
difficult (for us, at least). However, it is very tempting to consider
a "thought" as corresponding to the stable behaviour of some module of a
complex network, and that stable behaviour as corresponding to a
particle capable of interacting with another "particular thought" if the
nodes concerned happed to influence one another. As with particle
interactions, the result might me something entirely different.
But, as I said, we got nowhere with the maths, and at the time the
available compute power (and financial resources for programmers, etc)
was inadequate to go much beyond 8 or 10 nodes in a network simulation--
far too few to make it easy to develop quasi-independent "particular
thoughts".
Jixuan
As you see I use "Law of System Inertia" to indicate that my
work was intended to build a framework for cybernetics like what Newton
did for physics. I would be very much pleased if that principle could
be falsified, or better yet, rephrased and improved. But before that,
lets see...
Martin
Some years ago, we conducted a series of experiments (never published)
on small (typically 8-node) neural networks with random internode links.
The nodes were each of the form:
out.j(t+1) = f(sigma(weight.ij * input.i(t)), where f was a standard
"squashing function" with limits at zero and some positive number.
Weight.ij were assigned values from some random distribution that was
fixed throughout the duration of an experiment. In an experiment, the
"temperature" of one or perhaps two nodes in the network might be
varied, "temperature" being represented by the rate at which f moved
from zero to its positive limit value.
Jixuan
If your f is changing, perhaps what I meant by "structure" is changing
too.
Martin response
Sorry, this was ambiguous. By "rate at which f moved" I mean a scaling
parameter p on the x axis in y = f(p*x). Larger values of p, increased
"rate of movement of f". Sorry about that. In different experiments we
used different values of p, and in some experiments we slowly varied p
for some one node, keeping the others in the rest of the network
constant.
Martin
All the following results apply to networks in which all the weights and
all the temperatures (i.e. ALL the parameters of the network) were
constant.
Many networks behave as Jixuan describes. Some sets of weights and
temperatures yield behaviour that rapidly approaches a fixed point, some
go into oscillations of longer or shorter periods, and some behave in a
manner we could not distinguish from chaotic.
But also there are many networks (defined as sets of parameter values)
that do NOT behave like this; their behaviour can depend drastically on
the initializing values of the data on the node inputs, and can change
drastically and permanently if even a single small impulse is added to a
single input of one node.
Jixuan
Very interesting - can you make a distinction between these two types of
networks? I.e., what is the difference between those who obey Hus first
law (Law of System Inertia or Law of Circular Causality) and those who
dont obey?
Martin Response
I know of no distinction other than the following: If we vary the
temperature parameter value p slowly on node A of an 8-node net, and on
node B even more slowly, so that the values of A and B jointly scan a 2-
D space in the same way an electron beam scans a TV screen, then the
network as a whole will sometimes be in a robustly stable condition,
always having, say, an oscillation of period 22 no matter what inputs
you proved, and then it may move into a region of variable behaviour,
where an impulse might move it from a period 22 oscillation to, say, a
fixed point behaviour or to (apparently) chaotic behaviour; then the
scan continues, the multiple behaviour might stabilize in, say, a period
5 oscillation or a fixed point, or (apparent) chaos, or...
There may well be functional relations among the weights and
temperatures that determine whether the network is stable (i.e. sustains
a behaviour after arbitrary inputs) or moves between behaviours. But I
dont know what those relationships might be.
Jixuan
Are those non-abiding networks more complex? Has more levels
of circularity loops? Dominating loop(s) is(are) changing? (Dominating
loops can be found by analyzing the sensitivities of different
parameters.)
Martin response
Structurally, they are identical, and they may differ only by some very
tiny variation in some one parameter. But in some sense they must be
more complex. What that sense is, I do not know.
Martin
The SAME set of parameters can yield a network
that at one time goes to a static fixed-point behaviour, at another time
has a regular oscillation, and at another time behaves apparently
chaotically, and can be switched among these behaviours by a single
impulse at a judiciously chosen time.
Jixuan
May we then say that this set of parameters doesnt form any stable
dominating loop within the system? Please check back your data and let
us know.
Martin response
I have only summary results, no old data. This was all done some years
ago, sorry. There was an interim report that I might be able to dig up,
but since we really didnt know what to make of it all, we didnt write
it up properly. Perhaps we should have, and perhaps we still should. I
dont know--I guess we kind of assumed that people more into network
simulations would know all about this kind of stuff.
Martin
An interesting property of such parameter sets is that usually if one
slowly changes a parameter (we changed the "temperature" of only one
node) while it is behaving, the characteristic behaviour is sustained
until some boundary value is passed, after which the behaviour switches
abruptly into another mode (e.g. chaotic to fixed-point). If one slowly
reverses the changes, the behaviour eventually switches back, but not
until well past the parameter value at which the original switch
occurred. The behaviour exhibits hysteresis.
Jixuan
Just like a living system that can live, get sick, and die. The
structure of circular causality loops in a living system is very
complex. In my previous message I was only talking about fundamental,
general and basic, unit of these loops. Also please see Rene Thoms
theory about catastrophe dynamics in complex systems.
Martin
The point of this comment is to say that in any moderately complex mesh
of feedback loops, the behaviour is likely to be indeterminate, and to
change drastically if the mesh has even minor input from outside, or if
any of the parameter values of the feedback loops changes even
trivially.
Jixuan
Yes, the key term here is "moderately complex." How do we define
complexity and how do we measure it - how much is "moderately complex"?
For me, I was focusing at the behavior of "unit-loop," or systems that
are clearly dominated by such loop(s). People in system dynamics
community call it "key loop."
Martin
One the other hand, for any particular loop, the behaviour may well be
robust against small disturbances from outside or in its parameter
values.
Jixuan
Sure.
Martin
It has not escaped our notice that these phenomena may have some
relevance to clinical psychiatric problems (to paraphrase Watson and
Crick). But that is an area in which we have no expertise.
Jixuan
I bet they have! But the phenomena may only be found in systems with
higher level of complexities. Do we have some psychiatrician on CYBCOM?
Anthony Gill phone: +44 (0)1295 812262
Phrontis Limited
Beacon House fax: +44 (0)1295 812511
Horn Hill Road
Adderbury email: t2@phrontis.demon.co.uk
Banbury
OXON. OX17 3EU
U.K.