definition of dynamic equilibrium

This forum contains all archives from the SD Mailing list (go to http://www.systemdynamics.org/forum/ for more information). This is here as a read-only resource, please post any SD related questions to the SD Discussion forum.
Locked
"John Gunkler"
Member
Posts: 31
Joined: Fri Mar 29, 2002 3:39 am

definition of dynamic equilibrium

Post by "John Gunkler" »

Guenter,

Here is a fairly easy to understand description of what dynamic equilibrium
is, and is not:

When a (sufficiently complex) process is started in some initial
(non-steady) state, "...we expect to see a brief settling-in period, before
it settles down to an observable behavior. The erratic behavior during the
initial settling-in period is called the start-up transient. The
settled-in, eventual observable behavior is the equilibrium state....
Warning: Equilibrium, as used here, does not imply a static equilibrium,
nor a steady state."

To really understand dynamic equilibrium requires some mathematics, I fear.
But if you want a quick tour, follow me.

A fundamental idea is that of a "trajectory." Think of this as a line on a
graph, but think of yourself as a point on this line that, over time, moves
along a path that creates the line. Then think of yourself approaching a
point, or a straight line, closer and closer. While you never actually get
there (it takes you an infinitely long time to do so), this point or line
you approach is called an "asymptote." If you are approaching a point, it
is sometimes called a "critical point" or "limit point" for your trajectory.
An example is the point at the bottom of a pendulums swing -- where the
pendulum comes to rest. [This is a static equilibrium point.]

One example of a trajectory approaching a line would be a growing population
(say, of lynx or rabbits) that goes higher and higher until it approaches
the "carrying capacity of the environment." This is an example of a steady
state equilibrium.

The simplest example of dynamic equilibrium is probably some kind of
"periodic equilibrium." For example, take the pendulum of a clock. It gets
a little kick each cycle that keeps it going in (about) the same way. If
you graph the trajectory (distance from the critical, or resting, point over
time) you see a sine curve. This sine curve shows the dynamic (periodic)
equilibrium achieved by such a system.

More complex (e.g., chaotic) systems are characterized by more complex kinds
of dynamic equilibria. Mathematics here introduces the notion of an
"attractor" which is a trajectory (or point) that "captures" other
trajectories that come near enough to it. "Captures" can mean asymptotic
approach (just like in simpler systems) or it can mean that trajectories,
while not settling into a single path, stay in paths that are fairly close
to the attractor path. A system that settles into an always-changing, but
never far from the attractor, trajectory is also in dynamic equilibrium.

I hope this helps a little.
From: "John Gunkler" <
jgunkler@sprintmail.com>
Locked