Page 1 of 1

definition of dynamic equilibrium

Posted: Thu Apr 16, 1998 5:05 pm
by Guenter Emberger
Hello,
I want to ask you if you know a definition of dynamic equilibrium
respectively what is an stable system behaviour?
Is it possible to say that a system with a chaotic behaviour can been seen
as stable or not?
Do I think in the wrong way when I say every system which exist over a long
time period without an overshoot collapse is in dynamic equlibrium.

Many thanks

Guenter
-----------------------------------------------------------------
Mag. Guenter Emberger
Institut fuer Verkehrsplanung
TU-Wien
Gusshausstrasse 30/2
A-1040 Vienna
Telefon : ++43 1 58801 23112
Fax : ++43 1 58801 23199
e-mail : guenter.emberger@tuwien.ac.at
www :http:www-ivv.tuwien.ac.at
-----------------------------------------------------------------

definition of dynamic equilibrium

Posted: Thu Jun 18, 1998 10:24 am
by Fabian Szulanski
IMHO, a simple definition of one type of of dynamic equilibrium would be
to say that in a very simple inflow-stock-outflow structure, whatever
the mathematical relationships embedded in the flows might be, we
observe dynamic equilibrium in the system (being shown by a steady state
value in the stock) when inflow = outflow.
We say there exists dynamic equilibrium because there are two activities
taking place, those of the two flows, and even in that case the
observable behavior in the stock is of steady state equilibrium.
This is the kind of dynamic equilibrium we could explain very easily.
Adding a little twist, this dynamic equilibrium is often unstable; the
slightest change in the net flow occurs and the dynamic equilibrium
vanishes.

Be well, and have a healthy and prosper 1999

Fabian Szulanski
From: Fabian Szulanski <fabiansz@consultant.com>

definition of dynamic equilibrium

Posted: Thu Dec 17, 1998 11:07 am
by "John Gunkler"
Guenter,

Here is a fairly easy to understand description of what dynamic equilibrium
is, and is not:

When a (sufficiently complex) process is started in some initial
(non-steady) state, "...we expect to see a brief settling-in period, before
it settles down to an observable behavior. The erratic behavior during the
initial settling-in period is called the start-up transient. The
settled-in, eventual observable behavior is the equilibrium state....
Warning: Equilibrium, as used here, does not imply a static equilibrium,
nor a steady state."

To really understand dynamic equilibrium requires some mathematics, I fear.
But if you want a quick tour, follow me.

A fundamental idea is that of a "trajectory." Think of this as a line on a
graph, but think of yourself as a point on this line that, over time, moves
along a path that creates the line. Then think of yourself approaching a
point, or a straight line, closer and closer. While you never actually get
there (it takes you an infinitely long time to do so), this point or line
you approach is called an "asymptote." If you are approaching a point, it
is sometimes called a "critical point" or "limit point" for your trajectory.
An example is the point at the bottom of a pendulums swing -- where the
pendulum comes to rest. [This is a static equilibrium point.]

One example of a trajectory approaching a line would be a growing population
(say, of lynx or rabbits) that goes higher and higher until it approaches
the "carrying capacity of the environment." This is an example of a steady
state equilibrium.

The simplest example of dynamic equilibrium is probably some kind of
"periodic equilibrium." For example, take the pendulum of a clock. It gets
a little kick each cycle that keeps it going in (about) the same way. If
you graph the trajectory (distance from the critical, or resting, point over
time) you see a sine curve. This sine curve shows the dynamic (periodic)
equilibrium achieved by such a system.

More complex (e.g., chaotic) systems are characterized by more complex kinds
of dynamic equilibria. Mathematics here introduces the notion of an
"attractor" which is a trajectory (or point) that "captures" other
trajectories that come near enough to it. "Captures" can mean asymptotic
approach (just like in simpler systems) or it can mean that trajectories,
while not settling into a single path, stay in paths that are fairly close
to the attractor path. A system that settles into an always-changing, but
never far from the attractor, trajectory is also in dynamic equilibrium.

I hope this helps a little.
From: "John Gunkler" <jgunkler@sprintmail.com>