dt/time step in the real world
Posted: Thu Dec 23, 2004 8:23 am
Old Subject: The Derivative in the Real World
I would like to seek thoughts on one issue from this latest discussion -
explaining dT.
The mystical meaning of dT seems to be a major conceptual hurdle for
most regular people [not math genii] getting their heads around system
dynamics, especially when you start having to explain why variables take
a value in a simulation that is 1/dT greater than 'reality'.
I would be interested to lear how others cope with this pedagogically.
I have adopted an approach that - whilst probably anathema in the strict
sense - seems to be 'good enough' for most practical purposes. This is
to focus students' and executives' attention on choosing the appropriate
time *period* instead.
Having identified that the quantity of stocks [and constants or
decisions] at the start of a period determine the rate at which things
happen during that period, I suggest they ask themselves 'for how long a
period is it safe to assume that these rates don't change significantly,
relative to the overall scale of the situation?'. That length of time
becomes the time period of the simulation, and dT is 1.0.
This seems to be reasonably safe, and good enough for practical purposes
- e.g. if things continue more or less at a steady rate over a month at
a time, it is safer to set 'months' as the time period and have dT of
1.0 than it is to set 'years' as the time period and dT of 0.25.
It would be helpful to hear how others might explain the dangers of this
simplifying approach to audiences that don't have a strong grasp [or
interest!] in the math.
Kim Warren
From: ""Kim Warren"" <Kim@strategydynamics.com>
I would like to seek thoughts on one issue from this latest discussion -
explaining dT.
The mystical meaning of dT seems to be a major conceptual hurdle for
most regular people [not math genii] getting their heads around system
dynamics, especially when you start having to explain why variables take
a value in a simulation that is 1/dT greater than 'reality'.
I would be interested to lear how others cope with this pedagogically.
I have adopted an approach that - whilst probably anathema in the strict
sense - seems to be 'good enough' for most practical purposes. This is
to focus students' and executives' attention on choosing the appropriate
time *period* instead.
Having identified that the quantity of stocks [and constants or
decisions] at the start of a period determine the rate at which things
happen during that period, I suggest they ask themselves 'for how long a
period is it safe to assume that these rates don't change significantly,
relative to the overall scale of the situation?'. That length of time
becomes the time period of the simulation, and dT is 1.0.
This seems to be reasonably safe, and good enough for practical purposes
- e.g. if things continue more or less at a steady rate over a month at
a time, it is safer to set 'months' as the time period and have dT of
1.0 than it is to set 'years' as the time period and dT of 0.25.
It would be helpful to hear how others might explain the dangers of this
simplifying approach to audiences that don't have a strong grasp [or
interest!] in the math.
Kim Warren
From: ""Kim Warren"" <Kim@strategydynamics.com>