Greg Scholl asks
"Since the methodology [of system dynamics] strives to build formal,
descriptive structures, shouldnt those structures behave at least
similarly to the real-world phenomena they describe? "
Absolutely. Indeed the reason to build a model of a structure is to
figure out how the real-world structure behaves. Unknown to use, the
real-world structure may behave in a way that does NOT generate some
reference mode. In this case, the a model of the real-world structure,
can help us understand this important fact (and alert us that we had
best look elsewhere for the source of the reference mode). The model
would be useful because it helped us understand something about how
the real-world structure actually behaves. Even though the mismatch
between reference mode and model output does not indicate the model is
not useful; we still need to know that there is a mismatch. The
mismatch in most (all important?) cases is so gross that it can be
SEEN with the human eye.
I hope that Greg did not interpret my previous remarks to mean that
information about the real world is unimportant. Far from it.
The critical information is, of course, about the real-world structure
(and patterns of behavior). To paraphrase Jay Forresters famous
funnel diagram: Information about structure for the most part is
found in managers heads; to a far lesser extent in writings; to an
extraordinarily small extent in the numerical data base; and to a
truly unbelievably small extent in the part of the numerical data
base that is time-series.
What I WAS arguing is, first, achieving a close, point-by-point fit to
time series can be misleading or otherwise damaging to generating
insights if achieving the fit results in warped parameters (which it
often does), if it means that resources will be inadequate for
analysis, or if it results in unwarranted confidence in the model on
the part of either the modelers or their audiences. Second, In cases
where all of the preceding problems are avoided, then fitting model
output closely to data is usually simply unimportant -- model-derived
insights are robust.
Regards,
Jim Hines
jimhines@interserv.com
Reliable models
-
- Senior Member
- Posts: 54
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
Jim Hines (SD0040) says that one could not check the correspondence of
even detrended economic data against a model of, say, inventory dynamics
"beyond a rough check". This is simply not true. Not only can you do
it, but it has been done. The question is whether one learns anything
meaningful from the quantitative comparison of the model against the
data that couldnt be gleaned from the "rough check". Now, Jim isnt
specific as to the meaning of the rough check, so once again I call on
Jim to get down to cases by specifying particular instances in which he
feels it isnt worthwhile to compare models against data, including the
purpose of the model and the nature of the tests he would do.
To be clear on my part, I will assume the purpose is to investigate the
role of inventory dynamics in business cycles (Which is the purpose Jim
originally proposed for his hypothetical), and that the audience one
ultimately wishes to influence is the academic and economic policy
community. I posit this audience because I dont think other potential
audiences particularly care about the role of inventories in the
business cycle. Though individual managers do care about the role of
their own inventories in managing their own businesses, this latter
purpose would not lead to the macroeconomic model Jim proposed, but a
firm or industry specific model. However, Jim should be specific about
the purpose and audience for which he thinks the rough check is
adequate.
I think it is obvious that the academic and policy audience in economics
will not take seriously a model in which the empirical component
consists of a rough check. Quantitative assessment is required: it is
a standard they expect, and work that does not conform to the standard
is not considered to be professional nor is it taken seriously.
Lets leave the sociological issue of acceptance by the economics
community aside, however, and consider Jims case on the merits.
First, the matter of detrending. Nowhere do I suggest that you should
detrend by a simple exponential if the trends or excluded modes in the
data are more complex. Jim should not criticize the ability to do or
the utility of quantitative model assessment by assuming that the
modelers do their analysis badly.
Second, Jim says the quantitative comparison of amplitude, phase, etc.
in the model and the data cannot yield meaningful results because the
excluded modes would all have to be included in the model so that their
exact impact can be subtracted from the raw data to leave only the
impact of inventories. Nonsense. It seems to me that Jim has an
implicit assumption that the purpose of quantitative assessment of model
against data is to build the case that the model is right. I have
repeatedly stressed that the purpose of the comparison is to find the
ways in which the model is wrong (or inadequate for the purpose). One
wants to do these tests precisely to discover if the exclusion of trade,
or interest rates, or other feedbacks matters. Following Jims
procedure will certainly leave the modeler older and will likely leave
him no wiser, because he is less likely to discover flaws in his model.
A concrete example is provided by exactly this hypothetical case: in
simple model of inventory dynamics with exogenous demand, quantitative
comparison of phase relationships shows that although many are within
the range observed in the real data, the phase relationship between
inventory and output was not. The source was tracked to the exogeneity
of aggregate demand. When aggregate demand was made endogenous (by
replacing the exogenous stream of orders with the assumption that the
consumers of durables manage the stock of durables they own by adjusting
the flow of orders), the phase relationship of inventory and output came
much closer to that observed in the real data. The result was a richer
account of the role of inventories in the business cycle by expanding
the dynamic hypothesis beyond the behavior of producers to include the
feedbacks between producers and consumers.
To do this type of work does not require that all possible stuctures be
built into the model. I see no basis for asserting that those who seek
quantitative tests of models must include all relevant modes, while
those like Jim who are content with the rough check need not. I think
what Jim is trying to say is that a model need not have an R^2 of 99.99%
to be acceptable, and I certainly agree. This is what I meant by
tracking down discrepancies to a satisfactory resolution. Such
tracking down does not mean refining parameters, structure, or inputs
until the fit is perfect. My concern is that the rough check method
makes it harder to find the real flaws in models, and makes it easier to
persuade yourself or your client that your model is good enough, when a
more rigorous assessment would make discrepancies more salient to
yourself and your audience. It may take time to do this, but for the
purpose I hypothesized, the stakes are large and the effort worthwhile
(the stakes being to identify high leverage points to moderate business
cycles). Given that business cycles have been extensively studied in
economics for more than a century, I would expect that the economy would
yield up its secrets slowly, that the efforts of many people would be
required, and that answers would come not from the work of one person
but from the collective, interactive, sometimes contentious process of
proposing and testing theory in a scientific community. When this
process is done well, we do become wiser as we grow older.
Finally, Jim suggests that I advocate models driven by exogenous
variables, while good theory should capture the dynamics endogenously.
Jim should know me better than that. Of course a good theory should
provide an endogenous account of the dynamics.
But every model has exogenous inputs (parameters). Some of these may be
properly represented as time varying (as in the order rate for product).
Others are assumed by the modeler to be sufficiently decoupled from the
system that they may be represented as constants, but of course in the
real world they are not constant.
To take a concrete example, Rogelio Oliva has just completed his PhD
thesis here at Sloan in which he studied a bank lending operation. The
model captures endogenously a wide range of feedbacks that determine the
dynamics of service quality and delivery. He quantitatively and
formally estimated many of the key parameters, and also used soft data
such as interviews and observation. The formal estimation process and
quantitative assessment of the model fit led to substantial and
significant refinement of the dynamic hypothesis. The model provides a
fully endogenous theory accounting for the dynamics of service quality
at the bank, but it still has a few exogenous inputs. One of these is
employee absenteeism. Now of course, absenteeism could be partially
endogenous: poor working conditions, for example, may enhance
absenteeism. However, quantitative assessment (as well as the interview
data) revealed no statistically meaningful evidence that these feedbacks
were significant; for the most part, the absenteeism appeared to be
random or calendar-driven (high near holidays, for example). To test
the model, Rogelio used the exogenous absenteeism data as an input. In
policy tests, he used a time series for absenteeism generated by a
random variable with the same variance and autocorrelation spectrum as
the actual data. These exogenous inputs do not create the dynamics, but
perturb the system enough to elicit the important endogenous dynamics.
The ability of the model to track the performance of the lending center
when driven by the actual data was extremely helpful in discovering
subtle flaws in the initial formulation. It is a lovely piece of work,
and I encourage all interested in service quality or in the rigorous
testing and calibration of a model to take a look. The thesis is
available from Rogelio at <roliva@mit.edu>.
John Sterman
jsterman@mit.edu
even detrended economic data against a model of, say, inventory dynamics
"beyond a rough check". This is simply not true. Not only can you do
it, but it has been done. The question is whether one learns anything
meaningful from the quantitative comparison of the model against the
data that couldnt be gleaned from the "rough check". Now, Jim isnt
specific as to the meaning of the rough check, so once again I call on
Jim to get down to cases by specifying particular instances in which he
feels it isnt worthwhile to compare models against data, including the
purpose of the model and the nature of the tests he would do.
To be clear on my part, I will assume the purpose is to investigate the
role of inventory dynamics in business cycles (Which is the purpose Jim
originally proposed for his hypothetical), and that the audience one
ultimately wishes to influence is the academic and economic policy
community. I posit this audience because I dont think other potential
audiences particularly care about the role of inventories in the
business cycle. Though individual managers do care about the role of
their own inventories in managing their own businesses, this latter
purpose would not lead to the macroeconomic model Jim proposed, but a
firm or industry specific model. However, Jim should be specific about
the purpose and audience for which he thinks the rough check is
adequate.
I think it is obvious that the academic and policy audience in economics
will not take seriously a model in which the empirical component
consists of a rough check. Quantitative assessment is required: it is
a standard they expect, and work that does not conform to the standard
is not considered to be professional nor is it taken seriously.
Lets leave the sociological issue of acceptance by the economics
community aside, however, and consider Jims case on the merits.
First, the matter of detrending. Nowhere do I suggest that you should
detrend by a simple exponential if the trends or excluded modes in the
data are more complex. Jim should not criticize the ability to do or
the utility of quantitative model assessment by assuming that the
modelers do their analysis badly.
Second, Jim says the quantitative comparison of amplitude, phase, etc.
in the model and the data cannot yield meaningful results because the
excluded modes would all have to be included in the model so that their
exact impact can be subtracted from the raw data to leave only the
impact of inventories. Nonsense. It seems to me that Jim has an
implicit assumption that the purpose of quantitative assessment of model
against data is to build the case that the model is right. I have
repeatedly stressed that the purpose of the comparison is to find the
ways in which the model is wrong (or inadequate for the purpose). One
wants to do these tests precisely to discover if the exclusion of trade,
or interest rates, or other feedbacks matters. Following Jims
procedure will certainly leave the modeler older and will likely leave
him no wiser, because he is less likely to discover flaws in his model.
A concrete example is provided by exactly this hypothetical case: in
simple model of inventory dynamics with exogenous demand, quantitative
comparison of phase relationships shows that although many are within
the range observed in the real data, the phase relationship between
inventory and output was not. The source was tracked to the exogeneity
of aggregate demand. When aggregate demand was made endogenous (by
replacing the exogenous stream of orders with the assumption that the
consumers of durables manage the stock of durables they own by adjusting
the flow of orders), the phase relationship of inventory and output came
much closer to that observed in the real data. The result was a richer
account of the role of inventories in the business cycle by expanding
the dynamic hypothesis beyond the behavior of producers to include the
feedbacks between producers and consumers.
To do this type of work does not require that all possible stuctures be
built into the model. I see no basis for asserting that those who seek
quantitative tests of models must include all relevant modes, while
those like Jim who are content with the rough check need not. I think
what Jim is trying to say is that a model need not have an R^2 of 99.99%
to be acceptable, and I certainly agree. This is what I meant by
tracking down discrepancies to a satisfactory resolution. Such
tracking down does not mean refining parameters, structure, or inputs
until the fit is perfect. My concern is that the rough check method
makes it harder to find the real flaws in models, and makes it easier to
persuade yourself or your client that your model is good enough, when a
more rigorous assessment would make discrepancies more salient to
yourself and your audience. It may take time to do this, but for the
purpose I hypothesized, the stakes are large and the effort worthwhile
(the stakes being to identify high leverage points to moderate business
cycles). Given that business cycles have been extensively studied in
economics for more than a century, I would expect that the economy would
yield up its secrets slowly, that the efforts of many people would be
required, and that answers would come not from the work of one person
but from the collective, interactive, sometimes contentious process of
proposing and testing theory in a scientific community. When this
process is done well, we do become wiser as we grow older.
Finally, Jim suggests that I advocate models driven by exogenous
variables, while good theory should capture the dynamics endogenously.
Jim should know me better than that. Of course a good theory should
provide an endogenous account of the dynamics.
But every model has exogenous inputs (parameters). Some of these may be
properly represented as time varying (as in the order rate for product).
Others are assumed by the modeler to be sufficiently decoupled from the
system that they may be represented as constants, but of course in the
real world they are not constant.
To take a concrete example, Rogelio Oliva has just completed his PhD
thesis here at Sloan in which he studied a bank lending operation. The
model captures endogenously a wide range of feedbacks that determine the
dynamics of service quality and delivery. He quantitatively and
formally estimated many of the key parameters, and also used soft data
such as interviews and observation. The formal estimation process and
quantitative assessment of the model fit led to substantial and
significant refinement of the dynamic hypothesis. The model provides a
fully endogenous theory accounting for the dynamics of service quality
at the bank, but it still has a few exogenous inputs. One of these is
employee absenteeism. Now of course, absenteeism could be partially
endogenous: poor working conditions, for example, may enhance
absenteeism. However, quantitative assessment (as well as the interview
data) revealed no statistically meaningful evidence that these feedbacks
were significant; for the most part, the absenteeism appeared to be
random or calendar-driven (high near holidays, for example). To test
the model, Rogelio used the exogenous absenteeism data as an input. In
policy tests, he used a time series for absenteeism generated by a
random variable with the same variance and autocorrelation spectrum as
the actual data. These exogenous inputs do not create the dynamics, but
perturb the system enough to elicit the important endogenous dynamics.
The ability of the model to track the performance of the lending center
when driven by the actual data was extremely helpful in discovering
subtle flaws in the initial formulation. It is a lovely piece of work,
and I encourage all interested in service quality or in the rigorous
testing and calibration of a model to take a look. The thesis is
available from Rogelio at <roliva@mit.edu>.
John Sterman
jsterman@mit.edu
-
- Senior Member
- Posts: 54
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
I agree very much with Ed Galleher that stock and flow diagrams and
causal loop diagrams are easier for people to understand than
differential equations, and I, along with many in the SD field, have
long advocated opening the modeling process to the widest possible
audience by avoiding needless complexity in the presentation of models.
However, Ed says that "differential equations [are] not SD models", I
have to disagree. We have to distinguish the substantive assumptions of
a model from the format in which it is presented. In substance, a
differential equation model is a system dynamics model. They are
mathematically equivalent. That is, there is an algorithm which can
translate any differential equation model into an equivalent "system
dynamics" model and vice versa. Of course, an SD model, like
differential equation models, can have discrete and stochastic elements
too; but there is still a one-to-one mapping of the DE to the SD
formats: they are two different representations of the same mathematical
structure.
However, the format of presentation of diffeq models and SD models is
often very different: The differential equation models typically
involve lots of Greek letters, while the SD presentation involves lots
of diagrams (but not always: when I seek to publish my SD models in
academic journals where Greek letters and differential equations are the
norm, I do so, while I present the same model using stock and flow
diagrams for other audiences: When in Rome, one must speak as the
Romans do, but one still speaks the truth as one sees it).
I agree completely with Ed that using the diffeq presentation excludes
much of the audience one would like to reach. When notation is used to
exclude people, or to display how smart the author is, then learning
suffers. Einstein said something like an idea should be presented in
as simple a way as possible, but not simpler. Stock and flow diagrams
are easier for most people to grasp than dx/dt = ax - bxy, but
understanding stock and flow diagrams too requires training, practice,
and discipline (think of it as learning to read music - it requires
similar practice and effort).
I look forward to the day when stock and flow diagrams and causal loop
diagrams are as commonplace in the scientific literature as differential
equations. There are actually a number of disciplines where this is
already becoming true, including biomedicine.
John Sterman
jsterman@mit.edu
causal loop diagrams are easier for people to understand than
differential equations, and I, along with many in the SD field, have
long advocated opening the modeling process to the widest possible
audience by avoiding needless complexity in the presentation of models.
However, Ed says that "differential equations [are] not SD models", I
have to disagree. We have to distinguish the substantive assumptions of
a model from the format in which it is presented. In substance, a
differential equation model is a system dynamics model. They are
mathematically equivalent. That is, there is an algorithm which can
translate any differential equation model into an equivalent "system
dynamics" model and vice versa. Of course, an SD model, like
differential equation models, can have discrete and stochastic elements
too; but there is still a one-to-one mapping of the DE to the SD
formats: they are two different representations of the same mathematical
structure.
However, the format of presentation of diffeq models and SD models is
often very different: The differential equation models typically
involve lots of Greek letters, while the SD presentation involves lots
of diagrams (but not always: when I seek to publish my SD models in
academic journals where Greek letters and differential equations are the
norm, I do so, while I present the same model using stock and flow
diagrams for other audiences: When in Rome, one must speak as the
Romans do, but one still speaks the truth as one sees it).
I agree completely with Ed that using the diffeq presentation excludes
much of the audience one would like to reach. When notation is used to
exclude people, or to display how smart the author is, then learning
suffers. Einstein said something like an idea should be presented in
as simple a way as possible, but not simpler. Stock and flow diagrams
are easier for most people to grasp than dx/dt = ax - bxy, but
understanding stock and flow diagrams too requires training, practice,
and discipline (think of it as learning to read music - it requires
similar practice and effort).
I look forward to the day when stock and flow diagrams and causal loop
diagrams are as commonplace in the scientific literature as differential
equations. There are actually a number of disciplines where this is
already becoming true, including biomedicine.
John Sterman
jsterman@mit.edu
-
- Senior Member
- Posts: 67
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
irt: jsterman@MIT.EDU (John Sterman), Thu, Apr 25, 1996 7:44 PM EST
Regarding Johns comment:
I look forward to the day when stock and flow diagrams and causal loop
diagrams are as commonplace in the scientific literature as differential
equations. There are actually a number of disciplines where this is
already becoming true, including biomedicine.
I would just like to second offer support for the emergence of this
environment. For, as John said, "...one still speaks the truth as one sees
it....". Yet speaking truth is not quite enough. We must also seek
understanding of truth and I find SFDs snd CLDs often make truth much easier
to see and understand than DEs.
Gene Belling
CrbnBlu@aol.com
PS It would appear that this list has come to life!!!! I am quite pleased
with the contributions so many of you are making to support this happening.
Regarding Johns comment:
I look forward to the day when stock and flow diagrams and causal loop
diagrams are as commonplace in the scientific literature as differential
equations. There are actually a number of disciplines where this is
already becoming true, including biomedicine.
I would just like to second offer support for the emergence of this
environment. For, as John said, "...one still speaks the truth as one sees
it....". Yet speaking truth is not quite enough. We must also seek
understanding of truth and I find SFDs snd CLDs often make truth much easier
to see and understand than DEs.
Gene Belling
CrbnBlu@aol.com
PS It would appear that this list has come to life!!!! I am quite pleased
with the contributions so many of you are making to support this happening.
-
- Newbie
- Posts: 1
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
/ Jim Hines writes:
| What I WAS arguing is, first, achieving a close, point-by-point fit to
| time series can be misleading or otherwise damaging to generating
| insights if achieving the fit results in warped parameters (which it
Please, allow me a few comments even though I am only an epidemiologist.
For most complex models (involving nonlinearity), I wonder if one could
achieve close fit; only a qualitative similarity can be reached due to
sensitivity to initial conditions. For example, in my field, modeling the
population biology of infectious processes lead to close yet
qualittative ressemblance of the true time series with the simulation.
Another point I want to stress as far as complex complex models (defined
as above) are considered: I believe that the same qyualitative
ressemblance may be achieved through the use of models that may not favor
understanding; in this case, the model may be a useful predictor of the
real series but a poor understanding device. This might be due to
counfounding within the structure.
My two cents...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Pierre Philippe, Ph.D BEWARE!
Social & Preventive Medicine IF NATURE CAN FOOL YOU, BE SURE IT WILL
U of Montreal ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Quebec email:philippp@ere.umontreal.ca
Canada http://alize.ere.umontreal.ca/~philippp
~~~~~~~~~~~~~~~~~~~~~~~Listowner EPIDEMIO-L (Listproc@CC.UMontreal.CA)
| What I WAS arguing is, first, achieving a close, point-by-point fit to
| time series can be misleading or otherwise damaging to generating
| insights if achieving the fit results in warped parameters (which it
Please, allow me a few comments even though I am only an epidemiologist.
For most complex models (involving nonlinearity), I wonder if one could
achieve close fit; only a qualitative similarity can be reached due to
sensitivity to initial conditions. For example, in my field, modeling the
population biology of infectious processes lead to close yet
qualittative ressemblance of the true time series with the simulation.
Another point I want to stress as far as complex complex models (defined
as above) are considered: I believe that the same qyualitative
ressemblance may be achieved through the use of models that may not favor
understanding; in this case, the model may be a useful predictor of the
real series but a poor understanding device. This might be due to
counfounding within the structure.
My two cents...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Pierre Philippe, Ph.D BEWARE!
Social & Preventive Medicine IF NATURE CAN FOOL YOU, BE SURE IT WILL
U of Montreal ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Quebec email:philippp@ere.umontreal.ca
Canada http://alize.ere.umontreal.ca/~philippp
~~~~~~~~~~~~~~~~~~~~~~~Listowner EPIDEMIO-L (Listproc@CC.UMontreal.CA)
-
- Member
- Posts: 41
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
On Fri, 26 Apr 1996, philippp@ERE.UMontreal.CA (P. Philippe) wrote in SD066:
>For most complex models (involving nonlinearity), I wonder if one could
>achieve close fit; only a qualitative similarity can be reached due to
>sensitivity to initial conditions. For example, in my field, modeling the
>population biology of infectious processes lead to close yet
>qualittative ressemblance of the true time series with the simulation.
In social system modeling it is possible to achieve a close fit. It takes time
and some "creativity". In most cases the fit is achieved with exogenous drivers
or with structure added specifically to get the fit. Some people in this
discussion believe that when done properly, achieving the fit in this way is
helpful. Others in this discussion believe that often or usually it is not done
well; and even when done well the effort can consume resources which could be
better applied elsewhere.
Jim Hines
jimhines@interserv.com
>For most complex models (involving nonlinearity), I wonder if one could
>achieve close fit; only a qualitative similarity can be reached due to
>sensitivity to initial conditions. For example, in my field, modeling the
>population biology of infectious processes lead to close yet
>qualittative ressemblance of the true time series with the simulation.
In social system modeling it is possible to achieve a close fit. It takes time
and some "creativity". In most cases the fit is achieved with exogenous drivers
or with structure added specifically to get the fit. Some people in this
discussion believe that when done properly, achieving the fit in this way is
helpful. Others in this discussion believe that often or usually it is not done
well; and even when done well the effort can consume resources which could be
better applied elsewhere.
Jim Hines
jimhines@interserv.com
-
- Member
- Posts: 39
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
>Date: Wed, 1 May 1996 11:04:06 -0700
>To:e1sad@ice.csv.warwick.ac.uk (Adrian Boucher)
>Adrian Boucher wrote:
>> In schools we have a so-called
>>National Curriculum which is deemed to contain all that is needed to
>>become suitably educated for the modern complex world.
>>Unfortunately, it doesnt address transferable analytical skills such
>>as those developed through systems thinking and system dynamics. The
>>curriculum is said to be overcrowded so that there is no room for
>>additional techniques such as SD. This, it seems to me, entirely
>>misses the point. Many of the processes in biology, physics,
>>chemistry, economics, ecology, (I could go on, but I guess you get
>>the point), may be modelled simply by generic SD structures, and the
>>similarity of these could be used as a powerful learning environment
>>which emphasises the TRANSFERABILITY of results across subject
>>boundaries.
This is EXACTLY the point we are trying to illustrate with SyM Bowl.
The model, and behavior of drug elimination is -very close- to the model
of dilution of salt in a tank in a pickle factory, and -very close- to the
dilution of pesticide in a reservoir, with clear water running in.
This latter example could be a tremendous learning tool re the -economics-
of cleaning pesticides out of the environment. And then it could be
modified slightly to explain why one should take three aspirin for the
first dose, and two aspirin every X hours thereafter.
Very different subject areas. Very similar models.
And the user has a set of tools for future work as well.
Despite the complexity of the real world, I expect that a reasonably
approachable number of simple SD models will provide huge increases in
understanding. Then we (society) can go on from there.
ed
(Ed Gallaher)
gallaher@teleport.com
>To:e1sad@ice.csv.warwick.ac.uk (Adrian Boucher)
>Adrian Boucher wrote:
>> In schools we have a so-called
>>National Curriculum which is deemed to contain all that is needed to
>>become suitably educated for the modern complex world.
>>Unfortunately, it doesnt address transferable analytical skills such
>>as those developed through systems thinking and system dynamics. The
>>curriculum is said to be overcrowded so that there is no room for
>>additional techniques such as SD. This, it seems to me, entirely
>>misses the point. Many of the processes in biology, physics,
>>chemistry, economics, ecology, (I could go on, but I guess you get
>>the point), may be modelled simply by generic SD structures, and the
>>similarity of these could be used as a powerful learning environment
>>which emphasises the TRANSFERABILITY of results across subject
>>boundaries.
This is EXACTLY the point we are trying to illustrate with SyM Bowl.
The model, and behavior of drug elimination is -very close- to the model
of dilution of salt in a tank in a pickle factory, and -very close- to the
dilution of pesticide in a reservoir, with clear water running in.
This latter example could be a tremendous learning tool re the -economics-
of cleaning pesticides out of the environment. And then it could be
modified slightly to explain why one should take three aspirin for the
first dose, and two aspirin every X hours thereafter.
Very different subject areas. Very similar models.
And the user has a set of tools for future work as well.
Despite the complexity of the real world, I expect that a reasonably
approachable number of simple SD models will provide huge increases in
understanding. Then we (society) can go on from there.
ed
(Ed Gallaher)
gallaher@teleport.com
-
- Senior Member
- Posts: 67
- Joined: Fri Mar 29, 2002 3:39 am
Reliable models
irt: p.atkins@unsw.EDU.AU (Paul Atkins), Tue, May 7, 1996 5:54 AM EST
Paul asks if there are stages one goes through in developing expertise in the
SD arena. And I guess I want to say yes, and then I guess Im supposed to
tell what they are --- and this is the hard part.
I think my understanding has progressed from insight to insight, though not
in what I would claim has any sensible order to it. And the things I
considered insights we in fact things other people already knew. Its when I
made a connection with something that seemed to provide a understanding
beyond what was there before.
The insights I seem to recall are:
1. Finally grasping the concept of "system" and relating it to the
everything in the universe.
2. The implications of "system" are things Im still uncovering on an
ongoing basis. One of the most profound implications (at least for me) was
understaning "emergence."
3. Continuing to deepen my understanding of the principles (or foundational
truths) of systems. Senge provide a number of these and a longer list is
provided by Draper Kauffman in "Systems 1: An Introduction to Systems
Thinking." You can find a list of these at the end of the paper at:
http://www.radix.net/~crbnblu/columbo.html
4. Also with the perspectives of systems there was also a strugle to learn
to do simulations (and this one is far from over). I could look at other
peoples models and run them, play with them, reimplement them, but when it
came to my own I would sit for hours will a blank screen worried about where
was the right place to start. Nike finally provided me with the answer to
this --- "Just Do It!" So now I dont worry about where I start. It all seems
to end up in the same place as long as I continue to iterate the process.
5. And from here its figuing how to concquer ones ego and embrace the
capacity to learn from everything and everyone --- whether they understand
what youre doing or not. I find some of my greatest dilemmas are solved by
talking to people that dont have the foggiest idea what Im doing -- go
figure!
6. Do a lot of it! It seems to improve with practice....
Hope this helps some...
Gene Bellinger
CrbnBlu@aol.com
Paul asks if there are stages one goes through in developing expertise in the
SD arena. And I guess I want to say yes, and then I guess Im supposed to
tell what they are --- and this is the hard part.
I think my understanding has progressed from insight to insight, though not
in what I would claim has any sensible order to it. And the things I
considered insights we in fact things other people already knew. Its when I
made a connection with something that seemed to provide a understanding
beyond what was there before.
The insights I seem to recall are:
1. Finally grasping the concept of "system" and relating it to the
everything in the universe.
2. The implications of "system" are things Im still uncovering on an
ongoing basis. One of the most profound implications (at least for me) was
understaning "emergence."
3. Continuing to deepen my understanding of the principles (or foundational
truths) of systems. Senge provide a number of these and a longer list is
provided by Draper Kauffman in "Systems 1: An Introduction to Systems
Thinking." You can find a list of these at the end of the paper at:
http://www.radix.net/~crbnblu/columbo.html
4. Also with the perspectives of systems there was also a strugle to learn
to do simulations (and this one is far from over). I could look at other
peoples models and run them, play with them, reimplement them, but when it
came to my own I would sit for hours will a blank screen worried about where
was the right place to start. Nike finally provided me with the answer to
this --- "Just Do It!" So now I dont worry about where I start. It all seems
to end up in the same place as long as I continue to iterate the process.
5. And from here its figuing how to concquer ones ego and embrace the
capacity to learn from everything and everyone --- whether they understand
what youre doing or not. I find some of my greatest dilemmas are solved by
talking to people that dont have the foggiest idea what Im doing -- go
figure!
6. Do a lot of it! It seems to improve with practice....
Hope this helps some...
Gene Bellinger
CrbnBlu@aol.com