A
Dear SD Community,
I am writing on behalf of an inter-institutional research group in Austria,
who is conducting a research project aiming at modeling the carbon cycle at
a sub-regional (national) level. The (Austrian Carbon Balance) model
encompasses a range of modules including biosphere (forestry, agriculture),
energy, industry and waste(-management). This is going to be a complex task
and the resulting model will be quite heterogeneous regarding the time
resolution and dynamics of each module; some are highly dynamic, such as the
biosphere, some are less dynamic (basically linear), such as the energy modules.
The results of a pre-study demonstrate the feasibility of the approach
taken, but they also suggest to look into other (commercially and
not-commercially available) modeling support tools.
We found several mostly commercially available packages, among them Vensim,
Powersim, Stella etc., which all have the "system-dynamics" approach in
common. Modeling work on selected modules (forestry, agriculture, energy)
had been carried out using MS Excel and another (non system-dynamics)
supporting tool developed at TU Vienna.
One important fact is that the model should have a modular character, i.e.
modules are created by different groups and are connected later to one
entity. As mentioned above the modules will most likely reveal different
characteristic time step in the forestry module may be 10 years, while that
of the agricultural model may be only 1 year. It will also incorporate both
continuous and discrete functions.
Our question is, whether SD and tools such as Vensim are suitable to realize
such a complex task.
Which (other) tool(s) would you suggest using? I would also like to know if
such a (similar) task had already been carried out using SD.
Thanks in advance for your comments.
B.Mayr
--------------------------------------------------------------------------
******* * * Bernd MAYR, Dipl.-Ing.
* * * Forschungsinstitut fuer Chemie und Umwelt
* * * (Research Institute of Chemistry and the Environment)
* **** TECHNISCHE UNIVERSITAET WIEN (VIENNA UNIV. OF TECHNOLOGY)
-------------- Getreidemarkt 9/191, A-1060 VIENNA; AUSTRIA
W I E N Phone: ++43-1-58801-5194 Fax: ++43-1-581-29-52
email: bmayr191@fbch.tuwien.ac.at,
Bernd.Mayr+e191@tuwien.ac.at
WWW of Inst.: http://info.tuwien.ac.at/ficu
--------------------------------------------------------------------------
SD for research study on carbon cycle
-
- Junior Member
- Posts: 2
- Joined: Fri Mar 29, 2002 3:39 am
-
- Newbie
- Posts: 1
- Joined: Fri Mar 29, 2002 3:39 am
SD for research study on carbon cycle
With regards to the "chaining" of comodels, I have seen this concept used in a
beta version the Industrial Base Analysis Model developed by Decision Dynamics
Inc. of Bethesda, MD. Although never tested to its maximum capability,
object-oriented production maps of up to 30,000+ linked models could
theoretically be constructed, allowing user controlled what-ifs to be run
within each of the models. This model was developed using iThink and C++.
This concept may or may not suit your needs. Not being a programmer or
hands-on SD modeler (yet) I suggest you contact Dr. Lou Alfeld for further
details at (301) 657-8500 or lealfeld@decisiondynamics.com.
Ron Cooke, ODUSD(IA&I), Industrial Capabilities and Assessments
Dept of Defense
From: Ron Cooke 57916 <rcooke@acq.osd.mil>
beta version the Industrial Base Analysis Model developed by Decision Dynamics
Inc. of Bethesda, MD. Although never tested to its maximum capability,
object-oriented production maps of up to 30,000+ linked models could
theoretically be constructed, allowing user controlled what-ifs to be run
within each of the models. This model was developed using iThink and C++.
This concept may or may not suit your needs. Not being a programmer or
hands-on SD modeler (yet) I suggest you contact Dr. Lou Alfeld for further
details at (301) 657-8500 or lealfeld@decisiondynamics.com.
Ron Cooke, ODUSD(IA&I), Industrial Capabilities and Assessments
Dept of Defense
From: Ron Cooke 57916 <rcooke@acq.osd.mil>
-
- Junior Member
- Posts: 3
- Joined: Fri Mar 29, 2002 3:39 am
SD for research study on carbon cycle
In response to Bernd Mayrs query about co-(sub-)models and
integrated applications.
Powersim has the capability to use link several co-(sub-)models--a
feature that you look for in a simulation software. Powersim is also
MS Windows compliant that allows the use of DDE and OLE for data
transfer between applications. This way you can even mix a
traditional spreadsheet analysis with the simulated data from a
system dynamic model. In fact Powersim is currently working with a
client to do just what you described: we are integrating a dynamic
model with spreadsheet calculation.
You can even "expand" the software using custom user interface done
in Visual Basic, Delphi, and recently Java for client-server
simulation on the inter/intra/extranets. Please contact us for
further details about our software capability.
--
---------------------------------
Benny Budiman
benbu@powersim.com
703-707-6421
Powersim Corporation
The Business Simulation Company
www.powersim.com
---------------------------------
integrated applications.
Powersim has the capability to use link several co-(sub-)models--a
feature that you look for in a simulation software. Powersim is also
MS Windows compliant that allows the use of DDE and OLE for data
transfer between applications. This way you can even mix a
traditional spreadsheet analysis with the simulated data from a
system dynamic model. In fact Powersim is currently working with a
client to do just what you described: we are integrating a dynamic
model with spreadsheet calculation.
You can even "expand" the software using custom user interface done
in Visual Basic, Delphi, and recently Java for client-server
simulation on the inter/intra/extranets. Please contact us for
further details about our software capability.
--
---------------------------------
Benny Budiman
benbu@powersim.com
703-707-6421
Powersim Corporation
The Business Simulation Company
www.powersim.com
---------------------------------
SD for research study on carbon cycle
Powersim may be suitable for this kind of work - it can handle very large
models, you can disaggregate the model into modules called "comodels" which
can all together form your "main" model. Also, it has tools whereby data
exchange with Excel is easily possible, and also, very important for your
purpose - the comodels (modules) may be run for different time periods
(forestry vs. agriculture) and with different degrees of accuracy - if a
linear model calls for large time steps you can do so in one part, and for
a highly nonlinear other part have very small timesteps (biosphere vs.
energy). This can lead to savings in compute times for large & complex
models. For reusability of comodels, you could define array variables and
array parameters and array levels and array flows, etc. etc. to model, for
example, similar but not same regions.
The "chaining" of comodels, whereby you link variables in one comodel with
a corresponding one in other can become quite tricky, cumbersome and
unmanageable for large models though. I dont know, however, if other tools
allow such modular structures and modular management of models as powersim
allows, suitable for what you are doing. Stella/iThink allows mappings
(layers) which are useful for understanding, but, I dont think (or know)
if different submodels can be managed in different ways in Stella, eg, with
different time steps and time horizons. Using object-oriented technologies
(whereby modules may be "encapsulated" as object classes, with properties
of "inheritance" of attributes and services) may be the best way to do this
type of modeling, though at least nowadays you have to be a C++ expert to
do it.
Check their sites www. powersim.com or www.hps-inc.com; I dont know about
Dynamo, Vensim, Madonna etc. Hope these comments help.
Good luck and regards,
Jaideep
From: Jaideep <jm62004@Jetson.UH.EDU>
====================================================================
Jaideep Mukherjee, Ph. D.
Research Associate
Industrial Engineering
E210, D3, Cullen College of Engineering
University of Houston
Houston, TX 77204-4812
http://www.uh.edu/~jm62004/jm97.html
Ph: 713 743 4181; Fax: 713 743 4190
====================================================================
models, you can disaggregate the model into modules called "comodels" which
can all together form your "main" model. Also, it has tools whereby data
exchange with Excel is easily possible, and also, very important for your
purpose - the comodels (modules) may be run for different time periods
(forestry vs. agriculture) and with different degrees of accuracy - if a
linear model calls for large time steps you can do so in one part, and for
a highly nonlinear other part have very small timesteps (biosphere vs.
energy). This can lead to savings in compute times for large & complex
models. For reusability of comodels, you could define array variables and
array parameters and array levels and array flows, etc. etc. to model, for
example, similar but not same regions.
The "chaining" of comodels, whereby you link variables in one comodel with
a corresponding one in other can become quite tricky, cumbersome and
unmanageable for large models though. I dont know, however, if other tools
allow such modular structures and modular management of models as powersim
allows, suitable for what you are doing. Stella/iThink allows mappings
(layers) which are useful for understanding, but, I dont think (or know)
if different submodels can be managed in different ways in Stella, eg, with
different time steps and time horizons. Using object-oriented technologies
(whereby modules may be "encapsulated" as object classes, with properties
of "inheritance" of attributes and services) may be the best way to do this
type of modeling, though at least nowadays you have to be a C++ expert to
do it.
Check their sites www. powersim.com or www.hps-inc.com; I dont know about
Dynamo, Vensim, Madonna etc. Hope these comments help.
Good luck and regards,
Jaideep
From: Jaideep <jm62004@Jetson.UH.EDU>
====================================================================
Jaideep Mukherjee, Ph. D.
Research Associate
Industrial Engineering
E210, D3, Cullen College of Engineering
University of Houston
Houston, TX 77204-4812
http://www.uh.edu/~jm62004/jm97.html
Ph: 713 743 4181; Fax: 713 743 4190
====================================================================
-
- Junior Member
- Posts: 10
- Joined: Fri Mar 29, 2002 3:39 am
SD for research study on carbon cycle
Bernd,
I hope these notes on the construction of multiple linked models are
arriving in time to still be of some use to you.
In any SD package, you can construct submodels separately, then cut & paste
to integrate them into a single model. If you agree on the inputs and
outputs for each sector in advance, this is fine as long as you dont go
through the (work on submodels) -> (integrate) -> (discover bugs in
submodels) -> (work on submodels) cycle very often.
In Vensim, it is also extremely easy to use data from one simulation as an
exogenous driver for another simulation. This is useful for developing a
submodel in isolation or for chaining submodels, but is inefficient for
closing a feedback loop between two submodels. Vensim and other packages
also make it easy to link an SD model with a spreadsheet or other software.
The logistical difficulties of creating a single model rather than a set of
linked modules may be worth bearing. Construction of pieces in isolation
(especially when subsystem boundaries correspond with disciplinary
boundaries) may lead to an impoverished picture of the feedback among
subsystems.
Whatever software you choose, I would argue strongly for a continous time
implementation. Use of continous time doesnt preclude representation of
discrete events, and makes it easy to verify that the results of the model
are not numerical artifacts by simply reducing the time step until the
results dont change. A continous time implementation is flexible, in that
you can always shorten a subsystems time step in order to synchronize it
with another subsystem that has faster dynamics. As long as the models
arent too big (thousands of variables are quite manageable in Vensim, even
for optimization and sensitivity analysis), the "wasted" time steps in the
slower subsystem arent really a problem.
Discrete time models are frequently plagued by fuzzy stock/flow
distinctions, dimensional inconsistencies, and implicit delays or time
constants. There are already several examples of climate policy models with
problems that are attributable to the use of discrete time:
- The DICE models capital lifetime is misstated because Nordhaus
incorrectly compounds the outflow of depreciation from the capital stock
without regard for the inflow of investment.
- The ICAM model cascades multiple discrete delays in order to break
simultaneous equation loops, but the delays have no explicit basis in
reality and are impossible to subject to sensitivity analysis without major
reconstruction of the model.
- The Connecticut/YOHE model uses insufficient energy and capital in the
economy due to a discrete delay in resource allocation; this biases policy
conclusions in favor of taking no action to reduce CO2 emissions.
(Theres more along these lines in my dissertation, available for a short
while longer at http://web.mit.edu/tomfid/www, if youre interested)
If theres a compelling reason to use discrete time - e.g. the necessity of
including a general equilibrium component or LP - special caution needs to
be exercised to avoid these types of problems.
Good luck with your project.
Regards,
Tom Fiddaman
****************************************************
Thomas Fiddaman, Ph.D.
Ventana Systems http://www.vensim.com
34025 Mann Road Tel (360) 793-0903
Sultan, WA 98294 Fax (360) 793-2911
http://web.mit.edu/tomfid/www/ tomfid@premier1.net
****************************************************
I hope these notes on the construction of multiple linked models are
arriving in time to still be of some use to you.
In any SD package, you can construct submodels separately, then cut & paste
to integrate them into a single model. If you agree on the inputs and
outputs for each sector in advance, this is fine as long as you dont go
through the (work on submodels) -> (integrate) -> (discover bugs in
submodels) -> (work on submodels) cycle very often.
In Vensim, it is also extremely easy to use data from one simulation as an
exogenous driver for another simulation. This is useful for developing a
submodel in isolation or for chaining submodels, but is inefficient for
closing a feedback loop between two submodels. Vensim and other packages
also make it easy to link an SD model with a spreadsheet or other software.
The logistical difficulties of creating a single model rather than a set of
linked modules may be worth bearing. Construction of pieces in isolation
(especially when subsystem boundaries correspond with disciplinary
boundaries) may lead to an impoverished picture of the feedback among
subsystems.
Whatever software you choose, I would argue strongly for a continous time
implementation. Use of continous time doesnt preclude representation of
discrete events, and makes it easy to verify that the results of the model
are not numerical artifacts by simply reducing the time step until the
results dont change. A continous time implementation is flexible, in that
you can always shorten a subsystems time step in order to synchronize it
with another subsystem that has faster dynamics. As long as the models
arent too big (thousands of variables are quite manageable in Vensim, even
for optimization and sensitivity analysis), the "wasted" time steps in the
slower subsystem arent really a problem.
Discrete time models are frequently plagued by fuzzy stock/flow
distinctions, dimensional inconsistencies, and implicit delays or time
constants. There are already several examples of climate policy models with
problems that are attributable to the use of discrete time:
- The DICE models capital lifetime is misstated because Nordhaus
incorrectly compounds the outflow of depreciation from the capital stock
without regard for the inflow of investment.
- The ICAM model cascades multiple discrete delays in order to break
simultaneous equation loops, but the delays have no explicit basis in
reality and are impossible to subject to sensitivity analysis without major
reconstruction of the model.
- The Connecticut/YOHE model uses insufficient energy and capital in the
economy due to a discrete delay in resource allocation; this biases policy
conclusions in favor of taking no action to reduce CO2 emissions.
(Theres more along these lines in my dissertation, available for a short
while longer at http://web.mit.edu/tomfid/www, if youre interested)
If theres a compelling reason to use discrete time - e.g. the necessity of
including a general equilibrium component or LP - special caution needs to
be exercised to avoid these types of problems.
Good luck with your project.
Regards,
Tom Fiddaman
****************************************************
Thomas Fiddaman, Ph.D.
Ventana Systems http://www.vensim.com
34025 Mann Road Tel (360) 793-0903
Sultan, WA 98294 Fax (360) 793-2911
http://web.mit.edu/tomfid/www/ tomfid@premier1.net
****************************************************