Risk Work

This forum contains all archives from the SD Mailing list (go to http://www.systemdynamics.org/forum/ for more information). This is here as a read-only resource, please post any SD related questions to the SD Discussion forum.
Locked
Tom Mullen
Newbie
Posts: 1
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Tom Mullen »

Regarding the comments from Jay Forrest and others on using System
Dynamics for "precise modeling," remember that no forecast from any
source is perfect. The proper question is: "When a precise forecast is
needed, what is the best tool to use?"

1 -- All models are a simplification of reality and cannot be perfect.
This is true of System Dynamics models, discrete event models,
spreadsheets, econometric models, etc.

2 -- System Dynamics models are especially powerful for understanding
dynamic issues, and can be very helpful for building shared
understanding of the behavior of a system.

3 - There are also times when a precise forecast is demanded for
planning purposes. The most common method of preparing forecasts is to
use a spreadsheet, factoring up or down the results of the spreadsheet
analysis until the forecast is "acceptable" to the principal
stakeholders.

4 - Can well-constructed, well-tested System Dynamics models improve
upon this typical practice of "fudged Spreadsheet analysis." For
long-term analysis of issues involving causal dynamics, the answer is
clearly, YES.

Until the day arrives that precise forecasts are no longer needed, we
should be thinking of a continuum of modeling purposes, with some models
focused on understanding general patterns of behavior, and others
designed to provide specific numerical forecasts. Though forecasts from
System Dynamics models will never be perfect, they can still improve
substantially on the flawed forecasts that we all see prepared using far
cruder and heavily-biased methods of analysis.

Sincerely,

Tom Mullen
Pugh-Roberts Associates
Tom.Mullen@PA-Consulting.com
Antonio Barrsn Iqigo
Junior Member
Posts: 7
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Antonio Barrsn Iqigo »

RGDICKSO@mailgw.sanders.lmco.com
Junior Member
Posts: 2
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by RGDICKSO@mailgw.sanders.lmco.com »

I am new to the concepts of System Dynamics and was wondering if there
is any work to introduce risk variables or any of the concepts fo risk
management into models...


<b>****</b> Dickson

Lockheed/Martin Company

Richard.G.Dickson@lmco.com
Bob Walker
Junior Member
Posts: 6
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Bob Walker »

We are also very interested in the assessment of risk. In my view SD
methodologies have several significant advantages in dealing with risk
issues.

1. Sensitivity tests are fast and easy to implement. Several of the
platforms, notably iThink and Vensim, make the mechnaics a no-brainer.
Whats critical is that you have a well validated and stress-tested
model to start with and that all the critical relationships are modelled
endogenously.

2. For many applications sensitivities alone are inadequate. Serious
risks often come from discontinuities in your business or "issue"
environment. Here again however the SD paradigm really helps in a couple
of ways. One is the straightforward introduction of new conditions, with
adjustable timing.

A second comes from the fundamental soundness of integrations as a
method of capturing performance results. One of the things we are
working on are measures of "how much money is at risk, for how long" in
planning alternative courses of action. I view this as somewhat
analogous to the pricing of options in financial markets.

3. You can really only test risks that you can perceive (and model).
What about the case where "We dont know what we dont know"? Even
here, however, there may be some help. If you have a significantly
complex model it may be worthwhile to test a wide selection of model
parameters for leverage on your key performance variables (using
sensitivity methods). The main difference between this and the above
methods is that you will increase understanding about how "what really
matters to you" is influenced by changing conditions, whether or not you
can directly influence the parameters in question.

Once established, the assessment of real world risks becomes a creative
exercise, suitable only for the wonderful computers mounted on our
necks. Possibly for the first time, youll be able to answer the
question, "What could we do to mitigate (or capitalize on) the factors
that have the largest influence on our business (or issue).

This technique could be a powerful generator of the "Aha!" factor,
leading to powerful insights >> new ideas >> model testing of new
strategies/policies >> better decisions >> improved or safer results.

First thoughts, I look forward to reading many other replies that I
expect your important query may stimulate.

Bob Walker
Director - Performance
Bell Canada
From: Bob Walker <
rjwalker@sympatico.ca>
Jay Forrest
Junior Member
Posts: 6
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Jay Forrest »

I would like to build on Bob Walkers comments on Risk Work.

My views are based upon several key observations:
1) Organizations and individuals in organizations often evolve into
shockingly narrow paradigms (and visions of their work and business
environment) over time.
2) This limits vision such that they focus on certain "traditional" areas of
concern and are frequently oblivious to and even in denial of trends and
issues that will obviously impact on their business viability.
3) System dynamics is more appropriate for understanding patterns of
behavior and less appropriate for precise modeling.
4) Over the longer term (which can be quite short) we cannot predict the
future with any great accuracy.

Based upon those three observations I feel that sensitivity analysis is often a
misleading placebo. It should be done, but the value of sensitivity analysis
is closely related to the completeness and accuracy of the model. As
sensitivity analysis hones the behavior of the model to reality the
credibility of the model grows -- often to unrealistic levels because
pending issues and alternatives (limiting factors and loops, shifting
dominance, etc.) are not properly included. The result is overconfidence in
a flawed representation.

As Bob suggested, "For many applications sensitivities alone are inadequate.
Serious risks often come from discontinuities in your business or issue
environment." In my experience very few system modelers seek to model or
recognize the uncertainties related to their models or to explore the useful
boundaries of their model. Though there is a growing focus on futures
research in selected companies, I would encourage modeling teams to seek the
counsel of a good futures consultant. The fresh input from an outside source
can often provide new perspectives quickly and efficiently and help you
understand what you dont know.

At SDSG we find that working with clients gives us an excellent
understanding of their mental models and of innate assumptions (the things
they are taking for granted). Applying our diverse industry experience and
my background in futures studies has proven very effective in stimulating
the development of more comprehensive and thought provoking models.

I believe that the greatest benefit that a system dynamic model can have is
to prepare the organization for the future. I would suggest that this is
best accomplished by growing the mental model(s) of the organization, not by
emulating history. Focusing on sensitivities tends to divert attention from
the more valuable exploration of the boundaries of the mental model.

All that said, thanks Bob and Barry for building senstivity analysis into
Vensim and ithink/Stella. It is very helpful when refining a model! I
simply dont feel that it is very helpful in considering RISK.

Should any of you wish to know more about futures studies, please email me
directly and I will gladly refer you to others who can help you build a more
futures oriented mentality in your organization.

The Strategic Decision Simulation Group (SDSG, L.L.C.)
jayf@sdsg.com
http://www.sdsg.com
(281)493-5022 (voice) (281)558-3228 (fax)
Tom Fiddaman
Junior Member
Posts: 10
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Tom Fiddaman »

I liked Jay Forrests comments on risk, but found one item puzzling:

>3) System dynamics is more appropriate for understanding patterns of
>behavior and less appropriate for precise modeling.

This is not much different from the statement, "differential equations are
more appropriate for understanding patterns of behavior and less appropriate
for precise modeling."

This statement implies to me that there must be other tools that are more
appropriate for "precise" modeling. I tried to think a little about what
these might be. It seems to me that in each case the choice of tools is
dictated more by ease of creating an appropriate representation of the problem:

-For laying out and scheduling machines on a shop floor, you might want an
integer program. This lets you exploit characteristics of the problem to get
an optimal solution much faster.
-For testing a chip design, you might want a dedicated parallel mini-super
logic simulator. Here, its inefficient to use the floating point numbers SD
tools use to represent binary states, discrete time is an appropriate
choice, and theres a big payoff to problem-specific hardware.
-For evolutionary simulations, you might want SWARM, SmallTalk, LISP, etc.
If youre simulating a market with firms entering and exiting, current SD
tools require you to create all the firms in advance, whether theyre active
or not. This is computationally inefficient. With OO tools, you can
instantiate firms as theyre needed. Similarly, if youre trying to evolve a
good controller, you can manipulate the controller design symbolically,
rather than having to create every possible feedback structure in advance.

It seems to me the potential problem with precision in SD tools arises when
the software constrains you to use an unnatural representation of the
problem. This might happen when, for example, we just lump together a bunch
of loans into a single stock, because its hard to disaggregate them by
payment period and track principal and interest using an amortization schedule.

Its actually quite common to treat things this way - inventories and
production processes are often modeled with one or two stocks, where a
modeler with an OR background might instead choose a discrete event
simulation with disaggregation by SKU number. While the aggregation in the
SD model might be defensible given the right assumptions about underlying
distributions, it might also be a source of imprecision. However, the extra
effort required to construct the discrete event simulation might prevent a
deeper look into other areas (e.g. soft variables). In that case, the more
"precise" model might give inaccurate policy recommendations due to omitted
feedback.

Highly aggregated representations in SD models are often the right choice.
However, I think its also sometimes a matter of habit to plunk in simple
structures like first-order smooths, even where it might be easy to use a
higher-order structure. Since CPU time is seldom a constraint, and most
software provides arrays and related functionality that makes disaggregation
easier, its a habit we should perhaps be critical of.

For a broad class of tasks, I think SD tools are ideal. Ill take any SD
package over a spreadsheet.

For numerical solution of differential equations, SD tools are quite
precise. If youre simulating some weird, stiff, chaotic system, you might
want to roll your own code, but I doubt integration problems ever affect the
outcome of real business or public policy models (unless theyre introduced
by bad formulations).

For parameter estimation and time series forecasting, you wont do a lot
better than the optimization and Kalmann filtering capabilities of Vensim.
Most forecasting problems arise from estimating parameters for garbage
models, not from faulty estimation methods. Its much easier to build and
test a good model and to model data measurement processes with SD software
than with some rotten discrete-time stats tool.

Of course, maybe my only tool is a hammer, so everything looks like a nail ...

- Tom Fiddaman

****************************************************
Thomas Fiddaman, Ph.D.
Ventana Systems
http://www.vensim.com
34025 Mann Road Tel (360) 793-0903
Sultan, WA 98294 Fax (360) 793-2911
http://home1.gte.net/tomfid/ tomfid@premier1.net
****************************************************
Bill Braun
Senior Member
Posts: 73
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Bill Braun »

Jay Forrests comment about precise modeling is an acknowledgement that
as systems become more complex, the probability of designing a model that
1) has every single relevant variable accounted for and 2) describes the
dynamic relationship between every varaible with absolute accuracy is
likely never to happen.

Modeling is better suited for gaining insight and being approximately
correct than being a plug and chug answer machine. A model thats pretty
close is also better than being dead wrong.

Best regards,

Bill Braun
From: Bill Braun <medprac@hlthsys.com>

---------------------------
Medical Practice Systems Inc. (216) 382-7111 (Voice)
and The Health Systems Group http://www.hlthsys.com
Mergers - Planning - Management Services
Marketing - Managed Care - Education & Training
Jay Forrest
Junior Member
Posts: 6
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Jay Forrest »

At 03:49 PM 10/16/97 -0400, you wrote:
>I liked Jay Forrests comments on risk, but found one item puzzling:
>
>>3) System dynamics is more appropriate for understanding patterns of
>>behavior and less appropriate for precise modeling.

Hi Tom!

Sorry to be so slow. I was in transit when it was received and somehow it
got lost temporarily!

I appologize for I should have been more precise. I would start by
suggestign that systems thinking and system dynamics are collectively
beneficial to any modeling effort I can imagine (so long as its goal
includes some quality factor).

I should have said that I generally feel software designed for SD is
generally better for looking at problems quickly and easily. As the size of
the model grows the available SD related software begins to lose appeal. The
features that make them quick and easy do not necessarily facilitate
maintaining huge models. Simpler models tend to be more behavioral
(capturing the essence of behavior more than rigorous simulation). As the
need for literal simulation elevates the size of the model I tend to feel
other tools such as G2 become more appropriate, but software alone doesnt
make better models.

Overlying all of this, my personal view is that rigorous emulation of
history creates a perverted false credibility for models. It is frequently
better to look at the assumptions underlying the model than to look at the
output. Overlooked, currently unimportant alternative loops will often play
a more important role in future behavior than the variables and senstivities
which mover a model from 90 percent accurate to 95 percent on a historical
basis. I feel this role for SD is ultimately stronger than that of
emulation. However, I will not argue that SD and SD software can emulate
many situations quite adequately.

As I type this I realize I am suffering from a bit of jet lag and hope it
makes more sense than my previous comment!

Your points are well taken!
Jay Forrest

The Strategic Decision Simulation Group (SDSG, L.L.C.)
jayf@sdsg.com
http://www.sdsg.com
(281)493-5022 (voice) (281)558-3228 (fax)
Jim Hines
Senior Member
Posts: 80
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Jim Hines »

I liked Jay Forrests statement and Toms response.

Jay had said that System Dynamics was "more appropriate for
understanding patterns of behavior and less appropriate for precise
modeling"

Tom said this caused him to think about approaches that were more
precise than system dynamics.

In an absolute sense, system dynamics IS better for understanding
patterns of behavior than for making precise point predictions about the
future. What this means is that it is easier to say something accurate
about the pattern of behavior than about where the system will be at a
particular point in time. It also means that it is often more POWERFUL
to be able to change the entire pattern of behavior than to change one
point.

On the other hand, if you must come up with as good a point prediction
as possible for a dynamic system, system dynamics is usually your
approach of choice.

So: System dynamics is often better at making point predictions than
other approaches. At the same time system dynamics is even better for
understanding behavior patterns than it is at making point predictions.

One final quibble: I think that Tom was a bit loose in distinguishing
between tools and approaches. Tom I am sure would agree that System
dynamics is an approach (well, o.k., its a way of life) while Vensim,
iThink, Powersim, Smalltalk, and C++ are tools. Some tools are more
flexible than others, some easier to use than others, but any of these
tools might be used in a system dynamics study. One more: System
dynamics is broader than differential equations, too. Some of what we
represent cannot be represented as differential equations. A discrete
event modeling tool could be used to investigate system dynamics.
(Admittedly it would be unusual, and in most situations not as efficient
as a continuous-time modeling tool like Vensim). Tom, dont you agree?

Regards,
Jim
JimHines@interserv.com
Jay Forrest
Junior Member
Posts: 6
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by Jay Forrest »

I reread my recent message when I received it today and determined that I
owe everyone one more piece of information. When I say large model I am
particularly referring to models of 200-400 or more variables. Some may feel
existing SD software handles that number reasonably, but I believe a better
variable repository would be beneficial - with multiple filtration options,
archiving, global search of documentation, etc., etc., etc.

I hope this helps clarify my comments. I appreciate yours!

Thanks!
Jay Forrest

The Strategic Decision Simulation Group (SDSG, L.L.C.)
jayf@sdsg.com
http://www.sdsg.com
(281)493-5022 (voice) (281)558-3228 (fax)
"Payne, Robert E"
Newbie
Posts: 1
Joined: Fri Mar 29, 2002 3:39 am

Risk Work

Post by "Payne, Robert E" »

Jay Forrest said:
> Overlying all of this, my personal view is that rigorous emulation of
> history creates a perverted false credibility for models. It is

I agree and share the unease with the belief that SD models which
emulate historical performance will accurately model future performance
for the same reasons that Jay so eloquently articulated above. The
phrase from the countless prospectuses: "Historical performance does not
necessarily guarantee future results" rings through my mind all too
frequently. As in investing, historical performance of SD models
provide an imperfect predictor of a "pattern of behavior" that some of
us bet money will be repeated or continued. Unlike investing, a
"validated" SD model representing the combined wisdom of a group
modeling team, should represent an "explanation" of past performance,
(emulation), and a consistent "prediction" of future performance. By a
consistent prediction, I mean the model building teams shared
understanding of how the subject system will behave under various
scenarios of the future. The expected, (and unexpected) changes in the
future environment will stimulate the "overlooked and currently
unimportant alternative loops" in the model to emerge - which may, or
may not, be approximated by the the model building teams "prediction."

I have probably over-complicated a relatively basic point. In any case,
I appreciate any comments on the subject.


Bob Payne
Project Engineer
Northrop Grumman, MASD
paynero2@mail.northgrum.com
(805) 272-8561
Locked