optimizing with stochastic data input

Use this forum to post Vensim related questions.
Post Reply
LAUJJL
Senior Member
Posts: 1426
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

optimizing with stochastic data input

Post by LAUJJL »

Hi

I am working currently on a model where the input data are highly stochastic.
I generate these data with random normal and random binomial functions.
If I try to find an optimal solution, I get funny results, generated by extreme sets of
values generated stochastically.
To avoid this, I work with multiple parallel simulations in the same model, by adding a
supplementary subscript to most of the data of the model and I try to optimize the average of the value I try to optimize through the different value it has in the parallel simulations. I can too add a supplementary pay off, the standard deviation of this value around the average if I want a less risky policy.

I have tested the number of simulations I need, as when you change the seed of the random functions, you can get substantial differences in results.

I need currently about 100 parallel simulations to get sufficiently stable results independant of the seed values.

The problem is that as the model grows in complexity, it takes more and more time to optimize.

I can turn the model into a deterministic one, but it will be then very far from reality, where most of the problems come from the stochastic characteristics of the demand, and the necessity to optimize counterbalancing policies.

The other solution is to increase the time step, to reduce the number of steps of simulation, or simplify the model.

Another way is to look for an optimum using the sensitivity analysis and intuition.
This is possible when the number of parameters is small. But it is an option too.

Any other ideas?

Regards.

J.J. Laublé
bob@vensim.com
Senior Member
Posts: 1107
Joined: Wed Mar 12, 2003 2:46 pm

Post by bob@vensim.com »

In general the approach you are using seems to make the most sense. I am actually surprised that it requires only 100 simulation to get stability.

One thing you might want to consider would be using exogenous data steams instead of random numbers. If you did this you could tailor the streams to give more variation with fewer replication (kind of like Latin Hypercube).

Unfortunately, I don't know of any solid methodology for defining the streams to pick, but it may be that by studying the output of your 100 parallel replications you can just pick 10 or so that seems to give good coverage.
LAUJJL
Senior Member
Posts: 1426
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

optimization with stochastic data

Post by LAUJJL »

Hi Bob

If I use 100 parallel simulations generating stochastic data every time step around a fixed average, I generally get sufficiently stable results for my purpose, although they are still varying.

Using exogenous data is a good idea but it is too not evident. What rule shall I follow to construct or select the stream.
And the results will then be not dependant on the seeds, but on the rules I have decided to follow.

My model is trying to solve simultaneously deterministic (generally longer time steps and greater and more continuous values) and stochastic (short time steps and discrete integer values, which forbids the use of real values less then 1, thus needing the stochastic value generation).

I will try to separate the problems and to construct a deterministic model, much easier to handle, coupled with a smaller stochastic one later on, addressing only the stochastic part of the problem.

Regards and thanks for the answer.

J.J. Laublé.
Post Reply