Issues with sensitivity testing

Use this forum to post Vensim related questions.
Post Reply
mikdale
Member
Posts: 28
Joined: Thu Jun 11, 2009 2:51 am

Issues with sensitivity testing

Post by mikdale »

Hi guys

I have an issue with sensitivity testing.

The results I obtain from my Monte Carlo simulations are sensitive to the value of constants (those being varied by the sensitivity test) in the model itself.

Just in case that doesn't make sense. I am testing my model output over some set of constants 'a' using a random normal distribution of mean 'm' and std dev 's'. The results of the Monte Carlo simulation are different when the model has different values for 'a' in the equation editor.

Can anyone explain this to me? Surely the point of the sensitivity function is to over-ride the usual model values?

I'd be grateful for any help.

Cheers

Mik
mikdale
Member
Posts: 28
Joined: Thu Jun 11, 2009 2:51 am

Post by mikdale »

Hi all

I think I sussed it. I was on 'univariate' which meant that the model value was being used.

Cheers

Mik;)

[Edited on 1-25-2010 by mikdale]
Post Reply