Hi guys
I have an issue with sensitivity testing.
The results I obtain from my Monte Carlo simulations are sensitive to the value of constants (those being varied by the sensitivity test) in the model itself.
Just in case that doesn't make sense. I am testing my model output over some set of constants 'a' using a random normal distribution of mean 'm' and std dev 's'. The results of the Monte Carlo simulation are different when the model has different values for 'a' in the equation editor.
Can anyone explain this to me? Surely the point of the sensitivity function is to over-ride the usual model values?
I'd be grateful for any help.
Cheers
Mik