Calibration

Use this forum to post Vensim related questions.
Post Reply
claudia
Junior Member
Posts: 13
Joined: Tue Jul 06, 2010 6:55 pm

Calibration

Post by claudia »

Hi good morning
I ve basic questions about calibration and I would like to know how to proceed:
I'm simulating land-use change in a territory. I've the real data observed between 2000-2009 (land change maps) ans I'm doing a simulation for a period of 30 years.
I just have the data of few parameters for the existing period (before 2009) and I consider a lot of parameters fix in both projected and existing period.
I'm actually in the step of calibration and testing the parameters, I built all the land-use model (from 2009 to 2038) in the PLE version, but now, I'm quite blocked with the Sensibility analysis to test the entry parameters and the calibration step. I need to apply a Monte Carlo sensibility analysis to test the parameters.

What do you advice me:
First, adapt this model to the existing period (2005-2009) and after testing the sensibility of the parameters and observe the differences between projected and real period? How to do? What function can I use to do this calibration and parameters adjustments in Vensim PLE or PLE Plus.
Thanks.
Claudia
tomfid
Administrator
Posts: 3994
Joined: Wed May 24, 2006 4:54 am

Re: Calibration

Post by tomfid »

You need two things to calibrate:

1. a payoff or objective function

For calibration this is typically something like the weighted sum of squared residuals between model and data, or (for robust estimation), absolute values of residuals. You can write equations in your model to calculate this.

2. a list of parameters to vary to improve the fit, and an algorithm for doing so.

If you have only a few parameters, you can use sensitivity analysis for this, though it may take several iterations. Your sensitivity file must perform a multivariate sampling over plausible ranges for each of your parameters. Then you can export the results to a .tab delimited file, and look at which values return the highest payoff.

If you have more than a few parameters, you are unlikely to find a good solution via sensitivity analysis, because the search space is simply too large to explore randomly - you need the optimizer.
claudia
Junior Member
Posts: 13
Joined: Tue Jul 06, 2010 6:55 pm

Re: Calibration

Post by claudia »

Tom ,
Thanks for you answer.
Ok for the response of the calibration, but what do you consider to be "few parameters"? I have like 6 parameters.
Just to know the optimizer is only accessible in the Professional Version ?
Thanks you.
claudia
Junior Member
Posts: 13
Joined: Tue Jul 06, 2010 6:55 pm

Re: Calibration

Post by claudia »

Tom, Another question, where can I find the equation relative to the the "weighted sum of squared residuals between model and data, or (for robust estimation), absolute values of residuals" to enter in my model.
Claudia
tomfid
Administrator
Posts: 3994
Joined: Wed May 24, 2006 4:54 am

Re: Calibration

Post by tomfid »

The definition of 'few' depends a bit on how fast your model is - you might get away with 6. You could certainly try it to see.

You need Vensim Pro or DSS for optimization. A public research or academic license is cheaper, if this qualifies.

Your payoff might look like the following:

residual = modelvar - datavar
weighted residual = weight*residual
squared error = weighted residual^2
payoff = INTEG(squared error,0)

The key question is how to choose the weight. Normally an appropriate choice is 1/std_error_of_measurement for your data. If you don't know this, you can estimate it iteratively, though an educated guess will usually suffice. If all your data has similar scale and error properties, you can ignore this and use a weight of 1.

You might take a look at Chapter 18 of the User Guide for inspiration.

Tom
claudia
Junior Member
Posts: 13
Joined: Tue Jul 06, 2010 6:55 pm

Re: Calibration

Post by claudia »

Ok thanks, I built these equations in my model to compare the simulated and original datas and obtain a payoff.
Thanks
Claudia
naserprs87
Junior Member
Posts: 13
Joined: Wed Jun 24, 2015 3:15 pm
Vensim version: DSS

Re: Calibration

Post by naserprs87 »

I have 2 data series, but the number of parameters that I am trying to calibrate are more than 7. Is it practical if I have this amount of parameters for calibrating?
Is Vensim able to obtain an optimum value for parameters by calibration?
Administrator
Super Administrator
Posts: 4838
Joined: Wed Mar 05, 2003 3:10 am

Re: Calibration

Post by Administrator »

naserprs87 wrote:I have 2 data series, but the number of parameters that I am trying to calibrate are more than 7. Is it practical if I have this amount of parameters for calibrating?
Is Vensim able to obtain an optimum value for parameters by calibration?
You can calibrate using seven parameters. Are you having trouble doing this?
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391

Units are important!
http://www.bbc.co.uk/news/magazine-27509559
naserprs87
Junior Member
Posts: 13
Joined: Wed Jun 24, 2015 3:15 pm
Vensim version: DSS

Re: Calibration

Post by naserprs87 »

Administrator wrote:
naserprs87 wrote:I have 2 data series, but the number of parameters that I am trying to calibrate are more than 7. Is it practical if I have this amount of parameters for calibrating?
Is Vensim able to obtain an optimum value for parameters by calibration?
You can calibrate using seven parameters. Are you having trouble doing this?
No, I don't have any trouble, I am just wondering how many parameters I can calibrate at most. Because I am thinking if I increase the number of parameters, then calibration is not able to find the best value of each of them. And also sometimes, it just chooses the lower bound or upper bound of interval that I have chosen.
tomfid
Administrator
Posts: 3994
Joined: Wed May 24, 2006 4:54 am

Re: Calibration

Post by tomfid »

The upper limit depends on your model - nonlinearity of parameter interactions determines the number of iterations needed, and model size determines the speed of a given number of iterations.

Getting solutions that are at the limits is usually an issue with the model's payoff surface, not the algorithm.
tomfid
Administrator
Posts: 3994
Joined: Wed May 24, 2006 4:54 am

Re: Calibration

Post by tomfid »

Using random multistart is generally a good way to test/improve the solution quality.
Post Reply