Run Many Simulations with Varying Parameters memory limit and output analysis

Use this forum to post Vensim related questions.
Post Reply
nzligul
Junior Member
Posts: 8
Joined: Thu Oct 27, 2022 12:10 pm
Vensim version: DSS

Run Many Simulations with Varying Parameters memory limit and output analysis

Post by nzligul »

Hi,

I have two questions related to run many simulations with varying parameters feature.

1. What is the memory limit for the save list? I understand that this limit depends on the number of parameters, their ranges, and the selected outputs, but does anyone have more concrete information about the actual constraints?
2. Additionally, I plan to run thousands of simulations and analyze the time series results. What would be the best way to handle and process such a large number of runs? More specifically, I am interested in extracting time series outputs from these sensitivity runs and analyzing them in R (for time series analysis and clustering). While I know that Vensim models can be converted to run in Python for further processing, I am looking for a straightforward approach to extract the results directly from Vensim and analyze them in R.

Any suggestions or best practices would be greatly appreciated!
Kind regards,
Nazli
Administrator
Super Administrator
Posts: 4826
Joined: Wed Mar 05, 2003 3:10 am

Re: Run Many Simulations with Varying Parameters memory limit and output analysis

Post by Administrator »

nzligul wrote: Mon Mar 03, 2025 1:14 pm1. What is the memory limit for the save list? I understand that this limit depends on the number of parameters, their ranges, and the selected outputs, but does anyone have more concrete information about the actual constraints?
No limits other than your hard drive space or machine RAM.
nzligul wrote: Mon Mar 03, 2025 1:14 pm2. Additionally, I plan to run thousands of simulations and analyze the time series results. What would be the best way to handle and process such a large number of runs? More specifically, I am interested in extracting time series outputs from these sensitivity runs and analyzing them in R (for time series analysis and clustering). While I know that Vensim models can be converted to run in Python for further processing, I am looking for a straightforward approach to extract the results directly from Vensim and analyze them in R.
Make sure you only save the number of variables you actually need. Running thousands of times is fine, but your drive might fill up quickly if you save everything.

I've run a model over 64 million times in the past with no problems. I only stopped it because the sample size was deemed large enough.
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391

Units are important!
http://www.bbc.co.uk/news/magazine-27509559
nzligul
Junior Member
Posts: 8
Joined: Thu Oct 27, 2022 12:10 pm
Vensim version: DSS

Re: Run Many Simulations with Varying Parameters memory limit and output analysis

Post by nzligul »

Hi,

Thank you for your reply. How can I check or analyze the files generated by the sensitivity runs? I can view the results in the sensitivity graph in Vensim, but I am specifically interested in analyzing the output and identifying which parameter sets led to specific behavior patterns.
Kind regards,
Nazli
Administrator
Super Administrator
Posts: 4826
Joined: Wed Mar 05, 2003 3:10 am

Re: Run Many Simulations with Varying Parameters memory limit and output analysis

Post by Administrator »

You can export the results and do what you need to in Excel/Python/R.

Version 10.3 (due to be release any time now), you can right click on an individual trace and simulate just that particular run.
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391

Units are important!
http://www.bbc.co.uk/news/magazine-27509559
tomfid
Administrator
Posts: 3986
Joined: Wed May 24, 2006 4:54 am

Re: Run Many Simulations with Varying Parameters memory limit and output analysis

Post by tomfid »

A very rough estimate of the variable part of storage, not including some overhead:

The variable count in storage will be:
(# vars, including subscripts) x ( # times )
with
# times = (FINAL TIME-INITIAL TIME)/SAVEPER
Each variable is a double, or 8 bytes, so a million variables at 100 times is about 800 megabytes.
Post Reply