Hi,
I have two questions related to run many simulations with varying parameters feature.
1. What is the memory limit for the save list? I understand that this limit depends on the number of parameters, their ranges, and the selected outputs, but does anyone have more concrete information about the actual constraints?
2. Additionally, I plan to run thousands of simulations and analyze the time series results. What would be the best way to handle and process such a large number of runs? More specifically, I am interested in extracting time series outputs from these sensitivity runs and analyzing them in R (for time series analysis and clustering). While I know that Vensim models can be converted to run in Python for further processing, I am looking for a straightforward approach to extract the results directly from Vensim and analyze them in R.
Any suggestions or best practices would be greatly appreciated!
Run Many Simulations with Varying Parameters memory limit and output analysis
Run Many Simulations with Varying Parameters memory limit and output analysis
Kind regards,
Nazli
Nazli
-
- Super Administrator
- Posts: 4826
- Joined: Wed Mar 05, 2003 3:10 am
Re: Run Many Simulations with Varying Parameters memory limit and output analysis
No limits other than your hard drive space or machine RAM.
Make sure you only save the number of variables you actually need. Running thousands of times is fine, but your drive might fill up quickly if you save everything.nzligul wrote: ↑Mon Mar 03, 2025 1:14 pm2. Additionally, I plan to run thousands of simulations and analyze the time series results. What would be the best way to handle and process such a large number of runs? More specifically, I am interested in extracting time series outputs from these sensitivity runs and analyzing them in R (for time series analysis and clustering). While I know that Vensim models can be converted to run in Python for further processing, I am looking for a straightforward approach to extract the results directly from Vensim and analyze them in R.
I've run a model over 64 million times in the past with no problems. I only stopped it because the sample size was deemed large enough.
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Units are important!
http://www.bbc.co.uk/news/magazine-27509559
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Units are important!
http://www.bbc.co.uk/news/magazine-27509559
Re: Run Many Simulations with Varying Parameters memory limit and output analysis
Hi,
Thank you for your reply. How can I check or analyze the files generated by the sensitivity runs? I can view the results in the sensitivity graph in Vensim, but I am specifically interested in analyzing the output and identifying which parameter sets led to specific behavior patterns.
Thank you for your reply. How can I check or analyze the files generated by the sensitivity runs? I can view the results in the sensitivity graph in Vensim, but I am specifically interested in analyzing the output and identifying which parameter sets led to specific behavior patterns.
Kind regards,
Nazli
Nazli
-
- Super Administrator
- Posts: 4826
- Joined: Wed Mar 05, 2003 3:10 am
Re: Run Many Simulations with Varying Parameters memory limit and output analysis
You can export the results and do what you need to in Excel/Python/R.
Version 10.3 (due to be release any time now), you can right click on an individual trace and simulate just that particular run.
Version 10.3 (due to be release any time now), you can right click on an individual trace and simulate just that particular run.
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Units are important!
http://www.bbc.co.uk/news/magazine-27509559
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Units are important!
http://www.bbc.co.uk/news/magazine-27509559
Re: Run Many Simulations with Varying Parameters memory limit and output analysis
A very rough estimate of the variable part of storage, not including some overhead:
The variable count in storage will be:
(# vars, including subscripts) x ( # times )
with
# times = (FINAL TIME-INITIAL TIME)/SAVEPER
Each variable is a double, or 8 bytes, so a million variables at 100 times is about 800 megabytes.
The variable count in storage will be:
(# vars, including subscripts) x ( # times )
with
# times = (FINAL TIME-INITIAL TIME)/SAVEPER
Each variable is a double, or 8 bytes, so a million variables at 100 times is about 800 megabytes.
/*
Advice to posters (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Blog: http://blog.metasd.com
Model library: http://models.metasd.com
Bookmarks: http://delicious.com/tomfid/SystemDynamics
*/
Advice to posters (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391
Blog: http://blog.metasd.com
Model library: http://models.metasd.com
Bookmarks: http://delicious.com/tomfid/SystemDynamics
*/