So far - not having a computer science background - I had the more or less naive impression that computers by now would have surpassed any limitations previously imposed on building models (that I could thing of at least) regarding performance and memory. So I had thought.
Having recently built a bigger model that makes heavily use of Vensim's subscripted variables and vector functions to capture the complexity in a transport modeling effort where complexity quickly enters due to the quadratic grwoth in the spatial dimension I was able to notice two things: On the one hand I marvelled once again on the elegant way Vensim's vector functions allow to handle complexity while keeping the model 'simple' on the sketch level; with one equation one can more or less effectively handle what in other programs has to be done in lots of intertwined loops. Unfortunately on the other hand I noticed how quickly Vensim's performance and stability deteriorated handling a bigger sitzed model. Here are some examples fo what I have experienced: using the stats tool on a subscripted variable made Vensim DSS crash; trying to run a model reading in a (25 x 25 x 25 x 25 x2) exogenous variable not only prints the message 'insufficient memory' but will also cause the software to crash. Converting a 4 MB data external data file to vdf format takes (subjectively felt) "hours" ... while Vensim will not even show a proper message that it is doing anything if you give it an excel 2010 file ... (all of these examples were experienced with the newly released Vensim 5.11 running on an i7 2Ghz machine with 8 GB of RAM running Windows 7 64 bit)
My strong impression is that while the rest of software development embraces 64 bit operating systems the humble friends of Vensim seemingly need to live with its 32 bit restrictions (restricting the addressable RAM to 2-3 GB) a bit longer? So since the hunger to build bigger and more detailled models does arise for some of us not merely content with building smaller scale insight models using Vensim gives rise to the question for some ways around these restrictions. I am therefore curious about some information and 'tricks of the trade' in building and handling bigger models:
- How can one determine the memory that is used and needed by Vensim running a model and where is the limit here? (One certainly does not want to find out only when the model is conceived of and built.)
- What efforts can be taken to work your way around this limitation when exogenous data is used? e.g. how will a database (or the like?) help in reducing Vensim's memory restrictions
- Is there a way to make Vensim 'forget' things during the simulation so that only t and (t-dt) values are stored in some kind of save list?
- More generally: What else can be done to optimize and efficiently handle big models - and how is this achieved internally?
PS: I have also put the 'pledge' for a 64 bit version of Vensim on the improvement wishlist and would like to see my 'favorite SD-tool' run more stable handling big models.