Hi,
I have a question about a problem with model size. I've been working on a huge vensim model: setting/info says about 2500K for equations. Now I'm trying to create a new model from this one. It includes only a share of the original equations but for some of them I have created new subscripts with more elements, so that the model is still very complex.
For this new model Vensim complains that it cannot allocate memory. However setting/info reports about 550K for equations. So I'm puzzled: why the original model, apparently larger, does not give rise to memory problems and the new one does? Is there any specific element that can bring about the memory allocation problem?
I've read the VENSIM manual page about memory management but I could not understand much, honestly. So thanks in advance for any help
Davide
Model size (cannot allocate memory)
-
- Senior Member
- Posts: 1107
- Joined: Wed Mar 12, 2003 2:46 pm
Vensim needs to allocate memory based on the total number of dimensions in each variable. Thus is you have
population[country,age,gender] with 100 countries, 100 ages and 2 genders there will be 20000 elements. Such things get very big very fast. If you have equations for everything look at model info to see the total number of variables - how big is that number?
population[country,age,gender] with 100 countries, 100 ages and 2 genders there will be 20000 elements. Such things get very big very fast. If you have equations for everything look at model info to see the total number of variables - how big is that number?
Thanks Bob,
unfortunately I can't read the number of variables because the model cannot be simulated given the fatal Vensim error and so this information is not shown in the info.
However, about the number of combinations, the maximum number I have now is about 26,000 (41x41x4x4) while in the original model there are some variables whose number of combinations is over 56,000 (15x4x15x4x4x4). I also dropped most of the equations from the original model (OK, simpler equations with less combinations) so I would expect that the total number of variables is significantly lower in the new model.
I'll investigate further.
unfortunately I can't read the number of variables because the model cannot be simulated given the fatal Vensim error and so this information is not shown in the info.
However, about the number of combinations, the maximum number I have now is about 26,000 (41x41x4x4) while in the original model there are some variables whose number of combinations is over 56,000 (15x4x15x4x4x4). I also dropped most of the equations from the original model (OK, simpler equations with less combinations) so I would expect that the total number of variables is significantly lower in the new model.
I'll investigate further.
-
- Super Administrator
- Posts: 4838
- Joined: Wed Mar 05, 2003 3:10 am
Model does not crash, but only a subset of information is provided. What I get is:
Model has 2712 symbols and cannot be simulated -- Sizes: Symbols 42K, Equations 558K, Comments 13K
In the original (working) model I have:
Model has 9288 symbols, 7553 Lookups, Time + 1068889 levels, 25059921 Auxiliaries, 862865 Data, 114474 initial and 1348857 Constants (Total 28445007). -- Sizes: Symbols 144K, Equations 2450K, Comments 73K
The comparable information shows that the original model is much bigger (furthermore the orginal model also uses an external dll where other equations are managed). Then I suspect there is some specific structure in the new model that affects memory allocation. But I can't understand what it is.
Model has 2712 symbols and cannot be simulated -- Sizes: Symbols 42K, Equations 558K, Comments 13K
In the original (working) model I have:
Model has 9288 symbols, 7553 Lookups, Time + 1068889 levels, 25059921 Auxiliaries, 862865 Data, 114474 initial and 1348857 Constants (Total 28445007). -- Sizes: Symbols 144K, Equations 2450K, Comments 73K
The comparable information shows that the original model is much bigger (furthermore the orginal model also uses an external dll where other equations are managed). Then I suspect there is some specific structure in the new model that affects memory allocation. But I can't understand what it is.
-
- Super Administrator
- Posts: 4838
- Joined: Wed Mar 05, 2003 3:10 am
-
- Senior Member
- Posts: 1107
- Joined: Wed Mar 12, 2003 2:46 pm
tony forwarded your model to me and it has something over 600 million variables which is the reason it just won't go. I have not looked at it in detail but you will need to decrease that by at least a factor of 10 to make using the model possible, 100 to make using it practical and 1000 to make using it fun.
-
- Senior Member
- Posts: 1107
- Joined: Wed Mar 12, 2003 2:46 pm
Hi Davide,
You might open the model in a text editor and start copying and pasting pieces of it into Vensim - the subscripts seem to map out OK and I did this with matrix_eucoun_nuts2_POP_segment and got just over 90,000 variables. Clearly there aren't that many entries so I am guessing that you have lots of empty cells - these still take space. For example if you have
sub: s1,s2,s3 ~~|
tub : t1, t2, t3 ~~|
one entry[s2,t2] ~~|
actually gives 6 values even though only 1 is used. This might be the source of the problem.
You might open the model in a text editor and start copying and pasting pieces of it into Vensim - the subscripts seem to map out OK and I did this with matrix_eucoun_nuts2_POP_segment and got just over 90,000 variables. Clearly there aren't that many entries so I am guessing that you have lots of empty cells - these still take space. For example if you have
sub: s1,s2,s3 ~~|
tub : t1, t2, t3 ~~|
one entry[s2,t2] ~~|
actually gives 6 values even though only 1 is used. This might be the source of the problem.
Hi Bob,
thanks for your help.
Actually I think you are right and empty spaces are the cause of the error. We used subranges with the aim of writing only existing combinations, but I suppose Vensim tries to create space for all possible combinations. So the most complex variables have actually 28x250x250x5x3 combinations which is actually a huge number even if 90% of them are actully non existing because for any of the 28 EUCoun only few NUTS2 out of 250 are significant (NUTS2 are regions EUCoun are the EU countries of course).
Now we are trying using subscript instead of subrange. This should dramatically decrease the number of variables. Unfortunately we need to split also a lot of variables to do so
thanks for your help.
Actually I think you are right and empty spaces are the cause of the error. We used subranges with the aim of writing only existing combinations, but I suppose Vensim tries to create space for all possible combinations. So the most complex variables have actually 28x250x250x5x3 combinations which is actually a huge number even if 90% of them are actully non existing because for any of the 28 EUCoun only few NUTS2 out of 250 are significant (NUTS2 are regions EUCoun are the EU countries of course).
Now we are trying using subscript instead of subrange. This should dramatically decrease the number of variables. Unfortunately we need to split also a lot of variables to do so
