Dealing with bigger models

Use this forum to post Vensim related questions.
Post Reply
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Dealing with bigger models

Post by gwr »

Dear all:

So far - not having a computer science background - I had the more or less naive impression that computers by now would have surpassed any limitations previously imposed on building models (that I could thing of at least) regarding performance and memory. So I had thought.

Having recently built a bigger model that makes heavily use of Vensim's subscripted variables and vector functions to capture the complexity in a transport modeling effort where complexity quickly enters due to the quadratic grwoth in the spatial dimension I was able to notice two things: On the one hand I marvelled once again on the elegant way Vensim's vector functions allow to handle complexity while keeping the model 'simple' on the sketch level; with one equation one can more or less effectively handle what in other programs has to be done in lots of intertwined loops. Unfortunately on the other hand I noticed how quickly Vensim's performance and stability deteriorated handling a bigger sitzed model. Here are some examples fo what I have experienced: using the stats tool on a subscripted variable made Vensim DSS crash; trying to run a model reading in a (25 x 25 x 25 x 25 x2) exogenous variable not only prints the message 'insufficient memory' but will also cause the software to crash. Converting a 4 MB data external data file to vdf format takes (subjectively felt) "hours" ... while Vensim will not even show a proper message that it is doing anything if you give it an excel 2010 file ... (all of these examples were experienced with the newly released Vensim 5.11 running on an i7 2Ghz machine with 8 GB of RAM running Windows 7 64 bit)

My strong impression is that while the rest of software development embraces 64 bit operating systems the humble friends of Vensim seemingly need to live with its 32 bit restrictions (restricting the addressable RAM to 2-3 GB) a bit longer? So since the hunger to build bigger and more detailled models does arise for some of us not merely content with building smaller scale insight models using Vensim gives rise to the question for some ways around these restrictions. I am therefore curious about some information and 'tricks of the trade' in building and handling bigger models:
  • How can one determine the memory that is used and needed by Vensim running a model and where is the limit here? (One certainly does not want to find out only when the model is conceived of and built.)
  • What efforts can be taken to work your way around this limitation when exogenous data is used? e.g. how will a database (or the like?) help in reducing Vensim's memory restrictions
  • Is there a way to make Vensim 'forget' things during the simulation so that only t and (t-dt) values are stored in some kind of save list?
  • More generally: What else can be done to optimize and efficiently handle big models - and how is this achieved internally?
Kind regards,

Guido

PS: I have also put the 'pledge' for a 64 bit version of Vensim on the improvement wishlist and would like to see my 'favorite SD-tool' run more stable handling big models.
Administrator
Super Administrator
Posts: 4573
Joined: Wed Mar 05, 2003 3:10 am

Re: Dealing with bigger models

Post by Administrator »

How can one determine the memory that is used and needed by Vensim running a model and where is the limit here? (One certainly does not want to find out only when the model is conceived of and built.)
There isn't really a formal way to do this. Experience helps though as once you have hit a problem like this, it forces you to design the model in such a way to fit everything in.
What efforts can be taken to work your way around this limitation when exogenous data is used? e.g. how will a database (or the like?) help in reducing Vensim's memory restrictions
Use the database to combine subscripts where possible. Then use GET DIRECT SUBSCRIPTS to actually configure the model based on what is in the database.
Is there a way to make Vensim 'forget' things during the simulation so that only t and (t-dt) values are stored in some kind of save list?
You can try using a savelist, or setting saveper to be large. But you will lose numbers from the output files making debugging difficult.
More generally: What else can be done to optimize and efficiently handle big models - and how is this achieved internally?
Personally I just try to make sure I combine ranges where possible. Or split things so that you have 1/2/3 models if possible. In the past I've also used external functions to pull in just what data I needed from a database, and also developed distributed models (but this is not always practical).

My suggestion would be to combine subscripts where possible, and also investigate the use of sub-ranges (eg, routes 1-10 can only be used by bus, routes 11-20 only by car etc).
Advice to posters seeking help (it really helps us to help you)
http://www.ventanasystems.co.uk/forum/v ... f=2&t=4391

Units are important!
http://www.bbc.co.uk/news/magazine-27509559
tomfid
Administrator
Posts: 3806
Joined: Wed May 24, 2006 4:54 am

Re: Dealing with bigger models

Post by tomfid »

Memory has certainly surpassed a lot of the practical limitations for dynamic models, but computation has not made nearly as much progress. It's still quite easy to overwhelm processing power with combinatorial explosion, and probably always will be. Tools for understanding models with a lot of detail complexity have lagged even more. I've actually never run into memory limits in 15 years of use; computation was always the issue.

Without knowing the exact details of the data, it's hard to say why 4MB is a limit, but here's my guess. The data architecture is optimized for time series, which potentially have a lot of points over time for each data element. So, if you are feeding in data that has high subscript dimension but small time dimension (e.g. 25x25x25x25x2 for 1 point in time), there's potentially a lot of allocation overhead that causes the failure. If, on the other hand, you were importing data with 25x25 subscript dimension and 25x25x2 time dimension, it would probably be fine.

Similarly, Vensim is really optimized for running a model with some detail complexity over many times and (often) many model iterations. So, 100 variables x 1000 array elements each x 1000 simulations is easy. 1000 variables x 1000 array elements for 10 iterations also works. 100 variables x 1,000,000 array elements for 1 iteration is moving out of Vensim's sweet spot.

Re your specific questions,
- There isn't an obvious way to determine memory requirements, other than experimentation, though we might be able to do some experimentation to say more. I'll look into it. It should be fairly straightforward to prototype a model by building some structure with variables of the requisite complexity, but no feedback.
- The biggest possible improvement is to make sure that any data without a time dimension is treated as constants, not data, and imported via .cin file or GET XLS CONSTANTS. For real time series data, it helps to offload any processing (units conversions and other simple non-feedback operations) to a separate model, which produces a .vdf to be read by your main model.
- Save lists help, but mainly with disk write time and .vdf file size I think. There's no way to change internal storage during simulation.
- There are some optimization hints at http://blog.metasd.com/2011/01/optimizi ... im-models/. For models with a lot of array detail, probably the biggest gain is from using VECTOR SELECT for sparse operations.

The trajectory for the next major Vensim release will improve scalability, though practical limits will remain, particularly with respect to understanding complex models, I suspect.
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Re: Dealing with bigger models

Post by gwr »

Thank you both for valueable advice,

I just went ahead and experimented with a model containing only a subscripted variable big var with two subscript ranges (750x750). If I set this matrix to be filled with '1' - big var being a constant the vdf for the run will have the size of about 5 MB. If I were to make this a selecting matrix for a sparse array by setting only the diagonal elements to 1 and everything else to 0 the use of INITIAL as given by the link Tom has provided proves valueable:

e.g. big var = INITIAL(IF THEN ELSE(Sub1=Sub2,1,0))

will take up 5 MB also while not using initial spoils this as big var will be recalculated and stored each time step depending on saveper.

What strikes me though is that if I want to use this matrix as a selection array for a sparse matrix I would essentially only need a BIT for each entry (e.g. 0 or 1). But I can't see that Vensim knows such a thing, e.g. big var seemingly takes up as much space containing large integers as compared to big var containing only 0 or 1?

Wouldn't it make sense to have such a thing as a selection array as a data type?

Kind regards,

Guido
tomfid
Administrator
Posts: 3806
Joined: Wed May 24, 2006 4:54 am

Re: Dealing with bigger models

Post by tomfid »

Correct - everything in Vensim is stored as a float (or a double in DP DSS). In the future there will probably be at least a boolean flag type. Integers are much more problematic, because so many things can go wrong in truncation if a model mixes continuous- and discrete-valued quantities.
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Re: Dealing with bigger models

Post by gwr »

Tom,

thank you for making this clear. The fog has cleared a great deal by now and just a couple of questions remain:
  • In the simulation control there is a tab called 'advanced' where there is button fro 'Minimal Memory'. How will that really help?
  • In reading in external data with more than 2 subscript ranges I have usually used the dat or spreadsheet format and converted that to vdf (I had the impression that the dat format will convert faster - even though I have been told that it does not matter?). Can I read in a constant by having just one value for time - or to put it differently: will a dat file with just one value for time use up the same amount of memory during the simulation as .cin file for a constant will?

    I do find the declaration of a varible as an exogenous data variable more logical than putting in a pseudo value for a constant and than have it read in from a .cin file as it makes clear that here is a value that needs external input
Kind regards

Guido
LAUJJL
Senior Member
Posts: 1421
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

Re: Dealing with bigger models

Post by LAUJJL »

Hi Guido

I have experienced that It is more productive to try to think better than to use more computer power.
Regards.
JJ
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Re: Dealing with bigger models

Post by gwr »

Hi JJ,

that may be your experience and it does sound plausible but it is nevertheless not a logical necessity (for the rhethorically inclined readers: your statement is a bit like pascal's wager 8) ): e.g. you may do good thinking and use more computer power at the same time. Another way to put it : Why shouldn't I use 8 or more GB of RAM if I have it (and maybe need it)?

Behind your statement stands the reasonable assumption that good models many times are small and compact. I agree to some degree and would espescially agree if SD were to be applied to public policy consulting (where it belongs...). Nevertheless I will once again put forwad the link to the Keenan paper regarding SD in management consulting:

http://www.systemdynamics.org/conferenc ... 4KEENA.pdf

There are some tasks out there in business consulting IMHO that -- at least from the customer perspective -- call for some detail. Since I am doing transport modeling with SD I am in competition with a lot of people who go around and sell agent based models. Now, here is your point in reverse:
  • They have not had the necessity to do the same amount of thinking I had to go through because the aggregate view is (often) harder to do. Everybody knows how an individual behaves...
  • They never experience systematical aggregation errors, they capture all the detail out there and they do not have to build boxes (stocks) and to assign memory for it if there is no content.
So maybe it is time to bury SD -- which mathematically and computationally as far as I am told -- only is a subset of the agent based models out there (only partial differential modelling seems to be computationally complete)?

I believe not. I marvel the power of the aggregate view because it tells a story about why things come to be the way they are. But I am -- as Keenan says -- aware that people in business ask very detailled questions. Also I am aware that like writing a good short story one most often has to start as a novelist - it's easier.

Kind regards,
Guido
LAUJJL
Senior Member
Posts: 1421
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

Re: Dealing with bigger models

Post by LAUJJL »

Hi Guido

You eventually mean Pascal’s bet about believing in God.
What you are telling is at the chore of understanding ‘complexity’ and how to use SD.
You are right that clients ask for a lot of details.

I have worked with a consultant on one problem and have asked for a simple model, when I saw the first prototypes. He answered that the clients always asked for detailed models and that at the end, if there were not enough details, I would be unsatisfied. The result was an overly complicated model that tried to be close to reality and finally was as complicate as the reality itself and totally useless.
The consultant was so accustomed to build complex detailed models that he did not know how to build simple models. The reason is that it is easier to build reality driven models and hope that there will be something coming out of it, than to try to understand fully the client’s problem and try to amend it. I do not blame him. He is working like most SD’ers work.

I am making models on my own and work totally differently than what I see commonly and how I was taught.
Very succinctly, I concentrate my efforts on the specifications of the model that I define in a behavioral way that is totally independent from any paradigm and I express these specifications with reality check equations that once built generate the model equations. It is like the way taught in the User guide. This way a model is never wrong and one does not have logically to make any tests.
The only things that may be wrong are the specifications.

To control the quality of the specifications I build the model with very small steps, writing a minimum of RC that generate a very small model (never more than 10 equations, generally an average of 5)
I study the model generated intensively until I understand its behavior as well I understand 1 + 1 = 2 and I am sure that the specifications are correct which is much trickier than one may think.
I repeat the operation adding another average of 5 other equations etc… until I consider that the efforts necessary to continue the experience are not worth the benefit any more.

This way of working focuses on the what (the specifications) before the how which is what is taught (diagramming). I do not use diagramming any more. Of course I can still study the diagram of the model once built to study loops etc… but not as a building tool.

About the specifications (what one wants) I do not believe that it is possible to model social reality as do physicists. But I think that it is possible to understand some characteristics of reality by building metaphorical models whose behavior look a bit like the reality even they do not represent reality in its whole complexity. It is then possible to build models small enough to be understood and built using a method like the one exposed. Choosing the right metaphor is not easy and needs a lot of creativity.

Working this way is unfortunately not easy, because it is hard to change the way one works and needs to get used to it.
But it is a rewarding experience.

I do not know if it is possible to apply such a method by a consultant, because it needs a close and permanent participation of the client both when choosing the specifications and studying each version of the model generated. It will take a longer time, use much more of the consultant and client’s time and generate a much smaller final model that may make the client think that there is not a lot of work done. It necessitates too that the client(s) with which to work must be the one who takes the decision and implement the policies and preferably pay too if there are some policies generated (not an obligation).
The more difficult must be that the client must not believe that because the model is big and close to the reality it is necessary better (I think that it is the contrary).
All these constraints may make my way of working impossible to apply in ordinary situations.
(consultancy, internal consultancy etc…)

I am presently in Washington where I will attend the SDS conference.
I may meet people that work a bit this way, but I do not hope to find any.
There is a maybe interesting workshop on metaphorical models.
About the difficulty to justify aggregation with heterogeneous populations there is an interesting paper from Osgood joined.

Regards.
JJ
Attachments
heterogenéité.pdf
(201.25 KiB) Downloaded 297 times
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Re: Dealing with bigger models

Post by gwr »

Hi JJ,

it is too bad that I will not be able to attend the conference this year but I certainly hope to meet you and other modelers in 2012. Your approach to modelling by using RC statements to ensure a very concise top-down approach is very attractive and certainly true to modeling for understanding. There are some comments I would like to make:
  • While sticking to a top-down approach is often very advisable it nevertheless requires great experience and may take some time. Regarding computational restrictions in this regard and sticking with the metaphor I have used: You cannot usually make someone a good writer of shortstories by only providing for a couple sheets of paper. More often it will take a more or less 'unrestricted' amount of paper, practice and adaption (learning) until the shorter length is reached. So if paper (e.g. computer memory) is available nowadays - provide it; the modeler will find the best way by himself.
  • Regarding confidence in a model. I certainly marvel those small insight models with three or five stocks that have been built and show (almost) the same type of behavior as the much larger Urban Model or the World3 model respectively. They are 'unbeatable' in transparency and explanatory value. Yet, would we have the same confidence in what these small models have to tell us had they not been derived by aggregating larger models that preceded them? (This question especially arises for the inexperienced SD practitioner or a client not well used to the method)
  • Since the (excellent paper) by Osgood you have provided shows the tradoffs in aggregation we need to be careful with relying on RC checks only for validation since the emergent behavior may yet be unknown. So the call is for hybrig modeling and carefull combination of both methods IMHO - especially in transport modeling with its spatial dimension and the well known problem of aggregation error in modeling.
    I will add the link to another paper by Osgood from 2007 in the thread related to what kind of improvements are needed for Vensim since a dynamic way of subscripting and breaking through the 32 bit restrictions next to tools for combining individual and aggregate type modeling seem quite reasonable and in line with what is foreseen as the future development of SD and ABM tools in that paper.
  • Regarding RC checks I believe one should be aware of their restrictions: If you have a model = f(x,e,t) and its behavior = g(x,e,t) I am sceptical with regard to the way behavior - behavior statements are checked by Vensim: As far as I have understood it, the bevhavior of an endogenous state variable x_i might be 'forced' to meet the Test Condidition and the resulting behavior is then checked. But for an endogenous variable you have then changed to model, eg. f_RC <> f_Original. The 'true' reality check has to include the structural knowledge of what is in your model in that only the free parameters are varied to check whether the Test Condition can be reached (first part) and whether the Implied Behavior is then reached accordingly (second part). So RC involve changes to the structure of the model that are not made explicit (e.g. in a world model a disease that all of a sudden drains the stock should be modeled as another outflow triggered by an event with set rules and will also make you think about other feedbacks: will there be counter-measures, e.g. a feedback to the infection rate and so on?). So in a way the simplicity of the RC checks may be misleading when applied to endogenous variables. If you make a Behavior-Behavior-Statement you should make sure that it can be reached by the explicit model structure not by an exogenous shock, shouldn' t it?
Kind regards,

Guido
LAUJJL
Senior Member
Posts: 1421
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

Re: Dealing with bigger models

Post by LAUJJL »

Hi Guido
When i started using SD nearly 10 years ago, I studied too the reality check feature and thought that it looked an interesting idea. But I did not use it, for two reasons.
I never saw any public model written in Vensim using it. I then thought it was probably not useful.
The second reason was the same than the one you have explained. I was bothered about forcing any endogenous variable to a predetermined behavior that could be in total contradiction with the original model structure. In fact RC statements modify the model structure and logically on could say that the RC statements does not check the original model but the derived model whose structure has been modified by the RC statement.
I believed that it was possible to force parameters to new values, but modifying a variable driven by other values that have not been changed accordingly to a new behavior seemed to me a non sense.
In fact I think that most people using Vensim have not understood what RC is about.
Your remark is right when one considers that the model structure acts as a definition of the model, like diagramming does.
In this case, when using RC you modify the model structure and the model definition with it.
But in my case, the model structure is not the model definition. The model definition lies in the Reality check equations. Any model structure that satisfies the RC is correct. I stick to a strict behavioral definition of the problem and modifying the model structure is no more a problem.
But I had still a problem. Is it possible to define completely the behavior of all the variable of the models under any circumstances with reality check equations? When I started, I had a serious doubt.
Since I started some months ago, I have never found any problem defining the specificity of a model with RC statements. The only problem is that it changes deeply the way you work and it requires an extremely rigorous mind to write these statements. In fact all the difficulty is concentrated on the what. It necessitates too a step by step modeling because it is impossible to write the RC statements for a model of a certain size in one step. It is then impossible to write quickly a model of a certain size
as you can do with diagramming. It is one of the main draw back with that method. Is it a slow step by step method, requiring a lot of time but generates models that are much better understood as a whole and that can adapt much better to life realities, implementation problems that necessitate much more creativity than the diagramming approach where you get stuck in the apparent working of reality.
One of the reason of using this approach is that I am concerned with the use of the models I build and I want to verify first the range of benefits of the modeling process and if it is practically implementable. I have only used this method strictly for two or three months and it still needs a lot of experimentation. Before working this way, i have used RC not as a definion tool but to validate my models for a much longer time. But I had a serious problem. Where the RC statements sufficient to make sure of the model validity? See the pdl file of Peterson and Eberlein joined. According to this paper, one should build and enormous amount of RC impossible to achieve practically. The rest of the paper is OK.
I then used the RC as a problem definition and the problem vanishes!
But I think that the problem is too difficult to explain in a forum and if you come to the next SDS conference in ST Gallen it will be easier to explain.
I do not think either that it is necessary a good idea for anybody to use such a method that I prefer for personal reasons: I use the models I build and I am extremely demanding about their validity and usability.
Best regards.
JJ
Attachments
RC_justification.pdf
(792.76 KiB) Downloaded 299 times
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Re: Dealing with bigger models

Post by gwr »

JJ,

yes, I can see what you are saying. You are using the RC statements (behavior-behavior) statements quite the same way a reference mode ist used: It defines the model boundary and at the same time the model verification. Actually it is collecting 'all' possible modes of behavior that you can think of and thus guiding the development of the model. Where I would disagree a bit is whether this rigor is necessary - in my opinion it might also hinder creativity in modeling by elimination the structural element of modeling or the topology.

Let's say that you want to build a model of a system that you have never seen in action (let's use the Bossel example of a grandfather clock). The structural component of modeling is that by looking at the clock you can readily identify known subsystems that you recognize: a pendulum, handels moving on a circle, a gearbox... By rebuilding the structure of the system, eg. by modeling the known subsystems and their interconnection, you might build a valid model without having given one valid behavior-behavior statement. That part of SD modeling is missed in focussing too much on the behavior side. From new interconnections of known systems unknown behvaior 'emerges'.

The way you are modeling I would think you might want to take a closer look at tools like Mathematica if you have not done so before. Actually I believe almost anything could be modeled with it and it just leaves you without much borders. I personally want to be flexible in modeling and do not really like to be bound by some school of thought or the limits of the tool I am using.

Best regards,

Guido
LAUJJL
Senior Member
Posts: 1421
Joined: Fri May 23, 2003 10:09 am
Vensim version: DSS

Re: Dealing with bigger models

Post by LAUJJL »

Guido
About your first paragraph.
You are approximately right that the behavior-behavior specifications are a bit like collecting all possible reference modes. About the rigor, I too tried to avoid some shortcuts when things looked evident. The problem is ok at least at the beginning, but often too it is difficult to define the frontier between what must be put into RC and which must not. If you develop the model this way you will sooner or later have a doubt about the model being rigorously built when the model grows up.

I realized that one has only one choice: absolute rigor. There is no interest to accept all the inconvenient of a rigorous approach is one takes some hazardous shortcuts and destroys the whole process.
Rather paradoxically this rigor develops creativity, because it asks the right questions about what one wants exactly. Lots of models have a very vague definition or no definition at all.
One must remember that you are totally free in your specifications and it is by being obliged to make these specifications that you become conscious of the freedom you have in the specifications.
About the structural elements or what you call the topology it is a consequence of known or supposed causal relationships. If you are sure of a causal relationship that must be respected whatever the model you build, you just build this causal relationship in the reality checks before you build the same equation in the model. One could say that it is a mere duplication and that it is not necessary. But here again where is the limit when it is not necessary to duplicate?

Another point is that when you think about the specifications of the model prior to building the model, you realize that some apparently evident causal relationships are not so evident at all.
And if it is really evident, then I still will duplicate exactly what says the RC into a model equation.
To resume some RC are evident, some are not so evident and some are not evident at all.
But one must list all of them whatever their level of evidence.

About your second paragraph.
You are taking a physical example where everything is clear and the working of the system transparent. If all the systems where like this and living no freedom of appreciation, then diagramming would do and building RC would be a mere duplication of evidences that would not add any added value. But even with physical systems, if they are complex, you can make the choice to build a metaphorical model for instance to explore some characteristics of reality. You have then the freedom to build the model you want as long as it has some utility. And it is then necessary to define precisely what you want before building the model. About the known interconnections it is very easy to list them as known behavior to be respected even if they look apparently like duplications.
You do not miss anything in the behavior-behavior definition. Any causal knowledge or any knowledge can be represented with reality checks even if they look trivial.
And new unknown behavior emerges too from the behavior-behavior definition.
The great advantage of the behavior-behavior specification versus the diagrammatic one, is that once you are sure of the definition, you are do not need to make any test, as long as the model satisfies the definition on which it is built.

About your last paragraph, I like too to be free and work without preconceived ‘method’.
My method is not a school of thought at all. It just necessitates to precise exactly what you want to do and explains how to do it prior to building the model and apart from that it leaves you a total freedom to define anything you want. With diagramming you have not this liberty and the definition is not separate from the model. But the method I use presently (I may amend it in the future) by giving you a total liberty, is probably too more difficult to use. This is why it is necessary to proceed by small steps to verify the path one is taking. But this is essentially a method that must be used to develop its usefulness and understanding. No expose will replace experimentation.

Regards.
JJ
gwr
Senior Member
Posts: 209
Joined: Sun Oct 04, 2009 8:40 pm
Vensim version: DSS

Last questions

Post by gwr »

A couple of open questions (cf. the thread so far) have remained unanswered and I would appreciate some remarks.
  • I have tried to answer the second one by experimentation. So it seems that if I have an exogenous data file which only has data for one point in time (eg. from converting a .dat file or a spreadsheet file) reading in this file in a simulation will take up more memory than if it is read in using a .cin file. Is this correct?
  • There still remains the question about what 'Use Minimal Memory' in the Advanced tab of the Simulation Control will achieve - any answers here?
Kind regards,

Guido
tomfid
Administrator
Posts: 3806
Joined: Wed May 24, 2006 4:54 am

Re: Dealing with bigger models

Post by tomfid »

1. If you have only one point in time, a .cin file, GET XLS CONSTANTS, or GET DIRECT CONSTANTS with a tab file will be much more efficient.

2. I'm not sure anyone's ever tested the limits of the Minimal Memory feature, but it will help to some extent (at the price of larger .vdf and longer disk write times, though that can be minimized by using a savelist).

Tom
Post Reply