About constancy of parameters in high aggregated system-models.
I would like to have the opinion of other system-dynamicist about the
following problem:
Let in an economic dynamic feed-back model the relationship between demand
and price for one individual be:
demand=constant*(price**pelasticity)*(income**ielasticity) [1]
in which pelasticity has a negative and ielasticity a positive value
The relation for a population of 1000 individuals then will be:
demand=1000*constant*(price**pelasticity)*(income**ielasticity) [2]
This however is only true if all 1000 individuals react in the same way.
If their income is different (may be they are distributed log-normal over
income, are there any arguments for particular types of distribution and for
constancy of a particular type?), then the magnitude of the dependency of
demand on price is smaller than the parameter pelasticity in relation [2]
suggests. For a "same" price works out differently for individuals with a low
and for individuals with a high income. Actually the parameter "eplasticity"
is dependent on the income-distribution of the population. However as the
income-distribution itself is dependent on the demand (because of self-
selection processes within the population), the parameter "pelasticity" must
be a variable. (And the same will be true for all the other parameters which
are dependent on distributions of variables around their mean values. The
question is: are there any general laws (a priori principles) describing the
relations between the "dynamics over time" and the "distribution around the
mean" in a dynamic system, or are those relations different and unique for
all different systems and is the only option to simulate the relation between
diachronic and synchronic variation by means of models at the lowest levels
of aggregation? Can high aggregated models of complex systems theoretically,
empirically or pragmatically be valid?
To make a reliable model of (a problem in) a system the suggestion is that
one has two options:
1. Make a very big detailed model at the lowest possible aggregation-level
(individual level)
2. Make a smaller summary-model for aggregated individuals in which the
constant parametervalues are replaced by variables. These "variable
parameters" are dependent on the over time changing distributions of
variables. Those distributions have to be included as "level-variables"
(state variables) into the model. However to determine the changing
magnitudes of relationships over time in model [2] one needs the simulation-
results of model [1]. How to get confidence in a high aggregated model
without the help of a low aggregated model ?
Are summary models only valid for physical and chemical systems, a little bit
for population-dynamical systems, still less for ecological systems and not
at all for social systems ?
Geert Nijland, Wageningen University, Dep. Ecological Agriculture.
Mail: Geert Nijland@users@ECO.WAU.NL
Constancy of parametervalues in high aggegated sys tems
-
- Newbie
- Posts: 1
- Joined: Fri Mar 29, 2002 3:39 am
-
- Junior Member
- Posts: 4
- Joined: Fri Mar 29, 2002 3:39 am
Constancy of parametervalues in high aggegated sys tems
About parameters and validity of aggregate models:
Geert Nijland raised an important question about the validity of
aggregate models. Having dug into this area pretty deeply, I have
concluded that
1) aggregate models of social systems are often valid, and
2) disaggregation can lead to less realism, rather than more.
To understand how disaggregation can degrade model fidelity, it
helps to deal explicitly with the fact that all models of social
systems are approximate and, therefore, stochastic. Even though
typical system dynamics models are in the form of deterministic
equations, the construction and interpretation of these models
relies on more or less informal acknowledgement of uncertainty.
Explicit consideration of uncertainties makes clear the dangers
of disaggregation.
Geert Nijlands example equations may be augmented to include
uncertainty terms, to represent approximation, chaos and other
forms of noise:
demand = constant*(price**pelasticity)
*(income**ielasticity) + e
where I have added the subscript to several of Nijlands terms
to indicate these terms vary from one individual to the next,
and the uncertainty term e has been added to acknowledge both
that our model of each individual is imperfect, and also that
the individual may be inherently noisy in his or her behavior, due
to chaotic brain circuits, unmodeled complexity, etc.
Suppose we (at GIGANTIC expense) manage to get good parameters
for each of the 1000 individuals in Nijlands population. That is,
we have 1000 valid equations of the above form. Adding together
these equations, results in a disaggregate, but invalid
model of the population:
total demand = SUM(demand)
Even if the demand models are all valid, the total demand
(the disaggregate model of the whole population) is invalid, because
when we add together the e[i] error terms, we need information about
the covariance (cross-correleation) properties. For a population of
1000 individuals, even if the error terms are simple Gaussian
(normal) random variables, we have 499,500 covariance terms to
estimate or measure. Most modelers dont even try to cope with
the covariance terms, and assume the error terms are independent
of each other. The result is that the errors are falsely assumed
to cancel each other out somewhat, so that the percentage error
in the disaggregate model is underestimated by one or two factors
of ten. Even if the error terms are omitted from all of the
equations, the disaggregate model will still be a bad predictor,
because the errors are still implicitly there.
This distortion from disaggregation happens all the time in
large-project risk analysis. To make a big technical project appear
risk-free, all you have to do is model it at a very disaggregate level,
and neglect the (unknown) correlations of the error terms by naivley
assuming independent errors.
It is often better to work directly with an aggregate model.
Not only does the aggregate uncertainty, directly estimated,
come out in a plausible range, but
you have much more time and money to be careful about equation
form, data integrity, clarity of purpose, and a lot of other
important things, rather than getting bogged down in a lot of
tiny submodels that dont really add up to anything realistic.
In my experience, the hierarchy suggested by Nijland should be
reversed: Given our relative ignorance of the details of
social systems, we should expect summary models to be of maximum
value there. In physics, where we know a lot about the statistics
of the interactions of the small pieces, we may with greater
confidence indulge in disaggregate, detailed models.
David Peterson, Ventana Systems <dwp@world.std.com>
Geert Nijland raised an important question about the validity of
aggregate models. Having dug into this area pretty deeply, I have
concluded that
1) aggregate models of social systems are often valid, and
2) disaggregation can lead to less realism, rather than more.
To understand how disaggregation can degrade model fidelity, it
helps to deal explicitly with the fact that all models of social
systems are approximate and, therefore, stochastic. Even though
typical system dynamics models are in the form of deterministic
equations, the construction and interpretation of these models
relies on more or less informal acknowledgement of uncertainty.
Explicit consideration of uncertainties makes clear the dangers
of disaggregation.
Geert Nijlands example equations may be augmented to include
uncertainty terms, to represent approximation, chaos and other
forms of noise:
demand = constant*(price**pelasticity)
*(income**ielasticity) + e
where I have added the subscript to several of Nijlands terms
to indicate these terms vary from one individual to the next,
and the uncertainty term e has been added to acknowledge both
that our model of each individual is imperfect, and also that
the individual may be inherently noisy in his or her behavior, due
to chaotic brain circuits, unmodeled complexity, etc.
Suppose we (at GIGANTIC expense) manage to get good parameters
for each of the 1000 individuals in Nijlands population. That is,
we have 1000 valid equations of the above form. Adding together
these equations, results in a disaggregate, but invalid
model of the population:
total demand = SUM(demand)
Even if the demand models are all valid, the total demand
(the disaggregate model of the whole population) is invalid, because
when we add together the e[i] error terms, we need information about
the covariance (cross-correleation) properties. For a population of
1000 individuals, even if the error terms are simple Gaussian
(normal) random variables, we have 499,500 covariance terms to
estimate or measure. Most modelers dont even try to cope with
the covariance terms, and assume the error terms are independent
of each other. The result is that the errors are falsely assumed
to cancel each other out somewhat, so that the percentage error
in the disaggregate model is underestimated by one or two factors
of ten. Even if the error terms are omitted from all of the
equations, the disaggregate model will still be a bad predictor,
because the errors are still implicitly there.
This distortion from disaggregation happens all the time in
large-project risk analysis. To make a big technical project appear
risk-free, all you have to do is model it at a very disaggregate level,
and neglect the (unknown) correlations of the error terms by naivley
assuming independent errors.
It is often better to work directly with an aggregate model.
Not only does the aggregate uncertainty, directly estimated,
come out in a plausible range, but
you have much more time and money to be careful about equation
form, data integrity, clarity of purpose, and a lot of other
important things, rather than getting bogged down in a lot of
tiny submodels that dont really add up to anything realistic.
In my experience, the hierarchy suggested by Nijland should be
reversed: Given our relative ignorance of the details of
social systems, we should expect summary models to be of maximum
value there. In physics, where we know a lot about the statistics
of the interactions of the small pieces, we may with greater
confidence indulge in disaggregate, detailed models.
David Peterson, Ventana Systems <dwp@world.std.com>
-
- Newbie
- Posts: 1
- Joined: Fri Mar 29, 2002 3:39 am
Constancy of parametervalues in high aggegated sys tems
> Sloan Schools Tom Stoker is a leading expert on this issue and recently
> (1993? 1994?) published a comprehensive review article on aggregation
>
Here is the complete citation for the above article:
Stoker, Thomas M. 1993. Empirical Approaches to the Problem of
Aggregation Over Individuals. Journal of Economic Literature, Vol. 31,
No. 4 pp. 1827-1874.
--
Phares P. Parayno
________________________________________________________________________
Energy Management and Envt Policy 205 South 42nd St, Apt B-1
University of Pennsylvania Philadelphia, PA 19104
210 South 34th Street tel: (215) 387-5031
Philadelphia, PA 19104-6311 fax: (215) 898-9215
email: phares@dolphin.upenn.edu
________________________________________________________________________
> (1993? 1994?) published a comprehensive review article on aggregation
>
Here is the complete citation for the above article:
Stoker, Thomas M. 1993. Empirical Approaches to the Problem of
Aggregation Over Individuals. Journal of Economic Literature, Vol. 31,
No. 4 pp. 1827-1874.
--
Phares P. Parayno
________________________________________________________________________
Energy Management and Envt Policy 205 South 42nd St, Apt B-1
University of Pennsylvania Philadelphia, PA 19104
210 South 34th Street tel: (215) 387-5031
Philadelphia, PA 19104-6311 fax: (215) 898-9215
email: phares@dolphin.upenn.edu
________________________________________________________________________
-
- Senior Member
- Posts: 54
- Joined: Fri Mar 29, 2002 3:39 am
Constancy of parametervalues in high aggegated sys tems
Re: Geerts questions on aggregation:
The issue of aggregation and especially parameter estimation in
aggregated systems had received much attention in econometrics. The MIT
Sloan Schools Tom Stoker is a leading expert on this issue and recently
(1993? 1994?) published a comprehensive review article on aggregation
theory in the Journal of Economic Literature. There is only a small
amount there on aggregation in dynamic systems, but it is an excellent
piece.
Also see Bob Eberleins PhD thesis (a short version was published in the
SD Review), and also Joel Rahns SDR piece on Aggregation in volume 1,
number 1.
My feeling is that while one can often not provide consistent
aggregation over individuals in a dynamic model, the errors introduced
by aggregation are often small relative to other sources of error such
as knowledge of the individuals income and price elasticities,
knowledge of the system structure, and knowledge of other parameter
values. As always, the utility of a model depends on the purpose, and
there is no doubt that aggregate models of social systems can be and
have often been highly useful for many purposes despite the issues
involved in creating an aggregate representation of a population of
individuals. It is an important theoretical issue to determine the
limits of aggregation and to develop workable formulations at both the
aggregate and disaggregate level to deal with various situations in
which heterogeneity matters, and to nail down better when heterogeneity
matters, and for which purposes it matters.
John Sterman
jsterman@mit.edu
The issue of aggregation and especially parameter estimation in
aggregated systems had received much attention in econometrics. The MIT
Sloan Schools Tom Stoker is a leading expert on this issue and recently
(1993? 1994?) published a comprehensive review article on aggregation
theory in the Journal of Economic Literature. There is only a small
amount there on aggregation in dynamic systems, but it is an excellent
piece.
Also see Bob Eberleins PhD thesis (a short version was published in the
SD Review), and also Joel Rahns SDR piece on Aggregation in volume 1,
number 1.
My feeling is that while one can often not provide consistent
aggregation over individuals in a dynamic model, the errors introduced
by aggregation are often small relative to other sources of error such
as knowledge of the individuals income and price elasticities,
knowledge of the system structure, and knowledge of other parameter
values. As always, the utility of a model depends on the purpose, and
there is no doubt that aggregate models of social systems can be and
have often been highly useful for many purposes despite the issues
involved in creating an aggregate representation of a population of
individuals. It is an important theoretical issue to determine the
limits of aggregation and to develop workable formulations at both the
aggregate and disaggregate level to deal with various situations in
which heterogeneity matters, and to nail down better when heterogeneity
matters, and for which purposes it matters.
John Sterman
jsterman@mit.edu