As oil imports in the US reach and exceed 50%, and as volatility returns
to the oil market, the topic of petroleum depletion is being
rediscovered.
The history of attempts to gauge the size of the petroleum resource base
is an interesting and contentious one. A few years ago, George
Richardson, Pal Davidson, and I developed system dynamics models of the
dynamics of estimates of world and US petroleum resources. The focus
was the accuracy over time of the estimation procedures used by various
groups (the USGS, oil cos, etc.) to guess at how much oil there is, and
how much is likely to be recoverable.
It is obvious that natures initial endowment of oil to humanity (known
as initial oil-in-place) is fixed relative to the human time scale (that
is, petroleum is formed over geologic time while consumed over only a
few hundred years, so relative to our time horizon, the petroleum
creation rate is essentially zero). It is also obvious that the amount
of oil remaining in the ground to be consumed in the future is
monotonically declining.
*Estimates* of initial oil in place, however, can rise or fall as
knowledge, opinions, methods, and political pressures change. There are
a variety of estimation methods. One of the most popular is the
"geologic analogy method" in which the abundance of oil in unexplored
regions is estimated by assuming it will be similar to that in known
regions of similar geology. Historically, as each region is opened to
exploration, the estimates start low, then rise rapidly as greater
exploration and improved geological knowledge lead to better estimates.
But instead of flattening out close to or asymptotically approaching the
"true" level, estimates in the US dramatically overshot and then
collapsed. In the 1960s the USGS estimated the ultimate recoverable
petroleum resource base to be nearly three times as great as the value
now widely accepted and endorsed by USGS. Why the overshoot? The
models we developed explain the overshoot as a systemic phenomenon
created by the limited information available and the bounded rationality
of the petroleum estimation experts. For example, the USGS assumed the
yield to exploration effort in unexplored areas of a given stratigraphy
would be the same as had historically been observed in the explored
regions of similar type. Sounds reasonable - if drilling activity was
randomly distributed. But drilling activity is far from random - oil
cos and wildcatters are several times better than random. This means
they found the easy stuff first, and the yield to exploration in the
places they havent yet looked must be lower on average than in the
places they have. Compound this error with long delays in getting and
adjusting beliefs about data on the yield to exploration and naive
extrapolation of progress in discovery and recovery technology and
youve got the basis for the overshoot.
Only one estimation method avoids the overshoot, and it is the method
developed by the late geologist M. King Hubbert. Hubberts method,
based on the basic stock-flow structure of the resource base, has
consistently proved to be the most accurate for the US. Indeed, in 1956
Hubbert predicted US lower 48 production would peak between 1968 and
1972 - at a time when the official and consensus view was that no limits
were in sight. Widely criticized at the time, Hubbert had the last
laugh when production peaked in 1970 (it is now about half that level).
Hubberts model is certainly one of the most accurate long-range
forecasts of all time.
Our models explain the accuracy of Hubberts method and the tendency for
overshoot by the geologic analogy method. The version of the model
calibrated for the world oil resource situation suggests an overshoot
for estimates of world oil in place as well.
The main point of our work was to show how system dynamics could be used
to calibrate methods to estimate an unknown resource base when the usual
method - repeated comparison of forecasts to actual outcomes - is
impossible. In the case of oil and other nonrenewable resources, the
true answer wont be known until the answer itself is moot - that is,
until after we have consumed the oil and have no chance to correct any
errors. There are policy implications, however. Overoptimistic
estimates of resource abundance can lead to complacency about how much
time remains to develop renewable substitutes for petroleum, dampen
conservation efforts, and delay development of the technologies,
institutions, and values needed to create a sustainable energy system
and sustainable society. I wonder what biases (optimistic or
pessimistic) remain to be discovered in the methods currently being used
to assess other, even more ambiguous, resources such as climate change,
soil fertility, and biodiversity.
Interested readers will find this work in:
Davidsen, P., Sterman, J. D., & Richardson, G. P. (1990). A Petroleum
Life Cycle Model for the United States with Endogenous Technology,
Exploration, Recovery, and Demand. System Dynamics Review, 6(1), 66-93.
- Describes the model we developed.
Sterman, J. D., & Richardson, G. P. (1985). An Experiment to Evaluate
Methods for Estimating Fossil Fuel Resources. Journal of Forecasting,
4(2), 197-226.
- Uses the model to examine the dynamics of global petroleum estimates.
Sterman, J. D., Richardson, G. P., & Davidsen, P. (1988). Modeling the
Estimation of Petroleum Resources in the United States. Technological
Forecasting and Social Change, 33(3), 219-249.
- Uses the model to examine the dynamics of petroleum estimates in the
USA.
John Sterman
jsterman@mit.edu