how to weigh newer data more heavily in calibration?
Posted: Wed May 01, 2019 8:00 pm
It's not rare that older data is less reliable, especially in my work in developing countries. REcent data may be measurements, whereas older data may be estimates only, or the measurement frequency has increased recently.
Hence when calibrating manually and I face a trade-off between of getting a good fit to old data vs. to recent data, I tend to choose the latter. I mean data for the same variable that is in one and the same vdf file.
I wonder if there is a way to do the same in automatic calibration?
In case the answer is "not yet":
I've been thinking that delays are actually do something like that, they weigh older information less and recent information more. Hence maybe we could use SD models to optimize other SD-models in the way that I am asking for above?
Hence when calibrating manually and I face a trade-off between of getting a good fit to old data vs. to recent data, I tend to choose the latter. I mean data for the same variable that is in one and the same vdf file.
I wonder if there is a way to do the same in automatic calibration?
In case the answer is "not yet":
I've been thinking that delays are actually do something like that, they weigh older information less and recent information more. Hence maybe we could use SD models to optimize other SD-models in the way that I am asking for above?