Kevin Cowtan
Updates:
One question not answered by the report was why the simple fragment homogenization method (FHA) failed to recover all of the required trend in the adjustments. For the benchmark data the primary reason why the simple method only captures part of the trend in the adjustments is a problem called 'confounding'. When comparing a pair of stations we can determine a list of all the breaks occurring in either record, but we can't determine which breaks belong to which record. If the breaks are correctly allocated to the individual records, the rest of the trend in the adjustments is recovered. Existing homogenization packages include unconfounding steps.
I have now implemented a crude iterative uncounfounding algorithm as the final stage of the homogenization calculation. With this additional stage, the calculation recovers the trend for all but the hardest of the eight US benchmarks. It also reproduces the GHCN adjustments for the real US observations. However the iterative unconfounding method requires a densely sampled station network (e.g. the benchmark or real US data) and does not address the issue for the rest of the world.
Errata: