It would seem that NCDC have made a nice set of graphs that show the adjustments done on each and every station in the GHCN Global Historic Climate Network temperature history.
A brief look seems to indicate more cooling of the past and warming of the present, via adjustment, than from any asserted “Global Warming” in the actual data. It would take a lot more work, though, to demonstrate that via looking at every station and the net impact on the final ‘warming’.
But for now, there’s quite a set of useful images here:
For each station. So you can just wander around and find things of interest.
I’m going to upload a couple here, for purposes of illustration. But really, this is one giant “Dig Here!” that would benefit from many hands (and eyes) looking at many graphs.
The graphs are in folders with a single digit number. That number is the first digit of the station ID (so also the continent / cluster).
Here’s some info from other locations at that site:
The “ReadMe” file:
Last Updated: 09/29/2010
The following directory:
is comprised of sub-directories (that are named by the first digit of a station
ID) that contain individual station plot files (in “gif” format).
The plot files contain 9 individual graphs, arranged in a 3×3 matrix. The first
column of graphs, contain 2-D colored symbol graphs of the actual monthly data
for the entire period of record for A) the (Q)uality (C)ontrolled (U)nadjusted
(QCU) data, B) the (Q)uality (C)ontrolled (A)djusted (QCA) data, and C) the
differences between QCA and QCU monthly data. The second column of graphs
contain histograms of the monthly data for QCU, QCA, and (QCA-QCU) respectively.
Finally, the third column of graphs depict annual anomalies and their associated
trend line for QCU and QCA, and the differences in the annual anomalies for QCA
and QCU. Detailed axis titles and units are displayed in the title of each
So you can see that there’s lots of good info here on unadjusted vs adjusted. I find the tend line and the difference graphs the most interesting.
Here’s an example from Tatlayoko Lake, BC:
On the right, notice that the original dropping trend line has been turned into a generally flat one. The graph at the bottom right shows that the past was cooled, and the present warmed. Clearly and obviously.
Now, to me, it isn’t so much the warming present and cooling past, as that pretty much every graph has more change from adjustments than it does from actual trend. Those that are not changed generally are so short of data that there isn’t much point. (Though there are graphs that are unchanged).
What’s the net-net of it? Hard to say, but I’d say mostly a “Global Warming” signal that comes out of the adjustments, not out of the data.
They have a paper describing their latest changes here:
It has some interesting bits buried in it, like their new method finding more step change points to prune out and inducing even more change than the prior version. The “homogenizing” looks to be the magic sauce. It looks similar to the B.E.S.T. splice and dice method of taking slow changes (like aging paint) and keeping that warming in, while taking out the step function when it is repainted to the proper white. Version 3.2.0 finding 1.07 C / Century while Version 3.1.0 has 0.94 C / Century. So we get 0.13 C of added warming from this one update to the code. Now, do that 5 times, you have all of Global Warming. How many updates have there been? Well, since this was from 3.1 to 3.2, I’d wonder about 1.x to 2.x to 3.x… Looks like about a dozen or three to me…
Yes, just a first approximation. But I’d like to know just how many salami slices of warming have been added just this way.
Here is an example station that gets no change:
So why is it left alone, while others are changed? Who knows…
Again, if it is so important to change the data, dramatically, for other stations; then why is it not just as important for THIS station? Which is the error? Changing the other one, or not changing this one? They BOTH can not be error free decisions…
While Faraday gets the rather high trend there cooled down:
Mawson station in the same major number cluster gets a bit of warming:
In a general ‘look over’ it looks to me like the added warming makes up all of the “AGW” signal. It needs a full on analysis / proof to show that. But what gets me more is that there is no rhyme nor reason. Some stations up, some down, some flat. Is the whole thing just an artifact of an algorithmic adjustment gone mad? The average warming signal being the leftovers in the error band of all those seemingly senseless adjustments?
Looking at the raw data for many locations does not show much “warming” at all. This one for example:
So why do they end up getting a warming trend? And why is the tend from those adjustments so much more than any trend in the actual data?
IMHO, the folks doing the adjusting are in love with their intellectual creations and not bothered to actually look at what it does to the data. (The alternative requiring malice… and “never attribute to malice that which is adequately explained by stupidity”… )
This takes a whole lot more eyes looking at a whole lot more of these graphs. Sorting them by type of adjustment. Assessing each one for sanity. Calling “BS” on the ones that are just not justified by the known facts. Calling “BS” on the ones with no known facts to justify them. Calling “BS” on the ones where natural cycles and processes have been ironed out in the name of ‘homogeneity”.
But at least the graphs are now produced, and sitting there for everyone to have a look.
Station data and other info is available too. They have a FAQ “Frequently Asked Questions” file here:
and it claims to link to other documents that:
global temperature trends?
NCDC Technical Report No. GHCNM‐12‐02 provides a detailed summary of each software modification
and the resulting impacts to global temperatures. This report is available at
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/techreports/Technical Report NCDC No12‐02‐
With software available for inspection:
Is it possible to obtain the computer software code that NCDC uses for making homogeneity
Yes. The Pairwise Homogeneity Adjustment algorithm software is available online at
So plenty to keep a lot of folks busy, if they have the time to dig in and help.
What is very clear is that there is an awful lot of room for fudge in those adjustments, and a lot of room for error that does not show up as error bars, and ought to.
If it is at all like what they do to the USHCN, that is about 1/2 F, it accounts for roughly all of the “Global Warming” signal with nothing left over for nature:
The cumulative effect of all adjustments is approximately a one-half degree Fahrenheit warming in the annual time series over a 50-year period from the 1940’s until the last decade of the century.