Over on Icecap ( http://icecap.us/ ) for January 12th, Joseph D’Aleo has a very interesting article looking at Central Park. I’m going to focus in on just one little graph from his posting, the one above.
UPDATE: (18 Jan 2010) In talking with Joseph, he has confirmed using GISS data, not the actual “unadjusted” data from NOAA / NCDC. I’ve sent him the “unadjusted” GHCN data and he reports that it is not significantly divergent from the actual “raw” data he got from NY directly. This article as been re-edited to reflect that change in the interpretation of the graph.)
Joseph did the leg work to find the real raw data and compare it to the NOAA / NCDC GHCN “unadjusted” data as merged with the NOAA / NCDC USHCN “corrected” data in GIStemp. What he finds are that the “unadjusted + corrected” data are very much adjusted (and some of us would say very much “maladjusted” ;-) as it flows part way though GIStemp.
Exactly which step is used is something I’m still looking into. It is at least the “as combined” step and perhaps the “homogenized” step that includes an Urban Heat Island Effect correction. The original version of this posting called it the “unadjusted” data based on the name used on the chart (The GISS web site lables it “Raw” on the dropdown menu first selection).
The data he used is not the GHCN Unadjusted data directly, but the data set used is the result of the processing of GIStemp. (The link in the paper at Icecap connects to the GISS web site, not to NOAA / NCDC. The option to download the STEP0 data is labled “Raw GHCN + USHCN corrections” at GISS). If that was, in fact, the data set used; then the graph will reflect the merger process in GIStemp STEP0.
That process looks for the existence of both sets of data (GHCN “unadjusted” and USHCN – version one prior to November 15th 2009, and version 2 with added “adjustments” thereafter). If only one exists, that one is used. If both exist, then they are averaged, in an odd sort of way. To the extent the heading on this graph ought to have been “GHCN Unadjusted AND USHCN” there will be some USHCN derived adjustments making up part of that “unadjusted” line. To the extent that the “as combined” data were used, the chart does not change much (it is mostly an ‘in fill’ process). And to the extent that the “homogenized” data were used, then this chart shows what the “homogenization” process does to the data. (And potentially, for all cases, what “adjustements” are in the USHCN version 2 set.)
In either case, that 3 F increase in the warming trend is present at some point in the GIStemp process of STEP0 to STEP2 (step zero is the very first ‘glue data sets together’ part of GIStemp. GIStemp is the program from NASA / GISS that creates those “anomaly maps” showing we have warmed by some fractions of a degree and so, ought to panic). One way or another, that 3 F warming trend “bump” is in the making of that product…
Just look at that. Up to 3 whole degrees of F (over 3 in a couple of places) of added “warming trend” via the NOAA / NCDC “corrected” adjustments and GIStemp processing. Heck, even the language you must use to describe what is going on is painful to the ear. But what else to call it? The lables NCDC applies are “unadjusted” and “corrected” so that has to be used to know which data sets I’m talking about. The data are clearly changed, so it is adjusted. And we are left with lumpy terms in quotations like – “Unadjusted + corrected” data and “raw + corrected” that isn’t raw and is adjusted; and “homogenized” that is both truncated and UHI corrected as well.
You can take nothing for granted when reading the NAME of a data set used in climate “research”.
In the first version of this posting, I had written:
It looks to me like we will need to go all the way back to “first sources” to have any hope of finding out what is really going on in the temperature history of the planet. GHCN “Unadjusted” clearly is too adjusted to be suitable to the task.
Given the new comparision of NOAA / NCDC GHCN “unadjusted” without the GISTemp processing to the actual raw data from NY, I now say instead:
It looks to me like we will need to go ahead of GIStemp and use GHCN “unadjusted” to find out what is going on in the temperature history of the planet. We don’t need to go all the way back to “first sources” to have any hope of finding out what is really going on in the temperature history of the planet, but ought to do so for some QA checks along the way. USHCN Version 2 is too “corrected” and GIStemp “homogenized” is clearly is too adjusted to be suitable to the task.
This is very good news to me, since it means that all the analysis I’ve done using GHCN “unadjusted” is using valid data and I don’t have to do it all over again! Be advised that NOAA / NCDC is rumored to be making a new version of GHCN that uses the same adjustment method as USHCN Version 2; so we may be right back at this “know your adjustments” issue again in a month. One hopes they continue to make available an “unadjusted” version…
I will be digging through the varous “versions” of the data made available by NOAA / NCDC (GHCN – the Global Historical Climate Network both “unadjusted” and “adjusted”; USHCN – the U. S. Historical Climate Network both “version one” and “version two”. USHCN claims to be “corrected” but it is unclear which “corrections” are adjustments… some documents describe the USHCN as unadjusted, but corrected.) and the data from GISS in GIStemp (both STEP0 “Raw GHCN + USHCN corrected” and the newer version that uses USHCN Version2 that is known to have some adjustments in it. GIStemp only began using USHCN V2 a couple of months ago.) I’ll make a follow up posting here when I’ve got something to show.
If all this talk about 4 different versions of the same data for the same location (Central Park) has your head swimming, just think on this: They are all held out as valid and correct by NOAA / NCDC. The same organization produces all of: GHCN “unadjusted”, GHCN “adjusted”, USHCN “corrected”, and USHCN Version2. They all are available for download now.
GISS, from GISTemp, makes available a further 3 variations plus anomaly maps. Taking the NOAA / NCDC data and reworking it into yet more variations.
So exactly what “input data” are the right ones? You get to chose based on what ‘adjustments’ and ‘corrections’ you would like to have. And they are different from each other, often by several degrees. From this we are supposed to be excited about fractional degrees of change? There is much more than that in the adjustments…