GIStemp STEP1 data change profile

This STEP has already been looked at in fair depth by John Goetz over on WUWT, so I’m going to put a link here to that article rather than rehashing a bunch of the same stuff. Here I’ll just put the benchmark data and a brief discussion.

http://wattsupwiththat.com/2009/07/22/giss-step-1-does-it-influence-the-trend/

Now we could compare to the result of the STEP0 runs, to see ONLY what STEP1 does, or we could compare to the “Base Case”. Since STEP0 made only minor changes to the benchmark, I would rather make this a simple “compare to the base case” analysis. That data are all here for you to look at exactly what STEP1 does if you wish, but I’d rather keep my eye on total changes compared to the raw data (and knowing that STEP0 didn’t do much, it’s kind of silly to worry a lot about it.)

The Base Case

This is the base case of the benchmark. The benchmark code is run on the input dataset: v2.mean from NOAA.

DecadeAV: 1889  
 0.4  1.9  5.2 10.6 15.3 18.8 21.0 20.3 17.2 12.0  6.5  2.7 11.0
DecadeAV: 1899 
-0.2  0.9  4.7 10.6 15.3 19.4 21.4 20.8 17.5 12.0  6.0  1.7 10.8
DecadeAV: 1909  
 0.9  1.3  6.0 10.6 15.2 18.9 21.0 20.6 17.3 12.4  6.8  2.0 11.1
DecadeAV: 1919  
 0.9  1.9  6.1 11.0 14.9 18.6 20.8 20.1 16.9 12.3  6.9  2.2 11.0
DecadeAV: 1929   
 1.0  2.5  6.3 10.7 15.0 18.6 20.7 20.1 17.2 12.5  6.9  2.4 11.2
DecadeAV: 1939  
 1.1  2.2  5.9 10.9 15.4 18.9 21.2 20.6 17.4 12.4  6.8  2.5 11.3
DecadeAV: 1949  
 0.6  1.9  5.8 10.9 15.0 18.4 20.5 20.0 16.9 12.4  6.6  2.2 10.9
DecadeAV: 1959  
 3.1  4.4  7.5 12.2 16.1 19.2 20.9 20.4 17.7 13.5  8.2  4.7 12.3
DecadeAV: 1969  
 3.7  5.1  8.3 12.6 16.2 19.0 20.6 20.2 17.6 13.9  9.2  5.2 12.6
DecadeAV: 1979  
 3.5  5.1  8.4 12.4 15.9 18.8 20.3 19.8 17.3 13.3  8.7  5.0 12.4
DecadeAV: 1989  
 2.7  4.0  7.5 11.9 15.7 18.5 20.4 20.0 17.0 12.8  7.8  3.8 11.8
DecadeAV: 1999  
 4.8  6.4  9.3 13.1 17.0 20.1 21.9 21.5 18.7 14.4  9.2  5.8 13.5
DecadeAV: 2009  
 5.1  6.2  9.7 13.8 17.2 20.3 22.0 21.6 18.9 14.8 10.1  6.2 13.8

STEP1 Benchmark Data

Here we look at the result of running the product of STEP1, the Ts.txt file, though my benchmark averaging code.

DecadeAV: 1889  
1.0  2.3  5.3 10.3 15.0 18.4 20.5 19.9 16.9 11.8  6.8  3.0 10.9 1047
DecadeAV: 1899  
0.4  1.4  5.2 10.9 15.4 19.5 21.4 20.8 17.7 12.3  6.5  2.3 11.2 1739
DecadeAV: 1909  
1.6  1.8  6.5 10.9 15.4 18.9 21.0 20.6 17.5 12.7  7.4  2.6 11.4 2278
DecadeAV: 1919  
1.6  2.5  6.6 11.3 15.1 18.6 20.7 20.1 17.1 12.6  7.4  2.8 11.4 2641
DecadeAV: 1929  
1.6  3.1  6.8 10.9 15.1 18.5 20.7 20.1 17.3 12.7  7.3  3.0 11.4 2878
DecadeAV: 1939  
2.2  3.2  6.8 11.3 15.6 19.0 21.2 20.7 17.7 12.9  7.6  3.6 11.8 3308
DecadeAV: 1949  
2.1  3.4  7.0 11.7 15.4 18.6 20.6 20.3 17.4 13.3  7.8  3.7 11.8 4188
DecadeAV: 1959  
4.1  5.5  8.2 12.7 16.4 19.4 21.1 20.7 18.1 14.1  8.9  5.6 12.9 5487
DecadeAV: 1969  
4.0  5.4  8.6 12.9 16.4 19.3 20.9 20.6 18.0 14.3  9.6  5.4 12.9 6156
DecadeAV: 1979  
3.8  5.5  8.9 12.7 16.2 19.1 20.7 20.3 17.7 13.9  9.2  5.6 12.8 5989
DecadeAV: 1989  
3.3  4.6  8.0 12.2 15.8 18.6 20.5 20.1 17.3 13.2  8.3  4.3 12.2 5234
DecadeAV: 1999  
4.5  5.9  8.7 12.3 16.1 19.2 21.1 20.7 18.0 13.8  8.7  5.5 12.9 3190
DecadeAV: 2009  
4.6  5.6  9.2 13.1 16.7 19.8 21.6 21.2 18.5 14.3  9.7  5.8 13.3 1565

So What Does This Bucket of Numbers Say?

I really need to get some graphics on this, but learning to do blog graphics is kind of low on my “must do” list… so we’re doing this “long hand” by looking at the numbers. That wasn’t so bad on the earlier series, where changes were minor, but here we have a lot of moving parts; so you must hold more number in your head to “see the pattern”.

Ok, lets start with what is still the same. There still isn’t much happening in the summer. Take August.

Before After Delta
20.3    19.9  -0.4
20.8    20.8    0
20.6    20.6    0
20.1    20.1    0
20.1    20.1    0
20.6    20.7   0.1
20.0    20.3   0.3
20.4    20.7   0.3
20.2    20.6   0.4
19.8    20.3   0.5
20.0    20.1   0.1
21.5    20.7   0.2
21.6    21.2  -0.4

There is still not much “warming over time” in the August data. Both series are still +1.3C. But there is a modestly large “bolus” of added warmth to the data that matches the pattern of the arrival of thermometers to the data series. (And the drop in the 2009 decade ending average matches a drop in total thermometer count…) The GIStemp process has added as much as 1/2 C of warming to the data in the 1969 / 1979 decades ending, exactly in sync with the largest count of thermometers. Realize that this is in addition to any warming signal already in the data. Up to this point, GIStemp is acting as an amplifier, not a filter.

It sure looks to me like the GIStemp process has a small, but significant, sensitivity to “thermometer count”.

So how about those winter months? They carry the bulk of the “warming signal”. What happens with them? GIStemp does an odd thing where it looks at last years December to make this years “winter” so I’m going to avoid that complication by looking at November. You can pick other months, but then you must start spanning years to figure out what’s going on with the data. March might also be a good month for staying inside a single years data.

Before After Delta
6.5    6.8    0.3
6.0    6.5    0.5
6.8    7.4    0.6
6.9    7.4    0.5
6.9    7.3    0.4
6.8    7.6    0.8
6.6    7.8    1.2
8.2    8.9    0.7
9.2    9.6    0.4
8.7    9.2    0.5
7.8    8.3    0.5
9.2    8.7   -0.5
10.1   9.2   -0.9

Again we have the stronger warming signal in winter months (2.4C over the life of the GIStemp data). We have a large jump with the addition of many thermometers, but it is moderated somewhat over time after that and we again have a drop in the latest decade when the thermometer count drops. The delta column is the change in the data relative to the baseline, so it is the degree to which GIStemp is acting as a damper, a filter, or as an amplifier. The data seem to show that GIStemp is acting as an amplifier up to this point. More thermometers increase the warming signal, fewer reduce it.

But quite clearly, and quite strongly, the “warming signal” is still being carried in the winter months (though now with a tiny boost from the summers when there are more thermometers in the record).

Conclusions

Since STEP2 is the “split into zonal sections and homogenize” process, this is the last time we can do this particular kind of analysis on the data. From here on out we can only look at what happens inside boxes, grids, zones, and anomaly maps. The raw data, and the slightly cooked data, are now saying all they can say about the GIStemp processing of thermometer records.

What they say to me is that GIStemp warms the record. In summer, it tilts the slope upward as more thermometers are added over time, peaking at 1/2 C or so of added tilt (or 0.9 C of tilt if you start from the -0.4 base).

In winter, there is much more “warming” in the whole series. From bottom to peak the ’tilt’ is still 0.9C but only 0.6 C in the decade prior to now (and negative now). The degree of ’tilt’ has a correlation with the thermometer count. Also of note: The entire winter record has been “lifted”. When averaged with the summer to get an Annual Average, this will bias the average upward. The warming still comes largely from the winters, and the signal appears to be carried by the added thermometers.

Given the roll off in the last years data, it will be interesting to see what happens to GIStemp numbers are they begin reporting the present cool down of the planet and the present drop in thermometer count.

If reports start to be delayed, or processes changed, I would count that as a Red Flag. If the anomaly maps do not reflect the “roll off” seen here in the raw data, then there is a major issue in the “mapping” process. (Why? For the simple reason that you must respect the data. If it isn’t in the raw data, or even the slightly processed data, it isn’t real… )

One side bar. The Ts.txt file format has the “Station ID” on a line by itself that starts with a blank, so you can get a “thermometer count” fairly easily with a “grep” (Global Regular Expression Print – it’s a unix command…):

[chiefio@tubularbells vetted]$ grep "^ " Ts.txt | wc -l
   7630

Which shows that after all the overlapping, combining, et al that it does, there are still most of the station IDs that we started with still in the pot. A future bit of work would be to figure out how many stations (and which ones) were dropped and how many were simply combined records. Also, how many of those were dropped when the 1880 cut off was put in place. I’ll add that bit a little later…

Advertisements

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW GIStemp Specific and tagged , , , , , . Bookmark the permalink.