GIStemp Quartiles of Age Bolus of Heat

Following the Temperatures

Having discovered that the average of thermometers shows warming, but concentrated into the winter months, and that the added temperature showed up coincident with the arrival of a large number of short lived thermometers, I decided to explore this a bit more, to the next reasonable level. Divide it into quartiles by thermometer persistence.

We’ve already seen the numbers for the top 3000 thermometers under the “long lived” example earlier. The total Station ID count is 13472, so a quartile is 3368 (and that is not enough different from 3000 to make any real difference in the results.) I’ll include the top quartile here for documentation purposes only, since we already know there is no heating to speak of for most of the history in the best, longest lived thermometers. The shortest lifetime in these stations if 58 years (a bit shorter than our top 3000 that had a 64 year shortest life)

Update: Added listings of Site Data

I’ve added listings of the site data (location, name, land use) by quartile by Station ID with number of years for that station. These are just long lists of data (some fields shortened a bit to fit, like name); but if you want to know exactly what thermometer is in which quartile and where it is, you can take a look here:

https://chiefio.wordpress.com/2009/08/15/best-3000-thermometer-records/

https://chiefio.wordpress.com/2009/08/15/quartile-2-middle-top-thermometers/

https://chiefio.wordpress.com/2009/08/15/quartile-3-middle-bottom-thermometers/

https://chiefio.wordpress.com/2009/08/15/bottom-quartile-of-thermometers

The Top Quartile

This represents 25 MB out of 45.8 MB of total data. Just a reminder, we have 12 months of “monthly averages of temperatures in a decade” followed by the average for the whole year, then in the final field, the count of active thermometers in those averages.

DecadeAV: 1879  
 1.8  2.7  5.8 10.4 14.7 18.8 20.8 20.1 16.7 12.1  6.6  2.6 11.1  575
DecadeAV: 1889  
 0.4  2.0  5.3 10.7 15.4 19.0 21.1 20.3 17.2 12.0  6.5  2.7 11.1 1137
DecadeAV: 1899 
-0.4  0.7  4.6 10.6 15.3 19.5 21.5 20.9 17.5 12.0  5.9  1.5 10.8 1843
DecadeAV: 1909  
 0.6  1.0  5.8 10.5 15.2 18.9 21.2 20.7 17.3 12.3  6.7  1.8 11.0 2343
DecadeAV: 1919  
 0.6  1.5  5.8 10.9 14.9 18.7 20.9 20.2 17.0 12.1  6.7  1.8 10.9 2652
DecadeAV: 1929  
 0.8  2.2  6.1 10.5 15.0 18.6 20.9 20.2 17.2 12.3  6.7  2.1 11.0 2995
DecadeAV: 1939  
 0.6  1.6  5.4 10.4 15.1 18.9 21.3 20.6 17.2 12.1  6.3  2.0 11.0 3094
DecadeAV: 1949 
-0.1  1.3  5.2 10.6 14.8 18.4 20.7 20.2 16.9 12.1  6.1  1.5 10.6 3198
DecadeAV: 1959  
 0.2  1.7  4.8 10.4 15.0 18.7 20.9 20.3 17.0 12.0  5.7  2.0 10.7 3179
DecadeAV: 1969 
-0.6  1.0  4.9 10.4 14.8 18.6 20.8 20.1 16.7 12.0  6.1  1.0 10.5 3207
DecadeAV: 1979 
-0.9  1.1  5.4 10.4 14.9 18.7 20.8 20.1 16.8 11.6  5.8  1.3 10.5 3021
DecadeAV: 1989 
-0.5  1.1  5.3 10.6 15.2 19.0 21.3 20.6 16.9 11.6  5.7  0.9 10.6 2641
DecadeAV: 1999  
 1.3  3.5  7.0 11.5 16.4 20.5 22.8 22.2 18.7 13.1  7.0  2.7 12.2 1378
DecadeAV: 2009  
 1.8  2.8  7.0 12.0 16.3 20.2 22.6 22.1 18.3 12.7  7.0  2.1 12.1  304

There is a bit of warming that shows up at the end, when for unknown reasons we start pruning thermometers from the record. But clearly the pattern of “warming a bit and only in the last 20 years” does not match the arrival profile of CO2.

The Bottom Quartile

The group only represents 3 MB of data, well less than 10% of the data (7.3%) so I doubt that it really has much influence on the temperature history of the planet. These stations flit into existence, then evaporate. The longest lived is 20 years, the shortest only 2.

DecadeAV: 1879 
-13.9 -13.0 -5.1  5.3 13.2 18.5 20.8 19.8 14.0  6.7 -1.6 -7.2  4.8    7
DecadeAV: 1889 
 -5.3  -4.4 -0.8  5.2 11.7 16.0 18.0 17.6 14.3  9.1  3.0 -2.1  6.9    7
DecadeAV: 1899 
 -4.8  -4.0 -0.7  4.8  9.7 13.9 16.6 16.2 12.5  7.4  0.9 -2.9  5.8   17
DecadeAV: 1909 
 -1.0  -0.3  2.8  7.5 11.1 13.9 16.0 15.6 13.1  9.7  5.0  1.4  7.9   12
DecadeAV: 1919  
  2.8   4.0  5.9  8.2  9.2 10.5 12.0 11.8 10.8  8.7  6.0  3.3  7.8   17
DecadeAV: 1929 
 -7.0  -4.3 -0.7  4.5  9.8 13.7 16.0 14.7 12.1  8.0  0.8 -5.1  5.2   18
DecadeAV: 1939 
 -4.5  -3.8  0.1  7.2 12.9 16.6 18.4 18.0 14.8  9.9  3.9 -1.0  7.7   24
DecadeAV: 1949  
  0.9   2.5  5.9 10.6 14.2 17.3 19.3 18.9 16.1 12.1  6.7  2.5 10.6  138
DecadeAV: 1959  
  7.1   7.9 10.2 13.8 17.1 19.9 21.3 21.2 18.9 15.3 11.0  8.2 14.3  418
DecadeAV: 1969  
  8.9   9.9 12.2 15.1 17.9 20.0 21.0 20.8 19.1 16.5 13.0 10.0 15.4 1094
DecadeAV: 1979  
  8.3   9.3 11.7 14.6 17.3 19.4 20.4 20.2 18.3 15.5 12.1  9.3 14.7 1309
DecadeAV: 1989  
  5.4   6.6  9.6 13.0 16.1 18.5 20.0 19.7 17.5 14.1  9.9  6.7 13.1  888
DecadeAV: 1999  
  6.9   8.2 10.4 13.7 16.6 18.8 20.4 20.3 17.8 14.5 10.1  7.6 13.8  288
DecadeAV: 2009  
  4.7   5.6  9.3 12.7 16.1 19.1 20.6 20.5 17.8 14.2  9.4  5.5 13.0  233

The most interesting thing about these thermometers is the bolus that start arriving in the 1950’s, and were largely leaving in the 1980’s (and by now are no longer in the record, the maximum lifetime of the set being 20 years…) It is also worth noting that the earlier stations (though very low in number so will have little impact) are all fairly cold stations. They have all left the recent records, but will have pulled down the past to a colder temperature by a little.

When we inspect the averages from those more recent years, we see much warmer January, February, November, December… The 1969 – 1979 decades are particularly flat. This strongly argues for short lived thermometers from regions with warm N. Hemisphere winter months. The mid-summer peak temperature averages are still in the 20 C range and will not act to pull up the “average temperature” in the rest of the data, but those warm winter temperatures certainly will.

One other thing that is certain: Those numbers in the winter months are not the result of CO2 warming the planet only in winter.

How about the Second Quartile?

These data are 11 out of 45.8 MB, so they represent about 1/4 of the total records. The longest lived of them is 58 years, while the shortest lived is just 33. Long enough to have an impact, but not to show a real climate trend in a system with known 60+ year cycles. What do they say?

DecadeAV: 1879  
1.3  1.5  4.5  8.8 12.8 17.1 19.6 19.2 15.6 11.0  5.9  1.3  9.9   52
DecadeAV: 1889  
1.2  2.6  5.0  9.0 13.7 17.3 20.1 19.6 16.5 11.3  7.2  3.0 10.5   65
DecadeAV: 1899  
2.4  3.4  6.4 10.7 14.4 18.1 20.3 19.9 17.3 12.9  8.0  4.3 11.5   87
DecadeAV: 1909  
4.3  4.9  8.0 11.9 15.2 18.0 19.8 19.6 17.3 14.0  9.5  5.8 12.4  141
DecadeAV: 1919  
5.2  6.4  9.2 13.1 15.5 17.7 19.3 19.1 17.1 14.2 10.4  6.5 12.8  189
DecadeAV: 1929  
4.2  5.6  8.7 12.2 15.6 18.1 19.5 19.3 17.2 14.2  9.8  5.7 12.5  261
DecadeAV: 1939  
3.4  4.7  8.1 12.9 16.9 19.6 21.2 20.7 18.1 14.1  9.0  4.8 12.8  751
DecadeAV: 1949  
1.4  2.5  6.4 11.4 15.3 18.3 20.0 19.5 16.7 12.6  7.2  2.7 11.2 1599
DecadeAV: 1959  
4.0  5.3  8.5 13.0 16.5 19.2 20.6 20.1 17.7 13.8  9.0  5.5 12.8 2970
DecadeAV: 1969  
3.3  4.7  8.2 12.5 16.2 18.9 20.3 19.8 17.2 13.5  8.7  4.7 12.3 3056
DecadeAV: 1979  
3.0  4.4  7.9 12.3 15.9 18.7 20.1 19.6 16.9 12.9  8.1  4.2 12.0 2920
DecadeAV: 1989  
2.7  3.9  7.3 12.0 15.8 18.4 20.1 19.6 16.8 12.7  7.6  3.7 11.7 2394
DecadeAV: 1999  
6.4  8.0 10.5 13.7 17.4 20.3 22.0 21.6 19.2 15.3 10.7  7.3 14.4  214
DecadeAV: 2009  
8.4  9.3 12.4 14.6 17.0 18.9 20.1 19.9 18.6 15.8 13.1 10.3 14.9  161

Compared to the “top” group, we still have mid-summer temperatures of about 20C (or perhaps a bit cooler) but the winters are warmer. We are not in the 0.x to 1.x range, we’re up in the 3.x to 4.x range (and even 8 and 10 in the last couple of decades).

To me it looks like we gradually accumulated stations of modest life span in slightly warmer places than the long lived set; and especially in the last few years have biases toward a nearly flat seasonal curve. Far different from the strong seasonal curve of the early thermometers and “top” cohort.

And the 3rd Quartile

In some ways my favorite of the 4 quartiles, saved for the end. These readings represent 6.5 MB out of the total 45.8. Somewhat less than the quartile just above them, but enough to have an impact. The longest lived record is 33 years, the shortest is 20. Again, long enough to have an impact, but not long enough for a persistent impact.

DecadeAV: 1879 
-5.7 -7.1 -2.5  5.4 11.7 17.4 20.2 19.4 13.8  7.6  0.9 -5.5  6.3    5
DecadeAV: 1889 
-0.8  0.8  4.1  9.6 14.5 19.4 22.3 21.6 18.1 13.1  7.8  2.7 11.1   15
DecadeAV: 1899  
 6.6  7.4 10.1 14.2 17.6 21.0 23.3 22.7 20.0 16.6 11.1  7.2 14.8   25
DecadeAV: 1909  
 6.6  6.8  9.7 13.5 16.5 18.7 20.6 20.6 18.6 15.6 11.7  8.1 13.9   42
DecadeAV: 1919  
 5.6  7.0 10.1 13.8 15.8 17.9 19.2 19.2 17.4 14.3 10.6  7.0 13.2   44
DecadeAV: 1929  
 4.6  6.3  9.4 13.2 16.2 18.3 19.6 19.3 17.4 14.7 10.5  6.5 13.0   63
DecadeAV: 1939 
10.1 11.1 13.5 15.9 17.6 18.8 19.7 19.9 18.9 17.0 14.0 11.3 15.6  110
DecadeAV: 1949 
10.2 10.9 13.1 15.6 17.7 19.2 20.2 20.1 19.1 17.1 13.7 10.8 15.6  340
DecadeAV: 1959 
11.1 11.9 14.0 16.7 19.2 21.0 21.9 21.7 20.2 17.6 14.4 12.0 16.8  996
DecadeAV: 1969 
10.1 11.1 13.1 15.6 17.9 19.7 20.6 20.5 19.0 16.7 13.7 11.1 15.7 2029
DecadeAV: 1979  
 9.0 10.1 12.2 14.6 16.8 18.5 19.6 19.5 18.0 15.5 12.4  9.8 14.7 1721
DecadeAV: 1989  
 7.6  8.3 10.6 13.6 15.9 17.7 19.2 19.3 17.4 14.6 11.0  8.3 13.6 1997
DecadeAV: 1999  
 9.5 10.2 12.6 15.3 18.0 19.8 21.1 20.8 19.0 16.3 12.4 10.0 15.4  871
DecadeAV: 2009  
 9.0 10.2 12.9 15.8 18.3 20.4 21.4 21.2 19.3 16.6 13.0 10.2 15.7  872

Look at 1969 decade ending. 2000 stations with a January temperature average of 10 C. Those stations are not in Canada.

While this whole table of data show a much warmer average winter than the long lived thermometers, summers are still in the 19 to 20 C range. These must be more southernly thermometers, on average.

Further, the decades from 1959 to 1979 jump off the page as a large cohort of winter warming. The same winter warming that we saw in the bulk data as carrying the “warming signal”. Notice, too, that though the earlier stations must have left the record (1950+33 = 1983) even the youngest must have been in the record in 1989 (+20 = 2009) and both of the most recent decades show a considerably warmer winter profile than the long lived stations. This whole bolus of stations is between about 3 C and 5 C warmer in the winter months in the later decades as compared to the earlier decades, and up to 10 C warmer than the stable thermometers cohort during Norther Hemisphere winters.

It would be very hard to remove that much “winter warming bias” from this base data via grids, zones, etc.

In Conclusion

We have seen that there is a clear set of stations that enter the record with very warm winter temperatures. This happens relatively recently in time, and constitutes the bulk of the “winter warming signal”. The seasonal profile of these data strongly indicate a non-Northern Hemisphere location for many of them (that can be accurately identified from the station records in a future investigation). Furthermore, the arrival (and departure) of these stations roughly matches the “Global Warming” pattern.

Finally, and in some ways most significant: Summers do not warm

In all of these quartiles we have summer averages that run about 20C and are consistently that temperature over the entire time.

If there were a CO2 (or other “greenhouse gas” ) induced general warming of the record, and especially if there were a temperature driven positive feedback mechanism of any sort, we would not have such dramatically stable summer averages spanning a couple of hundred years, a dozen thousand of thermometers, in all quartiles of the data.

Bonus Round! Daily Double!

Once you have made a hammer, it’s hard to not run around pounding on things with it 8-)

So I thought I was all done, then I started thinking (always a dangerous thing!)… What if I just took the top 10% of stable thermometers? The best of the best? Those thermometers that have been tended for a hundred years plus by dedicated folks of great passion (or it would not have been done for the last 100+ years…)?


Number:  1348
Size:  12.5 MB (27% of the records)
Shortest life:  103 years
Longest life:   286 years

Well, here is the result:

DecadeAV: 1879  
 1.8  2.9  6.2 10.9 15.2 19.3 21.3 20.5 17.0 12.4  6.8  2.7 11.4  445
DecadeAV: 1889  
 1.0  2.6  5.9 11.5 16.1 19.6 21.8 21.0 17.9 12.6  7.0  3.3 11.7  835
DecadeAV: 1899  
 0.4  1.5  5.4 11.4 16.0 20.3 22.2 21.6 18.4 12.6  6.6  2.2 11.5 1207
DecadeAV: 1909  
 0.8  1.0  6.4 11.0 15.9 19.9 22.2 21.8 18.3 12.9  7.0  1.7 11.6 1315
DecadeAV: 1919  
 0.5  1.4  6.2 11.4 16.0 20.1 22.5 21.7 18.2 12.9  6.9  1.7 11.6 1315
DecadeAV: 1929  
 0.7  2.4  6.6 11.3 16.0 20.2 22.5 21.8 18.7 13.1  6.9  2.2 11.9 1338
DecadeAV: 1939  
 1.3  2.3  6.4 11.5 16.6 20.9 23.5 22.7 19.1 13.3  7.1  2.6 12.3 1320
DecadeAV: 1949  
 0.3  2.0  6.2 11.9 16.3 20.4 22.8 22.3 18.6 13.5  7.1  2.3 12.0 1321
DecadeAV: 1959  
 0.9  2.6  5.7 11.5 16.4 20.5 22.8 22.2 18.6 13.3  6.6  2.6 12.0 1337
DecadeAV: 1969 
-0.1  1.6  5.7 11.6 16.1 20.2 22.5 21.7 18.2 13.2  7.2  1.6 11.6 1338
DecadeAV: 1979 
-0.7  1.6  6.4 11.4 16.2 20.4 22.6 21.9 18.4 12.8  6.8  2.1 11.7 1293
DecadeAV: 1989  
 0.1  1.8  6.4 11.7 16.5 20.5 23.0 22.3 18.4 12.7  6.9  1.5 11.8 1198
DecadeAV: 1999  
 0.6  2.9  6.6 11.5 16.6 21.1 23.5 22.8 19.0 13.1  6.6  2.2 12.2  898
DecadeAV: 2009  
 1.0  2.0  6.8 12.1 16.9 21.2 23.6 23.1 18.9 13.0  6.9  1.7 12.3   80

Remarkably devoid of trend. Within a few tenths C decade to decade in all months columns and in the average for each year. If you told me that the average thermometer reading for a given month for the planet would not change by more than a couple of 1/10 C over 150 years I would not have believed it possible.

I do find it distressing that the number of stations drops to 80 in the last decade. I would hope that there was some kind of station ID renumber and that these stations still exist, but with a different number these days. Unfortunately, since GHCN in all cohorts shows a dramatic drop in stations in the last decade, I can see little way for that to be the case. If the stations do still exist, but have just been dropped from GHCN, then I am left to wonder why. Given that the temperature series (and the GIStemp calculation of “anomalies”) seems sensitive to thermometer changes, I can think of no reason to have removed those long lived thermometers from service, or from the data series.

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW Science and Background, Favorites, Metrology and tagged , , , , , . Bookmark the permalink.

19 Responses to GIStemp Quartiles of Age Bolus of Heat

  1. Ellie in Belfast says:

    Several things come to mind on reading your pages on this:
    1. Garbage in / garbage out
    2. They can’t see the wood for the trees – so wrapped up in torturing the data to give up its story (the legacy crime of CO2) that they don’t stand back and actually look at it rationally.
    3. Sometimes it takes an outsider to say “Why are you doing it that way?” (Answer: “Well, we’ve always done it that way”)

    It would not be the first time that an outsider or newcomer to a scientific field shows a flaw in method or thinking, or comes up with a lateral method that has everyone going – of course! or wow!

    Some of my early comments on WUWT were stupid and badly received (or ignored at best) because i failed to understand how the temperature data was calculated. All this average and anomaly stuff seemed intrinsically wrong to me. Still does.

  2. H.R. says:

    Hey, hey, hey! VERY interesting, E.M.

    You wrote:
    “So I thought I was all done, then I started thinking (always a dangerous thing!)… What if I just took the top 10% of stable thermometers? The best of the best? Those thermometers that have been tended for a hundred years plus by dedicated folks of great passion (or it would not have been done for the last 100+ years…)?”

    I’d think you’d find some UHI effect in those ‘long-timer’ stations instead of no apparent trend. Some of those stations had to have significant urbanization crop up around them, eh?

  3. E.M.Smith says:

    @Ellie:

    If you torture the data enough, they will tell you anything you want to hear. If you want to know the truth, you must respect the data, and ask them politely what they have to say; then shut up and listen…

    @H.R.:

    There are a few tenths of rise in some of the columns that might well be UHI. It’s also possible that the folks who run the longest lived stations are just a bit more careful about making sure they are well sited… Basically, I can’t explain the data, I just ask them to tell me what they know…

    I have not looked at exactly where these stations are located. They may, or may not, be in urban cores. I could easily see, for example, many long lived stations being at astronomical observatories out in the boonies. Mount Hamilton comes to mind. I think they have had a reliable temperature station there for a very long time (don’t know if they are in the GHCN though). There has been no development around such sites… On the flip side, many airports have moved and changed locations over the years. I could easily see a (hypothetical) case where the shorter lived locations where the most volatile and most impacted by urban change and location changes.

  4. C-H says:

    Simply brilliant idea! This is something nobody seems to have thought of before. Now, if the longest lived stations have a near flat temperature record, does it also work the other way? If we take the “flattest” record, do we end up with the oldest stations?

  5. JLKrueger says:

    Bingo!

    One of the first lessons I learned as an ORSA (Operations Research Systems Analyst) was that sometimes all the fancy tools in the toolbox can wind up misleading us. Sometimes just sitting back, and as you say, “listening to the data” can be the best approach.

    We can get so enamored with our computers and statistics that we miss the obvious. Great work!

  6. Roger Sowell says:

    E.M., I agree with your approach of surveying the data before analyzing it. At one time, we referred to this as data reconciliation, others referred to this as data screening for gross errors, or by other names. One can learn a lot by simply tabulating the data, as you have here, or by appropriate graphing, also by data distribution graphs (bell curve).

    I wrote a paper in 1998, published in Hydrocarbon Processing, that addresses data reconciliation, among other things. This was prior to my law school days.

    http://www.resowell-law.com/home53

  7. E.M.Smith says:

    @C-H
    I don’t know, but it’s an interesting question. I’m looking at bulk averages, so it would require a bit of coding to pick out individually flat records and then add them together… It also isn’t clear to me how to handle the question of “how flat is the trend for a 2 year history?”. I’ll think about this a bit and see where I end up…

    @JL Krueger

    Thanks for the support! Back when I was learning FORTRAN (my first computer language… circa 1972 as FORTRAN IV) they spent a great deal of time in class showing us how you could get garbage out of what looked like rational decisions in coding. Now we focus on the “whiz bangs” of object oriented coding and graphics display packages.

    But under it all there is still a need to ask “what precision does this data type support?”, “What happens at overflow of the variables?”, “Is this conclusion really supported by the data, or am I creating the trend in the result?”, “Am I assuming facts not in evidence?”. Unfortunately, fewer and fewer folks ask those questions these days.

    @Roger

    Well, if you’d like to take some of the “blocks of numbers” here and turn them into graphs on your site, I’d be more than happy to link to them! Looks like you have some experience at that sort of thing!

    We just called it a “sanity check”. You would always pick some sample of the data and run through where it ought to end up in the result. First was to characterize the data (look for things like missing data, edge cases, strange groupings, etc. Learn the patterns of your data.), then was to run a subset of those cases data through the process and see where they ended up (and ask if that was the right place to end up). This tended to catch things like not handling missing data well (averaging a zero into an average when it really was “not there” rather than “zero”) and having edge cases blow up (you pick up 99 well, but 100 comes in as 00 because you only read 2 digits… but there are only 1/10% of the data over 99 so nobody noticed it in the test set ) that kind of thing.

    As near as I can tell, that kind of “rational paranoia” in coding is no longer taught in most programming classes. We’re all in love with the very complex tools we have built and spend too much time admiring and polishing them and not enough time asking what happens when the gas tank gets dirt in it…

  8. Ellie in Belfast says:

    E.M.,
    I sent you a file last night graphing your top quartile. If you wish to post any please do so (if you can open MS Excel 97-2003). I can send the file on to Roger, or you may do so if it is useful . Of course it may be preferable to start again if my templates aren’t suitable. If they are, a new data set dropped into the same tables will automatically redraw the graphs with the new data.

    There is so much more than can be done with this data. I was just playing to start with and you as the data owner probably need to give direction on how you would like to see the data represented graphically.

    It will be late next week before I can guarantee to spend much time at this again.

  9. E.M.Smith says:

    @ Ellie:

    I got it. I sent you email a bit ago.

    FWIW, I don’t think of myself as “data owner”. I’ve just tabulated the data in an interesting way (and made the code to do so public so anyone can do it).

    Feel free to do whatever you want with the data presented here. Make graphs of it. Give it to your friends. Post it. Publish articles base on it.

    I’m from the “free software” and “communal barn raising” traditions. We all do what we can to make the world better.

    At this point, I’ve looked at the graphs and like what they say. Now I just need to figure out how to put them on a blog page. (Probably some easy “save as gif” or similar, but I still need to figure out what it is…)

    So I guess my “direction” would be: Figure out what presentation lets the data speak most clearly, and make it; then set it free to talk to the world.

  10. Bill Illis says:

    Good stuff E.M.

    You should keep working on this. You might have discovered a method to measure the bias (or it might not lead anywhere but one should at least see if it does).

    Start with the 304 long-lived thermometres that are still around as at 2009. If they are sufficiently distributed around the globe, you might have a better temperature index with just these stations than any other index (if indeed bias has crept into the other indices).

    If they are concentrated regionally (maybe the oceans, or the southern hemisphere isn’t sufficiently covered), then you could just compare this series to the comparable region in the other indices.

    That would be one way to start.

  11. Ellie in Belfast says:

    Graphs as pictures – easy (at least on a PC).

    Copy. Paste using Paste special… (below Paste menu) options include pictures – gif and jpg files. You can paste them back into MS Excel as pictures and lift them from there.

  12. Pingback: Interesting Post « the Air Vent

  13. _Jim says:

    Great work! Doing the work that mainstream science (and science students) ought to be doing!

    _Jim

  14. Bill Flastic says:

    Your definition of quartile isn’t clear to me.

    Do you mean the age of the thermometer at it’s disappearance, whether that is in 1929 or 2009? Or do you mean the number of years before the present (the present being 2009)?

    It seems reasonable that you are assuming the birth of the thermometer at it’s first recording, but there are, as you say, some re-namings possible. Could a thermometer appear twice in your list via re-naming?

  15. Marc Shaw says:

    Hey, I read a lot of blogs on a daily basis and for the most part, people lack substance but, I just wanted to make a quick comment to say GREAT blog!…..I”ll be checking in on a regularly now….Keep up the good work! :)

    – Marc Shaw

  16. E.M.Smith says:

    These are quartiles of thermometer record “life span” not calendar years of recent vs past. So some records come into existence in, oh, 1830, and are recorded in the same record for 100 years. This is a “long lived” record. Another might come into existence in 1830, only to end in 1850 and it is “short lived”. The same short life as one that began in 1988 and ended in 2008.

    And yes, the same thermometer WILL be in the record more than once. The life span here is about “thermometer records” and a single thermometer may (and often does) have more than one record. These are distinguished by the “modification flag” that is the 9th digit of the station ID. (The first 3 are “country code”, then there are 5 for location, then 3 for sub-station, then one of modification flag).

    So a site in, for example, Quantico Marine Air Station would have a USA country code, then a 5 digit code for Quantico, then the 3 digits would code for things like “the station near the runway” vs. “the new station on the roof of the tower” or “The automated replacement for the old Stevenson Screen, that is still being recorded for the next 5 years of overlap”.

    Finally, the modification flag reflects processing done to the record after collection. “Corrections” and related changes. So a site might have the exact same thermometer that has been read for 100 years, but the time of observation was not adjusted in the first 90 years, and in the last 10 we have started to apply a TOBS “correction”. Those two “records” have a different “modification history”, so they get two different “modification flags” and I treat them as two different “thermometer records”.

    On my “do someday” list is to cook up a non-GIStemp way to splice some of these “record fragments from the same place” together and see what happens then. But since that process is part of how GIStemp is flawed, I need to be very careful in how I approach it. (i.e. it is a mine field of issues and, IMHO, GIStemp steps on several of them).

    This was an early stage “benchmark” just to see what the data “looked like” and what the “raw as I can get it” data said before they were molested. Then you can compare it to the results of your program and see what is in the data, vs what came from the program code applied.

    But when I looked at this data in this way, it had some ‘surprising’ things to say… thus this posting.

    OK, does it make sense to exclude the recent 10 years of a constantly existing old thermometer from the “long lived” group just based on modification flag? Well, that depends on just what that modification was, doesn’t it? It would be very interesting to take the “long old records” and splice on the “new different modification flag” and compare that to these results AND to the “just a short lived thermometer” cohort. Haven’t done it yet, but I want to. It would illustrate if the “modification” done was ‘cooking the books’ too, or not.

    For now, the “base case” analysis ought to look at the simple, un-spliced records. This eliminates all the issues attendant on splicing, modifying, etc. It just says “give me those records that are long lived and recorded in the same way for the whole time of their life span” vs those records that say “Well, I’m either a short lived here and gone thermometer, or one newly added at a tropical airport, or one where they keep jerking around the modification system, so I’m a bit dodgy”.

    And that sorting stable long lived from flighty thermometer records shows the best most stable records have no warming signal.

    It’s all in the flighty records.

    And that is rather interesting.

    “Why” will take a while…

  17. Simon says:

    Since most of the “warming” is, I believe, represented by a an increase in minimum temperatures, and most data is in the form min/max, is the raw data of min/max available anywhere, or is it just the meaningless average?
    Plotting min and max separately would then make this feature blindingly obvious.

  18. E.M.Smith says:

    You can download any of MIN, MAX, or Average from NOAA. The details in under the GIStemp tab up top in the STEP0 process.

  19. turkeylurkey says:

    BONUS ROUND:
    Hey Cheif,

    Am trying to explain the tower of Jello to someone who thinks I’m nuts.

    I think that plots of the Top 10% are potentially powerfully compelling.
    I find myself wishing that I had more than 14 data points for the ‘decadal_annual’ averages.

    If you can find the card deck for that Top 10% thing,
    would you please consider re-invoking it but with a Rolling decadal average?

    There’s a ton of persuasion in those top ten, but I’m wishing I could see a bit more detail when I plot the annual average.
    Probably these are predominantly NH, but at least they aren’t moving around a lot.

    Also, regarding barfulous analyses, I linked to a classic image.
    With High regard,
    TL

Comments are closed.