illudium

Illudium, the answer for Nasa GISS.

Nasa’s Marvin The Martian

full size image

There are links to sound wav files in this text from http://www.gargaro.com/marvinsounds.html , if you click on a link and get nothing, it may be because you have your sound turned off.

Some “Defenders of GISS” have stated that GIStemp is just fine because “it uses satellite data”. Well, yes and no, but more no than yes… There are some data used in GIStemp that derived ultimately from some satellite input, but I would not consider them to be basic input data, more like a processed product. But this distinction between “data” and “product” seems to ellude some folks… One could just as easily say that a Diesel truck runs on “Rocket Fuel” because you put 20 percent kerosene in it in winter to thin the fuel and some rockets used kerosene! It’s technically true, but more misleading than enlightening.

It’s not that I think Hansen, like Marvin the Martian, is going to accidentally destroy the planet due to his research, but there is a certain megalomaniacal similarity with the incedental destruction of our economy due to what eludes him. We can only hope he is disapointed.

Here a bit more detail on the illudium space modulator “satellite” component of the GISS stew. First, notice that this all talks about SST for Sea Surface Temperature. It’s not about satellite data coverage for land. The land data comes from land thermometers. From:

http://www.emc.ncep.noaa.gov/research/cmb/sst_analysis/

Analysis Description and Recent Reanalysis

The optimum interpolation (OI) sea surface temperature (SST) analysis is produced weekly on a one-degree grid. The analysis uses in situ and satellite SSTs plus SSTs simulated by sea ice cover.

So here are your first clues. It’s an “analysis” not a reporting of satellite data. It uses “in situ”, that is surface reports from ships, buoys, etc.; along with satellite Sea Surface Temperatures and, my favorite, SSTs simulated by sea ice cover. Given the recent “issues” with sea ice reporting it kinda make you wonder…

So, ok, a stew of ships, buoys, whatever, a dash of satellite data, and some simulations (based on a broken ice cover satellite?) are used to create this analysis product (that some folks want to call “satellite data”…)

Before the analysis is computed, the satellite data is adjusted for biases using the method of Reynolds (1988) and Reynolds and Marsico (1993). A description of the OI analysis can be found in Reynolds and Smith (1994). The bias correction improves the large scale accuracy of the OI.

Oh, and the satellite data are adjusted based on an optimal interpolation method. We’re getting even further away from “data” and into the land of processed data food product…

In November 2001, the OI fields were recomputed for late 1981 onward. The new version will be referred to as OI.v2.
The most significant change for the OI.v2 is the improved simulation of SST obs from sea ice data following a technique developed at the UK Met Office. This change has reduced biases in the OI SST at higher latitudes. Also, the update and extension of COADS has provided us with improved ship data coverage through 1997, reducing the residual satellite biases in otherwise data sparse regions. For more details, see Reynolds, et al (2002).

And they have had a change of method lately with “improved simulation”. Frankly, I’m not real fond of having my data be a simulation… especially when based on the sea ice data that are, er, questionable. Even if they do say they think it may have reduced the “biases in” the optimal interpolation at higher latitudes (which I presume means in the arctic where the ice was, er is, er, ought to be…)

But these “data” are just fine for calling “satellite data”… at least as long as you don’t mind your data simulated, interpolated, averaged, homogenized, etc. etc. etc. Me? I like my data to be from instruments, natural, whole, and minimally processed. Certainly not synthetic, er, simulated…

Then these computed anomalies (they are not temperatures, they are computed anomalies in 1 degree latitude / longitude grid cells over oceans) are used to adjust the anomallies computed in GIStemp STEP4 near the very end (again, we left temperatures behind long ago.) Via the reference station method, these computed anomaly products can change the land anomalies hundreds or thousands of kilometers away. The best answer to this ought to be: I don’t think so, Jim.

About these ads

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW Science and Background, Favorites and tagged , , . Bookmark the permalink.

3 Responses to illudium

  1. H.R. says:

    I’m beginning to think you’ve gone “where no man has gone before.”

    Just a wild guess, but I think just about everyone else who has tried to dissect GIStemp kinda’ glazed over and zoned out after 1 or 2 steps (their next of kin being properly concerned, if not alarmed, of course).

    I keep following along. You’re striking a nice balance between the details and the plain English translation of each step.

    As Red Green would say, “Remember… keep your stick on the ice. We’re all pullin’ for ya.”

  2. E.M.Smith says:

    H.R.,

    I almost gave up 1/2 way through my first pass, then I did what I always do when it’s a difficult problem and I’m getting that overwhelmed feeling. Pick a piece, maybe even just a single line, and whack at it. Pretty soon it’s a whole script, and then a section. Then you know you can do another one.

    My first whack was just to take the “readme” gistemp.txt and document where the data came from. Then you just put your head down and push… I look at that first page now, “GIStemp a basic Inro.” and it’s sad. It desperately needs a re-write (that I started on today, then Marvin came up!) to put more meat in it. But it did its part. It got me started.

    I do as much as I can stomach, then post it in tech detail with human commentary (hoping to pave the way for other programmer types, yet still have something where a non-programmer can “get the gist of it” and see where there is an understandable broken bit). Then I take a break.

    Sometimes the break is a humor posting, like this one, or a clean direct “plain talk” explanation of something that’s galling me about GIStemp (like the Mr. McGuire piece), and sometimes it’s a six pack 8-| if the part was bad enough to want to forget the experience :-]

    My overall impression is that GIStemp “just growed” over the years and has never had a “clean up”. And I think that is part of the problem. Parts that were glued on again and again without questioning the overall result. Like “the reference station method”. I think it was shown OK once for real temperatures and really nearby stations; now it’s done at least 4 times in a row with stations from all over and even with “nearby” anomalies 1500 km away where it has never been justified as an idea. So the edges were stretched until things are broken.

    And no one goes back to say “You do exactly what, again?”

    So I’ve elected myself, and I’ll stay at it until I’m done; I create a movement of like minded helpers; or my liver gives out, whichever comes first ;-)

    Thanks for the support!

  3. From one terrier to another, thanks! Your quarry is different to mine but if you were not doing it maybe I would, ugh!

    Don’t lose your liver on it, though, you’re much too precious with the contributions you bring.

    (ha! the image! a skywalker terrier!)

Comments are closed.