GIStemp – Witness this Fully Armed and Operational Anomaly Station

An Anomalous Death Star

An Anomalous Death Star

Orginal Image. I believe that non-commercial use of this image for purposes of education and parody constitutes fair use.

What is an Anomaly Map

As one of it’s major end products, GIStemp produces an “anomaly map”. This purports to show if the planet is getting warmer, or cooler, where, and by how much. It does this by comparing the present computed temperature averages to the past computed temperature averages. If the present averages are higher, the assumption is that the planet is getting warmer. Folks then leap to the further conclusion that this must be due to something we people have done, in particular CO2 generation.

There are several issues with this chain of connection.

The world has some known cyclical weather and climate patterns. There is a 1500 year warming and cooling cycle called a “Bond Event”. There is a 176 to 200 year cycle and a ‘double that’ of 300-400 years that may be related to patterns of solar activity. There are even some warming and cooling cycles to the oceans; with periods in the 20 to 60 year range (such as the PDO and AMO – The Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation) that may be simply a ‘ringing’ of the flow of the water, just at a very low frequency. So the first issue is just that an “anomaly” is comparing two different periods in time, and any normal cyclical pattern can show up as an “anomaly” even if it is not abnormal. Only periods shorter than the baseline are immune from this effect. The 30 year baseline in GIStemp can be fooled into thinking that a PDO flip, for example, is an “anomaly”.

So if your start period of your “baseline” is positioned at the bottom of a cyclical low point, you are not measuring anything abnormal, you are measuring where you chose to place your baseline. If it is shorter than a cycle, you might be measuring that natural cycle. GIStemp uses a 30 year baseline from 1950 to 1980. By definition, any cycle longer than 30 years will show up a an anomaly, even though it is absolutely natural.

Further, if you select a low point of one of those cycles for your start of baseline, your ‘anomalies’ will, again by definition, show a lot of “warming” that is not really warming. It is just the placement of your baseline. So is the placement of the baseline ‘special’? From personal experience and second hand from the reports of my parents, the 1930’s and early ’40s were warm. The 1960s to 1970s were particularly cold. (It snowed in my home town for the first time in dozens of years). So what I see is a simple “cherry pick” of a cold point to place the baseline, straddling that cold time. Does the data support this?

PDO Pacific Decadal Oscillation

PDO Pacific Decadal Oscillation

Orginal image.

This picture shows the GIStemp baseline set exactly on top of the cold phase of the PDO, one of the more major ocean cycles with the largest impact.

So we know that the baseline is in a ‘special’ place in terms of cold, and we know that it is too short to avoid being mislead by cycles. Is there any evidence for a cycle of longer duration? And where is the baseline in that cycle? (Or those cycles…)? Yes. We have the Little Ice Age and our recovery from it. A “several hundred year” event. On this picture we can see that all of the GIStemp time period, from 1880 to present, is in fact on the upslope out of the cold end of the Little Ice Age. The bottom end of a 300+ year cycle.

Sweden had a warming long ago.

Sweden had a warming long ago.

Clearly there is room to improve the baseline placement and duration in GIStemp. (As a future “what if” benchmark, I will be moving that baseline and measuring the impact on the anomaly product. Answering the question: “Exactly how much of the ‘anomaly’ comes from baseline placement?”

Why Do We Care

Well, there are a bunch of folks who are asserting the world is having runaway global warming because “the anomaly” is too warm. But we could just as easily say the anomaly had been too cold until now by a different selection of baseline period… So making a policy decision based on where a programmer decided to put the start of a baseline seems a bit dim…

The Anomaly Map Will Save Us From Thermometer Change

Further, in examining the workings of GIStemp, we find things that “look like they will break it” and make the output invalid. When pointed out, those folks who support the Global Warming hypothesis point at the anomaly map creation of GIStemp and assert: ~ “The Anomaly Prevents {whatever the problem that was found} from invalidating the product or changing the anomaly map”.

Magical things, these anomaly maps. You can feed them any data and they will make a pristine and correct map, somehow… Usable even down to the 1/100 C degrees of temperature. I have trouble buying that claim (in part due to the math of precision and in part due to knowing that computer programs can easily have the low order bits wrong).

But one must bow before the command to: “Witness the power of this fully armed and operational Anomaly Station”. It can squash dissent with the mere presence of a map. It can vaporize any complaint about thermometer quality, placement, and change over time. Reducing thermometer issues to rubble with a single blast of data mapping. Never mind your concerns about bit shifts, low order bits, serial averaging blurring precision, thermometer deletions, biased thermometer placements, bad baselines: “Nothing can resist the power of the Anomaly!”.

Use The Benchmark Luke!

Well, GIStemp makes the anomaly maps toward the end, in the last “Land Step”, that is, the last step that is not just averaging in some Hadley CRUT sea surface temperature anomaly maps into the ocean bit. STEP3.

It would be very nice to run one set of data through STEP3 and then another set of data from the same place but with some of the thermometers missing and see if it really DOES find the same anomalies. If the anomaly changes, even by a single 1/10 C and in either direction we know that it is sensitive to the particular thermometers used: It becomes a problem of degree not one of kind. Or put more bluntly: A prostitute that only sleeps with rich men and only for $10,000 per night is still a prostitute just like the $5 one; we’re only haggling over the price.

So I have this benchmark. Not a great one, an accidental one. But it is enough.

It seems that in 2007 NOAA changed the format of a data file (the US thermometers on land or USHCN) to a new format named USHCN version 2 or USHCN.v2 and GIStemp did not change with the times. It still uses the old USHCN copy, but that one ends in 2007.

At about the same time: GHCN (the Global thermometers on land series, that also has the US data in it, but converted to degrees C instead of F ) decided to chuck a bunch of thermometers in the dust bin in terms of new data. (Oddly, both still report the OLD data, that goes into the baseline, but new records need not apply… that this might bias the past relative to the “now” is self evident, but some folks can’t see things in front of them… so we benchmark…)

The Forbidden Experiment is often one where you allow a patient to die to see what a particular drug or treatment does, or does not, do. In many ways, NOAA and NASA have conducted The Forbidden Experiment. They have allowed 93% of the land thermometer data for the USA to die from GHCN (and, via GIStemp not updating to use USHCN.v2, from GIStemp). The values you see publicly touted are the ones that come from The Forbidden Experiment version of the land history. (GHCN has also had a Great Dying of Thermometers world wide. It is a Pandemic Thermometer Death…).

If we could put them back in, we could see the results of this: The Forbidden Experiment. And while I can not do that for the whole world (not having the input data from places like Canada, Australia, China, etc.) we do have a way to do it for the U.S.A. That USHCN.v2 file.

I’ve written two different bits of code. One converts the USHCN.v2 file into a USHCN format file and just shoves it into GIStemp the same old way. The other takes the first step of GIStemp, STEP0, and teaches the program that reads USHCN data to read the USHCN.v2 file instead. I’ve run both, and both give the same results. (There are about 1100 stations who’s data end in mid 2007 that get added back in, but there are also about 63 stations who end up in a ‘need new station inventory description’ log file; and all of their data is removed – some going back to 1880). So this is not a perfect “2007 changes only” benchmark. But it does let us see if thermometer change changes the anomaly map.

Also, we still have the Rest Of World changes going on. In the ROW we still have deletions and changes, so we are looking for the impact of only restoring the USA data against the tide of global change. ANY visible impact is important.

So I ran the code. And the anomaly: changes.

Not much, but it does. Now we are just haggling over the price…

The Northern Hemisphere Anomaly Report

In STEP3, where the anomaly maps are produced, there are also produced some anomaly reports. I’m not going to paste the whole thing in here (most of it is substantially the same when the data are substantially the same). The ‘interesting bit’ is the last couple of years.

And what do we get? Here, first, is the “With USHCN.v2 data added” report. Then the older version. Some anomalies go up, some go down. Clearly the 1/10 C place IS sensitive to thermometer locations.

(Oh, this is a current USHCN.v2 file, but the GHCN is a couple of months old, so don’t trust the “July” 2009 data in the new run to mean anything… it is only the USA data in that month).

One other minor point: Even before The Great Dying Of Thermometers there is some jitter in the 1/10 C place. Clearly the 1/10 C place is also sensitive to exactly which copy of the same data you get from NOAA… (The new file is in 1/10 F while the old file is in 1/100 F – despite the raw data being in whole degrees F and those both being False Precision numbers…)

As I have said many times: The 1/10 C place in GIStemp is not usable for any significant decisions. It is as much an artifact of processing as it is of any temperature readings.

The New Northern Hemisphere Anomaly Report

[chiefio@tubularbells results]$ pwd
/gnuit/GIStemp/STEP3/results
[chiefio@tubularbells results]$ ls -l NH.Ts.GHCN.CL.PA.txt 
-rw-rw-r--    1 chiefio  chiefio     14647 Nov 12 11:30 NH.Ts.GHCN.CL.PA.txt
[chiefio@tubularbells results]$ tail NH.Ts.GHCN.CL.PA.txt 
2001    74   66   98   70   73   62   63   80   54   66  107   85     75  72     62   80   68   76  2001
2002   128  129  142   74   63   76   86   58   68   56   90   62     86  88    114   93   73   71  2002
2003   110   70   76   80   90   62   72   93   81  106   84  108     86  82     80   82   75   90  2003
2004    73  109  122   97   74   60   58   67   65   78  107   63     81  85     96   98   62   84  2004
2005   109   85  112  112   91   97   91   81  107  110  115   97    101  98     86  105   90  111  2005
2006    89  112   96   67   85   95   79   72   79  106  105  132     93  90     99   83   82   97  2006
2007   163  119  115  126   80   76   77   87   73   99   96   97    101 104    138  107   80   89  2007
2008    37   53  132   70   66   63   60   54   58   84   94   78     71  72     63   89   59   79  2008
2009    85   86   68   79   79   73   36*************************  *********     83   75**********  2009
Year   Jan  Feb  Mar  Apr  May  Jun  Jul  Aug  Sep  Oct  Nov  Dec    J-D D-N    DJF  MAM  JJA  SON  Year
[chiefio@tubularbells results]$ 

The Old Northern Hemisphere Anomaly Report

[chiefio@tubularbells Aug24.save]$ pwd 
/gnuit/GIStemp/STEP3/results/Aug24.save
[chiefio@tubularbells Aug24.save]$ ls -l NH.Ts.GHCN.CL.PA.txt 
-rw-rw-r--    1 chiefio  chiefio     14647 Aug 24 13:36 NH.Ts.GHCN.CL.PA.txt
[chiefio@tubularbells Aug24.save]$ tail NH.Ts.GHCN.CL.PA.txt 
2001    73   64   97   69   72   61   63   79   53   65  106   84     74  71     61   79   67   75  2001
2002   126  127  141   74   62   75   85   58   67   55   89   60     85  87    112   92   73   70  2002
2003   108   68   75   80   89   61   71   92   80  104   82  106     85  81     79   81   74   89  2003
2004    71  107  121   96   73   59   57   65   64   77  105   60     80  83     95   97   60   82  2004
2005   107   83  110  111   90   96   90   80  106  109  113   95     99  96     84  104   89  109  2005
2006    86  110   95   66   83   93   78   71   77  105  103  130     91  88     97   81   81   95  2006
2007   161  117  113  124   79   76   79   86   73  100   94   95    100 103    136  105   80   89  2007
2008    37   51  128   72   69   64   60   56   60   88   95   75     71  73     61   90   60   81  2008
2009    85   84   66   80   78   74******************************  *********     81   75**********  2009
Year   Jan  Feb  Mar  Apr  May  Jun  Jul  Aug  Sep  Oct  Nov  Dec    J-D D-N    DJF  MAM  JJA  SON  Year
[chiefio@tubularbells Aug24.save]$ 

What now?

Now I need to go put into the inventory file those stations that were dropped so that they can be carried all the way through. And it would be nice to do specific benchmarks at each step of the process to find out exactly how much impact the change has, and where. it would also be a nice thing to “stabilize the world” and repeat the test without ROW changing too. Oh, and I’m going to re-run this end to end (both “USHCN only” and “USHCN.v2”) with completely new downloads of all the data files (though I’m going to need to find a new disk to squirrel away some of my archive sets…).

All very important and all very useful. And necessary to fend off the inevitable squeals of “But some records were not processed in the USHCN.v2 run!” and “But GHCN was a different time stamp than USHCN.v2!” and all the rest. And yes, those are very important concerns. (Rather like a preference for “short skirts and high heels” v.s. “hot pants and leather with studs”. Oh, and negotiating the price for this particular… um, anomaly product… )

But at the end of the day, this (admittedly limited) benchmark shows one thing rather clearly: The Anomaly Maps are sensitive to thermometer count and location issues in the 1/100 C position and by more than a single 1/100 th in some cases. For a few, the 1/10 C place is changed as digits roll. We have a change in 1/2 the Globe in these N. Hemisphere reports based on thermometer changes in the USA only. Something we were told could not happen…

All that is left is finding the exact limits to the numbers (as we haggle over the small change …)

Technobabble follows for those who like to see the homework:

The Program Run Logs

Just as documentation of the creation of this run, I’ve captured the “run log” below. The console output. Basically, it just shows this was a new run and some details on sizes and times. Messages like “Cannot remove file” just mean that a work space was already clean when the cleaning step was run.

Snow-Book:~ chiefio$ cat v2.RunLog.12Nov 

[chiefio@tubularbells STEP0]$ do_comb_step0.sh v2.mean
Clear work_files directory? (Y/N) y
rm: cannot remove `work_files/*': No such file or directory
Bringing Antarctic tables closer to input_files/v2.mean format
collecting surface station data
... and autom. weather stn data
... and australian data
replacing '-' by -999.9, blanks are left alone at this stage
adding extra Antarctica station data to input_files/v2.mean
created v2.meanx from v2_antarct.dat and input_files/v2.mean
GHCN data:
 removing data before year 1880.
created v2.meany from v2.meanx
replacing USHCN station data in v2.mean by USHCN_noFIL data (Tobs+maxmin adj+SHAPadj+noFIL)
  reformat USHCN to v2.mean format
extracting FILIN data
getting inventory data for v2-IDs
After the sort of ushcn.tbl into ID_US_G
 
Doing ../bin/v2USHCN2v2.exe the New USHCN.v2 Version
 USHCN data end in  2009
 
Check work_files/ushcn.tbl.updates for station changes
 
-rw-rw-r--    1 chiefio  chiefio    365930 Nov 12 10:24 ushcn.tbl.updates
 
finding offset caused by adjustments
extracting US data from GHCN set
 removing data before year 1980.
getting USHCN data:
-rw-rw-r--    1 chiefio  chiefio  10255476 Nov 12 10:24 USHCN.v2.mean_noFIL
-rw-rw-r--    1 chiefio  chiefio   9594277 Nov 12 10:25 xxx
doing dump_old.exe
 removing data before year 1880.
-rw-rw-r--    1 chiefio  chiefio   9594277 Nov 12 10:25 yyy
Sorting into USHCN.v2.mean_noFIL
-rw-rw-r--    1 chiefio  chiefio   9594277 Nov 12 10:25 USHCN.v2.mean_noFIL
 done with ushcn
created ushcn-ghcn_offset_noFIL 
Doing cmb2.ushcn.v2.exe
created  v2.meanz
replacing Hohenspeissenberg data in v2.mean by more complete data (priv.comm.)
disregard pre-1880 data:
At Cleanup
created v2.mean_comb
move this file from to_next_step/. to ../STEP1/to_next_step/. 
Copy the file to_next_step/v2.mean_comb to ../STEP1/to_next_step/v2.mean_comb? (Y/N) y
 
and execute in the STEP1 directory the command:
   do_comb_step1.sh v2.mean_comb
[chiefio@tubularbells STEP0]$ 

[chiefio@tubularbells STEP0]$ 
[chiefio@tubularbells STEP0]$ ls -l ../STEP1/to_next_step/*
-rw-rw-r--    1 chiefio  chiefio  44695497 Nov 12 10:28 ../STEP1/to_next_step/v2.mean_comb

../STEP1/to_next_step/save:
total 73186
-rw-rw-r--    1 chiefio  chiefio  29872975 Aug 24 13:05 Ts.txt
-rw-rw-r--    1 chiefio  chiefio  44775115 Aug 24 12:37 v2.mean_comb
[chiefio@tubularbells STEP0]$ cd ../STEP1
[chiefio@tubularbells STEP1]$ do_comb_step1.sh v2.mean_comb
Copy input files from STEP0/input_files ? (Y/N) n
Clear work_files directory? (Y/N) y
Creating v2.mean_comb.bdb
reading v2.mean_comb
reading v2.inv
writing v2.mean_comb.bdb
Combining overlapping records for the same location:
Fixing St.Helena & Combining non-overlapping records for the same location:
Dropping strange data - then altering Lihue,Hawaii
reading Ts.strange.RSU.list.IN
reading v2.mean_comb.combined.pieces.bdb
writing v2.mean_comb.combined.pieces.strange.bdb
reading v2.mean_comb.combined.pieces.strange.bdb
writing v2.mean_comb.combined.pieces.strange.alter.bdb
reading Ts.discont.RS.alter.IN
reading v2.mean_comb.combined.pieces.strange.alter.bdb
creating v2.mean_comb.combined.pieces.strange.alter.txt
1000
2000
3000
4000
5000
6000
7000
7630

created Ts.txt
move this file from STEP1/to_next_step to STEP2/to_next_step 
and execute in the STEP2 directory the command:
   do_comb_step2.sh last_year_with_data
[chiefio@tubularbells STEP1]$ ls -l to_next_step/Ts*
-rw-rw-r--    1 chiefio  chiefio  29799460 Nov 12 10:55 to_next_step/Ts.txt

[chiefio@tubularbells STEP1]$ cp to_next_step/Ts.txt ../STEP2/to_next_step/

[chiefio@tubularbells STEP1]$ cd ../STEP2

[chiefio@tubularbells STEP2]$ do_comb_step2.sh 2009
Clear work_files directory? (Y/N) y
converting text to binary file
 last year with data: 2009
 1000 processed so far
 2000 processed so far
 3000 processed so far
 4000 processed so far
 5000 processed so far
 6000 processed so far
 7000 processed so far
 number of station ids: 7630
STOP 0
breaking up Ts.bin into 6 zonal files
STOP 0
trimming Ts.bin1-6
STOP 0
preparations for urban adjustment
Creating annual anomalies ANN.d2009.[1-6]
 GHCN V2 Temperatures (.1 C)                                                     
 935 1 6 1560 1575 1880 9999 -9999 1552
STOP 0 statement executed
 GHCN V2 Temperatures (.1 C)                                                     
 1 1 6 1560 1575 1880 9999 -9999 1549
STOP 0 statement executed
 GHCN V2 Temperatures (.1 C)                                                     
 1069 1 6 1560 1575 1880 9999 -9999 1554
STOP 0 statement executed
 GHCN V2 Temperatures (.1 C)                                                     
 973 1 6 1560 1575 1880 9999 -9999 1212
STOP 0 statement executed
 GHCN V2 Temperatures (.1 C)                                                     
 722 1 6 1560 1575 1880 9999 -9999 1554
STOP 0 statement executed
 GHCN V2 Temperatures (.1 C)                                                     
 805 1 6 1560 1575 1880 9999 -9999 1392
STOP 0 statement executed
inputfiles: ./ANN.dTs.GHCN.CL.1-6 rural neighborhood radius:1000 km overlap_cond:20
The following files were created:
PApars.list
 and for diagnostic purposes:
PApars.noadj.stations.list
PApars.GHCN.CL.1000.20.log
PApars.statn.use.GHCN.CL.1000.20
PApars.statn.log.GHCN.CL.1000.20
STOP 0
 
created Ts.GHCN.CL.* files
 
move them from STEP2/to_next_step to ../STEP3/to_next_step
and execute in ../STEP3/do_comb_step3.sh 
[chiefio@tubularbells STEP2]$ 


[chiefio@tubularbells STEP2]$ ls to_next_step/
save          Ts.GHCN.CL.3  Ts.GHCN.CL.6     Ts.GHCN.CL.PA.3  Ts.GHCN.CL.PA.6             Ts.txt
Ts.GHCN.CL.1  Ts.GHCN.CL.4  Ts.GHCN.CL.PA.1  Ts.GHCN.CL.PA.4  Ts.GHCN.CL.PA.station.list
Ts.GHCN.CL.2  Ts.GHCN.CL.5  Ts.GHCN.CL.PA.2  Ts.GHCN.CL.PA.5  Ts.GHCN.CL.station.list
[chiefio@tubularbells STEP2]$ ls -l to_next_step/Ts.GHCN.CL.*
-rw-rw-r--    1 chiefio  chiefio    847556 Nov 12 11:09 to_next_step/Ts.GHCN.CL.1
-rw-rw-r--    1 chiefio  chiefio  14175108 Nov 12 11:10 to_next_step/Ts.GHCN.CL.2
-rw-rw-r--    1 chiefio  chiefio   2784852 Nov 12 11:10 to_next_step/Ts.GHCN.CL.3
-rw-rw-r--    1 chiefio  chiefio   1555268 Nov 12 11:10 to_next_step/Ts.GHCN.CL.4
-rw-rw-r--    1 chiefio  chiefio   1364296 Nov 12 11:10 to_next_step/Ts.GHCN.CL.5
-rw-rw-r--    1 chiefio  chiefio     83960 Nov 12 11:10 to_next_step/Ts.GHCN.CL.6
-rw-rw-r--    1 chiefio  chiefio    824152 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.1
-rw-rw-r--    1 chiefio  chiefio  13386164 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.2
-rw-rw-r--    1 chiefio  chiefio   2275360 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.3
-rw-rw-r--    1 chiefio  chiefio   1314048 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.4
-rw-rw-r--    1 chiefio  chiefio   1284896 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.5
-rw-rw-r--    1 chiefio  chiefio     83960 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.6
-rw-rw-r--    1 chiefio  chiefio    571140 Nov 12 11:11 to_next_step/Ts.GHCN.CL.PA.station.list
-rw-rw-r--    1 chiefio  chiefio    578588 Nov 12 11:10 to_next_step/Ts.GHCN.CL.station.list
[chiefio@tubularbells STEP2]$ cp to_next_step/Ts.GHCN.CL.* ../STEP3/to_next_step/
[chiefio@tubularbells STEP2]$ cd ../STEP3
[chiefio@tubularbells STEP3]$ do_comb_step3.sh 
Clear work_files directory? (Y/N) y
rm: cannot remove `work_files/*': No such file or directory
Doing ../bin/toSBBXgrid.exe 1880 1200 > to.SBBXgrid.1880.GHCN.CL.PA.1200.log 


SideBar:  Machine usage while I wait for STEP3 to complete:

 11:29am  up  2:58,  2 users,  load average: 0.99, 0.74, 0.47
43 processes: 40 sleeping, 3 running, 0 zombie, 0 stopped
CPU states: 97.2% user,  2.7% system,  0.0% nice,  0.0% idle
Mem:   126724K av,  123924K used,    2800K free,       0K shrd,     536K buff
Swap:  326292K av,       0K used,  326292K free                   77380K cached

  PID USER     PRI  NI  SIZE  RSS SHARE STAT %CPU %MEM   TIME COMMAND
 1831 chiefio   17   0 18816  18M   464 R    99.6 14.8   6:40 toSBBXgrid.exe
 1833 chiefio   10   0  1036 1036   836 R     0.3  0.8   0:00 top
    1 root       8   0   512  512   444 S     0.0  0.4   0:04 init
    2 root       9   0     0    0     0 SW    0.0  0.0   0:00 keventd
    3 root       9   0     0    0     0 SW    0.0  0.0   0:00 kapm-idled
    4 root      19  19     0    0     0 SWN   0.0  0.0   0:00 ksoftirqd_CPU0
    5 root       9   0     0    0     0 SW    0.0  0.0   0:02 kswapd
    6 root       9   0     0    0     0 SW    0.0  0.0   0:00 kreclaimd

Pegged at 100% for a little while...
Meanwhile, back at the sript output:

The following files were created:
SBBX1880.Ts.GHCN.CL.PA.1200    BX.Ts.GHCN.CL.PA.1200  
At Clean Up

If you don't want to use ocean data, you may stop at this point
You may use the utilities provided on our web site to create maps etc
using to_next_step/SBBX1880.Ts.GHCN.CL.PA.1200 as input file

In order to combine this with ocean data, proceed as follows:
move SBBX1880.Ts.GHCN.CL.PA.1200 from STEP3/to_next_step to STEP4_5/input_files/.
create/update the SST-file SBBX.HadR2 and move it to STEP4_5/input_files/.
You may use do_comb_step4.sh to update an existing SBBX.HadR2 file
You may use do_comb_step5.sh to create the temperature anomaly tables
that are based on land and ocean data
While left unsaid by GISS, an SBBX.SSTHadR2 is available via: 
 ftp://data.giss.nasa.gov/pub/gistemp
and the oiv2monthly files are available at: 
ftp://ftp.emc.ncep.noaa.gov/cmb/sst/oimonth_v2

[chiefio@tubularbells STEP3]$ ls -l results/
total 64
drwxrwxr-x    2 chiefio  chiefio      1024 Nov 12 10:11 Aug24.save
-rw-rw-r--    1 chiefio  chiefio     14647 Nov 12 11:30 GLB.Ts.GHCN.CL.PA.txt
-rw-rw-r--    1 chiefio  chiefio     14647 Nov 12 11:30 NH.Ts.GHCN.CL.PA.txt
-rw-rw-r--    1 chiefio  chiefio     14647 Nov 12 11:30 SH.Ts.GHCN.CL.PA.txt
-rw-rw-r--    1 chiefio  chiefio     14271 Nov 12 11:30 ZonAnn.Ts.GHCN.CL.PA.txt

[chiefio@tubularbells STEP3]$ ls -l to_next_step/SBBX1880.Ts.GHCN.CL.PA.1200 
-rw-rw-r--    1 chiefio  chiefio  43841984 Nov 12 11:30 to_next_step/SBBX1880.Ts.GHCN.CL.PA.1200
[chiefio@tubularbells STEP3]$ 

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW GIStemp Specific, Favorites and tagged , . Bookmark the permalink.