USHCN vs USHCN Version 2 – more induced warmth

USHCN – U.S. Historic Cimate Network.

I had intended to make a study comparing USHCN (what was used by GIStemp up until just a couple of months ago, and which “cut off” in May of 2007, leaving only the limited GHCN – Global Historic Climate Network, stations) to the newer USHCN Version2 (that I will shorten to USHCN.v2). The question I was going to answer was “Did my ‘eyeball’ inspection of the data that looked like it had an induced warming trend stand up to analysis?” Well, in comments over on Wattsupwiththat, it looks like another person is already doing that work, and finding an induced warming trend from the “update”.

Mike McMillan (17:28:30) :

I’ve completed USHCN vs USHCN version 2 blink comparison charts for Wisconsin. As with the Illinois charts, the majority of stations had their raw data adjusted to show more warming by lowering the temperatures in the first half of the 20th century.
That brings the raw data more in line with the GISS homogenized versions. I haven’t blinked the original GISS with the new homogenized charts yet, but I’d bet a nickle they’ll show even more warming.

Wisconsin original USHCN raw / revised raw data –
http://www.rockyhigh66.org/stuff/USHCN_revisions_wisconsin.htm

Illinois original raw / revised raw –
http://www.rockyhigh66.org/stuff/USHCN_revisions.htm

Revised raw data. Oxymoron?

I’ll still do my comparison study (as confirmation of his findings, if nothing else. Independent confirmation is always a good thing) but just with a bit less urgency. The charts in his link have a caption that implies this is the GIStemp STEP0 output (which merges GHCN with USHCN or USHCN.v2) so would be the “blended” data for those stations in both. Luckily, with only 136 USA stations remaining in GHCN, the odds of any one of these being that station or two in that state become quite small. For all practical purposes, you can treat these graphs as a fairly clean USHCN vs USHCN.v2 comparison, even if they are not completely pristine comparisons of the data sets directly.

At least now we know that the decision by NASA / GISS to “put back in the USA thermometers” was not exactly a benevolent act… Until just a while ago, GIStemp ran on USHCN (which had a data cut off in May of 2007. This left only GHCN to cover the last 3 years and had such effects as leaving only 4 thermometers in California mostly on the beach and near L.A.) Well, OK, they put the USA back in, but it sure looks like they had to re-cook the data first.

So now instead of only 1176 thermometers surviving into 2009 in GIStemp, all from GHCN, we will have them plus a couple of thousand in the USA, but with those USA thermometers having been “adjusted” to show sufficient warming…

I need to re-run some of the thermometer count reports, but with USHCN.v2 data in the system, and see what it does to the USA numbers. So now it looks like we’ll have more than 4 thermometers in California, but though they won’t be “on the beach”, they will be near a sun lamp … ;-)

BTW, I’d heard a press release discussed where NOAA / NCDC says that a new revision of GHCN is due out near February that uses the same “enhanced” “corrections” applied in USHCN.v2. So watch out for a New Improved, and much Warmer Global Temperature History coming to a computer model near you!

Call me old fashioned, but I really liked it better when my history did not keep changing and past temperatures did not require frequent re-writing…

Advertisements

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW GIStemp Specific, AGW Science and Background, Favorites, NCDC - GHCN Issues and tagged , , , , , . Bookmark the permalink.

57 Responses to USHCN vs USHCN Version 2 – more induced warmth

  1. Tony Hansen says:

    Looking at the raw(?) data comparisons suggests that there has been some raw work going on somewhere.
    Is there any way short of going back to the B1(?) forms to get real raw data?

  2. E.M.Smith says:

    @Tony Hansen:

    Well, that’s that 64 Billion Dollar Question…

    Every time I’ve thought I had the “raw” data I’ve been wrong. GIStemp called there input “raw” but it was just the ouput of NCDC… Repeat…

    So I’d suggest going to the paper for a couple of stations, then following it forward and finding where it changed. Then back up one step and you probably have “raw” (but should do checking!)

    For things like the Australian data, you get to back up through NOAA / NCDC and end up at the Australian BOM and have to work your way back through THEIR changes too…

    Sad, really. One of the first lessons banged into my brain was that you always had the original data preserved in your lab book and that erasures or a missing page got you an F, no questions asked and no answers accepted… If you wrote it down wrong, you put a single line through it, wrote the correct value next to it, and put a footnote explaining why you were so clumsy…

    This wholesale mutating of the data just makes me want to vom…

  3. Malaga View says:

    Call me old fashioned, but I really liked it better when my history did not keep changing and past temperatures did not require frequent re-writing…

    You can call me old fashioned as well… and please keep on rocking the numbers in the good old fashioned way… it is music to my ears…. Thank you.

  4. pyromancer76 says:

    Fine presentation on John Coleman’s TV program. Anthony Watts give deserved praise — http://wattsupwiththat.com/. You have an excellent way of speaking to both the mind and the heart and the entire production should be understood by a very large public, no small task. It is like a masterstroke, at exactly the right time. I wish it could be sent to every high school in the U.S. as required viewing in science classes.

    What are we going to do about “the raw data”, the wonderful temperature record that humans have kept for hundreds of years, honestly, with a desire to know more about Nature? Charlatans should not be able to mess with these truths.

    REPLY: [ Thanks! I learned to turn hard technical stuff into normal human stuff in about 9th grade. That was when I was asked to tutor several local farm kids in things like algebra… I was “the math wiz” and they needed help. Nothing like trying to get the concepts of factoring a quadratic equation into the head of a person who really just wanted to make Prom Queen to sharpen your teaching / explaining skills ;-) For one brief shining moment being “the science and math geek” had a bit of status ( 3 of the kids were children of the landed elite; and in a farm town those with the biggest farms can be VIPs of importance. Being their tutor had some cache’ ;-)

    All it really takes is actually knowing how the topic material works, translate it OUT of the jargon of the field, demonstrate the process and the value in a very few words, do it in a way that is interesting or entertaining, encourage them that anyone can do it if they get past the notion that it’s hard, and never talk down with that “I know it you don’t” teacher superiority voice used in classrooms. Oh,and when possible, make it a fun game. Other than that, no problem ;-) Well, that, and maybe the hardest part: Open yourself to them. Show that you are just a regular person like them, but maybe who’s looked at something a bit longer than they have. Just be an accessible human being. You can not hide your soul and be believed and trusted. And a teacher must be believed and trusted to teach effectively.

    Humor sidebar: I’ve tried to purge tech jargon from my general speech. This works well in dealing with CFO, CEO, VP, etc. levels of companies on various contracts. i.e. No “technobabble” and no “talking down” that a lot of computer guys do some times. Well, the flip side was that some of the tech guys would think me technically clueless (because I was not speaking in the code words…). On most contracts I would pick off some hard technical bit and do it myself. 2 reasons. I would demonstrate technical competence. I would keep my skills up. It was great fun to watch folks go from “Does he know anything tech?” to “Gosh, he’s good.” I learned early to spot the guys who were trying to pigeon hole me as ‘non-tech’ and pick a moment to shift to full on technobabble in a staff meeting and watch them suddenly sit up and eyes get big. Great fun ;-) and it would solidify my role as technical lead. Sad, but true, you have to demonstrate Alpha Dominance sometimes to be a good “cat herder”, especially if you don’t look like much of a cat… -E.M.Smith ]

  5. H.R. says:

    Awriiiiiight E.M.!!!

    I did get to catch your segment on John Coleman’s special.

    Not much time was allowed to discuss all of the fine points of what you have found and posted here, but the California reporting temperature example was an excellent choice to communicate the problems with gum’mint temperature data products.

    Considering that the show aired to a California audience, I think you were probably successful in getting a lot of people who have not really looked into “global warming” to say, “Whoa! Only 4 stations in California to measure temps; and they’re all on the beach? That ain’t right.”

    You done real good.

    P.S. I’ll have to start keeping an eye on the hit counter. It should jump up a good bit.

    P.P.S. Aaaand… I see you didn’t get the girl in this episode. However you did make a good start on getting the townspeople riled up to run the corrupt sherriff out of town.

  6. Viv Evans says:

    Seems that praising your performance on John Coleman’s show isn’t that OT!
    I’ve only watched it today, after all the video links were up at WUWT.
    Well done – your great work, which we knew from your blog, came over very well for a lay audience. I hope they were suitably outraged.

    Regarding the cooked an overcooked data – its so sad that we’re now expecting just that from The Team, and are no longer astonished that scientists would dare do such things.

    Still – how they did it needs thorough examination.

  7. Harold Vance says:

    Nice interview, Chief! I thought that you did a fine job of presenting the subject matter in a way that laypersons can understand it.

    Does the actual raw data — meaning the thermometer readings as recorded by the observers in the field — exist in computer format? Has it been digitized and is it available for John Q. Public to peruse?

    I guess that I would find it hard to believe that the true raw data isn’t available and that the climate centers only publish a scrubbed “raw” version for use by the public. Am I correct in assuming that this is the case, that the raw data has been scrubbed, for both USHCN and GHCN?

    REPLY: [ As I understand it, the original paper forms for the USA are on line. I do not know of any extract of the data into a data set that is “really really raw” and online (though Joseph D. found NYC at a site). ALL the stuff from NOAA / NCDC that I’ve been able to find is “adjusted” or scrubbed in some way. From foreign countries, you can get “data” if you pay and promiss not to pass it on to anyone else (as they wish to continue collecting rent for the data…) so I’d surmise that ROW is also not available (at least for many / most countries) and even then you have to find out what “corrections” they have done. For example, the Australian BOM was “recomputing” their “data”… And as we have seen, the “adjustments” are far far greater than any implied “global warming”. So the place to “dig here” is in all the changes done to the data.

    Basically the Global Warming thesis rests on the necessary assumption that every adjustment and change to the data is perfect and 100% accurate. Otherwise AGW dissapears into the “error band” of the adjustments…

    -E.M.Smith ]

  8. DirkH says:

    Just watched Coleman here in Germany. Great Job by Coleman and by you! I hope it makes an impact.

  9. Tim Clark says:

    on January 15, 2010 at 3:45 am E.M.Smith
    Every time I’ve thought I had the “raw” data I’ve been wrong. GIStemp called there input “raw” but it was just the ouput of NCDC… Repeat…
    So I’d suggest going to the paper for a couple of stations, then following it forward and finding where it changed. Then back up one step and you probably have “raw” (but should do checking!)

    But isn’t there a paywall for the paper data? I can’t find it free.

    BTW, I commended your performance on WUWT, but repeat:

    Very well done–Now for peer-review?

    Lots of questions on WUWT for you. Stocks are down today anyway.

    REPLY: [ You can look up individual sites for free on a web site (don’t have the url at hand just now, though) so picking one or two works, you pay for the whole set… I had stayed up late, after “showtime” so that folks comments would get through ‘moderation’ and so that the inevitable rude and vile language would not. Eventually sleep comes. So I’m just now “up”. As soon as I’ve caught up here, I’m off to WUWT for Q/A. Oh, and also now you know why sinking time into “peers” was low on my list of “to do” ;-) especially given that the “peers” were the ones supressing dissenting papers. “When you know the game is rigged, play a different game. -E.M.Smith” so I did a “change up” on them. Public Review. They had (and have) no control of that. Maybe now that the “peer” review process is getting a bit of a clean up post ClimateGate it could be valid again…

    Maybe now it would be worth formal publishing, though… We’ll see what tomorrow brings. It may well end up being a trade off between “trading for a living”, “a ‘real job’ somewhere”, and “climate research for free with no money for rent”. (They are talking layoffs where my spouse works so we might become a “no paycheck” family instead of a “one paycheck and trading”… and as you noticed, the markets are flat and listless. Very hard to make any money out of them without risky options trades (sell calls and call spreads, bet on ‘not much movement’). Welcome to California Cap and Tax — and economic implosion.)

    At any rate, IMHO, the “heavy lifting” part of the investigation is mostly done. There is a clear path toward where there are “issues”, and with some visibility lots of folks can now explore it. If it is me, that would be great ( I seem to have a knack for investigation ;-) but if it is someone else, that’s fine too. In either case, it gets done. -E.M.Smith ]

  10. Pingback: Climate change - anthropogenic or not? - Page 13 - PriusChat Forums

  11. vjones says:

    Very Well Done!!

  12. kuhnkat says:

    Just watched the 4th part of the KUSI report.

    THANK YOU!!!!!

    All 5 parts of the show here:

    http://www.kusi.com/weather/colemanscorner/81583352.html

  13. jazznick says:

    Excellent work – moves things along a bit faster.
    CRU investigation was going to take 3 years here in UK!!

    Hansen has already denied anything amiss.

    What’s next !!?? – see you in court ??

    =============================
    Looks like Roger Revelle may have a lot to answer for;
    sounding off about Global Warming without figures to back it up, that just wouldn’t happen today would it Mr Mann ;-)

  14. Pingback: “Revised raw data” at Heliogenic Climate Change

  15. As per the above, an outstanding job on the KUSI report, sir! If there’s anything we can do over at climategate.com just say the word!

  16. Tim Clark says:

    As soon as I’ve caught up here, I’m off to WUWT for Q/A. Oh, and also now you know why sinking time into “peers” was low on my list of “to do” ;-) especially given that the “peers” were the ones supressing dissenting papers.

    Understood.

  17. ruhroh says:

    Hey Cheif;

    Simply Outstanding!

    I’m clearly old school, but I am having a lot of trouble keeping track of which begat what;

    Maybe a text ‘flow-chart’ of the various FLA and FLA,
    hmm, I guess that would be 4LA and 5LA, “”data”” sets,
    would help those of us with intermittent and at-best-peripheral engagement.

    Who’s on first?

    ABCDN + EFGDN => HIJDN etc.

    I’m noticing that symbols like + and = have meanings that don’t capture the sausage-grinder aspect of the process betwixt input and output.

    I still think that Flowcharts would help folks like me anyway.
    Maybe some genius knows of powerful free data-flow-diagram tool on the www somewhere, like that DIYMAPS thing.

    Maybe flowcharts are like sliderules… another lost art…

    RR

  18. Pingback: Nude Scientist – Issue 2 « TWAWKI

  19. Rod Smith says:

    Congratulations on the great job for the KUSI thing.

    Short — maybe irrelevant — comment. Back in the 70’s Asheville had 23 warehouses full of what I’ll call “original paper,” from US sites. The last I heard (some time ago) is that they now only keep the original stuff for five years. Whether it is/was ever put on microfilm is the question.

    I suspect some of this will come out in the near future. Hang in there, and don’t let them “grind you down.”

  20. Tony Hansen says:

    Just wondering about the base period.
    Has the value of the GISS base period (1951-1980) drifted over time?
    Is it a fixed base or a floating base?

    REPLY: [ Fixed in the public software. On the GIS website there is an option to play with the baseline that I’ve seen in a posting somewhere, but that is not what is used for all the official pronouncements of Doom in Our Time… -E.M.Smith]

  21. juanslayton says:

    Maybe you can give me a pointer or two. I have compiled a list of replacement stations for USHCN (stations that have been changed since the inception of surfacestations.org and do not appear in the gallery):
    http://members.dslextreme.com/users/juanslayton/changes.txt
    Although the original stations appear in the total giss list at
    http://data.giss.nasa.gov/gistemp/station_data/v2.temperature.inv.txt ,
    none of the replacement stations do. Has GISS assigned a station number to any of these, and if so, where can I find it. (I notice that the one replacement station for which Anthony has put up a gallery album [Cushman Powerhouse] does not carry the usual GISS id number.

    REPLY: [ Probably a bit more complicated that you are hoping for, but here goes. Under the GIStemp tab up top, down in the “Geek Corner” you will find links to get to the ‘data download’ portion. The USHCN and GHCN (and now USHCN.v2) data sets all have “inventory” files that list the stations. If a change happened that was only in a station in the USA and after May 2007, and not in GHCN: Then it would not have been in GIStemp until December (when they started using USHCN.v2). So you can look in the base data (GHCN, USHCN, USHCN version 2) and compare, or go back to GISS now (since they ought to have the USHCN.v2 sites showing up now.) BTW, GHCN and USHCN use different numbering schemes for the same sites. GIStemp has a translation table in STEP0/input_files. So I suspect that Anthony is using the WMO number from USHCN while GHCN and GIStemp use the GHCN StationID. -E.M.Smith ]

  22. Tony Hansen says:

    With the ‘raw’ numbers being so fluid , would it even be possible to know or be sure about anything?

    From the email Makiko Sato sent Hansen and Ruedy –
    14/8/2007 – the gap between 1934 and 1998 has shifted by 0.556 degrees (how much warming has there been again?). Plus there are multiple runs.
    Are fresh adjustments being brought in for each/most/some new run/s? Is the released code only valid for the month of release?
    Plus previous run data may not have been kept. Jim says he thinks it might be good to save a copy each year – when it looks like the value for 1934 changes each run – I guess that means each month – why not save a copy of each run?
    Mumble, mumble, bloody grumble… :(

    REPLY: [ IMHO, between the “fluidity” in the input data (I have trouble calling it “raw” anymore) and the gigantic error bands (that are never stated…) and the complete lack of industry standard things like benchmark results and QA test suite results… and especially the constantly changing “adjustment” suite applied (a new one is coming for GHCN in just a month or so…) along with the gigantic deletion of thermometers from cold places and… The only thing I can say with certainty is that we can say absolutely nothing about our temperature history with certainty. Sad, but true. -E.M.Smith ]

  23. E.M.Smith says:

    @Tony Hansen

    Since GIStemp uses dynamic selection of records for various things, each time you run the program with even minor data changes you can get different results, and those results include data adjustments.

    That’s the problem you get into when you keep rewriting the past. Keeping track of all the re-writes. And yes, every month when new data are added you will get a different past created.

    One example: In STEP2 there is a program that throws away any record shorter than 20 years. So each month as stations age and get longer records, some ‘short’ record will age enough to be kept. This record may now be used as part of the UHI adjustment for any / all stations in a 1200 km radius.

    It’s almost like a random number generator for the low order temperature bits. Each run a different result, step right up and place your bets! Hurrry Hurry Hurrry, a new run is about to begin!!!

  24. boballab says:

    Tony Hansen
    Just wondering about the base period.
    Has the value of the GISS base period (1951-1980) drifted over time?
    Is it a fixed base or a floating base?

    REPLY: [ Fixed in the public software. On the GIS website there is an option to play with the baseline that I’ve seen in a posting somewhere, but that is not what is used for all the official pronouncements of Doom in Our Time… -E.M.Smith]

    This is what I have gathered from the CRU emails about baselines. Right now the WMO has set 1961 – 1990 as the baseline for the IPCC. When they did this I don’t remember off hand just remember that from the emails. Also from the emails that at that time CRU shifted their baseline from the one that GISS uses of 1951 – 1980. Changing baselines can make a huge visual difference in presentation if the map is presenting anomaly’s and not trends. Here is a link to the NASA site for the map display:
    http://data.giss.nasa.gov/gistemp/maps/

    Also I have started a little yahoo group that has a few files in it. I’m trying to gear it towards people that are just now starting to look for answers about this type of stuff. One of the things I did was make a primer that has likes to where the databases are, who compiles and who uses them for what. Also it can be used to help parents teach there kids including a simple project concept for a kid to do.
    http://tech.groups.yahoo.com/group/LaymansGuide/

  25. juanslayton says:

    Thanks. I’ll give it a try.

  26. M. Simon says:

    Sad, but true, you have to demonstrate Alpha Dominance sometimes to be a good “cat herder”, especially if you don’t look like much of a cat… -E.M.Smith

    War story:

    I was taking a class in Heat Transfer and Fluid Flow in the US Navy Nuke Power School at Mare Island taught by a UC Berkeley guy (he was an officer in the geek Navy). Sharp guy. Good at explanations to Navy guys about 20 points in IQ lower than him. And they were no dummies.

    I was sitting in the back of the class reading motorcycle magazines (I had a Triumph 650 at the time – sweetest handling bike I ever owned) and keeping half an eye on his three blackboards of work (you know physics geeks) when he thought he had me. And he did the usual, “Simon, what is the next term in the equation?” He was going to put a stop to my goofing off in class and setting a bad example for the rest of the sailors.

    So I said, “I can’t give you the next term” and you could tell he thought he had me, “because two boards back you made an error and all the subsequent equations are wrong.” He looked and by golly I was right.

    From then on he let me read in peace.

  27. Chiefio, I wanted to touch base with you to ensure that the work we do at climategate.com dovetails efficiently and productively into your highly important contribution so we can help get your message to the widest possible audience. I was pleased that my recent article, based on your fine analysis drew in new readership and questions have been raised as to how to mobilise enthusiastic readers keen to offer their services voluntarily.

    Our site is growing quickly and covers a broad spectrum from diehard climate followers to newbies looking for reliable climate info. We want to give both a ready primer to the issue as well as a regularly updated page for well-informed readers. In this regard I would be most grateful if you would, when not inundated, point me, a jobbing writer with far, far more legal than science expertise, in the right direction.

    I’ve just posted a response on TWAWKI regarding those global dropped sites that aren’t included in Anthony Watt’s United States study and as you highlight, the issues are far graver globally where reliable info is scant. I’m hesitant to give further advice until I’ve discussed this with you.

    Therefore, I’d be most grateful if you could advise me where to focus my efforts so I can help support your work to the best of my ability. I’d also like to write further articles on our site as we follow your investigations.

    Many thanks and keep up the great work.
    John

  28. juanslayton says:

    “One example: In STEP2 there is a program that throws away any record shorter than 20 years. So each month as stations age and get longer records, some ’short’ record will age enough to be kept. This record may now be used as part of the UHI adjustment for any / all stations in a 1200 km radius.”

    Am I correct in assuming that when they replace one station with another to form a single continuous record, the 20 year standard will be applied to the sum of the two service periods?

    Is there any accessible record of adjustments for specific stations? For example, I would very much like to see what adjustments were made when LOA, UT (see photo at http://gallery.surfacestations.org/main.php?g2_itemId=69917 ) was replaced by SALINA (see htpp://members.dslextreme.com/users/juanslayton/salina.JPG and note the air conditioners.)

  29. juanslayton says:

    Hmmm. Not sure why I didn’t get a link on that last url. Let’s try this:

  30. E.M.Smith says:

    The “homogenize” and “infill” process is a bit complex. It may or may not stitch two short records together to make one “long enough” and if a station or equipment is replaced, merging it with a ‘nearby’ station to make “one long record” will depend to some extent on how the stations are numbered. For example, there is “the curious case of Calcutta” where two records for the same place has the older segment tossed out (for reasons only a detailed follwing of the code will reveal, but probably that ‘less than 20 years’ filter) yet others get “homogenized” into one record that is then treated as a single long record. It is somewhat “haphazard”…

    https://chiefio.wordpress.com/2009/10/23/gistemp-the-curious-case-of-calcutta/

    One of the consistent failings I see in the way thermometers are handled is that there is no “audit trail” kept. Each month when new data are added, it is a new run and with new results… so “what happened” will depend on which month you are talking about ….

    Yes, the “thermometer sands of time” shift that much in GIStemp land. Whole decades of time for individual thermometers just come and go, and often with no reason why …

  31. E.M.Smith says:

    John O’Sullivan

    Chiefio, I wanted to touch base with you to ensure that the work we do at climategate.com dovetails efficiently and productively into your highly important contribution so we can help get your message to the widest possible audience. I was pleased that my recent article, based on your fine analysis drew in new readership and questions have been raised as to how to mobilise enthusiastic readers keen to offer their services voluntarily.

    Well, one really good thing for new folks would be to use their fresh point of view to clarify where a “primer” is weak, has gaps, or needs the jargon squashed out of it.

    I have a ‘basics of what is wrong with AGW’ posting, but it is really just a rough check list at this point:

    https://chiefio.wordpress.com/2009/07/30/agw-basics-of-whats-wrong/

    I would envsion taking that checklist, adding any missing bits for completion, making sure there is adequate “bridge” material to connect one “issue” to the next one (like, oh “NOAA / NCDC cook GHCN then it goes to GIStemp who further cooks it” Would need to have the buzz words worked out of it and the connections cleared up and an explanation of who are NOAA and what kind of changes are in their adjustment. Then a decent bridge to GISS and GIStemp and what it does…)

    But I’m just swamped now. So the “primer” sits.

    Taking someone with good writing skills and having them ‘flesh it out’ and add examples, then using an “editorial board” of newly interested folks to test read it and suggest “What WERE you saying there?” places would probably be a great thing to do. ( And a wonderful example of a communal “barn raising” in the computer age if I do say so my self ;-)

    (SIDEBAR: I have Amish grandparents on one side, so the whole barn raising thing is a cultural touchstone for me… it is typically done by the whole community for the young family just starting out on a new farm. A community gift to the next generation… )

    Eventually it could work up to things like explaining why thermometer deletions in GHCN cause GIStemp to do a poorer UHI adjustement in induces warming into the anomaly maps… but that would be a ways off ;-)

    What do you think?

    Oh, and feel free to use any of the data, code, and analysis here in any effort to show that AGW is a broken concept. If you find an article of interest, feel free to write your own article using the information found there. A footnote of attribution would be nice.

    Basically, I don’t own the data, and the code is published with “copy left” permissions. So the only bit here that is really “mine” is my own text wrapped around the data and code. If there is any bit where the code is not yet published, it is most likely just because I got pulled off to something else. ALL code I wrote and use in these analsysis postings is available. And frankly, liberal quoting of my text is also fine.

    If someday I ever get around to writting a book out of all this or publishing a paper for a journal, it will all be new words anyway.

    Awfully long winded way to say I don’t expect to ever make a dime of money out of this ;-)

    E.M.Smith

  32. mandolinjon says:

    I am conducting a study on climate in New Mexico, specially for a region east of Albuquerque in the Estancia Basin. From 1900 to 1950 there was sufficient precipitation to grow crops (pinto beans) over a wide area. Today the precipation is extremely low. My interest in in temperature data for the time period in question. Do any of the temperature data bases like those you showed for Illinois for this specific region? If not can you point me in a direction to get data? Thank you for your contributions with repect to AGW and the apparent fraud.

    REPLY: [ The data are available in 3 places. USHCN (that ends in May 2007, but is not too heavily adjusted) and USHCN Version 2 (that is more heavily adjusted but covers ‘to date’) from NOAA. There is also a subset of the stations kept in GHCN. See here:

    https://chiefio.wordpress.com/2009/02/24/ghcn-global-historical-climate-network/

    where there is a description and links to the NOAA site. -E.M.Smith ]

  33. magicjava says:

    On a related note, I’ve found where the UAH and RSS raw data is stored. It’s at the National Snow and Ice Data Center in the “Data Pool” section of this page:
    http://nsidc.org/data/amsre/order_data.html
    You’ll want the AE_L2A.2 brightness temperatures data.
    http://n4eil01u.ecs.nasa.gov:22000/WebAccess/drill?attrib=esdt&esdt=AE_L2A.2&group=AMSA

    I’ve not yet come across any UAH or RSS source code that processes this raw data. However, there is some general purpose source code for working with this data here:
    http://nsidc.org/data/amsre/tools.html

    The raw data has the following noteworthy properties:

    *) It’s huge. I’d estimate a single day of temperatures will be about 2.5 gigabytes of data.
    *) It mixes binary and text data in a single file. The text data is in hierarchical format, but it’s not XML, JSON, or any other standard format. It uses custom tags to define the hierarchy.
    *) It includes all levels of the atmosphere. Usually, you’ll just want the troposphere, so you’ll have to extract that information out manually.

    For more information on this data and how it’s collected, see the WUWT post by Dr. Spencer, here:
    http://wattsupwiththat.com/2010/01/12/how-the-uah-global-temperatures-are-produced/#more-15191

  34. Thanks Chiefio, great to have your thoughts on this and we’ve given it the due consideration such a large project deserves and come up with an idea.

    We really loved the analogy you drew with communal “barn raising” in the computer age. Mark, the site owner sees great potential to set up a ‘wiki’ style online resource which encapsulates your sentiments for group involvement. I haven’t seen another climategate wiki type site anywhere (?) so it has real niche potential.

    Mark and I agree a stand alone site headed with a new domain he’s bought may be the way to go i.e. Climategatewiki.com or a subdomain, i.e. wiki.climategate.com or climategate.com/wiki.

    To avoid the extreme headache of having the warmers mess with it, we feel it needs to be an invite-only wiki. There are many smart people we know who are willing, including the “stars” such as yourself, the ‘great and the good’ so to speak, so we get a melting pot from the science experts, writers, proof readers, editors etc. all making their own specialist contributions.

    Of course, we shall abide by the principles of fair acknowledgement to all concerned and as source for what we draw on from your work we shall always endeavour to appropriately cite and link back to you and all contributors. It’s a key principle we stand by.

    One day you will be acknowledged as a key figure in progressing REAL climate science, just as Steve McIntyre to name but one other. Money is always a gratifying reward for great work but the honour and accolades themselves will draw attention to anyone with great skill and integrity and from there opportunities will arise.

    regards,
    John

    REPLY: [ Sounds good to me. BTW, not just anybody walks onto an Amish farm and starts sawing on the timbers for the barn. It is a directed process by those ‘in the community’. The analogy holds, with a director or moderator keeping the barn design on track. -E.M.Smith ]

  35. Pingback: UNIPCC WRONG AGAIN! « TWAWKI

  36. DirkH says:

    “on January 17, 2010 at 1:07 pm John O’Sullivan
    […]
    I haven’t seen another climategate wiki type site anywhere (?) so it has real niche potential. ”

    Did you look at Lucy Skywalker’s attempt?

    http://www.neutralpedia.com/wiki/Main_Page

  37. Josh Cryer says:

    data.giss.nasa.gov does not have raw data, because it includes USHCN adjusted (which includes TOD adjustments and possibly UHI). I’m not aware of when USHCN v2 was incorporated in to GISS, however, USHCN v2 points out that it lowered the warming by I believe .10 degrees.

    You can get raw USHCN data here: ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/

    You can get raw GHCN data here: ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2

    You can get raw data in general from the NCDC: http://www.ncdc.noaa.gov/

    Again, data.giss.nasa.gov has no raw data, as it clearly says on the website.

    “raw data adjustments” are very strong claims, guys. I will compare USHCN and GHCN to NCDC and see if they are adjusting their raw data as is claimed here.

  38. Josh Cryer says:

    It came to my attention that data.giss.nasa.gov doesn’t have ‘adjusted’ in the data list, but instead, ‘corrected.’ I don’t see GISS using raw unadjusted data, because USHCN comes in three flavors, raw (I believe straight from NCDC), raw + TOD correction, fully adjusted. GISS probably uses raw + TOD. TOD adjustements really hurt the temperature record over long term trends. (Hot boxes sitting under the sun in the middle of the day, changing to cool boxes sitting on a grassy area in the middle of the night.)

  39. Josh Cryer says:

    USHCN v2 raw mathes NCDC exactly for their Central Park station (I had the data files so that’s the one I checked). In time I will automate a checking program to do the checking for me.

    GHCN is a complete failure, though, all of its data for Central Park is in the 20s (20 degrees in August?!). This may be why Macintyre’s analysis of USHCN vs GHCN is so scary. But if you actually look at the number fields it is a nightmare and no one would believe the numbers. GHCN quality control removes Central Park from their data.

    You can read more rambling here: http://www.talk-polywell.org/bb/viewtopic.php?p=32568#32568

    BTW, I got the raw NCDC product from here: http://www7.ncdc.noaa.gov/CDO/cdo

    country -> dataset: surface hourly -> state: NY -> central park

  40. Josh Cryer says:

    I wrote “data.giss.nasa.gov does not have raw data, because it includes USHCN adjusted (which includes TOD adjustments and possibly UHI).”

    No, it includes, well, every adjustment.

    9641C_200907_F52.avg.gz (from the USHCN v2 ftp) matches “raw USHCN data + corrections” on data.giss.nasa.gov ( http://data.giss.nasa.gov/gistemp/station_data/ ).

    It isn’t raw USHCN and it isn’t even raw USHCN + TOB. It’s the whole shebang, missing values added in and biases corrected for.

    Sorry for posting yet again, I’m making some posts to a blog I’m creating, to show how to go step by step to reach the conclusions I’m making. :)

  41. xe136 says:

    Congrats E.M.

    John Coleman has posted on WUWT singing your praises.

    John Coleman: (2010/01/17 at 11:03pm)

    E. Michael Smith is one of the brightest, most communicative persons I have ever interviewed. (I have been doing TV for 55 years and done thousands of inerviews)
    I wish I could put him on Tv for a hour so he could really help us undestand the real interworkings of Gistemp and the NASA/GISS and NOAA/NCDC inner-related layers of programs. He has spent a thousand hours of or more finding his way through the layers and getting into the detailed systems. I want to know more.

    Meanwhile, I apologize more the honorary Ph.D. for Lord Monckton. It was an error by the Producer that I failed to notice before the program aired. It was not the only error.

    I accept the critical remarks of the writers on this website. I am simply an old school, TV Meteorologist, 75 years old with failing eye sight that makes reading the prompter tough. I wrote the entire script and accept blame for contemp errors.

    On the other hand the program was a huge success as measured by the rules of my world. The 4.6 rating beat 30 Rock and was the highest rating of the ENTIRE WEEEK for KUSI. Our website has had over a million hits by people viewing the video. Numerous stations in other markets have asked re-transmission rights which is being granted.

    After all these years my skin is hypo thick. Let me have it. I appreciate the points of complaint and will use them to guide me next time and am not offended by the cuts.

    [Reply: Many thanks for all your hard work on creating the program John. Congratulations on it’s success from all at wattsupwiththat]

  42. Nica in Houston says:

    Dear all:
    I want to express my gratitude for your tremendous, uncompensated work. I have a long-standing suspicion of this field, primarily based on my experience in other, non-engineering disciplines. Primarily epidemiology and public health.

    I completely lost confidence in the underlying “raw” data this past summer. Short story about how this happened below:

    I started hearing newscasters and weather people describe the summer of 2009 in Houston as historically hot. The NWS also started reporting many record high temperatures during those months. However, I vividly remember the summer of 1980 in Houston, when the daytime highs went well above 100 F in the last days of May, and reached 101 F or higher for more than 70 consecutive days! Into mid-August!

    I said to myself, if I can look at the “raw” data from NWS, I can write a note to the weather people and correct them. I searched on-line data for 3 weather stations for the Houston metro area, and was shocked to find only a handful of days over 100 F for the summer of 1980.

    I proceeded to interview long-time Houston residents, and I asked the following questions:
    1. Were you in Houston the summer of 1980?
    2. Is there anything about that summer that stands out in your mind.

    13 of my informal survey explicitly commented that it was devilishly hot. 3 mentioned the uninterrupted string of +100 F days (remember, this was the days BEFORE windchills and realfeels), and a typical statement was: “I was 18 years old working construction, and I remember it was so hot we were drinking gallons of water and still were thirsty”

    I don’t have the time or the skill to go to the Houston Chronicle microfiche archives and dig up the daily weather reports, or to go to the NWS stations to find the paper forms. Therefore, I do not have definitive proof that they’ve cooked the data. Nevertheless, I no longer trust NOAA/NCDC/GISS to provide us with accurate “raw” data. So, therrby hangs a tail, or a tale…

  43. Mike McMillan says:

    If you want the really raw data, scans of the original B-1 forms are online at
    http://www7.ncdc.noaa.gov/IPS/coop/coop.html

    The scans aren’t really up to standards, but it’s fer shure unadjusted.

    I have USHCN version 1 charts for only Iowa, Illinois, and Wisconsin, so when I do Iowa, that will the last state.

  44. @DirkH
    “ Did you look at Lucy Skywalker’s attempt?”
    Duly noted, thanks. We are feverishly working away to bring out our climategate.com ‘wiki’ and will dovetail it to LS’s work wherever possible. We are not in competition so we will do all we can to encourage the promotion and utilisation of the work of others in this area.

    @Nica in Houston: it’s a valid point that to go through the historical record of all newspaper archives is, indeed, a monumental task. This is something that is best done within a community resource such as a ‘wiki’ where a group of volunteers can add piecemeal data as and when time permits. The value of referring back to the historical record is that it is incontrovertible once it is in place. The effort will be well rewarded when it becomes robust and extensive enough to debunk the ‘homogenized’ falsities relied upon by discredited self-serving organisations such as the IPCC.

  45. DirkH says:

    Hi ChiefIO,
    i found this in Anthony’s inbox:
    “Zeke Hausfather (08:21:27) :

    Its not normally the side of the debate you cover, but I have a refutation of the whole “dropped stations” in GHCN argument up: http://www.yaleclimatemediaforum.org/2010/01/kusi-noaa-nasa/


    They try to justify the “death of the thermometers” with a kind of reverse logic. I think it’s easily debunked:
    -Even if it’s true (that they will add more temp records but it takes time) it invalidates the kind of direct comparisons with the past they make
    -Even if on average adjustments are symmetric, the question still arises why their only polar station is the warmest one, the Bolivia effect still exists etc…

  46. DirkH says:

    Hey we can even use Zeke Hausfather’s argument! Inadvertently he gave us the blueprint for Hansen’s history rewriting! All they have to do is scan the warmest stations first. Scan data from the colder stations later and add it to the temperature record later. This will automatically lead to the “rewriting history” effect, can easily be excused with “we don’t have enough grad students to update the records faster”, and will always “prove” that the latest decade is the warmest.

    Catch 22. This is where the real genius of Hansen lies.

  47. Josh Cryer says:

    Mike McMillan, someone is suggesting that we can’t trust NCDC digitized records, and implying that we should OCR all of those scans again. Sounds like fun.

    DirkH, they write, “If stations had intentionally been dropped to maximize the warming trend, one would expect to see more divergence between surface and satellite records over time as the number of stations used by GHCN decreases.”

    This would be very easy to spot.

    But we know that the stations were dropped because they did not meet GISS standards (20 monthly reports or more, 10 year trend or more). Not because GISS selectively removed them.

    REPLY: [ I don’t see where we know WHY any stations were dropped at all. Only that there were dropped in GHCN. Motivation is a very difficult thing to see… BTW, these are dropped by NOAA / NCDC and NOT by NASA GISS. GIStemp does toss out station records shorter than 20 years, but that happens to the combined GHCN / USHCN data sets and long AFTER NOAA / NCDC have “done the deed” of deletion of recent cold records in GHCN. Also, this article is pointing out that the “adjustments” to USHCN in making USHCN.v2 induce a warming trend. This is not from deletions but from rewriting the past of individual stations. -E.M.Smith ]

  48. wolfwalker says:

    Hi Chiefio,

    I’ve been following this whole AGW/data inconsistencies controversy for quite a while now. Being a skeptic in all ways, I try to question the claims and arguments of both sides. That skepticism has brought up two questions about two of your recent posts on USHCN and GHCN.

    1) You said, in a couple of blogposts and again in your recent interview, that California has only 4 ‘official’ thermometers left, all in lowland areas. However, when I went hunting for a list of USHCN stations, I found this, which in turn linked to this file, which is apparently complete as of May 2008. It shows more than fifty stations in California. What station list did you use to get your count of only four?

    2) On the “Bolivia effect,” you demonstrated clearly that Bolivia has no GHCN stations left at all, so all the data from that cold mountain climate is gone, replaced by extrapolations from surrounding stations. However, I think there still may be one gap in your reasoning. Do you know exactly what algorithm HCN uses for its extrapolations? In particular, do you know if it contains a corrective factor for altitude?

    REPLY: [ As boballab noted, I was talking about GHCN. BTW, until November 15, 2009, GIStemp used USHCN along with GHCN, but USHCN ‘cut off’ in May 2007. So from May 2007 until November 2009 the only data used for California WAS from GHCN. At that time, GIStemp swapped to USHCN.v2 that includes more thermometers, but as noted in another posting, the data are “adjusted” and have more warming slope in them. So they went from “biased from deletions” to “biased by adjustments’… have a nice day… and keep your eye on that walnut shell…

    Yes, GIStemp tries to do a adjustment to the ‘in-fill’ data. It does this via an “offset” calculation from looking at ‘nearby’ stations. I don’t remember seeing a particular specific altitude adjustment in the code, but frankly, it was a while ago that I last looked at that step. The code is all “up” under the “GIStemp technical and code” category on the right margin. STEP3 does the anomaly and fill in for the Grid / Box process (and I’m fairly certain there is no altitude adjustment in it). STEP1 does the homogenizing and some fill in (and is written in Python that I’m not that great at reading, so IFF I missed a lapse rate calculation I’d expect it to be hiding there) while STEP0 does the “glue together” and merges GHCN with USHCN (and thus, some more blend / fill in the USA for different data from the same thermometer) with no altitude adjustment. The STEP2 does the UHI (and tosses out shorter than 20 year records) and it does some more “in fill” via ‘UHI correction”. It’s a very complicated process. It “corrects” to keep the ‘offset of a mean of averaged data’ constant, not via altitude per se, IIRC.

    They key thing for me was a rough benchmark I did on the USHCN to USHCN.v2 transition that shows a change in the anomaly from deleting the thermometers from 5/2007 to 11/2009. The fact is that deleting thermometers and relying on fill-in gave artificial warming in that benchmark. It’s diluted (since it’s only the 2% of boxes that are USA that get warmed and the anomaly is reported for the whole N. Hemisphere in the report), but it is real.

    So we know it has an impact, now it’s just a matter of ‘how big’. And yes, I do need to do a fuller benchmark with the whole hemisphere, but it’s been a bit hard to get GIStemp to run on reduced data sets or selected benchmarks…

    BTW, since I filled a 10 GB disk with various runs and “stuff”, I’m currently doing an upgrade to my “rig”. I’ve got a new Linux up with a 40 GB disk and I’m looking to do a reinstall and validate; then I can get back to doing more ‘research’. This machine is also about 2 x the memory and processor speed of the old box. The downside is that my ability to take a quick look or do a quick report is limited as I’ve got things in pieces and doing admin work 8-{ but in the long run, it will be better and faster.

    -E.M.Smith ]

  49. boballab says:

    Wolfwalker on your first question you made a simple mistake a lot of people make about datasets. Chiefio’s analysis on California was on the GHCN dataset only, not USHCN. Even tho NCDC makes both GHCN and USHCN datasets they are not the same. GISS uses both datasets to try and ovecome the deficiency in the GHCN dataset in regards to the US.

    + NASA Portal
    + Goddard Space Flight Center
    + GSFC Earth Sciences Division

    GISS Temperature Analysis
    =========================
    Sources
    ——-

    GHCN = Global Historical Climate Network (NOAA)
    USHCN = US Historical Climate Network (NOAA)
    SCAR = Scientific Committee on Arctic Research

    Basic data set: GHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2
    v2.mean.Z (data file)
    v2.temperature.inv.Z (station information file)

    For US: USHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly
    9641C_200907_F52.avg.gz
    ushcn-v2-stations.txt

    For Antarctica: SCAR – http://www.antarctica.ac.uk/met/READER/surface/stationpt.html
    http://www.antarctica.ac.uk/met/READER/temperature.html
    http://www.antarctica.ac.uk/met/READER/aws/awspt.html

    For Hohenpeissenberg – http://members.lycos.nl/ErrenWijlens/co2/t_hohenpeissenberg_200306.txt
    complete record for this rural station
    (thanks to Hans Erren who reported it to GISS on July 16, 2003)

    USHCN stations are part of GHCN; but the data are adjusted for various recording and protocol errors and discontinuities; this set is particularly relevant if studies of US temperatures are made, whereas the corrections have little impact on the GLOBAL temperature trend, the US covering less than 2% of the globe.

    http://data.giss.nasa.gov/gistemp/sources/gistemp.html

    Now here is an example: I just opened my copy of the GHCN station list and randomly pulled out a California station: 42572289001 PASADENA

    Now I open my copy of GHCN mean raw and look at the data for it and I find that it has a temp record that runs from 1893 to 2006. (my copy of the file is from Dec 2009)

    Lets look at another one: 42572290002 CHULA VISTA
    Its data runs from: 1918 to 2006

    The neat thing about this is that you don’t need to take my word or Chiefo’s word for all this, its actually simple to check yourself. All you need is a txt editor to look at the data files and the station list. If you want to plot the data a spreadsheet program is needed. If you visit my Yahoo group I have a Dataset primer with all the links to NOAA, NASA and the WMO which is a way to get to various datasets from around the world. It also explains who makes the datasets and who only does analysis.
    http://tech.groups.yahoo.com/group/LaymansGuide/

    Example NASA only does analysis and if you read Chiefio’s posts on their program for GIStemp one of the first things they try to do is pull any adjustments others do to the data out. Yes they put their own in but at least they are trying to get back to the original data first.

    As to your second question CHiefio would have to address it since something like that would show up in the code, however from what I read on the GISS site about their code it looks like the answer is no. You can read each steps explanation from GISS at the link I provided earlier on GISTemp sources.

  50. davidc says:

    The Australian BOM site:

    http://www.bom.gov.au/climate/data/index.shtml

    has monthly data that seems to be untouched. To get station numbers click on “Long record temperature stations” on the right hand side bar.

    When they have no data they indicate that and just leave it out. When a station moves a short distance they might give it a new number, but maybe not. Seems that they might decide by actually looking at the old and new sites. It could be this that upset HARRY_README. The thing that convinces me that this is probably untouched data is that for most places I have looked at, for most months, there is no sign of warming. In the last 20 or so years most show cooling. That’s enough evidence for me that CO2 variations have no major impact.

    To see charts at the bom site click the icon at the top of the columns.

    REPLY: [ Above where I said “Australia BOM was recomputing” I think I ought to have said New Zealand BOM… I’ve since found the story about N.Z. that I THINK was what I was remembering. At any rate, we need to sort out which BOMs have good clean data and work from them, prior to any ‘recomputing’… -E.M.Smith ]

  51. davidc says:

    EM, Yes, there were recent reports about adjustments in NZ making up the entire extent of claimed warming. Australia bom might do it too in some of their aggregates but it seems to me that the posted data for individual stations are unmolested.

    I agree it would be good to have a list of safe sites. Then I think people can draw their own conclusions. If you take the time to look at a reasonable sample of sites – say 20 or 30 – and see not much happening, I think that’s more convincing than a trend of 0.16C/decade after all the manipulation required to get a global “mean”. Of course, a few individual sites might have significant local issues, so you need a reasonable sample.

    BTW, thanks for your great work.

  52. Richard Wakefield says:

    You may be interested in reading this analysis I did for temperatures in one location in Ontario.

    I’m currently merging other stations to get Southern Ontario. Then I will do the rest of Canada.

  53. Roger Sowell says:

    test
    REPLY: [ I don’t know why, but this ended up in the SPAM queue. I don’t have anything much set, so I can’t explain it… -E.M.Smith ]

  54. John in L du B says:

    Thanks for all your work Ciefio.

    As reported yesterday at ICECAP, an explanation for station dropout:

    …as Thomas Peterson and Russell Vose, the researchers who assembled much of GHCN, have explained:

    “The reasons why the number of stations in GHCN drop off in recent years are because some of GHCN’s source datasets are retroactive data compilations (e.g., World Weather Records) and other data sources were created or exchanged years ago. Only three data sources are available in near-real time.

    It’s common to think of temperature stations as modern Internet-linked operations that instantly report temperature readings to readily accessible databases, but that is not particularly accurate for stations outside of the United States and Western Europe. For many of the world’s stations, observations are still taken and recorded by hand, and assembling and digitizing records from thousands of stations worldwide is burdensome.

    During that spike in station counts in the 1970s, those stations were not actively reporting to some central repository. Rather, those records were collected years and decades later through painstaking work by researchers. It is quite likely that, a decade or two from now, the number of stations available for the 1990s and 2000s will exceed the 6,000-station peak reached in the 1970s.”

    It this at all plausible?

    REPLY: [ I would find it “plausible” that that is what they would like to believe. There ‘are issues’ with the thesis, though. The simplest is the USHCN problem. How can it be a “not actively reporting” problem when NOAA / NCDC make both the USHCN and the GHCN, yet selectively filter the data so only a subset now make it from their right pocket into their left pocket? Then we have Canada saying that they DO report all the stations in Canada and it isn’t THEM dropping the data on the floor. And Russia has accused NCDC of selective listening skills too… There are also the ongoing changes in composition of the data. If it was all a “done in 1990 one time” you would expect the curve to show a one step change. There is a large step change, but it is bounded by curves on each side. Ongoing changes are happening.

    BTW, the notion that the data set was created in 1990 therefore any changes in composition are only related to that date (such as the peak in 1970 being before it was created) completely ignores that there are temperature data in it from the 1800’s that NOAA / NCDC don’t mind changing as their methods change. They have with some frequency ‘diddled the past’ as their data set changes over time demonstrate.

    Finally, I find it ludicrous on the face of it that they talk about it as ‘not real time’ or not “instantly report temperature readings”. Folks, its been Twenty Years since 1990. Even a mule train from Bolivia or a row boat from Madagascar could have gotten the data here by now… And speaking of Madagascar, their thermometers survive until about 2005, then start dying off. SOMETHING is happening after 1990, it’s just not Temperature Truth that’s happening. And oh, BTW, http://www.wunderground.com/ has no problem finding Bolivia … so the data are, in fact, being reported “real time” just in some cases it is falling on deaf ears.

    But hey, if they want to use that excuse, I’m “good with that”. The necessary consequence of that line of reasoning is this: “The GHCN data set is obsolete by 20 years. The ongoing maintenance of the data have been botched. The result is a structural deficit that makes it wholly unsuited to use in climate analysis and that makes any statements about the ‘present’ vs. the ‘baseline’ useless due to lack of recent data comparable to that baseline. All that research based on the GHCN must be discarded as tainted by a broken unmaintained data set.” And the corollary is that we ought to fire NCDC and just contract the data set out to Wunderground. They have fast and complete access to the data…

    So if they want “to go there”, then I’m “good with that”… but I don’t think they will like the result… All their excuse does is change the “issue” from sin of commission to sin of omission. Not exactly a big advantage. They still “own” the data set and they still “own” the brokenness… -E.M.Smith”

  55. boballab says:

    Amazing we spent all that money on “climate change” and the NCDC’s excuse is scientists in Canada either:

    1. Never heard of Laptop Computers
    2. Never knew you can buy a cheap Laptop for $500
    3. Never learned how to operate a laptop.
    4. Never heard of a Scanner before.

    I think I made my point, even in remote places in Africa they know what a laptop is. So that whole line about the time it takes to scan in a paper copy and make a message in the computer age is something you see come out the south end of a north bound steer.

    The UN and the WMO have a special network for reporting this type of inforamtion and it’s called the Global Telecommunications System and reporting stations send Monthly CLIMAT reports over them. Here are some excerpts from the WMO:

    2.6.2 Logging and reporting of observations
    Immediately after taking an observation at a manual station, the observer must enter the data into a logbook, journal, or register that is kept at the station for this purpose. Alternatively, the observation may be entered or transcribed immediately into a computer or transmission terminal and a database. Legislation or legal entities (such as courts of law) in some countries may require a paper record or a printout of the original entry to be retained for use as evidence in legal cases, or may have difficulty accepting database generated information. The observer must ensure that a complete and accurate record has been made of the observation. At a specified frequency (ranging from immediately to once a month), depending on the requirements of the NMHS, data must be transferred from the station record (including a computer database) to a specific report form for transmittal, either by mail or electronically, to a central office.

    Does the NCDC seriously expects us to believe that Canada doesn’t use a computer network. I mean I think they know what the Internent is up there.

    Some national climate centers will require the station personnel to calculate and insert monthly totals and means of precipitation and temperature so that the data may be more easily checked at the section or central office. In addition, either the climate center or observer should encode data for the CLIMAT messages (WMO/TD‐No.1188), if appropriate. WMO has software to encode the data. The observer should note in the station logbook and on the report forms the nature and times of occurrence of any damage to or failure of instruments, maintenance activities, and any change in equipment or exposure of the station, since such events might significantly affect the observed data and thus the climatological record. Where appropriate, instructions should be provided for transmitting observations electronically. If mail is the method of transmission, instructions for mailing should be provided to the station as well as preaddressed, stamped envelopes for sending the report forms to the central climate office.

    Now this next quote is illuminating

    A major step forward in climate database management occurred with the World Climate Data and Monitoring Programme (WCDMP) Climate Computing (CLICOM) project in 1985. This project led to the installation of climate database software on personal computers, thus providing NMHS in even the smallest of countries with the capability of efficiently managing their climate records. The project also provided the foundation for demonstrable improvements in climate services, applications, and research. In the late 1990s, the WCDMP initiated a CDMS project to take advantage of the latest technologies to meet the varied and growing data management needs of WMO Members. Aside from advances in database technologies such as relational databases, query languages, and links with Geographical Information Systems, more efficient data capture was made possible with the increase in AWS, electronic field books, the Internet, and other advances in technology.

    http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf
    Looks like that doesn’t square with what NOAA is peddling.

  56. Pingback: TWAWKI » Nude Scientist – Issue 2

  57. Ruhroh says:

    Hey Cheif;

    A commenter Troyca over at CA did elicit an interesting revelation from Menne regarding the Pointwise Homogenizer Algorithm in GIStemp;

    “1) The PHA is run separately on Tmax & Tmin series to produce the USHCNv2 dateset. We then compute Tavg as (Tmax+Tmin)/2. ”

    Wow, what’s OK about that?
    From
    http://troyca.wordpress.com/2011/01/03/more-on-running-the-ushcnv2-pha-response-from-dr-menne/

    He also describes what it took to run compile the Fortran under his linux , I think.

    RR

Comments are closed.