GIStemp – No new source code?

It’s been a while since I did a GIStemp posting. Partly as Hansen had become such a bad joke it wasn’t worth it. Partly as the focus had moved to the upstream data diddling, what with GHCN V3 being significantly “warmer” than Version 1 had been.

But I have a new toy, the Raspberry Pi, and I wanted to play with it. See what it could do… Could it, in fact, compile and run GIStemp?

Now if planning to do a new port, it is worth it to start with a fresh copy of the code, so I went to NASA and downloaded the source code from the approved link:

http://data.giss.nasa.gov/gistemp/sources/

It all looked rather familiar, and I didn’t think much of it. The unpacking went well, as did the set-up of a file system for it ( it fits nicely on a 2 GB SD card ;-) I looked up my “Make File” posting and proceeded to type it in again, by hand. (Why? Well, to check for any “differences” with the “new” GIStmp that runs on GHCN Version 3, of course…

It went fairly well. Nothing seemed changed, so I figured it must be the same names, but maybe different ‘insides’ for V3…

A couple of test compiles showed the FORTRAN compiler (gfortran) was much more accepting. The prior issue of making assignments during the type declaration didn’t present a problem, and all the modules compiled without needing to pay attention to F77 vs F95 nearly so much.

All good.

Fortran is quite fast on the Raspberry Pi with a 30 Mb/second SD chip!

I managed to have “make” compile all the FORTRAN to binaries. I even test ran a few of them and they worked fine. So next I put the GHCN data where it belonged (but noted that the code still talked about V2… but, I thought, maybe they just didn’t want to change all the file names from V2 to V3 and change the program names and change…) So I put a copy of the V3 GHCN data in the “v2.mean” input file (where it always had been put), and told it to run…

Several steps using USHCN ran fine. Then, on the first step that reads in GHCN, it tossed it’s cookies with a Format Error in the data stream.

Looking more closely, the code looked VERY familiar. It looked like it was just the old V2 code still. So I “checked it out”, and no, I didn’t have an old copy, it was a fresh download. I’ve only had the cards a couple of months, and the data and software were direct downloads to it. Date stamp on the tar archive on my card is May 16, 2013. So the download was just 3 days ago. Just to be sure, I did a ‘tar -xvf GISTEMP.sources.tar’ and captured the date stamps on the files IN that archive. (The command ‘tar’ is the Tape ARchiver. X is extract, while V says be verbose about it and F says get the archive from this file).

Here’s a screen capture. This is Step0 where the initial data unload and mix is done. If any V3 changes were made, it ought to show here:

GIStemp source code 16 May 2013 still not GHCN V3

GIStemp source code 16 May 2013 still not GHCN V3

You ought to be able to click on that to ’embiggen’ in and read the dates. It’s mostly 2009 and 2010. The newest data stamp in it is 2011 for one bit of code not related to GHCN.

What’s The Deal?

So I’m realizing this is the same old Version 2 software. But if you go to NCDC to download the Version 2 data to use with it (as I was mostly interested in testing the Raspberri Pi so figured I’d give up on V3) you find out that GHCN Version 2 temperature data is now gone.

ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/

grid 5/15/2013 4:00:00 PM
source 12/23/2010 12:00:00 AM
File:v2.country.codes 11 KB 1/25/2002 12:00:00 AM
File:v2.max.Z 6010 KB 2/7/2012 12:00:00 AM
File:v2.prcp.Z 25445 KB 5/18/2013 5:00:00 AM
File:v2.prcp.failed.qc.Z 163 KB 5/18/2013 5:00:00 AM
File:v2.prcp.inv 1267 KB 1/25/2002 12:00:00 AM
File:v2.prcp.readme 4 KB 9/27/2011 12:00:00 AM
File:v2.prcp_adj.Z 5357 KB 5/18/2013 5:00:00 AM
File:v2.read.data.f 11 KB 12/14/2006 12:00:00 AM
File:v2.read.inv.f 3 KB 12/14/2006 12:00:00 AM
File:v2.slp.country.codes 11 KB 10/3/2002 12:00:00 AM
zipd

The precipitation data is still there, along with the MAX temperatures, but MIN and average are gone.

OK, I’ve got a few copies of the old V2 “squirreled away” and can get them unpacked in a few days to finish my Raspberry Pi testing. But that’s not the point… (Well, it is for me, but there’s a bigger point…)

Are we not supposed to have access to both the data AND the software being used to pronounce this yet again “The Hottest Ever!!!”
(even as snow covers the UK and Canada… )?

This is just so broken.

They claim to be running GIStemp with Version 3 GHCN data at their web site.
But are they? Is “Trust me.” good enough from the government?

http://data.giss.nasa.gov/gistemp/

Record high global temperature during the period with instrumental data was reached in 2010. After that paper appeared, version 3 of the GHCN data became available. The current analysis is now based on the adjusted GHCN v3 data for the data over land.

Are they hiding what they are doing? Preventing a proper software audit? Normally I’d attribute to just being busy with other stuff, and what with Hansen being on the skids nobody really caring about “his software” anymore. But in light of nefarious things with the: Associated Press, State Department / Benghazi, EPA, and even IRS being politicized; well, lets just say that it sure looks like “assume the worst” and you get closer…

At this point, I don’t know if I ought to be upset that there is no GHCN Version 3 code to validate / check; or be happy that now with Hansen out of the picture his “products” are being allowed to die on the vine. I’m all for just ’round filing’ GIStemp and calling it broken history. But I’m not willing to accept having it run, claiming “Hottest Ever!!!” and no way to audit the thing to see what was changed.

As of now, GIStemp is a black box with no utility as it is unverifiable.

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW and GIStemp Issues, GISStemp Technical and Source Code, NCDC - GHCN Issues and tagged , , , . Bookmark the permalink.

29 Responses to GIStemp – No new source code?

  1. Richard Ilfeld says:

    For some folks, EVERYTHING is political. This seems to be true of our current government, and, more troubling, much of our current civil service.
    When the narrative and the agenda are more important than the civic and social responsibility, it is time to clean house. Suggest replacement more likely to succeed that “reform” which is usually the same scoundrels switching chairs whilst canning a couple of unfortunate underlings.

  2. Petrossa says:

    Now that hansen is gone i guess they are busy cleaning up after him. Since nobody understands the mess he left behind they must have made an even greater mess of it. AssumeStupidityNotMalice

  3. BobN says:

    This is SAD. If its black box its worthless. Time to ignore their data, I think its meaningless.

  4. R. de Haan says:

    This article is worth a repost at WUWT.

  5. R. de Haan says:

    Richard Ilfeld says:
    19 May 2013 at 12:48 pm
    “For some folks, EVERYTHING is political. ”

    The term political no longer covers the load.
    This is criminal. Period.

  6. E.M.Smith says:

    Well, I got STEP0 to run to completion. The only really “odd bit” is that the “tail” command on the Raspberry Pi doesn’t understand the “+100” option, (telling it to skip 100 lines, then print the rest), so since the Hohenpeissenberg file had 224 lines in it, I just changed it to “tail -124” and moved on. At some point that kludge needs a fix.

    So at this point, unless the Python in Step1 is a pain somehow, it looks relatively easy to run GIStemp on a Raspberry Pi. Since it ships with Python based games on the desktop, I’m pretty sure the Python will not be a problem…

    I’m presently running on a copy of GHCN.v2 from Dec 2009 that was in my archive…

  7. Bob Koss says:

    The Giss methodology is substantially different from how they handled the data from Ghcn v2. Instead of combining all the raw datasets from a specific station into one dataset(averaging where their are multiple values), they allow that to be done by Ghcn. Ghcn then further adjusts the combined dataset into an adjusted dataset. This is becomes the dataset Giss uses when making their homogeneity adjustments.

    The pathetic method Ghcn uses when combining datasets from one location I find astonishing. Here is how they say they do it.

    A simple algorithm was applied to perform the merge. The algorithm
    consisted of first finding the length (based upon number of non
    missing observations) for each of the time series and then
    combining all of the series into one based upon a priority scheme
    that would “write” data to the series for the longest series last.

    Therefore, if station A, had 3 time series of TAVG data, as follows:

    1900 to 1978 (79 years of data) [series 1]
    1950 to 1985 (36 years of data) [series 2]
    1990 to 2007 (18 years of data) [series 3]

    The final series would consist of:

    1900 to 1978 [series 1]
    1979 to 1985 [series 2]
    1990 to 2007 [series 3]

    What this ends up doing in many cases is over-writing valid values from a shorter dataset with whatever is present in the longer one. This even includes over-writing actually valid values with -9999. I’ve seen some cases where entire years were present in a shorter dataset and entirely wiped out with the value -9999 simply because the longer dataset didn’t record the data.

    Ghcn then takes this atrociously combined data and adjusts it according to some algorithm and this is what Giss now uses when performing their homogeneity adjustment.

    Then there is the fact that Giss used to end up with a single homogeneity adjusted dataset set for each location. They accomplished this by doing the combining before homogeneity adjustment. If you think that is still true, I have a bridge for sale if you are interested. There are 51 stations, all in the southern hemisphere, which now have multiple homogeneity adjusted datasets. This comes about due to Ghcn apparently not combining the highest dataset number into its combined dataset in those instances. Giss makes a separate homogeneity adjusted dataset out of it. Most of these stations appear to be rural stations where running their homogeneity adjustment doesn’t alter the data.

    Unsurprisingly the multiple homogenized datasets at a location end up with different trend slopes. If as I suspect, they are not being further combined into one dataset before use in sub-box creation, those locations exert out-sized influence compared to those stations with only one dataset in use. Call me cynical, but frankly I doubt this change was done to improve accuracy. More likely it tends to increase SH warming. I haven’t checked though. It’s beyond my declining abilities and interest in my advanced years.

    In a while I’ll put up a list of the stations with multiple homogeneity datasets in another comment.

  8. Bob Koss says:

    These are the 51 stations with multiple homogeneity datasets. Last digit(dataset number) has been removed.
    14168816000
    14168906000
    14168994000
    14361997000
    14361998000
    14789042000
    15761996000
    30188963000
    30485469000
    30485585000
    30485934000
    30489056000
    30489057000
    31688890001
    31688890002
    31788903000
    50194600000
    50194601000
    50194962000
    50194998000
    50793945000
    50793987000
    50793994000
    52591843000
    52891938000
    52891958000
    53791960000
    70089001000
    70089002000
    70089009000
    70089022000
    70089034000
    70089050000
    70089055000
    70089059001
    70089062000
    70089063000
    70089066000
    70089512000
    70089532000
    70089542000
    70089564000
    70089571000
    70089592000
    70089606000
    70089611000
    70089642000
    70089662000
    70089664000
    70089665001
    70188968000

    If you are really interested in further investigation this might be a ‘dig here’ place to start if you wish to do a quick analysis as to effect. There are four stations with triple datasets. I have included their neighbors within 1200 km.

    0 km (*) Gough Island 40.4 S 9.9 W 141689060000 < 10,000 1955 – 2013
    0 km (*) Gough Island 40.4 S 9.9 W 141689060007 < 10,000 1956 – 2006
    0 km (*) Gough Island 40.4 S 9.9 W 141689060008 < 10,000 1956 – 2013
    424 km (*) Tristan Da Cunha 37.0 S 12.3 W 141689020000 < 10,000 1942 – 1987
    ————-
    0 km (*) Marion Island 46.9 S 37.9 E 141689940000 < 10,000 1948 – 2013
    0 km (*) Marion Island 46.9 S 37.9 E 141689940007 < 10,000 1948 – 2006
    0 km (*) Marion Island 46.9 S 37.9 E 141689940008 < 10,000 1948 – 2013
    1069 km (*) Crozet 46.4 S 51.9 E 143619970000 < 10,000 1970 – 2013
    1069 km (*) Crozet 46.4 S 51.9 E 143619970007 < 10,000 1965 – 1998
    ————-
    0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980000 < 10,000 1948 – 2013
    0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980007 < 10,000 1948 – 2002
    0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980008 < 10,000 1948 – 2013
    707 km (*) Campbell Isla 52.5 S 169.2 E 507939450000 < 10,000 1941 – 2002
    707 km (*) Campbell Isla 52.5 S 169.2 E 507939450007 < 10,000 1941 – 1996
    723 km (*) Campbell 52.0 S 169.0 E 507939470008 < 10,000 1961 – 2013
    1098 km (*) Invercargill 46.7 S 168.6 E 507938440000 49,000 1950 – 2010
    ————-
    0 km (*) Grytviken, 54.3 S 36.5 W 317889030000 < 10,000 1905 – 2013
    0 km (*) Grytviken, 54.3 S 36.5 W 317889030007 < 10,000 1905 – 1982
    0 km (*) Grytviken, 54.3 S 36.5 W 317889030008 < 10,000 1905 – 2013
    871 km (*) Base Orcadas 60.8 S 44.7 W 701889680000 < 10,000 1903 – 2011
    871 km (*) Base Orcadas 60.8 S 44.7 W 701889680008 < 10,000 1903 – 2013
    899 km (*) Signy Island 60.7 S 45.6 W 147890420000 < 10,000 1947 – 1994
    899 km (*) Signy Island 60.7 S 45.6 W 147890420008 < 10,000 1947 – 1995
    ————-

  9. E.M.Smith says:

    @Bob Koss:

    Nice to know… I’d heard something like that, which was why I decided I ought to do a re-port of GIStemp and see how different it was when the results were compared with my archived V2 version. Essentially, run both of them through Dec 2009 and compare, and / or splice on v3 data from 2010-present to the V2 set and compare both ‘end to end’.

    Instead I found out that “whatever they are doing” it isn’t in the source code available to download.

    I’m quite certain that any ‘work product’ produced by GIStemp is worthless (as it was worthless in V2 and will be worse now); but thought it might be fun to have it running on a R.Pi card… (Just to poke fun at the idea you need a supercomputer for “climate science” if nothing else ;-)

    As it stands, I’m finding myself having ever more “cringe” response as I try to make it “go” again…

    FWIW, making the FORTRAN bits compile and run has been easy (for the v2 source code they have available); but the Python has presented some “issues” in the first attempt. One installs to custom libraries made from C code via a multi-layer “Make” process (a make that makes the Makefile to make the… ) and it seems to expect things based on a user installed python 1.5 in a /usr/local type directory, not a Python 2.7 in /usr/bin already installed. So I’m packing it in for a day or two (as I have more interesting things to do…). But at this point, the only bit where I’m “stuck” is on working out the ‘recipe’ to get the STEP1 python install / build library bit done on the R.Pi. (In some ways, putting it on the old RedHat 7.x box was easier as it didn’t have Python on it already to I just did the “install 1.5 like they expect” and go…)

    At any rate, doing it all for the V2 port, when that whole homogenizing step is likely very different in whatever the v3 code might be; seems, somehow, not my “highest and best use” right now… I’ve already shown that the R.Pi handles the FORTRAN well and easily. (Had to install FORTRAN and the mksh packages to get the Korn Shell and gfortran, but that was all of 5 minutes and two simple commands “apt-get install mksh” and “apt-get install gfortran” as I recall it…)

    What I’d really like to do is get the V3 code and see what it does, but that seems to be “unavailable” at the moment…

  10. Gail Combs says:

    I would say political especially if you follow the dots.

    Start with Royal Dutch Shell:
    The Dutch royal family (The House of Orange) is still reportedly the biggest shareholder in the Dutch part of the group, although the size of its stake has long been a source of debate. The Queen of England is also a major stockholder link and Scuttlebutt and more Scuttlebutt.

    Prince Bernhard of the Dutch Royal Family is the Founding President of World Wildlife Fund (WWF)

    HRH The Duke of Edinburgh served as International President of WWF for 16 years until his retirement at the end of 1996…

    John H. Loudon, Better known as “the Grand Old Man of Shell”, John H. Loudon, a Dutchman, headed Royal Dutch Shell from 1951 to 1965…. He was President of WWF from 1976 to 1981, and also a member of The 1001.

    Ruud Lubbers served three terms as Prime Minister of the Netherlands between 1982 and 1994, thus becoming the longest serving Dutch Prime Minister…. He continued in Parliament as Senior Deputy Leader, and later Parliamentary Leader of the Christian Democratic Alliance. He became President of WWF International on 1 January 2000, but only served for one year as he was appointed United Nations High Commissioner for Refugees from 2001-2005.

    World Wildlife Fund Presidents – past and present

    Another major stockholder is the Rothschilds. The Rothschild Investment Trust was formed in 1988 => RIT Capital Partners. Rockefellers and Rothschilds Unite

    Then we look at the Shell Board of Directors.

    Peter Voser
    Chief Executive Officer
    … a member of the Swiss Federal Auditor Oversight Authority from 2006 to December 2010. In 2011…

    Josef Ackermann
    Non-executive Director
    … He is Chairman of the Board of Directors of Zurich Insurance Group Limited and of Zurich Insurance Company Limited, positions he has held since March 2012.

    … he held a variety of positions in corporate banking, foreign exchange/money markets, treasury and investment banking. In 1990, he was appointed to SKA’s Executive Board, on which he served as President between 1993 and 1996. He joined Deutsche Bank’s Management Board in 1996 with responsibility for the investment banking division and, from 2006 and 2002 respectively until May 2012, he was Chairman of the Management Board and of the Group Executive Committee of Deutsche Bank AG. He is a member of the Supervisory Board of Siemens AG, the Board of Directors of Investor AB and a number of advisory boards. He also has various roles in several foundations and academic institutions….

    Charles O. Holliday
    Non-executive Director
    … He served as Chief Executive Officer of DuPont from 1998 to January 2009, and as Chairman from 1999 to December 2009…. He previously served as Chairman of the World Business Council for Sustainable Development, Chairman of The Business Council, Chairman of Catalyst, Chairman of the Society of Chemical Industry – American Section, and is a founding member of the International Business Council. He is Chairman of the Board of Directors of Bank of America Corporation and a Director of Deere & Company.

    Gerard Kleisterlee
    Non-executive Director
    …He is Chairman of Vodafone Group plc, a member of the Supervisory Board of Daimler AG, and a Director of Dell Inc.

    Christine Morin-Postel
    Non-executive Director
    …. she was Chief Executive of Société Générale de Belgique, Executive Vice-President and a member of the Executive Committee of Suez S.A., Chairman and Chief Executive Officer of Crédisuez S.A. and a Non-executive Director of Pilkington plc, Alcan Inc. and EXOR S.p.A. She is a Non-executive Director of British American Tobacco plc.

    Sir Nigel Sheinwald GCMG
    Non-executive Director
    He was a senior British diplomat who served as British Ambassador to the USA from 2007 to 2012. He joined the Diplomatic Service in 1976 and served in Brussels (twice), Washington and Moscow and in a wide range of policy roles in London. He served as British Ambassador and Permanent Representative to the European Union in Brussels from 2000 to 2003. Prior to his appointment as British Ambassador to the USA, he served as Foreign Policy and Defence Adviser to the Prime Minister and Head of the Cabinet Office Defence and Overseas Secretariat. He retired from the Diplomatic Service in March 2012….

    Linda G. Stuntz
    Non-executive Director
    She is a founding partner of the law firm of Stuntz, Davis & Staffier, P.C., based in Washington, D.C. Her law practice includes energy and environmental regulation as well as matters relating to government support of technology development and transfer. From 1989 to 1993, she held senior policy positions at the U.S. Department of Energy, including Deputy Secretary. She played a principal role in the development and enactment of the Energy Policy Act of 1992.

    From 1981 to 1987, she was an Associate Minority Counsel and Minority Counsel to the Energy and Commerce Committee of the U.S. House of Representatives. She chaired the Electricity Advisory Committee to the U.S. Department of Energy from 2008 to 2009, and was a member of the Board of Directors of Schlumberger Limited from 1993 to 2010. She is a member of the Board of Directors of Raytheon Company. [Raytheon does mostly government contracts G.C.]

    Jeroen van der Veer
    Non-executive Director
    ….He was Vice-Chairman and Senior Independent Director of Unilever (which includes Unilever N.V. and Unilever plc) until May 2011 and is Chairman of the Supervisory Boards of Koninklijke Philips Electronics N.V. and of ING Group. He also has various roles in several foundations and charities.

    Gerrit Zalm
    Non-executive Director
    He is Chairman of the Board of Management of ABN AMRO Bank N.V., a position he has held since February 2009. Before joining ABN AMRO, he was the Minister of Finance of the Netherlands from 1994 until 2002, Chairman of the VVD Liberal Party in the Lower House (2002) and Minister of Finance from 2003 until 2007. During 2007 until 2009 he was an adviser to PricewaterhouseCoopers (2007), Chairman of the trustees of the International Accounting Standards Board (2007-2010), an adviser to Permira (private equity fund) (2007-2008) and Chief Financial Officer of DSB Bank (2008). Prior to 1994, he was head of the Netherlands Bureau for Economic Policy Analysis, a professor at Vrije Universiteit Amsterdam and held various positions at the Ministry of Finance and at the Ministry of Economic Affairs. ….

    Rather well connected to governments, NGOs and various banks are they not?

    Then you have Shell Oil funding the Climate Research Unit at East Anglia. WIKI A Shell Oil President, Marlan Downey, “Former President of the international subsidiary of Shell Oil, founder of Roxanna Oil; former President of Arco International” on the Advisory Board of Richard The Liar* Muller’s consulting firm, Muller & Assoc.

    Finally we come to Ged Davis a Shell Oil VP with IPCC connections who was in the Climategate E-mails. See: ClimateGate (1) email 0889554019 The e-mail attachment includes his Sustainable Development (B1) scenario aka UN Agenda 21.

    Scenarios Come to Davos
    A GBN Conversation with Ged Davis

    As one of the masterminds behind the World Economic
    Forum’s annual Davos gathering, this seasoned Shell
    scenarist is once again poised to help global business
    tackle the world’s problems.

    …Ged Davis is one of those rare people who has a mastery of both craft and content. His craft is
    scenario planning, which he has been practicing nearly since its inception. In 1972, Ged joined
    Royal Dutch/Shell, where he worked alongside Pierre Wack, the godfather of scenarios, as well
    as GBN’s own Napier Collyns and Peter Schwartz. He later became the head of scenario
    planning at Shell, and over the course of his career spearheaded many innovations in the form.
    At the same time, he developed deep knowledge about energy, climate change, sustainability,
    and global social problems like the spread of HIV/AIDS. In many ways, Ged epitomizes the
    wise, socially responsible European business perspective—which is one reason he was
    recently chosen to head the World Economic Forum’s new Centre for Strategic Insight, the
    entity now responsible for setting the agenda for the Forum’s prestigious annual gathering in
    Davos….

    …After completing my studies in London, I got an offer to go to Stanford University. I found
    California rather liberating, with many interesting ideas developing. I was very interested in
    economics, and soon got an offer to go to the London School of Economics

    After LSE, I was offered a job at Stanford Industrial Park, and while I was working there got
    heavily involved in studying engineering economic systems at Stanford. It was also there that I met
    and took courses from Willis Harmon. In 1969, I remember well a course, which he ran, in which
    we looked at the world’s problems as a whole —that was my first real experience in handling global
    issues from a holistic perspective, in a way I would later do with scenarios….

    then I headed a new unit working on new scenario processes and applications. I had started to
    develop something different from what Shell was doing—namely, techniques that would allow
    groups of 60 or so people to develop scenarios. I tried these methods out with a project for the
    World Business Council for Sustainable Development on the future of sustainable development,
    which turned out to be a surprisingly successful project. The scenarios had considerable impact,
    largely, I think, because the techniques we developed almost forced ownership. So I got very much
    involved with the question of how you build scenarios that are owned by larger groups of people
    with a capacity to act. For me, that was an important step in my own thinking.
    ….
    http://www.weforum.org/pdf/CSI/GBN_Davis_interview.pdf‎

    Ged Davis is co-president of the Global Energy Assessment. Previously he was managing director of the World Economic Forum, responsible for global research, scenario projects, and the design of the annual Forum meeting at Davos, which brings together 2,400 corporate, government, and non-profit leaders to shape the global agenda.
    Ged is a member of the InterAcademy Council Panel on Transitions to Sustainable Energy, a director of Low Carbon Accelerator Limited, a governor of the International Development Research Centre in Ottawa and a member of the INDEX Design Awards Jury. He was the director of the UNAIDS “AIDS in Africa” scenario project from 2002 to 2003. Ged has led a large number of scenario projects during his career, including the multi-year, multi-stakeholder scenarios on the future of sustainability for the World Business Council for Sustainable Development and was facilitator of the last IPCC emissions scenarios. Ged first graduated with a degree in Mining Engineering from Imperial, College London. He holds postgraduate degrees in Economics and Engineering from the London School of Economics and Stanford University
    http://gbn.com/people/peopledetail.php?id=18

    Like Maurice Strong, Ged Davis is one of those people you don’t hear much about who has a big impact on your life.

    * “Let me be clear. My own reading of the literature and study of paleoclimate suggests strongly that carbon dioxide from burning of fossil fuels will prove to be the greatest pollutant of human history. It is likely to have severe and detrimental effects on global climate.” – Richard Muller, 2003″ link

  11. Gail Combs says:

    BobN – ….Time to ignore their data, I think its meaningless.
    >>>>>>>>>>>>>>>>>>>>>>>>>..
    It always was meaningless. You have to take into account the energy contained in water vapor to actually measure the energy since that is what they should be looking at. ( Trenbeth’s missing energy is not hiding in the ocean it is hiding in the clouds, snicker)

  12. Chiefio,
    I am awestruck by what you do. My ability with words is totally inadequate, so I have to resort to poetry:

    Beside yon straggling fence that skirts the way
    With blossom’d furze unprofitably gay,
    There, in his noisy mansion, skill’d to rule,
    The village master taught his little school;
    A man severe he was, and stern to view,
    I knew him well, and every truant knew;
    Well had the boding tremblers learn’d to trace
    The days disasters in his morning face;
    Full well they laugh’d with counterfeited glee,
    At all his jokes, for many a joke had he:
    Full well the busy whisper, circling round,
    Convey’d the dismal tidings when he frown’d:
    Yet he was kind; or if severe in aught,
    The love he bore to learning was in fault.
    The village all declar’d how much he knew;
    ‘Twas certain he could write, and cipher too:
    Lands he could measure, terms and tides presage,
    And e’en the story ran that he could gauge.
    In arguing too, the parson own’d his skill,
    For e’en though vanquish’d he could argue still;
    While words of learned length and thund’ring sound
    Amazed the gazing rustics rang’d around;
    And still they gaz’d and still the wonder grew,
    That one small head could carry all he knew.

    Oliver Goldsmith, Deserted Village

  13. BobN says:

    @Gail Combs – About 10 years ago when I first started hearing about global warming, I took a few Saturdays to track down the location of the meters in my area used for the data collection. About half had a bias to warm in where they were located. I new then that it wasn’t science, it was an agenda.

  14. Gail Combs,
    The CAGW folks are getting really desperate with their “Missing Heat” and global cooling caused by aerosols from China.

    So what will the AR5 say in September? Similar to what AR4 said but with a lower sea level rise predicted for 2100.

    It was never about science.

  15. Bob Koss says:

    E.M.Smith,

    Good luck with it, if you do decide to get it running. Just thought I’d point out a couple areas where I noticed they have made changes. As far as I’m concerned, their product has always been shoddy and it seems to be getting more so.

    One other thing they are now doing. They are using reanalysis values for Byrd station. Previously they only used Byrd starting in 1980, now they have infilled with many years of reanalysis values so they can go back to 1957. At least they did mentioned that in one of their updates a couple months ago. Maybe using that station as a stalking horse, with reanalysis to be expanded to other stations if people don’t complain about it.

    They seem addicted to fudge.

  16. Gail Combs says:

    Here are the actual ‘reanalysis values for the Byrd Station’ and yes they are addicted to fudge.

    Reconstructed temperature record from Byrd Station (1957-2012)

  17. E.M.Smith says:

    @Bob Koss & Gail:

    Oh, I like that! Remind me to send boxes of fudge as gifts if I’m ever in a position of needing to present a ‘social gift’ to one of The Team! (Say, at a presentation somewhere… having a small ‘thank you gift’ for participating ;-)

    FWIW, I’m not so much interested in running GIStemp to USE it, as to COMPARE it. To do “compare and contrast” between the basic data (the V1 I’ve got saved vs V2 vs V3) and show that the “increasing warming” comes as much from changes in the processing of the actual data as anything else… Also I’d like to have it on a $25 Raspberry Pi just for the humor value….

    For reasons beyond my ken, folks love to attribute more importance to codes that need expensive hardware to run A simple formula that is accurate and can be solved on a slide rule is more impressive to me; but say “our model runs on the world’s fastest supercomputer” and folks are impressed with it… So having a photo of a R.Pi captioned with ‘running GIStemp’ would make for an interesting “funny”, IMHO… it puts “strain” between expectations of importance and reality of hardware.

    It’s also pretty crappy code, so if it runs correctly, I can be pretty sure the compiler is a good one. Something of a test suit for the R.Pi “maturity” of the port of compiler and related tools.

    Maybe I’ll try porting one of the “Climate Models” instead ;-)

    @GallopingCamel:

    Thanks for the poetry! Don’t know that I’m in that category, but that is for others to decide.

    I have my “empty spots” too, so don’t be too impressed. There’s a ‘selection bias’ here in that I avoid topics where I’m a complete idiot. I know nearly nothing about: Opera, ballet, central Asian and amerindian language, power politics from an operational aspect – i.e. I can’t do it well, Indian cooking – though I’d love to be better at it, music history nor playing any instrument, etc. etc….

    So I “know some stuff” in some depth in a few particular areas that are diverse from each other. OK, it’s a nice trick. But there is a much much larger area where I’m functionally ignorant… (ask me about ANY pop star, movie star, name singer or dancer or… and you will get a blank stare… Similarly for any African tribe or their history / political dynamics. I know there have been conflicts, but can’t say how to spell Hutu? or Tutsi? or why they do what they do…) So I’m not at risk of becoming too ‘full of myself’. I’ve got my moments, and I appreciate you’re enjoying them; but I’m going to remain ‘in touch’ with my core ignorance of far too much… Which is likely why I’m still so interested in learning things ;-)

    I’ll likely get back to finishing the v2 port in a week or two. I’ve got some things that have “popped up” that are going to suck up a lot of the next 2 weeks. Then I’ll have more “idle time” to work on things like porting misc. codes to the R.Pi just to see them run there…

    (Anyone else with a R. Pi who wants to work out how to get the STEP1 Python libraries installed and that step running, feel free. It will save me a couple of days work… hint hint ;-)

  18. D Cotton says:

    Without gravity acting to restore the thermodynamic equilibrium which is stipulated in the Second Law of Thermodynamics (which says: “An isolated system, if not already in its state of thermodynamic equilibrium, spontaneously evolves towards it. Thermodynamic equilibrium has the greatest entropy amongst the states accessible to the system”) and thus, as a direct corollary of that Law, supporting (at the molecular level) an autonomous thermal gradient, then …

    (1) The temperature at the base of the troposphere on Uranus would be nowhere near as hot as 320K because virtually no direct Solar radiation gets down there, and there is no surface at that altitude. The planet’s radiating temperature is under 60K because it receives less than 3W/m^2.

    (2) The temperature of the Venus surface would be nowhere near as hot as 730K (even at the poles) because it receives only about 10% as much direct Solar radiation at its surface as does Earth at its surface.

    (3) Jupiter would be nowhere near as hot, even in its core, which receives extra kinetic energy which was converted by gravity from gravitational potentential energy due to the continual collapsing of this gaseous planet. This is why Jupiter emits more radiation than it receives.

    (4) The core of our Moon would be nowhere near as hot as it is thought to be, probably over 1000K.

    (5) Earth’s surface would indeed be perhaps 20 to 40 degrees colder, and the core, mantle and crust nowhere near as hot, maybe no molten material at all.

    Think about it! If you’re not sure why, it’s explained in Sections 4 to 9 and Section 15 here.

  19. gallopingcamel says:

    Chiefio,
    I think I used that verse once before. My memory plays tricks on me!

    Doug Cotton,
    From where I stand your theory looks a whole lot better than Arrhenius’ (which is totally false). However to get me on board you will have to make some predictions that can be falsified by experiments as N&K have done.

    When it comes to gas giants it does not matter where the heat comes from. Internal heat is as good as external heat thanks to thermodynamics.

    While I like N&K much more than Arrhenius there are some huge holes in their theory.

  20. KevinM says:

    Did hansens efforts to lower past data go from incidental, to subtle, to apparent, to obvious, to egregious? Even a sympathetic successor who “walks in on” the secret recipe will be confronted with what’s probably much harder to forgive when seen “all at once” then when learned “a little at a time”.
    In volume industrial test automation, the accumulation of small fudge factors and hard coded quick fixes can lead to periodic confidence crashes, especially at smaller companies where software control is a one man show.
    Nobody knows what lurks in his conscience but himself, but the past decade may have weighed heavily on him, like a stock gambler trapped “all in” on his worst knife catch, the outer world refusing to accommodate.

  21. R. de Haan says:

    They can fiddle around with temp data to cheat on the public into the warming scam as much as they like. Especially of Mother nature is giving the opposite message: http://newyork.cbslocal.com/2013/05/25/a-wet-and-miserable-start-to-memorial-day-weekend/

  22. CompuGator says:

    Over at https://chiefio.wordpress.com/gistemp/ (comments now closed),
    E.M.Smith says (23 July 2009 at 12:31 am):

    I’ve looked in USHCN2V2.f and the section in question looks like this one to me:

    if (temp .gt. -99.00) itemp(m)=nint( 50.*(temp-32.)/9 )

    A number of things, just based on this statement alone, concern me:

    Has input-sanity verification been done in some prior step, so at least some misaligned input data can be detected, instead of being abandoned or ignored?

    It looks as if ” 45100″ (inspired by the classic Fahrenheit 451) would be read, converted to degrees Celsius, and stored in itemp(m) without complaint. Oh, that’s right! It’s greater than -99.00, and it’s running on a supercomputer, so what could possibly be wrong with it?

    Is itemp(1:12) explicitly filled with some out-of-range marker value (not zero, which is a valid value) before each execution of the immediately-enclosing loop:

    do m=1,12

    so that values not stored (or maybe not even read) into itemp(m) can be recognized and not corrupt any calculated “average”? It’s not done element-by-element inside the loop. Not every computer operating system automatically zeroes data segments when it allocates or creates them (e.g.: IBM OS/360 didn’t).

    The expression 50. * T / 9 worries me. IBM shows evaluation as left-to-right, thus (50. * T) / 9, but is that required by the FORTRAN standards, or is it simply IBM’s statement of an implementation-defined behavior for precedence-2 operators that IBM chose? By coïncidence, it’s also the behavior for C, in which both operators are at precedence 13! It’s perilous, thus amateurish (never mind possession of any advanced degrees), to trust that every single compiler used, on differing computers, will calculate that expression as intended. It may not make a significant difference in this particular expression, but I get the impression that quite a few people have had their hands in developing or maintaining GISS/GCN code. Leaning heavily on fallible human memory of precedence & associativity, instead of just coding explicit parentheses, is a bad habit to get into. Especially when nailing down one’s intent with additional parentheses is so easy to do: (50.*(temp-32.))/9.

  23. CompuGator says:

    Disclaimer: I recognize that the 7 lines displayed by Chiefio from USHCN2V2.f, at the link I cited immediately above, are only an excerpt from what I assume is a much longer FORTRAN xx>’77 program. And I suppose some intervening lines, deemed boring or irrelevant, might’ve been deleted for brevity’s sake.

    My invented variable T, which does not appear in the 7-line excerpt, is a stand-in for the expression (temp - 32.), intended to simplify my discussion of FORTRAN-expression operator precedence.

    Perhaps I might be allowed to close with a purely emotional remark: It seems, from the code shown, that FORTRAN stooped to adopt C’s==equality-op (sometimes pronounced, mostly tongue-in-cheek, as ‘very equal’–but absolutely, positively, the wrong concept to apply to floating-point comparisons). Bleccch!

  24. P.G.Sharrow says:

    @CompuGator; Interesting that you took the trouble to examine the code. While I have very little experience in such things, it has always appeared to me that the code was slapped together to get a wanted result rather then to help with evaluating the data. In the early days of this field the rising stars claimed that the new supercomputers could solve everything if they could just get enough time on the computers and plug in all the data. Too bad that they had little understanding that computers are very stupid and do only what you tell them to do, Very fast, exactly! nothing intelligent there. Creating the code is the intelligent magic. The path to the solution must be known first, and then the code can be created to tell the computer how to get there. These guys were the blind leading ignorant in the dark. So GIGO.
    Those “rising stars” have used their coding results to enjoy a wonderful career. They don’t want anyone to examine the thing too closely. E.M.Smith seems to have a good grasp on climate/weather cause and effect as do several others. He even has a good grasp on computer coding and needs a job! All that is needed is a very large Grant to accomplish the job of real climate projection computer coding. 8-) pg

  25. CompuGator says:

    Beware that some more-or-less modern FORTRAN compilers sometimes implement, um, expression-parsingsurprises.
    These issues are not limited to FORTRAN code. To get a view of how arithmetic expressions are handled–or mishandled in the translators for various programming languages (thus parsed according to differing rules or standards), see http://stackoverflow.com/questions/tagged/operator-precedence.

    Returning to the FORTRAN issue raised at the first link above, it’s customary [*] for a unary arithmetic-operator to have higher precedence (i.e.: causing it to be performed sooner during execution) than any binary arithmetic-operator. And built-in exponentiation (i.e.: ** operator) is privileged as the highest-precedence binary arithmetic-operator. So in the linked surprise, I’d expect to see, e.g.: x**-2*y parsed and compiled as x**(-2) )*y. But that’s not what Intel’s compiler did: Their parse was embarrassingly uncustomary & unreasonable, as x**( -2*y ) ).h
    So it was their competitions’ parse that was the reasonable one. I infer from on-line documentation for IBM’s XL compiler that it takes the legalistic route, rejecting adjacent numerical operators as illegal (but not having access to that compiler, I don’t know whether it follows through by flagging it as an error that can’t be overridden, or can be compelled to generate code as an undocumented “extension”). I get the impression that it’s a restriction from a more-recent FORTRAN standard.[$]

    Note *: In my opinion: A condition that in most ways, ought to go without saying. Be that as it may, my programming perspective reaches a few decades back into the 20th century. It’d be fun, in an historical sense, to still have my old GE-Timesharing System manuals for BASIC and FORTRAN.

    Note $: I don’t own (or possess) any of the FORTRAN standards after 1978 (dubbed “77”). Before I dipped a toe into the programming side of climate-warmist controversy,
    their absence was a feature of my technical library, not a deficiency. Back when I owned such documents, both ANSI and ISO charged fees for them that were equivalent to the combined monthly total of my electricity and phone bills. Maybe there are usable references on line nowadays, but I’ve not previously had any need to look.

  26. CompuGator says:

    CompuGator says (3 August 2013 at 3:32 pm):

    Beware that some more-or-less modern FORTRAN compilers sometimes implement, um, expression-parsingsurprises.

    The text emphasized above should have appeared as 2 separate words: “parsing surprises”. Please note that I use “parsing” in the formal computer-science sense of the word; I have always rejected the technically ignorant misuse of the word by the mainstream news media (e.g.: the scandal of Pres. Clinton and the White-House intern).

    CompuGator says (per link above)

    Their parse was embarrassingly uncustomary & unreasonable, as ( x**( -2*y ) ).h

    There should’ve been no “.h” (suggestive of the file-extension for C source-code header files), at the end of that excerpt above, but simply a ‘.‘ to end the sentence.

    Is there really no way to provide some kind of comment-preview for WordPerfect blogs? For the comment I previously submitted, I physically inserted it into an XHTML file off line, and used an old version of a browser to verify that I properly closed all my opening tags. I’d made unusually extensive use of tags to emphasize specific single characters, typically arithmetic operators and parentheses, and it would’ve been too easy to foul up the nesting. Any single error would’ve completely ruined my intended effect, and using the XHTML skeleton file did enable me to catch and fix one.

  27. p.g.sharrow says:

    @CompuGator: Computer can’t tell the difference between Human speak and Computer instructions. Lol, very simple minded machine, only does exactly what it is told. I try to keep my comments simple with no imbedding of fancy characters or format. Easy for a simple minded person to do. ;-) but requires the recipient to think a bit more
    Still, I am delighted to read your comments on the jest of the problem. pg

  28. CompuGator says:

    Re: [FORTRAN Expression-Parsing Surprises]

    This past Saturday, I tried to post all of the presentation, to which I’ve now assigned
    the previously merely-implicit Subject shown above, to this topic (https://chiefio.wordpress.com/2013/05/19/gistemp-no-new-source-code/):

    WordPress Error:
    Sorry, this comment could not be posted.

    [2013-08-03 ~11:08 EDT .: ~7:04 GMT] Well, why the (expletives deleted! ) not!? Grrr!

    After much gnashing of teeth, I finally chopped it down to something WordPress would accept (3 August 2013 at 3:32 pm).

    It’s quite possible that Chiefio’s regular readers aren’t an audience well matched to the details of wrestling with compilers. Be that as it may, here below, I hope, will appear The Rest of the Story:

    Considering that x**(-2) is formally the same as 1/( x**2 ), my example expression (<a href="3 August 2013 at 3:32 pm) might best be rewritten as (1/( x**2 ))*y, which would preserve left-to-right (ordering of) evaluation, or ( y / (x**2)), which would not (while saving the time that’d be needed for 1 multiplication), in case order doesn’t matter for the values being calculated. Did anyone hear a hushed “Uh, oh”, evoked by an unspoken doubt: Are you really sure that it “doesn’t matter” to your code?

    The discussion on expression-evaluation in the preceding paragraph doesn’t depend on the value of the exponent, except that it’s limited to a negative-constant exponent.

    Of course, if the absolute value of the exponent were really 2 (instead of some greater integer), generating the multiplication ( x*x ) would almost always be preferable to the exponentiation operation. Depending on instruction timing, for greater exponents, e.g.: ( x**3 ), the analogous generation of the multiplication ( x*x*x ) might also make good sense.

    But at some point, the compiler writer really ought to consult the local numerical-analysis guru, to guard against fouling up an application-programmer’s intended calculation via excess compiler cleverness. Perhaps there even ought to be a compilation option to rein in that compiler-writing tendency: “noclever“.

  29. CompuGator says:

    P.G.Sharrow says (1 August 2013 at 2:50 am):

    In the early days of this field the rising stars claimed that the new supercomputers could solve everything if they could just get enough time on the computers and plug in all the data. Too bad that they had little understanding [….]  Creating the code is the intelligent magic. The path to the solution must be known first, and then the code can be created to tell the computer how to get there.

    The rationale of the climatologist “rising stars” that you’ve summarized for pursuing support from supercomputers might, I suppose, have allowed them to get lucky, by revealing patterns that were too grand in scope for any single brilliant mind to recognize. But that would’ve been only if brilliant–and perceptive–minds had been able to identify practically all the crucial influences on climate, and accurately assess the importance of each.

    Indeed, for computers to be useful in producing valid conclusions or guidance, they or brilliant-minded colleagues must be able to encode the quantitative influences & importance numerically. In this case, not only applying accepted principles of physics and chemistry, but also enough numerical analysis to avoid numerically ruined computations.

    As you’ve already written, they need to understand the problems and solution first, then design & code second. Validate their input data; give special early attention to formatting errors. Validate their output data; figure out in advance how they could test whether their computations–including the whole model–have gone wrong, and put those tests into their software. So test, and keep testing.

    Yet the politically-correct climatologists seem not to have the necessary software-applications skills, nor to’ve had assistance from colleagues who do. From what I remember of the “HARRY README” fiasco, I got the impression that much of their software was grad-student quality coding. I’ve routinely staffed a university help-desk for grad students who were “early adopters” of programming–usually FORTRAN–and they typically fit a profile:

    Arrogance that anyone who isn’t brain-dead can write FORTRAN programs, and certainly they can, because they are graduate students; indifference to code quality, accepting computations that “seem to work” well enough to fit the student’s draft thesis or dissertation; flimsy, um, design favoring–if not reliant upon–preconceived situations, neglecting logic paths that’ll make code go BOOM! when reality refuses to coöperate; severe oversights or shortcomings in organization, error-handling, coding style; and of course, nearly no documentation of a kind that’d be necessary to give a fighting chance to any poor soul assigned to correct it or “build on it” (AIYEEE! ) after its original author has departed.

    I’ll leave opinions about the politically-correct climatologists’ skills in physics and chemistry to others whose academic performance in those fields was more, um, exemplary than mine. Could they be as callous about the standards of those fields as they appear to be about software?

    Absent the complete set of crucial skills, they’ll only get reams & reams of print-outs, or cylinders of hard disks, full of floating-point numbers, and purty-colored movin’ pitchers, quite possibly totally unrelated to reality, altho’ produced with blinding computational speed at great public cost.[#] Before anyone misuses them to make public policy, that is.

    Note #: Whoo-whee! More dollars or euros of fundin’ filling climatologists’ purses than the Okefenokee has (mo)skeeters!

Comments are closed.