It’s been a while since I did a GIStemp posting. Partly as Hansen had become such a bad joke it wasn’t worth it. Partly as the focus had moved to the upstream data diddling, what with GHCN V3 being significantly “warmer” than Version 1 had been.
But I have a new toy, the Raspberry Pi, and I wanted to play with it. See what it could do… Could it, in fact, compile and run GIStemp?
Now if planning to do a new port, it is worth it to start with a fresh copy of the code, so I went to NASA and downloaded the source code from the approved link:
It all looked rather familiar, and I didn’t think much of it. The unpacking went well, as did the set-up of a file system for it ( it fits nicely on a 2 GB SD card ;-) I looked up my “Make File” posting and proceeded to type it in again, by hand. (Why? Well, to check for any “differences” with the “new” GIStmp that runs on GHCN Version 3, of course…
It went fairly well. Nothing seemed changed, so I figured it must be the same names, but maybe different ‘insides’ for V3…
A couple of test compiles showed the FORTRAN compiler (gfortran) was much more accepting. The prior issue of making assignments during the type declaration didn’t present a problem, and all the modules compiled without needing to pay attention to F77 vs F95 nearly so much.
Fortran is quite fast on the Raspberry Pi with a 30 Mb/second SD chip!
I managed to have “make” compile all the FORTRAN to binaries. I even test ran a few of them and they worked fine. So next I put the GHCN data where it belonged (but noted that the code still talked about V2… but, I thought, maybe they just didn’t want to change all the file names from V2 to V3 and change the program names and change…) So I put a copy of the V3 GHCN data in the “v2.mean” input file (where it always had been put), and told it to run…
Several steps using USHCN ran fine. Then, on the first step that reads in GHCN, it tossed it’s cookies with a Format Error in the data stream.
Looking more closely, the code looked VERY familiar. It looked like it was just the old V2 code still. So I “checked it out”, and no, I didn’t have an old copy, it was a fresh download. I’ve only had the cards a couple of months, and the data and software were direct downloads to it. Date stamp on the tar archive on my card is May 16, 2013. So the download was just 3 days ago. Just to be sure, I did a ‘tar -xvf GISTEMP.sources.tar’ and captured the date stamps on the files IN that archive. (The command ‘tar’ is the Tape ARchiver. X is extract, while V says be verbose about it and F says get the archive from this file).
Here’s a screen capture. This is Step0 where the initial data unload and mix is done. If any V3 changes were made, it ought to show here:
You ought to be able to click on that to ’embiggen’ in and read the dates. It’s mostly 2009 and 2010. The newest data stamp in it is 2011 for one bit of code not related to GHCN.
What’s The Deal?
So I’m realizing this is the same old Version 2 software. But if you go to NCDC to download the Version 2 data to use with it (as I was mostly interested in testing the Raspberri Pi so figured I’d give up on V3) you find out that GHCN Version 2 temperature data is now gone.
grid 5/15/2013 4:00:00 PM
source 12/23/2010 12:00:00 AM
File:v2.country.codes 11 KB 1/25/2002 12:00:00 AM
File:v2.max.Z 6010 KB 2/7/2012 12:00:00 AM
File:v2.prcp.Z 25445 KB 5/18/2013 5:00:00 AM
File:v2.prcp.failed.qc.Z 163 KB 5/18/2013 5:00:00 AM
File:v2.prcp.inv 1267 KB 1/25/2002 12:00:00 AM
File:v2.prcp.readme 4 KB 9/27/2011 12:00:00 AM
File:v2.prcp_adj.Z 5357 KB 5/18/2013 5:00:00 AM
File:v2.read.data.f 11 KB 12/14/2006 12:00:00 AM
File:v2.read.inv.f 3 KB 12/14/2006 12:00:00 AM
File:v2.slp.country.codes 11 KB 10/3/2002 12:00:00 AM
The precipitation data is still there, along with the MAX temperatures, but MIN and average are gone.
OK, I’ve got a few copies of the old V2 “squirreled away” and can get them unpacked in a few days to finish my Raspberry Pi testing. But that’s not the point… (Well, it is for me, but there’s a bigger point…)
Are we not supposed to have access to both the data AND the software being used to pronounce this yet again “The Hottest Ever!!!”
(even as snow covers the UK and Canada… )?
This is just so broken.
They claim to be running GIStemp with Version 3 GHCN data at their web site.
But are they? Is “Trust me.” good enough from the government?
Record high global temperature during the period with instrumental data was reached in 2010. After that paper appeared, version 3 of the GHCN data became available. The current analysis is now based on the adjusted GHCN v3 data for the data over land.
Are they hiding what they are doing? Preventing a proper software audit? Normally I’d attribute to just being busy with other stuff, and what with Hansen being on the skids nobody really caring about “his software” anymore. But in light of nefarious things with the: Associated Press, State Department / Benghazi, EPA, and even IRS being politicized; well, lets just say that it sure looks like “assume the worst” and you get closer…
At this point, I don’t know if I ought to be upset that there is no GHCN Version 3 code to validate / check; or be happy that now with Hansen out of the picture his “products” are being allowed to die on the vine. I’m all for just ’round filing’ GIStemp and calling it broken history. But I’m not willing to accept having it run, claiming “Hottest Ever!!!” and no way to audit the thing to see what was changed.
As of now, GIStemp is a black box with no utility as it is unverifiable.
For some folks, EVERYTHING is political. This seems to be true of our current government, and, more troubling, much of our current civil service.
When the narrative and the agenda are more important than the civic and social responsibility, it is time to clean house. Suggest replacement more likely to succeed that “reform” which is usually the same scoundrels switching chairs whilst canning a couple of unfortunate underlings.
Now that hansen is gone i guess they are busy cleaning up after him. Since nobody understands the mess he left behind they must have made an even greater mess of it. AssumeStupidityNotMalice
This is SAD. If its black box its worthless. Time to ignore their data, I think its meaningless.
This article is worth a repost at WUWT.
Richard Ilfeld says:
19 May 2013 at 12:48 pm
“For some folks, EVERYTHING is political. ”
The term political no longer covers the load.
This is criminal. Period.
Well, I got STEP0 to run to completion. The only really “odd bit” is that the “tail” command on the Raspberry Pi doesn’t understand the “+100” option, (telling it to skip 100 lines, then print the rest), so since the Hohenpeissenberg file had 224 lines in it, I just changed it to “tail -124” and moved on. At some point that kludge needs a fix.
So at this point, unless the Python in Step1 is a pain somehow, it looks relatively easy to run GIStemp on a Raspberry Pi. Since it ships with Python based games on the desktop, I’m pretty sure the Python will not be a problem…
I’m presently running on a copy of GHCN.v2 from Dec 2009 that was in my archive…
The Giss methodology is substantially different from how they handled the data from Ghcn v2. Instead of combining all the raw datasets from a specific station into one dataset(averaging where their are multiple values), they allow that to be done by Ghcn. Ghcn then further adjusts the combined dataset into an adjusted dataset. This is becomes the dataset Giss uses when making their homogeneity adjustments.
The pathetic method Ghcn uses when combining datasets from one location I find astonishing. Here is how they say they do it.
What this ends up doing in many cases is over-writing valid values from a shorter dataset with whatever is present in the longer one. This even includes over-writing actually valid values with -9999. I’ve seen some cases where entire years were present in a shorter dataset and entirely wiped out with the value -9999 simply because the longer dataset didn’t record the data.
Ghcn then takes this atrociously combined data and adjusts it according to some algorithm and this is what Giss now uses when performing their homogeneity adjustment.
Then there is the fact that Giss used to end up with a single homogeneity adjusted dataset set for each location. They accomplished this by doing the combining before homogeneity adjustment. If you think that is still true, I have a bridge for sale if you are interested. There are 51 stations, all in the southern hemisphere, which now have multiple homogeneity adjusted datasets. This comes about due to Ghcn apparently not combining the highest dataset number into its combined dataset in those instances. Giss makes a separate homogeneity adjusted dataset out of it. Most of these stations appear to be rural stations where running their homogeneity adjustment doesn’t alter the data.
Unsurprisingly the multiple homogenized datasets at a location end up with different trend slopes. If as I suspect, they are not being further combined into one dataset before use in sub-box creation, those locations exert out-sized influence compared to those stations with only one dataset in use. Call me cynical, but frankly I doubt this change was done to improve accuracy. More likely it tends to increase SH warming. I haven’t checked though. It’s beyond my declining abilities and interest in my advanced years.
In a while I’ll put up a list of the stations with multiple homogeneity datasets in another comment.
These are the 51 stations with multiple homogeneity datasets. Last digit(dataset number) has been removed.
If you are really interested in further investigation this might be a ‘dig here’ place to start if you wish to do a quick analysis as to effect. There are four stations with triple datasets. I have included their neighbors within 1200 km.
0 km (*) Gough Island 40.4 S 9.9 W 141689060000 < 10,000 1955 – 2013
0 km (*) Gough Island 40.4 S 9.9 W 141689060007 < 10,000 1956 – 2006
0 km (*) Gough Island 40.4 S 9.9 W 141689060008 < 10,000 1956 – 2013
424 km (*) Tristan Da Cunha 37.0 S 12.3 W 141689020000 < 10,000 1942 – 1987
0 km (*) Marion Island 46.9 S 37.9 E 141689940000 < 10,000 1948 – 2013
0 km (*) Marion Island 46.9 S 37.9 E 141689940007 < 10,000 1948 – 2006
0 km (*) Marion Island 46.9 S 37.9 E 141689940008 < 10,000 1948 – 2013
1069 km (*) Crozet 46.4 S 51.9 E 143619970000 < 10,000 1970 – 2013
1069 km (*) Crozet 46.4 S 51.9 E 143619970007 < 10,000 1965 – 1998
0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980000 < 10,000 1948 – 2013
0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980007 < 10,000 1948 – 2002
0 km (*) Macquarie Isl 54.5 S 158.9 E 501949980008 < 10,000 1948 – 2013
707 km (*) Campbell Isla 52.5 S 169.2 E 507939450000 < 10,000 1941 – 2002
707 km (*) Campbell Isla 52.5 S 169.2 E 507939450007 < 10,000 1941 – 1996
723 km (*) Campbell 52.0 S 169.0 E 507939470008 < 10,000 1961 – 2013
1098 km (*) Invercargill 46.7 S 168.6 E 507938440000 49,000 1950 – 2010
0 km (*) Grytviken, 54.3 S 36.5 W 317889030000 < 10,000 1905 – 2013
0 km (*) Grytviken, 54.3 S 36.5 W 317889030007 < 10,000 1905 – 1982
0 km (*) Grytviken, 54.3 S 36.5 W 317889030008 < 10,000 1905 – 2013
871 km (*) Base Orcadas 60.8 S 44.7 W 701889680000 < 10,000 1903 – 2011
871 km (*) Base Orcadas 60.8 S 44.7 W 701889680008 < 10,000 1903 – 2013
899 km (*) Signy Island 60.7 S 45.6 W 147890420000 < 10,000 1947 – 1994
899 km (*) Signy Island 60.7 S 45.6 W 147890420008 < 10,000 1947 – 1995
Nice to know… I’d heard something like that, which was why I decided I ought to do a re-port of GIStemp and see how different it was when the results were compared with my archived V2 version. Essentially, run both of them through Dec 2009 and compare, and / or splice on v3 data from 2010-present to the V2 set and compare both ‘end to end’.
Instead I found out that “whatever they are doing” it isn’t in the source code available to download.
I’m quite certain that any ‘work product’ produced by GIStemp is worthless (as it was worthless in V2 and will be worse now); but thought it might be fun to have it running on a R.Pi card… (Just to poke fun at the idea you need a supercomputer for “climate science” if nothing else ;-)
As it stands, I’m finding myself having ever more “cringe” response as I try to make it “go” again…
FWIW, making the FORTRAN bits compile and run has been easy (for the v2 source code they have available); but the Python has presented some “issues” in the first attempt. One installs to custom libraries made from C code via a multi-layer “Make” process (a make that makes the Makefile to make the… ) and it seems to expect things based on a user installed python 1.5 in a /usr/local type directory, not a Python 2.7 in /usr/bin already installed. So I’m packing it in for a day or two (as I have more interesting things to do…). But at this point, the only bit where I’m “stuck” is on working out the ‘recipe’ to get the STEP1 python install / build library bit done on the R.Pi. (In some ways, putting it on the old RedHat 7.x box was easier as it didn’t have Python on it already to I just did the “install 1.5 like they expect” and go…)
At any rate, doing it all for the V2 port, when that whole homogenizing step is likely very different in whatever the v3 code might be; seems, somehow, not my “highest and best use” right now… I’ve already shown that the R.Pi handles the FORTRAN well and easily. (Had to install FORTRAN and the mksh packages to get the Korn Shell and gfortran, but that was all of 5 minutes and two simple commands “apt-get install mksh” and “apt-get install gfortran” as I recall it…)
What I’d really like to do is get the V3 code and see what it does, but that seems to be “unavailable” at the moment…
I would say political especially if you follow the dots.
Start with Royal Dutch Shell:
The Dutch royal family (The House of Orange) is still reportedly the biggest shareholder in the Dutch part of the group, although the size of its stake has long been a source of debate. The Queen of England is also a major stockholder link and Scuttlebutt and more Scuttlebutt.
Prince Bernhard of the Dutch Royal Family is the Founding President of World Wildlife Fund (WWF)
Another major stockholder is the Rothschilds. The Rothschild Investment Trust was formed in 1988 => RIT Capital Partners. Rockefellers and Rothschilds Unite
Then we look at the Shell Board of Directors.
Rather well connected to governments, NGOs and various banks are they not?
Then you have Shell Oil funding the Climate Research Unit at East Anglia. WIKI A Shell Oil President, Marlan Downey, “Former President of the international subsidiary of Shell Oil, founder of Roxanna Oil; former President of Arco International” on the Advisory Board of Richard The Liar* Muller’s consulting firm, Muller & Assoc.
Finally we come to Ged Davis a Shell Oil VP with IPCC connections who was in the Climategate E-mails. See: ClimateGate (1) email 0889554019 The e-mail attachment includes his Sustainable Development (B1) scenario aka UN Agenda 21.
Like Maurice Strong, Ged Davis is one of those people you don’t hear much about who has a big impact on your life.
* “Let me be clear. My own reading of the literature and study of paleoclimate suggests strongly that carbon dioxide from burning of fossil fuels will prove to be the greatest pollutant of human history. It is likely to have severe and detrimental effects on global climate.” – Richard Muller, 2003″ link
BobN – ….Time to ignore their data, I think its meaningless.
It always was meaningless. You have to take into account the energy contained in water vapor to actually measure the energy since that is what they should be looking at. ( Trenbeth’s missing energy is not hiding in the ocean it is hiding in the clouds, snicker)
I am awestruck by what you do. My ability with words is totally inadequate, so I have to resort to poetry:
Beside yon straggling fence that skirts the way
With blossom’d furze unprofitably gay,
There, in his noisy mansion, skill’d to rule,
The village master taught his little school;
A man severe he was, and stern to view,
I knew him well, and every truant knew;
Well had the boding tremblers learn’d to trace
The days disasters in his morning face;
Full well they laugh’d with counterfeited glee,
At all his jokes, for many a joke had he:
Full well the busy whisper, circling round,
Convey’d the dismal tidings when he frown’d:
Yet he was kind; or if severe in aught,
The love he bore to learning was in fault.
The village all declar’d how much he knew;
‘Twas certain he could write, and cipher too:
Lands he could measure, terms and tides presage,
And e’en the story ran that he could gauge.
In arguing too, the parson own’d his skill,
For e’en though vanquish’d he could argue still;
While words of learned length and thund’ring sound
Amazed the gazing rustics rang’d around;
And still they gaz’d and still the wonder grew,
That one small head could carry all he knew.
Oliver Goldsmith, Deserted Village
@Gail Combs – About 10 years ago when I first started hearing about global warming, I took a few Saturdays to track down the location of the meters in my area used for the data collection. About half had a bias to warm in where they were located. I new then that it wasn’t science, it was an agenda.
The CAGW folks are getting really desperate with their “Missing Heat” and global cooling caused by aerosols from China.
So what will the AR5 say in September? Similar to what AR4 said but with a lower sea level rise predicted for 2100.
It was never about science.
Good luck with it, if you do decide to get it running. Just thought I’d point out a couple areas where I noticed they have made changes. As far as I’m concerned, their product has always been shoddy and it seems to be getting more so.
One other thing they are now doing. They are using reanalysis values for Byrd station. Previously they only used Byrd starting in 1980, now they have infilled with many years of reanalysis values so they can go back to 1957. At least they did mentioned that in one of their updates a couple months ago. Maybe using that station as a stalking horse, with reanalysis to be expanded to other stations if people don’t complain about it.
They seem addicted to fudge.
Here are the actual ‘reanalysis values for the Byrd Station’ and yes they are addicted to fudge.
Reconstructed temperature record from Byrd Station (1957-2012)
@Bob Koss & Gail:
Oh, I like that! Remind me to send boxes of fudge as gifts if I’m ever in a position of needing to present a ‘social gift’ to one of The Team! (Say, at a presentation somewhere… having a small ‘thank you gift’ for participating ;-)
FWIW, I’m not so much interested in running GIStemp to USE it, as to COMPARE it. To do “compare and contrast” between the basic data (the V1 I’ve got saved vs V2 vs V3) and show that the “increasing warming” comes as much from changes in the processing of the actual data as anything else… Also I’d like to have it on a $25 Raspberry Pi just for the humor value….
For reasons beyond my ken, folks love to attribute more importance to codes that need expensive hardware to run A simple formula that is accurate and can be solved on a slide rule is more impressive to me; but say “our model runs on the world’s fastest supercomputer” and folks are impressed with it… So having a photo of a R.Pi captioned with ‘running GIStemp’ would make for an interesting “funny”, IMHO… it puts “strain” between expectations of importance and reality of hardware.
It’s also pretty crappy code, so if it runs correctly, I can be pretty sure the compiler is a good one. Something of a test suit for the R.Pi “maturity” of the port of compiler and related tools.
Maybe I’ll try porting one of the “Climate Models” instead ;-)
Thanks for the poetry! Don’t know that I’m in that category, but that is for others to decide.
I have my “empty spots” too, so don’t be too impressed. There’s a ‘selection bias’ here in that I avoid topics where I’m a complete idiot. I know nearly nothing about: Opera, ballet, central Asian and amerindian language, power politics from an operational aspect – i.e. I can’t do it well, Indian cooking – though I’d love to be better at it, music history nor playing any instrument, etc. etc….
So I “know some stuff” in some depth in a few particular areas that are diverse from each other. OK, it’s a nice trick. But there is a much much larger area where I’m functionally ignorant… (ask me about ANY pop star, movie star, name singer or dancer or… and you will get a blank stare… Similarly for any African tribe or their history / political dynamics. I know there have been conflicts, but can’t say how to spell Hutu? or Tutsi? or why they do what they do…) So I’m not at risk of becoming too ‘full of myself’. I’ve got my moments, and I appreciate you’re enjoying them; but I’m going to remain ‘in touch’ with my core ignorance of far too much… Which is likely why I’m still so interested in learning things ;-)
I’ll likely get back to finishing the v2 port in a week or two. I’ve got some things that have “popped up” that are going to suck up a lot of the next 2 weeks. Then I’ll have more “idle time” to work on things like porting misc. codes to the R.Pi just to see them run there…
(Anyone else with a R. Pi who wants to work out how to get the STEP1 Python libraries installed and that step running, feel free. It will save me a couple of days work… hint hint ;-)
Without gravity acting to restore the thermodynamic equilibrium which is stipulated in the Second Law of Thermodynamics (which says: “An isolated system, if not already in its state of thermodynamic equilibrium, spontaneously evolves towards it. Thermodynamic equilibrium has the greatest entropy amongst the states accessible to the system”) and thus, as a direct corollary of that Law, supporting (at the molecular level) an autonomous thermal gradient, then …
(1) The temperature at the base of the troposphere on Uranus would be nowhere near as hot as 320K because virtually no direct Solar radiation gets down there, and there is no surface at that altitude. The planet’s radiating temperature is under 60K because it receives less than 3W/m^2.
(2) The temperature of the Venus surface would be nowhere near as hot as 730K (even at the poles) because it receives only about 10% as much direct Solar radiation at its surface as does Earth at its surface.
(3) Jupiter would be nowhere near as hot, even in its core, which receives extra kinetic energy which was converted by gravity from gravitational potentential energy due to the continual collapsing of this gaseous planet. This is why Jupiter emits more radiation than it receives.
(4) The core of our Moon would be nowhere near as hot as it is thought to be, probably over 1000K.
(5) Earth’s surface would indeed be perhaps 20 to 40 degrees colder, and the core, mantle and crust nowhere near as hot, maybe no molten material at all.
Think about it! If you’re not sure why, it’s explained in Sections 4 to 9 and Section 15 here.
I think I used that verse once before. My memory plays tricks on me!
From where I stand your theory looks a whole lot better than Arrhenius’ (which is totally false). However to get me on board you will have to make some predictions that can be falsified by experiments as N&K have done.
When it comes to gas giants it does not matter where the heat comes from. Internal heat is as good as external heat thanks to thermodynamics.
While I like N&K much more than Arrhenius there are some huge holes in their theory.
Did hansens efforts to lower past data go from incidental, to subtle, to apparent, to obvious, to egregious? Even a sympathetic successor who “walks in on” the secret recipe will be confronted with what’s probably much harder to forgive when seen “all at once” then when learned “a little at a time”.
In volume industrial test automation, the accumulation of small fudge factors and hard coded quick fixes can lead to periodic confidence crashes, especially at smaller companies where software control is a one man show.
Nobody knows what lurks in his conscience but himself, but the past decade may have weighed heavily on him, like a stock gambler trapped “all in” on his worst knife catch, the outer world refusing to accommodate.
They can fiddle around with temp data to cheat on the public into the warming scam as much as they like. Especially of Mother nature is giving the opposite message: http://newyork.cbslocal.com/2013/05/25/a-wet-and-miserable-start-to-memorial-day-weekend/
Over at https://chiefio.wordpress.com/gistemp/ (comments now closed),
E.M.Smith says (23 July 2009 at 12:31 am):
A number of things, just based on this statement alone, concern me:
Has input-sanity verification been done in some prior step, so at least some misaligned input data can be detected, instead of being abandoned or ignored?
It looks as if ” 45100″ (inspired by the classic Fahrenheit 451) would be read, converted to degrees Celsius, and stored in
itemp(m)without complaint. Oh, that’s right! It’s greater than -99.00, and it’s running on a supercomputer, so what could possibly be wrong with it?
itemp(1:12)explicitly filled with some out-of-range marker value (not zero, which is a valid value) before each execution of the immediately-enclosing loop:
so that values not stored (or maybe not even read) into
itemp(m)can be recognized and not corrupt any calculated “average”? It’s not done element-by-element inside the loop. Not every computer operating system automatically zeroes data segments when it allocates or creates them (e.g.: IBM OS/360 didn’t).
50. * T / 9worries me. IBM shows evaluation as left-to-right, thus
(50. * T) / 9, but is that required by the FORTRAN standards, or is it simply IBM’s statement of an implementation-defined behavior for precedence-2 operators that IBM chose? By coïncidence, it’s also the behavior for C, in which both operators are at precedence 13! It’s perilous, thus amateurish (never mind possession of any advanced degrees), to trust that every single compiler used, on differing computers, will calculate that expression as intended. It may not make a significant difference in this particular expression, but I get the impression that quite a few people have had their hands in developing or maintaining GISS/GCN code. Leaning heavily on fallible human memory of precedence & associativity, instead of just coding explicit parentheses, is a bad habit to get into. Especially when nailing down one’s intent with additional parentheses is so easy to do: (50.*(temp-32.))/9.
Disclaimer: I recognize that the 7 lines displayed by Chiefio from
USHCN2V2.f, at the link I cited immediately above, are only an excerpt from what I assume is a much longer FORTRAN xx>’77 program. And I suppose some intervening lines, deemed boring or irrelevant, might’ve been deleted for brevity’s sake.
My invented variable
T, which does not appear in the 7-line excerpt, is a stand-in for the expression
(temp - 32.), intended to simplify my discussion of FORTRAN-expression operator precedence.
Perhaps I might be allowed to close with a purely emotional remark: It seems, from the code shown, that FORTRAN stooped to adopt C’s “
==” equality-op (sometimes pronounced, mostly tongue-in-cheek, as ‘very equal’–but absolutely, positively, the wrong concept to apply to floating-point comparisons). Bleccch!
@CompuGator; Interesting that you took the trouble to examine the code. While I have very little experience in such things, it has always appeared to me that the code was slapped together to get a wanted result rather then to help with evaluating the data. In the early days of this field the rising stars claimed that the new supercomputers could solve everything if they could just get enough time on the computers and plug in all the data. Too bad that they had little understanding that computers are very stupid and do only what you tell them to do, Very fast, exactly! nothing intelligent there. Creating the code is the intelligent magic. The path to the solution must be known first, and then the code can be created to tell the computer how to get there. These guys were the blind leading ignorant in the dark. So GIGO.
Those “rising stars” have used their coding results to enjoy a wonderful career. They don’t want anyone to examine the thing too closely. E.M.Smith seems to have a good grasp on climate/weather cause and effect as do several others. He even has a good grasp on computer coding and needs a job! All that is needed is a very large Grant to accomplish the job of real climate projection computer coding. 8-) pg
Beware that some more-or-less modern FORTRAN compilers sometimes implement, um, expression-parsingsurprises.
These issues are not limited to FORTRAN code. To get a view of how arithmetic expressions are handled–or mishandled in the translators for various programming languages (thus parsed according to differing rules or standards), see http://stackoverflow.com/questions/tagged/operator-precedence.
Returning to the FORTRAN issue raised at the first link above, it’s customary [*] for a unary arithmetic-operator to have higher precedence (i.e.: causing it to be performed sooner during execution) than any binary arithmetic-operator. And built-in exponentiation (i.e.:
**operator) is privileged as the highest-precedence binary arithmetic-operator. So in the linked surprise, I’d expect to see, e.g.:
x**-2*yparsed and compiled as
( x**(-2) )*y. But that’s not what Intel’s compiler did: Their parse was embarrassingly uncustomary & unreasonable, as
( x**( -2*y ) ).h
So it was their competitions’ parse that was the reasonable one. I infer from on-line documentation for IBM’s XL compiler that it takes the legalistic route, rejecting adjacent numerical operators as illegal (but not having access to that compiler, I don’t know whether it follows through by flagging it as an error that can’t be overridden, or can be compelled to generate code as an undocumented “extension”). I get the impression that it’s a restriction from a more-recent FORTRAN standard.[$]
Note *: In my opinion: A condition that in most ways, ought to go without saying. Be that as it may, my programming perspective reaches a few decades back into the 20th century. It’d be fun, in an historical sense, to still have my old GE-Timesharing System manuals for BASIC and FORTRAN.
Note $: I don’t own (or possess) any of the FORTRAN standards after 1978 (dubbed “77”). Before I dipped a toe into the programming side of climate-warmist controversy,
their absence was a feature of my technical library, not a deficiency. Back when I owned such documents, both ANSI and ISO charged fees for them that were equivalent to the combined monthly total of my electricity and phone bills. Maybe there are usable references on line nowadays, but I’ve not previously had any need to look.
CompuGator says (3 August 2013 at 3:32 pm):
The text emphasized above should have appeared as 2 separate words: “parsing surprises”. Please note that I use “parsing” in the formal computer-science sense of the word; I have always rejected the technically ignorant misuse of the word by the mainstream news media (e.g.: the scandal of Pres. Clinton and the White-House intern).
CompuGator says (per link above)
There should’ve been no “.h” (suggestive of the file-extension for C source-code header files), at the end of that excerpt above, but simply a ‘.‘ to end the sentence.
Is there really no way to provide some kind of comment-preview for WordPerfect blogs? For the comment I previously submitted, I physically inserted it into an XHTML file off line, and used an old version of a browser to verify that I properly closed all my opening tags. I’d made unusually extensive use of tags to emphasize specific single characters, typically arithmetic operators and parentheses, and it would’ve been too easy to foul up the nesting. Any single error would’ve completely ruined my intended effect, and using the XHTML skeleton file did enable me to catch and fix one.
@CompuGator: Computer can’t tell the difference between Human speak and Computer instructions. Lol, very simple minded machine, only does exactly what it is told. I try to keep my comments simple with no imbedding of fancy characters or format. Easy for a simple minded person to do. ;-) but requires the recipient to think a bit more
Still, I am delighted to read your comments on the jest of the problem. pg
Re: [FORTRAN Expression-Parsing Surprises]
This past Saturday, I tried to post all of the presentation, to which I’ve now assigned
the previously merely-implicit
Subjectshown above, to this topic (https://chiefio.wordpress.com/2013/05/19/gistemp-no-new-source-code/):
[2013-08-03 ~11:08 EDT .: ~7:04 GMT] Well, why the (expletives deleted! ) not!? Grrr!
After much gnashing of teeth, I finally chopped it down to something WordPress would accept (3 August 2013 at 3:32 pm).
It’s quite possible that Chiefio’s regular readers aren’t an audience well matched to the details of wrestling with compilers. Be that as it may, here below, I hope, will appear The Rest of the Story:
x**(-2)is formally the same as
1/( x**2 ), my example expression (<a href="3 August 2013 at 3:32 pm) might best be rewritten as
(1/( x**2 ))*y, which would preserve left-to-right (ordering of) evaluation, or
( y / (x**2)), which would not (while saving the time that’d be needed for 1 multiplication), in case order doesn’t matter for the values being calculated. Did anyone hear a hushed “Uh, oh”, evoked by an unspoken doubt: Are you really sure that it “doesn’t matter” to your code?
The discussion on expression-evaluation in the preceding paragraph doesn’t depend on the value of the exponent, except that it’s limited to a negative-constant exponent.
Of course, if the absolute value of the exponent were really 2 (instead of some greater integer), generating the multiplication
( x*x )would almost always be preferable to the exponentiation operation. Depending on instruction timing, for greater exponents, e.g.:
( x**3 ), the analogous generation of the multiplication
( x*x*x )might also make good sense.
But at some point, the compiler writer really ought to consult the local numerical-analysis guru, to guard against fouling up an application-programmer’s intended calculation via excess compiler cleverness. Perhaps there even ought to be a compilation option to rein in that compiler-writing tendency: “
P.G.Sharrow says (1 August 2013 at 2:50 am):
The rationale of the climatologist “rising stars” that you’ve summarized for pursuing support from supercomputers might, I suppose, have allowed them to get lucky, by revealing patterns that were too grand in scope for any single brilliant mind to recognize. But that would’ve been only if brilliant–and perceptive–minds had been able to identify practically all the crucial influences on climate, and accurately assess the importance of each.
Indeed, for computers to be useful in producing valid conclusions or guidance, they or brilliant-minded colleagues must be able to encode the quantitative influences & importance numerically. In this case, not only applying accepted principles of physics and chemistry, but also enough numerical analysis to avoid numerically ruined computations.
As you’ve already written, they need to understand the problems and solution first, then design & code second. Validate their input data; give special early attention to formatting errors. Validate their output data; figure out in advance how they could test whether their computations–including the whole model–have gone wrong, and put those tests into their software. So test, and keep testing.
Yet the politically-correct climatologists seem not to have the necessary software-applications skills, nor to’ve had assistance from colleagues who do. From what I remember of the “HARRY README” fiasco, I got the impression that much of their software was grad-student quality coding. I’ve routinely staffed a university help-desk for grad students who were “early adopters” of programming–usually FORTRAN–and they typically fit a profile:
Arrogance that anyone who isn’t brain-dead can write FORTRAN programs, and certainly they can, because they are graduate students; indifference to code quality, accepting computations that “seem to work” well enough to fit the student’s draft thesis or dissertation; flimsy, um, design favoring–if not reliant upon–preconceived situations, neglecting logic paths that’ll make code go BOOM! when reality refuses to coöperate; severe oversights or shortcomings in organization, error-handling, coding style; and of course, nearly no documentation of a kind that’d be necessary to give a fighting chance to any poor soul assigned to correct it or “build on it” (AIYEEE! ) after its original author has departed.
I’ll leave opinions about the politically-correct climatologists’ skills in physics and chemistry to others whose academic performance in those fields was more, um, exemplary than mine. Could they be as callous about the standards of those fields as they appear to be about software?
Absent the complete set of crucial skills, they’ll only get reams & reams of print-outs, or cylinders of hard disks, full of floating-point numbers, and purty-colored movin’ pitchers, quite possibly totally unrelated to reality, altho’ produced with blinding computational speed at great public cost.[#] Before anyone misuses them to make public policy, that is.
Note #: Whoo-whee! More dollars or euros of fundin’ filling climatologists’ purses than the Okefenokee has (mo)skeeters!