Here’s a few graphs that look at stations in “High Cold Places” both in the GISS baseline and in the last year of data in GHCN v3.3 for whatever it means.
First the GISS baseline of 1950-1980:
Then the final year of data in GHCN v3.3:
Then comparing the two:
It looks like South America moved more south in the Andes, and near China moved more north.
They essentially lost New Zealand and a large chunk of Africa. Then overall just reduced the number of stations in “High Cold Places” with higher volatility (greater range of temperatures) than low places near the beach…
The Midrange
Just for grins, I’m adding a graph of the changes in the Midrange altitudes. Over 100 meters and under 1000 meters. That’s below most high cold places (under 3000 feet) and above the warm coastal areas where most people live.
It does look to me like there’s a general move inland and toward drier areas. South America looks like it is moving downslope of the Andes into the rain forest, while Africa snuggles up closer to the South Sahara. North America is a bit too dense to see through all the blue baseline overlay, but it looks like a general move southward and into “flyover country” – the hotter drier places.
Then Europe picks up more Spain and a bit of what looks like Poland. Russia is a bit of a mess with things just moving around. India looks to be moving out of the cooler mountain slopes and toward the warmer south. Then there’s nearly nothing Out To Sea… Even New Zealand is gone.
The Beach
Well, might as well toss in a “below 100 m” graph. That’s where most of the people live, most of the urbanization is happening, and most of the water moderated temperature swings happens. Just for comparison to the other graphs if nothing else.
We do get the islands and oceans back again. New Zealand shows up. There’s a lot of baseline blue stations along all the coasts. Then lots of change in the high northern hemisphere and low southern hemisphere.
I’m not sure exactly what all it means, but what is very clear is that there’s a whole lot of screwing around with the intruments going on.
It is my opinion that there is simply no way (anomalies or not) that you can do valid calorimetry with that much instrument change. They asserting it is accurate to the 1/10 C range when the measurements are in 1 whole degree is just daft. The error bars exceed the adjustments and the adjustments exceed the actual trend in the data.
In Australian BOM temperature data for the last century, there is a marked difference between coastal and inland stations. The variance of the inland sites is much greater. Best shown with maximum temperatures. I have done first differences of the annual data for 57 of the longest, more complete stations. I have used inland for greater than 100 know from the ocean, broadly measured on Google Earth. The standard deviation of the first diffrrence is 2 to 3 times larger for inland sites, than coastal. Among other effects, this should cause thought about using the anomaly method.
Investigations continue. At this stage I do not know if the cause is instrumental or weather related and it is proving hard to find good ways to dissect this out. Geoff.
@Geoff:
I have a thesis that it is the suppression of cold volatile stations (like those up the mountain) and the addition of hot volatile stations (like inland deserts) that accounts for all the bogus “trend” found when they are as used to back-fill cold times (in the baseline) and hot times (now). I’m slowly working up ways to demonstrate that (this stuff is the baby steps at the start…)
FWIW, it is well known that low humidity places are more volatile (the classic burning hot desert with freezing nights…) while very humid (i.e. coastal) areas are range suppressed. Like Florida where at about 85 F the thunderstorms sprout and all that water takes the heat to the tropopause, returning hail to the ground and cooling us off.
I intend to do an A/ B on some places at the same latitude, but different humidities, to demonstrate that it isn’t just Sunlight levels, and certainly not CO2 as it is a “well mixed gas” that accounts for the difference between Phoenix and Atlanta temperatures. ( I did some charts on this some years back, but need to update it with more data…)
There is a very real possibility that “Global Warming” is just finding changes in global humidity…
My efforts to understand CAGW (Catastrophic Anthropogenic Global Warming) were based on GHCN v2.
https://diggingintheclay.wordpress.com/2010/12/28/dorothy-behind-the-curtain-part-1/
During my visit to Asheville Tom Peterson gave me access to GHCN v3 and some data from automated weather stations close to the GISP drilling site.
https://diggingintheclay.wordpress.com/2010/12/30/dorothy-behind-the-curtain-part-2/
I had great difficulty processing the files obtained (TMI). My spreadsheets simply crashed whenever II loaded the complete data. I decided to use a database program to select only high latitude stations and this resulted in a file I could handle on my laptop.
Something similar happened when I tried to load the “Level 3” data from the Diviner Lunar Radiation Experiment. This time my computer skills were useless but I was rescued by Tim Channon who sent me a copy of a file published by Ashwin Vasavada:
https://tallbloke.wordpress.com/2014/04/18/a-new-lunar-thermal-model-based-on-finite-element-analysis-of-regolith-physical-properties/
The common theme here is that I am awestruck by Chiefio’s ability to handle files that crash my little laptop. I have the data but my ability to process it is severely limited!
@GallopingCamel:
Not to mention I’m doing it on a Raspberry Pi that costs $35 and is much less powerful than any Intel based modern laptop … Just by not using Microsoft and their spreadsheets… ;-)
The only real “magic” is using Linux and a decently efficient SQL database system. You can do it too, as I’ve published what I’ve done as I do it. I’m also happy to answer questions if you get stuck trying it.
Given you have a laptop, just booting a USB based Linux ought to be enough. Do note that I have a couple of GB of “swap” on a real USB disk. At present 150 MB of it have been used while making the graphs for my latest posting. With both a browser and MariaDB running, swap ran up to 250 MB.
Other than that, it doesn’t crash but does take a few minutes to run through the whole database.
Now you know why I run Linux of BSD Unix…
Dr Smith,
Thank you for your musings on the observation of greater variance of inland temperatures.
In the case I presented, the variance is about year to year differences of annual Tmax. This eliminates many first impression responses. It is not an easy problem, at least until somebody has the correct Eureka moment that is certainly eluding me. It almost seems as if coastal and inland observations are not measuring the same physical parameters and that care should be taken in their comparison. Geoff
@Geoff:
They are not measuring the same phenomena.
Compare San Diego (about 72 F +/- 5 year round it seems) to Phoenix inland. Max of 126 F when I was there one August… Max can be snow in winter (rare, but snowed last time I drove east through there…)
Or San Francisco with Sacramento (about 60 miles due east). In the middle of summer it gets HOT in the Central Valley. I grew up there. When it hits over about 100 F the hot air REALLY wants to rise. This pulls in fresh air from the coast and pulls a fog blanket over San Francisco that can end up below 50 F. I once drove from San Jose (55 miles south of San Francisco) up to The City in August. It was 105 F when I left San Jose (mountains between it and the ocean) and it was about 59F when I got to San Francisco. Just about froze my butt off as I’d only brought a small windbreaker… It was still in the 90s F that evening when I got back to San Jose…
In the case of SFO, not only was the temperature much lower, but the movement is the inverse of the inland temperatures. SF can be 75 F and pleasant if it isn’t too hot inland and the “rise” has not set in. FWIW, I think I’ve observed a 3 day cyclical pulse to it. It takes 3 days to cool off the inland area enough to stop the air movement, then it heats up again and repeats…
@Chiefio,
“Not to mention I’m doing it on a Raspberry Pi that costs $35 and is much less powerful than any Intel based modern laptop … Just by not using Microsoft and their spreadsheets… ;-) ”
That blows my mind. I have been running Excel on a “Quad Core i5” with an SSD hard drive which should have much more CPU than a Raspberry Pi to capture (and fix) Roy Spencer’s lunar model and Robinson & Catling 2012 & 2014 models. I have run into instability problems that I don’t understand. Could Excel be causing the instability?
I will send you the relevant files “off line” in the hope that you can offer some insights.
@Chiefio,
“Now you know why I run Linux of BSD Unix…”
Currently I use Linux Mint 19 with the “MATE” GUI. A simpleton like me needs a GUI. Does BSD Unix have one?
BSD has most of the same GUIs available as does Linux. Plus a few that never made the jump to Linux. It is mostly X based. LXDE and XFCE are common. Gnome will probably be a loss as it is getting tightly tied to SystemD and no, BSD will not go to systemd.
The real problem is that the BSDs typically come as a DIY build and without a pre-installed X or GUI so
“some assembly required”. There are some for PCs that come GUI prebuilt (but that doesn’t help me on an Arm chipset).
Most BSD builds are used as headless servers so it kind of makes sense (in a self fulfilling prophecy sort of way…) and the desktop users tend to be Unix Engineers who will rip out what you ship with and install what they like more… But slowly it is dawning on some of the BSD folks that this is not good for their long term market share…
Linux Mint is just fine…
Per model instabilities in spreadsheets:
They were never designed to run models. They were designed to do accounting. I’m impressed folks can get them to do model like stuff at all. Microsoft, especially.
I do know how it works ( I created “the spreadsheet from hell” to track a few hundred projects and their status with automatically updating fields but that was a few years ago…) At some point the code that traverses all the variables, links, references, etc. etc. will start to have memory or precision induced data instabilities. (In computers, once a value gets too large or too small “bad things happen” and I’d wager good money nobody checks for that in their spreadsheet models. Heck, they mostly don’t check for it in their “real” models…)
Not sure I can be of much help with your problem as at present I don’t have a working Windoz box set up anywhere. I can probably cobble something together, but any Excel will be ancient…
@GC, you can send me your files and I will see if I can load them into MS stuff. My name in the link at gmail dot com. (I don’t care if spam goes there ;p)
@Chiefio,
“Not sure I can be of much help with your problem as at present I don’t have a working Windoz box set up anywhere. I can probably cobble something together, but any Excel will be ancient…”
I don’t have a Windoze box either. LibreOffice runs nicely with my Mint 19 OS and it can handle files from the 2013 version of Excel that includes a number of amazing features.
So what could possibly be “Amazing” about Excel? You may not think it amazing so perhaps it would be better to say how surprising it is to find the “Gamma” function and the “Dellta” function along with humdrum functions like ERF(x). All of these things are needed for capturing Robinson & Catling’s atmospheric models.
@cdquarles,
The Excel model files for the Moon and seven worlds with significant atmospheres were sent a few minutes ago. The files do NOT include Mars.
Ok
Pingback: Interesting Anomaly By Region By Calendar Month Graphs | Musings from the Chiefio
Pingback: GHCN v3.3 vs v4 – Top Level Entry Point | Musings from the Chiefio