Eureka Canada Graphs

As a recent “issue” has broken out about Eureka Canada having bogus data in some recent records I’ve decided to make a set of graphs for the data. Temperatures directly. Monthly anomalies with the yearly running total. And finally, the running total of monthly anomalies.

The “anomaly” graphs are made from early 2010 data while the temperature series was from late December 2009. (Simply because those are the sets I was working on in each development process so that’s what’s set up to run). In any case, one or two months of data at the end of the series will not be significant. For the ‘anomaly’ graphs, I’ve used the “Blended Duplicate Number” data to give the most accurate anomalies possible.

As usual, you can click on a graph to get a larger version.

Eureka Temperatures

Eureka Temperatures

Eureka Temperatures

Some months rising, some falling. Notable is how the recent end for some months rises above the trend line. That is an indication of recent higher temps (and potentially flags those months and years with a high risk of dodgy data.)

Generally it looks like summer is flat. The months with the most ‘rise’ are late fall and early winter. I’m also intrigued by the data dropout in 1996 (that results in the annual average taking a dive).

Eureka Monthly Anomalies

Eureka Blended Duplicates Monthly Anomalies

Eureka Blended Duplicates Monthly Anomalies

Not as much to see as I would have liked. Some oddities in the middle of the ‘baseline’ interval of 1951-1980 with some odd spikes and dropouts. The biggest thing I see is that the ‘baseline’ time period was oddly cold and volatile. Recent data have the typical compressed volatility. The bulk of the ‘rise’ of the dT line comes after the 1990 changes of process, prior to that it’s a flat series with some volatility. All I see in this graph is some crappy data artifacts and some recent warming via process changes.

Eureka Cumulative Monthly Anomalies

Eureka Blended Duplicates Cumulative Anomalies

Eureka Blended Duplicates Cumulative Anomalies

March and May falling. April with a gentle rise. January and February nearly flat, but with some rise. June and July rising about 1 C, but August dead flat. Then comes September. We start ‘warming’. October, November, and December too. For December it’s a bit over 3 C of “warming”.

So what happens in Sept to Dec? Oh yeah. Winter comes and folks are not acclimated yet. Wonder what the fuel burn profile is for Eureka… perhaps highest at the onset of winter?

That would also be the time folks were more likely to drop an “M” for minus from the temperature report. After the core of winter, they would be more likely to remember to do it (having done nothing else for months…)

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW Science and Background, Favorites and tagged , . Bookmark the permalink.

11 Responses to Eureka Canada Graphs

  1. pyromancer76 says:

    Thanks for all your work — looks “easy” for you — to keep ’em honest. Wish you would add your 04:24:16 4/23 post at WUWT here, too. The one that states: “Look, if you’re going to play ‘climate scientist’ you really must learn all the tricks of the trade.” Excellent summary.

    Have you changed your “tip jar – buy me a beer” maximum contribution yet? Your site is ready for “subscriptions”, don’t you think?

  2. E.M.Smith says:

    @Pyromancer76:

    Well, you’ve been reading here long enough to know how hard it was for me to come up to speed on making graphics ;-)

    (At least, making them without the tools I’d learned and used a couple of decades ago. I do wish folks would stop moving my cheese ;-)

    Yeah, I kind of liked that comment at WUWT too… I think with a bit more ‘sprucing up’ it could become a posting.

    On the Tip Jar: I have a list of things to do with it, but greed has never been my strong suit, so it ends up at the bottom of the list. So before I “do it right” I need to get back to putting some trades on and making another trading / finance posting. So much to do, so little motivation 8-| Oddly, while there are times that trading is energizing and full of adrenaline and just makes the passion flare up; there are also long boring stretches where it’s just dreadful drudgery. So I’d chucked some money into oils and shipping (with emphasis on high dividend payers) and more into retail (with emphasis on high end) and some other odds and ends (like insurance companies and the BRKA BRKB conglomerate) and then what do you do? Oh, and the overseas and emerging market bond funds and mixed bond / stocks. So I could go stir the pot a lot, or I can just let it sit there cranking out modest gains. So I let it sit and it’s done “ok”. And you just end up torn between a small desire to optimize for more gains and knowing that you are as likely to just lose that “gain” to churn as anything else.

    Sometimes it better to come in a solid 2nd place than try to get the extra 1/10000 second to come in first (and die trying…)

    And it’s very hard to be motivated about working harder to achieve a more solid 2nd place finish 8-}

    But things change. We’ve had some “issues” floating to the top lately (like Obama and the Dems deciding to go whack their major contributors, the finance industry, again). And I really need to address them. A review of positions yesterday showed some clearly getting ‘long in the tooth’. So before I do maintenance work on the tip jar I need to do my financial homework. (And then I need to write a paper I’m presenting next month… and then… )

    So “Thank you for your support”, and I’ll try to get my motivation up for doing “money maintenance”.

  3. E.M.Smith says:

    For anyone not aware of that comment:

    E.M.Smith (04:24:16) :
    Zeke Hausfather (21:44:01) : To me at least the results appear indistinguishable:

    That’s because you didn’t do the “homogenize” and “Grid / Box” steps as GIStemp does. So first take your Eureka temps and spread them 1000 km in all directions as ‘fill in’ and “homogenizing” to any stations missing data or that were discontinued after the baseline. THEN take those and spread them another 1200 km into “empty” grid boxes. I make that about 2200 km RADIUS of influence. That’s how GISS does it. And that’s why the GISS graph has a small box for Eureka (the first image up top with a mostly grey arctic Canada) but then the whole thing turns blood red when you smear the data around ala GIStemp.

    Look, if you’re going to play ‘climate scientist’ you really must learn all the tricks of the trade. Try reading Hansen’s papers for starters. “The Reference Station Method” and “Optimal Interpolation” would be good search terms to start with. For advanced study, read the GIStemp source code. I know where you can read it on line…

    So, go back to your map, and draw a 1200 km radius circle around Eureka. That is the MINIUM area it will directly be used to fabricate the Grid Box anomaly. Now draw a 1000 km radius circle. Any OLD stations in that radius will be homogenized with Eureka (now, we don’t know how many or how much). Then put a 1200 km radius around each of THEM. That’s the ultimate “reach” of the data. Well, maybe not the ultimate ultimate… I did leave out one additional ‘reach’ step… After the 1000 km ‘homogenize’ there is an added 1000 km ‘UHI’ adjust. To the extent it’s ‘backwards’ you could get bogus warming from it. Not that that ever happens. Well, not more than 1/2 the time…. then after the UHI ‘correction’ it goes to the Grid/ Box step. So in theory you could chain this out to 3600 km radius. But I’m sure that rarely happens. After all, you would need to have nearly no other stations nearby, since the code uses the closest stations first. And I’m sure there must be dozens of stations ‘up there’… What, only one you say? Who knew?…

    So just remember, this is ‘climate science’. You can’t expect to apply simple mathematics to it and find the warming influence. So your ‘toy world’ experiment was doomed from the beginning.

  4. Rod Smith says:

    @E.M. I think Thule AFB is less than 500km from Eureka. Is Thule ignored by the “Climate Scientists”? Is that why Eureka has such a wide influence? I’m confused! (There are also several automatic stations fairly close to Eureka, one on the same island.)

    REPLY; [ There are thousands of stations world wide, in all countries, that are not in the GHCN at all or are only in it during a past part of history, yet are reporting perfectly fine data every day. There are a variety of excuses given as to why they are ignored or dropped, but, IMHO, none of those is satisfactory. For example “They don’t issue CLIMAT reports”, yet both Bolivia (for whom I got this excuse) and Papua New Guinea both are missing from the GHCN and issue CLIMATs that are available at Ogimet. Basically, IMHO, NOAA / NCDC who create GHCN choose which stations to use, and which to ignore. And they ignore a lot. So they are not in the GHCN. Most “climate scientists” use the GHCN. Connect the dots… -E.M.Smith ]

  5. Dave McK says:

    You are amazingly productive. I love the separated months. Whenever you get a herd of computers at your command, would you try some for particular hours?
    I actually did one of individual solar day temps on Sweden (but it was animated) and you can really see just how chaotic things can be – and how regular, too.
    The low lying areas all tracked together quite well, usually, except on the sea which don’t fluctuate so much. The higher elevations could be very extreme.
    You could see a cold front sweep through.

    At WUWT, Michael D Smith (04:52:13) :

    “I did a very similar analysis in 2008 by using sine waves and solved for best RMS error”

    He said he’d try to find his work to post – I’d love to see it.

  6. Rod Smith says:

    Thank you E.M., I suspected as much, but I thought Thule was worth mentioning. If you select the contents of a bouquet correctly, you can make it smell as sweet as you want!

    When I was collecting weather data for the USAF, almost every scrap of data, no matter what the source was fair game. But this was for everyday, real, working, forecasts and analyses. (We did not collect data from GHCN, but as far as I know this data was not available in real/near time.)

    OT, but a chuckle. I remember an exercise involving a Thule deployment with the clever name of “Well Digger.”

  7. E.M.Smith says:

    @Dave McK:

    Thanks! I try to be productive. If I was able to do this full time I’d get even more done ;-)

    But “gotta eat” so some of my time goes elsewhere and chases money…

    I started this work using an old “white box PC” (that I’ve made fun of on many occasions). It started life as a x486 box and got a motherboard upgrade about 15 years ago to a 400 MHz AMD chip (and all of 132 MB memory). I like to say it was to be “period correct” for when GIStemp was being written.

    That is true, but the major reason I’d chosen it was that it was running Red Hat Linux already, so had the right tools on it AND it was part of a Beowulf Cluster I’d built once “just for fun”. I’d expected it to take some scale of minor “supercomputer” to run this “climate code” folks were always talking about. So I dug one “node” out of the garage and set it up.

    Fully expecting that I’d need to get at least the original 1/2 dozen nodes going… Turns out that just one old slow node ran the whole thing in about 20 minutes, so I didn’t need any more than that…

    For an interesting similar story of a ‘do it yourself supercomputer’ see:

    http://www.extremelinux.info/stonesoup/

    So I figured if it was good enough for Oak Ridge National Laboratory it was good enough for me ;-)

    But back to your point…

    It’s not the compute power that’s the issue, it’s the time and focus to do it.

    You see, I need to make money, so I trade my stock account. It’s a living, but it takes a fairly large amount of time and mental energy. If I had a salary (like all that money the oil companies are supposed to be tossing around in the dreams of The Warmers) then I could have an office set up and work on this 8 hours a day instead of the ‘couple a day’ I can devote now. And that it is what “rate limits” and prevents things like exploring the hourly and daily data. Not the computes.

    In fact, thanks to some greatly appreciated donations, I’ve got 3 more machines now and the “White Box” is idle. (A “retired white box posting” is a ‘someday project’… but I can’t quite bring myself to make the posting and admit that the Old White Box is destined for the garage again…

    So I’m typing this on a much nicer and much faster Windows machine used for composition and connectivity, courtesy of WUWT, and I’ve got an HP Vectra with Red Hat doing the GIStemp and analysis runs (thanks all!), then there is a Mac with a BigEndian processor in it that does Open Office / graphics / print spooling and is being prepped for the “someday” attempt to run the BigEndian Step 4_5 of GIStemp. (Courtesy of Ruhroh)

    I still like to think of the White Box as a ‘backup archive’… even though the truth is that it’s “shut off and left as it was when shutdown”… which is an archive of a sort, I suppose…

    @Rod Smith: You are most welcome. Yeah, one of the great “someday projects” is to find ‘nearby’ but unused by GHCN stations and do A/B and QA checks with them… Thule would be good for that (and if we’re lucky, some enterprising reader will be inspired and follow up on it ;-)

    Like the imagery of selecting a bouquet ;-)

    Yeah, lots of flower picking going on in Canada…

  8. Dave McK says:

    I love a 486. A 486 runs hubble and is the last space certified cpu I know of.
    For a little box in a dirty shop- they don’t need a fan- that’s huge. I have at least 20 handheld thingies to patch up and use for running cnc machines, whenever I’m extracted from my purgatory in the north.

    Well, when I did the animations it took several days to strip and sort the data files on a 3G core duo. It took much less time to code the whole mess, actually…

    They look ok on full screen but youtube compressed them and they are hardly worth watching that way. The actual size of the bitmap of sweden was about 6 screensfull or so, which really suffered to crunch to 640*480…
    Well, I only did it cuz I have no real work and Jones had just said Sweden wouldn’t release the data, you know… and it was an excuse to do coding which is cheap entertainment…lol

    One hopes for things to pick up when the retirement tsunami starts. It’ll be a migration unlike any other and should churn up a lot of inflated USD, eh?

    one of the Sweden Temp animations is here:

  9. Dave McK says:

    OH- with respect to that vid- one thing that is very plain is that if grids were to be made, they should be made based on the geography because lat/long are arbitrary and span coastline and mountaintop – this contradicts the actual relationship of one site to another.
    Sweden, for instance, has several stations on coast or on tiny islands that are very stable while the mountain tops – you could possibly throw a rock from one station on top of the mountain to the one below the cliff by the shore- they do not reflect similar influences, they don’t track- they are completely unrelated and never are exposed to the same conditions.
    On the plains, you can watch a cold wave sweep through as the stations go blue, while on the mountains they flicker with every gust and the coastal ones have the inertia of the sea.

    Facts are always black and white when you look close enough to see the dots.

  10. E.M.Smith says:

    @Dave McK: Nicely done. Yeah, that elevation and proximity to water thing was one of the earlier lines of investigation that showed lots of locational bias in the data over time. The warmers proceeded to chant that it didn’t matter because it was all done with anomalies… But as we saw with the Marvelous Marble Bar posting, “a splice is still a splice” and you can make a steeper trend by splicing together disjoint anomaly trends just fine…

    The notion that you can replace the station at Truckee California (presently having 100 mph + winds and a winter storm) with one near the beach in Los Angeles, Santa Maria, San Diego, or even the only other surviving station in California, the San Francisco Airport; it’s just crazy.

    (FWIW, it is cool and wet today in the S.F. bay area, but not at all like the Truckee / Reno winter storm conditions…)

  11. Dave McK says:

    Well, if the world gets through its metaphorical Donner pass, it’s all downhill to the beach after that, eh?
    I miss Marin… lots.

Comments are closed.