A Curiousity In Australia

So there was / is a cold summer in Australia (especially up around Brisbane).

When you look at a map of Australia, Brisbane is just about in the middle of the country on the coast on the right (east) side.

What’s a bit curious, is that while the locals are saying it is Way Cold, GIStemp is just not seeing it. In fact, even looking back into the PAST cold, it isn’t there. It is as though the extremes are simply erased. (I’ve suspected this from what we’ve seen in the “QA Process” where stations must be ‘close enough’ to nearby airports or their data are rejected).

What brought this on? Bikinis.

No, honest! With a h/t to George on the Open Talk page where he noted:


Coldest summer in Australia in 50 years.

Well, I thought, how can I not follow up on a Bikini story… if retail sales are falling in swimwear, there are ways to make money off of shorting specialty retailers of swimwear and going long things like Burlington Coat Factory. Yes, short bikinis, long coats…

Perhaps even “short shorts” and “long long johns”…

So a bit of searching turned up this page claiming that Brisbane had a cold winter too:


Today in Brisbane temperatures have barely got above 12° C. The highest temperature I’ve seen on my outdoor thermometer is
12.4° C.

It’s the coldest daytime maximum temperature I’ve ever known here and I remember reading that the previous lowest maximum daytime temperature in recent years was around 15° C.

So I decided to check the records.

The previous and all time recorded lowest daytime temperature in Brisbane was 11.7°C in 1916. So today is the coldest day for 95 years!

They quote another article and link to it:

“Bureau of Meteorology senior forecaster Chris Joseph said the record low daytime temperature for Brisbane was 11.7 degrees, set in 1916.”


The gloves are on, as cold snap continues
09 Jun, 2011 11:51 AM

Queensland’s cold snap has continued, with Brisbane recording its coolest overnight temperature of the year while the mercury plummeted below zero further west.

The mercury dropped to 6.9 degrees at 6am in the capital, five below average and the coldest it has been since August 14 last year, when the temperature fell to 6 degrees.

Amberley was near-freezing at 1.8 degrees at 6am, while the temperature at Brisbane Airport fell to 5.7 degrees.

Charleville and Roma shared the title of being the state’s coldest, with the mercury falling to minus 3 degrees, while Dalby reached minus 2.1, Oakey minus 1, Toowoomba zero degrees and Kingaroy 0.6 degrees.

The cold front moving across southeast Queensland from the Victorian snow fields will make for more frosty conditions today with a maximum temperature of 16 degrees, 6 degrees below average.

But that is still almost four degrees above yesterday’s maximum temperature of 12.6 degrees.

So, I’m thinking, if we’ve had Way Cold in mid winter, and the summer is so cold folks are avoiding the beach and not buying a lot of swimwear, that ought to show up in the temperature charts, right?

We’re Off To See the Wizzard,…

So what does that man behind the curtain say when we ask GIStemp for a map of Australia with temperatures? Well, strange things happen.

Toward the end of looking at things, I decided to just home in on June 1916 relative to the hot period of 1980 to 2010. That was, per the Warmistas, the Hottest Ever and represents most of the ‘ramp up’. So if 1916 was a record cold, how does it stack up against the warmest? I expected to see a very cold Brisbane area. I set the ‘smearing’ to “only” 250 km so we could see where stations were recording just a bit better.

GISStemp for June 1916 compared to 1980-2010 baseline

GISStemp for June 1916 compared to 1980-2010 baseline

So look at Australia. A “hot pixel” in the dead center and then generally warm all over. Brisbane at the coast, roughly ‘normal’. For a coldest ever? What? Something is just very wrong… And it is not just the complete lack of temperatures at either pole or in the interior of Africa, South America, and large swaths of China and Russia. There are “hot pixels” in Japan, Greenland, Kashmir, and a warm Texas too (among others). There are also ‘way cold’ areas in odd places too. A bit of history check says 1916-17 was significantly cold in parts of Europe (some folks speculate W.W.I caused it). But what’s the deal with Australia?

Looking at June 2011 vs the same recent hot baseline (so it ought to be ‘way cold’ if it’s a ‘coldest in 50 years’ or more) we get:

GIStemp June 2011 vs 1980-2010

GIStemp June 2011 vs 1980-2010

Brisbane is marked as a ‘neutral’ white to light green. Nothing significantly different from that hot period. Nope, no cold there…

How odd..

I do note a lot of bright red in areas that are lacking many people and data points… Cutting it back to a 250 km ‘smear’ we see that a limited number of ‘hot pixels’ are getting smeared way too far and much of the map shows cold pixels.

GIStemp 250 km June 2011 vs 1980-2011

GIStemp 250 km June 2011 vs 1980-2011

At last we can see a bit of cool showing up in Brisbane…

So if we look at the maps with the default settings, what do we see?

GIStemp defaults June 2011

GIStemp defaults June 2011

Did Brisbane really have a normal winter? Or was it exceptionally cold? Was Europe really a couple of degrees warm this summer? Did England have a nice slightly warm summer? Plenty of tomatoes, eggplant, and Green beans from the garden?

The November cold summer down under? Winter up north?

GIStemp defaults Novenber 2011

GIStemp defaults Novenber 2011

Is Brisbane REALLY way above normal this summer, selling swimsuits like crazy to the vacationers trying to get a spot on the beach?

Is the UK and Scandinavia REALLY having a sweltering winter?

Last I looked, the Arctic was making basically normal sea ice in rather colder than usual temperatures. It is REALLY +5 C to +6 C HOT HOT HOT!!!

I know, we’ve seen this game many times before, but doesn’t there come a time when someone just stands up and says this is too embarrassing to NASA and they need to toss the crap out?

Look at the north pole! Darned near blood red on fire. Yet we’re having massive polar cold causing blizzards and early snows. This is just broken.

I know, I ought to go into the code and figure out where. I’ve ‘been there’ and don’t look forward to going back. IMHO it is the result of a very distributed approach. EVERYTHING is ‘snugged up’ just a little, resulting in a very large bias. (I think the graphing parts even add their bit. As we saw before they blocked it (but didn’t FIX it) zero values are mapped to blood red. I suspect that the sporadic null station adds an excess cold value, giving excess cold on missing data items, but that code was not in the last download I did – and I don’t know if is even provided now.)

So here we are, with reality telling us that all over the world we are having cold anomaly events. With rising heating bills and blizzard warnings. With swimwear sales off (and I expect coat sales up, along with road salt…) What does GIStemp say? Oh Hot Hot HOT!!!

How long can such {pick one: stupidity, fraud, buggy code, idiocy, embarrassment} continue with no Adult Supervision?

Subscribe to feed

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW and Weather News Events, AGW GIStemp Specific and tagged , , , . Bookmark the permalink.

40 Responses to A Curiousity In Australia

  1. Pingback: A Curiousity In Australia | Cranky Old Crow

  2. George says:

    Don’t forget, now is the start of the Australian summer. I also heard someone say that Perth (western Australia) was looking at the warmest year ever for 2011.

  3. George says:

    I don’t trust those GISS maps. They seem like total junk to me. At this point the database I trust most for global temps is HADCrut but I haven’t found a gridded product, though they must offer one.

  4. boballab says:


    I have been “talking” to someone on another site (non climate related) about this winter and they are from Finland and they state it is warm there compared to normal. However that is to be expected due to the AO being in its positive phase. NSDIC and NOAA actually do a decent job showing the what, why and wheres of the effect. As to the sea ice NSDIC has a pretty good explanation in their Nov and news releases:

    Atmospheric conditions
    In recent years, low sea ice extent in the summer has been linked to unusually warm temperatures at the surface of the Arctic Ocean in the fall. This pattern appeared yet again this fall.

    Air temperatures over most of the Arctic Ocean for October 2011 ranged from 1 to 4 degrees Celsius (1.8 to 7.2 degrees Fahrenheit) above average, measured at the 925 millibar level, about 1,000 meters or 3,000 feet above the surface. However, over the eastern Canadian Arctic and Greenland, temperatures were as much as 3 degrees Celsius (5.4 degrees Fahrenheit) below average.

    These temperature anomalies in part reflect a pattern of above-average sea level pressure centered over the northern Beaufort Sea, and lower than average sea level pressure extending across northern Eurasia. This pattern is linked to persistence of the positive phase of the Arctic Oscillation through most of the month. These pressure and temperature anomalies tend to bring in heat from the south, warming the Eurasian coast, but they also lead to cold northerly winds over the eastern Canadian Arctic Archipelago. However, along the Siberian coast and in the Beaufort and Chukchi seas, warmer temperatures came primarily from the remaining areas of open water in the region, as heat escaped from the water. These effects are more strongly apparent in the surface air temperatures: average October temperatures in the region were 5 to 8 degrees Celsius (9.0 to 14.4 degrees Fahrenheit) above average.


    Northern Hemisphere snow cover
    The positive phase of the Arctic Oscillation also tends to be associated with unusually warm conditions over Scandinavia. According to Sweden’s meteorological office, the country’s average temperature for the month of November so far was 7 degrees Celsius (12.6 degrees Fahrenheit) above average. Typically by November, much of Scandinavia is already covered with snow, but maps from the Rutgers University Global Snow Lab show that snow cover levels were anomalously low over Scandinavia and northwestern Europe during November. Below-normal snow conditions were also evident over most of the continental United States, except for the northern Rockies.

    However, overall Northern Hemisphere snow cover was more extensive than normal this November, with most of the extra snow cover found in Canada and Russia. Snow covered an average of 36.2 million square kilometers (14.0 million square miles) of Northern Hemisphere land. This is 2.79 million square kilometers (1.08 million square miles) above the 1971 to 2000 mean, and ranks as the fourth most extensive cover in the past 46 years of satellite-derived snow cover records.


  5. gallopingcamel says:

    The last two winters in Florida were downright cold with ice and even some snow. I swore to move another 500 miles south if this year continued the trend.

    I am happy to report that things are toasty here at the moment so I have no immediate relocation plans. I guess we need to blame El Nino, the PDO or some other theory I dimly comprehend.

  6. boballab says:


    Here is the links to the Gridded maps for HadCrut 3:

    You can also download the data from there or you can get the data from CRU here:

    And as long as I’m giving the links to temp datasets might as well go whole hog and give the all (if EM has the allowed links set high enough!)

    NCDC: http://www.ncdc.noaa.gov/cmb-faq/anomalies.php#grid

    GISS: http://data.giss.nasa.gov/gistemp/_tabledata3/GLB.Ts+dSST.txt

    UAH: http://www.nsstc.uah.edu/public/msu/t2lt/uahncdc.lt

    RSS: http://www.remss.com/data/msu/monthly_time_series/RSS_Monthly_MSU_AMSU_Channel_TLT_Anomalies_Land_and_Ocean_v03_3.txt

  7. Ripper says:

    It has been below average over our side of the country as well E.M.

    We had our first 40 degree day for the summer. last Sunday. and out second one today


  8. George says:

    I don’t trust NCDC, they’ve been caught adjusting their data EVERY SINGLE MONTH this year making years older than 1950 colder, and years after 1950 warmer with the greatest adjustment applied to the most recent years.

  9. George says:

    Scary, huh?

  10. Another Ian says:


    Being inland from Brisbane I can add

    The winter here got down to around -3 on the verandah. Has been known to get to around -10 at times, so wasn’t the worst winter. But it needed a lot of firewood.

    Summer so far fits your picture. A few days bordering on hot, but we still have blankets (note plural!) on our bed. And have had around 480mm of rain spread since half past November. With a lot of cloudy days, so not hot.

    I’ve had more computer time than a usual day today courtesy of 48.6mm of AGW permanent drought.

  11. boballab says:


    Well then you are going to have a problem with HadCrut from now on since the majority of it is made up of GHCN (Hadcrut is over 80% GHCN according to Phil Jones in the Climategate 1 emails) and with the newest version of it (GHCN v3) there is no version that isn’t NCDC adjusted anymore.

    With GHCN v3 they do not have a “raw” file, Time of day Bias adjustment (Tobs) file, and a fully adjusted file anymore. They have two: Quality Controlled Unadjusted (QCU) and Quality Controlled Adjusted (QCA). Now you might assume that the QCU file would be like the old so called “raw” file (which is what GISS and HadCrut both used) but it is not. As part of the “Quality Control” process they already did the multiple station adjustment procedure (Where they stitch multiple shorter term thermometer records into one long term record). Example of this is where for someplace like Washington DC there will be one thermometer that existed from 1875 to 1900, then a second one from 1895 to 1930, a third from 1927 to 1965 and a fourth from 1953 to 2010 (This is not really how the GHCN v2 raw file for Washington DC was it is just an example). Back in the old GHCN v2 days in the raw file you would find a station number for each of those thermometers and their own separate readings. That was the file GISS and HadCrut both used and they did their own stitching together (This is an adjustment) as well as NCDC. However in GHCN v3 you no longer get to see the thermometer break down they will now just give one station number and it they would have it go from 1880 all the way to 2010. Matter of fact GISS has even stated they will be using NCDC’s process instead of trying to undo it and then do their own (They also stated that that they will not be using USHCN in GISTEMP anymore either just GHCN v3).

    2011-12-15: GHCN v2 is no longer being updated, hence the GISS analysis is now based on the adjusted GHCN version 3 data. Graphs comparing results of the GISS analysis using GHCN v2 and v3 are available here. Discussion of the impact of this change will be included with the GISS analysis of 2011 global temperature.


    As you can see from the bolded bit, that GISS will be using GHCN adjusted data. Further confirmation is here:

    December 14, 2011: GHCN v2 and USHCN data were replaced by the adjusted GHCN v3 data. This simplified the combination procedure since some steps became redundant (combining different station records for the same location, adjusting for the station move in the St. Helena record, etc). See related figures.


    So the only two datasets that will not be contaminated by NCDC adjustment procedures will be the two satellite ones: UAH and RSS.

  12. George says:

    @ Another Ian

    Didn’t they tell you folks that you wouldn’t need to complete your flood control projects because you would never again see the sort of rain you had in the 1970’s and were going to be in drought forever?

    Seems at the moment the LNP are polling well for the elections next year in Queensland.

  13. George says:

    HadCRUT is GHCN raw, not GHCN adjusted which is what NCDC is.

    “With GHCN v3 they do not have a “raw” file”

    They do have a raw product but it is not generally available is my understanding. They do share the raw station data with other databases but do not provide a “raw” product to the public.

    Also, GISS is based on PARTIALLY adjusted data. I think it only has the TOD adjustment, not the UHI adjustment. GISS does their own UHI adjustment.

  14. R. de Haan says:

    If the real world data no longer adds up with what the Government is telling you the only obvious conclusion is that you no longer live in free country and the entire population is going to be subject to abuse.

    So I wondered why a Dutch scientist developen a killer flu virus.
    His research was payed for by the NIH (National Institutions for Health) a US Governmental institution.

    And this happens with John Holdren better known from the Population Bomb in the position of science czar in the Obama Administration.


    I don’t believe in coincidence.

  15. xyzlatin says:

    I’ve just come from Brisbane and everyone I spoke to there complained about the cold weather.

  16. boballab says:


    George that “raw” file you are thinking about was with GHCN v2 which they did away with. The data and files that HadCrut uses is the same ones that you and I have access to: QCU and QCA.

    From the Readme file off NCDC’s ftp site:

    1.1 OVERVIEW

    The GHCNM v3 has been released. This version currently contains
    monthly mean temperature, monthly maximum temperature and
    monthly minimum temperature. The station network for the time being,
    is the same as GHCN-Monthly version 2 (7280 stations). A new
    software processing system is now responsible for daily reprocessing
    of the dataset. This reprocessing consists of a construction process
    that assembles the data in a specific source priority order, quality
    controls the data
    , identifies inhomogeneities and performs adjustments
    where possible.
    In addition, graphical products, including individual
    station time series plots are produced daily.

    V3 contains two different dataset files per each of the three elements.
    “QCU” files represent the quality controlled unadjusted data, and
    “QCA” files represent the quality controlled adjusted data. The unadjusted
    data are often referred to as the “raw” data.
    It is important to note that
    the term “unadjusted” means that the developers of GHCNM have not made any
    adjustments to these received and/or collected data, but it is entirely
    possible that the source of these data (generally National Meteorological
    Services) may have made adjustments to these data prior to their inclusion
    within the GHCNM. Often it is difficult or impossible to know for sure,
    if these original sources have made adjustments, so users who desire
    truly “raw” data would need to directly contact the data source
    The “adjusted” data contain bias corrected data (e.g. adjustments made
    by the developers of GHCNM), and so these data can differ from the
    “unadjusted” data.

    Note that they state that they have a computer QC the data every day. It is after that step you get the first output from the program: QCU. You can also note they state that it is this “unadjusted” data that is called “raw”. You can also note that they state that they are not even getting “raw” data from their sources and that if you want the “raw” you personally have to tack it down from the source (such as Australia’s BOM or New Zealands NIWA). So right there they are telling you there is no “raw” material they are giving out to GISS and CRU (who make the CRUTEM 3 portion of HadCRUT 3).

    Also in it what I was telling you how they splice the thermometers together:


    The GHCNM v2 contained several thousand stations that had multiple
    time series of monthly mean temperature data. The 12th digit of
    each data record, indicated the time series number, and thus there
    was a potential maximum of 10 time series (e.g. 0 through 9). These
    same stations in v3 have undergone a merge process, to reduce
    the station time series to one single series
    , based upon these
    original and at most 10 time series.

    A simple algorithm was applied to perform the merge. The algorithm
    consisted of first finding the length (based upon number of non
    missing observations) for each of the time series and then
    combining all of the series into one based upon a priority scheme
    that would “write” data to the series for the longest series last.

    Therefore, if station A, had 3 time series of TAVG data, as follows:

    1900 to 1978 (79 years of data) [series 1]
    1950 to 1985 (36 years of data) [series 2]
    1990 to 2007 (18 years of data) [series 3]

    The final series would consist of:

    1900 to 1978 [series 1]
    1979 to 1985 [series 2]
    1990 to 2007 [series 3]

    The original series number in GHCNM v2, is retained in the GHCNM v3
    data source flag.

    One caveat to this merge process, is that in the final GHCNM v3
    processing there is still a master level construction process
    performed daily, where the entire dataset is construction according
    to a source order overwrite hiearchy (section 2.3), and it is
    possible that higher order data sources may be interspersed within
    the 3 series listed above.


    You can even look at the “Quality Control” order here:

    Again GHCN v3 does not work in anyway shape or form to the way GHCN v2 worked, it is completely different. I have copies of GHCN v2 from earlier this year sitting on my hard drive so that when the memory hole it (Which they did, you can’t download the last v2 temp data anymore from the ftp site) I still have it to compare to v3 in the future.

    Also what you are thinking of for GISS was the old method, again with GHCN v2, with GHCN v3 that all changed. When you now go to the GISS station page you get these options now:
    1. Adjusted GHCN v3 + SCAR data
    2. After removing suspicious records
    3. After GISS homogeneity adjustment

    Back when they used GHCN v2 the options were:
    1. GHCN v2 raw + USHCN
    2. After combining station records
    3. After GISS homogeneity adjustment

    Everything about how GHCN, GISS and HadCRUT worked got thrown out the window this month when NCDC discontinued GHCN v2 and only give out GHCN v3 .


    It also appears that GISTEMP has been retooled so you will have to re-download it.

  17. George says:

    You can also note that they state that they are not even getting “raw” data from their sources

    No, that is not what they are saying. They are not saying the source data ARE adjusted by the source, they are saying they aren’t adjusting the data in the U file and the source MAY be adjusting it, they have no way of knowing.

    Big difference. The only adjustment I know of that sources often apply is a TOD adjustment. Sources generally do not apply UHI adjustments but if the source is the national meteorological department of a country, sure, there might be other adjustments without their knowledge.

    Doesn’t really matter for climate as long as the adjustments are consistent. We are only looking for a trend.

  18. E.M.Smith says:

    FWIW, I suspect some of this was due to my dT/dt method showing some of how the ‘splice artifacts’ worked. That “mysterious Marble Bar” posting, for example.

    I’d had an inquiry about source code for it, and pointed folks at where I’d already posted it. The major “feature” of it was a method that allowed you to combine segments WITHOUT the splice artifacts (that then showed lower artificial trends).

    At the time I’d thought maybe someone was going to look for ways to try to defeat it, but figured it would be hard to do as long as the station data existed. I think I’m seeing the answer now. HIDE the station data and only give out the spliced, adjusted, homogenized data-food-product.

    Maybe just conceit on my part, but it is an odd “coincidence” (and I don’t believe in such ‘coincidences’ either…)

    Doesn’t really matter, though. Comparison of the GHCN v2 to the new data-food-product will be just as effective at showing ‘cookage’ for at least a decade. By then this thing will be wrapped up one way or the other.

    I do have to wonder, though, if the NCDC are setting themselves up for a “Jones Moment”…. A F.O.I.A. request for their station input data BY station BY segment… and methods. Oh, and emails related to it…

    They can then either hand over the station data (that is no different than was made available in the past via CLIMAT) or have F.O.I.A. issues… Night Fork…

    The other fairly simple approach is that there are other folks with data series ( I know of at least one commercial operation) so a direct comparison is possible between them. It would take funding, but that’s fine.

    Frankly, that this action makes my prior approach somewhat less useful just gives me more time to put on more interesting things.

    The arrogance of it, though, is a bit stunning…

    Mostly it means focus moves off of Hansen and onto NCDC…

  19. So, If I, or anyone, was to say that their published temperature data is misleading, incorrect, fudged, or even dishonest, I/we would be closer to the truth than are their results ?

  20. R. de Haan says:

    You get it Ken McMurtrie.
    Our scientific institutions are lying to us and our leadership is turning radical.
    Coincidence, I don’t think so.

    @ E. M, “The arrogance is a bit stunning”.
    It’s flabbergasting and a bloody disgrace.

    Why don’t you put your article up at WUWT and let the wolf pack shine a light on the subject.

    I really think you have a good case here with the potential to break the front door of Gistemp.

    I am really fed up with all the manipulations and can’t wait to confront the guy’s responsible for cooking the data.

    Anthony Watts once floated an idea to involve amateur meteorologists to create a shadow data network. I don’t know if he went through with this.
    In Australia, the USA and Europe we have many of those amateurs who are well organized and really could nice job.
    In the Netherlands for example they work in accordance with same standards as the KNMI (use google translate to read the site in the English language:

    I couldn’t find an Australian organization for amateur meteorologists on the fly but I did find this article: http://www.dailyrecord.co.uk/news/real-life/2010/08/18/meet-the-amateur-meteorologists-who-are-taking-forecasting-the-weather-into-their-own-hands-86908-22495536/

    Anyhow, there must be a way to get an alternative data source or at least a number of contacts that allow you to verify data reliable enough to make a case.
    Just think about it.

  21. adolfogiurfa says:

    @E.M.: Same thing at the other side of the pacific, at 12ºS, 75ºW
    Sorry kids!, your time is due, it won´t help keep on lying. See:

  22. R. de Haan says:

    Yes Adolfo. unisys shows the real thing.
    The kids are from another planet.

  23. Raymond says:

    I did a quick area evaluation using pixels in Photoshop.

    250 km 1200 km
    4 to 5.7 0.48% 0.47%
    2 to 4 2.86% 8.31%
    1 to 2 8.56% 21.33%
    0.5 to 1 11.19% 21.51%
    0.2 to 0.5 12.32% 13.48%
    -.2 to +.2 15.49% 11.19%
    -0.5 to -0.2 9.67% 7.75%
    -1 to -0.5 8.68% 5.85%
    -2 to -1 4.32% 2.76%
    -4 to -2 6.38% 5.58%
    NO DATA 20.06% 1.79%

  24. Yup!

    Australian official temperatures is something else.
    On Fig 47 BOM´s long term trend, on fig 42 the results from 2-300 stations showing a robust and significant split in Australian temperatures EastWest. There is NO sign of this EastWest split in Australian temperatures from BOM, fig 42.
    Did they use numerology, or?

    K.R Frank

  25. Yes, Australian official temperature trends are something else.

    Fig 42: 2-300 stations shows that East Asutralia has significantly different temperature trend than West Australia.
    Fig 47 shows that BOM has absolutely no clue about this…

    K.R. Frank

  26. E.M.Smith says:

    @R. deHaan:

    There is a commercial service, who’s name escapes me at the moment, who have an independent series and with independent correction methods. They sell their service to companies, so it’s a paid product. BUT, they have the NMS provided ‘raw – meaning slightly adjusted’ data archived… it’s not lost.

    I don’t get to decide what goes up at WUWT. Anthony does. What would probably be best is if Willis took my ‘raw stuff’ and spiffed it up to WUWT quality… I’ve given Anthony ‘carte blanch’ to use anything he finds here that is of interest (on the topic of weather and climate).

    BTW, Anthony knows of that commercial data source. May explain some of his lower interest in an amateur reconstruction.

  27. adolfogiurfa says:

    Hansen´s GISS has no credit at all.

  28. Nick Stokes says:

    You can find any amount of weather station records for SE Qld at this BoM site. I’ve shown Brisbane June 2011; ave min was 10.3C, max 20.9. Long term ave is 10.9C and 20.9. Nothing exceptional there. November was well above average: 19.9/28.8 vs 18.1/27.8.

    If you don’t like the GISS plots, I have a gadget here for viewing the GHCN station temps directly. Each station (you can show them) is colored according to temp for that month – no grid averaging. You can navigate with buttons and clicking on the small map.

  29. Andy Krause says:

    I believe the US Air Force keeps its own dataset of surface temperatures (AFCCC?). It think it is for combat purposes.

  30. Pingback: pindanpost

  31. Pingback: Here is the weather…or not | pindanpost

  32. Nick Stokes,

    BOM adjusts their data. Bye.

  33. Nick Stokes says:

    These are the readings that were reported on web every 30 mins. They go straight to that website. When do you think they adjust them?

  34. George says:

    I like BOM’s data

  35. Aussie says:

    Without checking the data, Canberra has been very cold for the start of summer. There have been very few hot days where it has stayed warm enough for short sleeves!!

    The West Australian temperatures are high, and yes Pilbara had a record temperature, but truly in Australia with the variations that we get, this is not evidence of climate change!!

  36. Pingback: AGW – A Curiousity In Australia | The GOLDEN RULE

  37. E.M.Smith says:

    @Frank Lansner:

    Don’t know why but you had two comments stuck in the spam filter. Sorry I didn’t notice them until now, they are good ones!

    For anyone who missed them, they are ‘up thread’ and say:


    Australian official temperatures is something else.
    On Fig 47 BOM´s long term trend, on fig 42 the results from 2-300 stations showing a robust and significant split in Australian temperatures EastWest. There is NO sign of this EastWest split in Australian temperatures from BOM, fig 42.
    Did they use numerology, or?

    K.R Frank

    Yes, Australian official temperature trends are something else.

    Fig 42: 2-300 stations shows that East Asutralia has significantly different temperature trend than West Australia.
    Fig 47 shows that BOM has absolutely no clue about this…

    K.R. Frank

    There were also three comments from Adolfo, but on other threads.

  38. R. de Haan says:

    Train Derailed in Australian flood

  39. adolfogiurfa says:

    @R.de Haan: Funny indeed. We´ll see the same in the next weeks, this time in south america. This is part of the cooling we are in (changing of phase, from vapor to water).

  40. E.M.Smith says:

    There is something endemic to the Anglo approach to news…. Like when they say that some of the carriages came off the track ‘as the train tried to cross that bridge’ while the panning shot shows a completely washed out section of track about 3 cars long with flood waters eroding away any base / bed / raised dirt in a torrent of water… How about “Bridge Gone!” guys? ;-)

    But yes, the lower air volume (from lower UV level) has made things colder, so the water is being squeezed out and we’ve got cold floods in some areas, drought in others…

Comments are closed.