More “data” than the Data – The Stupid, It Burns…

So in “tips” was a line from Larry (h/t Larry) that I actually ended up following from a cross post by Another Ian (h/t Ian) at Jo’s place:

http://joannenova.com.au/2017/08/bom-had-smart-cards-to-filter-out-coldest-temperatures-full-audit-needed-asap/#comment-1929947

pointed me back at, well, me… and this:

https://chiefio.wordpress.com/2017/07/29/tips-august-2017/#comment-85462

That sent me off to:

http://dailycaller.com/2017/08/03/report-127-million-climate-supercomputer-no-better-than-using-a-piece-of-paper/

which is well worth reading, though I had a bit of an “upchuck moment” at:

Scientists said supercomputer modeling could have predicted the flooding. Thompson said the supercomputer “simulations provided one hundred times more data than is available from observed records.”

Oh, the Stupid, it is strong in them…

100 times more “data” than in the actual Data.

I fear there is nothing that can be done to cure that level of Stupid. Perhaps a whole generation will need to be assigned to “doggy dooly patrol” pending their recovery.

Let me make it perfectly clear:

Computer Model OUTPUT IS NOT DATA. It is not, never was, and CAN NOT BE DATA!!!. EVER.

It is a computer generated fantasy product. It is “data food product”. It is Phantasy Football Crap.

Computer fantasy IS NOT REAL.

Oh God, I feel the need for more Tequila coming on… I know of no other way to dampen the burn from this much Stupid On Parade…

To quote someone or other:
“We give these people computers and expect them to know how to use them”…

I’d also add Smith’s Math Corollary:
“We give these people math and statistics and expect them to know how to use them!”.

Is there no intelligent life in Academia?

It would seem not…

Subscribe to feed

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in AGW Science and Background, Global Warming General and tagged , , . Bookmark the permalink.

25 Responses to More “data” than the Data – The Stupid, It Burns…

  1. Terry Jay says:

    Garbage In, Gospel Out.

  2. Larry Ledwick says:

    It is sort of like the stock player who “has a system” isn’t it. They are so sure that the computer output is right and valid that they never stop to think about massive disconnects like that.

    News flash!
    Your magic computer is just making up numbers based on a rule set you created.
    Nature is under no obligation to play by your rules.
    There is no physical analog to your made up numbers!

  3. p.g.sharrow says:

    “Is there no intelligent life in Academia? ”

    NO! those that can will learn. Those that can’t must be taught.
    Academia is teachers, teaching teachers teaching teachers. And the answers are in the back of the “Book” written by a teacher taught by teachers of Academia. All striving to be “Ex-sperts” in their fields by reading books written by Academics.
    A well programed computer will give you all the data you can want or need. “GIGO” It will only tell you what you tell it to tell you. No Intelligence there as well, just the thing for Academia…pg

  4. beththeserf says:

    Incestuousness closed societies,
    no fizz, no real life messiness,
    departments of Innovation by
    them who never innovated anything.

  5. gregole says:

    Supercomputer simulations of exotic sexual fantasies provide hundreds of times more sex than actually performing the sex act…

    Not a perfect analogy, but an attempt to capture the silliness these professorial goons display.

  6. jim2 says:

    I love it when they call model output “data.” It makes them look so very stupid.

  7. jerry l krause says:

    Hi E.M.,

    You see the problem, do you see the solution? Or, do you really see the problem?

    The problem, as I see it, is first there was a lack of actual observed quantitative data. Then suddenly because of technology, which allowed the needed actual data to generated and transmitted to a central data bank from which it could be accessed, there was too much observed quantitative data. So much that the computer guys did not know what to do except to average it. This they could have their computers do without working up much of their own sweat.

    The problem which you have identified is the extreme variability whose principal cause is cloud. However, it seems few have noticed there are two actual temperatures. The temperature of the earth’s radiating surfaces and the temperature of air which we breathe. These two temperatures are seldom the same at a given time of the day. It seems I remember that you have acknowledged this.

    One of Lewis Agassiz’s students wrote the following and I have tried to bring it to the attention of others; so I might be repeating. “Agassiz’s training in the method of observing facts and their orderly arrangement was ever accompanied by the urgent exhortation not to be content with orderly arrangement was ever accompanied by the urgent exhortation not to be content with them. ‘Facts are stupid things,’ he would say, ‘until brought into connection with some general law.”

    I expect you know how to select certain data from a data set, make a graphical figure in which this certain from a months of days, so one can possibly see possible relationships between certain observed facts. This what you. a computer guy, can do to help me see some order in the midst of the very variable world of weather.

    Have a good day, Jerry

  8. jerry l krause says:

    HI E.M.,

    I wasn’t going to acknowledge my failure to proofread, but I had another thought to challenge you.

    Climatologists try to escape from the messy world of meteorology. You have written we must consider a period of a thousand years. Of course, there are climatologists they have observed climatic data that is a thousand years old. And, of course, they pretend to be able to take a tree ring or two, and extrapolated that this describes the entire world. This at the same time we know the present climate of the Oregon coast is quite different from that of the Willamette Valley which in turn is quite different from the extensive area east of the Cascade Mountains.

    I consider the reasoning of the climatologists is little different from the computer ‘scientists’ who believe that the computer can generate data.

    Have a good day, Jerry

  9. tom0mason says:

    In all of this is the over-arching perception is that human’s, as separate from nature and is a nature destroying entity. Little do so many realize that humans are part of nature, and we have only a hazy understanding of what our actions today will play out for the future. Life and nature hold the keys to this knowledge.
    Our feeble ideas of modeling future effect are bound to fail as this world is not closed system, and is not controlled in a totally understood and apparent way — nature holds all the surprises.
    The best we can do is to learn from nature as we plan for worst times while living well in the better periods.

  10. E.M.Smith says:

    @Jerry:

    The problem isn’t with computer professionals. We generally “get it” about the way computers can fail (since we clean up after them every day). The problem is with the researchers who refuse to see the limits of their bright ideas and have only a modest grasp of programming (and less of the subtle failure modes). Generally, professional programmer types are just handed a specification and told to build it. Rarely is feedback about the request being a bit stupid greeted with thanks..

    The core problem is NOT too much data, it is too little with too much manipulation in it.

    So first off, it violates Nyquist. We need a lot more thermometers with better geographic distribution to get valid spacial sample size. Now the really hard part is we need most of them added in the past… kind of hard to do. We need a much longer record to really speak to climate. We also need to get a more stable count over time, instead of one in the 1700s to 6000 at peak down to about 1200 recently, IIRC.

    The data we do have has huge dropouts. Countries collapse and leave the record for years or decades, some covering vast land areas (USSR). Antarctica has short bits of data as countries staff, or abandon, stations. World wars and other wars punch decade scale holes in continent sized ares, the whole Pacific theatre, and more.

    So to “fix” this, various kinds of dubious practices are used to fabricate the missing bits. IMHO, these fail. See the work of Hansen (IIRC a 1984 or 87 paper). More detail than anyone wants available here: https://chiefio.wordpress.com/gistemp/
    which covers several years of my poking at it.

    But you asked :

    I expect you know how to select certain data from a data set, make a graphical figure in which this certain from a months of days, so one can possibly see possible relationships between certain observed facts. This what you. a computer guy, can do to help me see some order in the midst of the very variable world of weather.

    I’ve done a lot of that. See the postings here:

    https://chiefio.wordpress.com/category/dtdt/

    https://chiefio.wordpress.com/category/ncdc-ghcn-issues/

    As a starting place. Then explore the other “AGW” , GCM, and GIStemp categories listed on the right.

    I’ve done breakouts by country, altitude, latitude, month, etc. etc. (My favorite being the ‘by month by location’ ones where a given place can be wrming across fall months but cooling in spring (as an example hypothetical).

    Or put “hair graph” in the search box to the right…

    Per ground vs air temps:

    I’ve frequently pointed that out. It is part of what it means to be an intensive property.

    Intrinsic Extrinsic Intensive Extensive

    My preferred reality story on that being a camping trip in spring to the high mountains. In the sun, we were warm with about 80 F skin temps, so decided to hop in the creek. The rocks were warmed, the dirt in shadier spots cool. Jumping into the water caused an instant headache…. around the corner in the deep shade snow was melting, slowly feeding the creek. The air temp in the shade near the tents was about 70 F. So what was THE temperature there? 32 F snow, 33 F water, 70 F air, 75 F stones, 80 F skin? How does averaging any of that fix the nature of intrinsic properties?
    (It can’t….)

  11. Larry Ledwick says:

    I think another part of the problem is poor definition of the problem. Both scientifically and in the public mind.

    Given their statement of the problem “the world is warming due to X” what they are really hypothesizing is that the world is gaining thermal energy – – ie Heat energy, which as a secondary effect will cause average air temps to rise..

    That is NOT the same as air temperature!

    Like a glass of water with ice cubes in it, you can change how heat energy is stored from temperature to a physical change of state. You can absorb or release large amounts of heat energy without changing temperature.

    They need to account for both physical temperature of the air but also the humidity as that changes the total heat content of a parcel of air through latent heat of the water vapor, and whether it is freezing or evaporating.

    The fact is, that they don’t have anywhere near the information they need to calculate the world’s heat content even with today’s instrumentation and technology, let alone historical values, because they cannot account for total ice volume and it mass temperature, the total heat content of the oceans and so many other huge heat sinks and sources. Heat flux into the deep oceans from thermal springs is only guessed at, and we are just beginning to have the ability to measure the total heat content of the water column in a tiny fraction of the ocean.

    The local air temperature is a trivial fraction of the total heat content and tells us very little without fully accounting for changes of state of water, ice and water vapor and changes in other heat sinks like the deep ocean and total ice volume and its mass temperature (energy stored as heat of fusion, and thermal energy as heat). For example you have a million cubic meters of ice at a mass temperature of -30 deg C and it changes to a mass temperature of –31 deg C. You have moved a lot of heat energy out of the air into the ice with no change in ice volume regardless what the air temperature is over the ice a week or two later..

    Anyone who has experienced a 30 deg F temperature drop over a period of 10 minutes due to a thunderstorm accompanied by cold rain and hail understands that air temp at the surface tells you very little about total heat content. Never mind issues like implied precision and accuracy of measurement that is completely absurd given the possible error sources and incomplete data.

  12. jerry l krause says:

    Hi E.M.,

    “My preferred reality story on that being a camping trip in spring to the high mountains. In the sun, we were warm with about 80 F skin temps, so decided to hop in the creek. The rocks were warmed, the dirt in shadier spots cool. Jumping into the water caused an instant headache…. around the corner in the deep shade snow was melting, slowly feeding the creek. The air temp in the shade near the tents was about 70 F. So what was THE temperature there? 32 F snow, 33 F water, 70 F air, 75 F stones, 80 F skin? How does averaging any of that fix the nature of intrinsic properties?
    (It can’t….)”

    Excellent!! You simply used qualitative observation reasoning because I am sure you never looked at a thermometer. So while your temperatures might be generally what you propose them to be, the only one which is certain is the surface of the melting snow had to be 32 F. For the liquid water which first drained out of the melting snow to form the stream had to also be 32 F, not 33 F. And in the sun I am sure your exposed skin temp was greater than 98.6 F. But your conclusion, that given these localized, quite different, temperatures is right on.

    The ‘skin temperature’ of the dirt and of the rocks cannot measured with a thermometer (temperature sensor) because what is being measured is the temperature of the instrument. They can be measured with a radiometer, but we have to know the emissivity of the radiating surface. But, assuming an emissivity of one places a minimum limit on the possible surface temperature,

    There are locations where the climate is such that there are minimal cloud during certain seasons. The SCAN data includes hour by hour recorded measurements of the solar intensity reaching the earth’s surface at that location. Hence, graphical plots of this data can be produced for a month of days as well as for a given day. And it can easily be seen that the values of the solar intensity vary even though all the other evidence (smooth uniform curves) do not suggest any other evidence of the influence of cloud.

    The SURFRAD data is recorded each minute so this radiation data, as graphically presented, is never presented as a month of days. But I can imagine, if one know how (which I do not), one could select one measured value each hour and then produce a graphical display for a month of days as is done with the SCAN data.

    Have to leave now, but thank you very much for responding to my comment.

    Have a good day, Jerry

  13. E.M.Smith says:

    The stream was a bit above 32 F as it was in sun and splashing into warmer air at the surface.

    Human CORE tempurature is about 98.6 (though ranges – mine typically is about 97.5 at lowest, 99.4 mid afternoon); but SKIN temperatures are lower. Typically around 86 F. I allowed 1 F for the cool air, lack of shirt, and slight breeze.

    Please, don’t try to “second guess” or “correct” my personal history and experiences. You lack the data to do it. I was there.

    The “instrument” I used to measure rocks, dirt, grass, etc. was my bare feet. Calibrated over a few decades by spending much of my time barefoot. (One entire year when in college… I set it as a goal. That year it snowed for the first time in decades…)

    As feet in contact with rocks rapidly equalize temperature with them, it’s fairly accurate. I’d say within about 5 F even on a bad day.

    32 F has a numbing hurting feeling. 40 F is cold and uncomfortable, but livable. 50 F isn’t bad at all. 60 F cool and almost pleasant. 70 F is fine. 80 F a touch warm, 85 F warmer, 90 F significantly warm, 95 F starting to be hot, 100 F, it’s hot, 105 F you are sweating significantly while it’s hot, and at 110F to 115 F you are “dancing in the street”. Even at 100 F to 105 F on white paint, the black asphalt (at about 130 F) makes for a LOT of dancing and running for the while stripes.

    No, not as accurate as a lab thermometer, but close enough to know about what the relative temperatures of those surfaces actually was.

    I’m glad you found it “excellent”, but wish you could appreciate a bit more that I pay attention to details and don’t “make stuff up”. Those were my estimates of temperatures there, at the time. Based on decades of running around My Home Town barefoot and shirtless in the summers (and often much of spring and fall too…) while paying attention to temperatures. (Family of cooks so temperature importance set early. Farm town, so “weather talk” constant as was the impact on crops. Hooked on TV Weather Reports so aware of day to day what temperature it was. Thermometers mounted on walls both at home and in town so constant feedback. Radio reporting how hot it was all summer long. Etc. etc. etc.)

  14. cdquarles says:

    So, once again, we have to make clear the definition of terms. The thermodynamic temperature is the geometric mean of a given, specified sample of matter’s constituent’s kinetic energy and ONLY its kinetic energy. There is no heat energy properly speaking. There is KINETIC energy. To heat something, you MUST INCREASE that kinetic energy. Energy that goes elsewhere, such as phase changes, chemical reactions such as phosphorescence, or other forms of potential energy do not raise the kinetic energy and thus don’t increase the temperature. These may, however, lower the kinetic energy, cooling that defined sample of matter, which will be revealed by a lower thermodynamic temperature.

    Next, I say that the wrong question is being asked. Climate is not of concern, since it is a statistical statement of the past. It is the weather that is important, and what the weather is today is what we and all of the other biological organisms must survive in order to deal with tomorrow’s weather. While said statement of the past is a rough guide, given the damped-driven nature of the system that possesses inertia (aka autocorrelation), it is only a rough guide. The weather is modified minute to minute, hour to hour, day to day, every day by the sun, by water, and said biological organisms, including us, but not primarily by us, except locally.

    (Please pardon the caps, they are for emphasis.)

  15. cdquarles says:

    EM, the human core temperature is about 99.5 F, on average (remember the rectal thermometer?). The 98.6 is the average oral temperature and since the mouth is often open to the external environment, the temperature is a bit lower. You can see this with an esophageal probe. Leave the probe in the mouth long enough for it to equilibrate. Then have the person swallow it. Let it equilibrate again. The stomach temp is going to be warmer. Oh, don’t do this if a person has eaten recently, for the food and its digestion will give a different reading. Skin temperatures are lower still and the ear temperature is not a core one, either. The ear temp may or may not resemble the core one more or the oral one more, depending. The neat thing about the ear temp is that if there is an ear infection, the local inflammation often gives a high relative reading. Of course, looking in the ear will be more informative.

  16. jerry l krause says:

    Hi E.M.,
    I almost missed that you were first referring to my comment of 8/5 at 7:49 pm which was in reference to your posting. And I suspect it was my statement– “So much that the computer guys did not know what to do except to average it.”—which seemed to offend you. I can now see it should have been: “So much that the computer scientists did not know what to do except to average it.” Would this have made a difference?

    But I also now admit to being confused. For you continued: “The core problem is NOT too much data, it is too little with too much manipulation in it.

    ”So first off, it violates Nyquist. We need a lot more thermometers with better geographic distribution to get valid spacial sample size. Now the really hard part is we need most of them added in the past… kind of hard to do. We need a much longer record to really speak to climate. We also need to get a more stable count over time, instead of one in the 1700s to 6000 at peak down to about 1200 recently, IIRC.”

    I must ask: What would you do with the temperatures measured if there was one thermometer to every square mile of the earth’s surface? And if you had such temperature measurements for every hour of every day for a couple hundred years? And I ask: What temperatures would you need to measure?

    I ask this because it seems you missed the point of: “Excellent!! You simply used qualitative observation reasoning because I am sure you never looked at a thermometer.” Which last comment of your reply confirmed that you never looked at a thermometer. Hence, you did qualitative observation reasoning, which was entirely satisfactory with this method of science which I consistently use, even when I use measurements made with good precision.

    MY comments were not intended to be critical of what you wrote. In closing you asked: “How does averaging any of that fix the nature of intrinsic properties?” I was, perhaps poorly, trying to enforce the point made with your camping experiences.

    I have been trying to convince you there are valuable observations (measurements) which have been made by two government projects for the past 20+ years and which I find no one referencing any to the specific data which been reported. I have read no evidence you have gone to either of these sites to look around at the graphical displays of this data. You have asked me to tell you what you will see. I have to ask: Do you believe a picture can be worth a thousand words?
    I have experiences you have not had and you clearly have had experiences which I have not had.

    “Per ground vs air temps:
    I’ve frequently pointed that out. It is part of what it means to be an intensive property.
    https://chiefio.wordpress.com/2011/07/01/intrinsic-extrinsic-intensive-extensive/”

    I went to your reference and could find no specific ground temperature being compared with the specific air temperature above the ground at the time of comparison. I checked out your reference to see if I missed finding where you found measured ground temperatures to compare with air temperatures as they are conventionally measured. And I have noted how it is considered impossible to actually measure the temperature of the ground’s surface with a temperature sensor. For qualitative reasoning it is sufficient to measure the ground temperatures at depths of 2, 4, 8, 20, and 40 inches. And for a comparison between air temperature and ground temperature it would seem the most important of these 5 temperatures would be that at 2in depth because it is closer to the surface than the others. And SCAN project’s measurements are the only reference I can give which provides data to begin to compare actual air temperature versa ground temperature (albeit actually temperature at a 2 in depth).

    If my intelligence (or abilities) is not up to your standards, please tell me and I will bug off.

    Have a good day, Jerry

  17. jerry l krause says:

    Hi cdquarles,
    Agree totally with your two last comments; but especially so the first.

    The publisher of Galileo’s Two New Sciences wrote a preface to the reader and in it they wrote: “Intuitive knowledge keeps pace with accurate definition.” as translated by Crew and de Salvio.

    I certainly would like to have a private conversation with you. jerry.semivision@gmail.com

    Have a good day, Jerry

  18. Larry Ledwick says:

    The whole problem is that the “earth’s temperature” is an undefined quantity.
    Everyone assumes it is so intuitively obvious that no discussion is needed, but if it is undefined it has no meaning.

    Until there is an RFC that defines specifically what the earth’s temperature is and how it is to be measured, we are talking nonsense.

    If I tell you the snow depth at my home is 17 inches does that really mean anything if you have no clue how, when or where I made the measurement?

    Same question applies to temperature, if I tell you it is 73 deg F in my house, how does that compare to the temperature in your house if neither of us is using the same method to measure the temperature.

    A few years ago while trying to sort our how to set my heat registers for uniform temperatures in the apartment I bought 6 digital thermometers of the same type and manufacture. I calibrated them by checking the remote sensor with a glass of ice water and they all read the same value to their resolution of 01. deg F.

    I scattered them around the house and found that no two gave the same reading. It soon became clear that even trivial things made a significant difference in their temperature read out.

    It mattered if two of them in the same room were at different elevations above the floor.
    It mattered if a sensor was taped on an exterior wall or an interior wall (especially when it was very cold or very warm outside – ie the bias changed depending on how wide the difference was between the interior set point temperature and the outside temp.

    It mattered what was going on in the room due to waste heat, computer room, kitchen laundry room bedroom all had very different temperatures depending on what electrical equipment was on in those rooms.

    The computer room is consistently about 2 degrees warmer than the hall way just outside the door because of waste heat from 3 computers which run more or less 24 hours a day.
    Kitchen and laundry room obviously warmer when cooking or doing laundry, bedroom consistently cooler in cold weather in the morning than the rest of the house because the door is closed at night limiting free air exchange with the rest of the house.

    Without consistent measurement specifications it is completely idiotic to average temperatures together to “improve accuracy” of multiple instruments which no two have the same local conditions or even in most cases the instrumentation.

    It is like averaging the height of boys in 7th grade with the height of girls in 11th grade and saying it tells you something useful.

  19. tom0mason says:

    Larry,

    The whole problem is that the “earth’s temperature” is an undefined quantity.
    Everyone assumes it is so intuitively obvious that no discussion is needed, but if it is undefined it has no meaning.

    You’ve hit the nail squarely on the head.
    If scientists prior to the last LIA knew the global temperature would it bother them. No! Not if the saw, as some no doubt did, great zonal weather changes taking place.
    Climate is NOT global, it IS regional, zonal.

    All this twaddle about global temperatures is just climate onanism.
    When climate science knows and understands how the dynamic energy exchanges within clouds operate, then we might be a step closer to understanding just the BASICS of how the climate operates.
    Until then ALL of them are just guessing.

  20. E.M.Smith says:

    @Jerry:

    You use English a bit oddly. Is it perhaps a 2nd language?

    I suspect that is the core of the disconnect between your words and my words.

    First off, you can use a great many words, and not say very much specific in them. While I can do that too, I generally try to use the minimum words to carry the thought. (Though complex or very precise thoughts take more words). That difference seems to be an issue for both of us. So I read a long chunk of things you wrote and wonder “What the heck is THE point?”, where you will read a short chunk of mine and seem to wonder “What did he mean?” or take it to mean something it doesn’t.

    It makes exchanges with you a minor challenge (but one I’m “up for”).

    First, per “why I’ve not looked at your data sets and provided feedback as to my progress.”:

    I live in a full and complicated life schedule. I have about 48 hours of projects and “life obligations” in any 12 hour work time. Some projects take YEARS to get started, or finished. I did a brief “peek” at the data sets, thought “Maybe something there, mark for future exploring, rank about B or C+ priority” (as a rough paraphrase). I pondered it while flying cross country last week, inspecting cloud structure along the way (layers of cumulus below, wispy cirrus at FL 37 (37,000 feet more or less) and with some distant thunderheads being smeared out to wispy trails at FL 40+.

    Now, a couple of things about me. (Generally speaking “It isn’t about me”, but in this comment it is). First off, I take expediting very poorly. My innate and immediate response is to downgrade the interest and effort on the thing expedited. This is likely due to several older sisters who felt it their duty to “expedite” me to do what was on their agenda; so not your fault. Just be advised that “pushing to get your point looked at” achieves the exact opposite. Second, I get to a question “when it is ripe”. I have a tendency to mull over, contemplate, let it simmer in the mind, then, and only then am I motivated to act rapidly on it. Your “peak interest topic” is only in the “mulling” stage for me ATM. (Thus the cloud contemplation). Now mix those two: Pushing me on it is most likely to cause me to lose interest entirely. Kind of like a cat. Little flickers of motion gets the “pounce reflex” going. Push food in their mouth, they spit it out.

    To specifics:

    You said “To computer guys”. From your clarification, that was an error. “Computer Guys” are guys, who know and work with computers. “Climate Scientists” are generally NOT “Computer Guys” (Though some of them have a few programming classes and think they can write models. Having read their code, you can tell where the Computer Guy was contracted to write it vs the “Ph.D. with one computer language class”.) Do not be surprise if I respond to the words you wrote and not what you were thinking, since I can’t see what you are thinking.

    Words on a screen are cold dead things. Read them that way. ANY and ALL emotion in the words is entirely a product of the reader; and after about 40 years of such electronic text exchanges, I can assure you whatever emotion the reader projects has a better than 85% chance of being wrong and not what the writer was experiencing. This approaches 100% with projecting emotion only MY writing as I’m a “dead fish” emotionally (as has been explained to me 100,000 times by folks expecting me to be emotional.)

    So if you find yourself thinking things like “anger”, “hate”, “grumpy”, “upset” whatever about my emotional state, you WILL be wrong. Usually my strongest emotion toward any text is “mild irritation that it is ambiguous”. Heck, even when I write “I’m angry about” something, I’m usually not, just following social convention when I’m really “mildly annoyed”, since when I say “I’m mildly annoyed my friend died” people think (and sometimes say) I’m an “emotional dead fish” and that makes me mildly annoyed…

    OK, enough about me.

    What would I do with 1 thermometer / sq. mi. for 1000 years? With NO dropouts in the data? Simple. I’d NOT average them together and I’d NOT try to find a Global Average Temperature, since both of those are broken physics and based on a failure to understand intensive properties.

    For each thermometer, for each time window (sample space, day, month, however often they are read) I’d compute a slope for THAT time for THAT device for the entire data series. (Read the articles under the dt/dt category where I detail that method. While I used “first differences”, in comparing GHCN v1 vs v3 I also used a baseline and found no significant effect from that). Now the GHCN has only monthlies in it, so that was all I could use. Had it daily Min / Max of sufficient time depth I’d have used them instead.

    Now the Earth has about 197,000,000 sq.mi. of surface area, so I’d have about that many trend lines x sample rate, one for EACH thermometer x period sampled. Say it was once / day for MAX for 1000 years. That would be 1.97 x 10^8 x 365 trend lines. One representing each sq.mi. of the earth for each individual day, each of 1000 years length. Then and only then, I’d look for patterns in the change of those trend lines. Are the MOSTLY warming (sloping up), MOSTLY falling, rising in the N. Hemisphere and falling in the S. Hemisphere, Spring warming but fall cooling? I would NOT start with a preconception of what they were expected to show me. Then I’d ponder what I found and try to figure out what it meant. Finally I’d ask why it might be happening.

    The problem with our present data is that first, it is FULL of holes. Often giant ones. This causes lots of problems with finding trends. Second, it starts with ONE thermometer not that long ago, so most “Climate Scientists” cut off the past at about 1850; far too short a time series to get trends in a system with known 1500 year cycles. Next up, it peaks at 6000 thermometers (briefly) and then plunges to about 1200. From that, 16000 “grid boxes” are assigned a “temperature” – utter farce. The whole thing is garbage physics end to end. It is “too little data”, in time, in space, in consistency of coverage. Then all the manipulations applied to it trying to “fix” that, just make it worse instead, as they start with averages, then average them, then average them. (MIN/MAX daily average, Averaged to monthly, averaged across space to infill missing bits…) when we know averaging temperatures gives you crap.

    My point about sensing was that senses ARE thermometers. You left out the point that YOU stated what YOU believed to be the ‘right’ temperatures. Saying, in essence, you must have been wrong and it was 98.6 F skin is clearly saying “you were wrong”. JUST DON’T DO IT. Accept that people say things that are what they experienced. So I demonstrated the basis for my knowing what I EXPERIENCED. Just accept my testimony and move on. Challenge “the witness” and you will get the “witness qualifications” in defense. Don’t like that, don’t challenge the witness.

    Now I know most folks don’t have “Calibrated Feet”, but I’m not most people, so I showed my qualifications for knowing what temperature my feet are at to about 5 F. (FWIW, most of the time I can tell you the time to within 10 minutes. Less accurate now that I’m retired, but for years I kept my internal time calibrated to that. So when I say “It was about 10:30 A.M even if I don’t have a watch or clock it is pretty solidly between 10:20 and :10:40 A.M. Early in life I decided calibrating my senses was a “good thing”…

    “MY comments were not intended to be critical of what you wrote”: See, now I get “cognitive dissonance” out of that when a good chunk of it was telling me I can’t have a numerical value for temperatures I sensed and that I know and then assigning other values to those observations as replacements that you think are more right. That is, de facto, being “critical of what you wrote”. Then you say you weren’t, what clearly was. That doesn’t work for me. Pick ONE of two mutually exclusive cases…

    Per intensive vs extensive and ground temps: BY DEFINITION the very property of being intensive (or intrinsic) means that ground (as A THING) will be disjoint from air (as a different thing). No, I didn’t call it out as a specific example in that specific place, it is a GENERAL TRUTH. Like saying “wet isn’t dry”, I don’t have to then list “lakes are not dry like ovens” and “oceans are not dry like deserts” and “spit is not dry like towels”. The general encompasses ALL of the specifics. That was the whole point.

    “If my intelligence (or abilities) is not up to your standards, please tell me and I will bug off.”

    I don’t have “standards” about others “intelligence (or abilities)”. I learned long long ago that I.Q. 150 people can be incredibly dumb on some (sometimes many…) things; and that folks at below average can “get it” with a bit more time and care on things one would never expect.

    I do have a few simple behaviours that are expected from others. Mostly what you are told in kindergarten. Play well with others. Don’t be critical of others ( “If you have nothing good to say, say nothing”). Share. Etc. I do get a bit irritated at folks doing a “je accuse” of error when there is no error, and I learned long long ago to NEVER let a challenge to unresponded. (Grammar school. Someone tosses rocks at you, deck ’em or you will only get more rocks.) Don’t put words in other folks mouths. Expect them to “say what they mean and mean what they say”. Don’t make stuff up and don’t lie.

    Oddly, in my experience, the least educated and “dumber” folks rank higher on those skills and it’s the really really bright highly educated who fail the most. Lawyers and Politicians especially, but M.D.s sometimes and Lord Help You if you run in to a Left Wing Professor.

    So no, I have no problem with your “intelligence or ability”. I do find your expedites on YOUR Hot Button a bit edging toward irritation (“IF it really is that important to you, you research it and write it up” pops to mind). I do find your tendency to write a half dozen sentences and not give me one clear point to use as a handle a bit distracting. (i.e. vague as to position or conclusion). I do find your tendency to not read my actual words and accept them as “my statement” but want to “correct it” a mild irritant. And I do find your repeated assertions about my emotional state (am I happy, irritated, or pissed at you? You CAN NOT KNOW via text) as indicative of a limited total time dealing with internet communications.

    So I’ve given you a very long, very prolix response. Hopefully it has enough in it for you to get some “take home messages”. In keeping with my “long format style” (usually only used in formal presentations) of “tell ’em what you are going to tell ’em, then tell ’em, then tell ’em what you told ’em” I’m now going to summarize the key points:

    1) Reading ANY emotion into typed text is prone to extreme and frequent error. Don’t do it, and if you DO do it, assume you were wrong as that is usually the case.

    2) Telling folks their experiences were wrong or wrongly reported is generally an error, and always going to get a bad response from me, in particular.

    3) Expecting someone with a full workload plus some to immediately drop what they are doing and chase your favorite topic will bring you a lifetime of grief. Best get over that now. Just because it may take me a year to make it the “top of the todo list” doesn’t mean I didn’t like it, or that I’m not looking at it.

    4) It is best to think twice, or even 3 times, before hitting “send” to assure the words you type mean what you are thinking. Often they do not. Editing is a key skill. Experience improves this performance.

    5) Intelligence is highly over rated. Persistence, care, consideration of others, skepticism, neatness of thought; all are worth far more to me in terms of accomplishment and “keeping a tidy mind”.

    6) Expediting volunteers, and especially me, is a Very Bad Thing.

    7) When you type “Computer Guys” it means ONLY “Computer Guys”, not “Climate Scientists who use Computers a lot”. That does not change if you are thinking “Climate Scientists who use Computers a lot” when you type “Computer Guys”. Only the typed words come across, not what you think, and certainly not what a person is feeling.

    8) I’ve already done what I’d do with 187,000,000 thermometers. Make trends for each device for each time step as distinct trends. That’s all in the dt/dt stuff I pointed at. Changing the size of the data doesn’t change what I would do. I’d NOT do what was done by “Climate Scientists” as it is dumb as a bag of rocks, based on broken math, violates Nyquist and Intrinsic Properties, has lousy error bars, and more.

    9) The existing data is full of holes, dropouts, has inadequate coverage, variable precision, and insufficient quality thermometers over enough spacial and time ranges. Basically, it’s too crappy to give an valid answers about climate. What is done to “fix it”, doesn’t.

    10) A general solution / explanation covers all the specifics. Don’t expect a canonical listing of specifics in a general proof.

    11) I do not judge a person’s quality by either their raw intelligence nor by their education. I’ve learned neither of those matter all that much. I do judge them by their care, caring nature, curiosity, politeness, and a few other things. I’ve generally found better people in the “teaming masses” than in Academia… though I do have to admit that the most interesting folks have been the ones who pass the care, caring, polite, etc screen AND are “sharp cookies’ … most due to a keen wit and sense of humor….

    12) I’m not particularly interested in anyone “going away”, just not interested in a large time sink without much gain. Like explaining myself 3 times… or reading the same expedite 4 times.

    OK, that’s likely enough redundancy and variation to get the points clear, one way or another. With luck, it won’t be misconstrued.

    @Larry:

    BINGO! That’s the problem of intrinsic properties. EVERY single spot is a different value. There can be NO “Global Temperature”, only temperatures of a zillion different spots. Now you could define a “Global average OF temperatures”, but you could not do any heat analysis using it or say if the earth was warming or cooling using it… it is ONLY a statistic about the collection of data points process, not a physical state.

    The best example I’ve seen is that telephone numbers are intrinsic to a particular phone, so “an average of the telephone numbers in California” is pretty useless and will NOT be a phone number in California (as the area code average is not an area code in California…)

  21. philjourdan says:

    The use of the term “data” to label computer output is intentional. It is designed to obfuscate reality with more numbers than the person can handle, thus overwhelming them and causing either an indifference, or assent to the preconceived conclusion.

  22. p.g.sharrow says:

    “If you can’t confuse, daze with BS.
    In this case with computer GIGO and Bad Science…pg

  23. jerry l krause says:

    Hi E.M.,

    “Some projects take YEARS to get started, or finished. I did a brief “peek” at the data sets, thought “Maybe something there, mark for future exploring, rank about B or C+ priority” (as a rough paraphrase). I pondered it while flying cross country last week, inspecting cloud structure along the way (layers of cumulus below, wispy cirrus at FL 37 (37,000 feet more or less) and with some distant thunderheads being smeared out to wispy trails at FL 40+.”

    It takes years. You brush aside the years I have invested in reading about, pondering, considering actual observations (data) about weather. I commented you for successfully focusing attention of some people upon the influence of clouds upon the earth’s atmosphere’s temperature that we breathe when out of doors. Something which I have not have near the success that you did. Yes, it is a very obvious fact, but it also has been a seldom considered fact.

    You refer to holes in the data which is available. You saw the clouds which are seldom part of any dataset now a days, because most automated systems do not detect clouds above 12000ft. When there were on the ground observers, the wispy cirrus at 37000ft would have been noted on the log sheet.

    Since I discovered the SCAN data (being observed for 20+ years) less than a year ago, I have invested hours and hours of time trying to see what might be seen. As I understand, you stated you would study the trends of each and every thermometer over a period of time. So, it does not matter how many thermometers there are. And it does not matter how long a period of time you have data to establish a daily trend, a weekly trend, a monthly trend, a yearly trend, etc. You have to start with at a minimum of hourly data. The radiation and air temperature data of the SURFRAD project is measured and recorded every minute. This allows one to see how rapidly the values of the measurements can change on this short time scale.

    You refer to your calibrated body sensors with a significant degree of uncertainty. Scientists long ago began to invent instruments capable of measuring whatever with far less uncertainty because fundamental measurements with a significant degree of uncertainty could never allowed the scientists to discover the scientific laws which needed to be explained, if possible. Without Tycho Brahe’s careful naked-eye astronomical observations, Kepler could never have discovered the three laws which describe the motion of planets about the sun. And Newton would have nothing to explain with his universal theory of gravitation. Do you know that Newton stated he did not know the cause of gravity, but from observations he knew that its influence never become absolutely zero (that was his theory) and comets were the physical evidence of this?

    Relative to your discover of trends, the problem of weather is to better understand how unpredictable (not consistent with any trend) weather events occur. And we can only begin to discover this by observing actual data to see what is actually occurring when and where.

    You cannot the study the possible influences wispy cirrus and thunderstorms if you do not know when and where they have existed. And you cannot study the possible influences of cumulus, cirrus, etc. clouds unless you study the data which does exists, instead of making excuses that there are holes.

    Have a good day, Jerry

  24. larrygeiger says:

    They did not read Mr. Briggs book…

  25. E.M.Smith says:

    @Jerry:

    Now here’s a good example of the “disconnect” I get when exchanging text with you. First, you have a minor complaint that I haven’t done anything about your Favorite Issue (2 data sets) in only a few weeks. Now I responded. I was talking fairly clearly about ME and how I schedule MY SCHEDULE. I said:

    “I live in a full and complicated life schedule. I have about 48 hours of projects and “life obligations” in any 12 hour work time. Some projects take YEARS to get started, or finished.”

    Clearly about my “life schedule” and “I have…projects” and how soon I start MY projects and that can take years for lower priority things. in NONE of this were YOU mentioned nor YOUR YEARS. So your response? All about YOU.

    “You brush aside the years I have invested in reading about, pondering, considering actual observations (data) about weather. ”

    Let me make it very very clear, hopefully:

    YOU and YOUR YEARS are not relevant to SETTING MY SCHEDULE. It has zero bearing.

    So when I say “I might get to it next week, or next year, or in 5 years”, that has exactly zero to do with YOU, your years, your schedule.

    Conflating them just causes me to desire to stop wasting time on “engaging” over it. If you can’t keep ME and MY SCHEDULING PROCESS separate from YOU and YOUR YEARS when my paragraph topic sentence starts with “I live in a full and complicated life schedule” (just drop out the modifiers and the core is “I …and…schedule“) what hope is there of ever having a useful conversation as pretty much every other possible sentence of science topics will be more complicated and highly likely to be even more misconstrued.

    So, want to know what “problem” I see in communicating with you? Start there. Don’t conflate or mutate topics. Learn to connect the topic sentence of a paragraph with the detail in it AND NOT TO EXTERNAL THINGS ONLY IN YOUR HEAD AND NOT ON THE PAGE.

    Another example:

    You ask what I’d do with thermometers, and about holes in thermometer data. I respond with both reference to the dt/dt examples/code, a hypothetical example, and a description. You then run off to Your Favorite Topic (2 data sets) ignoring the context and ignoring that I’ve clearly ONLY talked about TEMPERATURES. Fine, you have a Hobby Horse and just can’t stand not to ride it. Yet it’s another example of “not sticking to topic”, mutating context. Shifting the discussion. i.e. entirely irrelevant to me, what I wrote, or what I’m going to be thinking about. Good luck with that as a conversational technique; it will fail utterly with me. I stay (sometimes ruthlessly) on topic on science things.

    Third, and final, example:

    I give a story with illustrative temperatures to show how you can not average an intensive property. I was there. I experienced it. I am THE best positioned source for the best idea what actual temperatures were THERE and THEN. You then choose to “adjust my data” with speculative flair. I point out that’s not going to cut it; and since the “witness” was challenged, present my credentials as having better calibration AT THAT PLACE and AT THAT TIME than YOU can possibly have on mere speculation and zero actual input. Now the really weird part happens: Somehow you see that as a “jumping off point” for asserting the superiority of actual instrumentation AS THOUGH I WAS SAYING THE OPPOSITE WHEN I WAS NOT. I have zero claim to any precision better than 5 F for “me and my feet”; yet you launch into some mini-rant about how much better instrumentation is. Guess what: I completely agree instrumentation is better than my feet BUT it is NOT BETTER IF IT ISN’T THERE. So your point is entirely bogus and irrelevant to the TOPIC of my having “good enough” estimates of temperatures in MY story about MY experience for illustrating MY topic of averages of intrinsic properties being BOGUS.

    So what you are doing is commonly called “Talking past the other party”. I don’t DO talking past the other party. If that is going to continue to be your “style”, I’m not engaging with it.

    Basically, If you can’t see the topic, stick to the topic, and stay ON TOPIC for things I’ve said, I’m going to leave you shouting into the empty ’cause it just doesn’t interest me to try to clear up what wild hare path you might have dashed off to or what rabbit hole you have fallen down.

    (BTW, that “wander off” and “talking past” is a common failure mode for Chat Bots, so I’m particular sensitive to that; as the LAST thing I care to do is sink time into playing footsy with Chat Bots. They’ve gotten fairly good now and can ‘pass’ on simple topics. Another flag is failure to address specific questions put to them that are outside the training scope. You manifest that one as well. I have no position on the question of does that point to you being a Chat Bot vs you having the same failure mode by other means. My only position is that I don’t engage such behaviours.)

    So, in closing: I’ve heard your interest in your Favorite Cloud Topic and your 2 Favorite Datasets. I’ve already said I’m going to look into it, and I will. You have zero impact on the schedule other than possibly causing me to lose interest by over expediting. I expect to become interested in it after a few weeks of studiously ignoring it now since, at the moment, it is “food stuffed involuntarily into my gullet” and I’m going to gag a while. So give it a year, maybe two now, and I’ll have something. Maybe. Until then, I’m not going to engage on any Off Topic comments from your, nor acknowledge anything about Your Hobby Horse. You want it done, you ride it.

    Gotta go, I’ve got a dozen postings I want to do backed up en queue and this isn’t getting them done. I’ve also got shopping to do, a garage to clean, and a car with a dead alternator to fix. Then there’s that set up a scrape for the list of sites from a few postings back, build a GHCN all versions database and write comparison codes, go through my 9 TB set of “misc. disks” and prune the dead data out (as much was consolidated onto a 4 TB new archive) and I’ve got about a half dozen “old PCs” to scrub and disposition. (Am I really ever going to need to run Windows 95 and Word from that era?…) and that’s not even 10% of what’s at the top of the ToDo list for today (most of which will not get done…)

Comments are closed.