WOOD3

This is another of the W.O.O.D. series of semi-regular Weekly Occasional (i.e. if I forget and skip one, no big) Open Discussions.

Immediate prior one here:
https://chiefio.wordpress.com/2017/08/23/wood2/
and remains open for threads running there
(at least until the ‘several month’ auto-close of comments on stale threads).

Cannonical list of old ones here: https://chiefio.wordpress.com/category/w-o-o-d/

So use “Tips” for “Oooh, look at the old shiny thing!”
and “W.O.O.D” for “Did you see what just happened?! What did you think about it?”

For this week, I’m going to toss wo topics in the hopper. Do note that Hurricanes have their own open thread:

1) Trump cut a deal with the Democrats, now the RHINOs are pissed… Swamp is full of swamp creatures of all kinds.

2) North Korea: To bomb them into the stone age would be a Very Bad Idea. Don’t want to improve their society after all… But one little drone on a Few Key Leadership and Nuclear Developers could work wonders… as there was no end to the Korean Police Action, just an armistice (i.e. pause) and even that states no new weapons to be introduced, we can just pick up where the old UN Resolutions left things…

Subscribe to feed

Advertisements

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in W.O.O.D. and tagged , . Bookmark the permalink.

114 Responses to WOOD3

  1. Larry Ledwick says:

    An article you need to show the folks that advocate electric cars.

    http://www.dailymail.co.uk/news/article-4764208/Child-miners-aged-four-living-hell-Earth.html

  2. E.M.Smith says:

    @Larry:

    Having just driven from Chicago to Silicon Valley in about 32 drive hours (44 elapsed I think…) just ask them how they would do that with frequent stops of hours duration to recharge the car…

    Or just ask them were in the middle of nowhere Wyoming they are going to do that charging? Some city spacings made my Diesel range a bit less than desired. Nevada comes to mind…

    (Why do it? Because I had to transport 2 dogs and didn’t want to either sleep in the car with them any more than absolutely necessary nor deal with extensive hotel costs and negotiations for pets in the room… Oh, and wanted to be home a few days early… that whole weekend fun thing…)

  3. jim2 says:

    E.M.Smith says:
    9 September 2017 at 5:22 am
    @Jim2:

    While I appreciate your dogged determination to “stick to your guns”, doing so when it is absolutely wrong is not particularly a valuable skill.
    **********************************************
    This is what I said:

    “The “temperature” we measure with a mercury bulb, thermocouple, or thermistor is the average of the “temperature” (energy) of the particles impinging upon it. I take issue with the notion that one can’t calculate an average of temperatures. It’s certainly mathematically possible. I contend it even has physical meaning. The global average temperature (during Snowball Earth) compared to the global average today would convey meaningful information about the state of the Earth at those times.

    None of the above implies temperature and heat are the same thing.”
    **********************************************************************************************
    I said nothing there about global warming, determining heat flows, the difficulties of getting surface air data, or any of the myriad of topics you have introduced via my very simple and accurate statement.

    I’m not sure if this is a debate ploy or if you simply assumed I was addressing all these other issues.

    Whatever the case, my simple statement is correct and backed up by mathematics and the kinetic theory of gases. My Snowball Earth example backs up the case that a global temperature average, in principle, would convey meaningful information.

  4. p.g.sharrow says:

    Data gathering devices are what they are. As a refrigeration tech I often use my hands and nose to do initial evaluation of the systems conditions and temperature. The “gathered” information is valuable to me, But, not so much for others.

    In much of science the data gathered is as much an artifact of the detector rather then the actual thing being measured. Facts inferred based on assumptions and logic.

    Sloppy thinking can yield poor conclusions that feed ego and not understanding. Add to this imprecise definitions of the terms used, results in confusion in communication with others.
    Back in the old days of “slip stick” computing, one needed to keep a running mental total of the calculation to reach a usable result. Today they just plug in the figures and expect their computer to give the correct answer. If garbage goes in, garbage is the result. There is no logical correction factor built in to shout, ERROR!…pg

  5. Larry Ledwick says:

    One of the first things they taught us when we got introduced to slide rules was to make a mental ball park calculation of the probable correct result so you could tell you were 10 off or 100 off in your slide rule calculations.

    Those reasonableness checks have caught problems more than once with computers too.

  6. pouncer says:

    Dr Pournelle contributed to my (confessedly very limited) ability with Personal Computers via his columns at the old dead tree magazines like “Byte”, “Compute”, “PC Magazine” and similar. He, and I, attempted to get real work done on Personal Computers before the abbreviation “PC” was captured by IBM-compatibles, when such devices might include Commodores, Sinclairs, and (anybody remember?) the Coleco Adam. It was his attitude more than his expertise that carried him, and his readers, along. Our current host’s posts about the Raspberry PI are very much in the same tradition. Thank you. I could never have succeeded in my various jobs without the informational gifts in essays provided by generous souls like these.

    As my own grateful recipro cal offering to Jerry’s ghost, I follow up on an earlier comment, here, about the very cheap “Windows 10” tablets being clearance-priced this summer. Mine is the 8.5 inch NAXA-9003 touch screen tablet but there are half a dozen similar makers and devices offered by a big-box electronic supermarket. I expect the advantages, problems, and work-arounds are similar for my widget and the others.

    The advantage is that this mini (32 bit) Windows 10 as the OS (not actually doing anything except OS-ing) does the Windows things exactly as I see on bigger and more powerful machines. As a Win10 training device it works: a Barbie themed battery operated plastic jeep teaches six year girls on the sidewalk how to work accelerator and brake pedals, the steering wheel and horn, and using the mirrors to do more than check make-up: to call a device a “toy” is not necessarily pejorative: an adult seeking Windows practice is better fit with this tablet than a similarly sized person learning to drive on the sidewalks in a Mattel vehicle. I offer faint praise, but it is sincere.

    There is an update from Microsoft numbered 1703 that despite many attempts I CAN NEVER INSTALL because the onboard memory is too small. It is designed by Murphey, in that this particular update promises me the option to delete certain “built-in” apps from Win10 that can’t be removed by older methods and tools. Which would free space to download the tool, which tool I need to free the space, and there’s a hole in my bucket, Eliza. When I do use the old tools to clear out just about everything I can from factory installed suite, then the other smaller weekly downloads, (notably the security patches) seem to install correctly.

    One of the installed packages I have un-installed was Skype. Turns out there is a version of Skype that runs on the SD Card as a “Portable App” which seems to do all that I need Skype to do. It requires that I have it running before allowing a contact to “ring” in to me. So it’s a work around rather than the live tile POTS-like communication service Skype might be on a desktop system. It’s sufficient for a small circle of contacts with infrequent but scheduled needs.

    The “Portable Apps” (PA) solution generalizes to several other functions: web browsing, simple spreadsheets, and (as mentioned) various versions of un-installers and security tools. This approach to software makes the investment in a large (64 gB) micro SD Card justifiable, It also appears to me that many of the PA tools are open source, compiled for Windows; that could be compiled instead for Linux, Android, or Mac. So once I become familiar with the app on my Windows tablet I could, theoretically, carry over the training and experience to a different device. This is an attractive prospect given that the effort to make myself familiar with Microsoft apps has historically provided frustration — features go away in upgrades.

    I’d mentioned previously that the tablet will run the PA ” DOSBOX ” and, if supported by a USB mouse, allow me to resurrect old experience and run software that my fingers understand without contributions from my brain. This is still true but … there is a hardware problem.

    NAXA like most cheap tablets has only one USB port, a cloacal opening that attempts three functions but, most conspicuously, lays an egg. The OTG function will not allow electrical recharging at the same time as Input / Output as either a host or slave. A fully charged battery only lasts a few hours if doing anything halfway serious. So if a powered hub is plugged into the micro-USB and a mouse, a full sized keyboard, and a USB memory stick (rather than SD Card) are plugged into the hub, even though the peripherals draw power from charger rather than tablet battery, the task had better be limited.

    The worse problem is that the single micro USB slot has quickly worn itself loose with the variety of invisibly-differently-sized plugs being changed in and out. (There’s a slot / slut pun apparent in the discussion we need not elaborate upon.) The original small USB keypad that sells with the NAXA kit was first device to suffer, with the plug dropping from the loose slot when the tablet was moved. With a needle I can pry up the little spring pins on one side of the plug to enhance its apparent size and keep it in, slightly more securely. But it seems a very temporary patch rather than a fix and more recently the plug from my powered hub is breaking a signal connection at unpredicable moments — even when power appears to continue. So, loose socket on the tablet — which breaks DOSBOX.

    A broken mouse or keyboard connection when using the PA Windows specific tools usually still leaves me the option of controlling work with the touch screen. It’s annoying for anything but Skype, but not a hard dead stop on the work.

    Given the problems with the USB slot I have only once experimented with an external WiFi adapter. During that experiment I noticed no improvement in signal strength or speed compared to the on-board WiFi. The internal adapter is spec’d as b/g/n and connects to my router, as far as I can tell, in the “G” mode – but that solidly and in any room in the large two-story house.

    The camera and speakers aren’t even as good as on my Tracfone. They work, but are no reason to buy this device. Consequently the screen resolution and image displayed are incompatibly GOOD compared to the image captured, or to the (crummy, mono) sound accompanying a good-resolution YouTube video.

    I’ve had a comparably-sized, similarly-priced, under-powered Android tablet that crashes with most Android apps, but offers a comparable camera, better speakers, a slightly worse display, but (oh, joy!) a “Type C” mini-HDMI port. If I could make Skype-for-Android work on this cheap Android tablet, (porting the image up to a large and “dumb” TV) it would be my goto device for that need. As is, it mostly leaves me regretting that the Windows cheap tablets examined so far exclude HDMI ports from the hardware configuration.

    Anyhow, I am obviously not Jerry Pournelle, nor Mr Smith. But for anybody considering a very low priced solution to various limited problems, I hope this info is helpful.

  7. cdquarles says:

    I also cut my teeth on slide rules. My grandfather was good with one. Before the latest ‘leftist’ ‘education’ fads took over back in the day, we were taught how to estimate answers and in hard science classes, how to use units to sanity check calculations. Oh, back in those days, you were expected to memorize basic facts and algorithms. There is no learning without memorization followed by verification.

  8. jim2 says:

    I also started out using a slide rule. I was a chemist for 15 years, an electronics technician, and work currently as a programmer. In spite of all my training and experience with measurements, if my electronic freezer thermometer says the temperature is above zero. I look into it. Most of the time I find soft ice cream. It is a trust builder – as if I needed it in the first place!!!

  9. E.M.Smith says:

    @Jim2:

    I think you replied to “wood2” when that discussion is over on “wood3”. A risk I expected in leaving similar name threads open. My response is there.

    https://chiefio.wordpress.com/2017/08/23/wood2/#comment-86368

    FWIW, I never asserted YOU made a conflation of heat and temperature, nor did I assert that YOU said anything about getting surface data nor determining heat flows. I only discussed those things as illustration and to show the relevance to the whole Global Warming thing.

    Near as I can tell, the only thing you have (doggedly) wrong is the idea that you can average intensive intrinsic properties and preserve meaning. It just is not possible.

    It will SOMETIMES be ACCIDENTALLY close enough to a proper result from the extensive properties, but then other times can be horridly wrong. Like a stopped clock is right 2 times a day… but you don’t want to use it to tell time.

    @P.G.:

    On the road I had bought some stuff at a gas stop. $x.45 I put the bills on the counter then plop down a quarter and 4 x nickles. The clerk picks up the quarter and one nickle…. looks at it. Picks up another, has a bit of a think. Picks up the third, longer think. Finally picks up the fourth… Clearly didn’t do 4 x nickels = ¢ 20 plus a quarter is ¢45

    I still travel with a circular slide rule…

    @Larry:

    I always have that ‘reasonableness check’ running. Great habit…

    I learned to do an ‘order of magnitude’ with the exponents and to do a ‘units equation’ with just the units. Between the two you are within 10 of the exact answer and know if your problem set-up gives the right units. Then it’s just the little bits to do ;-)

    @Pouncer:

    Golly! (Blush) being compared to Jerry Pournelle… I spent many hours reading his stuff… maybe a little of him wore off on me… but not enough.

    Per W10 tablets: Every device has something it can do well. You describe what it does well. FWIW, I love my toys and learned more from them than most work things. When I really love a machine at work, I’ll call it “My Toy”, so definitely not a pejorative! ;-)

    FWIW, I’d buy a tablet or two were it not for the fact that my office / junk room are stuffed. I absolutely MUST remove an object to bring in any new. So I’ve set a rule for myself that until the space is preened and at least 40% volume reduced; no new toys without disposing an equal volume FIRST…

    I may miss some opportunities, but it’s my only option right now. Even the R.Pi hardware required something else “go”… (but I had a larger volume ’empty packages’ stash that was acceptable ;-)

    @C.D.Quarles:

    It was in High School Chemistry where I learned the slide rule ( 6 foot giant WORKING one hanging in the front of the class! ) and where we were required to set up the “units problem” prior to doing anything with the numbers. Stellar approach!

  10. jim2 says:

    EMS: “Near as I can tell, the only thing you have (doggedly) wrong is the idea that you can average intensive intrinsic properties and preserve meaning. It just is not possible.”

    That’s just wrong. We disagree. And that’s OK.

  11. Power Grab says:

    My dad had a slide rule. I never learned to use one, though. Maybe I should tackle that. I’ve been getting on a kick of learning how to do things without computers.

    My dad did give me his homemade abacus. He made it after watching the locals use them when he was in Japan during the Korean War.

  12. E.M.Smith says:

    @Power Grab:

    It isn’t hard at all, really. You can layer a lot of complications on top of it, but at the core it is just adding logarithms. Two log scales are moved relative to each other to effectively add two logs together, then the result read off under the cursor. Start with the C and D scales and just do that. The rest is elaboration…

    @Jim2:

    Asserting a thing is wrong does not make it wrong.

    http://research.omicsgroup.org/index.php/Intensive_and_extensive_properties

    An intensive property is a bulk property, meaning that it is a physical property of a system that does not depend on the system size or the amount of material in the system. Examples of intensive properties include temperature, refractive index, density, and hardness of an object. When a diamond is cut, the pieces maintain their intrinsic hardness (until their size reaches a few atoms thick).
    […]
    An intensive property is a physical quantity whose value does not depend on the amount of the substance for which it is measured. For example, the temperature of a system in thermal equilibrium is the same as the temperature of any part of it. If the system is divided the temperature of each subsystem is identical. The same applies to the density of a homogeneous system; if the system is divided in half, the mass and the volume change in the identical ratio and the density remains unchanged. Additionally, the boiling point of a substance is another example of an intensive property. For example, the boiling point for water is 100 °C at a pressure of one atmosphere, a fact which remains true regardless of quantity.

    According to the state postulate, for a sufficiently simple thermodynamic system, only two independent intensive variables are needed to fully specify the entire state of a system. Other intensive properties can be derived from the two known values.

    Some intensive properties, such as viscosity, are empirical macroscopic quantities and are not relevant to extremely small systems.

    Now take two such systems, one at 10 C the other at 30 C. Average those two temperatures you get 20 C. Yet no such system in thermodynamic equilibrium exists and no such temperature property exists in your test space. It most absolutely IS NOT A TEMPERATURE. By definition, a single temperature is the intensive property of a system in equilibrium. The 20 C is a hypothetical construct of ill defined nature and NOT a temperature.

    Physical example:

    Mix two pots of water, one at 0C the other at 20 C. The final temperature of the combined system now depends on the relative sizes of the two pots and the melted vs frozen status of the first one. It is not possible to say the final system temperature will be 10 C, yet that is what the average of those two numbers would be.

    The average of the two temperatures is NOT the same as the final actual temperature in all but one carefully controlled case, where you hold the other extensive properties constant at one value.

    Averaging temperatures without those other extensive properties is invalid thermodynamics.

    Extensive properties

    An extensive property is defined by the IUPAC Green Book as a physical quantity which is the sum of the properties of separate noninteracting subsystems that compose the entire system. The value of such an additive property is proportional to the size of the system it describes, or to the quantity of matter in the system. Taking on the example of melting ice, the amount of heat required to melt ice is an extensive property. The amount of heat required to melt one ice cube would be much less than the amount of heat required to melt an iceberg, so it is dependent on the quantity.

    Extensive properties are the counterparts of intensive properties, which are intrinsic to a particular subsystem. Dividing one type of extensive property by a different type of extensive property will in general give an intensive value.
    For example, mass (extensive) divided by volume (extensive) gives density (intensive).

    To get the intensive property of the actual combined systems (the end temperature) you need to use the EXTENSIVE properties.

    If you do not believe that, go argue with IUPAC.

    And no, it is NOT just “we disagree”. It is that you are asserting something that is entirely unacceptable to IUPAC, thermodynamics, and more. I’m just reporting their findings and rules. (i.e. not an ‘opinion’ so not subject to agreement or disagreement; a reported state of things.) And NO, it is not OK to assert being wrong is acceptable. To keep a tidy mind, things that are clearly wrong need to be kept marked as wrong and not allowed to propagate and dirty up more tidy places. The whole GAT fraud exists because folks are unwilling to keep a tidy mind on exactly this point. You can not get an intensive property of a temperature by averaging other intensive properties of temperatures. Thermo doesn’t work that way.

  13. jim2 says:

    “And no, it is NOT just “we disagree”. It is that you are asserting something that is entirely unacceptable to IUPAC, thermodynamics, and more.”

    You have picked a specific example to attempt to prove your point.

    If you have an oven, it probably has one thermostat which has a thermometer incorporated. You probably rely upon it to measure the temperature of the oven and adjust it accordingly. I assume you consider it to measure a physical property. Now, if you add a second thermometer, does the first suddenly become non-physical? What about it’s measurement? If one is at the bottom and one at the top, and you use the average of the two, then you will have a more accurate measure of the temperature of the oven, which IS NOT in equilibrium throughout.

    In fact, if you know the heat capacity of air, then the more points you measure the temperature, the more accurate will be the computation of HEAT contained in the oven.

    This illustrates my point. And I am correct.

    You have chosen a case that can’t be realistically averaged, so you got the result you desired.

  14. jim2 says:

    So it appears you have demonstrated a case where a simple average of temperatures will not be physically meaningful, and I have put forward a case where it is.

  15. kneel63 says:

    “… $x.45 I put the bills on the counter then plop down a quarter and 4 x nickles. …”
    I have several times tendered exact change when buying half a dozen items at the supermarket – notes and coins at the ready before even being told the total by the assistant. The vast majority (always <20 y.o.) seem stunned that someone could actually do this in their head, without resorting to a phone app or something.

    There's lots of little tricks to doing "close enough" math in your head, and it's still a valuable thing to know. One boss I had was getting the calculator out to figure out how much he would save on the discount on power for his new data centre that he'd organised (we were in a taxi). When I said "that's more than half a million bucks", there was a pause while he hit the equals button, then a "yeah – $523,918 actually. Did you do that in your head?" He always asked me to ballpark stuff for him afterwards, which always came with "that's a little on the low (or high, as the case may be) side, maybe 10% or so". If anyone bothered to check and came back with an exact number, his response was always "yeah, whatever. I just wanted a rough idea – 1,000 or 10,000, that kinda thing"

  16. Larry Ledwick says:

    Interesting bit on Hurricane Irma and how the winds can blow the ocean out to sea leaving the beach area dry. (opposite of the surge)

    https://www.washingtonpost.com/news/capital-weather-gang/wp/2017/09/09/hurricane-irma-is-literally-sucking-the-water-away-from-shorelines/

  17. kneel63 says:

    Jim2:
    “Look! I made fish sticks – they’re burnt on the outside and still frozen in the middle, so on average, they’re cooked perfectly!” – Lisa Simpson.

    You can average whatever you want to,but it doesn’t always provide helpful information. I do not want to eat fish sticks that are perfectly cooked “on average” if they are as Lisa describes. Nor would I be prepared to eat them if half were burnt all the way through and the other half frozen all the way through. So an average here doesn’t tell me what I want to know to make my decision.

    Your oven example does help with the decision required.

    The average phone number of my friends tells me no useful information, other than that whoever calculated it has mental issues or a very twisted sense of humor.

    So the questions then, are: how, and under what circumstances, does calculating a global average temperature help? Which average is best to use for which decisions and why? If you can show a useful purpose to calculating global average temperature that is not otherwise apparent without averaging, and is other than political, it would go a long way towards making your point – alas, neither you nor anyone else seems to have made such a purpose well known, if any even exists.

  18. E.M.Smith says:

    @Jim2:

    I’ve fought with more oven thermometers than I care to count. The latest at the Florida Friend’s home. I added the second thermometer… it didn’t help much. Putting one at the top and one in the bottom makes things worse, not better.

    In his case, near as I can work it out, the oven is overall too “slow” (meaning a chicken or potato that ought to cook in 1 hr to 1 hr 15 min. at indicated 350 F takes closer to 2 hours). Yet things that take 3 hours are not off too much… watching the added thermometer, the eventual temperature reaches the indicated ( over time a 350F setting gives a 350 F stable temperature) yet short term reads lower than goal. Similarly, the top seems significantly hotter than lower down… for the first 1/2 hour… BUT dark pans run hot as they absorb more IR from the burner down low. In short, it is too non-uniform in heating rates high vs low, warms too slowly overall, and tends to stratification.

    Averaging the lower vs top readings does nothing to inform about the stratification, the initial lag time, and the IR absorption issues. Pre heating for at least 1/2 hour helps as does adding 50 F at the start, dropping to goal at 45 min. The thing is just too out of equilibrium for the first 1/2 hour to an hour to make any use of averages (the information needed is in the difference, not the average)

    The sister-in-law has the same model oven we have… but the color is different. Mine is black speckled enamel inside, hers grey speckle. I need to bake things longer in hers or at hotter indicated setting as the IR transfer is different. Averaging the two yields nothing useful. Using two thermometers in one mostly tells you the calibration difference (but not which thermometer is wrong) but also shows some information about relative color of the thermometer in the IR. In no case does the average yield a more correct temperature.

    What I find most helpful is to bake a medium sized potato at 350 F indicated. When it is done, it becomes soft to a firm squeeze. It ought to take about 1 hour to 1 hour 15 minutes. If longer than that, the oven is slow. If done faster (and a bit crisp…) the oven is fast. It doesn’t matter what the thermometer / thermostat says then, it is inaccurate. It is no accident the spud adds mass and specific heat to the measuring process…

    These are not hypotheticals, they are observations made while doing the cooking. Every experienced cook knows some ovens are slow and some fast and the thermostat tells lies. Added better calibrated oven thermometers helps, but due to the calibration. Nobody averages the two. Cooking directions (especially baking) include statements about rack height and even pan color and composition, but never say to average temperature readings. This is because rack height and pan color are useful to control. Averaging readings a waste of time that would give worse results.

    So what I offered were examples that illustrate a property and how to use it in conformance with thermo calculation standards. The example is not the argument, it is a learning aid. What you offered was an example arguing from the hypothetical as support for your assertion. But the reality of ovens does not support your hypothetical… which certainly does not displace the proper guidance on thermo from IUPAC.

    It isn’t a duel of hypothetical examples.

    Per your iceball earth hypothetical:

    First off, the major fault comes with the first word. Ice. By definition you are bringing in phase change and mass when you compare with ice vs without. I readily concede a gigantic mass of ice is colder than a tropical swamp. That proves nothing.

    The second fault is that you run causality backward. That ice is colder than swamp does not imply an average of cold thermometers will mean ice exists. The problem comes from the cases where the average is telling lies, not the one where a bucket of ice is known to exist. The second case has specific heat, heat of fusion, mass all incorporated (even if in horrid precision) by way of the existence of the ice. But put 1000 thermometers in Antarctica, and one in Brazil, then average them; and that average claims an iceball Earth where none exists. That’s the problem. Working from temperature to ice; not from extant ice to relative temperatures. Or put another way: You can find the statistic of the average from the data but you can not find the data from the statistic of the average.

    Do note: I have never asserted an average of thermometers can not be useful or interesting. Only that it will be wrong (to some unknown degree but often large) and is not usable for thermodynamics calculations of warming claims (or anything else that needs provable temperature accuracy)

    In particular, you can remove random error by averaging thermometers measuring the same system or object at the same time. It is finding that error term, though, and not finding an actual temperature. (Systemic error is not improved by averaging, so stays and could still make your recorded temperature quite wrong).

    So in the iceball Earth case, there were no thermometers present. Any temperature assigned to it were not done via thermometers (averaged or not). Asserting it says anything about averaging thermometers is by definition a hypothetical. It is known to be covered in ice, so colder than now without ice. Yet I could place thermometers on an iceball Earth such that their average was well above 0 C. (Volcanoes, in the ocean, in equatorial tropical forests that did exist, etc.) Averaging those readings would not show ice. Would a well spaced global grid of well calibrated thermometers have an average lower than that same grid now? Most likely. But then we know the rough mass and specific heat of the measured surface (sort of with high error) and we know ice is much much colder than the Sahara and Los Angeles… so we know the relative ranges are far far apart. Error has much room in which to hide. But what we can’t do is say that average is a temperature or representative of temperatures.

    It is a statistic about the population of temperature readings in the sample. It is no great leap to say a population of ice samples will have a lower average of temperatures. It is a great leap to say a low average of temperatures in a random sample means a colder Earth. It is insanity to say that statistic is a temperature to 1/10 C precision.

  19. Larry Ledwick says:

    The issue that bothers me most about “Global Average Temperature” is the method of creating it is undefined. It therefore literally has no defined meaning.

    It would be one thing to arbitrarily define an index called “Global Average Temperature” with a discreet and precise definition, but as it stands, we really have no clue how they are doing it (or have done it in various locations and institutions) and no historical record of how it was done by others.

    Let’s take another example of a completely arbitrary calculated index that is used as an indicator of the general economic health of the nation – the misery index.

    It is not a “real value” like the price of gold, it is a computed value that gives some intuitive sense of economic conditions.

    DEFINITION of ‘Misery Index’

    A measure of economic well-being for a specified economy, computed by taking the sum of the unemployment rate and the inflation rate for a given period. An increasing index means a worsening economic climate for the economy in question, and vice versa.

    Misery Index = unemployment rate + inflation rate
    (which unemployment rate? which inflation rate? Computed to one decimal place or two decimal places? )

    Read more: Misery Index http://www.investopedia.com/terms/m/miseryindex.asp#ixzz4sFVYAwSC
    Follow us: Investopedia on Facebook

    But it has a definition and you can use that definition to go back and compute the misery index for any time in history when you can derive the unemployment rate and inflation rate, which allows meaningful comparisons (at least as meaningful as the inputs are).

    Global average temperature however has no specific agreed upon standard definition.
    There are lots of ways to average data, there are lots of ways to gather the data you average. Without defining how both those steps are done, it is even more meaningless than not having any real physical existence and being only a statistic.

    That puts you in the place of asking what is it a statistic of?

    Is it the arithmetic mean of the high temperatures of the day taken in properly built and located Sevenson screens where the temperature is a 7 minute running average of the sensor values? (the approximate description of what a mercury column thermometer would provide).

    Or is it the median of the instantaneous high and low temperatures of the day?

    Both could be refereed to as the average temperature, but they would yield very different values.

    Do you compute a mean for all electronic instruments and a mean for all mercury column thermometers and some how “correct them” so they give equivalent values?

    Is there a certification standard you can apply to your sensors that gives some minor assurance that they are giving you the information you think they are?

    Do you correct for time of observation changes due to daylight savings time and make the observations only at specific times of the day, or log all readings and pick the high and low points in the plot for that calendar day or from actual sunrise to sunset or some other time reference etc.

    By making small changes in how that temperature data is gathered (cough toss out high altitude and high latitude stations cough), you can shift the final values to just about any number you want or intentionally introduce a bias that would be hidden to the average person looking at the data.

    It is meaningless data made more meaningless by having no standard method for compilation.

  20. cdquarles says:

    @ Larry, indeed.

    Again, the thermodynamic temperature is itself an average. It is the geometric mean of a defined sample of matter’s internal kinetic energy and *only* its kinetic energy. Key term: defined sample of matter.

    @ jim2
    As stated previously, a liquid in glass thermometer measures the volume of the liquid inside it and is a proxy for temperature via thermal expansion. Thermal expansion isn’t linear. It is a polynomial function that may be complicated (remember water’s properties are anomalous due to packing and hydrogen bonding, such that the density of water increases as it initially melts from the solid at standard temperature and pressure up to 4C/40F. That means that water does not have thermal expansion at first. Only after having heated water (pure, good luck actually having pure water) past 4C/40F will it show thermal expansion. Of course, the phase change to a gas (strictly vapor at usual temperatures and pressures) is the ultimate in thermal expansion.).

    You say you worked as a chemist. You surely must know this. Averaging averages will not, necessarily, have meaning. Given your example of a ‘snowball’ Earth. We wouldn’t need the contrived GMST to tell us something. The ice tells us what we need to know. Interestingly enough, if the tropics remain ice free (as I’ve seen reports that during the last glacial maximum, tropical rain forests were not wiped out, just limited to a smaller range than now), a calculated GMST may not be as low as you might think.

    Besides what is the GMST supposed to be? An average of averages of the skin? Of the air at 2m? Of both? Of the whole Earth, including its core?

  21. E.M.Smith says:

    @Larry:

    Well, yeah, that too…

    ;-)

  22. E.M.Smith says:

    @Kneel63:

    Another nice example of an intrinsic / intensive property. Cooked-ness. A fishstick has a given known amount of cooked-ness. The average is quite meaningless. It might mean done perfectly. It might mean burnt outside inside frozen. I might mean I burnt 1/2 of them and stopped… It is NOT the degree of cooked-ness of the fishsticks on my plate (other than by accident…)

    it MAY be useful information in some other mode. If, for example, my average cooked-ness is 60% frozen, it tells me there are some cooking problems, but not how cooked they are ( it might just mean a slow night and people ordered fries instead…)

    It is very important to remember to keep things sorted into Just What They Are and no more. A statistic about average cookeness is not cookedness. But it can still be useful for some other purpose. Just don’t ever confuse it with actual cookedness. Nor confuse an average of temperature readings with an actual temperature.

    Statistics about data are your friends, just don’t call them actual data…

  23. jim2 says:

    OK, now I see you guys are considering difficulties of measurement presented by the real world. Up until now, in my mind, we were considering the binary case: physically meaningful or not. I see I am making progress :)

    While two thermometers in the oven won’t necessarily provide a physically more representative temperature, real-world consideration of placement, if you put a thermocouple centered in every cubic inch of the oven, average the result, then you would have a much better idea of the temperature than with just one. (Trivially, one thermometer is an average of one thermometer.)

    Even Snowball Earth vs Hot House, an average of 100 thermometers distributed evenly around the globe would, or would have, presented physically meaningful information.

    That was my simple point, and judging from the most recent comments, I believe I have made it.

    No real-life measurements are simple, and I have never said they were, so I’m not, for the sake of this conversation, interested in the thermal expansion of glass, irregularities in the capillary, etc.

    This has been fun and I appreciate arguments made from what we know or believe we know rather than name calling, grand standing, etc.

    One parting thought. Courtesy of Einstein, we know Newton’s Law’s are approximate. Nevertheless, we have used them to guide artillery shells successfully for some time now. I will leave it to the soldiers on the receiving end of this approximation as to the meaningfulness of physicality.

  24. E.M.Smith says:

    @Jim2:

    To distill my point:

    The average is a statistic. It is ONLY and always just a statistic. It can never be an actual temperature. To try to represent it as a temperature is fundamentally wrong.

    Yet statistics can be useful, if very tricky and non-physical things.

    So an oven with 1000 thermometers and a std. dev. of 0.001 F among them will have an average that does inform about the likely temperature experienced by the food. The statistic tends to bound the range of the actual temperature possibilities. (But it is still not a temperature…)

    Yet you can not from an average 2 or even 20 or even 20,000 thermometers say what the actual temperatures were, nor even if the mystery object was mostly hot or cold. You lack critical information to run that direction. What is the object made of? (mass, specific heat, etc.) What is the temperature gradient (s) in the object and space? What is the distribution of the thermometers?

    The earth is not in temperature equilibrium. It is not a uniform substance. Heat capacities and specific heats and heats of vaporization and fusion and quantities of substance involved in phase changes vary over time and space. It is simply a fraud to claim that the average of even 6000 thermometers will be in any way representative of the “temperature of the Earth”. It is only a statistic about the data distribution of the data collected, nothing more.

  25. M Simon says:

    pouncer says:
    9 September 2017 at 2:56 pm

    I go back to the IMSAI days (S-100) when you had to roll your own. Have Solder. Will Computer. I designed the I/O board that went into the world’s first BBS.

    =====

    Did the budget deal with the Republicans founder on the Rohrabacher Amendment? I think so.

    http://classicalvalues.com/2017/09/more-thoughts-on-the-debt-deal/

  26. E.M.Smith says:

    @M.Simon:

    In about 1975? a friend bought one of the first Altair Mits 8800 kits. We assembled it in the dorm lounge.
    http://www.oldcomputers.net/altair-8800.html

    I wrote the first program we ran on it, toggled in the front panel. Hand assembled it started at mem 0 and copied itself to top mem, then halted… lots of blinky lights in the process… I think that was the start of my preference for blinky lights on equipment :-)

    Per Repubs and deals:

    They just forgot Trump has things to get done and does not give a crap about political games. Democrats stepped up to cut a deal, and he closed with them. Now they know. Present an acceptable “get ‘er done!” Deal, be the hero, we can work with that. Period.

    Dad sold realestate. I’ve managed residentail rentals and commercial office space. It is a cutthroat business and you must be fast, sharp, and flexible to make it. Trump thrived in it. Washington is slowly learning what that means….

    Did it have to do with M.J. enforcement? I don’t know. But whatever was holding up the Republicrims from getting good done, it just got a slap in the face. If they are slow learners, the kick to the groin is available… “Get ‘er done for We The People or get the hell out.”…

    Trump knows this, and knows how to smile while kicking…

  27. Larry Ledwick says:

    jim2 says:
    10 September 2017 at 1:05 pm

    OK, now I see you guys are considering difficulties of measurement presented by the real world. Up until now, in my mind, we were considering the binary case: physically meaningful or not. I see I am making progress :)

    While two thermometers in the oven won’t necessarily provide a physically more representative temperature, real-world consideration of placement, if you put a thermocouple centered in every cubic inch of the oven, average the result, then you would have a much better idea of the temperature than with just one. (Trivially, one thermometer is an average of one thermometer.)

    Seems to me jim that you are including an unspoken assumption in your above statements.
    That assumption is that there is a meaningful single temperature that is “the temperature” of the oven.

    What people generally mean (even if they don’t state it) when they state something like “the temperature of the oven” is actually the median temperature of some confidence interval of all the temperatures in the oven (using your 1 sensor per cubic inch example).

    A cook wants to know what the typical temperature the food product will experience in the oven and when they say the oven is 375 deg F is actually, that within acceptable error limits of (for example) +/- 20 degrees (or some other acceptable error) the median of all the cubic inches the food will occupy in the oven are at 375 deg or close enough to be within those error bounds. For some foods those error bounds are narrower than others to get a good outcome, but we state a single number for simplicity, and the experienced cook learns what the temperature bias is of their particular oven. For example to bake ginger bread in my oven I either have to set the temperature 25 deg hotter than the recipe or increase the time it bakes by about 10 minutes.

    There really is no “temperature of the oven” only a typical temperature between acceptable error bounds.

    Saying that there is “a temperature” of the oven is like saying there is one number that accurately represents the altitude of Colorado. The south east corner of Colorado is around 3000 ft altitude, along the front range near Denver it varies from the mid 4000 ft to low 6000 ft bounds. Just a few miles west it is 9000 ft and 30 some miles west it is between 10000 and low 14,000 ft elevation.

    There really is no single number that properly describes the elevation of Colorado and there is not a single number that represents the temperature of the oven. In both cases we mentally insert a reasonable error bound in our thought process. When we say the city of Denver is a mile high, we are only approximating the altitude of any given location, and that description really only applies to the 13th 15th or 18th step on the state capitol building where the markers are. The variation is due to changes in survey technology and the reference datum used to measure mean altitude above sea level.

    http://www.atlasobscura.com/places/mile-high-steps-at-the-colorado-state-capitol

    In the case of the oven you also have to add time as a qualification of the stated temperature because in the oven the temperature is constantly varying around the thermostat set point as the heating element or burner is turned on and off to maintain that acceptable error limit on the typical temperature of the volume of the oven.

    To state an explicit discrete temperature in the oven you would have to define a specific three dimensional grid coordinate inside the oven and a volume small enough to be assumed homogeneous in temperature and the time the measurement was taken. Only with those qualifications can you actually have “data” about the temperature in the oven.

    Without those qualifications you are only stating a typical temperature expected within the volume of the oven (which implies you are in the back ground, arbitrarily assigning some acceptable bound to what you consider allowable temperature error for what you are cooking).

  28. philjourdan says:

    Re: Trump and dems deal. People still want to think of him as a “politician” or even a “Republican”. Trump truly is a RINO (I do not use the term for the clowns in DC who renege on campaign promises). He will deal with anyone who will further his own agenda.

    And the republicans have not been doing a very good job of it.

    p.s. My reference to Trump as a RINO is not a slam. It is merely observation.

  29. Larry Ledwick says:

    Right a lot of people are seeing that deal with the Dems as a sell out of the Republican Establishment – no it is a “slap up side the head” telling them if they want to play in this game they have to be on his team. If they want to keep blocking his agenda he will simply go around them and leave them standing there with egg on their face.

    Pretty much a “lead follow of get out of the way” message.

  30. E.M.Smith says:

    @Larry:

    Exactly so. Both on the temperature point and on Trump (slap!) ;-)

  31. Larry Ledwick says:

    I hate it when my wind turbines catch fire and start brush fires that rapidly spread to 700 acre fires.

    http://laramielive.com/windmill-fire-turns-into-700-acre-blaze-in-southwest-wyoming-video/

  32. Another Ian says:

    E.M.

    I see that hurricane Irma is sucking water out of Tampa harbour and it will come back as a storm surge as the eye passes over.

    This leads to a question – was Moses saved by a hurricane? And if so what was the name of that hurricane?

  33. David A says:

    Well Ian, there must have been two hurricanes, to part the water you see. (-;

  34. With reference to temperature, it’s actually a little more complex. Temperature is not actually an intrinsic property either once we get down to atomic/molecular level. What we have instead is the kinetic energy of the atoms/molecules, and the temperature we measure is the average kinetic energy of those particles over a non-zero timespan. At any particular temperature, we have a probability distribution of the kinetic energy of the particles involved. For any individual particle, the actual energy may theoretically be anything from zero to infinite, and we can calculate the probabilities for any range we want to know. This may look like nit-picking, but is actually very important when considering how we could possibly change that kinetic energy back into a format we can use to do work (that is, it’s all in the same direction rather than random directions). I wrote an article relevant to this at http://revolution-green.com/heat-move-hotter-colder/ which explores some of the consequences of applying standard theories at the particle level. The interesting result (to me at least) is that the temperature at any point in a volume of material has absolutely no effect on the direction of the results of any collision of those particles. This is counter to the intuitive understanding (and daily experience) that heat only flows from hotter to colder. The diffusion of energy is a random walk that is not affected by the temperature distribution.

    The temperature at any point in the oven is already an average over time of the kinetic energy of the air molecules in that oven that impinge on the sensor. The average temperature of the oven is thus an average of an average. It may have some relationship to the probability of your soufflé rising properly, though it seems in that case the distribution of temperature has a large effect. Opening the door to check on it can have disastrous effects….

  35. E.M.Smith says:

    @Another Ian:

    The parting of the Reed Sea (it most likely was not the Red Sea… but a marshy area called the Reed Sea at one end) was most likely caused by sustained high winds driving the water out. A known effect and there was some record (I don’t remember what / where) from the time saying the wind had blown for a long time prior to the crossing.

    When the wind stops, the water slops back in…

    @Simon:

    I love harvesting nits ;-)

    Yes, at the core of it, “temperature does not exist”. It is a hypothetical number assigned to the mass motion of atoms (and molecules) to attempt to capture the general state of that motion. How energetic the “system” is. Thus all the “weasel words” stuck into thermo discussions like “homogeneous system” and “at equilibrium”. Things that simply do not exist in things like climate and weather and global scale systems and processes.

    I’d kind of dodged that poin as it wasn’t germane to the “don’t average temperatures” main point and, frankly, most folks glaze over if you ‘go there’… But the reality is that you can’t do a thermodynamic analysis on the globe to find if it is gaining or losing heat. At best you can observe bulk properties and infer something is happening. (i.e. you can’t use thermometers to tell you it is getting damn cold but you can see the ice ball Earth forming as the ice spreads). We simply do not have enough uniformity of composition and never are near any equilibrium, so can’t really use normal calorimetry methods or theory. Yet folks try…

    That is also why there is no real “thermometer” that directly measured temperature. It simply does not exist to be measured. What we can do, and what we really do, is arrange for some other material to get close to equilibrium with the target and then to measure some bulk property of that stuff. Expanded mercury, resistance of platinum wires, etc. Most recently we’ve gone to IR sensors that pick up the general emissions of infrared from relatively hot things and from that impute a general temperature (though we assume the surface is representative of the bulk and interfering gasses don’t do too much to the IR…)

    I find it interesting that folks call it an “average kinetic energy of the atoms and molecules” when in fact we do not measure a bunch of them and make a statistical average. We look for properties that themselves vary with the bulk KE of the substance and let nature sort of make an average-by-effects… that isn’t really a mathematical average. Yet we skip that detail. Left unexamined is how those effects my diverge from an actual average.

    Temperature is far more tricky a concept that just about everyone thinks. An average of it even more tricky. For something we interact with every day, temperature is very far from what it seems.

  36. EM – because of the way we measure things, it seems that a lot of people also think that when thermal equilibrium is achieved, then there’s no more activity. The movements however continue at the same rate, but on average (over time and space) as much energy passes in one direction as the other. Because the energy is still moving around, it is possible to extract usable energy from a system in thermal equilibrium, even though that is stated to be impossible. Just because we can’t measure a change in the average energy-level using the normal instruments such as thermometers does not mean that there are not changes happening below the time/spatial resolution of the device used to measure it. Using a Döppler radar on the molecules would in fact resolve their actual velocities, though. An average speed of around 500m/s at room temperature/pressure is definitely not stopped.

    Temperature and thermodynamics are indeed a lot trickier than is generally taught. It is worth thinking about what those averages hide. With temperature being a natural sense, though, and it being so obvious that heat will always move from hotter to colder, it’s hard to go beyond it to what’s actually happening. It took me a long time to solve that paradox, and even to realise that there was a paradox there.

    Maybe also worth noting that pressure has a similar problem. It can also only be expressed as an average over a non-zero time-span and a non-zero area. At a scale of around the mean free path, it resolves into individual collisions and momentum-transfers, with energy-exchanges that average to zero over time in an equilibrium situation. If we can get a structure of the right dimensions and properties, then pressure can also be converted to usable energy. The pressure/temperature would reduce, and we’d get unidirectional electricity out of the device. Changing the direction of kinetic energy only requires a momentum transfer (any work done in the process is effectively equal and opposite, and therefore cancels out) and so can logically be done without any external input of work/energy.

    Whereas the tendency to disorder by random processes is well-known, the tendency of force fields (magnetism, electric, gravity and nuclear forces) to produce order is not generally noticed. Generally our designs don’t use large-enough fields to overcome the tendency to disorder, but in the case of solar cells the inbuilt electrical field is large enough to produce order from disorder. In fact that is why they work.

    It’s fun harvesting nits!

  37. E.M.Smith says:

    @Simon:

    Your discussion of pressure got me thinking… We’re now seeing nano-motors built at the molecular level. It ought to be possible to make a nano-pump arrangement of atoms where the impinging of an atom at speed on one end causes it to rotate and put mechanical energy “somewhere else”. I.e. harvest some of that velocity. Just need to think of how to shape that end, and how to remove the rotational energy to do something useful… Hmmm…..

  38. p.g.sharrow says:

    Gentlemen, you are getting very close to understanding the secrets of mass/inertia and gravity. harvesting nits indeed!…pg

  39. E.M.Smith says:

    @P.G.:

    A nit is just a tiny egg. A tiny egg of an idea can grow to large proportions… many of them can give you a forrest of ideas…

    IMHO, the way to harvest energy from the universe is to realize that at the QM level, it is very different from at our scale. Use those properties. Then you get things like the Vortex Cold Gun https://www.amazon.com/Vortex-Cold-Cooling-Flexible-130mm/dp/B06WGR7CPH

    making both hot and cold air by sorting atoms according to their speed…

    (Which makes me wonder what happens at the Polar Vortex….
    http://www.foxnews.com/weather/2014/01/04/polar-vortex-to-blast-frigid-air-over-much-us/
    )

  40. philjourdan says:

    @Larry

    I hate it when my wind turbines catch fire and start brush fires that rapidly spread to 700 acre fires.

    Mine does it all the time, so I moved it to offshore. :-)

  41. cdquarles says:

    Speaking of cold and ultracentrifugation ;p. Today is a remarkable weather day where I am. It is downright cold. It hasn’t cracked 60F today and likely won’t now. The remnants of Irma are going to pass near me. I’ve had fairly strong, for where I am, north to northeast wind fetch for about 36 hours. The dew points have been in the upper 40s and have slowly risen to the mid 50s. As the remnants of Irma approach and the barometric pressure keep falling, this is downright nasty weather. Notably, no spin up tornadoes are expected. Unlike Harvey, whose remnants did spin up some tornadoes west of me. Then again, when Harvey’s remnants came through, it was in the 70s and 80s with a southwesterly fetch.

  42. EM – at that molecular scale, 2LoT does not necessarily apply, and instead you need to use Newton’s laws. If you can bias each individual energy transaction one way, then the net result of a lot of them will not average out to zero. I suspect the easiest way to get that bias is to use an electric field or magnetic field, but those can be built-in to the structure. Piezos after all work that way (electric field built-in) and convert between electrical energy and mechanical energy in a bidirectional fashion. If you can take the electricity away immediately it is produced, then the bidirectional symmetry is broken and you get electrical energy out. It’s a good idea to look for symmetries we can easily break – most energy transactions are naturally bidirectional and going in one direction increases the chances of it simply doing the reverse, so if you try to use a lot of them at once (or try to accumulate directly) and hope to get a bigger signal out you’ll be disappointed. A diode (if good enough) provides the broken symmetry we need, and of course there are other ways too, but I tend to think in terms of electronics. We can also build electronics in the right scale needed, too. Electricity is such a useful form of energy.

    QM scale is indeed very different, and allows us to do things that are non-intuitive and also violate classical-scale rules. It looks like I’ve solved the deposition problems (as of today) so fairly soon I should have some physical devices that demonstrate that we can turn environmental heat directly into electricity without needing a cold sink to reject waste heat to.

    pg – I’m certainly not understanding mass, gravity, and inertia yet, but Mike McCulloch keeps on finding his predictions proving out. I found the Higgs difficult to swallow, but Mike’s ideas may just be crazy enough to be largely true.

  43. jim2 says:

    “Seems to me jim that you are including an unspoken assumption in your above statements.
    That assumption is that there is a meaningful single temperature that is “the temperature” of the oven.”

    Wrong. Not any assumption of mine, but you are making one.

    “So an oven with 1000 thermometers and a std. dev. of 0.001 F among them will have an average that does inform about the likely temperature experienced by the food. ”

    True, but that isn’t something I claimed.

    I don’t see anything here that proves my simple statement wrong. The fact that we can’t measure instantaneously, or measure every point in the oven is beside the point. An average of plain ole thermometers stuck up in the air does convey physical information – imperfect to be sure, but the information is there. Like it or not.

    We can do better, of course …

    https://www.researchgate.net/publication/302919205_Nanosecond-resolved_temperature_measurements_using_magnetic_nanoparticles

  44. jim2 says:

    Also, know that I am NOT defending the various global temperature constructions. I can see that one coming a mile away :)

  45. p.g.sharrow says:

    The only IMPORTANT thing about the temperature in MY oven is that it doesn’t burn the cookies!…:-p…

  46. E.M.Smith says:

    @Jim2:

    You seem rather defensive about all this. Statements of fact need not be attributed to you to be germane and placed into an exposition. Use of words like “seem” are meant to convey that “it looks like to the reader” not that “it looked like to you when you wrote it”. Try to lighten up a little…

    “Seems to me jim that you are including an unspoken assumption”

    Seems – i.e. isn’t stated but is sub-rosa

    to me – i.e. isn’t your statement, is my interpretation

    unspoken – you did not say it

    assumption – seems to be included in the argument by necessity to reach your conclusions

    “meaningful single temperature”

    meaningful – some number that has meaning and works for the purposes stated

    single temperature – your average is the imputed “single temperature field” experienced by the lump of food or else there is no need nor use at all for said average. Otherwise we’re back at there being a few dozen temperatures and a useless average, which was my position…

    “True, but that isn’t something I claimed.” – never said it was. Showing a case where an average is useful to illuminate MY position. With small standard deviation it can be used to BOUND a temperature range probable. Beyond that, not much use. Please accept when I point out a place your position has some merit… that the average DOES have some (limited) uses.

    Where your statement is wrong is in saying the average is useful as a TEMPERATURE. It is not, and can not be, a TEMPERATURE. An average is still useful, but as a STATISTIC about a collection of data. That does not mean it is useless to BOUND a temperature probability. It does mean it is WRONG to call it a temperature or use it AS a temperature.

    The “missing bits” are the necessity to move from the statistic to the bounds with central tendency and range. THAT is where the error shows up. So I used the example of 10000 (or whatever) thermometers AND an extremely small standard deviation to show where you could put narrow bounds on a temperature and get to a range that was narrow enough to use a central point of the range for practical purposes. Without that probable range and standard deviation, you can’t do that. That statistical step is necessary to avoid the errors (that can be gross). It is the skipping of that mandatory applications of statistics to find the range of probable temperature and standard deviation that makes using a simple average AS a temperature de novo a broken process.

    It really is a trivially obvious point, if you would just choose to observe it. An average is always a statistic. An average of an intensive property is only a statistic. A temperature is only one reading of one sample with one instrument under one condition in one system (material). IUPAC, formal statistics, physics, they all say that. You want to assert your way to the opposite.

  47. jim2 says:

    OK, so responses to the conversation I started about this contains some elements that I took to be aimed in my direction. If I was wrong, I apologize to whomever didn’t mean it that way.

    You are technically correct that an average is a statistic. Excuse me for using conversational idioms. But nevertheless, an average of temperatures can be physically meaningful. It’s not “a temperature,” but it is nevertheless useful in some circumstances.

    Chemical engineers don’t have the luxury of letting everything (there’s that conversational thing again) come to equilibrium, chemical thermal or otherwise, but they nevertheless do use thermometers of various sorts and type in non-equilibrium environments. More than one thermometer in a given environment even.

    I agree with your point that a single, simple average does not convey the temperature of a certain point in a temperature field, nor does it convey the distribution of temperature. We agree!

    But my proposition was simple, not complicated.

  48. Another Ian says:


    antelope | September 12, 2017 2:32 AM | Reply

    More settled science overthrown: “For a century, researchers thought the specimen was a man because it was buried in an ‘ultimate warrior Viking grave'”

    http://nationalpost.com/news/world/plot-twist-viking-warrior-remains-assumed-to-be-a-mans-actually-belonged-to-a-female-military-leader

    http://www.smalldeadanimals.com/2017/09/reader-tips-3956.html#comment-1124067

  49. Jim2 – nice article about being able to measure temperatures down to a 14ns resolution. At that point, though, the actual concept of temperature is starting to break down. In order to resolve individual energies of molecules in standard air, you need only to go another couple of orders of magnitude better than that, and of course then the actual readings would be random and you’d need to accumulate measurements and average them over enough time to calculate the effective temperature.

    Down at the atomic level and individual interactions, what we’re really looking at is the probability that a collision will be within a certain range of combined energies, and often reactions will require a certain threshold energy in order to work. This also applies to cooking bread in the oven, and the temperature (as measured using a normal thermometer) is thus useful if you want bread and not toast.

    Temperature as measured by liquid-in-glass thermometers tends to have a long-enough integration time that it’s useful for human-related uses. Do I need a jumper on to go out? How well will the crops grow? With fast-reacting thermometers, they have their uses in checking for things that also react quickly, but can give misleading answers if used in the wrong places. Using them on an airport runway will give a useful answer to the question “can I take off” (providing the output is correctly smoothed) but a quick blast of that jet exhaust can give an anomalously-high reading because they react too quickly. With the tendency now to record temperatures each second (because they can) then the placement of the thermometer and the natural fluctuations around the rolling average over time (that was automatically done by LIG thermometers) can give misleading results. Hottest Evah!

    Any instantaneous reading of the thermometer is insufficient without knowing what the response time of the thermometer is. Any temperature reading, by its nature, is already an average. Though the average temperature has a use in human terms, it’s not sufficient in itself when talking about climate, and that is really the rub. AGW supporters quote the average temperature to a precision of 0.01°C, which is far more precise than the accuracy (or precision) of the devices used to measure it. Using a fast-reacting thermometer will automatically produce higher measurements of the high temperatures and similarly lower measurements for the low temperatures in a situation where the rate of change of temperature is greater than the slower thermometer can react to. It’s a can of worms, really.

    In industry, we use temperatures (as measured) and average temperatures to control a process. Average temperatures definitely have a use, as does the temperature range in an oven/kiln within the volume of interest. I get your point, therefore. There’s only really a problem if you forget what temperature actually means or use the wrong methods of measurement, and when it comes to climate science that does seem to be a problem. Local temperatures are just that – local. A few metres away it could be different, depending on the local conditions there.

    Providing you apply the concept of average temperatures where it’s useful (as in getting bread and not toast) I don’t see a problem. If the average temperature of the bath is correct, but it’s hot at one end and cool at the other, a bit of stirring will fix the problem. The mathematical tricks used in climate science, though, seem to lose track of what temperature actually is and how local the measured temperature actually applies to.

  50. E.M.Smith says:

    @Simon:

    “Any temperature reading, by its nature, is already an average.”

    But NOT an average of temperatures. As noted above, the things we call thermometers actually measure and indicate some other property, such as volume or resistance. Even the description of temperature as a pseudo-average of the kinetic energy of atoms and molecules; note in passing that it is KE being averaged, not “temperature”.

    Having a device that averages absorbed energy over time and displays it as a change of volume is NOT the same as averaging a bunch of numbers correlated to that average, and especially not so when those numbers come from different things (in thermo terms, different “systems” that are inhomogeneous with each other).

    You have correctly “got it” that some times this will work and sometimes it will bite you on the ass. The next step is to realize that treating a group of measured temperatures of different “systems” as a set of averagable objects is when you get bit on the ass. Calling that result a “temperature” is hideously broken thinking.

    So: One “system” (like an oven or flue gas) with several measuring devices. You can average the results to reduce random error. (One tube a bit thinner capillary, or one thermometer placed in an anomalous spot of KE). You can not reduce systematic error that way (short person always reads meniscus differently from the tall guy on night shift, for example). In effect, being a single system, you can presume some degree of expected uniformity and that gross deviance between readings is some sort of error.

    Multiple “systems”, like divergent locations, different ovens, ice vs water in the lake. You can not average the results to do anything useful. There is no relationship between the systems and thus no way to distinguish the random error in the instruments from legitimate variation between the systems. There is no way to determine what is random error, what is systematic error, and what is expected variation. Furthermore, since the multiple systems have divergent properties (mass, phase state, chemical composition, specific heats, etc.) any averaging of the reported temperatures is confounding more than improving understanding of the fundamental KE that we thought we were measuring.

    Simply taking two temperatures, even in one system, and averaging them, is a fundamental error. The only way it “works” is by an implicit assumption about the rest of the formal problem space (those extrinsic properties like specific heats and mass). That is where your cooking oven diverges from a Calorimeter. In your cooking oven, you don’t “notice” that an ounce of water evaporated from your roast and changed the heat equation. You look at it and let it run another 5 minutes. In a Calorimeter it glares at you. That difference is just exactly the place where averaging the intensive property of temperature was WRONG and doing the heat calculations in a Calorimeter is RIGHT. That you can “cheat it” sometimes and get a “good enough” result does not make it proper, it makes it a usable cheat.

    (Please note: I LOVE usable cheats. But in order to keep a tidy mind it is important to know they are cheats, and know when they fail, and know what the proper way to do it really might be. So Smiths, for generations, used all sorts of “rules of thumb” to temper metal and make different hardnesses and finishes. It works much much better now with proper metallurgical understanding. The problem with cheats is when you try to generalize them and they fail tragically. Like using iron color temperature to temper exotic steels or worse, non-iron alloys. Or use what worked to bake bread to try to claim the history of temperatures is warming globally.)

  51. Another Ian says:

    Simon D

    More temperatures

    “The Australian Bureau of Meteorology may not be meeting WMO, UK, US standards

    Since the Australian BOM allows for one second “records”, it’s not clear it is even meeting guidelines recommended for amateurs.”

    http://joannenova.com.au/2017/09/australian-bureau-of-met-uses-1-second-noise-not-like-wmo-uk-and-us-standards/

  52. Larry Ledwick says:

    Interesting reading about a classified air crash recently in Nevada – sort of a I wonder article.

    http://www.popularmechanics.com/military/aviation/news/a28146/mystery-aircraft-crashes-in-nevada-desert/

  53. jim2 says:

    Simon Derricutt says:
    12 September 2017 at 1:14 pm

    That’s a reasonable view, IMO. No thermometers are correct, but some are useful ;)

  54. Larry Ledwick says:

    How did climate “science” screw up, let me count the ways:

    CO2 is the cause of global warming ( there is no correlation between CO2 concentrations and temperatures in the historical record)

    Started with a conclusion and looked only for support of that conclusion (scientific method requires the problem be stated in a manner which can be falsified if the theory is wrong)

    Just give us a bigger computer and we will tell you the climate in 100 years (closely coupled, non-linear chaotic systems cannot be solved no matter how big and fast the computer is, we will never have sufficiently perfect data about initial conditions to predict future climate) This was admitted by the IPCC in their executive summary in 2001

    http://www.ipcc.ch/ipccreports/tar/vol4/index.php?idp=106
    “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future exact climate states is not possible. Rather the focus must be upon the prediction of the probability distribution of the system’s future possible states by the generation of ensembles of model solutions.”

    The computer models generate data that shows what the climate will likely be 100 years from now.
    (Computer model output is not data, it is only the result of assumptions fed into the computer code by the model’s creator. It is a “guess” not a prediction, and certainty not data.

    Due to the complex mathematics involved in resolving equations of multiple coupled non-linear chaotic systems, subtle systemic issues in the computer code will introduce errors into the calculation that will explode with time as the projection is repeatedly calculated with each successive iteration into the future.
    Issues like machine zero (the smallest number a specific computer and operating system can resolve as not zero), rounding errors, maximum size and precision of numbers the system can work with, make it impossible to have sufficient accuracy and precision of calculation to push the projection far into the future. Those systemic errors are built into the operating system, the computer code and the compilers which produce the executable code and cannot be eliminated. As a result even if we had perfect initial state conditions it is impossible for the computer models to give correct results far into the future. As a result the same exact source code compiled for different physical computers and operating systems will give projections which drift apart with time.

    The source data is not anywhere near perfect, it is horribly corrupted and manipulated, and due to poor scientific procedure and bad statistical practices is not usable for the purpose it is intended to fill.

    Current practice routinely changes historical climate data with undocumented methods based on highly questionable assumptions. Implicit in this process is an assumption that if the model and the data do not match expectations that the data is wrong not the model with produced the projection.
    This standing assumption complicated by making up missing data (infilling), poor data hygiene (bad coordinates for stations, doing calculations with data which includes values originally intended to show no data was available, lack of audit of station compliance with standards, improper use of thermometers (failure to account for differing time constants for physical capillary tube thermometers vs modern fast acting electronic thermometers, changes in site conditions over time etc.)

    Just a starting point for a bullet point list of the problems with “climate science” to aid in refuting the assertion that those who doubt the theory of global warming do not “believe in science”.

    Discuss.

  55. Larry Ledwick says:

    Regarding this item above “doing calculations with data which includes values originally intended to show no data was available”. I recall a discussion on wattsupwiththat about old weather records which used as I recall a -99 value to indicate no data. That “bad data” is still in some of those records and has not been accounted for. I believe this discussion was about the time we were going round and round about BEST but so far I cannot find the original discussion. If anyone has a link I would very much appreciate it.

  56. jim2 says:

    Yep, analog computers, like the atmosphere, are much better.

  57. EM – agreed that how you produce that figure of temperature is important. When the thermometer-device is in thermal equilibrium with its environment (which means that as much KE is entering it as is leaving it) some property of it (volume difference, resistance difference, voltage difference etc.) will correspond fairly closely to the total KE that the device currently contains. There are always however deviations from the ideal (or calculated) correlation and always small fluctuations, and the smaller the device and the faster it responds then the bigger those fluctuations will be. Something as mundane as measuring a temperature is quite interesting when you really nitpick.

    It’s also critical that we compare apples with apples, and so the other properties of the systems need to be taken into consideration when looking at that temperature measurement and trying to calculate what we want to know, such as what is the total energy contained and what’s going to happen as a result. This even applies in baking bread – it’s not just the temperature of the oven that affects the crustiness of the bread and how long it takes, but also the humidity of the air in the oven (why putting a tray of water in the oven can be a good idea).

    The statistics of temperature variations in a certain place don’t tell you a lot about the total energy in the atmosphere, so without extra information aren’t much use in climate calculations. On the other hand, such statistics may be useful in deciding what range of clothing you keep in the wardrobe or take with you on holiday there. Do you pack a puffer-jacket or T-shirts? Precipitation statistics are also useful in deciding what to pack….

    With the multiple systems you’re talking about, therefore, averaging the temperatures can easily lead to a wrong conclusion on the science and bite you on the bum. Sure, I also use the rules of thumb when I can get away with it, too, but you do need to know where they can be applied and where they’ll give a bad answer.

    Another Ian – yes, I had the Australia BOM records in mind when talking about measurements over the previous second. I think the UK Met Office is doing this too, now, since the current devices allow it. We could have fun visiting the weather sensors with a battery-powered hair-dryer….

    Jim2 – I aim to be reasonable :) We’re seeing unreasonable assumptions in the calculations made by the climate scientists, though, and this discussion strayed into an area that I’ve found intensely interesting over the last few years in relation to what we can do with heat. Some bits of thermodynamics are actually wrong, since with radiated energy (photons) the temperature of the source and destination make absolutely no difference to the direction of the photon. It’s only with an accumulation over time that the net energy flow is seen to go from hotter to colder, and for an individual photon that hotter-to-colder rule simply does not apply (if you posit that it does, then that breaks causality). Logically, the temperatures of the source and destination make no difference to the random walk of a molecule as it diffuses in a volume of gas, too. That simple observation has quite amazing ramifications, in that it makes an effective perpetual motion device theoretically possible (using environmental heat to produce energy). Thermometers are practically useful, even if it’s the one in my oven that is very far off accurate – providing it’s repeatable and is wrong by the same amount each time, it can still give good bread.

    Larry – I think you’ve covered the main ways in which the data and how it is used are screwed. Getting all that put right is however a massive task. Even removing the “corrections” to historical weather records would be difficult unless we can find some repository for the original data that we can be sure hasn’t been fudged.

  58. David A says:

    Larry. interesting and good summary. Add to that a fundamental, and sometimes acknowledged lack of understanding of major physics processes within the system; cloud formation and feedback, jet stream strength and location, ozone formation and affect, solar influences and flux affects from the top of the atmosphere to the depth of the oceans, and the non linearity of phase transitions and conduction convection responses, and I fail to see how we cannot admit that our climate science is not fit for purpose.

  59. jim2 says:

    D – good discourse.
    Yes, photons don’t care which way they fly, generally speaking. But as you point out, for a hotter and a cold object, higher frequency photons will fly from the hotter and lower frequency photons from the cooler. As the temperature of the two objects change, their temperatures will move closer and the frequency of their photons (on average) will change.

    One bone to pick. Let’s say we have a shallow pool of water, with a heat source at one end. Since the molecules at the hot end will (on average) possess a higher kinetic energy, a particle will experience more collisions from the hot side, and I’m thinking there will be a linear motion superimposed upon upon its path.

    Finally, I may have a way to test the proposition that an average of temperatures is physically meaningful, but don’t have the bandwidth right now. Find the temperature profile of some gravity oven on the internet. This should include a convective feature that is much hotter than other parts of the oven. Take a 1 cm slice perpendicular to some axis of symmetry. The quest will be to calculate the heat contained in that volume. (I already know that using heat in that manner is not strictly correct, but it’s more natural, so I don’t care.) We can make the heat determination relative to 0 C as a baseline. First use heat capacity of air to determine the heat transferred to each point. Since I’m imagining a color-coded cross-section, this might be accomplished as a practical matter on pixel at a time, using a function that converts the RBG values to a temperature. Each pixel would represent a small volume of our slice. Now we have the “heat” contained in the oven. Now for the test of temperature averages. We would have to assign a random error to each virtual thermometer. For simplicity, I will assume air motion in the oven is smooth rather than turbulent. For the test we will assume the thermometer is in equilibrium with its immediate environment. Now write a program that places a thermometer in a random cell as defined by the pixels, as explained earlier. This temperature will be used to calculate the heat of the entire volume and the total heat plotted on a graph. Do that a thousand times. Now, the program will produce two thermometers in random locations, use the average to determine the total heat. Plot,do it a thousand time. Continue in like manner with 3, 4, 5 … thermometers. I think as the average is calculated with more thermometers, the “sampled” heat will approach the baseline value calculate initially. If it does, then I posit the average of temperature measurements is physically meaningful.

  60. Jim2 – note that if you speculate that a linear motion may be imposed on the particles, that implies that there will be a measurable difference in the height of the water in the bathtub, with the hot side being lower (since there’s net movement away from hot). The energy movement will have such a linear shift (the diffusion rate from hotter to colder), but the particles themselves won’t to a great degree (and it will be opposite to the energy movement, since above 4°C the water expands when hotter). As we nitpick deeper we find small differences from the “intuitive” understanding, since the hot side should actually be slightly elevated relative to the cold side since it is less dense. For individual collisions of molecules, though, the KE of each will be random as will the directions they go after collision. It’s only when you look over a sufficient period of time that a trend can be discerned, and an individual particle follows a random walk whilst the KE spreads from an initial concentration.

    When talking about heat and thermodynamics, averages tend to creep in unnoticed unless you keep aware.

    In a oven, any number of thermometers will give useful information. If there’s a certain range of temperatures in the oven, it helps in where to place the soufflé for best results. Comparing that oven with, say, Death Valley may not be meaningful.

    What it boils down to is that some averages are useful, and some are misleading. Where this started really was in averages that are being misused.

  61. jim2 says:

    “What it boils down to is that some averages are useful, and some are misleading. Where this started really was in averages that are being misused.”

    Could you quote the words where averages are being misused?

  62. Jim2 – the explanation is at https://chiefio.wordpress.com/2017/09/08/wood3/#comment-86393 and before that your comment at https://chiefio.wordpress.com/2017/08/23/wood2/#comment-86355 . Maybe earlier ones, but basically it hit on the Global Average Temperature (hot button). The AGW enthusiasts misuse the averages there, and miss the total energy movements from phase changes etc.. To be sure, there’s some use in knowing that the snowball Earth was generally colder – we’d need to pack different clothes to go visit it (assuming time-travel of course).

    For a long time, the temperature at 2pm and 2am were averaged to give the “average daily temperature”. It’s assumed that those two times will be hottest and coldest – but that isn’t true that often. That really loses a lot of information, and in these days of 1-second measurements to get the absolute highest and absolute lowest temperatures at any time in the 24 hours it becomes more misleading to give the mean of the two as an average temperature. It may have been a foggy morning and the sun only broke through pretty late – you don’t know how nice a day it was to go walking.

    The problem is about the way the climate scientists present their disinformation, and thus EM explained why it was wrong. We can find situations where an average temperature is useful (normally in process control) but doing this for climate tends to be a mistake. Since GAT is so widely publicised, though, it’s useful to know why it’s a mistake.

  63. jim2 says:

    I’m pretty familiar with the problems with the global temperature data sets. The only words of mine I see in your links is this:

    “The “temperature” we measure with a mercury bulb, thermocouple, or thermistor is the average of the “temperature” (energy) of the particles impinging upon it. I take issue with the notion that one can’t calculate an average of temperatures. It’s certainly mathematically possible. I contend it even has physical meaning. The global average temperature compared to the global average today would convey meaningful information about the state of the Earth at those times.”

    I stand by that. (A correction made clear I would be comparing Snowball Earth to current Earth). I feel sure you would see a huge difference and that difference carries with it physical meaning. Even the crappy thermometers used in the early 1900’s would show a huge temperature difference. And you can’t rely on natural ice as a thermometer because it may contain salt or other impurities that suppress the freeze point. It’s an imperfect world we live in. Also, I’m not a CAGW’er.

  64. E.M.Smith says:

    From Simon (bold mine):


    The statistics of temperature variations in a certain place don’t tell you a lot about the total energy in the atmosphere, so without extra information aren’t much use in climate calculations.
    On the other hand, such statistics may be useful in deciding what range of clothing you keep in the wardrobe or take with you on holiday there. Do you pack a puffer-jacket or T-shirts? Precipitation statistics are also useful in deciding what to pack….

    With the multiple systems you’re talking about, therefore, averaging the temperatures can easily lead to a wrong conclusion on the science and bite you on the bum. Sure, I also use the rules of thumb when I can get away with it, too, but you do need to know where they can be applied and where they’ll give a bad answer.

    Exactly.

    So, a necessary addition to Larry’s list is that the METHODOLOGY used by “Climate Scientists” if fundamentally flawed as it attempts to do calorimetry on the planet using a statistic about temperatures that fundamentally can never work. Calorimetry MUST have those other values (specific heats, mass, phase changes, etc.) any yet those are just assumed to be constant or substantially irrelevant over time. Just averaging the thermometers is worse than wrong, yet that is what they do. Several times in succession. Daily Min / Max get averaged to give a daily Mean. Those get averaged to give a monthly Mean. Those get averaged to make grid / box values. Only AFTER that step does GIStemp make “anomalies” out of the Grid/Box Fiction. (They, last I looked, had 16,000 grid boxes but only about 6000 thermometers at max and only 1200 recently… so by definition most of those grid boxes are a complete fabrication.)

  65. jim2 says:

    I’m not comfortable with the small error bars either. The lack of measurement on so many points over such a large surface has to make the error bars larger. I can see how the jackknife or bootstrap procedures can show self-consistency, but only the limited number of measurements are used. Those procedures don’t take into account the uncertainty introduced by the majority of points not sampled.

  66. E.M.Smith says:

    @Simon:

    I’d add that the ‘random walk of molecules’ does have a hot to cold bias. Entirely due to velocity. A molecule picking up energy from a hot surface will depart with more KE than one departing a cold surface (where some even condense and stay…). IMHO, it is that differential velocity that gets more particles headed faster toward the cold plate than the hot plate. Then, with impacts and scattering, there ought to be a differential tendency for momentum to be imparted from hot to cold as well. With distance, this eventually runs out of effect, but in modest proximity ought to be significant. Just where the motion goes random is an interesting question… nanometers maybe?

    Note that photons don’t have impact / scattering they have absorb and re-emit and note that photons do not change velocity with energy, they change color. I think something can be made of that… just not sure what ;-)

    @David A:

    BINGO! Give that man a Rubber Ducky!

    @Jim2:

    “I already know that using heat in that manner is not strictly correct, but it’s more natural, so I don’t care.”

    As soon as you depart from strictly correct you are shaking hands with wrong.
    As soon as you no longer care about keeping a tidy mind, you will not have one.

    Good luck with that.

    ” If it does, then I posit the average of temperature measurements is physically meaningful.”

    Stated more correctly: “If it does, then one could posit that the average of multiple temperature measurements is a statistic useful to bound the actual physical temperature that is physically meaningful”.

    “Could you quote the words where averages are being misused?”

    Any time someone says “The average temperature is hotter”.

    “The statistic of the average of many temperatures is higher” is valid, as it does not assert the average is a temperature.

    “We can find situations where an average temperature is useful (normally in process control) ”

    Mostly because the material being measured is in one ‘system’. Same stuff, same specific heat, same volume and pressure in the pipes (or close enough the error term isn’t significant to your process),fairly homogeneous temperature field or known / desired gradients; and when things go pear shaped (like at boiling point) the process often has special modes of detection or operation. It’s forgetting all those assumed constants that gets folks in trouble even in process control as they think an average of temperatures is a temperature….

    “I take issue with the notion that one can’t calculate an average of temperatures. It’s certainly mathematically possible.”

    I have never said you can’t calculate it, nor to the best of my recollection has anyone else. That’s a strawman.

    “I contend it even has physical meaning.”

    That’s where you go off the rails (and what I protest) in a way that is sometimes insignificant at at others highly significant. Please be less worried about defending your prior position and take just a moment to consider the reality of the math and physics.

    An average of temperatures is not physical. In any way. Ever. It is ONLY a statistic about a group of numbers.

    Depending on context (external considerations of extensive sort – mass, physical composition, specific heat, phase change and more) that statistic can be useful to BOUND the probable actual temperature, enough to be used in decisions. (The oven with a few good thermometers – if they range from 348 F to 352 F you can easily use 350F as your ASSUMED temperature for baking). When you lack that context, any average of temperatures is at best wrong and at worst catastrophically wrong.

    It may, purely by accident of random chance, have a value close to something useful, sometimes… So in terms of the Globe, since the depth to which temperatures propagate into the soil and water changes (so mass is variable) and the phase of water changes A Lot (ice, snow, rain, vapor, steam from volcanoes) along with the chemical composition (ocean salt changes, CO2 turns into trees, distribution of water changes as the Sahara goes green, etc.) you simply can not assume a single system or even close enough to one to ignore those things. Measure in the wrong places and your average can tell you anything. Change your methods and your answer changes. That’s not science.

    So yes, in terms of an industrial process control, it’s a nit-pick to say the average of temperatures isn’t a temperature and is un-physical. It is also the truth to say that. It usually doesn’t get you into too much trouble (but can… in making explosives, for example, it doesn’t help you if the average is fine but one spot goes above auto-ignition temperature…)

    So why do I harp on that nit-pick? Because it leads to bad habits and the sloppy thought that an “average temperature” exists. It does not. An Average OF Temperature Data exists, and it is only a statistic. Not a physical thing. But say “average temperature” often enough, and eventually you start to think the GAT exists, is real, or has some physical meaning; when it does NOT exist, is NOT real, and has NO physical meaning. It is only a statistic about a sloppy set of data of poor history with Nyquist violation and more. I.e. essentiall useless and fundamentally wrong.

    Again, per your snowball Earth example: I’ve already pointed out that you are running causality backwards in that reasoning. By positing a snowball, you implicitly include lots of frozen water. i.e. those extrinsic properties are being used. The correct way to test your position is to compare many sets of averages of temperatures and then show they regularly detect when it is snowball earth or not. As I pointed out, as soon as that average is only Antarctica (and maybe toss in Greenland for fun) the average will “find” a Snowball Earth when none exists. To do otherwise would require a very special array of temperature measurements (probably better than we have today…) sufficient to assure no net significant errors were in the data from distribution, water location changes, etc. etc.

    One counter example:

    When Africa / Eurasia area gets more net heat flow in, the air over the Sahara rises a lot and sucks in more moisture laden air, causing lots of rain. This eventually leads to a wet savanna populated with lots of animals and fish. And people. Your “average temperature” of the Sahara has gone down due to going up… Similarly, when it was a vast inland sea (where whales look to have evolved from a hippo like animal) that sea will be much cooler than the Sahara of today. Now just where do you place your thermometers so those changes don’t bias your local, regional, hemispheric, and global “average temperature”?

    I have no idea if in an ice age glacial the stripe around the middle near the Sahara is net hotter or colder than today, nor if it is enough hotter that those measurements will overwhelm the polar ones and show the “Global Average Temperature” is hotter,or not. I do know the poles are physically smaller so bias the ‘grid box average’ less and that the Sahara can have a roughly 35 F range between wet and dry states.

    That’s where you get into trouble. When you try to go from averages of temperatures to physicality. It’s easy to go from physicality to average of temperatures. “What’s the average temperature of a bucket of molten lead?” I can guarantee it will be higher than a bucket of solid water… Now give me 5 thermometers reading an average of 900 F. Which ones are in lead and which in water? Well, that depends on the state of the lead and water…

  67. E.M.Smith says:

    As a side note, something other than “GAT Wars, Part 58”, I’m posting this from my PiM3 on the TV in the bedroom. Just testing out using one as a personal computer without a monitor. Works pretty good. It’s only a 720 p TV and I’m seeing where a 1080 p or 4K would be valuable as a monitor even when as a TV it is below visual value. Sitting 3 feet from the TV changes that equation…

    Biggest “gripe” I’ve got is that the Logitech integrated keyboard / trackpad that works OK on the Chromebox for launching YouTubes is a bit prone to typos and has a difficult time doing precision scrolling on long postings (no thumb wheel, only a ‘grab the slider and drag it’ with the track pad…) But with nice keyboard and real mouse (with pad to run on…) it would be a quite reasonable computer. (Need a chair with a back too… sitting on the edge of the bed is getting old…)

    I can see this as a reasonable bit of kit to use in a hotel room. Pack the identity / OS chip separate from the Pi M3 / Odroid board. Folks at the inspection station can look at your board as much as they want and get nothing. At destination, integrate plug in TV and go. Could even leave any sensitive data on an encrypted USB dongle and / or as an encrypted “blob in the cloud”; depending on destination.

    (Yes, the UK demanding passwords at customs has got me working on bypasses… )

    So this is just a Pi M3, the mini-dongle that connects it to the K400+ travel keyboard / pad, and a USB stick in it. I could even skip the USB stick if I put my home directory on the chip… Oh, and an HDMI cable to the TV and power adapter…

    I tried to get the Odroid C2 to work, and it did, minus the WiFi dongle… I suspect the 64bit WiFi software is still a bit dodgy. It would connect, then drop. Doing a well timed “ifconfig” caught it as wlan0 up with IP address 10.x.x.x and working, for a second, before it disconnected… So “some assembly required”. I may try the Odroid C1 and XU4 later, just to see what works and what doesn’t. IIRC I already used the C1 and it was fine, but need to check on “did I use the WiFi?” or not…

    So after I’ve run them all through the mix, I’ll pick one and configure an “on the road” chip for it. That way I don’t have to life on the tablet when on the road… or accept it’s crappy not-secure encryption method… (remember it encrypted the contents but not the file names nor hierarchy… then brain farted and would not decrypt…) I’ve already shown using encrypted blobs as file systems inside Linux, so anything I want “private” can easily be hidden on the chip in a secure way.

    Well, proof of concept done, time for a coffee break ;-)

  68. pouncer says:

    Mr Smith,

    Based on your investing experiences and posts here you may be a better postion than most of us to explicate the analogy between climate and economics offered by Steven Mosher. Mr Mosher has gone a bit far lately, but I’ll come back to that. The analogy though is that the various GISS or HADCRU data sets function with regard to atmospheric temperatures in the way that the Dow Jones Industrials or Standard & Poor indexes of stock market prices inform us of the state of the economy.

    If a publically traded company that happens to be included in the sampled index happens to have a good quarter – or if a weather station that happens to represent a particularly broad set of grid cells happens to have had a month of warm weather — the average of the numbers being calculated in the index goes up. Bad financials and cold weather for those spots drives the index number down. Often the index and the system as a whole are in sync. Not quite as often, they are not.

    It is a mistake to put too much money into any one decision on the basis of what’s happening with the index rather than the local and specific measure.

  69. Larry Ledwick says:

    Any time someone says “The average temperature is hotter”.

    “The statistic of the average of many temperatures is higher” is valid, as it does not assert the average is a temperature.

    You have the following data set:

    60, 58,58,58,58,58,58,20 = 428/8 = 53.5

    An hour later your data is:
    58,58,58,58,58,58,58,22 = 428/8 = 53.5

    6 hours later your data is:
    54,54,54,54,54,54,54,54=432/8 = 54

    In the last case the average temperature increased but the majority of the samples lost heat.
    If you include phase changes in the above like element 8 is a cubic foot of water and undergoes transition from ice to water as it heats up from 22 – 54 degrees the total heat content of the system is very different, than the same change in the average at a different temperature above freezing.

    Averaging temperatures and pretending the result is a temperature that indicates changes in heat content, can hide all sorts of mischief, and literally conveys no useful information about the heat content of a system if you do not know mass and (materials and their phases) and heat capacities of the system components.

  70. jim2 says:

    “Note that photons don’t have impact / scattering they have absorb and re-emit and note that photons do not change velocity with energy, they change color. I think something can be made of that… just not sure what ;-)”

    higher frequency => shorter wavelength => higher energy @ c

  71. EM – did I mention that it’s hard to avoid averages creeping in when talking about thermodynamics? Given two random walks of gas molecules, basically you can’t tell which one started off hot and which started cold. Given enough of the sequence (and the mass of the molecules in question), you can make a good guess about the overall temperature and pressure of the gas, but each collision is random even at the hot and cold surface. You need to know the data of momentum and energy-changes of that molecule along its path to build up the histogram that can then be matched to the temperature/pressure curves. The natural tendency is to say that the energy will move from a hotter to a colder place, and of course this is what we measure to happen using thermometers (or a toe in the bathtub) but in fact it is a random process, and what is happening over time is that each bit of energy tends towards an equal probability of being anywhere it can physically get to. The hot spreads out to fill that space, but the cold does too, and these movements are independent of each other, if you want to look at it that way instead. This is certainly counter-intuitive initially, but if you accept the kinetic theory of gases and that the individual collisions are random, then the logic follows on.

    The counter-intuitiveness comes from regarding the temperature as a single figure, whereas it is in fact one probability curve of a group where each theoretically stretches from zero velocity to infinite velocity, and given enough samples of the actual velocities you can say which curve it is on and thus assign a temperature. You can’t tell the temperature of a single particle, but only of a collection of them that are interacting. You can measure its velocity relative to something else (at least to within the HUP), but that does not mean that it has a certain temperature because of that velocity. I’ve however seen papers that treat the kinetic energy of a particle as if it were a temperature. Again, a mistake in clear thinking.

    This is of course a simplification that ignores any winds or convection currents, but they can be added into the model later on. Such things are non-random movements. At the individual particle level, though, the collisions are still random, but since both energy and momentum are conserved in each collision and the net momentum is in a particular direction, then it will remain in that direction after the collision too. If you look specifically at the collision of two molecules in a quantity of gas, you haven’t enough information to calculate either the temperature of the gas or whether there’s a wind. Once you have enough collisions to produce an average, you can then show the density and any winds or convection currents. An interesting point that is generally not noticed is that the path of a molecule between collisions must in a gravitational field be a parabola. Heavier molecules sink to the bottom because their paths have a greater curvature, and that gravitational acceleration must be countered by a force from molecules below which are at a higher pressure (which itself is an average).

    Interestingly, since momentum is conserved (as far as we know) then the initial momentum imparted by the hot plate in the gas will not be dissipated the further it gets away from the hot plate. It may well be absorbed in effect when it hits a wall, but will actually still be there (and the hot plate will have a recoil), but normally we’d be looking at a container of gas with a hot plate at one side so the momenta cancel out. It looks possible to use that to provide lift, if you have a plate that is hot on the underside and cold on the topside, and put it in free air. It may not however be very efficient and you’d need to clear the hot and cold boundary layers away from the surfaces so that they can be replaced by average temperature molecules. Maybe more useful in a heavy atmosphere such as Argon. Note that “average” crept in there, too….

    With normal atmospheric conditions, the frequency of collisions is around 7GHz (on average…) and the mean free path is of the order of 70nm. It’s hard to get measurements that aren’t an average.

    Damned averages….

    We can probably frequency-shift photons downwards by using an almost-transparent low band-gap PV at the focus of a lens, where the re-emitted photons (in random directions) are re-focussed by another lens to give a colouur-shifted image. I can’t think of a lot of use for that yet. Using a doubler is also possible to upshift the frequency so we can get a visible-light image of an IR-lit scene, though doublers tend to be narrow-band so it would be monochrome. We can probably do quite a lot more interesting stuff with photons than we’ve already done. On the causality problem, though, at the point/time when the photon is emitted the location/time of its absorption cannot be known either to the object or the photon, since it will necessarily be at the event horizon. It can’t tell, therefore, whether it will be going from colder to hotter or hotter to colder.

    Overall, though, regarding the temperature as a single number is useful in a lot of situations, but it does lead us to treat it as if it were really a single number and that in turn leads to some logical fallacies. In some ways, the meteorologists ways of getting an average temperature (and the GAT problem) are not as critical as the thinking that dictates that we can only use energy once on its way from hot to cold and then it’s waste and no longer usable. Though it may seem that environmental energy is somewhat diffuse, there’s around a kilowatt per square metre of it available and even a small bit of that will be useful.

  72. E.M.Smith says:

    The XU4 has a sporadic issue with the USB 3.0 ports being dodgy. It will likely be resolved pretty fast under Debian / Devuan; but for now, for the release I’m running, it is still there.

    I was reminded of this when I tried using it as the TV Traveler… With the WiFi Dongle in one of them, it had dodgy WiFi. When I put that in the USB 2.0 it was fine, but the keyboard dongle was moved to 3.0 and keystrokes would ‘repeat’ often… I finally got logged in via the expedient of a fast tap and long wait of each key of username and password… so I could shut it down….

    I’d put a USB 2.0 Hub on the thing in the office, then forgot about it… As a traveler without a hub I was reminded…. So I’ve either got to use the Odroid C1 (that as far as I know loves the HDMI wire, has only USB 2.0 and has working WiFi) as the traveler, or use the Pi M3 (that’s just a touch slow but OK).

    Or I supposed I could just wait for fixes to the USB 3.0 and 64 bit WiFi stuff…. Ah, the “joys” of developer land…

    OK, for now, the Pi M3 is the target. (Gee… isn’t it interesting how often that same refrain comes around…
    ” FOO and BAR have this little Foobar, so I’m sticking with the Pi M3″

    Oh Well. That’s what exploration is all about. Finding where there are “Here they be Dragons” signs…

    @Larry:

    Exactly so.

    @Pouncer:

    Well there are good points and bad points about the Dow analogy…

    “The analogy though is that the various GISS or HADCRU data sets function with regard to atmospheric temperatures in the way that the Dow Jones Industrials or Standard & Poor indexes of stock market prices inform us of the state of the economy. ”

    It is quite true that as temperatures in one place or another change, they cause changes in the average of the temperatures. It is quite true that as a company share price rises and falls it causes changes in the average of share prices. It is also true that both are largely useless for predicting much.

    The Dow does not PREDICT the state of the economy, it reflects it. It reflects more strongly the actions of The Fed and the emotional state of investors and traders… The Dow is highly manipulated, much like temperature data, with “station cherry picking” frequently. This then requires “adjustments” to how the Dow is calculated. Notice it is the “Dow 30 INDUSTRIALS”… So think Bank of America is an “industrial”? Yeah… that bad. How about Microsoft? Coca Cola? Apple? American Express? Goldman Sachs?

    http://money.cnn.com/data/dow30/

    Companies in the Dow Jones Industrial Average
    Company 	Price 	Change 	% Change 	Volume 	YTD
    change
    MMM 3M 	209.17 	-0.47 	-0.22% 	1,413,014 	+13.99%
    AXP American Express 	86.46 	-0.09 	-0.10% 	2,262,735 	+16.28%
    AAPL Apple 	158.91 	-1.95 	-1.21% 	16,591,051 	+41.64%
    BA Boeing 	242.01 	+1.43 	+0.59% 	3,271,270 	+54.37%
    CAT Caterpillar 	119.49 	-1.45 	-1.20% 	2,992,226 	+27.54%
    CVX Chevron 	108.76 	+1.14 	+1.06% 	3,660,250 	-7.60%
    CSCO Cisco 	32.05 	-0.36 	-1.10% 	14,686,578 	+6.88%
    KO Coca-Cola 	46.91 	+0.19 	+0.41% 	7,391,249 	+10.42%
    DIS Disney 	98.73 	+0.84 	+0.86% 	7,827,433 	-2.61%
    XOM Exxon Mobil 	79.69 	+0.19 	+0.24% 	7,361,092 	-15.17%
    GE General Electric 	24.12 	+0.21 	+0.88% 	58,848,107 	-20.44%
    GS Goldman Sachs 	225.88 	+2.14 	+0.96% 	2,346,855 	-5.67%
    HD Home Depot 	160.08 	+0.17 	+0.11% 	3,383,036 	+12.46%
    IBM IBM 	145.92 	+0.16 	+0.11% 	3,351,733 	-13.20%
    INTC Intel 	36.25 	+0.16 	+0.44% 	12,821,972 	-3.25%
    JNJ Johnson & Johnson 	131.97 	-0.66 	-0.50% 	3,820,814 	+13.73%
    JPM JPMorgan Chase 	91.13 	+0.23 	+0.26% 	9,815,834 	+6.27%
    MCD McDonald's 	157.09 	+0.76 	+0.49% 	1,878,291 	+31.29%
    MRK Merck 	65.45 	-0.01 	-0.02% 	6,185,056 	+8.43%
    MSFT Microsoft 	75.13 	+0.45 	+0.60% 	21,736,161 	+18.99%
    NKE Nike 	53.46 	+0.06 	+0.12% 	5,509,805 	+4.98%
    PFE Pfizer 	35.05 	-0.32 	-0.90% 	18,179,055 	+4.56%
    PG Procter & Gamble 	93.51 	-0.0011 	-0.00% 	4,960,653 	+10.05%
    TRV Travelers Companies Inc 	121.43 	-1.05 	-0.86% 	1,547,044 	-2.06%
    UTX United Technologies 	109.95 	+0.09 	+0.08% 	2,644,747 	+7.57%
    UNH UnitedHealth 	198.26 	+0.79 	+0.40% 	2,242,195 	+24.81%
    VZ Verizon 	46.88 	+0.09 	+0.18% 	11,455,202 	-10.23%
    V Visa 	105.63 	-0.58 	-0.55% 	4,466,772 	+33.17%
    WMT Wal-Mart 	80.08 	+0.47 	+0.59% 	6,474,360 	+13.38%
    

    So yeah, the GAT from Hadley or GIStemp is much like the Dow INDUSTRIALS. Basically lying from the basic start. Cherry picking by secret committee what gets put in and taken out. HIGHLY adjusted and so grossly changed over time that it can not be used for the original purpose. Manicured via selection bias and other means to have a long history of rising, to the point where the “average” is over 20,000 and the stocks themselves are all below $250… so even $250 x 30 / 30 = 250 “average”… is several orders of magnitude away from the Dow Index that is so divorced from reality to be useless for anything historical…

    Trying to predict using the indexes is notoriously impossible. Best I can do is pick up inflection points in local trend. Usually on the order of months, though a very long term decadal mode can confirm major changes of Bull vs Bear markets. IFF you get enough confirming indications and realizing it can be suddenly made very wrong by someone at The Fed grasping a microphone and speaking…

    Now, as to them “informing about the state of the Economy”, that too “has issues”. They said the economy was absolutely peachy keen just before the Dot.Com Bubble. And before the Housing/CRA-Financial Crisis Bubble. And before essentially every other bubble we’ve had. In short, they don’t tell you shit about the state of the Economy. They tell you about the state of the psychology of the market players. So yeah, I guess GIStemp and Hadcrut as like that. They tell us about the psychology of the Climate “scientists”…

    I could do more on that, but that’s enough for now. Thumbnail is: the analogy is mostly crap, but where it is accurate says both indexes are crap for what he claims they do.

  73. E.M.Smith says:

    @Simon:

    Yup. The on downside to keeping a tidy mind is it can take a lot of words to be correct. But worth it.

    An interesting point that is generally not noticed is that the path of a molecule between collisions must in a gravitational field be a parabola. Heavier molecules sink to the bottom because their paths have a greater curvature, and that gravitational acceleration must be countered by a force from molecules below which are at a higher pressure (which itself is an average).

    Hmmmm PV=nRT issues… gravity causing more P via that parabola perhaps also delivering more T as more collisions end up lower down… could that be supportive of the temperature curve of planets idea that says T is a function of P mostly? Just due to those parabolic deliveries of KE…

  74. jim2 says:

    @Simon Derricutt says:
    13 September 2017 at 6:53 pm

    Fluorescent molecules are frequency down converters. Not real efficient, but still …

  75. EM – yep, sometimes a lot of words.

    Between (and during) collisions, those molecules are being accelerated by gravity so must acquire momentum downwards from the gravitational field. KE going upwards will be reduced, and KE downwards increased. The gas molecules below must therefore have a higher KE. Again, equilibrium implies an average. I haven’t done the full maths here, but it ought to produce a full description of the pressure/temperature correlation. This was also measured by Graeff in an exquisitely-precise measurement of the temperature difference in air over a metre height. I’ve linked to that one before.

    Jim2 – fluorescents are efficient enough for white LEDs and fluorescent tubes. Not too bad. Still, I was thinking about something that shifted the whole spectrum down by a fixed energy gap, so that you get a picture in colour rather than monochrome. It’ll maybe get a use sometime.

  76. jim2 says:

    @Simon Derricutt says:
    13 September 2017 at 7:56 pm

    To what monochrome picture do you refer?

  77. Jim2 – a fluorescent material takes a photon above the energy it needs and outputs a photon of a single frequency/wavelength/energy (though there is some spreading because of thermal effects, the output wavelength bears no relationship to the input wavelength except that it is longer). You thus get a monochrome picture out of the system. White LEDS (and tubes) use a mixture of phosphors to get an approximation to white light from UV input. Somewhat nice if you can shift a whole spectrum down by a fixed energy, and retain the information of the differences in wavelengths as well as their intensity.

  78. Larry Ledwick says:

    An interesting point that is generally not noticed is that the path of a molecule between collisions must in a gravitational field be a parabola.

    Actually it is an ellipse with one focus being the center of the earth but ellipses with high eccentricity approach a parabola when their two foci are infinitely far apart. The parabola is the limiting case of a high eccentricity ellipse.

    That is the basis for the gravitational field source of lapse rate. Each gas molecule is between impacts with other atoms in a free orbit around the earth. As such the higher its altitude above the earth the lower is its kinetic energy of motion and the higher its gravitational potential energy of position.

    At lower altitudes the atoms have more of their total energy in the form of kinetic energy and thus higher temperature. It really is very very simple if you can visualize a single atom as an orbital body around the earth. Conservation of energy demands that the atoms kinetic energy increase as it gets closer to the surface of the earth (falls from apogee) and it will “cool” and slow down as it moves farther above the the surface (rises toward apogee, swapping kinetic energy (temperature) for gravitational potential energy.

  79. Thanks Larry. The gravitational field is not parallel but radiating from some point, so it’s an ellipse. A bit more mind-tidying, since I’ve been taught so long ago that it was a parabola I didn’t think to question that.

  80. Larry Ledwick says:

    For very large eccentricities a parabola is essentially identical to one end of an ellipse, so for all practical purposes parabolic trajectories for artillery etc works unit the flight distance is large with comparison to the radius of the earth (ICBM). We were all taught in physics that ballistic trajectories are parabolic over typical short flight times like a tossed baseball, arrow or even most artillery/ mortars.

    They never mention that they are assuming a special case where the gravitational field is normal to a flat surface rather than normal to a spherical surface. Sufficiently accurate for introduction to parabolic functions and simple examples of ballistic flight but fails for long flight times and distances over ground.

    Thomas Harriot, Galileo, and Tartaglia worked out that ballistic flight was curved (parabolic). Tartaglia was the first to work out a mathematical description of a cannon balls flight.
    In the mid 1500’s Tartaglia was using black powder cannon and developed ballistics tables for accurate predictable use of artillery, so max range he had to deal with was just a few hundred yards. With 1000 yards considered to be very long range, and 80-300 yards more typical of cannon usage in the 1500’s and later for battering down castle walls. At those distances the parabola and ellipse are functionally the same.

  81. Larry Ledwick says:

    Very interesting decision by Colorado’s Governor regarding immigration.
    I frankly did not expect this to be the choice selected.

  82. Larry Ledwick says:

    Mean while back at North Korea, they have now unmistakably demonstrated that they both have a suitable nuclear device and a missile with enough range to strike Guam.

    The only missing requirements are some demonstration that the missile has enough throw weight for the latest 160-250 kt weapon and that they can achieve sufficient accuracy at range to make effective use of the device.

    This flight was only 9 minutes in duration (very short warning time for Guam) and at 2294 miles distance has sufficient range to reach the island.

    Guam houses, Anderson Air Force Base (B52, B1, and B2 strategic bombers, our Nuclear Sub tender servicing SubPac nuclear boats (both fast attack and boomers), SRF Guam (Ship Repair Facility – ie dry docks etc.) Naval Hospital Guam, Naval base Guam (soon to include Marines being shifted out of Okinawa and the communications station on Guam. In short the most significant US naval asset in the pacific outside of Hawaii.

    http://www.sbs.com.au/news/article/2017/09/15/tillerson-urges-china-and-russia-take-action-after-north-korea-missile-launch

  83. Another Ian says:

    E.M.

    Would you believe?

    “Equifax had ‘admin’ as login and password in Argentina”

    http://www.smalldeadanimals.com/2017/09/what-would-we-d-83.html

  84. Larry Ledwick says:

    That is scary!
    How do you find employees that stupid?

  85. E.M.Smith says:

    @Anothrr Ian:

    There IS hope!

    @Larry L.:

    Also missing is evidence the gadget will funtion after G’s and vibration… and reentry heat…

    @All:

    Dooh! I never questioned the parabola. It was inserted too early…. Yes. Of course. Elipse…

  86. Larry Ledwick says:

    Chronology of North Korean missile and nuclear development

    https://au.news.yahoo.com/world/a/37102634/chronology-of-north-korean-missile-development/

  87. jim2 says:

    Hmmmm … nmap says I have 445/tcp open microsoft-ds on my Linux box. Does anyone else see that port open , if you are running Linux.

  88. jim2 says:

    jim2 says:
    15 September 2017 at 12:32 pm

    Never mind, it’s Samba. It’s gone now!

  89. Larry Ledwick says:

    Hmmm looks like GAB is going after Google on restraint of trade.

    View story at Medium.com

  90. E.M.Smith says:

    So what is “Gab” and how is Google restraining their trade?

  91. Pingback: W.O.O.D. – 15 Sept 2017 | Musings from the Chiefio

  92. cdquarles says:

    Gab.ai is an alternative to Twitter. I’d guess that Google is down rating them for search results involving SMS services.

  93. Larry Ledwick says:

    As noted above it is a twitter clone but is “conservative safe” in that they are not pushing a leftist agenda and summarily suspending conservatives for trivial issues, while letting the most agregious abusive behavior of liberals go unchallenged.

    Stephan Molyneux has posted tons of his videos over there to avoid having them yanked by google youtube and twitter.

    A lot of folks (myself included) have signed up there as a fall back to twitter and facebook if they pull the plug entirely on conservative voices.

    They are not quite at critical mass yet as some of the big organizations have not set up accounts over there but it is moving in that direction.

  94. Larry Ledwick says:

    By the way wordpress is being really obnoxious today, while I typed the last posting (firefox) it kept jumping to the top of the page and a video ad it was trying to run. Probably nothing you can do about it but just so folks are aware.

  95. philjourdan says:

    Gab does not restrict hate groups, so Google Banned them from Android. Gab is actually the epitome of free speech.

    I hope they win.

  96. E.M.Smith says:

    I’ll see if there are any “controls” I can set on adverts. I don’t think there are, but if “no video” is an option, I’ll set it. FWIW I’ve set “no javascript” on my Daily Driver so the video crap is suppressed. I turn it on if I want to look at a video… or just grab the tablet. I’m likely going to move to a “tablet for videos” mode in general, and just block it everywhere else. Makes the world more livable on small hardware and is more secure as scripts and flash get blocked.

    I’m finally getting the hang of this “too cheap to care” hardware world. Having dedicated systems / chips for different uses. Dedicated media station (Chromebox on the TV) for fun videos. Dedicated “on the road” and some videos with the Android Tablet. Dedicated Financial Processing with the C2. Dedicated Internal Workstation with the Pi M3 / Pi Stack for tech stuff. Dedicated Daily Driver for postings and general browsing via the XU4 (soon likely to split posting to a separate chip from general browsing, but maybe not. The two are related in that postings take links that take browsing…) Still need to move email & such to a separate chip (it is still on the Pi M3 internal desktop system) and likely will want a dedicated file server (right now it’s leave disks off and unplugged, plug into ‘whatever’ when needed – would be better if ‘whatever’ was a fixed system). Dedicated DNS / router server. Dedicated scraper / public data archive.

    It can be very comforting to know, for example, when I pay a bill in an hour or so that particular chip has NEVER been exposed to anywhere but that ONE financial site. (Soon to be two sites – but still, both Name organizations and very clean). No click bait or phish exposure. No hot pixel exposure. etc. etc.

    Given that it takes about 30 seconds to shut down, replug wires, reboot (and less if you just go the chip swap route that would be even cheaper and easier) it isn’t exactly a burden. Where before I’d done this with Berry Boot on the Pi as different system images, I’m now doing it with dedicated systems and chips (mostly as they are faster and don’t have Berry Boot & second because I bought them to play with them after all…) I could have 2 of them (or maybe even three) up at the same time IFF I wanted to set up another HDMI monitor or use a Terminal Server to get to them headless (but that sort of weakens the “only one connection type” use and provides a leakage path…)

    So, I’m off to look at settings on WordPress. Then I’ll try the tablet on the web pages and see what the video is. I think one of the browsers on it has java script turned on… and / or Flash enabled…

  97. E.M.Smith says:

    Well, under “ad control” all I have is a pitch to buy ads:

    To keep your blog free, we sometimes run advertisements on your blog, and you can remove these with the purchase of a WordPress.com Plan.

    If you’re interested in earning money from your WordPress.com site by showing high-quality advertisements, please request an invitation to WordAds today!

  98. jim2 says:

    I wonder if WordPress censors conservative site’s ads.

  99. Larry Ledwick says:

    If you view your blog with Brave browser not a problem as its built in ad blocker squelches the obnoxious behavior.

  100. Larry Ledwick says:

    Interesting read on terrorism and strange allies.

    https://www.saulmontes-bradley.com/latin-ballat/

  101. E.M.Smith says:

    I think it is my choice of processor….

    I’ve had a “problem” from time to time that some sites assume I’m on a “telephone” due to the use of ARM processors in the various boards like the Pi or Odroid. Sometimes this gives me web pages that are hobbled to fit a 4 inch screen.

    I suspect that WordPress is detecting the ARM processor and assuming it can’t handle full on video, so I don’t see it.

    It’s an interesting “maybe a feature” ;-)

    I’ll try firing up an anonymous Windows box tomorrow and see what it sees….

    (So much work to undo what I’ve done to isolate myself from all the crap… yet curiously interesting to note in passing what has been done and the effects of it…)

  102. llanfar says:

    @Another Ian

    Pointman has a wonderful command of language…

    The media is safely ensconced in their own self-wanking echo chamber with an occasional cri de coeur escaping out of it into the void to die of loneliness like a motherless child nobody actually cares for.

  103. jim2 says:

    EM – will the Pi run Wine? Don’t know how secure that would be, but I used to use it to log into GoToMYPC at work. That way I could use Linux and not have the fuss and bother of MS.

  104. E.M.Smith says:

    @jim2:

    It ought to run anything Debian can run. I suspect performance would suck. Windows can suck down a 4 core Intel box, and this would be putting a 4 core ARM chip under an emulator under it…

    I might try it later in the day if I get my “chores” done… Ought to be good for a laugh ;-)

    Were I going to run “wine”, I think I’d do it on an Intel box. You already know you are going to be exposed via the Windows bits, so at that point the added risk from the Intel firmware attack is small, IMHO. Might put it on something like the octo-core XU4 Odroid. It seems very fast and could likely carry the load OK.

  105. jim2 says:

    I’m running AMD. It worked OK with wine.

  106. Larry Ledwick says:

    An interesting story about how industrial manufacturing died in America and the long term political fallout we are seeing today in the move of life long blue collar democrats voting for Trump

    https://nypost.com/2017/09/16/the-day-that-destroyed-the-working-class-and-sowed-the-seeds-for-trump/

    I remember that period as it was a time of struggle for me, the economy was beginning its Carter Crash in 1977 and over the next 4 years or so lots of people would change their lives forever as the wheels came off their plans due to no fault of their own. I had just gotten out of the Navy a few years earlier and struggling to find a job. I ended up changing jobs several times in 4-5 years trying to get on my feet. It was a time when I was picking up aluminum cans in car wash dumpsters and taking them down to recycling so I had enough money to make phone calls for job openings in the news papers and gas enough to get to the job interviews. I was discharged from the Navy in the fall of 1972 and stepped into the worst economic down turn in my life time to that date.

    US inflation rate during that period
    1972 = 3.27%
    1973 = 6.16%
    1974 = 11.03%
    1975 = 9.20%
    1976 = 5.75%
    1977 = 6.50%
    1978 = 7.62%
    1979 = 11.22%
    1980 = 13.58%
    1981 = 10.35%

  107. Larry Ledwick says:

    Hmmm if true this would be a big deal if we have yet another secret off the books server scandle.

    http://www.breitbart.com/california/2017/09/16/discovery-of-a-another-democrat-secret-server-puts-californias-ag-becerra-in-hot-seat/

Anything to say?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s