I got a request to turn a comment into a more complete posting. This is that effort.
There is a standard model of heat gain / loss used by the Global Warming folks. It goes something along the lines of “sunlight shines in. Some turns to Infrared, and can’t get out due to CO2 absorbing it. That is then ‘re radiated’ toward the ground. In the end, the ground and the air gets warmer due to that CO2 preventing the free radiation of that heat as infrared radiation”. It is usually accompanied by a chart to illustrate:
From this paper: http://www.cgd.ucar.edu/cas/abstracts/files/kevin1997_1.html we get a picture like the one below.
There are color versions of this chart, but this is the one from the paper.
The most fundamental problem with the present theory, the place where it is ‘broken reasoning’, comes from this picture of the world; and that it is a static vision. Nice fixed numbers for cloud cover, evaporation, thermals, CO2. Then the future is predicted (or ‘projected’ or ‘guessed’ or whatever they are calling hand waving these days) based on changing CO2, but leaving the rest of the factors unchanged. A static scoring model, if you will, instead of recognizing the dynamic nature of the real world.
So, just as one example, a surface radiates heat depending on the temperature of that surface. It is a 4th power function. If the surface doubles in temperature, the heat leaving increases by 2 x 2 x 2 x 2 or 16 times as much. Do you see a 16 x increase for hot spots on this chart?
So, for example, if an equatorial area had a cool rain forest turn into a hot desert, there would be a 4th power increase in radiation from that increased temperature. That matters.
But it gets worse.
Notice that evaporation and convection / thermals are constants? (And shown as relatively small ones at that). What happened to weather?
When the sun rises, the surface warms. (Remember that 4th power function… it has strong effects even during the day / night cycle. Notice there is no day / night cycle in the picture?) Anyone who flys, especially in hot air balloons, knows that the best flying is just at sunrise. Inside a couple of hours, the warmth starts causing thermals and the wind picks up, making it too rough to launch. The amount of evaporation, thermals / convection, and wind changes as the energy input changes.
In the tropics, it is common knowledge that in the afternoon there will frequently be rain. The sun rises, the surface warms, and a lot of water evaporates. From the sea. From the forest canopy. From the dirt itself. That water rises high in the sky and condenses to fall as rain. As it condenses, it releases a very large quantity of heat. It does not depend on ‘radiation’ to move that heat to the tops of the clouds. It depends on evaporation, convection, and condensation.
The amount of rain varies fairly directly with the amount of energy in, provided there is water available. (For most of the earth there is plenty of water. Only in a few places do we have large deserts. 70% of the planet surface is water. Water dominates.) The amount of energy being moved in a single hurricane is measured in nuclear bomb equivalents. It is just massive. Note, too, that the image shows those clouds at the bottom of the greenhouse gas layer. In fact, the CO2 et.al. is evenly distributed through the air column and the ultimate height that clouds can reach is the base of the stratosphere. How high, and how strong, depends on how much heat is delivered to the moist surface. Again, we have a heat transport system that changes with changes of temperature or heat flow. More heat in gives more heat transport out.
We have a collection of systems here, with non-linear changes, modeled as fixed constants. No day and night. No seasons. No polar vortex with downflowing frigid air. No equatorial zone with tropical hurricanes moving gigatons of water vapor to the stratosphere, and back again as torrential rains. Each of those systems is known to have dramatic changes, sometimes even order of magnitude scale, over time and over the surface of the globe. Yet all of it is just wished away with “an average will do”.
But an average won’t do. A surface of an average temperature, if instead is made of two halves; respectively 1/2 and 2 x the temperature, will radiate over 8 times as much. The average temperature is the same, but the heat loss goes way up. Due to that hot side and a 4th power function. You can’t average away non-linear properties and effects.
This need not be anything so grand as a major desert or ice field. Surfaces have very uneven heating. One spring day I was camping in the mountains. The air temperature was about 80 F / 26 C. We were a bit warm. So we decided to take a swim in the creek. A quick plunge in, and back out, was ‘enlightening’. After the cold shock headache quit, we walked around a bend to find snow in the shadows melting to feed the creek. Inside a small campground, the temperatures ranged from freezing (the snow) to just melted (the creek) to 80 F / 26 C for the air, and on up to about 120 F near 50 C for the black tarmac / asphalt road and the metal on the truck. Those surfaces simply do not radiate at the “average” temperature. We have no idea what surface temperatures are, in detail, and yet they matter dramatically.
Trees even self regulate their leaf temperature. Increasing moisture loss to keep the temperature moderated. This points up another problem. “Heat of Fusion” and “Heat of Evaporation”. Collectively, enthalpy. We have heat being added to the system, yet the temperature of the leaves does not change. There is a lot of confounding of temperature with heat in “Global Warming” theory. It doesn’t work.
So we measure temperature and say “Look, it did not warm”, then ignore all the heat that was stored in water vapor by those trees. Cut down the trees, put in a parking lot or airport, and that self regulation ends. Worse, the asphalt becomes quite hot in the sun. The same energy arrives as a solar heat flow, but instead of evaporating water at a constant temperature, it becomes much hotter air over the runway. We say “Look! It is MUCH hotter! The world is warming!”. Yet “warming” implies heat storage or gain not just higher temperatures. In fact, that hotter surface is radiating heat better and even conducting heat into the air causing convection better. (Any glider plot can tell you the thermals over asphalt in the sun are quite strong, compared to the nearby grass or trees). It’s the same heat flow, just different percentage that goes into enthalpy vs temperature.
Yet when that water vapor rises, condenses to clouds, and falls as rain, the heat is still dumped to the sky.
We didn’t “warm”, we just have dryer air at a higher temperature where we changed the land use. We shifted enthalpy change to temperature change at no net heat gain.
By the way, most of our thermometers used for land temperature are now located at airports and similar such places. What used to be military parade grounds, then made into grass fields for balloons and eventually wood and cloth airplanes, now is acres of concrete and tarmac. That matters. Yet it is ignored. In a very real sense, our “warming land record” is simply recording the growth of aviation; the cutting down of trees and paving grass fields. The Airport Heat Island is well known to exist. That’s where we put the thermometers. That matters.
So there is a lot of discussion of “Back Radiation”, as though it is the only thing that matters, the only thing that changes. Occasionally there will be mention that cloud cover is poorly understood, or not modeled well. Sometimes you may even get a discussion of the Svensmark Theory that solar wind changes the cosmic ray density, and thus the cloud cover. Yet in hushed tones of doubt. (Even though experimental evidence so far tends to confirm it.) But inevitably the argument of “Back Radiation” returns. Based in the picture above. Such as this example:
This has been discussed here at length. Of course back radiation can not increase the temperature of the earth (direct thermalisation), but back radiation can and does slow the cooling of the surface.
No it doesn’t. CO2 does not “warm” the atmosphere, never ever. The sun does. CO2 just slows cooling, at least theoretically, resulting in an increase of average temperature readings.
On top of that: Even if CO2 slows the cooling of the earth, we simply don’t know exactly what other effects kick in (clouds? change in weather patterns?) to counter that reduced rate of heat loss and might even temporarily overcompensate, thus resulting in global cooling.
But we DO know what warm air does! It rises!
So any ‘slower cooling’ from ‘back radiation’ just means more and ‘faster rising’ to make up for it.
Take your model of more “back radiation” and air having some added warmth near the surface (temporarily). The air will expand, become lighter, and head UP. The more that “back radiation” induced temperature becomes higher than it otherwise would have been, the more and faster it rises to dump that heat at the top of the atmosphere to be radiated away more effectively by those same radiating “greenhouse” gases.
The net result of more CO2 is at most slightly faster convection during the warmest part of the day. (As temperatures cycle strongly over the course of the day, the heat is all dumped before sundown anyway. Ask any pilot when thermals happen. They start just after sun-up and run down after the sun sets.)
What about water? To the extent the surface is water, more “back radiation” makes more evaporation (and not higher temperatures). As water vapor is lighter than air, it, too, rises. In that case to eventually make clouds at altitude where it condenses and dumps the energy (and more effectively radiating the heat away).
Ultraviolet light penetrates the water to some depth and can deliver heat into the oceans, but the back radiation is supposed to be in the infrared, which stops in the very top layers of the water. Where that water promptly evaporates and does not turn that heat energy into temperature, but into enthalpy.
During this downturn of the solar cycle, we’ve had a close look (for the very first time) at how the sun changes. The production of Ultraviolet light has dropped dramatically. The solar spectrum has shifted from UV toward longer wavelengths. Yes, the TSI Total Solar Irradiance doesn’t change much, but the part that can get into the oceans drops a lot, while the part that just evaporates water increases. Total energy doesn’t change much, but where it goes and what it does changed a lot. That isn’t even shown in the picture at all (nor is it in the models).
The atmospheric height has shrunk. UV warms the upper layers and expands the height. Less UV, shorter air column. That changes how the wind flows, the degree to which the Jet Stream is zonal (flat) or meridional (‘loopy’). So now it’s ‘gone loopy’ and we’ve got alternating hot and cold spells as the loops slide back and forth over our heads. No different than in the past, but something we’ve not seen for several decades (at least, not to this degree).
That, too, is ignored. Sacrificed on the altar of “back radiation”. Why pay attention to the sun and what it can do? Instead the sun is treated as a static number. Not as the variable star that it really is. Not with a spectrum that can dramatically change on the order of weeks, as it has. No place at the table for UV and a variable star, only IR and a static scored star.
There is a, roughly, 60 year cycle of the ocean that may well be driven by those solar changes or perhaps by a lunar tidal cycle. The moon orbits the earth on a regular schedule, but with a slowly changing orbit. This causes changes in tides on a set of longer term cycles. The lunar orbit is ‘in sync’ with solar changes due to something called “orbital resonance”. This is a well established property of things in orbit. They tend to ‘sync up’ with each other.
That makes it hard to tell if “the sun did it” or if it was just that the lunar driven tide really did it, but at the same time the sun changed. Perhaps “the moon did it”. But what is very clear, is that ignoring both does not help find “who did it”. The Pacific Decadal Oscillation PDO and the Atlantic Multidecadal Oscillation AMO have long cycles. The ENSO ( La Niña / El Niño ) cycle isn’t quite as regular, but it, too, can cause shifts of long term weather (that is called ‘climate’ by the “climate change” folks, but is really just long term average of weather). So there are these cycles, ranging from sub-decade up to 60 years, 200 years, and perhaps even one as long as 1500 years. Likely driven by tides and orbital mechanics, but with natural oscillations on some time scales. See that in the picture above? No? Oh Dear…
Never Mind that the lunar tide influence is clear and strong (ask any sailor about the importance of accurate tide charts, and how they change over time). Never Mind that ENSO drives our fishing economy and monsoon / crop cycles. Never Mind to anything but “back radiation”.
“It’s just weather”… except that the definition of “climate” used by the Global Warming folks is the 30 year average of weather. I think that is a broken definition. Before Global Warming became a fad we had a perfectly good climate definition system that was largely based on latitude, altitude, distance from water, and land form. Last time I looked, the Mediterranean was still a “Mediterranean Climate” and the Arctic was still an “Arctic Climate” and the Brazilian rain forest was still a “Rain Forest Climate” and the Mojave Desert was still a “Desert Climate”. There has been no “Climate Change” on those terms. But there has been a slow warming over the warm half of the PDO cycle. Does that mean “Climate Change”? Or just that a 30 year average of weather in a 60 year weather cycle will show change? Hmmm? The very definition of “Climate” used by the Climate Change folks is based on broken thinking.
Bond Events are periodic cyclical events of significant cooling (usually preceded by significant warming) that happen every 1500 years. How can you possibly say “climate change” is the 30 year average of weather when weather has 1500 year cycles? Cycles that have nothing to do with carbon dioxide or any ‘greenhouse gases’. We had a dramatic cold event in 535 AD that was the start of The Dark Ages. (It came just after a very warm period – The Roman Optimum). 1500 years + 535 = 2035 AD. Any chance we could be in the warming precursor “Modern Optimum” just before the big drop of the next Bond Event? Ought we to worry that the sun has gone very quiet, that the UV level is plunging, the atmosphere shorter, and the Jet Stream “loopy” so folks expecting snow to be “a fond memory” are now under heavy snow? But now, we are told, do not worry. Look at the Pretty Picture! See, all that matters is ‘back radiation’. The sun is a constant and the air is a standard atmosphere and the oceans never change and the moon isn’t even on the picture, so why talk about changes of tides?…
We are over averaging the data, over simplifying the model / picture, and ignoring history. This will not end well.
There is simply no reason to stop the model (mental or otherwise) at the point the photon hits the dirt or CO2 molecule. We do know what happens. Hot air rises. Hotter air rises faster. The evaporation / precipitation cycle runs faster (if ‘rain is in the air’) too. What’s broken is the idea that heat “builds up”, when in reality it “travels up”…
There is simply no reason to think that CO2 dominates the tides, the sun and moon, hurricanes and jet streams, even polar vortexes (that have changed with the UV / stratospheric temperature changes). There is no reason to make CO2 the control knob on all those other factors. Factors that have caused ice ages to come and go, caused Bond Events and Optimums, and caused whole civilizations to fall. All prior to fossil fuel use.
There is a dramatic history of the fall of empires from Rome to Mesopotamia to Egypt as cyclical droughts happened there. Some lasting nearly a century and collapsing Egypt. Yet the model used, the Pretty Picture, makes no allowance for history. No allowance for records of icebergs on the Black Sea nor for Roman villas built without heat, or even windows that could be closed, in places that are now quiet cool. (They knew how to build with central heating, and used it in cold places, so this was a choice. France was warmer then, as was Italy.) No, all of that is to be swept away, as climate is just the last 30 years average weather. Ignore that grapes were grown in southern England for wine, then in later years they had ‘ice fairs’ on the frozen rivers; and swapped to beer since the wine grapes could no longer grow. No, those changes are not to be acknowledged. Just old history. Anecdotes, after all. Not nearly as orderly as the pretty picture of “back radiation” and the CO2 Control Knob.
Yet history doesn’t care. It just is. Similarly, the future doesn’t care. It will be what it will be. The tides will change, the orbits will move, and we WILL plunge into the next Ice Age Glacial. No, we don’t know when. It’s a very slow process. It might not be for another 1500 years, or it might have already started during the Little Ice Age. Yes, it’s that slow.
This ‘warm cycle’ of the 1500 year cycle was not as warm as the last warm peak… The Medieval Warm Period … that was not as warm as The Roman Optimum. Similarly, the Little Ice Age was colder than the Dark Ages, that were colder than the prior Greek Cold Period. We have “lower highs and lower lows”. Just we don’t notice them as each cycle takes 1500 years or so. We ride the roller coaster of natural cyclical changes thinking we matter. We don’t.
So this next cold plunge could be a dramatic one, into a Little Ice Age, or even into the final start of full glaciation and the exit from the “interglacial” we’ve enjoyed for 10,000 years. Or not. We just don’t know. And it likely doesn’t matter. Would a Roman of 600 AD care that in 2000 AD we were paranoid about warming? Or had been paranoid about cooling in 1970? Probably not. Similarly, I’m not particularly worried about Canada being covered in a glacial shield, again, in 1000 vs 2000 years. Not a real problem.
It’s just as broken for me to worry about Canadian Glaciers in a few thousand years as it would be for an ancient Roman to worry about the potential for drought in the Midwest USA in the 1930s. And just as broken to use a time scale of 30 years for a system that needs a thousands of years scale to see what nature is doing. We need to stop worrying about things that are irrelevant, while looking at what nature does on the thousands of years scale. We simply do not properly grasp the scale of nature.
It takes 100,000 years for the full glacial coverage of an ice age glacial to build. So that ice, that WILL expand from Greenland to New York City, will be moving about 800 feet per year, in big years. You can out-walk an approaching ice age glacial in one day per year, and with a not all that long a walk, either. So if the “scary scary ice age” is about as exciting as watching paint dry, what are the odds that the other “scary scary” is overblown and imaginary as well? Weather changes fast. Climate not so much. The very way we think about the time scale of “climate change” is broken. It needs a 2000 year baseline at the minimum. With change measured on century scales.
In short, we use static scoring and a ridiculously short time scale to think about a process that simply MUST have very long time scales and dynamic system approaches used instead. We put on the blinders of averaging (that hides things) and only one changing parameter, then are surprised to find that parameter is the only one left to change. We make the time scale very short so we don’t notice that other things changed in the past, and did more.
This particular posting will be a ‘work in progress’ for a while. I need to add pictures and a bunch of links. There’s a half dozen topics I’ve not even put in yet. But I need to put something up now, so that the marker is in the ground and folks who where told to ‘watch for it’ know where to watch.
So think of this as the ‘first cut’, not the last. With that, I need a nice cup of tea and a break…