A long Journey In Progress

A Modern Adventure Begins

At last, I’ve caught my breath and had a few long naps…

The Story starts a couple of weeks back. On a Monday, I put the spouse on a plane to Chicago. I was to drive out over 5 days or so, taking my time and spending nights in hotels along the way. Didn’t happen.

The Banana Boat ( Old Yellow Mercedes Wagon ) was in the shop from the last return trip from Florida. Brakes, tune-up, valve seals (it was smoking off the line…), etc. After about $2000 worth it was supposed to be ready to go. I went by to pick it up. Running was rough…

After about a day, it was running OK, but on the drive home that night in the cold air, the rough / sputter returned. 8 AM Tuesday, back at the shop… Eventually they found a couple of vacuum lines that were off / leaking. Fixing that then needed re-timing and resetting the points (it’s a very old gas engine with points and carburettor). OK, finally back home at about 7 PM (4 to 7 spent waiting for Silicon Valley Rush Hour traffic to clear enough to let me go home…) I had a 2 hour nap, then started loading the car.

On The Road Again…

The bottom line is that I was “on the road” at just about midnight. Having had about 4 hours sleep the night before (helped a friend home from the airport and picked up the car for that late cold drive before the return the next morning). So all up about 6 hours sleep including the nap. The sleep budget becomes ever more important over this trip…

Headed out, snow was “an issue” on the direct route via I-80 and the more indirect route via I-70 to I-80. Open were I-40 and I-10. I chose I-40 despite the 6 hour or so time cost added. Through the long cold night down the Central California Valley, the car was mostly OK, but acted up on some gas that was, IMHO, not quite up to Super Octane spec. Adding octane booster helped, when I could get it. Choosing better gas stations helped too. (Likely the timing is set ‘to spec’ for the 1970s and that was when gas had more octane in the Super… and less alcohol leanness).

Climbing up to Arizona, Flagstaff was cold. Snow along the sides of the road, but the road was clear. It stayed cold across New Mexico and on into Oklahoma. Somewhere around Friday 3 PM I was in Missouri. (Yes, that’s about 36 elapsed hours from departure as a rough guess…) I’d ‘cat napped’ a couple of hours as needed. The cold would wake me up after about an hour+, even with a good cover.

Then the car began to sound like a drag racer.

Inspection showed the exhaust pipe cracked off of the stub out of the header pipes. (There’s a side bar here: I stopped to look, cell phone in hand, then in the car could not find my cell phone. The bluetooth headset still connected, barely, so I thought it was in the car. Later, about 100 miles down the road, I thought maybe I’d left it in the engine compartment. A check showed it nicely held between the air cleaner and the foam liner of the hood… But I’d made a few calls on it by then ;-)

Long story short: Checking with 5 different shops along I-44 in Missouri was no-joy on muffler / pipe repair. One had no welder. One had a welder, but the guy who knew how to use it was off early. 3 more had a welder, and people who could use it, but wanted to go home at 5 PM and were already busy ( 10 minutes for one weld being too much, I guess…)

So I ended up driving into Chicago on a load of noise, earplugs in, and with poor operation due to the wrong backpressure. About 4 hours out I got the call that labor had begun… 45 minutes apart. About midnight I stopped at a ‘park and ride’ in a stiff cold wind and adjusted the points for better operation on that particular backpressure… and somewhere around 2 AM pulled into the garage to unload the stuff I was delivering. Figured that The Kid, being a parent now, ought to have the dozen boxes of his stuff still in ‘his room’… Plus some other bits and gifts.

At this time, you can figure I was not very “focused”… Got to bed about 3:30 AM, and then up about 8:30 AM for “meet and greet”.

In Chicago

Labor turned out to be not quite the real thing. (Don’t know the details). Bottom line is we had a very nice ‘family dinner’ with her folks, his (my son’s) folks (i.e. me and spouse) and some friends. That night I got a decent night sleep.

The next day, we went to the Movies. (At least, I think it was the next day, I get a little hazy on the days here ;-) I *think* it was Sunday the 20th. So up about 8 AM, all day doing things, off to The Movies. Star Wars… We watch the whole movie, including credits, then on standing up, her ‘water broke’ and the gals all ran off to the restroom. That was about 9 PM. Off to The Hospital.

Now I was all ready to say “OK, we’ve got time to go home for a nice long nap before anything gets going” as she was 3 cm and “it takes time” from there. But No. Everyone was set to stay “up” as long as it took…

About 24 hours of trying to nap in a hospital waiting room later…

I’d explored their fine cafeteria, had coffee from their in house Starbucks (that closes at 7 pm… unclear on the major use of coffee at night…) and generally found that sleep was not possible in a semi-comfortable chair.

Cutting to the end…

After dilating to 9 cm and starting to push, the fetal heart rate dropped on each push. C-Section time, and found umbilical around the neck. Fine healthy delivery, but yet more hours in the chairs for us.

At about ‘who knows what’ AM on Dec 22, Miles was born.

Some time after that, I got another full night sleep. 2nd in how many days?…

On the 24th, I found a very good exhaust repair place in Chicago (in the Hispanic part, the city is 40% or so Hispanic now) and in about 10 minutes and for $30 had the exhaust welded. Car was once again my friend.

Somewhere around the 24 th, they all came home. We had a Christmas event with opening presents and all (and I got to sleep my 3rd? night out of 10 or so…) and all was pretty good.

Saturday was when the spouse was supposed to fly home. We got up at about 5:30 AM ( and I think I’d had about 7? hours of sleep) and there is a loud ‘thump’ from the bathroom. The spouse had sat down on the toilette due to feeling dizzy and then from that low position fainted.

Another “long story short” after a rapid assessment / differential it was down to ‘something with the pulse’ and both head / stroke issues and any neck issues were ruled out. Doing a heart-attack vs ‘other’ differential on her without an EKG was not possible, and women have variable heart attack symptoms, so it was Ambulance Time.

A nice 4? hours were spent in the ER, where the EKG was normal, the Cat Scan showed nothing (confirming the head injury differential) and the doctor gave her an antihistamine on the presumption it was vertigo related to that (which I didn’t quite follow how…) He did ask if the spouse was ‘anemic’ and she replied that she was, a little. That “a little” ought to have been explored more…

With an official “OK to fly” from the Doctor, it was back at the house and I’m madly carting luggage up and down 3 stores of walk-up stairs. (The other ‘grandma’ had come back from a brief excursion, so I brought her luggage up while taking ours down). At about 3 PM, we headed out for the airport and a bit of city tour.

Spouse was starting to feel a bit dizzy again, and I guessed it might be that the pill she was given was wearing off and we’d not had the prescription filled yet; or that Something Else might be going on. Having Kaiser, it’s a royal pain to be hospitalized out of a Kaiser Area, and it was about a 4 hour car ride back to the hospital, or a 5 hour plane flight to Kaiser Land. We opted for “air transport”.

That was likely a very good thing. At the other end, she was having trouble getting into a wheel chair, so the 2nd ambulance ride of the day followed. (Short form: Gastric or esophageal very tiny bleeder of some sort, that stopped bleeding before they could cauterize it or even find it really.) A couple of days later and with really low red blood count starting to recover, she was out of the hospital and home with friends.

Back At On The Road Again, Again

I’d gotten a snack at the airport and pointed the car south.

It was “unclear” if I’d go home or to Florida (to see friends) and phone calls during the trip were to determine that via spouse status, along with weather reports.

Normally I check the Hazards & Warnings map before leaving, but the events of the day had sucked up that time. The prior day had shown Goliath (named storm by The Weather Channel) heading in, and prior day hazard map showed snow on the northern route, rain in the middle of the mid-west and into the Appalachians, and a sort of clear path due south to I-10 and New Orleans, where a right / left turn decision could be made (though with a distance cost of about 600 miles). I was headed that way, expecting to make a choice after seeing newer weather news at some stop or other (either Truck Stop TV or Starbucks WiFi or…) and a phone status on the spouse.

Well, that didn’t happen.

The Car decided on Florida.

I pointed to the diagonal shortest path through the rain front.

About the end of Illinois, the exhaust noise returned, along with poor operation. Inspection of the pipes showed a different break spot. I likely need to replace the whole pipe, but this one is up near the manifold, so I’ll find out ‘in a few days’ if it is terminal for the car, or just a quick repair. Very few people in Florida work on cars from the 70s. Few of them will work on a Mercedes. I have a “choice of one” at the moment…

After a half dozen ‘tweaks’ at gas stops with mileage as low as 10 MPG, I got the points right and it went back up nearer to 20 something. Smoother operation even with wrong backpressure and a better drive on in to Orlando. It was having marginal / hard starts when cooled down, so letting it go fully cold in a hotel for the night was not an ideal solution, nor waking up everyone IN the hotel when parking / starting, so it was another marathon drive on the cards. Cat naps of very short duration only.

So a quick “run to safety” was the order of the day. Good thing, too, as Goliath not only dumped snow all over the norther routes, and dropped temps to the negatives, it also put 10 inches or so down near El Paso Texas and closed the airport across the freeway in Ciudad Juarez Mexico (they do get snow in Mexico, just not often enough to close the airport).

So with flooding rain filling in behind me, and show / freeze / high wind blocking the approach to the West Coast, and a car on the margin, Florida it was to be.

Driving through the night and all the following day had me go through the frontal zone in / near the Appalachians in Kentucky or Tennessee. It was night, and dark and raining. Then it turned to “Rain Of Biblical Proportions”. Now I’m very experienced in rain and fog. I grew up in “tulle fog” in the Central Valley. I’m not bashful about doing 30 mph being able to only see the tail lights in front of you (NOT the vehicle) and bits of the line next to you. This was like that for about 2 miles, then we were suddenly crossing a bridge that I could barely see (steel work to the side of me and above) and it got worse…

Crawling at about 10? MPH, an exit sign showed up on the other side of the bridge. The truck in front of me headed that way, to my relief, as I was not looking forward to taking that exit without a pilot car… I curled around and entered a gas station, stopping under the overhang. I stayed in the car for a good 10 minutes waiting for the rain to let up. Yes, I was under the overhang, but there was significant wind spray and a inch or so of ‘runoff’ crossing the pad.

After a nice 1/2 hour at the gas station, breakfast of a home made style biscuit with real ham chunks on it (clearly a rural operation ;-) and a full tank, it was back onto the road in much lighter rain as the front was past (and headed to behind me).

After an all-day Sunday drive, I arrived in Florida at just before midnight. For those keeping track, that’s roughly 36 hours (some day I’ll look at receipts and things and adjust for time zones and…)

My third Marathon No-Sleep-Just-Nap in 2 weeks. At most, 4 nights sleep spread over 14 or so days.

The Florida Haven

Needless to say, I’ve been sleeping a lot since then and I’m exceedingly grateful to My Florida Friend for providing a guest room and parking space. The weather here has been great, BTW. He has been fuming a bit about the weather reporter (Weather Channel?) saying folks were “braving the record heat” here. It was about 84 F that day, and perfect. Folks in short playing outside, washing cars, having a great time.

Beats the pants off of snow and sleet in Chicago.

Beats the pants off of “below zero” in the Rockies.

Beats the pants off “rain of biblical proportions” in the Midwest Cornbelt and Mississippi Drainage basin.

Beats the pants off “feet of snow and interstate closed” in the passes between east and west and on down into Mexico.

Frankly, it’s been “incredibly perfect” and I’ll happily “brave” it any time ;-)

Though today is back to a more regular winter pattern with about 60s to low 70s and overcast. No pool time for me today ;-) as it is solar heated and not warm on days like this. Not cold either, but I like it warm… Maybe we’ll do the hot tub instead ;-)

Oh, and a bit later I’ll find and put up the link to the article about how when the Gulf Stream slows down and cooling is happening (and Europe going into a Little Ice Age or a real one…) Florida gets more “summer pattern” weather. That it’s been warmer in Florida than the last 30 years is direct evidence for a cold regime starting. (Peer reviewed paper, too… with pollen from lake sediments and such.)

The Future Is Cloudy

After finding out I have a “choice of one” to fix the car and he’s on vacation until next week, I decided not to drive to California just now. The Florida Friend is lending me some garage space, and I’m renting a room here.

Flights back were screwed up by the storm, and they were 100% booked on my airlines of choice for a good 5 days with “get out of town after the holiday” traffic and rebookings post storm and airplane repositions and… I’ve booked a flight for 6 days out…

Some time after I’ve caught things up on that end, it will be a return trip here “for a while” to sort out more things, decide what to do about cars among them, and look for a ‘gig’ here. I’m having increased allergic reactions to “something” on the California end, and it looks like being “bi-coastal” is tilted toward “Florida First”. That’s a very large ‘chunk of work’ to get done, so expect a long slow slog through it. “Someday”. (Right now I’m just too drained to ponder it much…)

In Conclusion / Starting Here Forward

So that’s why postings have been ‘slim at best’. Frankly, I’m a bit surprised / impressed I’m still able to do that kind of “marathon of marathons”, but I’d really rather never do it again.

Next time I’m flying to Chicago and any remaining “stuff” can go UPS ;-)

Either that, or I need a newer car ;-)

Hopefully I’ll get the Florida Stuff done (car needs registration renewal- it’s the Florida Car and is supposed to be here anyway and has Florida tags as one example) and have enough time left over to both pack for the flight AND do some R&D / Postings.

I do have a lot ‘in queue’, just still feeling a bit ‘thrashed and had’ when thinking about spending 4 to 6 hours at a desk / keyboard instead of with a beer by the pool or a bag of chips dozing on the sofa…

Yet energy levels are almost back up to normal. So it is time to ‘start loading up the processor’ again.

Thank you all for ‘hanging in there’ and keeping yourselves entertained with comments during this time. I’ve looked at some of the ‘tips’ links, for example. Interesting stuff.

With that, I’m going to grab some chips and dip and hit the couch for some Big Screen American Football and ponder more “braving” of the weather here… I need to find my pool shoes and suit…


Subscribe to feed

Posted in Human Interest | Tagged , , , | 15 Comments

Tech Bits Grab Bag

In keeping with the prior Global Warming Grab Bag, this is also a list of some accumulated links without too much evaluation / commentary.

They are things of interest, but where I’ve not gotten around to doing a full posting on them, so might as well just put up the links and let folks read them as desired without my nattering in the way.

Parallel Programming ‘Start Here!’

This site has all sorts of interesting stuff in many pages, I’ll just post a link to one of them. This page is a basic introduction to doing parallel computer programming in C.


Why Parallel Programming?

Simply put, because it may speed up your code. Unlike 10 years ago, today, your computer (and probably even your smartphone) have one or more CPUs that have multiple processing cores (Multi-core processor). This helps with desktop computing tasks like multitasking (running multiple programs, plus the operating system, simultaneously). For scientific computing, this means you have the ability in principle of splitting up your computations into groups and running each group on its own processor.

Most operating systems have a utility that allows you to visualize processor usage in real-time. Mac OSX has “Activity Monitor”, Gnome/GNU Linux has “gnome-system-monitor” and Windows has … well actually I have no idea, you’re on your own with that one. Fire it up, and run a computationally intensive program you have written, and what you will probably see is that you have a lot of computational power that is sleeping. Parallel programming allows you in principle to take advantage of all that dormant power.
Kinds of Parallel Programming

There are many flavours of parallel programming, some that are general and can be run on any hardware, and others that are specific to particular hardware architectures.

Two main paradigms we can talk about here are shared memory versus distributed memory models. In shared memory models, multiple processing units all have access to the same, shared memory space. This is the case on your desktop or laptop with multiple CPU cores. In a distributed memory model, multiple processing units each have their own memory store, and information is passed between them. This is the model that a networked cluster of computers operates with. A computer cluster is a collection of standalone computers that are connected to each other over a network, and are used together as a single system. We won’t be talking about clusters here, but some of the tools we’ll talk about (e.g. MPI) are easily used with clusters.
Types of parallel tasks

Broadly speaking we can separate a computation into two camps depending on how it can be parallelized. A so-called embarrassingly parallel problem is one for which it is dead easy to separate it into some number of independent tasks that then may be run in parallel.
Embarrassingly parallel problems

Embarrassingly parallel computational problems are the easiest to parallelize and you can achieve impressive speedups if you have a computer with many cores. Even if you have just two cores, you can get close to a two-times speedup. An example of an embarrassingly parallel problem is when you need to run a preprocessing pipeline on datasets collected for 15 subjects. Each subject’s data can be processed independently of the others. In other words, the computations involved in processing one subject’s data do not in any way depend on the results of the computations for processing some other subject’s data.

As an example, a grad student in my lab (Heather) figured out how to distribute her FSL preprocessing pipeline for 24 fMRI subjects across multiple cores on her Mac Pro desktop (it has 8) and as a result what used to take about 48 hours to run, now takes “just” over 6 hours.
Tools for Parallel Programming

The threads model of parallel programming is one in which a single process (a single program) can spawn multiple, concurrent “threads” (sub-programs). Each thread runs independently of the others, although they can all access the same shared memory space (and hence they can communicate with each other if necessary). Threads can be spawned and killed as required, by the main program.

A challenge of using threads is the issue of collisions and race conditions, which can be addressed using synchronization. If multiple threads write to (and depend upon) a shared memory variable, then care must be taken to make sure that multiple threads don’t try to write to the same location simultaneously. The wikipedia page for race condition has a nice description (an an example) of how this can be a problem. There are mechanisms when using threads to implement synchronization, and to implement mutual exclusivity (mutex variables) so that shared variables can be locked by one thread and then released, preventing collisions by other threads. These mechanisms ensure threads must “take turns” when accessing protected data.
POSIX Threads (Pthreads)

POSIX Threads (Pthreads for short) is a standard for programming with threads, and defines a set of C types, functions and constants.

More generally, threads are a way that a program can spawn concurrent units of processing that can then be delegated by the operating system to multiple processing cores. Clearly the advantage of a multithreaded program (one that uses multiple threads that are assigned to multiple processing cores) is that you can achieve big speedups, as all cores of your CPU (and all CPUs if you have more than one) are used at the same time.

OpenMP is an API that implements a multi-threaded, shared memory form of parallelism. It uses a set of compiler directives (statements that you add to your C code) that are incorporated at compile-time to generate a multi-threaded version of your code. You can think of Pthreads (above) as doing multi-threaded programming “by hand”, and OpenMP as a slightly more automated, higher-level API to make your program multithreaded. OpenMP takes care of many of the low-level details that you would normally have to implement yourself, if you were using Pthreads from the ground up.

The Message Passing Interface (MPI) is a standard defining core syntax and semantics of library routines that can be used to implement parallel programming in C (and in other languages as well). There are several implementations of MPI such as Open MPI, MPICH2 and LAM/MPI.

In the context of this tutorial, you can think of MPI, in terms of its complexity, scope and control, as sitting in between programming with Pthreads, and using a high-level API such as OpenMP.

The MPI interface allows you to manage allocation, communication, and synchronization of a set of processes that are mapped onto multiple nodes, where each node can be a core within a single CPU, or CPUs within a single machine, or even across multiple machines (as long as they are networked together).

One context where MPI shines in particular is the ability to easily take advantage not just of multiple cores on a single machine, but to run programs on clusters of several machines. Even if you don’t have a dedicated cluster, you could still write a program using MPI that could run your program in parallel, across any collection of computers, as long as they are networked together. Just make sure to ask permission before you load up your lab-mate’s computer’s CPU(s) with your computational tasks!

And even some more on using the GPU (Graphics processor) for general purpose computing.

Looks like a great place for folks who are interested to start down the path of parallel computing. (and / or explains some my babbling here ;-)

GISS Good Programming Guide

Who knew… they generally have good advice, but leave out a lot of stuff I learned in my FORTRAN class about ‘defensive programming’ and how to avoid various kinds of insidious errors. Still, it’s nice to see that they are making an effort.

For those not big into FORTRAN programming practices, you can take a look at the general kinds of comments about how things can vary from one machine to another (i.e. not work right if you don’t use a different set of code…) and contemplate just how many ways in all that complexity you could have a small error that just screws it all up in an insidious not so obvious way… and how much effort it would take to validate everything… and how much of that has NOT been done…


This ‘cut/paste’ will likely lose all the pretty formatting, but it’s just to document what’s at the other end of the link for historical preservation against changes. Hit the link to read it in a nicer format.

Dr. Gavin A. Schmidt
The GISS GCM Good Programming Guide

Version 1.0 February 1999
Gavin Schmidt, Jeff Jonas, Jean Lerner & Ruedy Reto

The GISS GCM (and all attendant offshoots) has developed into a large distributed effort. This guide is an effort to help integrate GISS-wide good coding practices that improve the efficiency of the code, make it more transparent and hopefully (in the long term) lead to some degree of homogenization.

This guide will be split into a number of sections. Firstly, we will highlight some of the common examples of ‘bad’ code and indicate some better alternatives. These should become common sense programming habits. The next section deals with the ways of getting rid of unnecessary GO TO statements, and discusses more structured approaches. We then outline some of the more useful elements of FORTRAN 90which can be used to enhance the readability of the code. The first appendix describes tools that are under-utilised but can be very helpful. The other appendix contains some of the issues that arise in porting the model code to the SGI machines. Examples are given where we feel the correct behavior is not obvious.

This is not intended to be a comprehensive guide to FORTRAN (or to the GISS GCM). It is intended only to highlight some of the more common problems that occur in the model and should be seen as a useful reference for the programmers and scientists in the building.

Please let us know about any features that could be added to this document. They will appear in the next version (contact gavin@giss.nasa.gov).

What to definitely avoid

* Do not calculate invariant expressions at every time step. Put those calculations whose answer never changes in an IFIRST section and calculate them once only at the start of every run. Note that on the SGI machines, code segments containing an IFIRST section must be compiled with the -static option in order to save the local variables. However, unnecessary compilation with -static can add a lot of overhead, slowing down the code. For instance, QUSBxxx routines do not require any saving of local variables and hence need not be compiled with this option. As a better alternative, all needed local variables should be declared in a SAVE statement and the -static option never used. Care must be taken to make sure that all such variables have been identified before switching over.

* Avoid calculating expressions that do not depend on the loop variable inside the loop. Move such independent code outside the loop.

* Do not divide! Division takes many times longer than multiplication and so if you are dividing by the same denominator more than once, calculate the reciprocal and multiply by that. Sometimes the compiler can do this for you, but this cannot be relied upon.

* Organise your loops and arrays so that the innermost loop is for the first index. For instance, in an I,J,L loop over T(I,J,L). L should be the outermost loop, followed by J, followed by I. Doing it any other way is highly inefficient. If your subroutine processes things by (I,J) grid box, and performs calculations in the vertical (such as MSTCNV, CONDSE etc), write the internal arrays with L as the first index.

The most egregious example of bad looping occurs right at the start of subroutine DYNAM,

DO 310 J=1,JM
DO 305 L=1,LM*3+1
DO 305 I=1,IM
305 UT(I,J,L)=U(I,J,L)
DO 310 L=1,LM
DO 310 I=1,IM
310 QT(I,J,L)=Q(I,J,L)

On the face of it, there’s nothing wrong. It works fine. However, if rewritten properly the model actually runs noticeably faster. What happened here was that the code probably started out correct just processing UX and UT. Then someone must have added QT and decided that they could save some coding by rearranging. (This is theory-the oldest code that could be found was from 1985, and that has the error. Unfortunately, MB112M9.S can’t be fixed because there are too many .U files to check for conflicts.) Anyhow, a couple of years ago Jean went through all the dynamics routines for the 31-layer model and rewrote all the inside-out loops. It ran 15-18% faster when finished.

As a general rule, common blocks should not be initialised like this since it is not transparent. See the example in the FORTRAN 90 section for the preferred method.

* Avoid exponentiation (ie. A**5). All polynomials should be calculated using Horner’s Rule (i.e. A*X**3 + B*X**2 + C*X + D should be coded as D + X*(C + X*(B + X*A)). Factorize as much as possible.

* Calls to mathematical functions ( log, sin, cos, etc.) should be minimised. These are expensive. If repeated calculations are needed, calculate them once and then save the result.

* Use constants for DO loops (not variables). Prior to the PARAMETER statement (in FORTRAN 77), IM, JM, LM, and related quantities were in common blocks. Now, IM, JM, and LM are defined as parameters, and replaced by placeholders ( IM0, JM0, LM0) in the common block. When loops use constants, the loops are set up by the compiler, instead of on the fly at execution time. Quantities in common blocks are treated as variables, not constants, by the compiler so there is overhead setting up the loop. So DO loops will run faster with IM and JM as loop variables, rather than IM0, JM0.

However, there are still people using JMM1 and LMM1 (variables). Instead, JM-1 and LM-1 should be used (which could not be done pre-FORTRAN 77). These are constants and are treated as such by the compiler. As a general point, all constants should be declared in a parameter statement.

Further information on optimization is available in the “XL Fortran: Optimization Guide” handbook or the IRIX Fortran manual.

Unstructured habits

Much of the GCM code (and some of the programmers!) dates back to the punch-card days of yore. In contrast to the current fashion for 1960’s nostalgia, we believe that the code really should reflect some of the advances made by FORTRAN 77 (and maybe even FORTRAN 90!). Many of the most confusing examples are driven by a desire to write compact code. While this was an issue with punch-cards, it is no-longer so important. However, if the desire for compact code is still present, we recommend you use the new features of FORTRAN 90 which allow extremely compact code to be written in clear and unambiguous ways (see next section).

* GOTO bad, IF (..) THEN good. Prior to the introduction of BLOCK IF structures (and now DO…WHILE loops) huge numbers of GOTO statements were needed. However, there is a tendency for GOTO structures to become very spaghetti-like, and it becomes very difficult to follow the logic of the code. Replacing GOTO with any of the other structures makes it much clearer for the reader (and for the compiler).

* Initialisation across a common block using only one array is bad habit. If the common block changes or is rearranged, this can cause serious confusion. See the example in the FORTRAN 90 section for a clearer way of doing this efficiently. (Also see last comment in this section)

* Obsolete code should be deleted. Use a new version number for new code (do not just call it ‘new’ or ‘latest’). If you need a record, you can always look at the previous version stored on Ipcc2 or Ra. Commenting it out only leads to confusion. Leaving it in, and not using it, is inefficient and confusing to the next generation of programmers.

* Over-use of equivalence statements. While equivalence statements are useful for writing compact code, their use should really be restricted for operations that are applied uniformly over the equivalenced arrays (such as initiallisation, input/output etc.)

* IF … GOTO branches out of a DO loop. While this is supported by our compilers and is technically correct, it is not a recommended feature of standard fortran. Problems have occured with compilers/optimizers that did not treat it correctly. In particular, if a loop variable is used outside the loop subsequent to such a branch, this sometimes gave incorrect results. This practice should be avoided if possible. More generally, these branches seriously inhibit proper optimization (since loop structures cannot be changed). Use the DO..WHILE construction instead (see next section).

* Do not use active code lines as continue statements. (This makes Jean CRAZY!) Constructs like this

do 110 i=1,im
b = something complicated
110 a = something complicated depending on b

are not actually wrong, but they are very awkward if someone wants to do some debugging or add an additional calculation. Then, instead of inserting a statement, they also have to rewrite:

do 110 i=1,im
b = something complicated
a = something complicated depending on b
c = something depending on a
110 continue

This is a pain in the neck!!! It also makes it harder to see what’s going on when you do a diff on two files, because it looks like two things were changed, not one. Please use CONTINUE or END DO statements. (In fact with DO … END DO, you can dispense with line numbers entirely).

* COMMON block cautions. Two rules of clean programming that are frequently violated in the GCM are i) out-of-bounds array referencing (as alluded to above), and ii) declaring a common block to be different sizes in different routines. While neither of these are strictly illegal, they do cause problems with optimisers/compilers. For instance, on the SGI the compiler option -OPTreorg\_common is on by default. This pads out common block to make them more efficient (by making sure arrays are on the same page of memory for instance). This is now turned off in setup. Other problems can also occur. For instance, when the compiler encounters common blocks of different sizes it will select the largest, except when the common block is initialised via a block data section even if there are larger references elsewhere. If we reduce our dependence on these kinds of violations, future problems are likely to be minimised.

Common blocks that are used in sub-modules (such as the PBL or tracer codes) are sometimes required in MAIN, INPUT, etc. It is much more straightforward to put these in separate (named) files which are then INCLUDE-d in the code. That way revisions and changes can be quickly accommodated.

New structures from FORTRAN 90

Many of the new features in FORTRAN 90 significantly reduce the amount of coding needed and makes the code much more readable. There is much more to FORTRAN 90 then we can do justice to here, but we particularly want to highlight the array processing facilities. To use FORTRAN 90constructs, you need only change f77 to f90 in your compilation scripts. There is one major caveat: the GCM II’ as a whole does not run if compiled with f90. Some parts of it (the radiation?) cannot be compiled, but others have included f90 features with no problems. Gary’s model can be run with f90. Please ensure that any routines you modify will compile with f90 prior to including any new constructs. Please report any problems (and solutions!) you uncover.

* Array arithmetic. In FORTRAN 90, arrays can be used directly in an expression (as long as it is conformal) without having to loop over the indexes. The example discussed above can be compactly written as below with the compiler deciding the most efficient way to loop over the variables.


Another example is when you wish to only set a limited number of indexes, or ranges of the index. For this, : denotes the entire range of an index or 2:7 for instance denotes just the range 2 until 7 (inclusive). For instance, commonly the polar boxes are set to be the same and equal to the value at I1. This could be written:

P(2:IM,JM) = P(1,JM)

* New constructs: CASE and DO…WHILE. The CASE construct allows you to branch to any number of sub-sections. It is similar to an arithmetic IF or a computed GOTO, but is much more flexible. For instance,

CASE (1) ! ITYPE 1
CASE (2:4) ! ITYPE 2,3 or 4
CASE DEFAULT ! ITYPE anything else
PRINT*,”ITYPE must be 1,2,3 or 4″

The DO…WHILE construct can replace convoluted constructions involving IF and GOTO statements. In particular, it should definitely replace constructs that include IF statements branches out of DO loop. For example,

DO WHILE (Q(1) xyz.o) or
f77 -64 -static -O3 -c xyz.f (-> xyz.o)
for more speed:
f77 -64 -mips4 -static -O3 -OPT:fold_arith_limit1409
-O2 is almost as fast and safer than -O3 (optimization level)
link options:
-mips4 lfastm (uses library of fast math.routines)
5. Differences in FORTRAN (missing/obsolete features etc):
* READ(…,NUMnbytes) is unavailable
* BLOCK DATA have to be named
* ‘open’ cannot be the name of a COMMON BLOCK
* T,F cannot be used to initialize logical variables, unless
* Use STOP rather than RETURN in MAIN.
* The function ERFC only takes real*4 args, use DERFC for real*8 args.
* The compiler warns if you use CALL SUBR(A,..) where A is a variable whereas A is an array in the subroutine (using of course only A(1))
* If the arrays and their dimensions are passed to a subroutine, the dimensions have to be passed also in each entry that passes the arrays (the original HNTRP does not work on the SGIs)
* The following construction does NOT work (with or w/o -static)

(no v1… statement)
ENTRY XYZ (v1 is NOT an argument)
expression involving v1 (result unpredictable if
reached after CALL XYZ)

Quick (but ugly) fix:


6. NAMELIST READ(‘string’,NML…) is unavailable ; namelists have to be read from external files: back to 66-version (using unit 8 instead of 14)
7. System calls to save rsf, acc files did not work reliably. Use simple OPEN instead as in model II’ ( MB112M9.S).
8. Model-specific changes:
1. remove all READ(….NUM…) instances ( INPUT,readt in UTIL..)
2. replace internal NAMELIST reads by reads from unit 8 ( INPUT,2x)
3. name all BLOCK DATA (diagnostics, rad.routines, input,soils)
4. replace RANDU-subroutine (add RANDSGI)

Data Sources

A nice collection of places to find data:


Not much to say about it, really. The site “is what it is”, but looks to have collected in one place a nice set of pointers to sources of both data and processing (i.e. model codes). Oh, and it also has links to the “pasteurized homogenized data food products” as well… ;-)

In Conclusion

So between those two you could make a passable start at getting the GCM and other Climate Codes, along with the data, and getting them running on the parallel system of your choice / budget. Be it a COW (Collection Of Workstations) dynamically assembled via boot-from-CD-parallel-cluster-Linux or via an NVIDIA board (and a lot of coding…)

While I personally think the models are complete tosh, based on mistaken assumptions about cause and effect, and ignoring way too much the key drivers of clouds, cosmic rays, solar UV variation, lunar tidal ocean modulated oscillations, and other natural variation: They might be a suitable base for adding in those things (while taking the roll of CO2 way way down in them…)

Something for “another day”…

Subscribe to feed

Posted in AGW Science and Background, Tech Bits | Tagged , , | 13 Comments

Tips – December 2015

Since WordPress has decided that comments on Pages, like the Tips pages, don’t show up in recent comments, it kind of breaks the value of it for me. In response, I’m shifting from a set of “pages” to a set of “postings”. As any given Tips Posting disappears or gets full, I’ll add a new one. That will restore the broken function.

I will be keeping the same general format, with the T page still pointing to both the archive of Tips Pages as well as the series of new Postings. With that, back to the Tips boiler plate:

This is an “overflow” posting from prior Tips pages as they had gotten so large it was taking a long time to load. Same idea, just a new set of space to put pointers to things of interest. The most immediately preceding Tips posting is: https://chiefio.wordpress.com/2015/10/08/tips-october-2015/.

The generic “T” parent page remains up top, where older copies of the various “Tips” pages can be found archived. I have also added a “Tips” category (see list at right) and will be marking Tips postings with that for easy location.

While I’m mostly interested in things having to do with:

Making money, usually via trading
Weather and climate
Quakes, Volcanoes, and other Earth Sciences
Current economic and political events
(often as those last three have impact on the first one…)
And just about any ‘way cool’ interesting science or technology

If something else is interesting you put a “tip” here.

You can also look at the list of “Categories” on the right hand side and get an idea of any other broad area of interest.

This ought not to be seen as a “limit” on what is “interesting”, more as a “focus list” with other things that are interesting being fair game as well.

Subscribe to feed

Posted in Tips | Tagged , | 93 Comments

Some Odds and Ends

I tend to “collect links” and find that dozens of them never make it into a posting. So I’m going to just put up a few bits here and now. Not a lot of analysis, just some links worth reading and a few small remarks. There isn’t a strong theme to this posting, just stuff that caught my eye at some point.

D.O. Events and 1470 vs 1800 year periods

First up, on D-O events and the ‘rapid warm then slow cooling’ cycle. It comes around about every 1500 years, be it ice-age glacial or not. That isn’t an ‘internal oscillation’ as it has continued despite dramatic shifts in the structure of the oceans and winds and ice. IMHO, likely orbital mechanics with a large lunar component, but perhaps some solar input as well. As I’ve pointed out many times,

IMHO, I think the following link has an important contribution to sorting out why the cycle is 1470 years and not 1800 years (as simple lunar tidal strength would indicate). The author ‘tosses rocks’ at me for no particular reason in that he asserts I’ve not made the ‘paradigm shift’. I comment back that I think it’s a nice advance, but just don’t see the paradigm shift aspect. Somehow he doesn’t like that… I do think it is a nice advance. Adds “what season” to “lunar tidal” and finds that when the tidal forces x season strength product is maximal is about every 1470 years. ( I still don’t see that as a paradigm shift, just a very very nice advance… but it’s not my ego on the line. Rather like my pointing out that you get a roughly 54 – 60 year cycle out of matching when the 18.x year lunar tidal cycle lines up with the same ocean as the ‘repeat’ is about 1/3 of an Earth rotation each 18.x year cycle around the planet. It’s a ‘nice thing to notice’ but not a paradigm shift.)

So bottom line is that not only does the Moon Matter, but The State Of The Earth matters. It’s the interaction of the two. (AND that is influenced / controlled by the gas giants and orbital resonance so it all arrives in sync with solar changes too… as to which is the biggest and which the smallest, well, nobody knows and it likely is only of academic interest as ‘they all go together when they go’ so for all practical purposes any of them is an indicator of what to expect when).

With that, the link worth reading:


How do the phases of the Moon re-synchronize
with the 177.0 year Perigee-Perihelion Cycle?

§ When the Perigee of the Lunar Orbit is pointing at the Sun at (or very near to) Perihelion it does not necessarily mean that the phase of the Moon is either New or Full (Syzygy).

§ The next slide shows the number days that the phase of the Moon is from being New or Full, for each of the FMC’s that are at (or near to) Perihelion. The graph starts out with a New Moon at Perigee on January 1st (near to Perihelion on January 3rd) in the year 0.00.

§ New or Full Moons that re-occur for FMC’s at (or near to) Perihelion that are whole multiples of 739 years (i.e. 0.0, 739.0, 1478.0 and 2217.0 years) after the starting date, always occur at lunar Perigee.

§ In contrast, New and Full Moons that re-occur for FMCs at (or near to) Perihelion half way between whole multiple of 739 years (i.e. 370, 1109 and 1848 years) always occur at lunar Apogee.

§ Hence, we end up with the following 739.0 year repetition sequence for the times where FMC’s are at Perihelion:

0.00 Years è New or Full Moon at Perigee
184.75 Years è First or Last Quarter Moon
369.50 Years è New or Full Moon at Apogee
554.25 Years è First or Last Quarter Moon
739.00 Years è New or Full Moon at Perigee

§ Careful study of the New and Full Moons near 739.0 years shows that the strongest alignment between the phases of the Moon and the 177.0 year Perigee-Perihelion cycle occurs at the FULL MOON at 739.001 years. This contrasts with the NEW MOON at 0.000 years.

§ What this is telling us is that it actually takes 1478.00 years (= 2 x 739.00 years) to complete the cycle with a New Moon at Perigee when a FMC is close to Perihelion once again.

§ The FMC cycle is closest to perihelion at ((1447+1478)/2) years = 1462.5 years, while the lunar phases are most closely aligned with the Perigee-Perihelion cycle at 1478 years – producing a best synchronization at roughly (1478+1462.5)/2 = 1470.3 years.

§This is in extremely good agreement with the measured spacing of the D-O climate warming events of 1470 years!

Down in comments on that page you can see the interesting 1/2 argument where he complains about me and I say “I’m not disagreeing”… but for some folks ‘not disagreeing’ is the same as rejection, I guess…

chiefio October 12, 2013 at 3:06 PM


Um, not seeing the need for a ‘paradigm shift’. While I like your analysis, I’m mostly just quoting other folks works and trying to line them up with the known data.

Some of the better stuff points out that the 1470 is just an average and that actual D.O. / Bond Events have spacing offset to either side of that point. Were I to speculate, I’d speculate that often your analysis is what happens, but also that sometimes a cycle is offset toward the 1200 or 1800 year ends of things as some alignments have “slipped” too far from sync at that point. Then it all gets back toward the closer 1470 once that process gets resynced. In essence, that the times when “almost the same” have drifted out of sync, you get a ‘skip beat’ and the event is either closer or further out in time. Then the drift slowly puts them back in sync at 1470 (ish).

FWIW, I especially like your 739 year finding. I’ve found a similar “half Bond event” cycle in historical records. Not a full on Bond event, but “something happens” that’s not very good…

You might want to look at the exact dates of some of the older D.O. and Bond Events and note how they vary from the 1470 average.

Ninderthana October 13, 2013 at 2:32 AM


Thanks for you comments. I think we will have to agree to disagree on this one. I believe that you are discounting a possible explanation before it has properly been tested.

The real world lunar cycles are not precisely 1470 years. The quoted figures in this post are not mean to be taken as representative of what exactly happens in the real world – only what happens as a long term average.

The complex nature of the lunar orbit means that a high quality ephemeris needs to be used to see how specific DO events align with lunar tidal events.

At this stage, I do not have the time to followup this possible avenue of research but others are welcome to give it a try if they feel it is worth it.

chiefio October 12, 2014 at 5:54 AM

Um, l am not ‘disagreeing’. Just suggested a refinement or two (and didn’t see the need to call your observation a paradigm shift… important step forward is likely a better fit).

I have pointed out that the earth rotation matters (which ocean is under the tidal bulge repeats about 3 x 18.x year lunar cycle), and adding seasons is likely as or more important.

I see all of it as mattering, with how much not yet known. Seasons could easily be very important with Artic ice breakup and flushing as the means by which the tides shift things.

At any rate, I think your observation needs integration to the general understanding, though I still am working on fitting it in (if slowly :-)

I’m also pretty sure the spacing of actual events is not right on 1470 yrs and the difference could lead to further understanding. The variable spacing is visible in the red tick marks on the DO graph above.

So is that variance an error (so what can be learned is that the dating is dodgy) or material (so what can be learned is another minor influence)?

In either case, I don’t see that as a disagreement with this work. More of a next step, perhaps.

I suppose he might have seen it as ‘damning with faint praise’ but it wasn’t. More of a “nice, not earth shaking, but very nice”. I do think he’s onto something, and that it sorts out that 1800 vs 1470 issue. (BUT, I also think there’s an 18.x and maybe a 54-60 ish year ‘jitter’ in exact DO Events that can be tied to that ‘skip beat’ and slippage issue as things ‘cog’ out of alignment and slip back in).

Still, worth a read and a lot of thinking about. Who knows, maybe with some more thinking and fitting to history it WILL be a ‘paradigm shift’…

Solar Status

A nice link to as “State Of The Sun” page:


Not much to say about it other than that there is a lot of good information collected in one place along with some historical cycle data.

Little Ice Age

This link starts out with the Obligatory Kiss The Ring Catechism to Global Warming, but does have some nice history on the L.I.A. in Europe:


Note to general public:

My position on the current global warming is the same as the overwhelming majority of international climate scientists: the current rate of global warming is unprecedented and is being caused by humans. In no way can my summary of the research regarding the impact of regional climate change on the Viking civilization and Europe during the Little Ice Age be used to “prove” the current global warming is due to a natural cycle.

Please view Global Warming: Man or Myth which addresses many of the questions asked about the human impact on the current climate change in a very simple format. The climate change being observed today is unprecedented in modern times and can only be explained by the rapid increase of greenhouse gases by human activities. There are no known natural forces that could have caused the modern climate change.

I find it telling that he finds it necessary to put an up front disclaimer and swearing to The Ring before he can state the simple history of the cold Little Ice Ages… Oh Well, need to hold off the Papal Guards I guess… (Especially now that The Pope is on-board with the whole Weather Guilt Trip thing…)

Weather History

This is a very nice site that collects old weather reported by folks long long ago. Nice for looking up actual ‘how bad it was’ reports:


I’ve given the link to 500 A.D., but you can go forward and back from there as you like it. Here’s a ‘taster’:

~ AD 500 By this time, the storminess of the latter part of the 5th Century (q.v.) had ‘re-arranged’ some coastal alignment in East Anglia. A sea-level rise noted, BUT, Lamb considers that this may have more to do with reporting of increased frequency of inland storm-driven surges, rather than a general world-wide sea level rise. Also note that evidence of significant rise in peat bog deposits by or around this time: therefore implies greater ‘wetness’ (and presumably cyclonicity).

AD 508 Possible severe winter. Rivers frozen for two months. Years also quoted as 507 or 509.

AD 520 Major storm surge in Cardigan Bay.

AD 525 Possible severe winter. Thames frozen for 6 weeks.

Nordic Science

An interesting place where the Science of the Nordic folk can be reported. If anyone cares about ice and cold, it’s them! ;-)


1.4 billion years old forces are causing climate change today
March 23, 2015 – 06:25

Scientists have found evidence that the same natural forces that are causing climate changes today made the climate turbulent 1.4 billion years ago.

Ice ages and warm periods come and go at regular intervals — and that is probably the way it has always been.

This theory is strengthened by new research which demonstrates that climate cycles are by no means a “modern” phenomenon.

”It has been the assumption that these forces existed way back in the Earth’s past, but you don’t really know until you’ve proven it. We’ve done so now, which is pretty spectacular,” says Professor Donald Eugene Canfield, from the Department of Biology at the University of Southern Denmark.
Sheds light on the history of the Earth

According to Canfield, the study can tell us a lot about the climate in the past and help us to understand how historic climate changes have influenced the Earth’s geology and through that, its biology.

The scientist points out that the study is not, on the other hand, much use when it comes to present day man-made climate changes.

”It’s not so much its relevance in relation to the current climate debate as it is in relation to understanding how the climate has developed during the course of the Earth’s geological history,” says Canfield.

One of the big questions regarding the climate of the past is why ice ages occurred in some geological periods and not in others.
Climate concealed in Chinese mountains

In their study the scientists used rock formations in northern China (the Xiamaling Formation) to determine whether Milankovitch cycles (variations in the orbit of the Earth) changed wind and oceanic circulation on Earth 1.4 billion years ago.

The Xiamaling Formations took shape when different minerals and organic material formed a layered sea bed, which over billions of years ended up as the rocks that make up Xiamaling.

By dating the individual strata and studying their content of mineral and organic material the scientists were able to establish that the fluctuations in the climate that occurred as long as 1.4 billion years ago correspond to the Milankovitch cycles we see today.

“We can see that the fluctuations were shorter-lasting in the past than the Milankovitch cycles are today. This is probably because the Moon was closer to the Earth back then,” says Canfield.
More evidence needed

Assistant professor Peter Ditlevsen from the Niels Bohr Institute, University of Copenhagen has read the study and finds it interesting that the scientists have found apparent evidence of Milankovitch cycles 1.4 billion years ago.

Yeah… 1.4 Billion Years of the cycle… but now it doesn’t matter? “I don’t think so Tim!” (A reference to “Tim The Tool Man” show)

A Nice Climate Rant

Here, Dr. Robert Owens has a very nice ‘rant’ about the present state of affairs and how the future is likely to play out in Climate Stuff:


Can you believe it? The world is going to hell in a hand basket and our tone-deaf leaders are worried about man-made global warming. They are so worried that after terrorists turn the streets red with innocent blood more than one hundred of them fly in private jets to ride in long limousines to a summit in Paris vowing to make a statement that will rebuke the terrorists. The way it looks now these leaders of the blind will be sitting in the middle of a snow storm debating how to slow man-made global warming .0001 degree by destroying modern civilization when the mushroom clouds of the Mullahs are rising over American cities.

All the Paris Summit is really about is passing a worldwide carbon tax. This tax is meant to penalize the West for creating the modern world and transfer the money to 3rd world tyrants who loot their own countries and are salivating at the chance to loot ours.

Many people ascribe misguided but still humanitarian motives to this lunacy. I do not. When people we know are at least smart enough to earn advanced degrees or pour water out of a boot and who can plan well enough to maintain a perpetual grasp on power do such obviously dumb things I cannot ascribe their actions to a lack of either intelligence or foresight. I contend that what we are missing in our analysis is an understanding of the true motives.

If we can discern those motives we could make sense of what they are doing. If we could for one moment stand where they stand we would see that everything they say and do make sense.

Just imagine that their guiding light, communism, stands discredited and their social democracy is but a pale imitation on the way to totalitarian control. What is a want-to-be despot to do? Find another cause that can bamboozle the public and allow them to take greater control of everything.

And it continues from there… “hit the link” and enjoy!

In Conclusion

Well, that’s enough for one grab bag. More “in a bit” as events allow.

Subscribe to feed

Posted in AGW and GIStemp Issues, Science Bits | Tagged , , , | 2 Comments

Grandpa In Waiting…

The first grandchild is “on the way” and due in a “day or two”. Consequently, I’m going to be very inattentive to the blog for about a week.

Please feel free to use this as an “open thread” to talk among yourselves ;-)

I’ll mostly be using my tablet to stay in touch, but it doesn’t let me type long articles. When not pacing and waiting I’ll likely be sleeping instead of typing anyway ;-)

So I’m sill “around”, just likely rushing to and fro ;-) and tending to the “Honey Do” list…

Subscribe to feed

Posted in Human Interest | Tagged , , , | 34 Comments

Dept. of O.P.M. – Oh Joy [NOT!]

Millions of folks will have gotten one of these. I’d figured that, by now, I was not going to be one.

I was wrong.

It came in the mail today. From the Federal
Office Of Personnel Management. OPM.

Now you might think that having never worked for the Federal Government that I’d not be in their personnel database. You would be wrong. About 20? years ago I did a contract at The Federal Reserve Bank. (Yes, that Fed…) It required security screening. I was screened. So I’m in their database. Apparently forever.

Here’s what you get as a reward:

Federal Personnel OPM Hack notice letter

Federal Personnel OPM Hack notice letter

I’ve blurred my first names and erased my PIN. Don’t know what good it will do me to ‘sign up’ for whatever it is they are giving me. Didn’t see any “this releases our liability” notice on it, so I’ll likely do it.

Not much to say about it. “It is what it is. – Paul the Mercedes Mechanic”…

But for anyone not so endowed with such a letter, you can click on that one to embiggen it and take a look at what millions of your friends and enemies alike around the world have gotten.

Notice To Chinese Hackers

Or whoever did the hack…

I am presently unemployed and looking for a new “Gig”. There isn’t any need to be devious about things nor attempt to use “my” information against me. I don’t have much money to speak of, and I’m available for “reasonable rates”. I also don’t really care what I do so long as it isn’t criminal and I’m not doing evil. So pretty much any contract you want to throw at me “I’m good with that” anyway and no coercion is needed or expected. Basically, I’m a “computer money whore” available to the highest bidder anyway. All client secrets kept. Will travel as long as expenses are paid.

Medical, Dental, and 401k optional.

In Conclusion

Well, looks like I’m in the pot of all those folks likely to be residing in some Chinese Government Database, perused and reviewed by their CIA equivalent. While I’m not expecting anything to come of it, since I’m really not all that interesting a ‘catch’ and my Fed involvement was decades back and indirectly through other employers, it does put an interesting “wrinkle” in my life.

Maybe they can use someone to proof read their threatening extortion letters and clean up the diction…

Or, since I’m fond of Linux and Kylin is based on BSD, maybe they would like to let a “Penetration Testing” contract to me to knock on their computer doors and see what I find. I’ve done that kind of thing on contracts before.

Or, I like travel and exotic foods along with lots of different languages. Maybe they would like me to tour various western nations (other than the USA) taking photos and writing opinions. I’d be good at that.

Oh well. I’m most likely just going to sit in some dusty corner of the Spy Agency Vault Records as “old retired guy – no interest, not usable”. But a fella can hope can’t he?

;-) of course…

Or: 1/2 Humor, maybe…

May I live in interesting times…
Subscribe to feed

Posted in Human Interest, News Related, Tech Bits | Tagged , | 15 Comments

Paris, COP21, Obama Declares Victory, the TPP Likely Gives It To Him.

Obama was just on the news declaring victory for “climate justice”. (ANY adjective in front of “justice” is a red flag to a propaganda ploy. There is only “justice”, not flavors of it. Things are either just, or not… )

It’s his Big Legacy and he’s asserting this was a big win.

Some other folks are quasi celebrating that it’s Yet Another Farce and can’t possibly be enforce anyway as it “isn’t a treaty”.

But there’s a secret back door in the works. The TPP.


December 5, 2015
Obama will use TPP to Enforce his Climate Agreement

By Howard Richman, Jesse Richman and Raymond Richman

Little appreciated in the current debate on the Trans Pacific Partnership (TPP) is the dramatic way the TPP will abrogate legislative authority permanently from the U.S. Congress to the president. TPP creates a commission with full power to amend the agreement, and an arbitration mechanism with the strength to enforce such amendments. The House and Senate gave up their rights to amend TPP, but they can still vote it down when it comes up for up-or-down votes in both chambers next year.

Although many people still labor under the delusion that TPP is a free trade agreement, the 5,544 page TPP regulates trade, the environment, immigration, patents, copyrights, and labor laws among the 12 countries that are participants and the additional countries that are expected to join. Consequently, in a post-TPP world, U.S. presidents could force almost any alteration in U.S. law simply by achieving support in the TPP commission for a U.S. specific modification to the TPP. Case in point today, Obama’s climate ambitions.

Note that “immigration” line too. The Democrats are hell bent on importing a few Million more Muslims to the USA. I’d wondered just why. Well, turns out they vote about 70%+ Democrat. So bring in a few million, declare a “path to citizenship” and have the president preferentially ‘relocate’ them to “swing States” and you get an instant and permanent Democratic President forever.

Add in the TPP, and then you can get loads of “Progressive” folks brought in from places ranging from Brunei, and Mexico to Malaysia.

But back at the Environmental Hook:

It became obvious when the text of TPP was revealed at the beginning of November. Article 20.4 specifies that TPP will implement relevant multilateral environment agreements:

1. The Parties recognise that multilateral environmental agreements to which they are party play an important role, globally and domestically, in protecting the environment and that their respective implementation of these agreements is critical to achieving the environmental objectives of these agreements. Accordingly, each Party affirms its commitment to implement the multilateral environmental agreements to which it is a party.

So you sign ANY kind of “multilateral environmental agreement” it comes under the enforcement powers of this treaty and bypasses Congress and the U.S. legal system. BTW, I’ve bolded some bits in the quotes.

Republicans are now beginning to realize that Obama will use TPP to enforce his climate agreement. Manning continues:

There can be little doubt that Obama plans on using the Trans-Pacific Partnership governance as the means to enforce whatever he agrees to in Paris on the U.S. all the while our trade partners will ignore it, with the threat of international trade sanctions imposed against the United States should Congress or a future president roll back his agenda.

Obama’s Strategy

In retrospect, Obama’s strategy was obvious. It first appeared in a January 2014 press release from the U.S. Trade Representative’s office which stated:

The United States’ position on the environment in the Trans-Pacific Partnership negotiations is this: environmental stewardship is a core American value, and we will insist on a robust, fully enforceable environment chapter in the TPP or we will not come to agreement. [emphasis added]

Our proposals in the TPP are centered around the enforcement of environmental laws, including those implementing multilateral environmental agreements (MEAs) in TPP partner countries…

It was explained more fully in a 2014 paper (pdf) by Joshua P. Meltzer, Fellow in Global Economy and Development at the Brookings Institution. Meltzer wrote:

As a twenty-first-century trade agreement, the Trans-Pacific Partnership Agreement (TPP) presents an important opportunity to address a range of environmental issues, from illegal logging to climate change and to craft rules that strike an appropriate balance between supporting open trade and ensuring governments can respond to pressing environmental issues.

It became obvious when the text of TPP was revealed at the beginning of November. Article 20.4 specifies that TPP will implement relevant multilateral environment agreements:

1. The Parties recognise that multilateral environmental agreements to which they are party play an important role, globally and domestically, in protecting the environment and that their respective implementation of these agreements is critical to achieving the environmental objectives of these agreements. Accordingly, each Party affirms its commitment to implement the multilateral environmental agreements to which it is a party.

BTW, the TPP would also muzzle “fair use” and make all sorts of communications and information sharing subject to draconian legal threats:


but I’ll leave that for some other day…

In Conclusion

So there you have the strategy. Obama was declaring victory since our Congress has been so stupid as to be manipulated into handing off their power to govern to faceless “International Commissions”. The TPP has to be killed or we get COP21 thrust down our throats.

(Along with loss of liberty to use communications and ‘fair use’, along with outside control of immegration, along with … well, a whole lot of bad crap.)

I would strongly urge Congress to get a firm grip on their one remaining testicle and realize they have handed the knife to permanently render them impotent over to the TPP and Obama.

It is fairly clear that ANY “international agreement” or “treaty” from this point forward is a “clear and present danger” to the USA.

I’m reminded of what I’d tell my children when they were not behaving well: “Stop it! JUST STOP IT. NOW.”.

At this point there is a clear strategy from those who wish to remove the USA from a central place on the international stage:

1) Demographic swamping. Import LOADS of folks from “Progressive” places. Preferably non-Christians so as to neutralize that vote. Only from groups that have a pattern of strongly voting for Democratic / Socialist / Left Wing / Progressive causes.

2) Use “treaties” to neuter Congress and bypass the US Constitution and laws.

3) Parasitize and use all NGOs, Government Agencies, Foundations, and Charitable Organizations so as to bend their funding to Progressive Ideological purposes.

4) Pack Education with true believers. (Straight out of Marx and the Communist Manifesto, BTW). Drive out anyone from a conservative background and POV. Put Education under Federal control (who will take their “guidance” from “international norms and agreements”).

and likely a few more I’ve not sorted out just yet.

It is absolutely essential that Congress (and hopefully our next President) take action to halt and reverse each and all of those.

It will be a very long hard battle, since Socialism / Communism isn’t going to just give up and go home. The didn’t after the USSR collapsed, they just moved into Europe and set up the EU…

Basically, the era when we could just sit back and trust our “elected representatives” to actually represent us is over. Likely some long time ago. Similarly the time when a “charity” could be trusted to do anything reasonable with your “donations”. ALL must be bent to the will of the Progressive Left. So just stop funding them. One can no longer trust the NSF to actually fund Science, nor the EPA to respect citizens rights either. Congress is no longer doing anything remotely resembling “oversight” (likely deliberately so that they can avoid responsibility…) and we have top to bottom Agencies Run Amok.

To quote the Microsoft Support web site when I was researching a bug in their software: “This behavior is by design”.

It isn’t an accident. It isn’t just happening or accidental cultural drift. It is a designed process of subversion of the USA. Driven out of the UN, and fully supported by a globalist socialist movement.

Either our elected “representatives” get with it and start fixing this crap, or you can kiss off the Wesphalian State and accept that the USA no longer has:

“Westphalian sovereignty is the principle of international law that each nation state has sovereignty over its territory and domestic affairs, to the exclusion of all external powers, on the principle of non-interference in another country’s domestic affairs, and that each state (no matter how large or small) is equal in international law.

The EU has already achieved that goal in Europe. Now it’s the USA that is to be brought under the yoke. (Middle East to be reduced to rubble so it, too, can be ‘unified’ in some kind of sphere of control… details TBD…) FWIW, there is also an effort to make an EU of Latin America underway. Sometime later to be stitched together with the EU and TPP.

So you have a simple choice, really:

Stop it. Just stop it.


Accept life as a serf reporting up through 3 or 4 levels of non-Westphalian non-Representatives to a faceless out of reach “overlord commission”.

Subscribe to feed

Posted in Political Current Events | Tagged , , , , , , | 15 Comments