So I’ve got my infrastructure Pi boards all in a dogbone rack, and I’ve got my Desktop system on a Pi Model 3 and find it good enough as a Daily Driver (still tuning the preferred ‘release’ to run, at the moment wandering between Arch, Debian, and Devuan as the mood strikes me or depending on where I’ve left a bit of something…)
So I figured maybe it’s time to set about using all these cores. (I’ve got ‘top’ running in three windows from 3 of the 4 Pi boards and seeing them all at 99% idle is, well, it offends my sense of waste-not want-not…)
I’ve made GIStemp ‘go’ before, so on the docket is “do it again” with the more recent version (AND finish the conversion to Little Endian in Step 4-5 block that I never finished… I got ‘distracted’ by the discovery that the ‘magic sauce’ data molestation was moving upstream into the GHCN, USHCN and all the little National HCNs from around the world… and spent a year or three doing ‘dig here!’ in that…) So that’s to be done “sometime”. But it takes nearly no CPU to run GIStemp. See:
for a photo and description of the workstation I’d done the first port on. Pentium class AMD CPU at 400 MHz with 128 MB of memory. Yes, far less than the modern Raspberry Pi… But I’d already DONE a trial run at the port-to-the-Pi, and the old code complied fine:
Raspberry Pi GIStemp – It Compiled
Posted on 29 July 2015 by E.M.Smith
I’ve not done any testing yet, just a first cut “unpack and compile”.
The good news is that it did, in fact, compile. And STEP0 ran.
I had the same two “Makefile” issues from the port to Centos (Red Hat for the Enterprise). No surprise as it was an unpack of the same tar file archive with the same two issues in it. A bogus “:q” from a botched exit from the editor in one source file, and the need to remove a trace directive from the compile statement on another. After that, it compiled fine with one warning.
That warning might be a significant bug in the GIStemp code. I’ll need to investigate it further. I don’t remember if any of the other compilations found that type mis-match. The importance of it is still TBD and might be nothing.
Here’s the captures on what I did:
So re-doing that on a faster Pi with the newest code seemed a bit thin gruel…
What I really wanted was a GCM to play with. Get into the whole idea of what is it these things do. Maybe put in a “GOSUB Solar_Cycle” in place of CO2, tune a bit, and present an alternative computer fantasy to the AGW one.
The Model E code is the biggest GCM (Global Circulation Model) where I’ve found easy to get public domain source code. But it isn’t promising for a stack of 3 x Pi boards…
GISS GCM ModelE
The current incarnation of the GISS series of coupled atmosphere-ocean models is available here. Called ModelE, it provides the ability to simulate many different configurations of Earth System Models — including interactive atmospheric chemistry, aerosols, carbon cycle and other tracers, as well as the standard atmosphere, ocean, sea ice and land surface components.
Model versions used for the CMIP5 and CMIP6 simulations are available via the nightly snapshots of the current code repository, including the frozen ‘AR5_branch’. However, users should be aware that these snapshots are presented ‘as is’ and are not necessarily suitable for publication-quality experiments.
Please let us know if you intend to use this code by subscribing to our mailing list. We will then keep you (very occasionally) informed about code patches and possible improvements to the configuration.
Guidelines for Use of Model Code
Our code is freely available for download and use. Note however, that this is a work in progress and many people contribute to its development. It’s important to state how much the group effort — in all the different aspects of modeling — has contributed to progress. From the people who conceive of new developments, the people who code them, the people that apply the model to new science questions, the people who find the inevitable bugs, to those who do the data processing and analyse the diagnostics, all are essential to keeping the project viable. This should be reflected on when deciding on co-authorship and acknowledgments.
When you look down into the ‘configuration’ of computer desirable, you find:
The ModelE source code is quite portable and can be installed and used basically on any Unix machine with enough CPU power and memory (from Linux clusters to Linux and Mac OS X desktops). Though one can run the basic serial version of the model with prescribed ocean on a single core with as little as 2 GB of memory, to do any useful simulations in reasonable time one would need a computer with at least 16 cores (Sandy Bridge or faster) with at least 1 GB of memory per core. To do dynamic ocean simulations with full atmospheric chemistry one typically would need 88 cores with at least 1 GB of memory per core.
The source code is written mostly in Fortran 90 language with some elements of Fortran 2003 and can be compiled either with Intel ifort compiler (version 12.0) or with GNU gfortran (version 4.9 or later).
For input/output we use a NetCDF library, so it has to be installed (version 3.6 or later).
For parallel simulations on multiple cores the model needs to be compiled with MPI support, so an MPI library needs to be installed on your computer. The following MPI distributions are currently supported by the model:
For desktops or small servers we would recommend OpenMPI, since it is the easiest one to install and configure, though MPICH2 also works without problems. On a cluster, typically it would be up to support group to make a decision on which MPI distribution is more suitable for a particular platform. Over the last few years we were using Intel MPI with great success.
The compilation process is based on GNU make command and uses perl scripting language and m4 preprocessor (GNU version). Typically these are present by default on any Linux or Mac OS X system, but if you are using other type on Unix you may need to install them.
If instead of latitude-longitude version of the model you want to work with cubed sphere version, then in addition to the requirements mentioned above you will need to install a compatible ESMF library. You will also need to obtain the source code for the cubed sphere dynamical core from the developers since it is not included in the standard modelE distribution.
As I’ve already gotten Fortran 90 and OpenMPI to run on the Pi, that’s not an issue. BUT, that 88 cores and 88 GB of memory is ‘an issue’ for a Pi Cluster. ASSUMING I’m willing to wait longer for results and that a background process running for a week or three is OK by me, not by them: That might let me use a Pi Model 3 with GB memory (lite on memory / core by 3 GB, so add swap…) and then it’s only 88/4 = 22 Pi Model 3 boards and the 6 Dogbone Cases to hold them… Er, a bit larger than my kit today… Even dropping back to the 16 cores and 16 GB is 4 cores more than I’ve got, and much much faster cores too… So I’m about a factor of 10 behind where I’d really need to be. Not where you want to make your first test run at a technology…
I could likely make their minimal run version “go”, and that’s where I’ll start whenever I return to Model E… but not now. Just below Model E on their page, you find a reference to an older model. Off to the side of that page at the NASA, one finds a link to it:
GISS GCM Model II
The Goddard Institute for Space Studies General Circulation Model II, described fully by Hansen et al. (1983), is a three-dimensional global climate model that numerically solves the physical conservation equations for energy, mass, momentum and moisture as well as the equation of state.
Hmmm… 1983 isn’t all that long ago. Clearly this model has been used for Global Warming calculations. Looks like a good place to start, to me. Yes, it’s about 30 years back, but it ought to contain the basics. It would also be more approachable from the POV of figuring out how their thinking evolved.
The standard version of this model has a horizontal resolution of 8° latitude by 10° longitude, nine layers in the atmosphere extending to 10 mb, and two ground hydrology layers. The model accounts for both the seasonal and diurnal solar cycles in its temperature calculations.
Cloud particles, aerosols, and radiatively important trace gases (carbon dioxide, methane, nitrous oxides, and chlorofluorcarbons) are explicitly incorporated into the radiation scheme. Large-scale and convective cloud cover are predicted, and precipitation is generated whenever supersaturated conditions occur.
Snow depth is based on a balance between snowfall, melting and sublimation. The albedo of snow is a function of both depth and age. Fresh snow has an albedo of 0.85 and ages within 50 days to a lower limit of 0.5. The sea ice parameterization is thermodynamic with no relation to wind stress or ocean currents. Below -1.6°C ice of 0.5 m thickness forms over a fractional area of the grid box and henceforth grows horizontally as needed to maintain energy balance. Surface fluxes change the ocean water and sea ice temperature in proportion to the area of a grid cell they cover. Conductive cooling occurs at the ocean/ice interface, thickening the ice if the water temperature remains at -1.6°. Sea ice melts when the ocean warms to 0°C and the SST in a grid box remains at 0°C until all ice has melted in that cell. The albedo of sea ice (snow-free) is independent of thickness and is assigned a value of 0.55 in the visible and 0.3 in the near infrared, for a spectrally weighted value of 0.45.
Vegetation in the model plays a role in determining several surface and ground hydrology characteristics. Probably the most important of these is the surface albedo, which is divided into visible and near infrared components and is seasonally adjusted based on vegetation types. Furthermore, the assigned vegetation type determines the depth to which snow reflectivity can be masked. Hydrological characteristics of the soil are also based upon the prescribed vegetation types; the water holding capacity of the model’s two ground layers is determined by the vegetation type as is the ability of those layers to transfer water back to the atmosphere via transpiration. Nine different vegetation classes, developed by Matthews (1984) for the GISS GCM, represent major vegetation categories and the ecological/hydrological parameters which are calculated from the vegetation. Since the GISS GCM is a fractional grid model, more than one vegetation type can be assigned to each grid box.
Sea surface temperatures (SST) are either specified from climatological input files or may be calculated using model-derived surface energy fluxes and specified ocean heat transports. The ocean heat transports vary both seasonally and regionally, but are otherwise fixed, and do not adjust to forcing changes. This mixed-layer ocean model was developed for use with the GISS GCM and is often referred to as the “Q-flux” parameterization. Full details of the Q-flux scheme are described in Russell et al. (1985), and in appendix A of Hansen et al. (1997). In brief, the convergence (divergence) at each grid cell is calculated based on the heat storage capacity of the surface ocean and the vertical energy fluxes at the air/sea interface. The annual maximum mixed-layer depth, which varies by region and season, has a global, area-weighted value of 127 meters. Vertical fluxes are derived from specified SST control runs where the specified SSTs are from climatological observations and have geographically and seasonally changing values. In the early 1990s Russell’s technique was modified slightly to use five harmonics, instead of two, in defining the seasonally-varying energy flux and upper ocean energy storage. This change improved the accuracy of the approximations in regions of seasonal sea ice formation. The technique reproduces modern ocean heat transports that are similar to those obtained by observational methods (Miller et al. 1983). By deriving vertical fluxes and upper ocean heat storage from a run with appropriate paleogeography and using SSTs based on paleotemperaure proxies, q-fluxes it provides a more self-consistent method for obtaining ocean heat transports from paleoclimate scenarios that use altered ocean basin configurations.
Present-day maintenance and some development of Model II is performed within the context of the Columbia University EdGCM project. See the links at right for source code downloads and other resources provided by that project. Historical versions of Model II (e.g., the computer code used in the 1988 simulation runs) are not currently available. Please address all inquiries about the EdGCM project and about implementing Model II on modern personal computers to Dr. Mark Chandler.
Persons interested in using the most recent version of the GISS climate model, a coupled atmosphere-ocean model, should see the ModelE homepage.
Gee… and it is STILL in use, though mostly in a teaching context.
I can live with that.
The “stuff on the right” says:
Downloads & Links
Model II Source Code
The 8°×10° (lat×lon) version of the GISS Model II is still in use as a research tool for paleoclimate and planetary studies, for very long simulations, or where limited computing resources are available. An up-to-date copy of this slightly modified version, with minor updates and bugfixes, is available from the EdGCM project.
The Columbia University EdGCM software is a graphical user interface which simplifies set-up and control of GISS Model II. This educational suite gives users the ability to create and conduct “Rediscovery Experiments”, simulations that reproduce many of the hundreds of experiments that have been conducted and published using this version of the NASA GISS GCM.
EdGCM Model II Forum
Persons attempting to compile the Model II FORTRAN source code may consult the EdGCM message boards for assistance from other model users.
Even has an active Forum for questions. This is not some antique dead code. This is an actively used teaching tool. Just what I’d like to have and where I’d be best served to start.
From their link:
EdGCM provides a research-grade Global Climate Model (GCM) with a user-friendly interface that can be run on a desktop computer. For the first time, students can explore the subject of climate change in the same way that actual research scientists do. In the process of using EdGCM, students will become knowledgeable about a topic that will surely affect their lives, and we will better prepare the next generation of scientists who will grapple with a myriad of complex climate issues.
Our goal is to improve the quality of teaching and learning of climate-change science through broader access to GCMs, and to provide appropriate technology and materials to help educators use these models effectively. With research-quality resources in place, linking classrooms to actual research projects is not only possible, but can also be beneficial to the education and research communities alike.
Just what I’m looking for. If it can run “on a desktop computer” it can run on a Pi (though perhaps more slowly…)
If even looks like, for the less adventuresome, you can just run it on your Mac or PC without the porting work:
EdGCM, or the Educational Global Climate Model, is a suite of software that allows users to run a fully functional 3D global climate model (GCM) on laptops or desktop computers (Macs and Windows PCs). Teachers, students and others can learn-by-doing to design climate experiments, run computer simulations, post-process data, analyze output using scientific visualization tools, and report on their results. All of this is done in the same manner and with the same tools used by climate scientists.
Click here for details about the specific components of the EdGCM software
The Global Climate Model (GCM) at the core of EdGCM was developed at NASA’s Goddard Institute for Space Studies, and has been used by researchers to study climates of the past, present and future. EdGCM makes it possible for people to use the GCM without requiring programming skills or supercomputers. Major components of the software include:
A graphical user interface to manage all aspects of working with the GCM.
A searchable database that organizes experiments, input files, and output data.
Scientific visualization software for mapping, plotting and analyzing climate model output.
An eJournal utility to help create reports or instructional materials (including images).
Automated conversion of graphics and reports to html for web publishing.
EdGCM also comes complete with 6 ready-to-run climate model experiments:
2 modern climate simulations
3 global warming simulations
1 ice age simulation
and educators have great flexibility in constructing their own scenarios to satisfy specific curricular requirements. EdGCM scales for use at levels from middle school to graduate school, making it a unique tool for bringing a real research experience into the classroom.
Me? I want to compile this from scratch as my intent is to add some bits to it from the Solar and Planetary Cycles POV. Hey, we need a “computer model run” to bash the AGW folks with just like they have been pushing their stuff at us…
Has a load of OS X Intel and OS X PPC and more. Also some PC bits. For Linux they want you to run it under wine:
OS X GNU Fortran
This platform is not officially supported. There is a Makefile.gfortran in the source directory which works at the time of this writing. You’ll need to edit RANVAX.f and setpath.f to remove some “_”‘s.
GNU Tools w/ CygWin (Unofficial compile)
See http://forums.edgcm.columbia.edu/ for comments and user suggestions on building with CygWin
You cannot run EdGCM on Linux but you can run Model II. The following instructions are thanks to Patrick Lee
Create a run on Windows, suppose the name of the run is testrun
Run the simulation for the first hour and then stop the simulation. 3.Transfer the whole directory ..\EdGCM 4d\EdGCM 3.0\Output\testrun to Linux. You may delete the file GCM-testrun.exe which is the Lunar (the file testrun.exe is model II) and WhatToStart.TXT. If there is a file called SSW.STOPGCM, then you MUST delete it.
Use wine to run the testrun.exe on Linux (Notice that model is a command line program, so you should not be afraid when you see the black screen).
OK, so “some assembly required”, but it is known to be runnable under GNU Fortran and that’s my target language. As Fortran is highly portable, I’m thinking this likely is not all that hard to port, and these folks just love expensive Macs… “We’ll See” when the compile time comes…
So I downloaded it. And unpacked it.
[root@ARCH_pi_64 trunk]# ls -l /GCM total 304 drwxr-xr-x 3 root root 4096 Dec 26 23:20 GCM -rw-r--r-- 1 root root 303284 Dec 26 23:20 modelII_source.zip
OK, it’s a zip file of 303 KB. Not big, really. Unzipped with ‘unzip’ it makes a directory named “GCM”. Wandering down it, you get to:
[root@ARCH_pi_64 trunk]# pwd /GCM/GCM/modelII/trunk
(Note I named my working dir /GCM before I knew what it would do, so I have a double dip on GCM at the top of the path…)
[root@ARCH_pi_64 trunk]# du -ks . 1576 . [root@ARCH_pi_64 trunk]# ls B83XXDBL.COM Pjal0C9.f BA94jalC9.COM PostBuild.sh DB11pdC9.f R83ZAmacDBL.f FFT36macDBL.f RANVAX.f FORCINGSjalC9.f RANVAXxlf.f FORCINGSmac.COM README.f.in Info.plist RFRCmacDBL.f Makefile UTILmacDBL.f Makefile.Mac.PPC UTILwinDBL.f Makefile.README commandlinetest.gui Makefile.gfortran modelF.r Makefile.ifort mrweprefs.r Makefile.win pd_COMMON Makefile.win.gui setpath.f Mjal2cpdC9.f
So 1.5 MB of stuff once unpacked (and not including data files). I think I can live with that.
I suspect my biggest hurdles will be the GUI bits. That there is a MAKEFILE for gfortan is a Very Good Thing.
[root@ARCH_pi_64 trunk]# cat Makefile.gfortran F77COMPILER= gfortran LINKER = gfortran F77_FLAGS = -c -s -fconvert=big-endian -fno-automatic -ff2c -O2 # ifort -O2 -convert big_endian -IPF_fma -save -zero -ftz -assume # dummy_aliases -align none -mp -openmp -c L23_DAILY_MClim_CH4mths.f LIBS = -L/Developer/SDKs/MacOSX10.5.sdk/usr/lib TARGET= model.command SRCS = RANVAX.f \ setpath.f \ RFRCmacDBL.f \ UTILmacDBL.f \ Mjal2cpdC9.f \ Pjal0C9.f \ FORCINGSjalC9.f \ FFT36macDBL.f \ R83ZAmacDBL.f \ DB11pdC9.f \ README.f OBJS = $(SRCS:.f=.o) # all objects %.o: %.f $(F77COMPILER) -o $@ $(F77_FLAGS) $< $(TARGET): $(OBJS) $(LINKER) $(LPATHS) $(OBJS) $(LNK_FLAGS) $(LIBS) -o $(TARGET) clean: rm -f *.o rm -f $(TARGET) .PHONEY: all clean
Looks pretty straight forward and simple. I think I can work with that.
As I already have gfortran installed under Arch Linux, I’m doing my first whack at it here. Once that works, I’ll try moving the whole thing over to Devuan on the Model 2 (assuming things are fast enough…)
So that’s where I’m at and what I’m doing for the next day or three…
The description of the Input Files here:
Gives an interesting POV of what is set at the start, and how many parameters you can play with (lots!).
You can download the source zip file from:
Then there’s a lot of documentation links on their top page for the model… So some “light reading” for the rest of the year… that thankfully is only a few more days ;-)
So that’s what I’m ‘up to’ for the next while. Seeing if I can make a Pi Port of this go, then seeing how hard it might be to splice on a Solar Cycle routine… As in all things software “We’ll see” ;-)