Upload the GISS ModelE1 to the OSC cluster and uncompress using gzip and tar. These files are named modelE1.tar.gz and fixed.tar.gz. The first file is the model and the second is the boundary and initial conditions.
See the documentation provided at GISS ModelE
Placed the initialization files in the subdirectory initialization. Will need to specify a path to this directory when running ModelE.
It looks like the q-flux model in ModelE may not meet the need I have for determining fluxes within the ocean. Initializing the ocean involves the following:
To set up ModelE you need to upload the program code and a fixed dataset file. The program code is somewhat self contained and locates important files relative to a root directory (by default this is the directory modelE1_pub. The decks directory contains a Makefile. Run the command gmake config in this directory and it will generate a file named .modelErc in the home directory. This file provides paths to directories important to the model run. By default they use a root directory of /u, which is used as a central directory at the GISS site. More my account on glenn.osc.edu, I will set this directory to /nfs/06/ced0013/ModelE/wocean. The location of NetCDF is /usr/local/netcdf-3.6.2. The fixed dataset file must be placed in the directory /nfs/06/ced0013/ModelE/wocean/cmrun
The rundeck must be set up. For this first run, use the file E001.R, which is a fixed sea surface temperature. Start in the directory ~/ModelE/modelE1_pub/decks and run the command gmake rundeck RUN=E001smg. This takes the sample deck file E001.R and copies it to the current directory as well as the depository /nfs/06/ced0013/ModelE/wocean/cmrun/modelE/decks.
Now compile the program by running the command gmake gcm RUN=E001smg. Upon success, run the first hour setup using the command gmake setup RUN=E001smg. If this is successful, the rest of the run can be make.
Troubleshooting: The compile did not work evidently because it could not find the compiler. I went back to the file .modelErc and uncommented the compiler option and changed the value to Portland Group. To set up the deck again use the command gmake rundeck RUN=E001smg OVERWRITE=YES. The rundeck was generated; however, when compiling the model, it gave a new error which states that there is no rule to make a target. Since this did not happen the first time, I assume that some files were generated on the first compile that are interfering with this compile. Therefore, I will try generating a new deck called E002smg. In order for the E001.R file to be used the command for generating the deck is now gmake rundeck RUN=E002smg SRC=E001 If SRC=E001 not specified, the default is E001.
Dealing with the following error
Can't open ./.depend.E004smg: No such file or directory at -e line 1. ------------ Rebuilding Dependencies ------------- running CPP Requested target is not supported on Linux compiling RES_M12.f ... Requested target is not supported on Linux gmake[1]: *** [RES_M12.o] Error 2
After several hours of poking around in Makefile and other files in the directory, I came across the file model/Rules.make. This file is called by Makefile and sets a number of compiler flags. Looking through the list I found a conditional for a compiler of PGI (Portland Group). In the file .modelErc it was not clear how the compiler flag should be defined. I assumed it was either Portland or Portland Group. By changing it to PGI, the compile was successful. Now test the compile by using the command gmake setup RUN=E001smg.
A successful setup was accomplished. Now the model needs to be run. This particular run lasts for 6 days. It is initiated with the following command from the decks directory. ../exec/runE E001smg. If the run is to be interrupted gracefully, use the command ../exec/sswE E001smg. The run can be resumed by using the command ../exec/runE E001smg again.
The run done at the end of the day yesterday did not complete correctly. After about 20 minutes it ended without notice. Looking through the file wocean/E001smg/nohup.out I saw the following message
./E001smg: line 32: 29344 Killed ./E001smg.exe -i ./$IFILE > $PRTFILE ./E001smg: line 37: /nfs/06/ced0013/ModelE/wocean/exec/runpmE: No such file or directory ./E001smg: line 37: exec: /nfs/06/ced0013/ModelE/wocean/exec/runpmE: cannot execute: No such file or directory
Looking through the documentation, it seems that the files contained in modelE1_pub/exec should also be present in the directory defined by the parameter EXECDIR in the file .modelErc. As a result of this conflict, I think it would be best to redefine the location of the root directory for the model run. Up to now it has been wocean. I think it should be the directory modelE1_pub. Since this change will involve a little work, it might be good to move the directory modelE1_pub to a location closer to the home directory. Let's start from scratch and set up using the following steps.
./E001smg: line 32: 14640 Killed ./E001smg.exe -i ./$IFILE > $PRTFILE
I'm not sure why it used the term killed; however, it cleaned up working files and it left the following three files from the run. It seems that it terminated gracefully. This run took 20 minutes.
An email message was sent that contained some output values from the model and they do not terminate cleanly. On the output from Cloud Frequency for 60S to 30S latitude, there are 3 values reported for the 900 level of PRESSTAU. After this comes a message of No automatic fixup for return code: 137. A clipping from the email message is as follows:
0ISCCP CLOUD FREQUENCY (NTAU,NPRES) % 60S-30S PARTIAL ------------------------------------------------------------------------ PRESSTAU 0. 1.3 3.6 9.4 23 60 > 90 0.0 0.0 0.0 0.0 0.0 0.0 245 0.5 0.3 0.3 0.5 0.9 1.3 375 0.3 0.8 1.3 2.8 3.4 3.0 500 0.1 0.6 2.2 4.2 3.1 1.0 630 0.3 1.9 8.1 8.5 2.0 0.2 740 0.2 1.1 2.5 1.6 0.2 0.0 900 0.3 2.5 7.4 4.No automatic fixup for return code: 137
Try a new run using the multiple processors flag. This flag is located in the file ~/.modelErc. Set it to YES and run the previous series of step for an identical run
Ran into a snag on the compile (step 2). A comment in the file ~/.modelErc indicated that the multiple processor flag is only recognized by SGI and Compaq. The conclusion is that this is not supported in Linux. The compile terminated with the following statement when it was trying to link the executable
linking executable pgf90-Error-Unknown switch: -Msmp gmake[1]: *** [/nfs/06/ced0013/modelE/decks/E002smg_bin/E002smg.exe] Error 1 gmake: *** [gcm] Error 2
All the modules seem to compile just fine. I need to track down the switch -Msmp and see if there is an equivalent on supported by the Portand Group compiler. The only changes from the previous compile and run would be the incorporation of OpenMP instructions.
Did the following command grep -r Msmp modelE to identify files that contain the string Msmp. The error message line showed up in a hidden file supposedly in ~/modelE/model; however, I could not find it in a normal file list. The other two hits were in a file named ~/modelE/model/Rules.make. Scrolling through this file, there are two lines FFLAGS += -Msmp and LFLAGS += -Msmp. These are fortran compiler and linker flags. There is a comment in this file that these may need to be changed for the PGI OpenMP compatibility???. I would agree. From the Portand Group web site it seems the options should be changed to -mp.
Run gmake clean and gmake vclean to get rid of object files from the failed compile. Now repeat step 2 and follow with steps 3 and 4. Step 2 resulted in a clean compile; however, step 3 resulted in a Segmentation fault. The error message is as follows
setting up run E002smg output files will be saved in /nfs/06/ced0013/modelE/output/E002smg using /nfs/06/ced0013/modelE/cmrun/AIC.RES_M12.D771201 for IC only using /nfs/06/ced0013/modelE/cmrun/GIC.E046D3M20A.1DEC1955 for IC only starting the execution current dir is /nfs/06/ced0013/modelE/output/E002smg Starting 1st hour in the background. -bash-3.2$ sh: line 5: 23876 Segmentation fault ./"E002smg".exe -i I >> E002smg.PRT Problem encountered while running hour 1 : cat: error_message: No such file or directory >>> <<<
It is possible the segmentation fault is due to a stack size that is too small. Originally I looked at the stack size and it said unlimited. However, with the command ulimit -s it reported a stack size of about 10240 kbytes. The stack size was changed to ulimit -s 32768 and running step 3 was successful.
Running the multiple processor version of the program also resulted in a kill at 20 minutes. In fact it was exactly at 20 minutes. It may be limit placed on a program run on glenn.osc.edu to prevent one from dominating the cluster. The email sent by this run seems to be a portion of the file E002smg.PRT. There is an error code of 137 showing up on this run at the end. The program was not at the same place because it was a series of Restart files for H2O generated by CH4 in the stratosphere. The following is a clip from this email.
total 98956 -rw-rw-r-- 1 ced0013 PCED0003 19954612 May 19 14:06 1JAN1950.rsfE001smg -rw-rw-r-- 1 ced0013 PCED0003 4792472 May 19 14:06 DEC1949.accE001smg lrwxrwxrwx 1 ced0013 PCED0003 7 May 19 13:35 E -> E001smg -rwxrwxr-x 1 ced0013 PCED0003 838 May 19 13:35 E001smg -rwxrwxr-x 1 ced0013 PCED0003 16156865 May 19 13:30 E001smg.exe -rwxrwxr-x 1 ced0013 PCED0003 2572 May 19 13:35 E001smgln -rw-rw-r-- 1 ced0013 PCED0003 769401 May 19 14:07 E001smg.PRT -rwxrwxr-x 1 ced0013 PCED0003 382 May 19 13:35 E001smguln -rw-rw-r-- 1 ced0013 PCED0003 9 May 19 13:46 flagGoStop -rw-rw-r-- 1 ced0013 PCED0003 29668300 May 19 14:05 fort.1 -rw-rw-r-- 1 ced0013 PCED0003 29668300 May 19 14:06 fort.2 -rw-rw-r-- 1 ced0013 PCED0003 3807 May 19 13:47 fort.8 -rw-rw-r-- 1 ced0013 PCED0003 102400 May 19 13:47 fort.99 -rw-rw-r-- 1 ced0013 PCED0003 796 May 19 14:07 I -rw-rw-r-- 1 ced0013 PCED0003 231 May 19 13:35 Ibp -rw-rw-r-- 1 ced0013 PCED0003 16063 May 19 13:35 Iij -rw-rw-r-- 1 ced0013 PCED0003 9606 May 19 13:35 Ijk -rw------- 1 ced0013 PCED0003 87 May 19 14:07 nohup.out -rwxrwxr-x 1 ced0013 PCED0003 165 May 19 13:35 runtime_opts
total 74220 lrwxrwxrwx 1 ced0013 PCED0003 7 May 19 15:11 E -> E002smg -rwxrwxr-x 1 ced0013 PCED0003 838 May 19 15:11 E002smg -rwxrwxr-x 1 ced0013 PCED0003 16363137 May 19 15:11 E002smg.exe -rwxrwxr-x 1 ced0013 PCED0003 2572 May 19 15:11 E002smgln -rw-rw-r-- 1 ced0013 PCED0003 27224 May 19 15:37 E002smg.PRT -rwxrwxr-x 1 ced0013 PCED0003 382 May 19 15:11 E002smguln -rw-rw-r-- 1 ced0013 PCED0003 9 May 19 15:16 flagGoStop -rw-rw-r-- 1 ced0013 PCED0003 29668300 May 19 15:36 fort.1 -rw-rw-r-- 1 ced0013 PCED0003 29668300 May 19 15:36 fort.2 -rw-rw-r-- 1 ced0013 PCED0003 3807 May 19 15:17 fort.8 -rw-rw-r-- 1 ced0013 PCED0003 102400 May 19 15:17 fort.99 -rw-rw-r-- 1 ced0013 PCED0003 797 May 19 15:37 I -rw-rw-r-- 1 ced0013 PCED0003 231 May 19 15:11 Ibp -rw-rw-r-- 1 ced0013 PCED0003 16063 May 19 15:11 Iij -rw-rw-r-- 1 ced0013 PCED0003 9606 May 19 15:11 Ijk -rw------- 1 ced0013 PCED0003 87 May 19 15:37 nohup.out -rwxrwxr-x 1 ced0013 PCED0003 165 May 19 15:11 runtime_opts
The purpose of these files appear to be the following:
Running the program pdE on the file DEC1949.accE001smg generated the following files:
Looking through the documentation, there are two things to try. The first is to use multiple processors. Although the code is compiled for OpenMP, the number of processors needs to be specified in the runE command. Also it would be useful to have the data saved in netCDF format. This must be specified in the rundeck before compiling the program. A third run will now be initiated using these two factors. The procedure list is as follows:
Got stopped in step 3 (the compile) due to the linker not being able to find the appropriate files. This may go back to the problem with compiling CAM 3.0 last year. It may be a compiler flag because the NetCDF directory is defined in the .modelErc file parameter NETCDFHOME. Opened the file ~/modelE/model/Rules.make and looked for netcdf. The options are defined for SGI machines, but not for the Portland Group. The following lines were changed as follows:
# # Check for extra options specified in modelErc # ifdef NETCDFHOME ifeq ($(MACHINE),SGI) LIBS += -L$(NETCDFHOME)/lib64 -lnetcdf else LIBS += -L$(NETCDFHOME)/lib -lnetcdf endif FFLAGS += -I$(NETCDFHOME)/include INCS += -I $(NETCDFHOME)/include endif # # Pattern rules #
# # Check for extra options specified in modelErc # ifdef NETCDFHOME ifeq ($(MACHINE),SGI) LIBS += -L$(NETCDFHOME)/lib64 -lnetcdf else # LIBS += -L$(NETCDFHOME)/lib -lnetcdf LIBS += -L$(NETCDFHOME)/lib -netcdf endif FFLAGS += -I$(NETCDFHOME)/include INCS += -I $(NETCDFHOME)/include endif # # Pattern rules #
Can't get the netcdf portion to work yet. Changing the library flags did not work. Also from the CAM 3.0 issues, it was necessary to run the command module load netcdf, which sets environment variables. This did not fix it either. I may need to contact OSC to resolve this issue. I also tried the multiple processors option in step 5. It appears to have worked; however, the job was terminated after about 5 minutes since I was using 4 processors. For the time being I am going to run E001smg again and see if it restarts correctly from the restart file. If so, I should be able to get another year of simulation in. Go to the file .modelErc and change multiple processors to NO and then use the command ./runE E001smg.
Here is some useful information when planning a batch job to run more than 20 minutes
Task List of things to work on for ModelE
When running a batch run, a TMP directory needs to be generated. All of the working files need to be copied to this directory and any data files need to be copied back before the batch job finishes. The following parameters determine the size of the TMP directory.
For a 6 year run (default for E001) the TMP file should be 2.2 GB and should take 24 hours of computer time
Attempt to compile E003smg.R using POUT_netcdf.f. This failed on May 20th and I never got it resolved. Tried it this time and got the same errors. The following are the error messages generated, which are due to the linker not finding the netcdf compiled library.
compiling POUT_netcdf.f ... Done linking executable POUT_netcdf.o: In function `ncout_open_out_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:100: undefined reference to `nf_create_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:104: undefined reference to `nf_put_att_text_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:104: undefined reference to `nf_put_att_text_' POUT_netcdf.o: In function `ncout_def_dim_out_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:119: undefined reference to `nf_def_dim_' POUT_netcdf.o: In function `ncout_close_out_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:154: undefined reference to `nf_close_' POUT_netcdf.o: In function `wrtgattc_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:171: undefined reference to `nf_put_att_text_' POUT_netcdf.o: In function `defarr_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:180: undefined reference to `nf_redef_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:180: undefined reference to `nf_inq_varid_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:186: undefined reference to `nf_def_var_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:189: undefined reference to `nf_put_att_text_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:192: undefined reference to `nf_put_att_text_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:195: undefined reference to `nf_put_att_real_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:199: undefined reference to `nf_put_att_real_' /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:199: undefined reference to `nf_enddef_' POUT_netcdf.o: In function `setup_arrn_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:262: undefined reference to `nf_inq_varid_' POUT_netcdf.o: In function `wrtdarrn_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:301: undefined reference to `nf_put_vara_double_' POUT_netcdf.o: In function `wrtrarrn_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:318: undefined reference to `nf_put_vara_real_' POUT_netcdf.o: In function `wrtiarrn_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:330: undefined reference to `nf_put_vara_int_' POUT_netcdf.o: In function `wrtcarrn_': /nfs/06/ced0013/modelE/model/./POUT_netcdf.F:342: undefined reference to `nf_put_vara_text_' gmake[1]: *** [/nfs/06/ced0013/modelE/decks/E003smg_bin/E003smg.exe] Error 2 gmake: *** [gcm] Error 2
Check to make sure that the libraries on the OSC computer matches the parameter defined in ~/.modelErc. The version of NetCDF checked out. When running the environment variables from module load netcdf, the librarys included both -lnetcdf and -lnetcdff. This second library is not included in the file Rules.make. Editing this file to include -netcdff before -lnetcdf resulted in a successful compile. This second library is the fortran routines for netcdf. They are defined as C subroutines, but then they are mapped to fortran subroutines. The changed section of Rules.make are as follows:
# # Check for extra options specified in modelErc # ifdef NETCDFHOME ifeq ($(MACHINE),SGI) LIBS += -L$(NETCDFHOME)/lib64 -lnetcdf else # LIBS += -L$(NETCDFHOME)/lib -lnetcdf LIBS += -L$(NETCDFHOME)/lib -lnetcdff -netcdf endif FFLAGS += -I$(NETCDFHOME)/include INCS += -I $(NETCDFHOME)/include endif
Now recompile E003smg to do netcdf output as well as multiple processors. This will be done interactively for now so I have some results to evaluate. After this is successful, I will move on to defining a batch job. Instructions for setting up E003smg:
Using the NetCDF output and post-processing the accumulated statistics, the following files were generated:
Running the program pdE on the file DEC1949.accE001smg generated the following files:
After 3 months model time on the E003smg run, I did not terminate the run before the OCS 20 minute limit was hit. As a result, the run did not terminate cleanly and will not restart. Not sure how to resolve this. I am subscribing to the mailing list at giss-gcm-users-l@giss.nasa.gov. Hopefully I will get a useful response.
I am going to set up a dynamic ocean model run to see what data is saved at the end of each month. The rundeck that has the Russell dynamic ocean is E001o.R. This needs to be specified when setting up the initial run deck. Go through the following procedure to generate run E004smg, which is a modification of the previous run to get a dynamic ocean:
Ran the dynamic ocean run out to March. Download the files and look at them later.
Looking at the data files at home with Panoply, the missing data entries were treated as real values. This messed up the autoscaling of Panoply. I contacted the author (Robert Schmunk) and he said that the data attribute within the NetCDF file should specify missing_value. I did a grep on the data files and found that only the jk files included the missing_value designation. All the other data files were missing it. I thought I would need to change the modelE code to include it; however, I found a simpler solution. There is a collection of netCDF software tools that can be run from a Linux command line. They are called netCDF Operator (NCO). I considered installing these at OCS, but the install packages for precompiled code would probably need administrator priveledges. Instead I have installed them on my VirtualBox Linux installation. Putting the netCDF files into the shared directory, I can modify the netCDF files with the following command:
ncatted -O -a missing_value,,c,f,-1.0e30 inout.nc
The utility program is called ncatted, which edits attributes of a file. The option -O means overwrite the file. The option -a is to work on an attribute with the following parameters (attribute name, name of data field to change, create if not present, use floating point, value of the parameter). The last text string is the name of the netCDF file to modify. Check the documentation to find other ways this program and others can modify netCDF files.
Made a mistake on run E004smg. It did not include the dynamic ocean model. Not sure where the error occured. Began a new project of E005smg using the dynamic ocean model. The model compiled fine; however, it generated errors on the setup. The following is the error message:
-------------------------------------------------------------- --------- GCM successfully compiled --------- --------- executable E005smg.exe was created --------- -------------------------------------------------------------- gmake[1]: warning: Clock skew detected. Your build may be incomplete. --------- Looks like it was compiled OK --------- ----- Saving Rundeck and other info to global repository ----- --------- Starting setup for E005smg --------- -------------------------------------------------------------- Using settings from ~/.modelErc CMRUNDIR = /nfs/06/ced0013/modelE/cmrun EXECDIR = /nfs/06/ced0013/modelE/exec GCMSEARCHPATH = /nfs/06/ced0013/modelE/cmrun SAVEDISK = /nfs/06/ced0013/modelE/output MAILTO = gollmers@cedarville.edu UMASK = 002 setting up run E005smg output files will be saved in /nfs/06/ced0013/modelE/output/E005smg 1DEC1969.rsfE050AoM20A not found in /nfs/06/ced0013/modelE/cmrun using /nfs/06/ced0013/modelE/cmrun/1DEC1969.rsfE050AoM20A for IC only gmake[1]: *** [setup_script] Error 1 gmake: *** [setup] Error 2
Need to find the initialization file for the ocean. Looking through the run script it appears that the file E001o.R is set up to run a full ocean based on an spin up run of 320 years. I would assume this is done to reach some equilibrium state. The restart file for 1DEC1969 is used instead of the initialization files of AIC, GIC and OIC. The script E005smg.R was changed to comment out the restart file and to remove comments on the defintion of AIC, GIC and OIC. Running start up now resulted in a read error of the file defined for AIC. I assume it is due to an incompatible model resolution. Look at the other run scripts and see which initialization file for AIC is appropriate. The model resolution is fine. Looking at the PRT file in the directory E005smg, it indicates that it encountered an error in the restart file for ISTART=8. Since the ocean initialization file is similar to the script E001M20A.R, I should compare the values for files. The values for AIC and GIC are correct, but the ISTART=2 in this file. I will change the value of ISTART to 2 and see if setup works. I will also change the dates for the similation so that it runs for 6 years not a century and generated by default.
Looking at the results from E005smg indicates that the ocean model has problems. The conversion of the accumulated data did not complete cleanly. When looking at oij, there is only one data field (ocean potential temperature) and it has the same value of 9.96921e+36. Need to look at the ocean code and see if other fields should be recorded.
I want to get the model running on a desktop system. It seems this is best accomplished by using Ubuntu in a virtual machine. The model and generated data can be placed on a virtual drive, which can be easily backed up without affecting the Ubuntu install. From previous estimates a century run using the 4 x 5 degree model will generated 30 GB of data. This is not practical for a virtual machine. However, a decade run of 3 GB of data would be useful for some preliminary runs. If you want to push the limits of a DVD you could format the drive at 4.7 GB. However, if you are using 1kb as 1024 bytes, then the size of the DVD is really 4.38 GB. Therefore, the virtual disk will be formatted to 4.38 GB. This will give a model run of 13 years. If the model is installed on the primary virtual drive, then the secondary virtual drive could contain only generated data and not run files and initialization files. This would extend the amount of data to 14 years. It is probably best just to put everything on the virtual data drive. Using VirtualBox the drive was formatted to 4.32 GB or 4,637,851,648 bytes.
Formatted the virtual drive and installed the model and initialization files. Ran make config to generate the file .modelErc. Edited this file to match the location of the model resources. Installed gfortran.
Will need to change the Rule.make file to compile gfortran correctly.
Begin documenting the Russel ocean model. This document is at OceanModelNotes.html
The model needs to be documented.
While looking through the ocean model, it seems reasonable to perform a qflux run to see what data fields are reported. There should be a sea surface temperature, but also a mixed layer depth.
Set up run for E006smg using the script E001q.R. This is the q-flux model. Follow the ModelE Procedure on OCS described above.
Over the last couple of days I have looked at how best to use R for converting initialization files into NetCDF format. I finally got it figured out and have yet to go much further.
Today I read through the main loop of modelE. One thing I noticed is a parameter KCOPY. This parameter determines which files are written during execution. The following lists the possible values:
Have added information to the file OceanModelNotes.html about file formats and the procedure for modifying E001.R to include a dynamic ocean. To test out my procedure I have set up the run E006smg. I modified E006smg.R accordingly and began a test run. The setup ran fine. Now need to run it out to see if the acc and rsf contain ocean parameters. The size of fort.1 for this run is 41.2 MB and a non-dynamic ocean run has a file size of 29 MB. This is a file size is 42% bigger. When converting the acc file to NetCDF format, I get an error that the program stopped in POUT_netcdf.f. Looking at the generated files, the file DEX1949.otjE006smg has nothing in it. This may indicate where the routine is failing to write.
It looks like the subroutine io_rsf() handles all of the reading and writing of the acc and rsf files.
Need to look at POUT_netcdf.f to see where the program is failing to either read the acc file or to write the output file.
Received the restart file for the dynamic ocean and 20 layer atmosphere model. This includes a more resolved stratosphere, which probably will allow me to include the effects of aerosols better. To generate some useable statistics I will start a seventh run that is similar to E005smg, but will now use the restart file. I will use the run deck E001o.R will modifications to the POUT call.
After running E007smg for a month of model time, I tried to generate the post processing diagnostics with pdE. As before, I am getting an stopped process in POUT_netcdf.f. It seems that this subroutine does not take the dynamic ocean model into account. It opens the appropriate files, but then it stops.
I generated another run called E008smg. This run is the same as E007smg (using the restart file from the NASA GISS site. However, in this run I use POUT as is. This will generate a post-processing file in the GISS format. I should be able to figure out the file format and use it as is or generate NetCDF files from it. Have run this model out for three months and generated the post-processing diagnostics. Everything seems to be fine.
I did notice on both runs (E007smg and E008smg) there is a problem flagged in the snow routine that the water is not conserved. I assume that the routine corrects for this, but I am not certain.
Look at the message recorded in E007smg.PRT and E008smg.PRT to see where the water conservation problem occurs and see if any corrective measures are made by the routine.
After post processing the E008smg files, I looked at the DEC1900.ojlE008smg file in a hex editor. It is clear now what the GISS file format is. Each record is as follows:
Looking at the routine ODIAG_PRT.f the subroutine for saving to the file ojl was found. It sets up data to be saved and then sends the data to the subroutine POUT_JL in the file POUT.f. The statement calling this routine is
IF (QDIAG) CALL POUT_JL(TITLE,LNAME,SNAME,UNITS,2,LMO+1,XJL * ,ZOC1,"Latitude","Depth (m)")
The variable XJL is the data and the values 2 and LMO+1 are integers. Looking at POUT_JL we get the following:
WRITE (iu_jl) TITLE,JXMAX,KLMAX,1,1, * ((REAL(XJL(J1+J-1,L),KIND=4),J=1,JXMAX),L=1,KLMAX) * ,(REAL(XCOOR(J),KIND=4),J=1,JXMAX) * ,(REAL(PM(L),KIND=4),L=1,KLMAX) * ,REAL(1.,KIND=4),REAL(1.,KIND=4) * ,CX,CY,CBLANK,CBLANK,'NASAGISS' * ,(REAL(XJL(J,LM+LM_REQ+1),KIND=4),J=J1,JM+3) * ,((REAL(XJL(J,L),KIND=4),J=JM+1,JM+3),L=1,KLMAX)
Add this all up gives 80 + 4*(4+630+14) + 4*4 + 2*1 + 8 + 4*(48+42)
The file format for ij, il, and jl is not figured out. It is now possible to write a program or set up an R script to convert this data into NetCDF format. The following are the three files and the format of each of their records.
WRITE(iu_ij) TITLE,REAL(XIJ,KIND=4),REAL(XJ,KIND=4), * REAL(XSUM,KIND=4)
WRITE (iu_il) TITLE,IM,KLMAX,1,1, * ((REAL(XIL(I,L),KIND=4),I=1,IM),L=1,KLMAX) * ,(REAL(XCOOR(I),KIND=4),I=1,IM) * ,(REAL(PM(L),KIND=4),L=1,KLMAX) * ,REAL(0.,KIND=4),REAL(0.,KIND=4) * ,CX,CY,CBLANK,CBLANK,'NASAGISS' * ,(REAL(ASUM(I),KIND=4),I=1,IM),REAL(GSUM,KIND=4) * ,(REAL(ZONAL(L),KIND=4),L=1,KLMAX)
Worked on a Java program to read in the GISS format and convert it to NetCDF. I was using Java-NetCDF libraries and Eclipse. The development worked fairly well with the Eclipse environment. However, I was not able to get the data to save in the right format. The variable definition worked well and the data field was able to be recorded. However, when brought up in any NetCDF viewer, such as Panoply, the data field was not visible. The variable description was there and the data was in the file, it just was not accessible. I followed the tutorial given for the Java-NetCDF library, but I just was not able to get it to work.
I went back to R to convert the OIJ file. The format is relatively simple and it needs to be handled as a two pass process. The first pass pulls out the data field information and then the NetCDF file is created with all of the variables defined. On the second pass the data fields for each variable was saved to the file. The following things were observed in this process.
Check to see what white spaces are acceptable for NetCDF character fields and see if there is a way to clean up the title when using R so there is not an underline running through the data title.
Check to see if a forward slash is acceptable in NetCDF. If so, see if an escape sequence can be used in R to maintain this character in the units field.
See if there is a way to use Java to generate NetCDF files. It would be nice to have more control over strings when generating fields. R seems a bit crippled in the string manipulation area.
June 29, 2010
Now that the ocean data field can be viewed, it is time to generate a significant amount of additional data. This will only be possible by setting up the model to run as a batch run. This will eliminate the 20 minute computer time per job when done at the terminal.
Several observations were made on May 20th about batch jobs. Since the ocean model has been added, this may no longer be valid. Here is a modified list based on the run E008smg.
For a 6 year the TMP file should be 4 GB and should take 79.2 hours of computer time. This is 3.3 times as much time estimated on May 27th for the model E001, which does not include the stratosphere or the ocean.
To more accurately determine model performance. Multiple runs of 19 minutes computing time (not the most accurate way to stopping the run) were performed. This was done in interactive mode and, therefore, may be affected by system limitations on this mode over a batch mode. A similar process might be performed at the batch level to determine performance.
Using 8 processors it should then take 14.6 hours of real time. This comes out to about 1 decade per day in real time. This compares to 85.7 hours (3.6 days) for 1 decade with one processor in real time. This later number may give a realistic estimate of how fast the model can run on a desktop if it is possible to successfully compile and run it at this level. This would be a goal for the future.
Set up the model to compile and run on a desktop using Linux. Could either be done using Cygwin or MinGW/MSYS or through Linux on VirtualBox. Either way one needs to get a good compile using either gfortran or g95.
The following things need to be done in order to implement a batch job at the OSC:
The batch script needs PBS header lines. The relevant information from the web page http://www.osc.edu/supercomputing/training/customize/docs/batch/batch_pbsheader.shtml is as follows.
Here is an example PBS script. It contains the PBS commands followed by the shell commands normally given at the terminal. A similar script needs to be generated for running the climate model.
#PBS -N nest #PBS -l walltime=00:05:00 #PBS -j oe #PBS -S /bin/ksh set -x cd $PBS_O_WORKDIR gcc nest.c cp a.out $TMPDIR cd $TMPDIR ./a.out 459 121
Here is similar script that reports timing for the program execution.
#PBS -l walltime=00:01:00 #PBS -N nest #PBS -j oe #PBS -S /bin/csh #PBS -m e set echo cd $PBS_O_WORKDIR gcc nest.c -lm cp a.out $TMPDIR cd $TMPDIR echo "New run " time ./a.out 10000 10000 echo "New run " time ./a.out 15400 10032
The use of environment variables in a batch job is described in http://www.osc.edu/supercomputing/training/customize/docs/batch/batch_envvar.shtml. Here are some comments on how this may affect the climate model:
mkdir /tmp/$USER Create your own temporary directory. cp files /tmp/$USER Copy the necessary files. cd /tmp/$USER Move to the directory. ... Do work (compile, execute, etc.). ... cp new files $HOME Copy important new files back home. cd $HOME Return to your home directory. rm -rf /tmp/$USER Remove your temporary directory. exit End the session.
Looking through the scripts runE and sswE and the Makefile commands, it appear that the physical location of the model could change as long as the environment variables are changed accordingly. As a result, it seems that this is a good time to make a fresh start. I will eliminate all of the previous runs and set up the file .modelErc to reflect a different directory structure. The structure is as follows: (This is reflected in the file .modelErc.
This will place the compiled program and run output into a directory within cmrun. When a batch job is initiated, the file .modelErc will need to be changed so that all the string $MODELDIR is replaced with $TMPDIR. I copy of this file with the appropriate changes made will be called modelErcBatch.
Since the results from previous runs have been copied to a local machine, the runs will be wiped out on the OCS account. The model code will not be removed and the fixed.tar.gz data will be kept. Also the initialization file for the ocean run, will be retained.
Once model runs for the July conference are completed, the code and results should be backed up and saved to a DVD. Next all of the model code and output on the OCS machine should be cleared out. It should be replaced with the up to date code obtained through CVS. At this point, model runs should be registered through the NASA GISS CVS site.
The following steps were performed to restructure the model on the OCS machine.
MODELDIR=$HOME/modelE export MODELDIR
Now begin a new run. Name this one E010smg and use the ocean model by using E001o.R as the source deck. Follow the procedure given on June 1st. Once the file is compiled and the start up is successful, then use the following shell script to make sure the model can be moved to a new location and run successfully.
Problem: Can't get the environment variable $MODELDIR to be interpreted correctly. For Makefile to work it needs to be defined as $(MODELDIR). This works up through the compile. However, when doing setup it fails to work. The variables get defined literally without the substitution of the environment variable. It doesn't matter whether it is in parenthesis or not. This may be an issue with what shell the model is run under. Need to look into this. If it is not a shell issue, then I may need to define each of the environment variables directly and then place a dummy file of .modelErc into $HOME so it does not redefine the environment variables.
Looking through documentation for environment variables in a shell versus a makefile it seems that the syntax is not identical. As a result, I don't see a method for placing an environment variable in the file .modelErc that will work. It seems that it wants an absolute reference. This will not work when I go to a batch job because I need to reference $TMPDIR. The solution I will try is to move all of the variables in .modelErc to the environment. That way they can be declared and they should work for both the makefile and the running of the model. Had to make the following modifications.
Using these changes results in the environment variables not being set. There needs to be a different method. I will need to remove the changes and see what can be done.
I contacted Dr. Gavin Schmidt at GISS and he included someone else in the email response. Essentially he said that each system is unique and the make file needs to be adjusted. I will try two more things before I contact Schmidt again. The first is to replace the environment variable with the .. directory designation. If this doesn't work, then I will try to modify the pearl script to see if I can get it to parse the environment variable correctly.
This process worked with interactive mode. Now it needs to be run as a batch file. Run E010smg is cleaned out and repeated as a batch file. The following is the process for running the batch file.
Had a problem with the #PBS mem directive. Have eliminated it and tried again.
Returned batch name is 3469115.opt-batch.osc.edu
Batch job failed. Removed the mkdir commands and just did a copy of the ~/modelE directory. Resubmitted the job which is named 3469123.opt-batch.osc.edu. This one also failed. There seems to be a problem with getting the directories set up on $TMPDIR. Will try different variants and give the final form when done. Accidentally deleted the pbs file. Here is the current modified version
Changed the pbs script to copy back everything to modelE/cmrun. Also redirected the runE messages to E010smg.runoutput. Hopefully this will indicate what is going wrong with the batch job.
In an attempt to streamline the process, I am calling E010smg without using runE. This removes the nohup and nice calls. However, to use multiple processors, two environment variables need to be set. They are the following:
With this change I was able to get a 2 month run to come to completion in 2 hours of computer time. The following is the pbs file that worked.
#PBS -N E010smg #PBS -l walltime=03:00:00 #PBS -l nodes=1:ppn=4 #PBS -l mem=2GB #PBS -j oe #PBS -m e export MP_SET_NUMTHREADS=4 export OMP_NUM_THREADS=4 cd $TMPDIR mkdir E010smg cp $HOME/modelE/E010smg/* ./E010smg mkdir bcic cp -r $HOME/modelE/bcic/* ./bcic cd $TMPDIR/E010smg ulimit -s 34768 ./E010smg > ../E010smg.runoutput cd $TMPDIR cp -r ./* $HOME/cmrun
This can be trimmed down a bit more by only copying the ic and bc files needed by this model run. Will need to check the generated files to make sure the results look reasonable. If so, then a 6 - 10 year run can be set up.
Since I had a successful batch job, I need to verify that the data looks reasonable. The acc files were converted using pdE and downloaded. An R script was run to convert the oij fields into NetCDF format. This run was then compared to the results from the interactive run.
The fields match up within rounding errors due to precision of calculation. I now want to run the model for 6 years. This will take 3 days of computer time. Give 4 days so there is room for error and use 8 processors. This run will still be called E010smg.
The run completed successfully at 5:44 on July 9th. The job number was 3491574. It used 119:52.29 cpu time units, 276.792 Mb memory, 1,137.200 Mb virtual memory, and 15:14:35 walltime. This comes out to 5 days of computer time. There are evidently some additional inefficiencies that I did not account for when running 8 processors.
Statistics need to be generated for the reference run. The following is the process for generating a yearly, seasonal and monthly averages.
To complete the documentation on the first run it would be good to list the boundary and initial condition files that are used in the dynamic ocean run. Several of them are not used because they are replaced by the restart file, but they are included in case a run from scratch is desired. The list is as follows:
Now the restart file needs to be explored. In order to do a sensitivity run it will be necessary to identify the proper fields to change. Initially I would like to try increasing the temperature of the water at the bottom of the ocean. This could be done in two different ways.
In both of these cases it should be possible to see how the increased heat redistributes itself. I would anticipate the heat flow to move upward and with it an increased vertical motion. This will also result in some regions developing a down-welling motion.
File format for the RSF file is explored in the file OceanModelNotes.html. Modules are called by the subroutine io_rsfdefined in the file IORSF.f. The model values are saved to a rsf file within the file MODELE.f while running the main program loop. The subroutine io_rsf uses a flag iaction to determine if it reads or writes a restart file.
There are many module calls to save values to the restart file. Each module labels the data it saves using a character string. Once all of the restart parameters are saved, the current status of the accumulated variables are saved. The accumulated variables will be save in the acc file at the end of the month. The accumulated variables are saved under the label of DIAG. The following labels are useful for changing the ocean variables for the sensitivity run. The file 1DEC1969.rsfE050AoM20A is used for locating the following labels:
The last three fields should also show up in the acc file. Look at an acc file and deterimine the size of these fields.
Comparing the original restart file with the model restart files, there is a discrepency of 412 bytes in their length. Looking more closely, the following labels and values are compared between the two restart files.
Once we get to MODEL01, the shift of 412 is the same throughout the file. The
It looks like the ocean parameters can be changed without affecting the rest of the restart file. Therefore, the difference in sizes should not be a problem as long as the right start point is determined. A Java program can be written to find the start position. Next it can step through the different data fields to find the one to be changed. This field can be loaded in and modified. Treat this like a copy command where input data is saved to another file. When the appropriate field is found, substitute the modified field in place of a straight copy. Next we need to determine the fields saved in the OCDYN01 module. This module has a size of 3,842,008 bytes. The OCSTR01 module deals with the straits between larger bodies of water and should not be affected immediately by the changes of water temperature in the larger ocean. The following information about OCDYN01 is present in OceanModelNotes.html and elaborated on in the following: (Records are saved by io_ocdyn in OCNDYN.f. This subroutine is called by io_ocean, which is the standard call for all ocean models both prescribed and dynamic.) (Analysis was done in RSF_ACC_Fields.xls)
Now that the structure of the rsf file is know, a Java program is written to read through the records and extract the ones you want. This program is developed using Eclipse and is called GISSReader. It looks like the program works, but the values for salt and enthalpy seem large. Need to consider the value when compared to the amount of water present. The mass of water is given in kg/m^2. Each level of water has a different thickness. From Russell, Miller and Rind, 1995 A Coupled Atmosphère-Océan Model for Transient Climate Change Studies the layers are as follows: (These values were calculated using the average density for each level and the average mass per level from an ACC file. See SpecificEnthalpy.xls.)
Trying to track down how potential temperature is calculated for the ocean. Calculation of the ocean temperature is initiated from subroutine OIJOUT in file ODIAG_PRT.f. The calculation is performed as follows:
The next step is to change the potential enthalpy of the ocean layers and see how it translates into a change in ocean temperatures. Looked at the specific enthalpy for the bottom and the top of the ocean. See SpecificEnthalpy.xls. At the surface, the specific enthalpy is at 84% of the maximum allowed value. At the bottom of the ocean it is at 7% of max. Both top and bottom are within 60% of the minimum value. It looks like the best choice for warming the ocean is to increase the specific enthalpy by 10%. This should prevent any inconsistencies in the moments of the enthalpy and should result in a significantly warmer ocean. Since we are dealing with specific enthalpy, the change in enthalpy for any particular layer will need to be mass weighted.
A more precise calculation of layer thickness was attempted by taking the layer mass (from ACC file) and divide it by the average layer density (from PRT file). Those values are recorded with the previous days entries of level thickness. Either way the values are within 1% of each other.
Today's project is to increase the potential enthalpy for the restart file and see how it affects the potential temperature of the ocean in the ACC file. Modify the GISSReader program to copy a file until the appropriate record is found and then add a set amount of enthalpy to each layer of the ocean. Each layer has a different amount of enthalpy because the thickness of the layers change with depth. The java code was rewritten to make it easier to access different data fields in the future. The OCDYN01 record is loaded in as double arrays for each variable. The variable G0M is changed by adding an amount of specific enthalpy that is 10% of the largest value for the surface. In order to verify that the changes are correct, only the OCDYN01 record was saved for both the original RSF file and the modified one. They are called DEC1969.ocndyn and HotEnth.ocndyn respectively. These files were converted to NetCDF files using the R script OCDYN2NCDF.R. Values of enthalpy for non-ocean grid points were originally at zero. This was maintained in the modified file. By shifting all of the specific enthalpies by the same amount, it is hoped that there will not be any significant effect on the values of GXMO, GYMO and GZMO (the moments of the enthalpy). When running the model, it is expected that the potential temperature of the whole ocean will be increased with a more pronounced change in the deep ocean. There should also be an impact on sea surface temperatures and sea ice amounts.
The modified enthalpy file will be uploaded to OCS and a new run called E011smg will be initiated. A test run of one month will be performed to look at initial statistics. If this run looks good, E011smg will be rerun as a batch job for six years.
Problem encountered during setup. Error message is PBL: Ts out of range. I think it might be too large of a gradient between the sea surface temperature and the atmosphere. There are two possible solutions. One is to decrease the change in the sea surface temperature and the other is to leave the top layer unchanged. I will attempt the first one and see if a month of model time can be generated. Keeping the top layer unchanged allowed the setup to work; however, during the initialization of the standard run (beyond the 1st hour), the same message is generated. I have now kept the whole ocean the same except the bottom layer. This allowed a run to go for 1 month. I will look through the ACC file and see what the ocean looks like due to this change.
It was possible to run with only the bottom layer increased in enthalpy; however, it didn't show much change. The increase in enthalpy was set at 1% instead of 10%. The run worked and showed about 0.3 C increase in temperature for the tropic, but 19 C increase for the poles. No matter the layer, the north pole values were extreme. It is clear that the way enthalpy is increased is not right. It is clear that the enthalpy increase not only needs to take layer thickness into account, but also latitude (less area).
Looking at OGEOM.f the weighting of area by latitude is calculated with the formula DXYPO(J) = DLON*RADIUS*RADIUS*(SINVN-SINVS). Each of the variables in the formula have the following values:
The calculated area by latitude is contained in SpecificEnthalpy.xls. At this point the Java program for modifying the Enthalpy will use a specific enthalpy value of 135,200 as a base. This comes to 84% of the maximum allowed value. This value will be increased by 10%, which means 13,520 will be added to every ocean specific enthalpy. To convert this to the potential enthalpy for each ocean grid point and layer, the value added will be 13,520*MO(i,j,l)*area(j).
Tried to modify the enthalpy as described above. There were some problems when checking the enthalpy field in OCDYN01. After several tries, the Java program was debugged. A new RSF file was generated with specific enthalpy increased by about 10%. A model run of E011smg was run and the OIJ fields were inspected. The potential temperature of the ACC file ran about 0.3 to 4 C greater than the ACC file for the control run. It looks like there is a change in salinity due to seaice melt. The values for the one month run seem reasonable. A batch job was set up and initiated. It's job number is 3539120. Because the job was submitted at 1 AM Monday morning, it got off the queue very quickly. If this run takes 15 hours like the first one, it should be done by 6 PM tonight.
While waiting for the batch job to complete, R scripts need to be written to generate NetCDF files for the model accumulated statistics. The following files are generated by pdE. Some are text files and others are binary. The binary files will be converted to NetCDF format.
The batch job finished at 4:30 pm on July 19. The program pdE was used to generate diagnostic statistics on the resultant data. The process described on July 13, 2010 was used to generate annual and yearly seasonal averages. The Yearly seasonal averages were done with the batch file batchconvert. Taking a preliminary look at the data, it looks like the model was a success. The ocean temperatures at 6 years are cooler than the initial values, but warmver than the reference run.
It is time to make a poster and presentation based on the model run. Comparisons will be made between the 6 year winter average of the reference run against both the beginning and ending winter of the modified run. For comparison the ending winter will be compared to the ending summer to see how large the seasonal cycle is versus the perturbation of the model.
The temperature comparisons are done as a zonal average. This gives some sense of where the mixed layer ends and the deep ocean begins. Comparing the beginning winter with the reference, there is a nearly uniform 3.2 C increase in potential temperature due to the 10% increase of specific enthalpy relative to the surface. The plots generated for comparison are the reference, beginning winter, ending winter, difference between ending and reference, and difference between ending winter and summer. When making these plots, the following parameters are set in Panoply to make the plots look uniform.
Differences between the winter and summer are contrained to the top 3 layers of the model and have differences of -5.6 to 4.5 C. This layer depth translates into about 57 meters. Changes below this depth are on the range of 0.1 and 0.2 C.
The surface velocities are plotted for the reference winter and the modified 6th winter.
The ocean boundary layer depth is used by the K-Profile Parameterization (KPP) scheme. This scheme predicts an ocean boundary layer depth and then uses a parameterization using a nonlocal bulk Richardson number and similarity theory of turbulence. The alternate means of calculating vertical mixing is the Pacanowski and Philander (PP) scheme. For a comparison of these schemes and its use see Li, X. et al, 2000: A Comparison of Two Vertical Mixing Schemes in a Pacific Ocean General Circulation Model. J. Climate.
Continued to generate plots for the conference poster session.
This streamfunction is a vertically integrated relationship that determines the meridional mass transport. The units are Sverdrups (Sv) and are the same as 10^6 m^3/s. According to Marshall and Plumb (Atmosphere, Ocean and Climate Dynamics, p. 215) the Amazon river has a volume transport of 0.2 Sv. From Wikipedia the flow of fresh water from all of the rivers of the world have a value of 1 Sv. Regions where the volume flow is largest has no zonal flow. Regions of a high northward gradient of this streamfunction gives a location of very large flow from east to west. Likewise if the gradient is southward, the flow of the ocean gyre in this location is west to east. Regions of high streamfunction value occur near the westerly portion of the ocean basin.
The vertical mass fluxes were compared at levels 3, 6, and 9. The differences between the reference, beginning and ending winters is not significant for levels 3 and 6. However, for level 9 it looks like there is a noticeable increase in the vertical mass flux near the arctic and antarctic. This is most likely due to enhanced melting at the poles.
There are 9 data fields linked to quantity. GM stands for the Gent-McWilliams scheme for reducing climate drift in coupled ocean-atmosphere models. This scheme is an ocean eddy parameterization for sub-grid processess. It is assumed that the eddy fluxes are quasi-adiabatic in the ocean interior and can, therefore, be represented as eddy-induced velocity. (See Ferrari, Eddy-mixed layer interactions in the ocean. (MIT)) The picture that this scheme leads to is that creation of eddies depletes the energy source and, therefore, reduce the slope of the isopycnal surfaces. Once an eddy is formed, it slides along the isopycnal surface and in the process mix temperature and tracers on the grid scale. (See Neelin and Marotzke, Representing Ocean Eddies in Climate Models (Science)) This scheme does not work in the boundary layer of the ocean because there are strong diabatic processes going on. This leads to the need for tapering off the GM scheme and implementing a different one, such as the KPP scheme mentioned above. Through correspondence with Gavin Schmidt, the vertical heat flux is positive in the upward direction. When it is used in the subroutine GMFEXP, the flux (RFZT) is added to level L and subtracted from level L+1 (which is deeper in the ocean). Therefore, a positive value for flux will add heat to an upper level and remove it from a lower level.
Looking at levels 3 and 6 of the vertical mass flux, there does not seem to be any significant difference between the control and modified runs.
This plot is a vector plot of the N-S and E-W heat fluxes integrated over the depth of the ocean model.
The integrated salt flux was compared between the control and other runs. There is an increase of flux around antarctica; however, this is nothing different than the increase in heat flux and surface velocity in this location. The salt flux increases by 27%. This is probably an indicator of an increased deeper ocean circulation around the continent. No plots were generated.
Looking at the vertical profile of salinity shows not much change at greater depths. It looks like any changes are in the upper 4 levels. The salinity is compared as several levels.
The vertical potential temperature profile for the annual averages do not show much difference. Several levels are plotted and their differences.
The ocean surface height was compared between the reference and modified run. The warmer ocean does give an increased surface height. On the average the increased height corresponds to 2 meter increase.
Continued working on the poster for the CGS meeting next week.
While at the conference, was able to rename the variables in the IJ file to match how the netcdf routines in modelE does it. Also was able to successfully generate an OIL file. If the second dimension is relabeled to lat, Panoply reads and displays it. However, it assumes it is based on geographic coordinates.
Need to find another program other than Panoply to display NetCDF files. This can be done in R, but look for other sources before developing a series of R scripts to do plotting. One possibility is NCL (NCAR Control Language).
Installing modelE on a linux virtual machine. Place all of the relavant files on a secondary drive. When using it, the .modelErc files needs to be copied to the home directory of the user. The data files will be saved on the secondary drive. Use the relative addressing scheme from the batch job runs on the OCS computer.
Trying to get modelE to run on Linux desktop. To get a successful compile, the netcdf calls in Rules.make were removed. Also the function call iargc is an implicit function in gfortran (hold over from g77. To get this to work, the file MODELE.f needs to have all the references to iargc removed and the function call iargc() needs to be replaced with command_argument_count(). The program compiled; however, when startup was run for the first month, a segmentation fault was received. The value of ulimit was expanded, but that had no effect. Also the memory of the virtual machine was pushed up to 1.5 GB, but still a segmentation fault.
Try the following to get ModelE running on a Linux Desktop.
The second option (compile and download to desktop) did not work at face value. There may be some issues with the shell script that calls the binary, but there does seem to be a significant problem that needs to be explored if the other option does not pan out.
I looked at the resource useage at OSC. Of the 5000 allocated units, 24.54 are used. The balance as of today is 4975.35768. Compare this balance with a balance at a later time to see if general use of the account reduces the RU balance. Most if not all of this useage is due to the two batch jobs, which ran modelE with a dyanamic ocean for six years. Using this same model it is then expected that each simulated year will cost about 2 RU's. That means with the original allocation, a combined simulation of 2,500 years could be performed. Since it was observed that there is some drift in the model using the 1DEC1969 restart file (supposedly already run for 300 years), it is suggested by the folks at GISS that a total of 500 years spin-up be done. That means that the model should be spun-up for an additional 200 years. This would use 400 RU's. Before this is done the most recent version of ModelE should be downloaded using CVS.
Before downloading the most recent version of ModelE, the limits of the ocean model should be tested. An increase of 10% to specific enthalpy resulted in an average ocean temperature increase of 3 C. How much more can the ocean be warmed before it causes a problem with the model?
At this point, let's see if gfortran can be used to run the model on the OSC machine. Here are the changes made to the ModelE files to see if a compile can be accomplished.