I. Model Structure


It has a couple of directories and shell scripts.
Go to /home/work1b/hseo/Couple_RsmRoms_tutorial in COMPAS.

1. Lib

a. codes: contains all the Fortran codes used in the coupler

b. exec: contains executables.
1) Coupler: Fortran executables (.x); these are copied after compiling the .F files in codes. comp.sh for single or duel processor, comp2.sh for MPI in codes directory. I use PGI Fortran compiler. I haven’t used other kind.
2) ROMS: contains ROMS input and launch files necessary for running ROMS in COMPAS under the gridname indian25/
3) RSM: contains RSM run scripts under each gridname

c. aux-files: other auxiliary text files that are needed by coupler.

d. grids: contains the grid files (*.dat, *.nc); I use separate directory for each application. This directory is created in local machine after you setup the domain (see below 1.3) and copied to COMPAS indian25/Coupler
indian25/matlab
indian25/ROMS
indian25/RSM

2. Model: RSM, ROMS and other model-related files

a. RSM: lib is shared among the different applications since it is independent of the domain and grid size. but rsm and runs directories are needed for each application. For installing RSM in COMPAS, see http://g-rsm.wikispaces.com/

1) lib: shared among all the applications.
2) rsm: directory;
e.g. for indian25 case; (convention: applications_name/rsm_#cpu) indian25/rsm_32cpu: rsm directory for indian25 application using 32 cpus. (you will have to compile RSM according to the # of cpus and the grid information (that is determined later when you create the domain).
3) runs: same convention; applications_name/runs_#cpu
e.g. indian25/runs_32cpu: configure-model and
cp the rsim script to Lib/exec/RSM/indian25;
see my sample rsim script and change yours accordingly.
(the change i made is mostly about SST_ANL part in the script. search “hyodae” in the script).
Lib/exec/RSM/indian25/rsm_32cpu_day_indian25

b. ROMS: To install ROMS, see http://myroms.org/
I used ROMS-2.1 but there is a new ROMS3-0.
Compile the code by choosing the makefile for linux and mpgi90 compiler.)
You should be able to know already how to setup up ROMS for your application. This is not discussed here. Refer the above webpage.
After you compile the codes, (see below about the checklist when you compile ROMS.) You will have a single executable called something like oceanM. I usually call it something like oceanM21-nobulk, which indicates that this is MPI ROMS, 2.1 version and I use no bulk parameterization (if you do call it something like oceanM21_bulk). Link that to Lib/exec/ROMS/indian25.

ln -fs $your_ROMS_source_code_directory/oceanM21_nobulk Lib/exec/ROMS/indian25
Locate your ocean.in file in Lib/exec/ROMS/indian25 too. For ROMS2.1 version the sample is looks like
Lib/exec/ROMS/indian25/ocean32_roms21_day_indian25.in
(32 cpus, roms2.1, daily coupling, and application name followed by resolution)

c. misc/indian25: Contains all other files here necessary for run
(like general and empty forcing file and empty initial file and sometimes your spinup solution.) It is up to you how to use this directory. You may use other directories to store these files, only if you correctly point those files in your main.sh file. Run create_bulk_forc_init.m in your local machine to create the netcdf files. Sftp the resultant netcdf files in this directory (see II.4)
Also locate your spinup initial condition file. In this example, it is called roms25-indian25_spinup_Jan3.nc. It is the end solution from a 8-year forced ROMS run with COADS forcing and Levitus BC. In order to create this initial condition, you can use my sample code (originally from Manu's toolbox), called coldstart_init.m

3. Shell: All the necessary shell-scripts

I usually make a directory for each application and copy the shell scripts.
See, e.g., /home/work1b/hseo/Couple_RsmRoms_tutorial/Shell/indian25.
You may change each script according to your purpose.

Read the shell scripts (first main_couple.indian.1d.sh in home directory)
After preparing all the necessary steps, it will run
couple_Nday_indian25.sh (N=1 in this case; coupling frequency 1day)
Basically it does following
while day <end_day
run RSM,
call Rsm2Roms_nobulk_indian25.sh
call prepareROMS-indian25.sh
then run ROMS
call Rsm2Roms_nobulk_indian25.sh
end loop
Other shell scrips
uauo.sh; used only if you choose TOGA_COARE and UaUo
putfile.sh is needed for sftping the model outputs from COMPAS to your local machine
post.sh calls the following shells.
a. summary3D.sh makes netcdf file for 3d ROMS variables (like zeta) and write the zeta at each time step and write into a single Netcdf file. You can check this file using Matlab or NCView to see if it does all right.
http://meteora.ucsd.edu/~pierce/ncview_home_page.html
b. summary4D.sh does the same things except for the 4d variables like temp, or salt.
c. summary_4Dto3D.sh read eg., SST from 4d temperature and write to 3d netcdf file. Rsm_Clim3hr is obsolete. Explanations for each shell script follow later.

4. Run: Store the outputs in COMPAS directory

a. RSM: store RSM outputs
b. ROMS: store ROMS outputs:
1) Avg: Daily avg.nc file.I use avg file instead of history file
2) Dia: If you choose diagnostic option in ROMS cppdefs.h, diag file will be stored here
3) Forcing_files: Daily forc.nc file.
4) Misc: summary files are located here
5) Run_Log: ROMS log text files
c. SST: daily SST grib files (from Roms2Rsm.sh) from ROMS SST
Two types of SST grib files.
1) sstgrb:1993:01:02:00
2) SST_NDay/sstgrb_Day1

5. Grid_Setup: Information about the grid setup for tutorial purpose.