<p><b>duda</b> 2009-09-02 12:38:43 -0600 (Wed, 02 Sep 2009)</p><p>Update the README file.<br>
<br>
M README<br>
</p><hr noshade><pre><font color="gray">Modified: trunk/swmodel/README
===================================================================
--- trunk/swmodel/README        2009-09-02 17:00:21 UTC (rev 39)
+++ trunk/swmodel/README        2009-09-02 18:38:43 UTC (rev 40)
@@ -1,5 +1,7 @@
swmodel: A 2d shallow water model on a C-grid staggering.
+2 Sept 2009 -- Long overdue update to this readme.
+
21 May 2009 -- Added vertical dimension (currently fixed at 1) and 3d scalar array dimensioned
(nTracers,nVertLevels,nCells).
Pushed loop over blocks down into RK4 time integration; now each RK step loops
@@ -19,43 +21,90 @@
I. Code layout
- Files: swmodel.F -- Contains the main program, which essentially calls routines within other
- modules.
+ The swmodel is written in Fortran (plus one C routine), with some of the code being generated
+ by a C program called the "registry". All of the Fortran code for the model is contained in the
+ src/ directory, while the registry program resides in the Registry/ directory. When building the
+ model with the 'make' command, the registry is first compiled in the Registry/ directory; after
+ the registry is built, it is run by make to generate pieces of Fortran code that are included in
+ various hand-written Fortran routines in the src/ directory; these registry-generated pieces of
+ code are placed in the inc/ directory. After running the registry program, the main swmodel code
+ in the src/ directory is compiled, and the final executable, swmodel, is linked into the top-level
+ directory. At present, the only external package needed by the model is the NetCDF library, which
+ is linked in the swmodel executable. However, if the need arises to build other software libraries
+ when the swmodel code is built, these packages can be located in the external/ directory.
+
+ Briefly, the main Fortran source files in the src/ directory are as follows:
- module_test_cases.F -- Contains routines (one per test case) to initialize the model state for
- the particular case specified in the user input (namelist) file.
-
- module_sw_solver.F -- Contains the main solver loop, which advances each block of the domain
- forward in time by a specified delta-t and periodically writes model state
- to an output file.
+ swmodel.F -- Contains the main program, which essentially calls routines within other
+ modules.
- module_time_integration.F -- Contains time integration routine (currently RK4) and a function
- to compute the tendencies for prognostic variables given the current
- model state.
+ module_block_decomp.F -- Contains code to read a mesh decomposition file to decompose the
+ model grid into "blocks", each of which is solved for by a single MPI
+ process.
- module_grid_types.F -- Contains definitions of derived data types used in the model, as well
- as routines for allocating and deallocating each type.
+ module_configure.F -- Contains user-specified configuration information (e.g., model time,
+ number of timesteps to simulate, interval between writes of model state,
+ and which shallow water test case to run) that is read from a Fortran
+ namelist (namelist.input). The actual variables in the namelist are
+ defined in the Registry/Registry file, and the code to read these variables
+ as a namelist is generated by the registry program.
- module_io_input.F -- Contain code for reading grid and model state information from an input
- module_io_output.F file (grid.nc) and writing model state to an output file (output.nc).
+ module_constants.F -- Contains definitions of constants used at various points in the model
+ (e.g., radius of the earth, omega, gravity, pi).
- module_configure.F -- Contains user-specified configuration information (e.g., model time,
- number of timesteps to simulate, interval between writes of model state,
- and which shallow water test case to run) that is read from a Fortran
- namelist (namelist.input).
+ module_dmpar.F -- Contains routines to perform distributed memory operations, such as
+ update ghost cells in a block, determine which block owns a list of cells,
+ and perform collective operations like broadcasts and sums. These operations
+ are currently implemented using MPI.
- module_constants.F -- Contains definitions of constants used at various points in the model
- (e.g., radius of the earth, omega, gravity, pi).
+ module_grid_types.F -- Contains definitions of derived data types used in the model, as well
+ as routines for allocating and deallocating each type. The members of
+ two of the types (grid meta, which contains time-invariant fields, and
+ grid_state, which contains time-varying fields) are supplied by registry-
+ generated code, and, therefore, all fields in the model should ultimately
+ be defined in the Registry/Registry file.
+ module_hash.F -- A simple dictionary/hash table implementation.
+ module_io_input.F -- Contain code for reading grid and model state information from an input
+ module_io_output.F file (grid.nc or restart.nc) and writing model state to an output file
+ (output.nc or restart.nc). Like the fields in module_grid_types, the
+ actual code to read or write a particular field is generated by the
+ registry program at compile time.
+
+ module_sort.F -- Contains implementations of mergesort, quicksort, and binary search.
+
+ module_sw_solver.F -- Contains the main solver loop, which advances each block of the domain
+ forward in time by a specified delta-t and periodically writes model state
+ to an output file.
+
+ module_test_cases.F -- Contains routines (one per test case) to initialize the model state for
+ the particular case specified in the user input (namelist) file.
+
+ module_time_integration.F -- Contains time integration routine (currently RK4) and a function
+ to compute the tendencies for prognostic variables given the current
+ model state.
+
+ module_timer.F -- Contains functions to measure the execution time between two points
+ in the code.
+
+
II. Building the code
The swmodel code may be built using the included Makefile, after suitably editing this file to
- set compiler and compiler flags appropriate to the system. In particular, the following should
- be set:
+ set compiler and compiler flags appropriate to the system. Additionally, the environment
+ variable NETCDF must be set to the path of the NetCDF installation. In the Makefile, the following
+ should be set:
- FC -- The Fortran compiler
+ FC -- The MPI Fortran compiler if compiling a parallel executable; otherwise, the serial
+ Fortran compiler.
+ CC -- The MPI C compiler if compiling a parallel executable; otherwise, the serial C compiler.
+
+ SFC -- The serial Fortran compiler; the same as $(FC) when compiling a serial executable.
+
+ SCC -- The serial C compiler; the same as $(CC) when compiling a serial executable.
+
FFLAGS -- Flags specified to the compiler when compiling Fortran source files; there aren't
any particular flags to be specified, although one might want to add debugging or
optimization flags here
@@ -65,21 +114,33 @@
CPPFLAGS -- These should *not* be changed, since -DRKIND=8 will set the kind of Fortran
reals to be 8-bytes, which is necessary for reading the double precision model initial
- state from the input file, grid.nc.
+ state from the input file, grid.nc. If building a serial executable, -D_MPI should
+ not appear in the definition of CPPFLAGS.
- INCLUDES -- The flags necessary to tell the compiler where to find NetCDF include/header files.
+ CPPINCLUDES -- The flags necessary to tell the C preprocessor where to find include/header files.
+ FCINCLUDES -- The flags necessary to tell the compiler where to find NetCDF include/header files.
+
LIBS -- The flags necessary to tell the compiler where to find NetCDF libraries; typically,
these flags must include a flag specifying the path to libraries, as well as flags to
link the libraries themselves (either libnetcdf.a, or in some installations, libnetcdff.a
and libnetcdf.a)
- So far, the code has only been tested on Intel Mac systems with PGI compilers, and on NCAR's IBM
- with xlf90 compilers.
+
-
III. Running the code
- Edit the namelist.input file to set appropriate parameters; then type 'swmodel' to run.
+ After successfully compiling but before running, it will be necessary to have downloaded an input
+ file, and, if the code was compiled for parallel execution, a grid decomposition file. These input
+ files may be downloaded for several grid sizes from http://www.mmm.ucar.edu/people/duda/files/mpas/,
+ where the name of the tar file indicates the number of grid cells. Once the input files (grid.nc
+ and graph.info.part.?) have been copied into the top-level directory, the swmodel executable may be
+ run.
+ For a parallel run, a graph.info.part.N file must be present in the run directory, where N is the
+ number of processors to run on; then, the swmodel executable may be run with mpirun, mpiexec, etc.
+ For a serial run, no grid decomposition file (i.e., graph.info.part.*) file is necessary. It should
+ be noted that to run the code in serial, the code must not have been compiled with mpif90/mpicc, since
+ no graph.info.part.1 files are available (although one could be trivially generated).
+
</font>
</pre>