From nancy at ucar.edu Tue May 11 10:59:33 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Tue, 11 May 2010 10:59:33 -0600
Subject: [Dart-dev] [4358]
DART/trunk/diagnostics/threed_sphere/obs_diag.html: Minor revision to the
DART QC table. Tried to
Message-ID:
Revision: 4358
Author: nancy
Date: 2010-05-11 10:59:33 -0600 (Tue, 11 May 2010)
Log Message:
-----------
Minor revision to the DART QC table. Tried to make the descriptions
of the numbers more clear, and changed the headings slightly because
they were a little misleading.
Modified Paths:
--------------
DART/trunk/diagnostics/threed_sphere/obs_diag.html
-------------- next part --------------
Modified: DART/trunk/diagnostics/threed_sphere/obs_diag.html
===================================================================
--- DART/trunk/diagnostics/threed_sphere/obs_diag.html 2010-04-21 23:05:18 UTC (rev 4357)
+++ DART/trunk/diagnostics/threed_sphere/obs_diag.html 2010-05-11 16:59:33 UTC (rev 4358)
@@ -188,22 +188,19 @@
DART QC flag value
meaning
0
observation assimilated
-
1
observation evaluated only
+
1
observation evaluated only (because of namelist settings)
-
DART QC values higher than this means the prior and posterior are OK, but ...
+
DART QC values higher than this means the prior is OK, but ...
2
assimilated, but the posterior forward operator failed
-
3
Evaluated only, but the posterior forward operator failed
+
3
evaluated only, but the posterior forward operator failed
-
DART QC values higher than this means only the prior is OK, but ...
+
DART QC values higher than this were not assimilated because ...
4
prior forward operator failed
-
5
not used because of namelist control
-
-
DART QC values higher than this are bad news.
-
-
6
prior QC rejected
-
7
outlier rejected
+
5
not used because observation type not listed in namelist
+
6
rejected because incoming observation QC too large
+
7
rejected because failed outlier threshold test
8+
reserved for future use
From nancy at ucar.edu Tue May 11 11:40:10 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Tue, 11 May 2010 11:40:10 -0600
Subject: [Dart-dev] [4359] DART/trunk/DART_LAB/DART_LAB.html: There was no
generic page for the DART_LAB tutorial
Message-ID:
Revision: 4359
Author: nancy
Date: 2010-05-11 11:40:10 -0600 (Tue, 11 May 2010)
Log Message:
-----------
There was no generic page for the DART_LAB tutorial
information for me to point at. So here one is.
Added Paths:
-----------
DART/trunk/DART_LAB/DART_LAB.html
-------------- next part --------------
Added: DART/trunk/DART_LAB/DART_LAB.html
===================================================================
--- DART/trunk/DART_LAB/DART_LAB.html (rev 0)
+++ DART/trunk/DART_LAB/DART_LAB.html 2010-05-11 17:40:10 UTC (rev 4359)
@@ -0,0 +1,111 @@
+
+
+
+DART_LAB Tutorial
+
+
+
+
+
+
+The files in this directory contain PDF tutorial
+materials on DART, and Matlab exercises. See below for links to the
+PDF files and a list of the corresponding matlab scripts.
+
+
+This tutorial begins at a more introductory level than the materials
+in the tutorial directory, and includes hands-on exercises at several
+points. In a workshop setting, these materials and exercises took
+about 1.5 days to complete.
+
+
+
+
+
+
+
DART Tutorial Presentations
+
+
+Here are the PDF files for the presentation part of the tutorial:
+
+In the 'matlab' subdirectory are a set of
+Matlab scripts and GUI (graphical user interface) programs
+which are exercises that go with the tutorial. Each is
+interactive with settings that can be changed and rerun to
+explore various options. A valid
+Matlab
+license is needed to run these scripts.
+
+
+The exercises include the following:
+
+
gaussian_product
+
oned_model
+
oned_ensemble
+
run_lorenz_63
+
run_lorenz_96
+
twod_ensemble
+
+
+
+To run these, cd into the matlab directory, start matlab,
+and type the names at the prompt. Matlab must be started
+with the Java virtual machine enabled as these scripts
+open windows and display graphical images as they run.
+
+
+
+
+
+
Property changes on: DART/trunk/DART_LAB/DART_LAB.html
___________________________________________________________________
Added: svn:mime-type
+ text/html
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
From nancy at ucar.edu Thu May 13 12:51:22 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 13 May 2010 12:51:22 -0600
Subject: [Dart-dev] [4361] DART/trunk/models/bgrid_solo/shell_scripts:
Rename this script to be consistent with other models and
Message-ID:
Revision: 4361
Author: nancy
Date: 2010-05-13 12:51:22 -0600 (Thu, 13 May 2010)
Log Message:
-----------
Rename this script to be consistent with other models and
our convention to use a .csh extension on scripts. Update
script to add comments about no async4 support here (i added
a pointer to where one exists), and clean up the section for
batch execution without a queue system. No functional changes.
Added Paths:
-----------
DART/trunk/models/bgrid_solo/shell_scripts/run_filter.csh
Removed Paths:
-------------
DART/trunk/models/bgrid_solo/shell_scripts/runme_filter
-------------- next part --------------
Copied: DART/trunk/models/bgrid_solo/shell_scripts/run_filter.csh (from rev 4357, DART/trunk/models/bgrid_solo/shell_scripts/runme_filter)
===================================================================
--- DART/trunk/models/bgrid_solo/shell_scripts/run_filter.csh (rev 0)
+++ DART/trunk/models/bgrid_solo/shell_scripts/run_filter.csh 2010-05-13 18:51:22 UTC (rev 4361)
@@ -0,0 +1,117 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+#
+# This is an example script for how to run the filter program
+# in parallel by submitting it to a batch system. Note that
+# this version does NOT have an async 4 option because this version
+# of the bgrid model is a serial-only program and does not use MPI.
+#
+# If you are looking for an example script for how to run async 4
+# (parallel/mpi filter AND parallel/mpi model) check the
+# DART/models/template/shell_scripts directory.
+#
+#=============================================================================
+# This block of directives constitutes the preamble for the LSF queuing system
+# LSF is used on the IBM Linux cluster 'lightning'
+# LSF is used on the IMAGe Linux cluster 'coral'
+# LSF is used on the IBM 'bluevista'
+# The queues on lightning and bluevista are supposed to be similar.
+#
+# the normal way to submit to the queue is: bsub < run_filter.csh
+#
+# an explanation of the most common directives follows:
+# -J Job name (master script job.csh presumes filter_server.xxxx.log)
+# -o STDOUT filename
+# -e STDERR filename
+# -P account
+# -q queue cheapest == [standby, economy, (regular,debug), premium] == $$$$
+# -n number of processors (really)
+# -W hh:mm max execution time (required on some platforms)
+##=============================================================================
+#BSUB -J filter
+#BSUB -o filter.%J.log
+#BSUB -q economy
+#BSUB -n 6
+#BSUB -W 0:30
+#
+#=============================================================================
+# This block of directives constitutes the preamble for the PBS queuing system
+# PBS is used on the CGD Linux cluster 'bangkok'
+# PBS is used on the CGD Linux cluster 'calgary'
+#
+# the normal way to submit to the queue is: qsub run_filter.csh
+#
+# an explanation of the most common directives follows:
+# -N Job name
+# -r n Declare job non-rerunable
+# -e filename for standard error
+# -o filename for standard out
+# -q Queue name (small, medium, long, verylong)
+# -l nodes=xx:ppn=2 requests BOTH processors on the node. On both bangkok
+# and calgary, there is no way to 'share' the processors
+# on the node with another job, so you might as well use
+# them both. (ppn == Processors Per Node)
+##=============================================================================
+#PBS -N filter
+#PBS -r n
+#PBS -e filter.err
+#PBS -o filter.log
+#PBS -q medium
+#PBS -l nodes=4:ppn=2
+
+# Check for the existence of variables that are set by different
+# queuing mechanisms. This way, we can make a single script which
+# works for any queuing system.
+
+if ($?LS_SUBCWD) then
+
+ # LSF has a list of processors already in a variable (LSB_HOSTS)
+
+ mpirun.lsf ./filter
+
+
+else if ($?PBS_O_WORKDIR) then
+
+ # PBS has a list of processors in a file whose name is (PBS_NODEFILE)
+
+ mpirun ./filter
+
+else if ($?NODEFILE) then
+
+ # a linux cluster with mpich or lam or openmpi and no formal
+ # queueing system. alter this to match the required nodes and
+ # to construct a simple launch script.
+
+ setenv MY_NODEFILE ~/nodelist
+ echo "node7:2" > $MY_NODEFILE
+ echo "node5:2" >> $MY_NODEFILE
+ echo "node3:2" >> $MY_NODEFILE
+ echo "node1:2" >> $MY_NODEFILE
+
+cat > ./filterscript <
+# $URL$
+# $Revision$
+# $Date$
+
Deleted: DART/trunk/models/bgrid_solo/shell_scripts/runme_filter
===================================================================
--- DART/trunk/models/bgrid_solo/shell_scripts/runme_filter 2010-05-11 17:55:56 UTC (rev 4360)
+++ DART/trunk/models/bgrid_solo/shell_scripts/runme_filter 2010-05-13 18:51:22 UTC (rev 4361)
@@ -1,115 +0,0 @@
-#!/bin/csh
-#
-# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
-# provided by UCAR, "as is", without charge, subject to all terms of use at
-# http://www.image.ucar.edu/DAReS/DART/DART_download
-#
-# $Id$
-#
-# start at a generic run script for the mpi version.
-#
-#=============================================================================
-# This block of directives constitutes the preamble for the LSF queuing system
-# LSF is used on the IBM Linux cluster 'lightning'
-# LSF is used on the IMAGe Linux cluster 'coral'
-# LSF is used on the IBM 'bluevista'
-# The queues on lightning and bluevista are supposed to be similar.
-#
-# the normal way to submit to the queue is: bsub < runme_filter
-#
-# an explanation of the most common directives follows:
-# -J Job name (master script job.csh presumes filter_server.xxxx.log)
-# -o STDOUT filename
-# -e STDERR filename
-# -P account
-# -q queue cheapest == [standby, economy, (regular,debug), premium] == $$$$
-# -n number of processors (really)
-# -W hh:mm max execution time (required on some platforms)
-##=============================================================================
-#BSUB -J filter
-#BSUB -o filter.%J.log
-#BSUB -q economy
-#BSUB -n 6
-#BSUB -W 0:30
-#
-#=============================================================================
-# This block of directives constitutes the preamble for the PBS queuing system
-# PBS is used on the CGD Linux cluster 'bangkok'
-# PBS is used on the CGD Linux cluster 'calgary'
-#
-# the normal way to submit to the queue is: qsub runme_filter
-#
-# an explanation of the most common directives follows:
-# -N Job name
-# -r n Declare job non-rerunable
-# -e filename for standard error
-# -o filename for standard out
-# -q Queue name (small, medium, long, verylong)
-# -l nodes=xx:ppn=2 requests BOTH processors on the node. On both bangkok
-# and calgary, there is no way to 'share' the processors
-# on the node with another job, so you might as well use
-# them both. (ppn == Processors Per Node)
-##=============================================================================
-#PBS -N filter
-#PBS -r n
-#PBS -e filter.err
-#PBS -o filter.log
-#PBS -q medium
-#PBS -l nodes=4:ppn=2
-
-# A common strategy for the beginning is to check for the existence of
-# some variables that get set by the different queuing mechanisms.
-# This way, we know which queuing mechanism we are working with,
-# and can set 'queue-independent' variables for use for the remainder
-# of the script.
-
-if ($?LS_SUBCWD) then
-
- # LSF has a list of processors already in a variable (LSB_HOSTS)
- # alias submit 'bsub < \!*'
-
- mpirun.lsf ./filter
-
-
-else if ($?PBS_O_WORKDIR) then
-
- # PBS has a list of processors in a file whose name is (PBS_NODEFILE)
- # alias submit 'qsub \!*'
-
- mpirun ./filter
-
-else if ($?OCOTILLO_NODEFILE) then
-
- # ocotillo is a 'special case'. It is the only cluster I know of with
- # no queueing system. You must generate a list of processors in a
- # file whose name is given to the mpirun command, and the executable
- # needs to be wrapped with a script that cds to the right directory.
- setenv OCOTILLO_NODEFILE ~/nodelist
- echo "node7:2" > $OCOTILLO_NODEFILE
- echo "node5:2" >> $OCOTILLO_NODEFILE
- echo "node3:2" >> $OCOTILLO_NODEFILE
- echo "node1:2" >> $OCOTILLO_NODEFILE
-
-cat > ./filterscript <
-# $URL$
-# $Revision$
-# $Date$
-
From nancy at ucar.edu Thu May 13 12:56:06 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 13 May 2010 12:56:06 -0600
Subject: [Dart-dev] [4362] DART/trunk/models/lorenz_96/shell_scripts: Rename
this script to be consistent with other models and
Message-ID:
Revision: 4362
Author: nancy
Date: 2010-05-13 12:56:06 -0600 (Thu, 13 May 2010)
Log Message:
-----------
Rename this script to be consistent with other models and
our convention to use a .csh extension on scripts. Update
script to add comments about no async4 support here (i added
a pointer to where one exists), and clean up the section for
batch execution without a queue system. No functional changes.
Added Paths:
-----------
DART/trunk/models/lorenz_96/shell_scripts/run_filter.csh
Removed Paths:
-------------
DART/trunk/models/lorenz_96/shell_scripts/runme_filter
-------------- next part --------------
Copied: DART/trunk/models/lorenz_96/shell_scripts/run_filter.csh (from rev 4300, DART/trunk/models/lorenz_96/shell_scripts/runme_filter)
===================================================================
--- DART/trunk/models/lorenz_96/shell_scripts/run_filter.csh (rev 0)
+++ DART/trunk/models/lorenz_96/shell_scripts/run_filter.csh 2010-05-13 18:56:06 UTC (rev 4362)
@@ -0,0 +1,117 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+#
+# This is an example script for how to run the filter program
+# in parallel by submitting it to a batch system. Note that
+# this version does NOT have an async 4 option because the
+# Lorenz 96 model is a serial-only program and does not use MPI.
+#
+# If you are looking for an example script for how to run async 4
+# (parallel/mpi filter AND parallel/mpi model) check the
+# DART/models/template/shell_scripts directory.
+#
+#=============================================================================
+# This block of directives constitutes the preamble for the LSF queuing system
+# LSF is used on the IBM Linux cluster 'lightning'
+# LSF is used on the IMAGe Linux cluster 'coral'
+# LSF is used on the IBM 'bluevista'
+# The queues on lightning and bluevista are supposed to be similar.
+#
+# the normal way to submit to the queue is: bsub < run_filter.csh
+#
+# an explanation of the most common directives follows:
+# -J Job name (master script job.csh presumes filter_server.xxxx.log)
+# -o STDOUT filename
+# -e STDERR filename
+# -P account
+# -q queue cheapest == [standby, economy, (regular,debug), premium] == $$$$
+# -n number of processors (really)
+# -W hh:mm max execution time (required on some platforms)
+##=============================================================================
+#BSUB -J filter
+#BSUB -o filter.%J.log
+#BSUB -q economy
+#BSUB -n 6
+#BSUB -W 0:30
+#
+#=============================================================================
+# This block of directives constitutes the preamble for the PBS queuing system
+# PBS is used on the CGD Linux cluster 'bangkok'
+# PBS is used on the CGD Linux cluster 'calgary'
+#
+# the normal way to submit to the queue is: qsub run_filter.csh
+#
+# an explanation of the most common directives follows:
+# -N Job name
+# -r n Declare job non-rerunable
+# -e filename for standard error
+# -o filename for standard out
+# -q Queue name (small, medium, long, verylong)
+# -l nodes=xx:ppn=2 requests BOTH processors on the node. On both bangkok
+# and calgary, there is no way to 'share' the processors
+# on the node with another job, so you might as well use
+# them both. (ppn == Processors Per Node)
+##=============================================================================
+#PBS -N filter
+#PBS -r n
+#PBS -e filter.err
+#PBS -o filter.log
+#PBS -q medium
+#PBS -l nodes=4:ppn=2
+
+# Check for the existence of variables that are set by different
+# queuing mechanisms. This way, we can make a single script which
+# works for any queuing system.
+
+if ($?LS_SUBCWD) then
+
+ # LSF has a list of processors already in a variable (LSB_HOSTS)
+
+ mpirun.lsf ./filter
+
+
+else if ($?PBS_O_WORKDIR) then
+
+ # PBS has a list of processors in a file whose name is (PBS_NODEFILE)
+
+ mpirun ./filter
+
+else if ($?NODEFILE) then
+
+ # a linux cluster with mpich or lam or openmpi and no formal
+ # queueing system. alter this to match the required nodes and
+ # to construct a simple launch script.
+
+ setenv MY_NODEFILE ~/nodelist
+ echo "node7:2" > $MY_NODEFILE
+ echo "node5:2" >> $MY_NODEFILE
+ echo "node3:2" >> $MY_NODEFILE
+ echo "node1:2" >> $MY_NODEFILE
+
+cat > ./filterscript <
+# $URL$
+# $Revision$
+# $Date$
+
Deleted: DART/trunk/models/lorenz_96/shell_scripts/runme_filter
===================================================================
--- DART/trunk/models/lorenz_96/shell_scripts/runme_filter 2010-05-13 18:51:22 UTC (rev 4361)
+++ DART/trunk/models/lorenz_96/shell_scripts/runme_filter 2010-05-13 18:56:06 UTC (rev 4362)
@@ -1,117 +0,0 @@
-#!/bin/tcsh
-#
-# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
-# provided by UCAR, "as is", without charge, subject to all terms of use at
-# http://www.image.ucar.edu/DAReS/DART/DART_download
-#
-# $Id$
-#
-# start at a generic run script for the mpi version
-#
-#=============================================================================
-# This block of directives constitutes the preamble for the LSF queuing system
-# LSF is used on the IBM Linux cluster 'lightning'
-# LSF is used on the IMAGe Linux cluster 'coral'
-# LSF is used on the IBM 'bluevista'
-# The queues on lightning and bluevista are supposed to be similar.
-#
-# the normal way to submit to the queue is: bsub < runme_filter
-#
-# an explanation of the most common directives follows:
-# -J Job name (master script job.csh presumes filter_server.xxxx.log)
-# -o STDOUT filename
-# -e STDERR filename
-# -P account
-# -q queue cheapest == [standby, economy, (regular,debug), premium] == $$$$
-# -n number of processors (really)
-# -W hh:mm execution time (must be specified on some hosts)
-##=============================================================================
-#BSUB -J filter
-#BSUB -o filter.%J.log
-#BSUB -q regular
-#BSUB -n 4
-#BSUB -W 00:10
-#
-#
-##=============================================================================
-## This block of directives constitutes the preamble for the PBS queuing system
-## PBS is used on the CGD Linux cluster 'bangkok'
-## PBS is used on the CGD Linux cluster 'calgary'
-##
-## the normal way to submit to the queue is: qsub runme_filter
-##
-## an explanation of the most common directives follows:
-## -N Job name
-## -r n Declare job non-rerunable
-## -e filename for standard error
-## -o filename for standard out
-## -q Queue name (small, medium, long, verylong)
-## -l nodes=xx:ppn=2 requests BOTH processors on the node. On both bangkok
-## and calgary, there is no way to 'share' the processors
-## on the node with another job, so you might as well use
-## them both. (ppn == Processors Per Node)
-##=============================================================================
-#PBS -N filter
-#PBS -r n
-#PBS -e filter.err
-#PBS -o filter.log
-#PBS -q medium
-#PBS -l nodes=4:ppn=2
-
-# A common strategy for the beginning is to check for the existence of
-# some variables that get set by the different queuing mechanisms.
-# This way, we know which queuing mechanism we are working with,
-# and can set 'queue-independent' variables for use for the remainder
-# of the script.
-
-if ($?LS_SUBCWD) then
-
- # LSF has a list of processors already in a variable (LSB_HOSTS)
- # alias submit 'bsub < \!*'
-
- mpirun.lsf ./filter
-
-
-else if ($?PBS_O_WORKDIR) then
-
- # PBS has a list of processors in a file whose name is (PBS_NODEFILE)
- # alias submit 'qsub \!*'
-
- mpirun ./filter
-
-else if ($?OCOTILLO_NODEFILE) then
-
- # ocotillo is a 'special case'. It is the only cluster I know of with
- # no queueing system. You must generate a list of processors in a
- # file whose name is given to the mpirun command, and the executable
- # needs to be wrapped with a script that cds to the right directory.
- setenv OCOTILLO_NODEFILE ~/nodelist
- echo "node7:2" > $OCOTILLO_NODEFILE
- echo "node5:2" >> $OCOTILLO_NODEFILE
- echo "node3:2" >> $OCOTILLO_NODEFILE
- echo "node1:2" >> $OCOTILLO_NODEFILE
-
- cat > ./filterscript <
-# $URL$
-# $Revision$
-# $Date$
-
From nancy at ucar.edu Thu May 13 13:19:47 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 13 May 2010 13:19:47 -0600
Subject: [Dart-dev] [4363]
DART/trunk/models/template/shell_scripts/run_filter.csh: Make it harder to
do this wrong, by getting the async= value from
Message-ID:
Revision: 4363
Author: nancy
Date: 2010-05-13 13:19:47 -0600 (Thu, 13 May 2010)
Log Message:
-----------
Make it harder to do this wrong, by getting the async= value from
the input.nml and set the parallel_model shell variable automatically.
Also set the advance command based on what's in the namelist.
Minor updates to the comments to remove names of obsolete machines
and add a comment about the -W bsub line.
Modified Paths:
--------------
DART/trunk/models/template/shell_scripts/run_filter.csh
-------------- next part --------------
Modified: DART/trunk/models/template/shell_scripts/run_filter.csh
===================================================================
--- DART/trunk/models/template/shell_scripts/run_filter.csh 2010-05-13 18:56:06 UTC (rev 4362)
+++ DART/trunk/models/template/shell_scripts/run_filter.csh 2010-05-13 19:19:47 UTC (rev 4363)
@@ -6,30 +6,33 @@
#
# $Id$
#
+# Script to start an MPI version of filter, and then optionally
+# run the model advance if &filter_nml has async=4 (parallel filter
+# AND parallel model). This version gets the number of ensemble members
+# and advance command out of the input.nml namelist file automatically.
+# You do have to set the
+#
#=============================================================================
# This block of directives constitutes the preamble for the LSF queuing system
-# LSF is used on the IBM Linux cluster 'lightning'
# LSF is used on the IMAGe Linux cluster 'coral'
-# LSF is used on the IBM 'bluevista'
-# The queues on lightning and bluevista are supposed to be similar.
+# LSF is used on the IBM 'bluefire'
#
# the normal way to submit to the queue is: bsub < run_filter.csh
#
# an explanation of the most common directives follows:
-# -J Job name (master script job.csh presumes filter_server.xxxx.log)
-# -o STDOUT filename
-# -e STDERR filename
-# -P account
+# -J Job_name
+# -o STDOUT_filename
+# -e STDERR_filename
+# -P account_code_number
# -q queue cheapest == [standby, economy, (regular,debug), premium] == $$$$
-# -n number of processors (really)
+# -n number of MPI processes (not nodes)
+# -W hh:mm wallclock time (required on some systems)
##=============================================================================
#BSUB -J filter
#BSUB -o filter.%J.log
-#BSUB -q regular
+#BSUB -q standby
#BSUB -n 20
-#BXXX -P 868500xx
-#BSUB -W 2:00
-#BSUB -N -u ${USER}@ucar.edu
+#BSUB -W 1:00
#
##=============================================================================
## This block of directives constitutes the preamble for the PBS queuing system
@@ -64,25 +67,40 @@
# mpirun -np 64 modelxxx (or whatever) for as many ensembles as you have,
# set this to "true"
-# if async=4, also check that the call to advance_model.csh
-# has the right number of ensemble members below; it must match
-# the input.nml number.
+# this script is going to determine several things by reading the input.nml
+# file which contains the &filter_nml namelist. make sure it exists first.
+if ( ! -e input.nml ) then
+ echo "ERROR - input.nml does not exist in local directory."
+ echo "ERROR - input.nml needed to determine several settings for this script."
+ exit 1
+endif
+# detect whether the model is supposed to run as an MPI job or not
+# by reading the "async = " from the &filter_nml namelist in input.nml.
+# some namelists contain the same string - be sure to get the filter_nml one
+# by grepping for lines which follow it.
+
+set ASYNCSTRING = `grep -A 42 filter_nml input.nml | grep async`
+set ASYNC_TYPE = `echo $ASYNCSTRING[3] | sed -e "s#,##"`
+
+if ( "${ASYNC_TYPE}" == "0" || "${ASYNC_TYPE}" == "2") then
+ set parallel_model = "false"
+else if ( "${ASYNC_TYPE}" == "4") then
set parallel_model = "true"
+else
+ echo 'cannot autodetect async value in the filter_nml namelist in input.nml file.'
+ echo 'hardcode the parallel_model shell variable and comment out these lines.'
+ exit -1
+ set parallel_model = "false"
+endif
# Determine the number of ensemble members from input.nml,
-# it may exist in more than one place.
-# Parse out the filter_nml string and see which
-# one is immediately after it ...
+# as well as the command for advancing the model.
-if ( ! -e input.nml ) then
- echo "ERROR - input.nml does not exist in local directory."
- echo "ERROR - input.nml needed to determine number of ensemble members."
- exit 1
-endif
-
set ENSEMBLESTRING = `grep -A 42 filter_nml input.nml | grep ens_size`
set NUM_ENS = `echo $ENSEMBLESTRING[3] | sed -e "s#,##"`
+set ADVANCESTRING = `grep -A 42 filter_nml input.nml | grep adv_ens_command`
+set ADV_CMD = `echo $ADVANCESTRING[3] | sed -e "s#,##"`
# A common strategy for the beginning is to check for the existence of
# some variables that get set by the different queuing mechanisms.
@@ -93,7 +111,6 @@
if ($?LS_SUBCWD) then
# LSF has a list of processors already in a variable (LSB_HOSTS)
- # alias submit 'bsub < \!*'
echo "LSF - using mpirun.lsf for execution"
# each filter task advances the ensembles, each running on 1 proc.
@@ -137,7 +154,7 @@
# of processors this job is using.
echo "calling model advance now:"
- ./advance_model.csh 0 ${NUM_ENS} filter_control00000 || exit 9
+ ${ADV_CMD} 0 ${NUM_ENS} filter_control00000 || exit 9
echo "restarting filter."
mpirun.lsf ./wakeup_filter
@@ -161,7 +178,6 @@
else if ($?PBS_O_WORKDIR) then
# PBS has a list of processors in a file whose name is (PBS_NODEFILE)
- # alias submit 'qsub \!*'
echo "PBS - using mpirun for execution"
# each filter task advances the ensembles, each running on 1 proc.
@@ -205,7 +221,7 @@
# of processors this job is using.
echo "calling model advance now:"
- ./advance_model.csh 0 ${NUM_ENS} filter_control00000 || exit 9
+ ${ADV_CMD} 0 ${NUM_ENS} filter_control00000 || exit 9
echo "restarting filter."
mpirun ./wakeup_filter
@@ -288,7 +304,7 @@
# of processors this job is using.
echo "calling model advance now:"
- ./advance_model.csh 0 ${NUM_ENS} filter_control00000 || exit 9
+ ${ADV_CMD} 0 ${NUM_ENS} filter_control00000 || exit 9
echo "restarting filter."
${MPICMD} ./wakeup_filter
From nancy at ucar.edu Thu May 13 13:21:55 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 13 May 2010 13:21:55 -0600
Subject: [Dart-dev] [4364]
DART/trunk/models/template/shell_scripts/run_filter.csh: Finish the editing
before committing. Change i
Message-ID:
Revision: 4364
Author: nancy
Date: 2010-05-13 13:21:55 -0600 (Thu, 13 May 2010)
Log Message:
-----------
Finish the editing before committing. Change in comments at
the top of the file only.
Modified Paths:
--------------
DART/trunk/models/template/shell_scripts/run_filter.csh
-------------- next part --------------
Modified: DART/trunk/models/template/shell_scripts/run_filter.csh
===================================================================
--- DART/trunk/models/template/shell_scripts/run_filter.csh 2010-05-13 19:19:47 UTC (rev 4363)
+++ DART/trunk/models/template/shell_scripts/run_filter.csh 2010-05-13 19:21:55 UTC (rev 4364)
@@ -10,7 +10,10 @@
# run the model advance if &filter_nml has async=4 (parallel filter
# AND parallel model). This version gets the number of ensemble members
# and advance command out of the input.nml namelist file automatically.
-# You do have to set the
+# It also gets the async setting and sets serial vs parallel model
+# automatically. The theory is that once you get this script working on
+# your system, you will not need to change anything here as you change the
+# number of ensemble members, async setting, or model advance command.
#
#=============================================================================
# This block of directives constitutes the preamble for the LSF queuing system
From nancy at ucar.edu Mon May 17 09:22:58 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Mon, 17 May 2010 09:22:58 -0600
Subject: [Dart-dev] [4365] DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90:
Forgot to add strings for new kinds to table. This wil
Message-ID:
Revision: 4365
Author: nancy
Date: 2010-05-17 09:22:58 -0600 (Mon, 17 May 2010)
Log Message:
-----------
Forgot to add strings for new kinds to table. This will eventually
be autogenerated, once we have a way to input the 'master list' that
works for everyone. then we can autogenerate both the numbered parameter
section at the top, and also the strings-to-numbers-and-back table.
until then i have to remember that when i add a new parameter i also
have to add a new line to the string table init code.
Modified Paths:
--------------
DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
-------------- next part --------------
Modified: DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
===================================================================
--- DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-13 19:21:55 UTC (rev 4364)
+++ DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-17 15:22:58 UTC (rev 4365)
@@ -327,12 +327,14 @@
obs_kind_names(39) = obs_kind_type(KIND_CLOUD_FRACTION, 'KIND_CLOUD_FRACTION')
obs_kind_names(40) = obs_kind_type(KIND_ICE_FRACTION, 'KIND_ICE_FRACTION')
obs_kind_names(41) = obs_kind_type(KIND_RELATIVE_HUMIDITY, 'KIND_RELATIVE_HUMIDITY')
+obs_kind_names(42) = obs_kind_type(KIND_ELECTRON_DENSITY, 'KIND_ELECTRON_DENSITY')
obs_kind_names(50) = obs_kind_type(KIND_SALINITY, 'KIND_SALINITY')
obs_kind_names(51) = obs_kind_type(KIND_U_CURRENT_COMPONENT, 'KIND_U_CURRENT_COMPONENT')
obs_kind_names(52) = obs_kind_type(KIND_V_CURRENT_COMPONENT, 'KIND_V_CURRENT_COMPONENT')
obs_kind_names(53) = obs_kind_type(KIND_SEA_SURFACE_HEIGHT, 'KIND_SEA_SURFACE_HEIGHT')
obs_kind_names(54) = obs_kind_type(KIND_DRY_LAND, 'KIND_DRY_LAND')
obs_kind_names(55) = obs_kind_type(KIND_SEA_SURFACE_PRESSURE, 'KIND_SEA_SURFACE_PRESSURE')
+obs_kind_names(56) = obs_kind_type(KIND_W_CURRENT_COMPONENT, 'KIND_W_CURRENT_COMPONENT')
obs_kind_names(60) = obs_kind_type(KIND_INFRARED_RADIANCE, 'KIND_INFRARED_RADIANCE')
obs_kind_names(61) = obs_kind_type(KIND_INFRARED_BRIGHT_TEMP, 'KIND_INFRARED_BRIGHT_TEMP')
obs_kind_names(62) = obs_kind_type(KIND_LANDMASK, 'KIND_LANDMASK')
From nancy at ucar.edu Mon May 17 10:52:04 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Mon, 17 May 2010 10:52:04 -0600
Subject: [Dart-dev] [4366] DART/trunk/models: password-protected directory
for Tomoko and Ite
Message-ID:
Revision: 4366
Author: thoar
Date: 2010-05-17 10:52:04 -0600 (Mon, 17 May 2010)
Log Message:
-----------
password-protected directory for Tomoko and Ite
Added Paths:
-----------
DART/trunk/models/tiegcm/
DART/trunk/models/tiegcm/shell_scripts/
DART/trunk/models/tiegcm/work/
-------------- next part --------------
From nancy at ucar.edu Mon May 17 14:42:20 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Mon, 17 May 2010 14:42:20 -0600
Subject: [Dart-dev] [4367] DART/trunk/models/tiegcm: This is the initial
import of the TIEGCM interface. Tomoko an
Message-ID:
Revision: 4367
Author: thoar
Date: 2010-05-17 14:42:20 -0600 (Mon, 17 May 2010)
Log Message:
-----------
This is the initial import of the TIEGCM interface. Tomoko and I are
working on it on bluefire (IBM). At this point, perfect_model seems to
be working correctly. We are working on the model_mod_check routine
to ensure we can put a synthetic observation wherever we want, then
we will test a single assimilation with no model advance. The model
can be compiled as a single threaded executable, so we are checking
async==2 first. After that works, we will explore async==4, the
anticipated method for production runs.
Added Paths:
-----------
DART/trunk/models/tiegcm/dart_to_model.f90
DART/trunk/models/tiegcm/model_mod.f90
DART/trunk/models/tiegcm/model_to_dart.f90
DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
DART/trunk/models/tiegcm/shell_scripts/run_filter.csh
DART/trunk/models/tiegcm/shell_scripts/run_filter_async4.csh
DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh
DART/trunk/models/tiegcm/work/input.nml
DART/trunk/models/tiegcm/work/mkmf_create_fixed_network_seq
DART/trunk/models/tiegcm/work/mkmf_create_obs_sequence
DART/trunk/models/tiegcm/work/mkmf_dart_to_model
DART/trunk/models/tiegcm/work/mkmf_filter
DART/trunk/models/tiegcm/work/mkmf_model_mod_check
DART/trunk/models/tiegcm/work/mkmf_model_to_dart
DART/trunk/models/tiegcm/work/mkmf_obs_diag
DART/trunk/models/tiegcm/work/mkmf_perfect_model_obs
DART/trunk/models/tiegcm/work/mkmf_preprocess
DART/trunk/models/tiegcm/work/mkmf_wakeup_filter
DART/trunk/models/tiegcm/work/path_names_create_fixed_network_seq
DART/trunk/models/tiegcm/work/path_names_create_obs_sequence
DART/trunk/models/tiegcm/work/path_names_dart_to_model
DART/trunk/models/tiegcm/work/path_names_filter
DART/trunk/models/tiegcm/work/path_names_model_mod_check
DART/trunk/models/tiegcm/work/path_names_model_to_dart
DART/trunk/models/tiegcm/work/path_names_obs_diag
DART/trunk/models/tiegcm/work/path_names_perfect_model_obs
DART/trunk/models/tiegcm/work/path_names_preprocess
DART/trunk/models/tiegcm/work/path_names_wakeup_filter
DART/trunk/models/tiegcm/work/quickbuild.csh
-------------- next part --------------
Added: DART/trunk/models/tiegcm/dart_to_model.f90
===================================================================
--- DART/trunk/models/tiegcm/dart_to_model.f90 (rev 0)
+++ DART/trunk/models/tiegcm/dart_to_model.f90 2010-05-17 20:42:20 UTC (rev 4367)
@@ -0,0 +1,167 @@
+! DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+! provided by UCAR, "as is", without charge, subject to all terms of use at
+! http://www.image.ucar.edu/DAReS/DART/DART_download
+
+program dart_to_model
+
+!
+! $URL$
+! $Id$
+! $Revision$
+! $Date$
+
+!----------------------------------------------------------------------
+! purpose: interface between TIEGCM and DART
+!
+! method: Read DART state vector ("proprietary" format)
+! Reform state vector back into TIEGCM fields.
+! Replace those fields on the TIEGCM restart file with the new values,
+!
+! Replace 'mtime' variable in the TIEGCM restart file
+! with the 'valid time' of the DART state vector.
+! Write out updated namelist variables (e.g., model_time, adv_to_time)
+! in a file called 'namelist_update'
+!
+!----------------------------------------------------------------------
+
+use types_mod, only : r8
+use utilities_mod, only : get_unit, initialize_utilities, E_ERR, &
+ error_handler, timestamp, do_output
+use model_mod, only : model_type, get_model_size, init_model_instance, &
+ vector_to_prog_var, update_TIEGCM_restart, &
+ static_init_model
+use assim_model_mod, only : assim_model_type, aread_state_restart, &
+ open_restart_read, close_restart
+use time_manager_mod, only : time_type, get_time, get_date, set_calendar_type, &
+ print_time, print_date, set_date, set_time, &
+ operator(*), operator(+), operator(-), &
+ operator(>), operator(<), operator(/), &
+ operator(/=), operator(<=)
+
+
+implicit none
+
+! version controlled file description for error handling, do not edit
+character(len=128), parameter :: &
+ source = "$URL$", &
+ revision = "$Revision$", &
+ revdate = "$Date$"
+
+type(model_type) :: var
+type(time_type) :: model_time, adv_to_time, jan1, tbase, target_time
+real(r8), allocatable :: x_state(:)
+integer :: file_unit, x_size, ens_member, io
+character (len = 128) :: file_name = 'tiegcm_restart_p.nc', file_in = 'temp_ic'
+character (len = 128) :: file_namelist_out = 'namelist_update'
+integer :: model_doy, model_hour, model_minute ! current tiegcm mtime (doy,hour,mitute)
+integer :: adv_to_doy, adv_to_hour, adv_to_minute ! advance tiegcm mtime
+integer :: target_doy, target_hour, target_minute ! forecast time interval in mtime format
+integer :: utsec, year, month, day, sec, model_year
+
+!----------------------------------------------------------------------
+!----------------------------------------------------------------------
+
+call initialize_utilities(progname='dart_to_model', output_flag=.true.)
+
+call static_init_model() ! reads input.nml, etc., sets the table
+x_size = get_model_size() ! now that we know how big state vector is ...
+allocate(x_state(x_size)) ! allocate space for the (empty) state vector
+
+! Open the DART model state ...
+! Read in the time to which TIEGCM must advance.
+! Read in the valid time for the model state
+! Read in state vector from DART
+
+file_unit = open_restart_read(file_in)
+
+call aread_state_restart(model_time, x_state, file_unit, adv_to_time)
+call close_restart(file_unit)
+
+if (do_output()) &
+ call print_time(model_time,'time for restart file '//trim(file_in))
+if (do_output()) &
+ call print_date(model_time,'date for restart file '//trim(file_in))
+
+if (do_output()) &
+ call print_time(adv_to_time,'time for restart file '//trim(file_in))
+if (do_output()) &
+ call print_date(adv_to_time,'date for restart file '//trim(file_in))
+
+
+! Parse the vector into TIEGCM fields (prognostic variables)
+call init_model_instance(var, model_time)
+call vector_to_prog_var(x_state, var)
+deallocate(x_state)
+
+!----------------------------------------------------------------------
+! This program writes out parameters to a file called 'namelist_update'
+! for TIEGCM namelist update used in advance_model.csh
+!----------------------------------------------------------------------
+
+file_unit = get_unit()
+open(unit = file_unit, file = file_namelist_out)
+
+! write fields to the binary TIEGCM restart file
+call update_TIEGCM_restart(file_name, var)
+
+
+! Get updated TIEGCM namelist variables
+!
+call get_date(model_time, model_year, month, day, model_hour, model_minute, sec)
+jan1 = set_date(model_year,1,1)
+tbase = model_time - jan1 ! total time since the start of the year
+call get_time(tbase, utsec, model_doy)
+model_doy = model_doy + 1 ! add jan1 back in
+
+call get_date(adv_to_time, year, month, day, adv_to_hour, adv_to_minute, sec)
+jan1 = set_date(year,1,1)
+tbase = adv_to_time - jan1 ! total time since the start of the year
+call get_time(tbase, utsec, adv_to_doy)
+adv_to_doy = adv_to_doy + 1 ! add jan1 back in
+
+! Calculate number of hours to advance tiegcm
+target_time = adv_to_time - model_time
+call get_time(target_time, sec, day)
+target_doy = day
+target_hour = sec/3600
+target_minute = (sec - target_hour*3600)/60
+
+!START_YEAR
+write(file_unit, *, iostat = io ) model_year
+if (io /= 0 )then
+ call error_handler(E_ERR,'dart_to_model:','cannot write model_year to STDOUT', &
+ source,revision,revdate)
+endif
+!START_DAY
+write(file_unit, *, iostat = io ) model_doy
+if (io /= 0 )then
+ call error_handler(E_ERR,'dart_to_model:','cannot write model_day to STDOUT', &
+ source,revision,revdate)
+endif
+!SOURCE_START, START, SECSTART
+write(file_unit, *, iostat = io ) model_doy,',',model_hour,',',model_minute
+if (io /= 0 )then
+ call error_handler(E_ERR,'dart_to_model:','cannot write mtime (day/hour/minute) to STDOUT', &
+ source,revision,revdate)
+endif
+!STOP, SECSTOP
+write(file_unit, *, iostat = io ) adv_to_doy,',',adv_to_hour,',',adv_to_minute
+if (io /= 0 )then
+ call error_handler(E_ERR,'dart_to_model:','cannot write adv_to_time mtime (day/hour/minute) to STDOUT', &
+ source,revision,revdate)
+endif
+!HIST, SAVE, SECHIST, SECSAVE
+write(file_unit, *, iostat = io ) target_doy,',',target_hour,',',target_minute
+if (io /= 0 )then
+ call error_handler(E_ERR,'dart_to_model:','cannot write target_time mtime (day/hour/minute) to STDOUT', &
+ source,revision,revdate)
+endif
+
+close(file_unit)
+
+!----------------------------------------------------------------------
+! When called with 'end', timestamp will also call finalize_utilities()
+!----------------------------------------------------------------------
+call timestamp(string1=source, pos='end')
+
+end program dart_to_model
Property changes on: DART/trunk/models/tiegcm/dart_to_model.f90
___________________________________________________________________
Added: mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author URL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/tiegcm/model_mod.f90
===================================================================
--- DART/trunk/models/tiegcm/model_mod.f90 (rev 0)
+++ DART/trunk/models/tiegcm/model_mod.f90 2010-05-17 20:42:20 UTC (rev 4367)
@@ -0,0 +1,1842 @@
+! DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+! provided by UCAR, "as is", without charge, subject to all terms of use at
+! http://www.image.ucar.edu/DAReS/DART/DART_download
+
+module model_mod
+
+!
+! $URL$
+! $Id$
+! $Revision$
+! $Date$
+
+!------------------------------------------------------------------
+!
+! Interface for HAO-TIEGCM
+!
+!------------------------------------------------------------------
+
+! DART Modules
+use types_mod, only : r8, digits12
+use time_manager_mod, only : time_type, set_calendar_type, set_time_missing, &
+ set_time, get_time, print_time, &
+ set_date, get_date, print_date, &
+ operator(*), operator(+), operator(-), &
+ operator(>), operator(<), operator(/), &
+ operator(/=), operator(<=)
+use location_mod, only : location_type, get_close_maxdist_init, &
+ get_close_obs_init, get_close_obs, &
+ set_location, get_location, query_location, &
+ get_dist, vert_is_height
+use utilities_mod, only : file_exist, open_file, close_file, &
+ error_handler, E_ERR, E_MSG, E_WARN, nmlfileunit, &
+ do_output, find_namelist_in_file, check_namelist_read, &
+ do_nml_file, do_nml_term, nc_check, &
+ register_module
+use obs_kind_mod, only : KIND_U_WIND_COMPONENT, &! just for definition
+ KIND_V_WIND_COMPONENT, &! just for definition
+ KIND_TEMPERATURE, &! just for definition
+ KIND_ELECTRON_DENSITY, &! for Ne obs
+ KIND_DENSITY ! for neutral density observation
+ ! (for now [O]; minor species ignored)
+use random_seq_mod, only : random_seq_type, init_random_seq, random_gaussian
+
+use typesizes
+use netcdf
+
+implicit none
+private
+
+public :: get_model_size, &
+ adv_1step, &
+ get_state_meta_data, &
+ model_interpolate, &
+ get_model_time_step, &
+ end_model, &
+ static_init_model, &
+ init_time, &
+ init_conditions, &
+ nc_write_model_atts, &
+ nc_write_model_vars, &
+ pert_model_state, &
+ get_close_maxdist_init, &
+ get_close_obs_init, &
+ get_close_obs, &
+ ens_mean_for_model
+!TIEGCM specific routines
+public :: model_type, &
+ init_model_instance, &
+ end_model_instance, &
+ prog_var_to_vector, &
+ vector_to_prog_var, &
+ read_TIEGCM_restart, &
+ update_TIEGCM_restart, &
+ read_TIEGCM_definition, &
+ read_TIEGCM_secondary
+
+! version controlled file description for error handling, do not edit
+character(len=128), parameter :: &
+ source = "$URL$", &
+ revision = "$Revision$", &
+ revdate = "$Date$"
+
+!------------------------------------------------------------------
+! define model parameters
+
+
+integer :: nilev, nlev, nlon, nlat
+real(r8),dimension(:), allocatable :: lons, lats, levs, ilevs
+real(r8),dimension(:,:,:),allocatable :: ZGtiegcm !! auxiliary variable(geopotential height[cm])
+real :: TIEGCM_missing_value !! global attribute
+
+integer :: time_step_seconds = 120 ! tiegcm time step
+integer :: time_step_days = 0 ! tiegcm time step
+character (len=19) :: restart_file = 'tiegcm_restart_p.nc'
+character (len=11) :: secondary_file = 'tiegcm_s.nc'
+
+integer, parameter :: TYPE_local_NE = 0
+integer, parameter :: TYPE_local_TN = 1
+integer, parameter :: TYPE_local_TN_NM = 2
+integer, parameter :: TYPE_local_UN = 3
+integer, parameter :: TYPE_local_UN_NM = 4
+integer, parameter :: TYPE_local_VN = 5
+integer, parameter :: TYPE_local_VN_NM = 6
+integer, parameter :: TYPE_local_O1 = 7
+integer, parameter :: TYPE_local_O1_NM = 8
+
+type(time_type) :: time_step
+integer :: model_size
+
+type model_type
+ real(r8), pointer :: vars_3d(:,:,:,:)
+ type(time_type) :: valid_time
+end type model_type
+
+integer :: state_num_3d = 9
+ ! -- interface levels --
+ ! NE
+ ! -- midpoint levels --
+ ! O1 O1_NM (O2 O2_NM NO NO_NM excluded for now)
+ ! -- midpoint levels; top slot missing --
+ ! TN TN_NM UN UN_NM VN VN_NM
+logical :: output_state_vector = .false.
+ ! .true. results in a "state-vector" netCDF file
+ ! .false. results in a "prognostic-var" netCDF file
+namelist /model_nml/ output_state_vector, state_num_3d
+
+logical :: first_pert_call = .true.
+type(random_seq_type) :: random_seq
+
+
+
+!------------------------------------------------------------------
+
+character(len = 129) :: msgstring
+logical, save :: module_initialized = .false.
+
+contains
+
+!==================================================================
+
+
+subroutine static_init_model()
+!------------------------------------------------------------------
+!
+! Called to do one time initialization of the model. As examples,
+! might define information about the model size or model timestep.
+! In models that require pre-computed static data, for instance
+! spherical harmonic weights, these would also be computed here.
+! Can be a NULL INTERFACE for the simplest models.
+
+ integer :: i
+ integer :: iunit, io
+
+if (module_initialized) return ! only need to do this once
+
+! Print module information to log file and stdout.
+call register_module(source, revision, revdate)
+
+! Since this routine calls other routines that could call this routine
+! we'll say we've been initialized pretty dang early.
+module_initialized = .true.
+
+!! Read the namelist entry for model_mod from input.nml
+!call find_namelist_in_file("input.nml", "model_nml", iunit)
+!read(iunit, nml = model_nml, iostat = io)
+!call check_namelist_read(iunit, io, "model_nml")
+
+!if (do_nml_file()) write(nmlfileunit, nml=model_nml)
+!if (do_nml_term()) write( * , nml=model_nml)
+
+! Reading in TIEGCM grid definition etc from TIEGCM restart file
+call read_TIEGCM_definition(restart_file)
+
+! Reading in Geopotential Height from TIEGCM secondary output file
+call read_TIEGCM_secondary(secondary_file)
+
+! Compute overall model size
+model_size = nlon * nlat * nlev * state_num_3d
+
+if (do_output()) write(*,*)'nlon = ',nlon
+if (do_output()) write(*,*)'nlat = ',nlat
+if (do_output()) write(*,*)'nlev = ',nlev
+if (do_output()) write(*,*)'n3D = ',state_num_3d
+
+! Might as well use the Gregorian Calendar
+call set_calendar_type('Gregorian')
+
+! The time_step in terms of a time type must also be initialized.
+time_step = set_time(time_step_seconds, time_step_days)
+
+
+
+end subroutine static_init_model
+
+
+
+subroutine init_conditions(x)
+!------------------------------------------------------------------
+! subroutine init_conditions(x)
+!
+! Returns a model state vector, x, that is some sort of appropriate
+! initial condition for starting up a long integration of the model.
+! At present, this is only used if the namelist parameter
+! start_from_restart is set to .false. in the program perfect_model_obs.
+! If this option is not to be used in perfect_model_obs, or if no
+! synthetic data experiments using perfect_model_obs are planned,
+! this can be a NULL INTERFACE.
+
+real(r8), intent(out) :: x(:)
+
+if ( .not. module_initialized ) call static_init_model
+
+end subroutine init_conditions
+
+
+
+subroutine adv_1step(x, time)
+!------------------------------------------------------------------
+! subroutine adv_1step(x, time)
+!
+! Does a single timestep advance of the model. The input value of
+! the vector x is the starting condition and x is updated to reflect
+! the changed state after a timestep. The time argument is intent
+! in and is used for models that need to know the date/time to
+! compute a timestep, for instance for radiation computations.
+! This interface is only called if the namelist parameter
+! async is set to 0 in perfect_model_obs of filter or if the
+! program integrate_model is to be used to advance the model
+! state as a separate executable. If one of these options
+! is not going to be used (the model will only be advanced as
+! a separate model-specific executable), this can be a
+! NULL INTERFACE.
+
+real(r8), intent(inout) :: x(:)
+type(time_type), intent(in) :: time
+
+end subroutine adv_1step
+
+
+
+function get_model_size()
+!------------------------------------------------------------------
+!
+! Returns the size of the model as an integer. Required for all
+! applications.
+
+integer :: get_model_size
+
+if ( .not. module_initialized ) call static_init_model
+
+get_model_size = model_size
+
+end function get_model_size
+
+
+
+subroutine init_time(time)
+!------------------------------------------------------------------
+!
+! Companion interface to init_conditions. Returns a time that is somehow
+! appropriate for starting up a long integration of the model.
+! At present, this is only used if the namelist parameter
+! start_from_restart is set to .false. in the program perfect_model_obs.
+! If this option is not to be used in perfect_model_obs, or if no
+! synthetic data experiments using perfect_model_obs are planned,
+! this can be a NULL INTERFACE.
+
+type(time_type), intent(out) :: time
+
+if ( .not. module_initialized ) call static_init_model
+
+! for now, just set to 0
+time = set_time(0,0)
+
+end subroutine init_time
+
+
+
+subroutine model_interpolate(x, location, itype, obs_val, istatus)
+!------------------------------------------------------------------
+!
+! Given a state vector, a location, and a model state variable type,
+! interpolates the state variable field to that location and returns
+! the value in obs_val. The istatus variable should be returned as
+! 0 unless there is some problem in computing the interpolation in
+! which case an alternate value should be returned. The itype variable
+! is a model specific integer that specifies the type of field (for
+! instance temperature, zonal wind component, etc.). In low order
+! models that have no notion of types of variables, this argument can
+! be ignored. For applications in which only perfect model experiments
+! with identity observations (i.e. only the value of a particular
+! state variable is observed), this can be a NULL INTERFACE.
+
+real(r8), intent(in) :: x(:)
+type(location_type), intent(in) :: location
+integer, intent(in) :: itype
+real(r8), intent(out) :: obs_val
+integer, intent(out) :: istatus
+
+integer :: i, vstatus, which_vert
+integer :: lat_below, lat_above, lon_below, lon_above
+real(r8) :: lon_fract, temp_lon, lat_fract
+real(r8) :: lon, lat, height, lon_lat_lev(3)
+real(r8) :: bot_lon, top_lon, delta_lon, bot_lat, top_lat, delta_lat
+real(r8) :: val(2,2), a(2)
+
+if ( .not. module_initialized ) call static_init_model
+
+! Default for successful return
+istatus = 0
+vstatus = 0
+
+! Get the position, determine if it is model level or pressure in vertical
+lon_lat_lev = get_location(location)
+lon = lon_lat_lev(1) ! degree
+lat = lon_lat_lev(2) ! degree
+if(vert_is_height(location)) then
+ height = lon_lat_lev(3)
+else
+ which_vert = nint(query_location(location))
+ write(msgstring,*) 'vertical coordinate type:',which_vert,' cannot be handled'
+ call error_handler(E_ERR,'model_interpolate',msgstring,source,revision,revdate)
+endif
+
+! Get lon and lat grid specs
+bot_lon = lons(1) ! 0
+top_lon = lons(nlon) ! 355
+delta_lon = abs((lons(1)-lons(2))) ! 5
+bot_lat = lats(1) ! -87.5
+top_lat = lats(nlat) ! 87.5
+delta_lat = abs((lats(1)-lats(2))) ! 5
+
+
+! Compute bracketing lon indices: DART [0, 360] TIEGCM [0, 355]
+if(lon >= bot_lon .and. lon <= top_lon) then ! 0 <= lon <= 355
+ lon_below = int((lon - bot_lon) / delta_lon) + 1
+ lon_above = lon_below + 1
+ lon_fract = (lon - lons(lon_below)) / delta_lon
+elseif (lon < bot_lon) then ! lon < 0
+ temp_lon = lon + 360.0
+ lon_below = int((temp_lon - bot_lon) / delta_lon) + 1
+ lon_above = lon_below + 1
+ lon_fract = (temp_lon - lons(lon_below)) / delta_lon
+
+elseif (lon >= (top_lon+delta_lon)) then ! 360 <= lon
+ temp_lon = lon - 360.0
+ lon_below = int((temp_lon - bot_lon) / delta_lon) + 1
+ lon_above = lon_below + 1
+ lon_fract = (temp_lon - lons(lon_below)) / delta_lon
+else ! 355 < lon < 360 at wraparound point
+ lon_below = nlon
+ lon_above = 1
+ lon_fract = (lon - top_lon) / delta_lon
+endif
+
+! compute neighboring lat rows: TIEGCM [-87.5, 87.5] DART [-90, 90]
+! NEED TO BE VERY CAREFUL ABOUT POLES; WHAT'S BEING DONE IS NOT GREAT!
+if(lat >= bot_lat .and. lat <= top_lat) then ! -87.5 <= lat <= 87.5
+ lat_below = int((lat - bot_lat) / delta_lat) + 1
+ lat_above = lat_above + 1
+ lat_fract = (lat - lats(lat_below) ) / delta_lat
+else if(lat < bot_lat) then ! South of bottom lat
+ lat_below = 1
+ lat_above = 1
+ lat_fract = 1.
+else ! North of top lat
+ lat_below = nlat
+ lat_above = nlat
+ lat_fract = 1.
+endif
+
+! Now, need to find the values for the four corners
+ call get_val(val(1, 1), x, lon_below, lat_below, height, itype, vstatus)
+ if (vstatus /= 1) call get_val(val(1, 2), x, lon_below, lat_above, height, itype, vstatus)
+ if (vstatus /= 1) call get_val(val(2, 1), x, lon_above, lat_below, height, itype, vstatus)
+ if (vstatus /= 1) call get_val(val(2, 2), x, lon_above, lat_above, height, itype, vstatus)
+
+
+! istatus meaning return expected obs? assimilate?
+! 0 obs and model are fine; yes yes
+! 1 fatal problem; no no
+! 2 exclude valid obs yes no
+!
+istatus = vstatus
+if(istatus /= 1) then
+ do i = 1, 2
+ a(i) = lon_fract * val(2, i) + (1.0 - lon_fract) * val(1, i)
+ end do
+ obs_val = lat_fract * a(2) + (1.0 - lat_fract) * a(1)
+else
+ obs_val = 0.
+endif
+
+!print*, 'model_interpolate', lon, lat, height,obs_val
+
+
+end subroutine model_interpolate
+
+
+
+subroutine get_val(val, x, lon_index, lat_index, height, obs_kind, istatus)
+!------------------------------------------------------------------
+!
+real(r8), intent(out) :: val
+real(r8), intent(in) :: x(:)
+integer, intent(in) :: lon_index, lat_index
+real(r8), intent(in) :: height
+integer, intent(in) :: obs_kind
+integer, intent(out) :: istatus
+
+integer :: var_type
+integer :: k, lev_top, lev_bottom
+real(r8) :: zgrid, delta_z
+real(r8) :: val_top, val_bottom, frac_lev
+
+! No errors to start with
+istatus = 0
+
+! To find a layer height: what's the unit of height [cm] ?
+h_loop:do k = 1, nlev
+ zgrid = ZGtiegcm(lon_index,lat_index,k) ![cm]
+ if (height <= zgrid) then
+ lev_top = k
+ lev_bottom = lev_top -1
+ delta_z = zgrid - ZGtiegcm(lon_index,lat_index,lev_bottom)
+ frac_lev = (zgrid - height)/delta_z
+ exit h_loop
+ endif
+enddo h_loop
+
+if (obs_kind == KIND_DENSITY) then
+
+ var_type = TYPE_local_O1
+ val_top = x(get_index(lat_index, lon_index, lev_top, var_type))
+ val_bottom = x(get_index(lat_index, lon_index, lev_bottom, var_type))
+
+elseif (obs_kind == KIND_ELECTRON_DENSITY) then
+
+ var_type = TYPE_local_NE
+ val_top = x(get_index(lat_index, lon_index, lev_top, var_type))
+ val_bottom = x(get_index(lat_index, lon_index, lev_bottom, var_type))
+
+else
+
+ istatus = 1
+ val = 0.
+ return
+
+end if
+
+val = frac_lev * val_bottom + (1.0 - frac_lev) * val_top
+
+end subroutine get_val
+
+
+
+function get_index(lat_index, lon_index, lev_index, var_type)
+!------------------------------------------------------------------
+!
+integer, intent(in) :: lat_index, lon_index, lev_index, var_type
+integer :: get_index
+
+get_index = 1 + var_type + (lev_index -1)*state_num_3d &
+ + (lat_index -1)*state_num_3d*nlev &
+ + (lon_index -1)*state_num_3d*nlev*nlat
+
+end function get_index
+
+
+
+function get_model_time_step()
+!------------------------------------------------------------------
+!
+! Returns the the time step of the model; the smallest increment
+! in time that the model is capable of advancing the state in a given
+! implementation. This interface is required for all applications.
+
+type(time_type) :: get_model_time_step
+
+if ( .not. module_initialized ) call static_init_model
+
+get_model_time_step = time_step
+
+end function get_model_time_step
+
+
+
+subroutine get_state_meta_data(index_in, location, var_type)
+!------------------------------------------------------------------
+!
+! Given an integer index into the state vector structure, returns the
+! associated location. A second intent(out) optional argument kind
+! can be returned if the model has more than one type of field (for
+! instance temperature and zonal wind component). This interface is
+! required for all filter applications as it is required for computing
+! the distance between observations and state variables.
+
+integer, intent(in) :: index_in
+type(location_type), intent(out) :: location
+integer, optional, intent(out) :: var_type
+
+integer :: indx, num_per_col, col_num, col_elem
+integer :: lon_index, lat_index, lev_index
+real(r8) :: lon, lat, lev
+integer :: local_var_type, var_type_temp
+
+if ( .not. module_initialized ) call static_init_model
+
+! Easier to compute with a 0 to size -1 index
+indx = index_in -1
+
+! Compute number of items per column
+num_per_col = nlev * state_num_3d
+
+! What column is this index in
+col_num = indx / num_per_col
+col_elem = indx - col_num * num_per_col
+
+! what lon and lat index for this column
+lon_index = col_num /nlat
+lat_index = col_num - lon_index * nlat
+
+! Now figure out which beast in column this is
+lev_index = col_elem / state_num_3d
+
+! Get actual lon lat values from static_init_model arrays
+lon = lons(lon_index + 1)
+lat = lats(lat_index + 1)
+
+! Find which var_type this element is
+var_type_temp = mod(col_elem, state_num_3d)
+
+if (var_type_temp == 0) then !NE
+ local_var_type = KIND_ELECTRON_DENSITY
+else if (var_type_temp == 1) then !TN
+ local_var_type = KIND_TEMPERATURE
+else if (var_type_temp == 2) then !TN_NM
+ local_var_type = KIND_TEMPERATURE
+else if (var_type_temp == 3) then !UN
+ local_var_type = KIND_U_WIND_COMPONENT
+else if (var_type_temp == 4) then !UN_NM
+ local_var_type = KIND_U_WIND_COMPONENT
+else if (var_type_temp == 5) then !VN
+ local_var_type = KIND_V_WIND_COMPONENT
+else if (var_type_temp == 6) then !VN_NM
+ local_var_type = KIND_V_WIND_COMPONENT
+else if (var_type_temp == 7) then !O1
+ local_var_type = KIND_DENSITY
+else if (var_type_temp == 8) then !O1_NM
+ local_var_type = KIND_DENSITY
+else
+ write(msgstring,*)"unknown var_type for index ",index_in
+ call error_handler(E_ERR,"get_state_meta_data", msgstring, source, revision, revdate)
+endif
+
+if (var_type_temp == 0) then !NE defined at interface levels
+ lev = ilevs(lev_index + 1)
+else
+ lev = levs(lev_index + 1) !TN UN VN O1 defined at midpoints
+endif
+
+location = set_location(lon,lat,lev,2) ! 2 == pressure (hPa) 3 == height
+
+! If the type is wanted, return it
+if(present(var_type)) var_type = local_var_type
+
+end subroutine get_state_meta_data
+
+
+
+subroutine end_model()
+!------------------------------------------------------------------
+!
+! Does any shutdown and clean-up needed for model. Can be a NULL
+! INTERFACE if the model has no need to clean up storage, etc.
+
+if ( .not. module_initialized ) call static_init_model
+
+end subroutine end_model
+
+
+
+function nc_write_model_atts( ncFileID ) result (ierr)
+!------------------------------------------------------------------
+! TJH 24 Oct 2006 -- Writes the model-specific attributes to a netCDF file.
+! This includes coordinate variables and some metadata, but NOT
+! the model state vector. We do have to allocate SPACE for the model
+! state vector, but that variable gets filled as the model advances.
+!
+! As it stands, this routine will work for ANY model, with no modification.
+!
+! The simplest possible netCDF file would contain a 3D field
+! containing the state of 'all' the ensemble members. This requires
+! three coordinate variables -- one for each of the dimensions
+! [model_size, ensemble_member, time]. A little metadata is useful,
+! so we can also create some 'global' attributes.
+! This is what is implemented here.
+!
+! Once the simplest case is working, this routine (and nc_write_model_vars)
+! can be extended to create a more logical partitioning of the state vector,
+! fundamentally creating a netCDF file with variables that are easily
+! plotted. The bgrid model_mod is perhaps a good one to view, keeping
+! in mind it is complicated by the fact it has two coordinate systems.
+! There are stubs in this template, but they are only stubs.
+!
+! TJH 29 Jul 2003 -- for the moment, all errors are fatal, so the
+! return code is always '0 == normal', since the fatal errors stop execution.
+!
+! assim_model_mod:init_diag_output uses information from the location_mod
+! to define the location dimension and variable ID. All we need to do
+! is query, verify, and fill ...
+!
+! Typical sequence for adding new dimensions,variables,attributes:
+! NF90_OPEN ! open existing netCDF dataset
+! NF90_redef ! put into define mode
+! NF90_def_dim ! define additional dimensions (if any)
+! NF90_def_var ! define variables: from name, type, and dims
+! NF90_put_att ! assign attribute values
+! NF90_ENDDEF ! end definitions: leave define mode
+! NF90_put_var ! provide values for variable
+! NF90_CLOSE ! close: save updated netCDF dataset
+
+
+integer, intent(in) :: ncFileID ! netCDF file identifier
+integer :: ierr ! return value of function
+
+integer :: nDimensions, nVariables, nAttributes, unlimitedDimID
+
+integer :: StateVarDimID ! netCDF pointer to state variable dimension (model size)
+integer :: MemberDimID ! netCDF pointer to dimension of ensemble (ens_size)
+integer :: TimeDimID ! netCDF pointer to time dimension (unlimited)
+
+integer :: StateVarVarID ! netCDF pointer to state variable coordinate array
+integer :: StateVarID ! netCDF pointer to 3D [state,copy,time] array
+
+integer :: TNVarID, TN_NMVarID, UNVarID, UN_NMVarID, VNVarID, VN_NMVarID
+integer :: O1VarID, O1_NMVarID
+integer :: NEVarID
+integer :: lonDimID, latDimID, levDimID, ilevDimID
+integer :: lonVarID, latVarID, levVarID, ilevVarID
+
+! we are going to need these to record the creation date in the netCDF file.
+! This is entirely optional, but nice.
+
+character(len=8) :: crdate ! needed by F90 DATE_AND_TIME intrinsic
+character(len=10) :: crtime ! needed by F90 DATE_AND_TIME intrinsic
+character(len=5) :: crzone ! needed by F90 DATE_AND_TIME intrinsic
+integer, dimension(8) :: values ! needed by F90 DATE_AND_TIME intrinsic
+character(len=NF90_MAX_NAME) :: str1
+
+integer :: i
+
+if ( .not. module_initialized ) call static_init_model
+
+!-------------------------------------------------------------------------------
+! make sure ncFileID refers to an open netCDF file,
+! and then put into define mode.
+!-------------------------------------------------------------------------------
+
+ierr = -1 ! assume things go poorly
+
+call nc_check(nf90_Inquire(ncFileID, nDimensions, nVariables, &
+ nAttributes, unlimitedDimID), 'nc_write_model_atts','inquire')
+call nc_check(nf90_Redef(ncFileID),'nc_write_model_atts','redef')
+
+!-------------------------------------------------------------------------------
+! We need the dimension ID for the number of copies/ensemble members, and
+! we might as well check to make sure that Time is the Unlimited dimension.
+! Our job is create the 'model size' dimension.
+!-------------------------------------------------------------------------------
+
+call nc_check(nf90_inq_dimid(ncid=ncFileID, name="copy", dimid=MemberDimID),&
+ 'nc_write_model_atts', 'copy dimid')
+call nc_check(nf90_inq_dimid(ncid=ncFileID, name="time", dimid= TimeDimID),&
+ 'nc_write_model_atts', 'time dimid')
+
+if ( TimeDimID /= unlimitedDimId ) then
+ write(msgstring,*)"Time Dimension ID ",TimeDimID, &
+ " should equal Unlimited Dimension ID",unlimitedDimID
+ call error_handler(E_ERR,"nc_write_model_atts", msgstring, source, revision, revdate)
+endif
+
+!-------------------------------------------------------------------------------
+! Define the model size / state variable dimension / whatever ...
+!-------------------------------------------------------------------------------
+call nc_check(nf90_def_dim(ncid=ncFileID, name="StateVariable", &
+ len=model_size, dimid = StateVarDimID),&
+ 'nc_write_model_atts', 'state def_dim')
+
+!-------------------------------------------------------------------------------
+! Write Global Attributes
+!-------------------------------------------------------------------------------
+
+call DATE_AND_TIME(crdate,crtime,crzone,values)
+write(str1,'(''YYYY MM DD HH MM SS = '',i4,5(1x,i2.2))') &
+ values(1), values(2), values(3), values(5), values(6), values(7)
+
+call nc_check(nf90_put_att(ncFileID, NF90_GLOBAL, "creation_date" ,str1 ),&
+ 'nc_write_model_atts', 'creation put')
+call nc_check(nf90_put_att(ncFileID, NF90_GLOBAL, "model_source" ,source ),&
+ 'nc_write_model_atts', 'source put')
+call nc_check(nf90_put_att(ncFileID, NF90_GLOBAL, "model_revision",revision),&
+ 'nc_write_model_atts', 'revision put')
+call nc_check(nf90_put_att(ncFileID, NF90_GLOBAL, "model_revdate" ,revdate ),&
+ 'nc_write_model_atts', 'revdate put')
+call nc_check(nf90_put_att(ncFileID, NF90_GLOBAL, "model","TIEGCM" ),&
+ 'nc_write_model_atts', 'model put')
+
+!-------------------------------------------------------------------------------
+! Here is the extensible part. The simplest scenario is to output the state vector,
+! parsing the state vector into model-specific parts is complicated, and you need
+! to know the geometry, the output variables (PS,U,V,T,Q,...) etc. We're skipping
+! complicated part.
+!-------------------------------------------------------------------------------
+
+if ( output_state_vector ) then
+
+ !----------------------------------------------------------------------------
+ ! Create a variable for the state vector
+ !----------------------------------------------------------------------------
+
+ ! Define the state vector coordinate variable and some attributes.
+ call nc_check(nf90_def_var(ncid=ncFileID,name="StateVariable", xtype=nf90_int, &
+ dimids=StateVarDimID, varid=StateVarVarID), &
+ 'nc_write_model_atts', 'statevariable def_var')
+ call nc_check(nf90_put_att(ncFileID, StateVarVarID, "long_name", "State Variable ID"), &
+ 'nc_write_model_atts', 'statevariable long_name')
+ call nc_check(nf90_put_att(ncFileID, StateVarVarID, "units", "indexical"), &
+ 'nc_write_model_atts', 'statevariable units')
+ call nc_check(nf90_put_att(ncFileID, StateVarVarID, "valid_range", (/ 1, model_size /)), &
+ 'nc_write_model_atts', 'statevariable valid_range')
+
+ ! Define the actual (3D) state vector, which gets filled as time goes on ...
+ call nc_check(nf90_def_var(ncid=ncFileID, name="state", xtype=nf90_real, &
+ dimids = (/ StateVarDimID, MemberDimID, unlimitedDimID /), &
+ varid=StateVarID), 'nc_write_model_atts', 'state def_var')
+ call nc_check(nf90_put_att(ncFileID, StateVarID, "long_name", "model state or fcopy"), &
+ 'nc_write_model_atts', 'state long_name')
+
+ ! Leave define mode so we can fill the coordinate variable.
+ call nc_check(nf90_enddef(ncfileID), 'nc_write_model_atts', 'state enddef')
+
+ ! Fill the state variable coordinate variable
+ call nc_check(nf90_put_var(ncFileID, StateVarVarID, (/ (i,i=1,model_size) /) ), &
+ 'nc_write_model_atts', 'state put_var')
+
+else
+
+ !----------------------------------------------------------------------------
+ ! We need to process the prognostic variables.
+ !----------------------------------------------------------------------------
+
+ ! This block is a stub for something more complicated.
+ ! Usually, the control for the execution of this block is a namelist variable.
+ ! Take a peek at the bgrid model_mod.f90 for a (rather complicated) example.
+
+ !----------------------------------------------------------------------------
+ ! Define the dimensions IDs
+ !----------------------------------------------------------------------------
+
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="lon", &
+ & len = nlon, dimid = lonDimID), 'nc_write_model_atts')
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="lat", &
+ & len = nlat, dimid = latDimID), 'nc_write_model_atts')
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="lev", &
+ & len = nlev, dimid = levDimID), 'nc_write_model_atts')
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="ilev", &
+ & len = nilev, dimid = ilevDimID), 'nc_write_model_atts')
+
+ !----------------------------------------------------------------------------
+ ! Create the (empty) Variables and the Attributes
+ !----------------------------------------------------------------------------
+
+ call nc_check(nf90_def_var(ncFileID, name="lon", &
+ & xtype=nf90_double, dimids=lonDimID, varid=lonVarID),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, lonVarID, &
+ & "long_name", "geographic longitude (-west, +east)"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, lonVarID, "units", "degrees_east"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, lonVarID, "valid_range", &
+ & (/ -180.0_r8, 180.0_r8 /)),'nc_write_model_atts')
+
+ call nc_check(nf90_def_var(ncFileID, name="lat", &
+ & xtype=nf90_double, dimids=latDimID, varid=latVarID),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, latVarID, &
+ & "long_name", "geographic latitude (-south +north)"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, latVarID, "units", "degrees_north"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, latVarID, "valid_range", &
+ & (/ -90.0_r8, 90.0_r8 /)),'nc_write_model_atts')
+
+ call nc_check(nf90_def_var(ncFileID, name="lev", &
+ & xtype=nf90_double, dimids=levDimID, varid=levVarID),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, levVarID, "long_name", "midpoint levels"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, levVarID, "units", "hPa"),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, levVarID, "positive", "up"),&
+ 'nc_write_model_atts')
+
+ call nc_check(nf90_def_var(ncFileID, name="ilev", &
+ & xtype=nf90_double, dimids=ilevDimID, varid=ilevVarID),&
+ 'nc_write_model_atts')
+ call nc_check(nf90_put_att(ncFileID, ilevVarID, "long_name", "interface levels"),&
@@ Diff output truncated at 40000 characters. @@
From nancy at ucar.edu Mon May 17 15:13:13 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Mon, 17 May 2010 15:13:13 -0600
Subject: [Dart-dev] [4368] DART/trunk/obs_def/obs_def_upper_atm_mod.f90:
This is needed by the TIEGCM model.
Message-ID:
Revision: 4368
Author: thoar
Date: 2010-05-17 15:13:13 -0600 (Mon, 17 May 2010)
Log Message:
-----------
This is needed by the TIEGCM model.
Added Paths:
-----------
DART/trunk/obs_def/obs_def_upper_atm_mod.f90
-------------- next part --------------
Added: DART/trunk/obs_def/obs_def_upper_atm_mod.f90
===================================================================
--- DART/trunk/obs_def/obs_def_upper_atm_mod.f90 (rev 0)
+++ DART/trunk/obs_def/obs_def_upper_atm_mod.f90 2010-05-17 21:13:13 UTC (rev 4368)
@@ -0,0 +1,18 @@
+! DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+! provided by UCAR, "as is", without charge, subject to all terms of use at
+! http://www.image.ucar.edu/DAReS/DART/DART_download
+
+! This module supports the observation types from the AIRS instruments.
+! http://winds.jpl.nasa.gov/missions/quikscat/index.cfm
+
+! BEGIN DART PREPROCESS KIND LIST
+! CHAMP_DENSITY, KIND_DENSITY, COMMON_CODE
+! GPS_PROFILE, KIND_ELECTRON_DENSITY, COMMON_CODE
+! END DART PREPROCESS KIND LIST
+
+!
+! $URL$
+! $Id$
+! $Revision$
+! $Date$
+
Property changes on: DART/trunk/obs_def/obs_def_upper_atm_mod.f90
___________________________________________________________________
Added: mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author URL Id
Added: svn:eol-style
+ native
From nancy at ucar.edu Fri May 21 09:54:45 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Fri, 21 May 2010 09:54:45 -0600
Subject: [Dart-dev] [4369] DART/trunk/models/wrf/model_mod.f90: Ryan Torn' s
changes to add additional metadata to the netCDF diagnostic files when
Message-ID:
Revision: 4369
Author: nancy
Date: 2010-05-21 09:54:45 -0600 (Fri, 21 May 2010)
Log Message:
-----------
Ryan Torn's changes to add additional metadata to the netCDF diagnostic files when
writing the state data as a single vector. There is no change to the code in the more
common case of breaking the state vector into the consitituent variables, U,V,T, etc
and writing them in separate netCDF variables. The new additional metadata records
which variables are in the state vector, what order they are in, and what size they are.
This allows a postprocessing program to extract the various fields at a later point.
Also included in this commit is a minor rearrangment of where the module global variables
are declared, to collect the 5 deprecated namelist items together with a comment about them
having no impact on the code. This has been true for a while now, but it wasn't as
clear as it could have been that setting those items has no impact on the code. Removing
them from the namelist is a non-backwards-compatible change (users will get a fatal error
at runtime if they remain in the users namelist), but they should be removed as soon as
we freeze the code and make a release.
Modified Paths:
--------------
DART/trunk/models/wrf/model_mod.f90
-------------- next part --------------
Modified: DART/trunk/models/wrf/model_mod.f90
===================================================================
--- DART/trunk/models/wrf/model_mod.f90 2010-05-17 21:13:13 UTC (rev 4368)
+++ DART/trunk/models/wrf/model_mod.f90 2010-05-21 15:54:45 UTC (rev 4369)
@@ -163,25 +163,18 @@
logical :: default_state_variables = .true. ! use default state list?
character(len=129) :: wrf_state_variables(num_state_table_columns,max_state_variables) = 'NULL'
character(len=129) :: wrf_state_bounds(num_bounds_table_columns,max_state_variables) = 'NULL'
-integer :: num_moist_vars = 3
integer :: num_domains = 1
integer :: calendar_type = GREGORIAN
integer :: assimilation_period_seconds = 21600
-logical :: surf_obs = .true.
-logical :: soil_data = .true.
-logical :: h_diab = .false.
! Max height a surface obs can be away from the actual model surface
! and still be accepted (in meters)
real (kind=r8) :: sfc_elev_max_diff = -1.0_r8 ! could be something like 200.0_r8
-! adv_mod_command moved to dart_to_wrf namelist; ignored here.
-character(len = 72) :: adv_mod_command = ''
real (kind=r8) :: center_search_half_length = 500000.0_r8
real(r8) :: circulation_pres_level = 80000.0_r8
real(r8) :: circulation_radius = 108000.0_r8
integer :: center_spline_grid_scale = 10
integer :: vert_localization_coord = VERTISHEIGHT
-! Allow (or not) observations above the surface but below the lowest
-! sigma level.
+! Allow observations above the surface but below the lowest sigma level.
logical :: allow_obs_below_vol = .false.
!nc -- we are adding these to the model.nml until they appear in the NetCDF files
logical :: polar = .false. ! wrap over the poles
@@ -190,7 +183,19 @@
!JPH -- single column model flag
logical :: scm = .false. ! using the single column model
-! JPH note that soil_data and h_diab are never used and can be removed.
+! obsolete items; ignored by this code.
+! non-backwards-compatible change. should be removed,
+! but see note below about namelist.
+integer :: num_moist_vars
+logical :: surf_obs, soil_data, h_diab
+
+! adv_mod_command moved to dart_to_wrf namelist; ignored here.
+character(len = 72) :: adv_mod_command = ''
+
+! num_moist_vars, surf_obs, soil_data, h_diab, and adv_mod_command
+! are IGNORED no matter what their settings in the namelist are.
+! they are obsolete, but removing them here will cause a fatal error
+! until users remove them from their input.nml files as well.
namelist /model_nml/ output_state_vector, num_moist_vars, &
num_domains, calendar_type, surf_obs, soil_data, h_diab, &
default_state_variables, wrf_state_variables, &
@@ -3438,7 +3443,7 @@
!-----------------------------------------------------------------
integer :: nDimensions, nVariables, nAttributes, unlimitedDimID
-integer :: StateVarDimID, StateVarVarID, StateVarID, TimeDimID
+integer :: StateVarDimID, StateVarID, TimeDimID
integer, dimension(num_domains) :: weDimID, weStagDimID, snDimID, snStagDimID, &
btDimID, btStagDimID, slSDimID, tmp
@@ -3447,6 +3452,7 @@
integer :: DXVarID, DYVarID, TRUELAT1VarID, TRUELAT2VarID, STAND_LONVarID
integer :: CEN_LATVarID, CEN_LONVarID, MAP_PROJVarID
integer :: PERIODIC_XVarID, POLARVarID
+integer :: metadataID, wrfStateID, wrfDimID, WRFStateVarID, WRFStateDimID
integer, dimension(num_domains) :: DNVarID, ZNUVarID, DNWVarID, phbVarID, &
MubVarID, LonVarID, LatVarID, ilevVarID, XlandVarID, hgtVarID
@@ -3877,22 +3883,35 @@
! Create attributes for the state vector
!-----------------------------------------------------------------
- ! Define the state vector coordinate variable
+ call nc_check(nf90_inq_dimid(ncFileID, "metadatalength", metadataID), &
+ 'nc_write_model_atts','inq_dimid metadatalength')
- call nc_check(nf90_def_var(ncid=ncFileID,name="StateVariable", xtype=nf90_int, &
- dimids=StateVarDimID, varid=StateVarVarID), &
- 'nc_write_model_atts','def_var StateVariable')
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="WRFStateVariables", &
+ len = wrf%dom(1)%number_of_wrf_variables, dimid = wrfStateID), &
+ 'nc_write_model_atts','def_dim WRFStateVariables')
- call nc_check(nf90_put_att(ncFileID, StateVarVarID, "long_name", &
- "State Variable ID"), &
- 'nc_write_model_atts','put_att StateVariable long_name')
- call nc_check(nf90_put_att(ncFileID, StateVarVarID, "units", &
- "indexical"), &
- 'nc_write_model_atts','put_att StateVariable units')
- call nc_check(nf90_put_att(ncFileID, StateVarVarID, "valid_range", &
- (/ 1, wrf%model_size /)), &
- 'nc_write_model_atts','put_att StateVariable valid_range')
+ call nc_check(nf90_def_dim(ncid=ncFileID, name="WRFVarDimension", &
+ len = 3, dimid = wrfDimID), &
+ 'nc_write_model_atts','def_dim WRFVarDimensionID')
+ ! Define the state variable name variable
+ call nc_check(nf90_def_var(ncid=ncFileID,name="WRFStateVariables", xtype=nf90_char, &
+ dimids=(/ metadataID, wrfStateID /), varid=WRFStateVarID), &
+ 'nc_write_model_atts','def_var WRFStateVariables')
+
+ call nc_check(nf90_put_att(ncFileID, WRFStateVarID, "long_name", &
+ "WRF State Variable Name"), &
+ 'nc_write_model_atts','put_att WRFStateVariables long_name')
+
+ ! Define the WRF state variable dimension lengths
+ call nc_check(nf90_def_var(ncid=ncFileID,name="WRFStateDimensions", xtype=nf90_int, &
+ dimids=(/ wrfDimID, wrfStateID, DomDimID /), varid=WRFStateDimID), &
+ 'nc_write_model_atts','def_var WRFStateDimensions')
+
+ call nc_check(nf90_put_att(ncFileID, WRFStateDimID, "long_name", &
+ "WRF State Variable Dimensions"), &
+ 'nc_write_model_atts','put_att WRFStateDimensions long_name')
+
! Define the actual state vector
call nc_check(nf90_def_var(ncid=ncFileID, name="state", xtype=nf90_real, &
@@ -3929,9 +3948,21 @@
call nc_check(nf90_enddef(ncfileID),'nc_write_model_atts','enddef')
- call nc_check(nf90_put_var(ncFileID,StateVarVarID,(/ (i,i=1,wrf%model_size) /)), &
- 'nc_write_model_atts','put_var StateVarVarID')
+ do ind = 1,wrf%dom(1)%number_of_wrf_variables
+ my_index = wrf%dom(1)%var_index_list(ind)
+ call nc_check(nf90_put_var(ncFileID,WRFStateVarID,trim(wrf_state_variables(1,my_index)), &
+ start = (/ 1, ind /), count = (/ len_trim(wrf_state_variables(1,my_index)), 1 /)), &
+ 'nc_write_model_atts', 'put_var WRFStateVariables')
+ enddo
+ do id = 1, num_domains
+ do ind = 1,wrf%dom(id)%number_of_wrf_variables
+ call nc_check(nf90_put_var(ncFileID,WRFStateDimID,wrf%dom(id)%var_size(:,ind), &
+ start = (/ 1, ind, id /), count = (/ 3, 1, 1 /)), &
+ 'nc_write_model_atts', 'put_var WRFStateDimensions')
+ enddo
+ enddo
+
else ! physical arrays
do id=1,num_domains
@@ -4444,7 +4475,6 @@
if (debug) &
print*, 'model_mod.f90 :: get_model_pressure_profile :: n, v_p() ', n, v_p(1:n)
- !if( wrf%dom(id)%surf_obs ) then
if ( wrf%dom(id)%type_ps >= 0 ) then
ill = wrf%dom(id)%dart_ind(ll(1), ll(2), 1, wrf%dom(id)%type_ps)
@@ -4718,7 +4748,6 @@
source, revision, revdate)
endif
-!if(wrf%dom(id)%surf_obs ) then
if ( wrf%dom(id)%type_ps >= 0 ) then
ips = wrf%dom(id)%dart_ind(i,j,1,wrf%dom(id)%type_ps)
model_pressure_s = x(ips)
From nancy at ucar.edu Fri May 21 16:07:29 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Fri, 21 May 2010 16:07:29 -0600
Subject: [Dart-dev] [4370] DART/trunk: Adding a couple observation kinds for
the COAMPS folks.
Message-ID:
Revision: 4370
Author: thoar
Date: 2010-05-21 16:07:29 -0600 (Fri, 21 May 2010)
Log Message:
-----------
Adding a couple observation kinds for the COAMPS folks.
Also took steps to ensure FORTRAN parameter names are less than
the 32 character limit. This was potentially a problem when
trying to 'build' horizontal wind names from the components.
Had to change XXXXX_10_METER_HORIZONTAL_WIND (where XXXXX is the platform)
to XXXXX_10_M_HORZ_WIND. Change was made consistent in obs_diag
and should not affect any of the existing Matlab scripts.
Modified Paths:
--------------
DART/trunk/diagnostics/threed_sphere/obs_diag.f90
DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
-------------- next part --------------
Modified: DART/trunk/diagnostics/threed_sphere/obs_diag.f90
===================================================================
--- DART/trunk/diagnostics/threed_sphere/obs_diag.f90 2010-05-21 15:54:45 UTC (rev 4369)
+++ DART/trunk/diagnostics/threed_sphere/obs_diag.f90 2010-05-21 22:07:29 UTC (rev 4370)
@@ -2550,7 +2550,7 @@
if (indx1 > 0) then ! must be _?_WIND_COMPONENT
str3 = str1(1:indx1)//'_HORIZONTAL_WIND'
else ! must be _?_10_METER_WIND
- str3 = str1(1:indx2)//'_10_METER_HORIZONTAL_WIND'
+ str3 = str1(1:indx2)//'_10_M_HORZ_WIND'
indx1 = indx2
endif
@@ -3593,7 +3593,7 @@
indxN = index(str1(1:indx1),str2(1:indx2))
if (indxN > 0) then ! we know they are matching kinds
nwinds = nwinds + 1
- str3 = str1(1:indx2)//'_10_METER_HORIZONTAL_WIND'
+ str3 = str1(1:indx2)//'_10_M_HORZ_WIND'
names(max_obs_kinds + nwinds) = str3
endif
endif
Modified: DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
===================================================================
--- DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-21 15:54:45 UTC (rev 4369)
+++ DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-21 22:07:29 UTC (rev 4370)
@@ -48,7 +48,12 @@
! be rerun to generate a obs_kind_mod.f90 file for use by the rest of the
! DART system. Future versions of the preprocess program will be able to
! generate this table automatically.
-!
+
+! F90 currently has a maximum length of 32 characters for the NAME of
+! any Fortran PARAMETER.
+
+integer, parameter :: paramname_length = 32
+
! Definition and public access to the observation types/kinds
! Unique index values associated with each observation type and
! kind strings are defined here.
@@ -157,11 +162,14 @@
KIND_VORTEX_WMAX = 84, &
! kinds for COAMPS (Tim Whitcomb)
- KIND_EXNER_FUNCTION = 85
+ KIND_EXNER_FUNCTION = 85, &
+ KIND_TURBULENT_KINETIC_ENERGY = 86, &
+ KIND_VERTLEVEL = 87
+
!! PRIVATE ONLY TO THIS MODULE. see comment below near the max_obs_specific
!! declaration.
-integer, parameter :: max_obs_generic = 85
+integer, parameter :: max_obs_generic = 87
!----------------------------------------------------------------------------
! This list is autogenerated by the 'preprocess' program. To add new
@@ -218,7 +226,7 @@
! restrictions on the length of parameter identifiers.
type obs_type_type
integer :: index
- character(len = 32) :: name
+ character(len = paramname_length) :: name
integer :: var_type
logical :: assimilate
logical :: evaluate
@@ -231,7 +239,7 @@
type obs_kind_type
integer :: index
- character(len = 32) :: name
+ character(len = paramname_length) :: name
end type obs_kind_type
! An obs_kind_name_type is defined by the preprocess program to store
@@ -349,6 +357,8 @@
obs_kind_names(83) = obs_kind_type(KIND_VORTEX_PMIN, 'KIND_VORTEX_PMIN')
obs_kind_names(84) = obs_kind_type(KIND_VORTEX_WMAX, 'KIND_VORTEX_WMAX')
obs_kind_names(85) = obs_kind_type(KIND_EXNER_FUNCTION, 'KIND_EXNER_FUNCTION')
+obs_kind_names(86) = obs_kind_type(KIND_TURBULENT_KINETIC_ENERGY, 'KIND_TURBULENT_KINETIC_ENERGY')
+obs_kind_names(87) = obs_kind_type(KIND_VERTLEVEL, 'KIND_VERTLEVEL')
! count here, then output below
@@ -477,7 +487,7 @@
! Returns observation type name
integer, intent(in) :: obs_kind_ind
-character(len = 32) :: get_obs_kind_name
+character(len = paramname_length) :: get_obs_kind_name
if ( .not. module_initialized ) call initialize_module
@@ -501,7 +511,7 @@
! Returns observation kind name
integer, intent(in) :: obs_kind_ind
-character(len=32) :: get_raw_obs_kind_name
+character(len=paramname_length) :: get_raw_obs_kind_name
if (.not. module_initialized) call initialize_module
@@ -651,7 +661,7 @@
character(len=*), intent(in), optional :: fform
integer, intent(in), optional :: use_list(:)
-character(len=32) :: fileformat
+character(len=paramname_length) :: fileformat
integer :: i
if ( .not. module_initialized ) call initialize_module
@@ -805,7 +815,7 @@
integer :: get_kind_from_menu
integer :: i, ierr
-character(len=32) :: in
+character(len=paramname_length) :: in
if ( .not. module_initialized ) call initialize_module
@@ -862,8 +872,8 @@
integer :: add_wind_names
integer :: ivar, nwinds
-character(len=len(my_names)) :: str1, str2, str3
-character(len=len(my_names)), dimension(2*max_obs_kinds) :: names
+character(len=paramname_length) :: str1, str2, str3
+character(len=paramname_length), dimension(2*max_obs_kinds) :: names
! Initially, the array of obs_kind_names is exactly 'max_num_obs' in length.
! This block finds the U,V wind pairs and creates the 'horizontal_wind'
@@ -878,8 +888,6 @@
nwinds = 0
-if ( DEBUG ) write(*,*)'my_names length is ',len(my_names)
-
! Copy all the known obs kinds to a local list that is SURELY too BIG
! as we find wind pairs, we insert the new name at the end of the known
! names.
@@ -939,7 +947,7 @@
indxN = index(str1(1:indx1),str2(1:indx2))
if (indxN > 0) then ! we know they are matching kinds
nwinds = nwinds + 1
- str3 = str1(1:indx2)//'_10_METER_HORIZONTAL_WIND'
+ str3 = str1(1:indx2)//'_10_M_HORZ_WIND'
names(max_obs_kinds + nwinds) = str3
endif
endif
@@ -1049,7 +1057,7 @@
if (indx1 > 0) then ! must be _?_WIND_COMPONENT
str3 = str1(1:indx1)//'_HORIZONTAL_WIND'
else ! must be _?_10_METER_WIND
- str3 = str1(1:indx2)//'_10_METER_HORIZONTAL_WIND'
+ str3 = str1(1:indx2)//'_10_M_HORZ_WIND'
indx1 = indx2
endif
@@ -1058,7 +1066,7 @@
! str1(1:indx1) and str2(1:indx1) should be the wind name -
! 'RADIOSONDE_' or 'SHIP_' or 'AIREP_' or ...
-if (index(str1, str2(1:indx1)) < 1) then
+if ( str1(1:indx1) /= str2(1:indx1) ) then
write(msg_string,*) 'around OBS ', obskey, trim(str1),trim(str2), 'observation types not compatible.'
call error_handler(E_WARN,'do_obs_form_pair',msg_string,source,revision,revdate)
endif
From nancy at ucar.edu Fri May 21 16:23:38 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Fri, 21 May 2010 16:23:38 -0600
Subject: [Dart-dev] [4371]
DART/trunk/models/coamps/coamps_flat_file_mod.f90: adding support for flat
file manipulation
Message-ID:
Revision: 4371
Author: thoar
Date: 2010-05-21 16:23:38 -0600 (Fri, 21 May 2010)
Log Message:
-----------
adding support for flat file manipulation
Added Paths:
-----------
DART/trunk/models/coamps/coamps_flat_file_mod.f90
-------------- next part --------------
Added: DART/trunk/models/coamps/coamps_flat_file_mod.f90
===================================================================
--- DART/trunk/models/coamps/coamps_flat_file_mod.f90 (rev 0)
+++ DART/trunk/models/coamps/coamps_flat_file_mod.f90 2010-05-21 22:23:38 UTC (rev 4371)
@@ -0,0 +1,209 @@
+!------------------------------
+! MODULE: coamps_flatfile_mod
+! AUTHOR: T. R. Whitcomb and P. A. Reinecke
+! Naval Research Laboratory
+! DART VERSION: ?????
+!
+! Module containing the data structure and routines for dealing with
+! COAMPS flat files.
+!------------------------------
+module coamps_flat_file_mod
+
+ use coamps_util_mod, only : C_REAL, &
+ C_REAL4, &
+ check_io_status, &
+ check_alloc_status, &
+ check_dealloc_status, &
+ fix_for_platform4, &
+ uppercase, &
+ lowercase
+
+ implicit none
+
+ private
+
+ !------------------------------
+ ! BEGIN PUBLIC INTERFACE
+ !------------------------------
+
+ ! Initialization
+ public :: read_flat_file
+ public :: write_flat_file
+ public :: generate_one_flat_file_name
+ !------------------------------
+ ! END PUBLIC INTERFACE
+ !------------------------------
+
+ !------------------------------
+ ! BEGIN EXTERNAL INTERFACES
+ !------------------------------
+ ! [none]
+ !------------------------------
+ ! END EXTERNAL INTERFACES
+ !------------------------------
+
+
+ !------------------------------
+ ! BEGIN TYPES AND CONSTANTS
+ !------------------------------
+ ! [none]
+ !------------------------------
+ ! END TYPES AND CONSTANTS
+ !------------------------------
+
+ !------------------------------
+ ! BEGIN MODULE VARIABLES
+ !------------------------------
+ character(len=128) :: &
+ source = "models/coamps/coamps_flat_file_mod.f90 $", &
+ revision = "$Revision$", &
+ revdate = "$Date$"
+
+! source = "$URL$", &
+! revision = "$Revision$", &
+! revdate = "$Date$"
+
+ logical :: module_initialized = .false.
+ !------------------------------
+ ! END MODULE VARIABLES
+ !------------------------------
+contains
+
+ !------------------------------
+ ! BEGIN PUBLIC ROUTINES
+ !------------------------------
+
+ ! write_flat_file
+ ! ----------------
+ ! Given the unit number of an *open* COAMPS flat
+ ! file, read the file into an array.
+ ! PARAMETERS
+ ! IN flat_unit unit number of an open flat file
+ ! OUT flat_array coamps_grid structure to be filled
+ subroutine write_flat_file(flat_unit, flat_array)
+ integer, intent(in) :: flat_unit
+ real(kind=C_REAL), dimension(:), intent(in) :: flat_array
+
+ real(kind=C_REAL4), dimension(:), allocatable :: flat_array_tmp
+ character(len=*), parameter :: routine = 'write_flat_file'
+ integer :: io_status, alloc_status, dealloc_status
+ integer :: field_size
+
+ field_size=size(flat_array)
+ allocate(flat_array_tmp(field_size), stat=alloc_status)
+ call check_alloc_status(alloc_status, routine, source, revision, &
+ revdate, 'flat_array_tmp')
+
+ ! COAMPS flat files are real(kind=4)
+
+ flat_array_tmp(:) = real(flat_array(:), kind=C_REAL4)
+ call fix_for_platform4(flat_array_tmp, field_size, C_REAL4)
+
+ write(unit=flat_unit, rec=1, iostat=io_status) flat_array_tmp
+ call check_io_status(io_status, routine, source, revision, &
+ revdate, 'writing flat file')
+
+ deallocate(flat_array_tmp, stat=dealloc_status)
+ call check_dealloc_status(dealloc_status, routine, source, revision, &
+ revdate, 'flat_array_tmp')
+ end subroutine write_flat_file
+
+ ! read_flat_file
+ ! ----------------
+ ! Given the unit number of an *open* COAMPS flat
+ ! file, read the file into an array.
+ ! PARAMETERS
+ ! IN flat_unit unit number of an open flat file
+ ! OUT flat_array coamps_grid structure to be filled
+ subroutine read_flat_file(flat_unit, flat_array)
+ integer, intent(in) :: flat_unit
+ real(kind=C_REAL), dimension(:), intent(out) :: flat_array
+
+ real(kind=C_REAL4), dimension(:), allocatable :: flat_array_tmp
+ character(len=*), parameter :: routine = 'read_flat_file'
+ integer :: io_status, alloc_status, dealloc_status
+ integer :: field_size
+
+ field_size=size(flat_array)
+ allocate(flat_array_tmp(field_size), stat=alloc_status)
+ call check_alloc_status(alloc_status, routine, source, revision, &
+ revdate, 'flat_array_tmp')
+
+ ! Read in the data - COAMPS writes flat files as C_REAL4's
+ read(unit=flat_unit, rec=1, iostat=io_status) flat_array_tmp
+ call check_io_status(io_status, routine, source, revision, &
+ revdate, 'Reading flat file')
+ call fix_for_platform4(flat_array_tmp, field_size, C_REAL4)
+ flat_array(:)=real(flat_array_tmp(:) , kind=C_REAL)
+
+ deallocate(flat_array_tmp, stat=dealloc_status)
+ call check_dealloc_status(dealloc_status, routine, source, revision, &
+ revdate, 'flat_array_tmp')
+ end subroutine read_flat_file
+
+ ! generate_one_flat_file_name
+ ! -----------------------
+ ! Given field, level, and grid information, generate the properly
+ ! formatted 64-character COAMPS flat file name. Note that this
+ ! does *not* generate any path information - it only returns the
+ ! file name.
+ ! PARAMETERS
+ ! IN var_name the field the file contains
+ ! IN level_type vertical level type (height/pressure/etc)
+ ! IN level1 lowest vertical level in the file
+ ! IN level2 highest vertical level in the file
+ ! (for files for a single level, level1 is
+ ! that level and level2 is left to 0)
+ ! IN gridnum nest number (only 1 supported for now)
+ ! IN aoflag field type: (a)tmosphere or (o)cean
+ ! IN xpts number of points in the x direction
+ ! IN ypts number of points in the y direction
+ ! IN dtg base date-time group
+ ! IN tau_hh forecast lead time - hour component
+ ! IN tau_mm forecast lead time - minute component
+ ! IN tau_ss forecast lead time - second component
+ ! IN field_type type of field (e.g. fcstfld, infofld)
+ ! OUT file_name COAMPS flat file name
+ subroutine generate_one_flat_file_name(var_name, level_type, level1, &
+ level2, gridnum, aoflag, xpts,&
+ ypts, dtg, tau_hh, tau_mm, &
+ tau_ss, field_type, &
+ file_name)
+ character(len=6), intent(in) :: var_name
+ character(len=3), intent(in) :: level_type
+ integer, intent(in) :: level1
+ integer, intent(in) :: level2
+ integer, intent(in) :: gridnum
+ character(len=1), intent(in) :: aoflag
+ integer, intent(in) :: xpts
+ integer, intent(in) :: ypts
+ character(len=10), intent(in) :: dtg
+ integer, intent(in) :: tau_hh
+ integer, intent(in) :: tau_mm
+ integer, intent(in) :: tau_ss
+ character(len=7), intent(in) :: field_type
+ character(len=64), intent(out) :: file_name
+
+ write(file_name, 100) var_name, level_type, &
+ & level1, level2, gridnum, aoflag, xpts, ypts, dtg, &
+ & tau_hh, tau_mm, tau_ss, field_type
+
+ ! make sure the file name is lower case
+ file_name=lowercase(file_name)
+
+100 format(A6,'_',A3,'_',I6.6,'_',I6.6,'_',I1,A1,I4.4,'x',I4.4,'_', &
+ A10,'_',I4.4,I2.2,I2.2,'_',A7)
+ end subroutine generate_one_flat_file_name
+
+ !------------------------------
+ ! END PUBLIC ROUTINES
+ !------------------------------
+
+ !------------------------------
+ ! BEGIN PRIVATE ROUTINES
+ !------------------------------
+
+ !------------------------------
+ ! END PRIVATE ROUTINES
+ !------------------------------
+end module
Property changes on: DART/trunk/models/coamps/coamps_flat_file_mod.f90
___________________________________________________________________
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
From nancy at ucar.edu Fri May 21 16:27:48 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Fri, 21 May 2010 16:27:48 -0600
Subject: [Dart-dev] [4372] DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90: Use
the utility routine to determine if the input/output obs_seq files
Message-ID:
Revision: 4372
Author: nancy
Date: 2010-05-21 16:27:48 -0600 (Fri, 21 May 2010)
Log Message:
-----------
Use the utility routine to determine if the input/output obs_seq files
are binary or ascii so the defaults are handled in a single code location.
Modified Paths:
--------------
DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
-------------- next part --------------
Modified: DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90
===================================================================
--- DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-21 22:23:38 UTC (rev 4371)
+++ DART/trunk/obs_kind/DEFAULT_obs_kind_mod.F90 2010-05-21 22:27:48 UTC (rev 4372)
@@ -21,7 +21,7 @@
use utilities_mod, only : register_module, error_handler, &
E_ERR, E_MSG, E_WARN, &
logfileunit, find_namelist_in_file, &
- check_namelist_read, do_output
+ check_namelist_read, do_output, ascii_file_format
implicit none
private
@@ -659,57 +659,49 @@
integer, intent(in) :: ifile
character(len=*), intent(in), optional :: fform
-integer, intent(in), optional :: use_list(:)
+integer, intent(in), optional :: use_list(:)
-character(len=paramname_length) :: fileformat
-integer :: i
+integer :: i, ntypes
+logical :: is_ascii, restrict
if ( .not. module_initialized ) call initialize_module
-fileformat = "ascii" ! supply default
-if(present(fform)) fileformat = trim(adjustl(fform))
+is_ascii = ascii_file_format(fform)
-! Write the 5 character identifier for verbose formatted output
-SELECT CASE (fileformat)
- ! This header needs to be written for formatted OR unformatted
- ! If it's not present, it means to use the default old definitions
- CASE ("unf", "UNF", "unformatted", "UNFORMATTED")
- write(ifile) 'obs_kind_definitions'
- CASE DEFAULT
- write(ifile, 11)
-11 format('obs_kind_definitions')
-END SELECT
+! Write the 20 character identifier for verbose formatted output
+if (is_ascii) then
+ write(ifile, *) 'obs_kind_definitions'
+else
+ write(ifile) 'obs_kind_definitions'
+endif
-! Loop through the list to write out the integer indices and strings
-! For all the defined observation types
-SELECT CASE (fileformat)
- CASE ("unf", "UNF", "unformatted", "UNFORMATTED")
- ! Write the number of defined kinds, then the list
- if (present(use_list)) then
- write(ifile) count(use_list(:) > 0)
- else
- write(ifile) max_obs_specific
- endif
- do i = 1, max_obs_specific
- if (present(use_list)) then
- if (use_list(i) == 0) cycle
- endif
- write(ifile) obs_type_info(i)%index, obs_type_info(i)%name
- end do
- CASE DEFAULT
- if (present(use_list)) then
- write(ifile, *) count(use_list(:) > 0)
- else
- write(ifile, *) max_obs_specific
- endif
- do i = 1, max_obs_specific
- if (present(use_list)) then
- if (use_list(i) == 0) cycle
- endif
- write(ifile, *) obs_type_info(i)%index, obs_type_info(i)%name
- end do
-END SELECT
+! If this routine is called with a list of which types are actually
+! being used, restrict the table of contents to only those.
+! Otherwise, write all known types.
+if (present(use_list)) then
+ ntypes = count(use_list(:) > 0)
+ restrict = .true.
+else
+ ntypes = max_obs_specific
+ restrict = .false.
+endif
+if (is_ascii) then
+ write(ifile, *) ntypes
+else
+ write(ifile) ntypes
+endif
+
+do i = 1, max_obs_specific
+ if (restrict .and. use_list(i) == 0) cycle
+
+ if (is_ascii) then
+ write(ifile, *) obs_type_info(i)%index, obs_type_info(i)%name
+ else
+ write(ifile) obs_type_info(i)%index, obs_type_info(i)%name
+ endif
+end do
+
end subroutine write_obs_kind
!----------------------------------------------------------------------------
@@ -727,8 +719,9 @@
character(len=*), intent(in), optional :: fform
character(len=20) :: header
-character(len=32) :: fileformat, o_name
+character(len=paramname_length) :: o_name
integer :: i, num_def_kinds, o_index, list_index
+logical :: is_ascii
if ( .not. module_initialized ) call initialize_module
@@ -744,68 +737,51 @@
return
endif
-fileformat = "ascii" ! supply default
-if(present(fform)) fileformat = trim(adjustl(fform))
+is_ascii = ascii_file_format(fform)
-! Read the 5 character identifier for verbose formatted output
-SELECT CASE (fileformat)
- CASE ("unf", "UNF", "unformatted", "UNFORMATTED")
- ! Need to look for header string
- read(ifile) header
- if(header /= 'obs_kind_definitions') then
- call error_handler(E_ERR, 'read_obs_kind', &
- 'Did not find obs_kind_definitions string', &
- source, revision, revdate)
- endif
- CASE DEFAULT
- read(ifile, 11) header
-11 format(a20)
- if(header /= 'obs_kind_definitions') then
- call error_handler(E_ERR, 'read_obs_kind', &
- 'Did not find obs_kind_definitions string', &
- source, revision, revdate)
- endif
-END SELECT
+! Read the 20 character identifier for verbose formatted output
+if (is_ascii) then
+ read(ifile, *) header
+else
+ read(ifile) header
+endif
+if(header /= 'obs_kind_definitions') then
+ call error_handler(E_ERR, 'read_obs_kind', &
+ 'Did not find obs_kind_definitions string', &
+ source, revision, revdate)
+endif
+
! Loop through the list to read the integer indices and strings
! For all the defined observation types
! Set up the map from kinds in the obs_sequence file to those
! in the data structure in this module.
-SELECT CASE (fileformat)
- CASE ("unf", "UNF", "unformatted", "UNFORMATTED")
- read(ifile) num_def_kinds
- do i = 1, num_def_kinds
- read(ifile) o_index, o_name
- ! What is the integer associated with this o_name in this module?
- list_index = get_obs_kind_index(o_name)
- ! Check for error
- if(list_index == -1) then
- write(msg_string, *) 'Did not find observation kind ', o_name, &
- ' in obs_kind_mod list'
- call error_handler(E_ERR, 'read_obs_kind', msg_string, &
- source, revision, revdate)
- endif
- map(1, i) = o_index
- map(2, i) = list_index
- end do
- CASE DEFAULT
- read(ifile, *) num_def_kinds
- do i = 1, num_def_kinds
- read(ifile, *) o_index, o_name
- ! What is the integer associated with this o_name in this module?
- list_index = get_obs_kind_index(o_name)
- ! Check for error
- if(list_index == -1) then
- write(msg_string, *) 'Did not find observation kind ', o_name, &
- ' in obs_kind_mod list'
- call error_handler(E_ERR, 'read_obs_kind', msg_string, &
- source, revision, revdate)
- endif
- map(1, i) = o_index
- map(2, i) = list_index
- end do
-END SELECT
+if (is_ascii) then
+ read(ifile, *) num_def_kinds
+else
+ read(ifile) num_def_kinds
+endif
+do i = 1, num_def_kinds
+ if (is_ascii) then
+ read(ifile, *) o_index, o_name
+ else
+ read(ifile) o_index, o_name
+ endif
+
+ ! What is the integer associated with this o_name in this module?
+ list_index = get_obs_kind_index(o_name)
+ ! Check for error
+ if(list_index == -1) then
+ write(msg_string, *) 'Did not find observation kind ', o_name, &
+ ' in obs_kind_mod list'
+ call error_handler(E_ERR, 'read_obs_kind', msg_string, &
+ source, revision, revdate)
+ endif
+ map(1, i) = o_index
+ map(2, i) = list_index
+end do
+
end subroutine read_obs_kind
!----------------------------------------------------------------------------
@@ -831,8 +807,7 @@
end do
! Read the input as a string, convert to integers as appropriate
-read(*, 11) in
-11 format(A)
+read(*, '(A)') in
! If string is a positive or negative number, convert it to integer
read(in, *, IOSTAT = ierr) get_kind_from_menu
From nancy at ucar.edu Wed May 26 16:09:31 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Wed, 26 May 2010 16:09:31 -0600
Subject: [Dart-dev] [4373] DART/trunk/models/tiegcm: The assimilation is
working and the advance_model.csh script is working.
Message-ID:
Revision: 4373
Author: thoar
Date: 2010-05-26 16:09:31 -0600 (Wed, 26 May 2010)
Log Message:
-----------
The assimilation is working and the advance_model.csh script is working.
I have to check the file motion from advance_model.csh ... might be leaving
the tiegcm_restart_p.nc files in the advance directory without copying
them to CENTRALDIR - impact on cycling?
Must check that I did not break perfect_model_obs ... since advance_model.csh
expects the tiegcm support files (tiegcm_restart_p.nc & tiegcm_s.nc) to be
uniquely named in CENTRALDIR ...
Modified Paths:
--------------
DART/trunk/models/tiegcm/model_mod.f90
DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
DART/trunk/models/tiegcm/shell_scripts/run_filter.csh
DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh
DART/trunk/models/tiegcm/work/input.nml
-------------- next part --------------
Modified: DART/trunk/models/tiegcm/model_mod.f90
===================================================================
--- DART/trunk/models/tiegcm/model_mod.f90 2010-05-21 22:27:48 UTC (rev 4372)
+++ DART/trunk/models/tiegcm/model_mod.f90 2010-05-26 22:09:31 UTC (rev 4373)
@@ -416,9 +416,9 @@
! No errors to start with
istatus = 0
-! To find a layer height: what's the unit of height [cm] ?
+! To find a layer height: what's the unit of height [m]
h_loop:do k = 1, nlev
- zgrid = ZGtiegcm(lon_index,lat_index,k) ![cm]
+ zgrid = ZGtiegcm(lon_index,lat_index,k)/100.0_r8 ! [m] = ZGtiegcm/100 [cm]
if (height <= zgrid) then
lev_top = k
lev_bottom = lev_top -1
@@ -559,7 +559,7 @@
lev = levs(lev_index + 1) !TN UN VN O1 defined at midpoints
endif
-location = set_location(lon,lat,lev,2) ! 2 == pressure (hPa) 3 == height
+location = set_location(lon,lat,lev,2) ! 2 == pressure 3 == height
! If the type is wanted, return it
if(present(var_type)) var_type = local_var_type
@@ -798,7 +798,7 @@
'nc_write_model_atts')
call nc_check(nf90_put_att(ncFileID, levVarID, "long_name", "midpoint levels"),&
'nc_write_model_atts')
- call nc_check(nf90_put_att(ncFileID, levVarID, "units", "hPa"),&
+ call nc_check(nf90_put_att(ncFileID, levVarID, "units", "Pa"),&
'nc_write_model_atts')
call nc_check(nf90_put_att(ncFileID, levVarID, "positive", "up"),&
'nc_write_model_atts')
@@ -808,7 +808,7 @@
'nc_write_model_atts')
call nc_check(nf90_put_att(ncFileID, ilevVarID, "long_name", "interface levels"),&
'nc_write_model_atts')
- call nc_check(nf90_put_att(ncFileID, ilevVarID, "units", "hPa"),&
+ call nc_check(nf90_put_att(ncFileID, ilevVarID, "units", "Pa"),&
'nc_write_model_atts')
call nc_check(nf90_put_att(ncFileID, ilevVarID, "positive", "up"),&
'nc_write_model_atts')
@@ -1739,7 +1739,7 @@
call nc_check(nf90_get_var(restart_id, var_lev_id, values=levs), &
'read_TIEGCM_definition', 'get_var lev')
- levs = p0 * exp(-levs) ![millibars] = [hPa]
+ levs = p0 * exp(-levs) * 100.0_r8 ![Pa] = 100* [millibars] = 100* [hPa]
call nc_check(nf90_inq_dimid(restart_id, 'ilev', dim_ilev_id), &
'read_TIEGCM_definition', 'inq_dimid ilev')
@@ -1751,7 +1751,7 @@
call nc_check(nf90_get_var(restart_id, var_ilev_id, values=ilevs), &
'read_TIEGCM_definition', 'get_var ilev')
- ilevs = p0 * exp(-ilevs) ![millibars] = [hPa]
+ ilevs = p0 * exp(-ilevs) * 100.0_r8 ! [Pa] = 100* [millibars] = 100* [hPa]
if (nlev .ne. nilev) then
write(msgstring, *) ' nlev = ',nlev,' nilev = ',nilev, 'are different; DART assumes them to be the same'
Modified: DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-21 22:27:48 UTC (rev 4372)
+++ DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-26 22:09:31 UTC (rev 4373)
@@ -64,10 +64,13 @@
# The EXPECTED input DART 'initial conditions' file name is 'temp_ic'
- ln -sf ../$input_file temp_ic || exit 2
- cp -p ../tiegcm_restart_p.nc . || exit 2
- cp -p ../tiegcm_s.nc . || exit 2
+ set tiesecond = `printf "tiegcm_s.nc.%04d" $ensemble_member`
+ set tierestart = `printf "tiegcm_restart_p.nc.%04d" $ensemble_member`
+ ln -sf ../$input_file temp_ic || exit 2
+ cp -p ../$tiesecond tiegcm_s.nc || exit 2
+ cp -p ../$tierestart tiegcm_restart_p.nc || exit 2
+
# echo "ensemble member $ensemble_member : before dart_to_model"
# ncdump -v mtime tiegcm_restart.nc
@@ -75,17 +78,17 @@
# update tiegcm namelist variables
- set start_year = "START_YEAR = "`head -1 namelist_update | tail -1`
- set start_day = "START_DAY = "`head -2 namelist_update | tail -1`
- set source_start = "SOURCE_START = "`head -3 namelist_update | tail -1`
- set start = "START = "`head -3 namelist_update | tail -1`
- set secstart = "SECSTART = "`head -3 namelist_update | tail -1`
- set stop = "STOP = "`head -4 namelist_update | tail -1`
- set secstop = "SECSTOP = "`head -4 namelist_update | tail -1`
- set hist = "HIST = "`head -5 namelist_update | tail -1`
- set sechist = "SECHIST = "`head -5 namelist_update | tail -1`
- set save = "SAVE = "`head -5 namelist_update | tail -1`
- set secsave = "SECSAVE = "`head -5 namelist_update | tail -1`
+ set start_year = "START_YEAR = "`head -1 namelist_update | tail -1`
+ set start_day = "START_DAY = "`head -2 namelist_update | tail -1`
+ set source_start = "SOURCE_START = "`head -3 namelist_update | tail -1`
+ set start = "START = "`head -3 namelist_update | tail -1`
+ set secstart = "SECSTART = "`head -3 namelist_update | tail -1`
+ set stop = "STOP = "`head -4 namelist_update | tail -1`
+ set secstop = "SECSTOP = "`head -4 namelist_update | tail -1`
+ set hist = "HIST = "`head -5 namelist_update | tail -1`
+ set sechist = "SECHIST = "`head -5 namelist_update | tail -1`
+ set save = "SAVE = "`head -5 namelist_update | tail -1`
+ set secsave = "SECSAVE = "`head -5 namelist_update | tail -1`
sed -e 's/^;.*//' tiegcm.nml.original >! nml
@@ -105,8 +108,6 @@
mv tiegcm.nml.update tiegcm.nml
-# ls -lrt
-
#----------------------------------------------------------------------
# Block 3: Run the model
#----------------------------------------------------------------------
@@ -128,6 +129,12 @@
#----------------------------------------------------------------------
# Block 4: Convert the model output to form needed by DART
+ # AT this point, the model has updated the information in tiegcm_restart_p.nc
+ # We need to get that information back into the DART state vector.
+ #
+ # model_to_dart expects the tiegcm input file to be 'tiegcm_restart_p.nc'
+ # model_to_dart expects the tiegcm secondary file to be 'tiegcm_s.nc'
+ # model_to_dart writes out the DART file to be 'temp_ud'
#----------------------------------------------------------------------
../model_to_dart
Modified: DART/trunk/models/tiegcm/shell_scripts/run_filter.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/run_filter.csh 2010-05-21 22:27:48 UTC (rev 4372)
+++ DART/trunk/models/tiegcm/shell_scripts/run_filter.csh 2010-05-26 22:09:31 UTC (rev 4373)
@@ -120,6 +120,7 @@
set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
+set EXPERIMENT = /ptmp/tmatsuo/DART/tiegcm/2002_03_28/initial/filter
#-----------------------------------------------------------------------------
# Get the DART executables, scripts, and input files
@@ -135,8 +136,8 @@
${COPY} ${DARTDIR}/shell_scripts/advance_model.csh .
# data files
- ${COPY} ${DARTDIR}/work/obs_seq.out .
${COPY} ${DARTDIR}/work/input.nml .
+ ${COPY} ${EXPERIMENT}/obs_seq.out .
#-----------------------------------------------------------------------------
# Get the tiegcm executable, control files, and data files.
@@ -145,42 +146,62 @@
${COPY} ${TIEGCMDIR}/tiegcm-nompi tiegcm
#${COPY} ${TIEGCMDIR}/tiegcm .
${COPY} ${TIEGCMDIR}/tiegcm.nml .
- ${COPY} ${TIEGCMDIR}/tiegcm_s.nc .
#-----------------------------------------------------------------------------
-# Get the tiegcm input state ... for this experiment, we are using the
-# perturb routine from model_mod() to generate the ensemble from a single state.
-# REQUIREMENT:
-# input.nml:filter_nml:start_from_restart = .FALSE.
-# input.nml:filter_nml:restart_in_file = 'filter_ics'
+# Get the tiegcm input state ... for this experiment, we generated the ensemble by:
#
-# After you have run this once, there will be _many_ initial conditions files
-# for filter ...
+# ${COPY} ${TIEGCMDIR}/tiegcm_s.nc .
+# ${COPY} ${TIEGCMDIR}/tiegcm_restart_p.nc .
+# ./model_to_dart || exit 1
+# mv temp_ud filter_ics
#
-# Convert a TIEGCM file 'tiegcm_restart.nc' to a DART ics file 'filter_ics'
-# 'model_to_dart' has a hardwired output filename of 'temp_ud' ...
+# REQUIREMENT for the case where we have an initial ensemble:
+# input.nml:filter_nml:start_from_restart = .TRUE.
+# input.nml:filter_nml:restart_in_file = 'filter_ics'
#-----------------------------------------------------------------------------
+# Put all of the DART initial conditions files and all of the TIEGCM files
+# in the CENTRALDIR - preserving the ensemble member ID for each filename.
+# The advance_model.csh script will copy the appropriate files for each
+# ensemble member into the model advance directory.
+# These files may be linked to CENTRALDIR since they get copied to the
+# model advance directory.
+#-----------------------------------------------------------------------------
-${COPY} ${TIEGCMDIR}/tiegcm_restart_p.nc .
-./model_to_dart || exit 1
-mv temp_ud filter_ics
+set ENSEMBLESTRING = `/usr/local/bin/grep -A 42 filter_nml input.nml | grep ens_size`
+set NUM_ENS = `echo $ENSEMBLESTRING[3] | sed -e "s#,##"`
+@ i = 1
+while ( $i <= $NUM_ENS )
+
+ set darticname = `printf "filter_ics.%04d" $i`
+ set tiesecond = `printf "tiegcm_s.nc.%04d" $i`
+ set tierestart = `printf "tiegcm_restart_p.nc.%04d" $i`
+
+ ln -sf ${EXPERIMENT}/$darticname .
+ ln -sf ${EXPERIMENT}/$tiesecond .
+ ln -sf ${EXPERIMENT}/$tierestart .
+
+ @ i += 1
+end
+
#-----------------------------------------------------------------------------
# Run filter ...
#-----------------------------------------------------------------------------
+ln -sf tiegcm_restart_p.nc.0001 tiegcm_restart_p.nc
+ln -sf tiegcm_s.nc.0001 tiegcm_s.nc
+
mpirun.lsf ./filter || exit 2
echo "${JOBNAME} ($JOBID) finished at "`date`
#-----------------------------------------------------------------------------
# Move the output to storage after filter completes.
-# At this point, all the restart,diagnostic files are in the CENTRALDIR
+# At this point, all the DART restart,diagnostic files are in the CENTRALDIR
# and need to be moved to the 'experiment permanent' directory.
-# We have had problems with some, but not all, files being moved
-# correctly, so we are adding bulletproofing to check to ensure the filesystem
-# has completed writing the files, etc. Sometimes we get here before
-# all the files have finished being written.
+#
+# TJH: At this point, the output files have pretty 'generic' names.
+# The files should be archived with the assimilation date in their name.
#-----------------------------------------------------------------------------
echo "Listing contents of CENTRALDIR before archiving"
@@ -188,9 +209,9 @@
exit
-${MOVE} *.data *.meta ${experiment}/tiegcm
-${MOVE} data data.cal ${experiment}/tiegcm
-${MOVE} STD* ${experiment}/tiegcm
+${MOVE} tiegcm_s.nc* ${experiment}/tiegcm
+${MOVE} tiegcm_restart_p.nc* ${experiment}/tiegcm
+${MOVE} tiegcm_out_* ${experiment}/tiegcm
${MOVE} filter_restart* ${experiment}/DART
${MOVE} assim_model_state_ud[1-9]* ${experiment}/DART
@@ -206,8 +227,6 @@
${COPY} *.csh ${experiment}/DART
${COPY} $myname ${experiment}/DART
-ls -lrt
-
exit 0
#
Modified: DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh 2010-05-21 22:27:48 UTC (rev 4372)
+++ DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh 2010-05-26 22:09:31 UTC (rev 4373)
@@ -119,8 +119,9 @@
# Set variables containing various directory names where we will GET things
#-----------------------------------------------------------------------------
-set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
+set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
+set TIEGCMEXP = /ptmp/tmatsuo/ensemble/0
#-----------------------------------------------------------------------------
# Get the DART executables, scripts, and input files
@@ -136,7 +137,7 @@
${COPY} ${DARTDIR}/shell_scripts/advance_model.csh .
# data files
- ${COPY} ${DARTDIR}/work/obs_seq.in .
+ ${COPY} ${TIEGCMEXP}/obs_seq.in .
${COPY} ${DARTDIR}/work/input.nml .
#-----------------------------------------------------------------------------
@@ -147,9 +148,10 @@
${COPY} ${TIEGCMDIR}/tiegcm-nompi tiegcm
#${COPY} ${TIEGCMDIR}/tiegcm .
${COPY} ${TIEGCMDIR}/tiegcm.nml .
- ${COPY} ${TIEGCMDIR}/tiegcm_restart_p.nc .
- ${COPY} ${TIEGCMDIR}/tiegcm_s.nc .
+ ${COPY} ${TIEGCMEXP}/tiegcm_restart_p.nc .
+ ${COPY} ${TIEGCMEXP}/tiegcm_s.nc .
+
#-----------------------------------------------------------------------------
# Check that everything moved OK, and the table is set.
# Convert a TIEGCM file 'tiegcm_restart.nc' to a DART ics file 'perfect_ics'
@@ -161,8 +163,13 @@
#-----------------------------------------------------------------------------
# Run perfect_model_obs ... harvest the observations to populate obs_seq.out
+# model_mod expects a generic name // advance_model.csh expects a filename
+# with the ensemble member ID tacked on - must provide both.
#-----------------------------------------------------------------------------
+ln -sf tiegcm_restart_p.nc tiegcm_restart_p.nc.0001
+ln -sf tiegcm_s.nc tiegcm_s.nc.0001
+
./perfect_model_obs || exit 2
echo "${JOBNAME} ($JOBID) finished at "`date`
Modified: DART/trunk/models/tiegcm/work/input.nml
===================================================================
--- DART/trunk/models/tiegcm/work/input.nml 2010-05-21 22:27:48 UTC (rev 4372)
+++ DART/trunk/models/tiegcm/work/input.nml 2010-05-26 22:09:31 UTC (rev 4373)
@@ -25,8 +25,8 @@
async = 2,
adv_ens_command = "./advance_model.csh",
ens_size = 4,
- start_from_restart = .false.,
- output_restart = .false.,
+ start_from_restart = .true.,
+ output_restart = .true.,
obs_sequence_in_name = "obs_seq.out",
obs_sequence_out_name = "obs_seq.final",
restart_in_file_name = "filter_ics",
@@ -74,8 +74,8 @@
restart_out_file_name = 'smoother_restart' /
&ensemble_manager_nml
- single_restart_file_in = .true.,
- single_restart_file_out = .true.,
+ single_restart_file_in = .false.,
+ single_restart_file_out = .false.,
perturbation_amplitude = 0.2 /
&assim_tools_nml
From nancy at ucar.edu Wed May 26 16:16:03 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Wed, 26 May 2010 16:16:03 -0600
Subject: [Dart-dev] [4374]
DART/trunk/models/tiegcm/shell_scripts/advance_model.csh: added block to
move updated tiegcm files to CENTRALDIR
Message-ID:
Revision: 4374
Author: thoar
Date: 2010-05-26 16:16:03 -0600 (Wed, 26 May 2010)
Log Message:
-----------
added block to move updated tiegcm files to CENTRALDIR
Modified Paths:
--------------
DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
-------------- next part --------------
Modified: DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-26 22:09:31 UTC (rev 4373)
+++ DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-26 22:16:03 UTC (rev 4374)
@@ -135,11 +135,16 @@
# model_to_dart expects the tiegcm input file to be 'tiegcm_restart_p.nc'
# model_to_dart expects the tiegcm secondary file to be 'tiegcm_s.nc'
# model_to_dart writes out the DART file to be 'temp_ud'
+ #
+ # The updated information needs to be moved into CENTRALDIR in
+ # preparation for the next cycle.
#----------------------------------------------------------------------
../model_to_dart
- mv temp_ud ../$output_file || exit 4
+ mv temp_ud ../$output_file || exit 4
+ mv tiegcm_s.nc ../$tiesecond || exit 4
+ mv tiegcm_restart_p.nc ../$tierestart || exit 4
@ state_copy++
@ ensemble_member_line = $ensemble_member_line + 3
From nancy at ucar.edu Thu May 27 13:05:43 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 27 May 2010 13:05:43 -0600
Subject: [Dart-dev] [4375] DART/trunk/models/tiegcm/shell_scripts: This
cycled for 10 3-hour timesteps!!! with 4 members.
Message-ID:
Revision: 4375
Author: tmatsuo
Date: 2010-05-27 13:05:43 -0600 (Thu, 27 May 2010)
Log Message:
-----------
This cycled for 10 3-hour timesteps!!! with 4 members.
Need to re-run with 20? 40? N?
Modified Paths:
--------------
DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
DART/trunk/models/tiegcm/shell_scripts/run_filter.csh
DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh
-------------- next part --------------
Modified: DART/trunk/models/tiegcm/shell_scripts/advance_model.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-26 22:16:03 UTC (rev 4374)
+++ DART/trunk/models/tiegcm/shell_scripts/advance_model.csh 2010-05-27 19:05:43 UTC (rev 4375)
@@ -120,7 +120,7 @@
# mpirun.lsf ../tiegcm < tiegcm.nml |& tee tiegcm_out_$ensemble_member
# without mpi
- ../tiegcm < tiegcm.nml |& tee tiegcm_out_$ensemble_member
+ ../tiegcm < tiegcm.nml >& tiegcm_out_$ensemble_member || exit 3
# echo "ensemble member $ensemble_member : after tiegcm"
# ncdump -v mtime tiegcm_restart.nc
@@ -140,7 +140,7 @@
# preparation for the next cycle.
#----------------------------------------------------------------------
- ../model_to_dart
+ ../model_to_dart || exit 4
mv temp_ud ../$output_file || exit 4
mv tiegcm_s.nc ../$tiesecond || exit 4
Modified: DART/trunk/models/tiegcm/shell_scripts/run_filter.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/run_filter.csh 2010-05-26 22:16:03 UTC (rev 4374)
+++ DART/trunk/models/tiegcm/shell_scripts/run_filter.csh 2010-05-27 19:05:43 UTC (rev 4375)
@@ -36,7 +36,7 @@
#BSUB -P 35071364
#BSUB -q debug
#BSUB -n 4
-#BSUB -W 1:00
+#BSUB -W 2:00
#BSUB -N -u ${USER}@ucar.edu
#----------------------------------------------------------------------
@@ -118,8 +118,8 @@
# Set variables containing various directory names where we will GET things
#-----------------------------------------------------------------------------
-set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
-set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
+set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
+set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
set EXPERIMENT = /ptmp/tmatsuo/DART/tiegcm/2002_03_28/initial/filter
#-----------------------------------------------------------------------------
@@ -136,8 +136,8 @@
${COPY} ${DARTDIR}/shell_scripts/advance_model.csh .
# data files
- ${COPY} ${DARTDIR}/work/input.nml .
${COPY} ${EXPERIMENT}/obs_seq.out .
+ ${COPY} ${DARTDIR}/work/input.nml .
#-----------------------------------------------------------------------------
# Get the tiegcm executable, control files, and data files.
@@ -204,9 +204,6 @@
# The files should be archived with the assimilation date in their name.
#-----------------------------------------------------------------------------
-echo "Listing contents of CENTRALDIR before archiving"
-ls -l
-
exit
${MOVE} tiegcm_s.nc* ${experiment}/tiegcm
Modified: DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh
===================================================================
--- DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh 2010-05-26 22:16:03 UTC (rev 4374)
+++ DART/trunk/models/tiegcm/shell_scripts/run_perfect_model_obs.csh 2010-05-27 19:05:43 UTC (rev 4375)
@@ -39,7 +39,6 @@
#BSUB -W 0:50
#BSUB -N -u ${USER}@ucar.edu
-
#----------------------------------------------------------------------
# Turns out the scripts are a lot more flexible if you don't rely on
# the queuing-system-specific variables -- so I am converting them to
@@ -119,9 +118,9 @@
# Set variables containing various directory names where we will GET things
#-----------------------------------------------------------------------------
-set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
-set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
-set TIEGCMEXP = /ptmp/tmatsuo/ensemble/0
+set DARTDIR = /blhome/tmatsuo/DART/models/tiegcm
+set TIEGCMDIR = /blhome/tmatsuo/DART/models/tiegcm/tiegcm_files
+set EXPERIMENT = /ptmp/tmatsuo/DART/tiegcm/2002_03_28/initial/perfect
#-----------------------------------------------------------------------------
# Get the DART executables, scripts, and input files
@@ -137,20 +136,19 @@
${COPY} ${DARTDIR}/shell_scripts/advance_model.csh .
# data files
- ${COPY} ${TIEGCMEXP}/obs_seq.in .
+ ${COPY} ${EXPERIMENT}/obs_seq.in .
${COPY} ${DARTDIR}/work/input.nml .
#-----------------------------------------------------------------------------
# Get the tiegcm executable, control files, and data files.
-# trying to use the CCSM naming conventions
#-----------------------------------------------------------------------------
${COPY} ${TIEGCMDIR}/tiegcm-nompi tiegcm
#${COPY} ${TIEGCMDIR}/tiegcm .
${COPY} ${TIEGCMDIR}/tiegcm.nml .
- ${COPY} ${TIEGCMEXP}/tiegcm_restart_p.nc .
- ${COPY} ${TIEGCMEXP}/tiegcm_s.nc .
+ ${COPY} ${EXPERIMENT}/tiegcm_restart_p.nc .
+ ${COPY} ${EXPERIMENT}/tiegcm_s.nc .
#-----------------------------------------------------------------------------
# Check that everything moved OK, and the table is set.
@@ -189,9 +187,9 @@
exit
-${MOVE} *.data *.meta ${experiment}/tiegcm
-${MOVE} data data.cal ${experiment}/tiegcm
-${MOVE} STD* ${experiment}/tiegcm
+${MOVE} tiegcm_s.nc* ${experiment}/tiegcm
+${MOVE} tiegcm_restart_p.nc* ${experiment}/tiegcm
+${MOVE} tiegcm_out_* ${experiment}/tiegcm
${MOVE} filter_restart* ${experiment}/DART
${MOVE} assim_model_state_ud[1-9]* ${experiment}/DART
From nancy at ucar.edu Thu May 27 13:05:57 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 27 May 2010 13:05:57 -0600
Subject: [Dart-dev] [4376] DART/trunk/observations/gps/advance_time.f90: The
source for the time utility was moved to the
Message-ID:
Revision: 4376
Author: nancy
Date: 2010-05-27 13:05:57 -0600 (Thu, 27 May 2010)
Log Message:
-----------
The source for the time utility was moved to the
DART/time_manager directory quite a while ago.
This copy is not up to date, and is not used by
the build path_names files in the work directory.
Removed Paths:
-------------
DART/trunk/observations/gps/advance_time.f90
-------------- next part --------------
Deleted: DART/trunk/observations/gps/advance_time.f90
===================================================================
--- DART/trunk/observations/gps/advance_time.f90 2010-05-27 19:05:43 UTC (rev 4375)
+++ DART/trunk/observations/gps/advance_time.f90 2010-05-27 19:05:57 UTC (rev 4376)
@@ -1,424 +0,0 @@
-! DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
-! provided by UCAR, "as is", without charge, subject to all terms of use at
-! http://www.image.ucar.edu/DAReS/DART/DART_download
-
-program advance_time
-
-!
-! $URL$
-! $Id$
-! $Revision$
-! $Date$
-
- ! modified from da_advance_cymdh,
- ! - has accuracy down to second,
- ! - can use day/hour/minute/second (with/without +/- sign) to advance time,
- ! - can digest various input date format if it still has the right order (ie. cc yy mm dd hh nn ss)
- ! - can digest flexible time increment
- ! - can output in wrf date format (ccyy-mm-dd_hh:nn:ss)
- ! - can specify output date format
- ! - can output Julian day
- ! - can output Gregorian days and seconds (since year 1601)
- !
- ! eg.: advance_time 20070730 12 # advance 12 h
- ! advance_time 2007073012 -1d2h30m30s # back 1 day 2 hours 30 minutes and 30 seconds
- ! advance_time 2007073012 1s-3h30m # back 3 hours 30 minutes less 1 second
- ! advance_time 200707301200 2d1s -w # advance 2 days and 1 second, output in wrf date format
- ! advance_time 2007-07-30_12:00:00 2d1s -w # same as previous example
- ! advance_time 200707301200 2d1s -f ccyy-mm-dd_hh:nn:ss # same as previous example
- ! advance_time 2007073006 120 -j # advance 120 h, and print year and Julian day
- ! advance_time 2007073006 120 -J # advance 120 h, print year, Julian day, hour, minute and second
- ! advance_time 2007073006 0 -g # print Gregorian day and second (since year 1601)
- !
-
- implicit none
-
-! NOTE: this block is required by some fortran compilers, but causes a fatal
-! error with others. (ibm xlf needs it; gfortran cannot have it; intel
-! ifort does not seem to care either way.) If you get a compiler error
-! building this program, comment the following 4 lines in or out and try again.
-!START BLOCK
- interface
- integer function iargc()
- end function iargc
- end interface
-!END BLOCK
-
- integer :: ccyy, mm, dd, hh, nn, ss, dday, dh, dn, ds, gday, gsec
-
- integer :: nargum, i, n, id, ih, in, is
-
- character(len=80), dimension(10) :: argum
-
- character(len=14) :: ccyymmddhhnnss
-
- character(len=80) :: out_date_format, dtime
-
- integer :: datelen
-
- character(len=1) :: ch
-
- integer, parameter :: stdout=6
-
- nargum=iargc()
-
- if ( nargum < 2 ) then
- write(unit=stdout, fmt='(a)') &
- 'Usage: advance_time ccyymmddhh[nnss] [+|-]dt[d|h|m|s] [-w|-W|-wrf|-WRF] [-f|-F date_format] [-j|-J] [-g|-G]'
- write(unit=stdout, fmt='(a)') &
- 'Option: -w|-W|-wrf|-WRF output in wrf date format as ccyy-mm-dd_hh:nn:ss'
- write(unit=stdout, fmt='(a)') &
- ' -f|-F specify output date format, such as ccyy-mm-dd_hh:nn:ss, or ''ccyy/mm/dd hh:nn:ss'''
- write(unit=stdout, fmt='(a)') &
- ' -j|-J print Julian day'
- write(unit=stdout, fmt='(a)') &
- ' -g|-G print Gregorian days and seconds (since year 1601)'
- write(unit=stdout, fmt='(a)') &
- 'Example: advance_time 20070730 12 # advance 12 h'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007073012 -1d2h30m30s # back 1 day 2 hours 30 min and 30 sec'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007073012 1s-3h30m # back 3 hours 30 minutes less 1 second'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 200707301200 1d1s -w # advance 1 day 1 sec, output in wrf date format'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007-07-30_12:00:00 2d1s -w # same as previous example'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 200707301200 2d1s -f ccyy-mm-dd_hh:nn:ss # same as previous'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007073006 120 -j # advance 120 h, and print year and Julian day'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007073006 120 -J # advance 120 h, print year, Julian day, hour, minute and second'
- write(unit=stdout, fmt='(a)') &
- ' advance_time 2007073006 0 -g # print Gregorian day and second (since year 1601)'
- write(unit=stdout, fmt='(a)') ''
- stop 'try again.'
- end if
-
- do i=1,nargum
- do n=1,80
- argum(i)(n:n)=' '
- end do
- call getarg(i,argum(i))
- end do
-
- ccyymmddhhnnss = parsedate(argum(1))
- datelen = len_trim(ccyymmddhhnnss)
-
- if (datelen == 8) then
- read(ccyymmddhhnnss(1:10), fmt='(i4, 2i2)') ccyy, mm, dd
- hh = 0
- nn = 0
- ss = 0
- else if (datelen == 10) then
- read(ccyymmddhhnnss(1:10), fmt='(i4, 3i2)') ccyy, mm, dd, hh
- nn = 0
- ss = 0
- else if (datelen == 12) then
- read(ccyymmddhhnnss(1:12), fmt='(i4, 4i2)') ccyy, mm, dd, hh, nn
- ss = 0
- else if (datelen == 14) then
- read(ccyymmddhhnnss(1:14), fmt='(i4, 5i2)') ccyy, mm, dd, hh, nn, ss
- else
- stop 'wrong input date'
- endif
-
- if (.not. validdate(ccyy,mm,dd,hh,nn,ss)) then
- stop 'Start date is not valid, or has wrong format'
- endif
-
- i = 0
-
- dtime = trim(argum(2))
- call parsedt(dtime,dday,dh,dn,ds)
-
- hh = hh + dh
- nn = nn + dn
- ss = ss + ds
-
- ! advance minute according to second
- do while (ss < 0)
- ss = ss + 60
- nn = nn - 1
- end do
- do while (ss > 59)
- ss = ss - 60
- nn = nn + 1
- end do
-
- ! advance hour according to minute
- do while (nn < 0)
- nn = nn + 60
- hh = hh - 1
- end do
- do while (nn > 59)
- nn = nn - 60
- hh = hh + 1
- end do
-
- ! advance day according to hour
- do while (hh < 0)
- hh = hh + 24
- dday = dday - 1
- end do
-
- do while (hh > 23)
- hh = hh - 24
- dday = dday + 1
- end do
-
- ! advance day if dday /= 0
- if (dday /= 0) call change_date ( ccyy, mm, dd, dday)
-
- write(ccyymmddhhnnss(1:14), fmt='(i4, 5i2.2)') ccyy, mm, dd, hh, nn, ss
- if ( nargum == 2 ) then
- if (datelen<14) then
- if(nn /= 0) datelen=12
- if(ss /= 0) datelen=14
- endif
- write(unit=stdout, fmt='(a)') ccyymmddhhnnss(1:datelen)
- else if ( nargum > 2 ) then
- i = 3
- do while (i <= nargum)
- select case ( trim(argum(i)) )
- case ('-w', '-W', '-wrf','-WRF')
- out_date_format = 'ccyy-mm-dd_hh:nn:ss'
- write(unit=stdout, fmt='(a)') trim(formatdate(ccyymmddhhnnss, out_date_format))
- i = i+1
- case ('-f', '-F')
- out_date_format = trim(argum(i+1))
- write(unit=stdout, fmt='(a)') trim(formatdate(ccyymmddhhnnss, out_date_format))
- i = i+2
- case ('-j')
- write(unit=stdout, fmt='(I4,I4)') ccyy, julian_day(ccyy,mm,dd)
- i = i+1
- case ('-J')
- write(unit=stdout, fmt='(I4,I4,I3,I3,I3)') ccyy, julian_day(ccyy,mm,dd),hh,nn,ss
- i = i+1
- case ('-g','-G')
- call gregorian_day_sec(ccyy,mm,dd,hh,nn,ss,gday,gsec)
- write(unit=stdout, fmt='(I8,I8)') gday, gsec
- i = i+1
- case default
- i = i+1
- end select
- end do
- end if
-
-contains
-
-subroutine change_date( ccyy, mm, dd, delta )
-
- implicit none
-
- integer, intent(inout) :: ccyy, mm, dd
- integer, intent(in) :: delta
-
- integer, dimension(12) :: mmday
- integer :: dday, direction
-
- mmday = (/31,28,31,30,31,30,31,31,30,31,30,31/)
-
- mmday(2) = 28
-
- if (mod(ccyy,4) == 0) then
- mmday(2) = 29
-
- if (mod(ccyy,100) == 0) then
- mmday(2) = 28
- end if
-
- if (mod(ccyy,400) == 0) then
- mmday(2) = 29
- end if
- end if
-
- dday = abs(delta)
- direction = sign(1,delta)
-
- do while (dday > 0)
-
- dd = dd + direction
-
- if (dd == 0) then
- mm = mm - 1
-
- if (mm == 0) then
- mm = 12
- ccyy = ccyy - 1
- end if
-
- dd = mmday(mm)
- elseif ( dd > mmday(mm)) then
- dd = 1
- mm = mm + 1
- if(mm > 12 ) then
- mm = 1
- ccyy = ccyy + 1
- end if
- end if
-
- dday = dday - 1
-
- end do
- return
-end subroutine change_date
-
-
-function parsedate(datein)
- character(len=80) :: datein
- character(len=14) :: parsedate
- character(len=1 ) :: ch
- integer :: n, i
- parsedate = '00000000000000'
- i=0
- do n = 1, len_trim(datein)
- ch = datein(n:n)
- if (ch >= '0' .and. ch <= '9') then
- i=i+1
- parsedate(i:i)=ch
- end if
- end do
- if (parsedate(11:14) == '0000') then
- parsedate(11:14) = ''
- else if(parsedate(13:14) == '00') then
- parsedate(13:14) = ''
- end if
- return
-end function parsedate
-
-subroutine parsedt(dt,dday,dh,dn,ds)
- character(len=80) :: dt
- integer :: dday, dh, dn, ds
- character(len=1 ) :: ch
- integer :: n,i,d,s,nounit
- ! initialize time and sign
- nounit=1
- dday=0
- dh=0
- dn=0
- ds=0
- d=0
- s=1
- do n = 1, len_trim(dt)
- ch = dt(n:n)
- select case (ch)
- case ('0':'9')
- read(ch,fmt='(i1)') i
- d=d*10+i
- case ('-')
- s=-1
- case ('+')
- s=1
- case ('d')
- nounit=0
- dday=dday+d*s
- d=0
- case ('h')
- nounit=0
- dh=dh+d*s
- d=0
- case ('n','m')
- nounit=0
- dn=dn+d*s
- d=0
- case ('s')
- nounit=0
- ds=ds+d*s
- d=0
- case default
- end select
- end do
- if (nounit==1) dh=d*s
-end subroutine parsedt
-
-function formatdate(datein,dateform)
- character(len=14) :: datein
- character(len=80) :: dateform
- character(len=80) :: formatdate
- integer :: ic,iy,im,id,ih,in,is
- ic=index(dateform,'cc')
- iy=index(dateform,'yy')
- im=index(dateform,'mm')
- id=index(dateform,'dd')
- ih=index(dateform,'hh')
- in=index(dateform,'nn')
- is=index(dateform,'ss')
- formatdate=trim(dateform)
- if (ic /= 0) formatdate(ic:ic+1) = datein(1:2)
- if (iy /= 0) formatdate(iy:iy+1) = datein(3:4)
- if (im /= 0) formatdate(im:im+1) = datein(5:6)
- if (id /= 0) formatdate(id:id+1) = datein(7:8)
- if (ih /= 0) formatdate(ih:ih+1) = datein(9:10)
- if (in /= 0) formatdate(in:in+1) = datein(11:12)
- if (is /= 0) formatdate(is:is+1) = datein(13:14)
- return
-end function formatdate
-
-function julian_day(ccyy,mm,dd)
- integer :: ccyy,mm,dd
- integer :: julian_day
- integer, parameter, dimension( 13) :: &
- bgn_day = (/ 0, 31, 59, 90, 120, 151, &
- 181, 212, 243, 273, 304, 334, 365 /), &
- bgn_day_ly = (/ 0, 31, 60, 91, 121, 152, &
- 182, 213, 244, 274, 305, 335, 366 /)
- if (isleapyear(ccyy)) then
- julian_day = bgn_day_ly(mm)+dd
- else
- julian_day = bgn_day(mm)+dd
- end if
-end function julian_day
-
-function isleapyear(year)
- ! check if year is leapyear
- integer,intent(in) :: year
- logical :: isleapyear
- if( mod(year,4) .ne. 0 ) then
- isleapyear=.FALSE.
- else
- isleapyear=.TRUE.
- if ( mod(year,100) == 0 .and. mod(year,400) .ne. 0 ) isleapyear=.FALSE.
- endif
-end function isleapyear
-
-subroutine gregorian_day_sec(year,month,day,hours,minutes,seconds,gday,gsec)
- integer :: day, month, year, hours, minutes, seconds
- integer :: gday, gsec
- integer :: ndays, m, nleapyr
- integer :: base_year = 1601
- integer :: days_per_month(12) = (/31,28,31,30,31,30,31,31,30,31,30,31/)
-
- if( year < base_year ) stop "Year can not be before 1601!"
-
- ! compute number of leap years fully past since base_year
- nleapyr = (year - base_year) / 4 - (year - base_year) / 100 + (year - base_year) / 400
- ! Count up days in this year
- ndays = 0
- do m=1,month-1
- ndays = ndays + days_per_month(m)
- if(isleapyear(year) .and. m == 2) ndays = ndays + 1
- enddo
- gsec = seconds + 60*(minutes + 60*hours)
- gday = day - 1 + ndays + 365*(year - base_year - nleapyr) + 366*(nleapyr)
- return
-end subroutine gregorian_day_sec
-
-function validdate(ccyy,mm,dd,hh,nn,ss)
- integer :: ccyy,mm,dd,hh,nn,ss
- logical :: validdate
-
- validdate = .true.
-
- if(ss > 59 .or. ss < 0 .or. &
- nn > 59 .or. nn < 0 .or. &
- hh > 23 .or. hh < 0 .or. &
- dd < 1 .or. &
- mm > 12 .or. mm < 1 ) validdate = .false.
-
- if (mm == 2 .and. ( dd > 29 .or. &
- ((.not. isleapyear(ccyy)) .and. dd > 28))) &
- validdate = .false.
-end function validdate
-
-end program advance_time
From nancy at ucar.edu Thu May 27 13:17:29 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 27 May 2010 13:17:29 -0600
Subject: [Dart-dev] [4377] DART/trunk/models/MITgcm_ocean: Add a width field
to the fortran write statements; it makes the gfortran
Message-ID:
Revision: 4377
Author: nancy
Date: 2010-05-27 13:17:29 -0600 (Thu, 27 May 2010)
Log Message:
-----------
Add a width field to the fortran write statements; it makes the gfortran
compiler happier. (These are some very old local changes that i'm looking
through, trying to either get them all committed or tossed out.)
Modified Paths:
--------------
DART/trunk/models/MITgcm_ocean/model_mod.f90
DART/trunk/models/MITgcm_ocean/trans_sv_pv.f90
-------------- next part --------------
Modified: DART/trunk/models/MITgcm_ocean/model_mod.f90
===================================================================
--- DART/trunk/models/MITgcm_ocean/model_mod.f90 2010-05-27 19:05:57 UTC (rev 4376)
+++ DART/trunk/models/MITgcm_ocean/model_mod.f90 2010-05-27 19:17:29 UTC (rev 4377)
@@ -2929,16 +2929,16 @@
continue
elseif (index(uc_string,'STARTTIME') > 0) then
- write(nml_string,'('' startTime = '',f,'','')')0.0_r8
+ write(nml_string,'('' startTime = '',f12.6,'','')')0.0_r8
elseif (index(uc_string,'DUMPFREQ') > 0) then
- write(nml_string,'('' dumpFreq = '',f,'','')')dumpFreq
+ write(nml_string,'('' dumpFreq = '',f12.6,'','')')dumpFreq
elseif (index(uc_string,'ENDTIME') > 0) then
- write(nml_string,'('' endTime = '',f,'','')')endTime
+ write(nml_string,'('' endTime = '',f12.6,'','')')endTime
elseif (index(uc_string,'TAVEFREQ') > 0) then
- write(nml_string,'('' taveFreq = '',f,'','')')taveFreq
+ write(nml_string,'('' taveFreq = '',f12.8,'','')')taveFreq
endif
Modified: DART/trunk/models/MITgcm_ocean/trans_sv_pv.f90
===================================================================
--- DART/trunk/models/MITgcm_ocean/trans_sv_pv.f90 2010-05-27 19:05:57 UTC (rev 4376)
+++ DART/trunk/models/MITgcm_ocean/trans_sv_pv.f90 2010-05-27 19:17:29 UTC (rev 4377)
@@ -122,7 +122,7 @@
call print_time( model_time,'trans_sv_pv:dart model time')
call print_time(adv_to_time,'trans_sv_pv:advance_to time')
call print_time( offset,'trans_sv_pv:a distance of')
-write( * ,'(''trans_sv_pv:PARM03 endTime '',i,'' seconds'')') &
+write( * ,'(''trans_sv_pv:PARM03 endTime '',i8,'' seconds'')') &
(secs + days*SECPERDAY)
call print_date( model_time,'trans_sv_pv:dart model date',logfileunit)
@@ -130,7 +130,7 @@
call print_time( model_time,'trans_sv_pv:dart model time',logfileunit)
call print_time(adv_to_time,'trans_sv_pv:advance_to time',logfileunit)
call print_time( offset,'trans_sv_pv: a distance of',logfileunit)
-write(logfileunit,'(''trans_sv_pv:PARM03 endTime '',i,'' seconds'')') &
+write(logfileunit,'(''trans_sv_pv:PARM03 endTime '',i8,'' seconds'')') &
(secs + days*SECPERDAY)
call finalize_utilities()
From nancy at ucar.edu Thu May 27 16:39:43 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Thu, 27 May 2010 16:39:43 -0600
Subject: [Dart-dev] [4378] DART/trunk/models: Added the mkmf and path_names
files for the restart_file_tool
Message-ID:
Revision: 4378
Author: nancy
Date: 2010-05-27 16:39:43 -0600 (Thu, 27 May 2010)
Log Message:
-----------
Added the mkmf and path_names files for the restart_file_tool
from the model directories where they were missing.
Modified Paths:
--------------
DART/trunk/models/template/work/path_names_restart_file_tool
Added Paths:
-----------
DART/trunk/models/9var/work/mkmf_restart_file_tool
DART/trunk/models/9var/work/path_names_restart_file_tool
DART/trunk/models/MITgcm_annulus/work/mkmf_restart_file_tool
DART/trunk/models/MITgcm_annulus/work/path_names_restart_file_tool
DART/trunk/models/coamps/work/mkmf_restart_file_tool
DART/trunk/models/coamps/work/path_names_restart_file_tool
DART/trunk/models/forced_lorenz_96/work/mkmf_restart_file_tool
DART/trunk/models/forced_lorenz_96/work/path_names_restart_file_tool
DART/trunk/models/ikeda/work/mkmf_restart_file_tool
DART/trunk/models/ikeda/work/path_names_restart_file_tool
DART/trunk/models/lorenz_04/work/mkmf_restart_file_tool
DART/trunk/models/lorenz_04/work/path_names_restart_file_tool
DART/trunk/models/lorenz_63/work/mkmf_restart_file_tool
DART/trunk/models/lorenz_63/work/path_names_restart_file_tool
DART/trunk/models/lorenz_84/work/mkmf_restart_file_tool
DART/trunk/models/lorenz_84/work/path_names_restart_file_tool
DART/trunk/models/lorenz_96_2scale/work/mkmf_restart_file_tool
DART/trunk/models/lorenz_96_2scale/work/path_names_restart_file_tool
DART/trunk/models/rose/work/mkmf_restart_file_tool
DART/trunk/models/rose/work/path_names_restart_file_tool
DART/trunk/models/simple_advection/work/mkmf_restart_file_tool
DART/trunk/models/simple_advection/work/path_names_restart_file_tool
DART/trunk/models/tiegcm/work/mkmf_restart_file_tool
DART/trunk/models/tiegcm/work/path_names_restart_file_tool
-------------- next part --------------
Added: DART/trunk/models/9var/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/9var/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/9var/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/9var/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/9var/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/9var/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/9var/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/9var/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/MITgcm_annulus/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/MITgcm_annulus/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/MITgcm_annulus/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/MITgcm_annulus/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/MITgcm_annulus/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/MITgcm_annulus/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/MITgcm_annulus/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/MITgcm_annulus/model_mod.f90
+common/types_mod.f90
+location/annulus/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/coamps/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/coamps/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/coamps/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/coamps/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/coamps/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/coamps/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/coamps/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/coamps/model_mod.f90
+common/types_mod.f90
+location/threed_sphere/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/forced_lorenz_96/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/forced_lorenz_96/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/forced_lorenz_96/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/forced_lorenz_96/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/forced_lorenz_96/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/forced_lorenz_96/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/forced_lorenz_96/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/forced_lorenz_96/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/ikeda/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/ikeda/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/ikeda/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/ikeda/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/ikeda/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/ikeda/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/ikeda/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/ikeda/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/lorenz_04/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_04/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_04/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/lorenz_04/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/lorenz_04/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_04/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_04/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/lorenz_04/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/lorenz_63/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_63/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_63/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/lorenz_63/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/lorenz_63/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_63/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_63/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/lorenz_63/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/lorenz_84/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_84/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_84/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/lorenz_84/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/lorenz_84/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_84/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_84/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/lorenz_84/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/lorenz_96_2scale/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_96_2scale/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_96_2scale/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/lorenz_96_2scale/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/lorenz_96_2scale/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/lorenz_96_2scale/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/lorenz_96_2scale/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/lorenz_96_2scale/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/rose/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/rose/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/rose/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/rose/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/rose/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/rose/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/rose/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/rose/model_mod.f90
+common/types_mod.f90
+location/threed_sphere/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/models/simple_advection/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/simple_advection/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/simple_advection/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/simple_advection/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/simple_advection/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/simple_advection/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/simple_advection/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/simple_advection/model_mod.f90
+common/types_mod.f90
+location/oned/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Modified: DART/trunk/models/template/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/template/work/path_names_restart_file_tool 2010-05-27 19:17:29 UTC (rev 4377)
+++ DART/trunk/models/template/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -11,7 +11,7 @@
assim_model/assim_model_mod.f90
models/template/model_mod.f90
common/types_mod.f90
-location/threed_sphere/location_mod.f90
+location/oned/location_mod.f90
random_seq/random_seq_mod.f90
random_nr/random_nr_mod.f90
time_manager/time_manager_mod.f90
Added: DART/trunk/models/tiegcm/work/mkmf_restart_file_tool
===================================================================
--- DART/trunk/models/tiegcm/work/mkmf_restart_file_tool (rev 0)
+++ DART/trunk/models/tiegcm/work/mkmf_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,18 @@
+#!/bin/csh
+#
+# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
+# provided by UCAR, "as is", without charge, subject to all terms of use at
+# http://www.image.ucar.edu/DAReS/DART/DART_download
+#
+# $Id$
+
+../../../mkmf/mkmf -p restart_file_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_restart_file_tool
+
+exit $status
+
+#
+# $URL$
+# $Revision$
+# $Date$
+
Property changes on: DART/trunk/models/tiegcm/work/mkmf_restart_file_tool
___________________________________________________________________
Added: svn:executable
+ *
Added: svn:mime-type
+ text/plain
Added: svn:keywords
+ Date Rev Author HeadURL Id
Added: svn:eol-style
+ native
Added: DART/trunk/models/tiegcm/work/path_names_restart_file_tool
===================================================================
--- DART/trunk/models/tiegcm/work/path_names_restart_file_tool (rev 0)
+++ DART/trunk/models/tiegcm/work/path_names_restart_file_tool 2010-05-27 22:39:43 UTC (rev 4378)
@@ -0,0 +1,19 @@
+utilities/restart_file_tool.f90
+ensemble_manager/ensemble_manager_mod.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+reg_factor/reg_factor_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/tiegcm/model_mod.f90
+common/types_mod.f90
+location/threed_sphere/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
From nancy at ucar.edu Fri May 28 12:24:38 2010
From: nancy at ucar.edu (nancy at ucar.edu)
Date: Fri, 28 May 2010 12:24:38 -0600
Subject: [Dart-dev] [4379] DART/trunk/models/tiegcm/model_mod.f90: Bugs in
model_interpolate and get_val subroutines are fixed.
Message-ID:
Revision: 4379
Author: tmatsuo
Date: 2010-05-28 12:24:38 -0600 (Fri, 28 May 2010)
Log Message:
-----------
Bugs in model_interpolate and get_val subroutines are fixed.
Density vertical interpolation are now done in after being converted to log-scale.
Modified Paths:
--------------
DART/trunk/models/tiegcm/model_mod.f90
-------------- next part --------------
Modified: DART/trunk/models/tiegcm/model_mod.f90
===================================================================
--- DART/trunk/models/tiegcm/model_mod.f90 2010-05-27 22:39:43 UTC (rev 4378)
+++ DART/trunk/models/tiegcm/model_mod.f90 2010-05-28 18:24:38 UTC (rev 4379)
@@ -301,7 +301,7 @@
integer :: lat_below, lat_above, lon_below, lon_above
real(r8) :: lon_fract, temp_lon, lat_fract
real(r8) :: lon, lat, height, lon_lat_lev(3)
-real(r8) :: bot_lon, top_lon, delta_lon, bot_lat, top_lat, delta_lat
+real(r8) :: bot_lon, top_lon, zero_lon, delta_lon, bot_lat, top_lat, delta_lat
real(r8) :: val(2,2), a(2)
if ( .not. module_initialized ) call static_init_model
@@ -323,34 +323,29 @@
endif
! Get lon and lat grid specs
-bot_lon = lons(1) ! 0
-top_lon = lons(nlon) ! 355
+bot_lon = lons(1) ! 180
+zero_lon = lons(37) ! 0
+top_lon = lons(nlon) ! 175
delta_lon = abs((lons(1)-lons(2))) ! 5
bot_lat = lats(1) ! -87.5
top_lat = lats(nlat) ! 87.5
delta_lat = abs((lats(1)-lats(2))) ! 5
-! Compute bracketing lon indices: DART [0, 360] TIEGCM [0, 355]
-if(lon >= bot_lon .and. lon <= top_lon) then ! 0 <= lon <= 355
+! Compute bracketing lon indices:
+! TIEGCM [-180 180] = DART [180, 185, ..., 355, 0, 5, ..., 175]
+if(lon > top_lon .and. lon < bot_lon) then ! at wraparound point [175 < lon < 180]
+ lon_below = nlon !175
+ lon_above = 1 !180
+ lon_fract = (lon - top_lon) / delta_lon
+else if (lon >= bot_lon) then ! [180 <= lon <= 360]
lon_below = int((lon - bot_lon) / delta_lon) + 1
lon_above = lon_below + 1
lon_fract = (lon - lons(lon_below)) / delta_lon
-elseif (lon < bot_lon) then ! lon < 0
- temp_lon = lon + 360.0
- lon_below = int((temp_lon - bot_lon) / delta_lon) + 1
+else ! [0 <= lon <= 175 ]
+ lon_below = int((lon - zero_lon) / delta_lon) + 37
lon_above = lon_below + 1
- lon_fract = (temp_lon - lons(lon_below)) / delta_lon
-
-elseif (lon >= (top_lon+delta_lon)) then ! 360 <= lon
- temp_lon = lon - 360.0
- lon_below = int((temp_lon - bot_lon) / delta_lon) + 1
- lon_above = lon_below + 1
- lon_fract = (temp_lon - lons(lon_below)) / delta_lon
-else ! 355 < lon < 360 at wraparound point
- lon_below = nlon
- lon_above = 1
- lon_fract = (lon - top_lon) / delta_lon
+ lon_fract = (lon - lons(lon_below)) / delta_lon
endif
! compute neighboring lat rows: TIEGCM [-87.5, 87.5] DART [-90, 90]
@@ -391,6 +386,7 @@
obs_val = 0.
endif
+!write(11,*,access='APPEND') lon, lat, lon_fract, lat_fract, a(1), a(2), obs_val
!print*, 'model_interpolate', lon, lat, height,obs_val
@@ -419,13 +415,27 @@
! To find a layer height: what's the unit of height [m]
h_loop:do k = 1, nlev
zgrid = ZGtiegcm(lon_index,lat_index,k)/100.0_r8 ! [m] = ZGtiegcm/100 [cm]
+
+ if (k == 1 .and. zgrid > height) then
+ istatus = 1
+ val = 0.0
+ return
+ endif
+
+ if (k == nlev .and. zgrid < height) then
+ istatus = 1
+ val = 0.0
+ return
+ endif
+
if (height <= zgrid) then
lev_top = k
lev_bottom = lev_top -1
- delta_z = zgrid - ZGtiegcm(lon_index,lat_index,lev_bottom)
+ delta_z = zgrid - ZGtiegcm(lon_index,lat_index,lev_bottom)/100.0_r8
frac_lev = (zgrid - height)/delta_z
exit h_loop
endif
+
enddo h_loop
if (obs_kind == KIND_DENSITY) then
@@ -448,8 +458,10 @@
end if
-val = frac_lev * val_bottom + (1.0 - frac_lev) * val_top
+val = exp(frac_lev * log(val_bottom) + (1.0 - frac_lev) * log(val_top))
+!write(11,*,access='APPEND') lat_index, lon_index, lev_top, lev_bottom, frac_lev, val_top, val_bottom, val
+
end subroutine get_val