[Dart-dev] [4031] DART/trunk/observations: First stab at a reader program for the temperature/salinity observations
nancy at ucar.edu
nancy at ucar.edu
Wed Sep 2 16:48:16 MDT 2009
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/dart-dev/attachments/20090902/947f639d/attachment-0001.html
-------------- next part --------------
Added: DART/trunk/observations/GTSPP/GTSPP.html
===================================================================
--- DART/trunk/observations/GTSPP/GTSPP.html (rev 0)
+++ DART/trunk/observations/GTSPP/GTSPP.html 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,291 @@
+<HTML>
+<HEAD>
+<TITLE>GTSPP Observations</TITLE>
+<link rel="stylesheet" type="text/css" href="../doc/html/doc.css"></link>
+</HEAD>
+<BODY>
+<!--
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+!! !!
+!! GNU General Public License !!
+!! !!
+!! This file is part of the Data Assimilation Research Testbed (DART). !!
+!! !!
+!! DART is free software; you can redistribute it and/or modify !!
+!! it and are expected to follow the terms of the GNU General Public !!
+!! License as published by the Free Software Foundation. !!
+!! !!
+!! DART is distributed in the hope that it will be useful, !!
+!! but WITHOUT ANY WARRANTY; without even the implied warranty of !!
+!! MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the !!
+!! GNU General Public License for more details. !!
+!! !!
+!! You should have received a copy of the GNU General Public License !!
+!! along with DART; if not, write to: !!
+!! Free Software Foundation, Inc. !!
+!! 59 Temple Place, Suite 330 !!
+!! Boston, MA 02111-1307 USA !!
+!! or see: !!
+!! http://www.gnu.org/licenses/gpl.txt !!
+!! !!
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+-->
+
+<DIV ALIGN=CENTER>
+<A HREF="#Overview">OVERVIEW</A> /
+<A HREF="#DataSources">DATA SOURCES</A> /
+<A HREF="#Programs">PROGRAMS</A> /
+<A HREF="#Modules">MODULES USES</A> /
+<A HREF="#Namelist">NAMELIST</A> /
+<A HREF="#KnownBugs">KNOWN BUGS</A> /
+<A HREF="#FuturePlans">FUTURE PLANS</A>
+</DIV>
+
+<!--==================================================================-->
+
+<H1>GTSPP Observations</H1>
+<A NAME="HEADER"></A>
+
+<!--==================================================================-->
+
+<A NAME="Overview"></A>
+<H2>OVERVIEW</H2>
+
+<P>
+GTSPP (Global Temperature-Salinity Profile Program) data measures
+vertical profiles of ocean temperature and salinity.
+The <a href="http://www.nodc.noaa.gov/GTSPP/index.html">GTPSS home page</a>
+has detailed information about the repository, observations, and datasets.
+The programs in this directory convert from the netcdf files
+found in the repository into
+DART observation sequence (obs_seq) file format.
+</P>
+<P>
+FIXME : need more info here.
+</P>
+
+<!--==================================================================-->
+
+<A NAME="DataSources"></A>
+<BR><HR><BR>
+<H2>DATA SOURCES</H2>
+
+<P>
+Data from the GTSPP can be downloaded interactively from
+<a href="http://www.nodc.noaa.gov/cgi-bin/gtspp/gtsppform01.cgi">here</a>.
+It is delivered in
+<a href="http://www.unidata.ucar.edu/software/netcdf">netCDF</a>
+file format, one vertical profile per netCDF file.
+</P>
+
+<P>
+Currently each vertical profile is stored in a separate file,
+so converting a months's
+worth of observations involves downloading many individual files.
+The converter program can take a list of input files, so it is easy
+to collect a month of observations together into a single output file
+with one execution of the converter program.
+</P>
+
+<!--==================================================================-->
+
+<A NAME="Programs"></A>
+<BR><HR><BR>
+<H2>PROGRAMS</H2>
+The data is distributed in
+<a href="http://www.unidata.ucar.edu/software/netcdf">netCDF</a>
+file format. DART requires all observations to be in a proprietary
+format often called DART "obs_seq" format.
+The files in this directory, a combination
+of C shell scripts and a Fortran source executable,
+do this data conversion.
+<P>
+Optional namelist interface
+<A HREF="#Namelist"> <em class=code>&convert_cosmic_gps_nml</em> </A>
+may be read from file <em class=file>input.nml</em>.
+</P>
+<P>
+The work directory contains several scripts, including one which downloads
+the raw data files a day at a time (cosmic_download.csh), and one
+which executes the conversion program (convert_script.csh). These
+scripts make 6 hour files by default, but have options for other
+times. The input files are downloaded a day at a time. Be aware
+that each profile is stored in a separate netcdf file, and there
+are between 1000-2000 files/day, so the download process can be
+lengthy. You probably want to download as a separate preprocess step
+and do not use the options to automatically delete the input files.
+Keep the files around until you are sure you are satisified with the
+output files and then delete them by hand.
+</P>
+<P>
+The conversion executable, convert_cosmic_gps_cdf, reads the namelist
+from the file 'input.nml', but also reads an analysis time from the
+terminal or the standard input unit. This makes it simpler to convert
+multiple files without editing the namelist. The program prompts for
+the time string with the required time format,
+<em class=code>yyyy-mm-dd_hh:mm:ss</em> .
+</P>
+
+<!--==================================================================-->
+
+<A NAME="Modules"></A>
+<HR>
+<H2>MODULES USED</H2>
+<PRE>
+types_mod
+time_manager_mod
+utilities_mod
+location_mod
+obs_sequence_mod
+obs_def_mod
+obs_def_gps_mod
+obs_kind_mod
+netcdf
+</PRE>
+
+<!--==================================================================-->
+
+<A NAME="Namelist"></A>
+<BR><HR><BR>
+<H2>NAMELIST</H2>
+ <P>We adhere to the F90 standard of starting a namelist with an ampersand
+ '&' and terminating with a slash '/'.
+ <div class=namelist><pre>
+ <em class=call>namelist / convert_cosmic_gps_nml / </em>
+ obs_levels, local_operator, obs_window, &
+ ray_ds, ray_htop, gpsro_netcdf_file, &
+ gpsro_netcdf_filelist, gpsro_out_file
+ </em></pre></div>
+ <H3 class=indent1>Discussion</H3>
+ <P>This namelist is read in a file called <em class=file>input.nml</em>
+ </P>
+ <TABLE border=0 cellpadding=3 width=100%>
+ <TR><TH align=left>Contents </TH>
+ <TH align=left>Type </TH>
+ <TH align=left>Description </TH></TR>
+
+ <TR><!--contents--><TD valign=top>obs_levels</TD>
+ <!-- type --><TD valign=top>integer(200)</TD>
+ <!--descript--><TD>A series of heights, in kilometers, where observations
+ from this profile should be interpolated. (Note that
+ the other distances and heights in the namelist are
+ specified in meters.) The values should be listed in
+ increasing height order.
+ Default: none</TD></TR>
+
+ <TR><!--contents--><TD valign=top>local_operator</TD>
+ <!-- type --><TD valign=top>logical</TD>
+ <!--descript--><TD>If .true., compute the observation using a method
+ which assumes all effects occur at the tangent point.
+ If .false., integrate along the tangent line and do
+ ray-path reconstruction.
+ Default: .true.</TD></TR>
+
+ <TR><!--contents--><TD valign=top>obs_window</TD>
+ <!-- type --><TD valign=top>real(r8)</TD>
+ <!--descript--><TD>Accept and convert observations if they are within
+ plus or minus this number of hours from the analysis
+ time (which is read as an input from the terminal or
+ the standard input unit).
+ Default: 12.0</TD></TR>
+
+ <TR><!--contents--><TD valign=top>ray_ds</TD>
+ <!-- type --><TD valign=top>real(r8)</TD>
+ <!--descript--><TD>For the non-local operator only, the delta stepsize,
+ in meters, to use for the along-path integration in
+ each direction out from the tangent point.
+ Default: 5000.0</TD></TR>
+
+ <TR><!--contents--><TD valign=top>ray_htop</TD>
+ <!-- type --><TD valign=top>real(r8)</TD>
+ <!--descript--><TD>For the non-local operator only, stop the integration
+ when one of the endpoints of the next integration step
+ goes above this height. Specified in meters.
+ Default: 15000.0</TD></TR>
+
+ <TR><!--contents--><TD valign=top>gpsro_netcdf_file</TD>
+ <!-- type --><TD valign=top>character(len=128)</TD>
+ <!--descript--><TD>The input filename when converting a single profile.
+ Only one of the 2 filenames can have a valid value,
+ so to use the single filename set the list name
+ ('gpsro_netcdf_filelist') to the empty string ('').
+ Default: 'cosmic_gps_input.nc'</TD></TR>
+
+ <TR><!--contents--><TD valign=top>gpsro_netcdf_filelist</TD>
+ <!-- type --><TD valign=top>character(len=128)</TD>
+ <!--descript--><TD>To convert a series of profiles in a single execution
+ create a text file which contains each input file,
+ in ascii, one filename per line. Set this item to
+ the name of that file, and set 'gpsro_netcdf_file' to
+ the empty string ('').
+ Default: 'cosmic_gps_input_list'</TD></TR>
+
+ <TR><!--contents--><TD valign=top>gpsro_out_file</TD>
+ <!-- type --><TD valign=top>character(len=128)</TD>
+ <!--descript--><TD>The output file to be created. Note that to be
+ compatible with earlier versions of this program, if
+ this file already exists it will be read in and the
+ new data will be inserted into that file.
+ Default: 'obs_seq.gpsro'</TD></TR>
+
+ <TR><!--contents--><TD valign=top>overwrite_time</TD>
+ <!-- type --><TD valign=top>logical</TD>
+ <!--descript--><TD>This item is NOT in the standard namelist, but code
+ exists in the program to support the functionality
+ if the source is edited and this variable is added
+ to the namelist. If set to
+ .true., the output file will be created with all
+ observations marked with the specified analysis time
+ instead of the actual observation collection time.
+ This is generally not what you want, but might be
+ useful if you are binning observations by time to do
+ thinning or super-ob'ing data.
+ Default: .false.</TD></TR>
+
+ </TABLE>
+<P>
+</P>
+
+<!--==================================================================-->
+
+<A NAME="KnownBugs"></A>
+<BR><HR><BR>
+<H2>KNOWN BUGS</H2>
+<P>
+Some COSMIC files seem to have internal times which differ from the
+times encoded in the filenames by as much as 2-3 minutes. If it is
+important to get all the observations within a particular time window
+files with filenames from a few minutes before and after the window
+should be converted.
+Times really outside the window can be excluded either by setting
+the 'obs_window' namelist variable, or can be trimmed later with the
+<a href="../../obs_sequence/obs_sequence_tool.html">obs_sequence_tool</a>.
+</P>
+
+<!--==================================================================-->
+<!-- Describe Future Plans. -->
+<!--==================================================================-->
+
+<A NAME="FuturePlans"></A>
+<BR><HR><BR>
+<H2>FUTURE PLANS</H2>
+<P>
+</P>
+
+<BR><HR><BR>
+
+<!--==================================================================-->
+<TABLE summary="">
+<TR><TD>Contact: </TD><TD> nancy collins </TD></TR>
+<TR><TD>Revision: </TD><TD> $Revision$ </TD></TR>
+<TR><TD>Source: </TD><TD> $URL$ </TD></TR>
+<TR><TD>Change Date: </TD><TD> $Date$ </TD></TR>
+<TR><TD>Change history:</TD><TD> try "svn log" or "svn diff" </TD></TR>
+</TABLE>
+<!--==================================================================-->
+
+<!--==================================================================-->
+
+<HR>
+</BODY>
+</HTML>
Property changes on: DART/trunk/observations/GTSPP/GTSPP.html
___________________________________________________________________
Name: svn:mime-type
+ text/html
Name: svn:keywords
+ Date Revision Author HeadURL Id
Added: DART/trunk/observations/GTSPP/data/1264095.nc
===================================================================
(Binary files differ)
Property changes on: DART/trunk/observations/GTSPP/data/1264095.nc
___________________________________________________________________
Name: svn:mime-type
+ application/x-netcdf
Added: DART/trunk/observations/GTSPP/gtspp_to_obs.f90
===================================================================
--- DART/trunk/observations/GTSPP/gtspp_to_obs.f90 (rev 0)
+++ DART/trunk/observations/GTSPP/gtspp_to_obs.f90 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,292 @@
+! Data Assimilation Research Testbed -- DART
+! Copyright 2004-2008, Data Assimilation Research Section
+! University Corporation for Atmospheric Research
+! Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+!
+! gtspp_to_obs - program that reads a list of NODS GTSPP observation
+! profiles of ocean temperature and salinity in netcdf
+! format and writes a DART observation sequence file.
+!
+! created 02-sept-2009, based on the GPS reader. nsc.
+!
+!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
+
+program gtspp_to_obs
+
+use types_mod, only : r8
+use time_manager_mod, only : time_type, set_calendar_type, GREGORIAN, set_time,&
+ increment_time, get_time, set_date, operator(-), &
+ print_date, operator(+)
+use utilities_mod, only : initialize_utilities, find_namelist_in_file, &
+ check_namelist_read, nmlfileunit, do_output, &
+ get_next_filename, error_handler, E_ERR, E_MSG, &
+ nc_check, find_textfile_dims
+use location_mod, only : VERTISHEIGHT, set_location
+use obs_sequence_mod, only : obs_sequence_type, obs_type, read_obs_seq, &
+ static_init_obs_sequence, init_obs, destroy_obs, &
+ write_obs_seq, init_obs_sequence, get_num_obs, &
+ insert_obs_in_seq, destroy_obs_sequence, &
+ set_copy_meta_data, set_qc_meta_data, set_qc, &
+ set_obs_values, set_obs_def, insert_obs_in_seq
+use obs_def_mod, only : obs_def_type, set_obs_def_time, set_obs_def_kind, &
+ set_obs_def_error_variance, set_obs_def_location, &
+ set_obs_def_key
+! FIXME: what actual instrument took these readings? FLOAT_xx is a placeholder
+! for now. must have obs_def_ocean_mod.f90 in the preprocess input list.
+use obs_kind_mod, only : KIND_TEMPERATURE, KIND_SALINITY, &
+ FLOAT_TEMPERATURE, FLOAT_SALINITY
+
+use netcdf
+
+implicit none
+
+! version controlled file description for error handling, do not edit
+character(len=128), parameter :: &
+ source = "$URL$", &
+ revision = "$Revision$", &
+ revdate = "$Date$"
+
+
+integer, parameter :: num_copies = 1, & ! number of copies in sequence
+ num_qc = 1 ! number of QC entries
+
+character (len=129) :: meta_data, msgstring, next_infile
+character (len=80) :: name
+character (len=19) :: datestr
+character (len=6) :: subset
+integer :: rcode, ncid, varid, ndepths, k, nfiles, num_new_obs, &
+ aday, asec, dday, dsec, oday, osec, &
+ iyear, imonth, iday, ihour, imin, isec, &
+ zloc, obs_num, io, iunit, nobs, filenum, dummy
+logical :: file_exist, first_obs, did_obs, from_list = .false.
+real(r8) :: hght_miss, refr_miss, azim_miss, oerr, &
+ qc, lato, lono, hghto, refro, azimo, wght, nx, ny, &
+ nz, ds, htop, rfict, obsval, phs, obs_val(1), qc_val(1), &
+ dtime, glat, glon, d_qc(1)
+
+real(r8), allocatable :: lat(:), lon(:), dep(:), err(:) !, d_qc(:)
+
+type(obs_def_type) :: obs_def
+type(obs_sequence_type) :: obs_seq
+type(obs_type) :: obs, prev_obs
+type(time_type) :: obs_time, base_time, delta_time
+
+! initialize some values
+integer, parameter :: nmaxdepths = 5000 ! max number of observation depths
+real(r8) :: obs_depth(nmaxdepths) = -1.0_r8
+real(r8) :: temperature(nmaxdepths) = -888888.0_r8
+real(r8) :: salinity(nmaxdepths) = -888888.0_r8
+
+!------------------------------------------------------------------------
+! Declare namelist parameters
+!------------------------------------------------------------------------
+
+character(len=128) :: gtspp_netcdf_file = '1234567.nc'
+character(len=128) :: gtspp_netcdf_filelist = 'gtspp_to_obs_filelist'
+character(len=128) :: gtspp_out_file = 'obs_seq.gtspp'
+integer :: avg_obs_per_file = 500
+
+namelist /gtspp_to_obs_nml/ gtspp_netcdf_file, &
+ gtspp_netcdf_filelist, gtspp_out_file, &
+ avg_obs_per_file
+
+! start of executable code
+
+obs_num = 1
+d_qc(1) = 0.0_r8
+
+! time stored relative to jan 1st, 1900.
+call set_calendar_type(GREGORIAN)
+base_time = set_date(1900, 1, 1, 0, 0, 0)
+
+! read the necessary parameters from input.nml
+call initialize_utilities()
+call find_namelist_in_file("input.nml", "gtspp_to_obs_nml", iunit)
+read(iunit, nml = gtspp_to_obs_nml, iostat = io)
+
+! Record the namelist values used for the run
+if (do_output()) write(nmlfileunit, nml=gtspp_to_obs_nml)
+
+! any needed namelist checks for sanity:
+
+! cannot have both a single filename and a list; the namelist must
+! shut one off.
+if (gtspp_netcdf_file /= '' .and. gtspp_netcdf_filelist /= '') then
+ call error_handler(E_ERR, 'gtspp_to_obs', &
+ 'One of gtspp_netcdf_file or filelist must be NULL', &
+ source, revision, revdate)
+endif
+if (gtspp_netcdf_filelist /= '') from_list = .true.
+
+! need to know a reasonable max number of obs that could be added here.
+if (from_list) then
+ call find_textfile_dims(gtspp_netcdf_filelist, nfiles, dummy)
+ num_new_obs = avg_obs_per_file * nfiles
+else
+ num_new_obs = avg_obs_per_file
+endif
+
+! FIXME: is this a good idea? to append to an existing file?
+! either read existing obs_seq or create a new one
+call static_init_obs_sequence()
+call init_obs(obs, num_copies, num_qc)
+call init_obs(prev_obs, num_copies, num_qc)
+inquire(file=gtspp_out_file, exist=file_exist)
+if ( file_exist ) then
+
+print *, "found existing obs_seq file, appending to ", trim(gtspp_out_file)
+ call read_obs_seq(gtspp_out_file, 0, 0, num_new_obs, obs_seq)
+
+else
+
+print *, "no existing obs_seq file, creating ", trim(gtspp_out_file)
+print *, "max entries = ", num_new_obs
+ call init_obs_sequence(obs_seq, num_copies, num_qc, num_new_obs)
+ do k = 1, num_copies
+ meta_data = 'GTSPP observation'
+ call set_copy_meta_data(obs_seq, k, meta_data)
+ end do
+ do k = 1, num_qc
+ meta_data = 'GTSPP QC'
+ call set_qc_meta_data(obs_seq, k, meta_data)
+ end do
+
+end if
+
+did_obs = .false.
+
+! main loop that does either a single file or a list of files
+
+filenum = 1
+fileloop: do ! until out of files
+
+ ! get the single name, or the next name from a list
+ if (from_list) then
+ next_infile = get_next_filename(gtspp_netcdf_filelist, filenum)
+ else
+ next_infile = gtspp_netcdf_file
+ if (filenum > 1) next_infile = ''
+ endif
+ if (next_infile == '') exit fileloop
+
+ ! open the next profile file
+ call nc_check( nf90_open(next_infile, nf90_nowrite, ncid), 'file open', next_infile)
+
+ ! time is stored in the file 2 ways: as real(double) days since 1900/1/1,
+ ! and as 4 and 2 digit strings for year/mon/day/hr/min
+ ! both of these are variables, not attributes
+
+ ! start out with converting the real time.
+ call nc_check( nf90_inq_varid(ncid, "time", varid) ,'inq varid time')
+ call nc_check( nf90_get_var(ncid, varid, dtime) ,'get var time')
+
+ ! convert to integer days and seconds, and add on to reference time.
+ iday = int(dtime)
+ isec = int((dtime - iday) * 86400)
+ delta_time = set_time(isec, iday)
+ obs_time = base_time + delta_time
+ call get_time(obs_time, osec, oday)
+
+ ! get the number of depths
+ call nc_check( nf90_inq_dimid(ncid, "depth", varid), 'inq dimid depth')
+ call nc_check( nf90_inquire_dimension(ncid, varid, name, nobs), 'inq dim depth')
+
+ ! and read in the depth array
+ call nc_check( nf90_inq_varid(ncid, "depth", varid),'inq varid depth')
+ call nc_check( nf90_get_var(ncid, varid, obs_depth),'get var depth')
+
+ ! get the single lat/lon values
+ call nc_check( nf90_inq_varid(ncid,"longitude",varid) ,'inq varid longitude')
+ call nc_check( nf90_get_var(ncid, varid, glon), 'get var longitude')
+ call nc_check( nf90_inq_varid(ncid,"latitude",varid) ,'inq varid latitude')
+ call nc_check( nf90_get_var(ncid, varid, glat), 'get var latitude')
+
+ ! need to get the actual values from the 'temperature'
+ call nc_check( nf90_inq_varid(ncid,"temperature",varid) ,'inq varid temperature')
+ call nc_check( nf90_get_var(ncid, varid, temperature), 'get var temperature')
+
+ ! salinity? doesn't seem to be here
+
+ ! plus want file QC, and what about error?
+
+ call nc_check( nf90_close(ncid) , 'close file')
+
+
+ ! FIXME:
+ d_qc(1) = 0.0
+ oerr = 2.0
+
+ first_obs = .true.
+
+ obsloop: do k = 1, ndepths
+
+ ! check qc here. if bad, loop
+ if ( d_qc(1) /= 1 ) cycle obsloop
+
+ ! set location
+ call set_obs_def_location(obs_def, &
+ set_location(glon, glat, obs_depth(k),VERTISHEIGHT))
+ call set_obs_def_kind(obs_def, FLOAT_TEMPERATURE)
+ call set_obs_def_time(obs_def, set_time(osec, oday))
+
+ call set_obs_def_error_variance(obs_def, oerr * oerr)
+ call set_obs_def_key(obs_def, obs_num)
+ call set_obs_def(obs, obs_def)
+
+ obs_val(1) = obsval
+ call set_obs_values(obs, obs_val)
+ qc_val(1) = d_qc(1)
+ call set_qc(obs, qc_val)
+
+ ! first one, insert with no prev. otherwise, since all times will be the
+ ! same for this column, insert with the prev obs as the starting point.
+ ! (the first insert with no prev means it will search for the right
+ ! time ordered starting point.)
+ if (first_obs) then
+ call insert_obs_in_seq(obs_seq, obs)
+ first_obs = .false.
+ else
+ call insert_obs_in_seq(obs_seq, obs, prev_obs)
+ endif
+ obs_num = obs_num+1
+ prev_obs = obs
+
+ if (.not. did_obs) did_obs = .true.
+
+ end do obsloop
+
+ filenum = filenum + 1
+
+end do fileloop
+
+! done with main loop. if we added any obs to the sequence, write it out.
+if (did_obs) then
+!print *, 'ready to write, nobs = ', get_num_obs(obs_seq)
+ if (get_num_obs(obs_seq) > 0) &
+ call write_obs_seq(obs_seq, gtspp_out_file)
+
+ ! minor stab at cleanup, in the off chance this will someday get turned
+ ! into a subroutine in a module. probably not all that needs to be done,
+ ! but a start.
+!print *, 'calling destroy_obs'
+ call destroy_obs(obs)
+ call destroy_obs(prev_obs)
+print *, 'skipping destroy_seq'
+ ! get core dumps here, not sure why?
+ !if (get_num_obs(obs_seq) > 0) call destroy_obs_sequence(obs_seq)
+endif
+
+! END OF MAIN ROUTINE
+
+! contains
+
+! local subroutines/functions follow
+
+! something to convert time from days past jan 1, 1900 to dart time
+
+! something to set the err var - make it a subroutine so we can muck with it
+
+! something to ...
+end program
Property changes on: DART/trunk/observations/GTSPP/gtspp_to_obs.f90
___________________________________________________________________
Name: svn:keywords
+ Date Revision Author HeadURL Id
Added: DART/trunk/observations/GTSPP/work/input.nml
===================================================================
--- DART/trunk/observations/GTSPP/work/input.nml (rev 0)
+++ DART/trunk/observations/GTSPP/work/input.nml 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,55 @@
+
+
+! local operator does not do ray-path reconstruction
+! obs levels are in kilometers
+! flist contains a list of input filenames to convert into a single
+! output file
+>spp_to_obs_nml
+ gtspp_netcdf_file = '',
+ gtspp_netcdf_filelist = 'flist',
+ gtspp_out_file = 'obs_seq.gtspp',
+ /
+
+
+&preprocess_nml
+ input_obs_kind_mod_file = '../../../obs_kind/DEFAULT_obs_kind_mod.F90',
+ output_obs_kind_mod_file = '../../../obs_kind/obs_kind_mod.f90',
+ input_obs_def_mod_file = '../../../obs_def/DEFAULT_obs_def_mod.F90',
+ output_obs_def_mod_file = '../../../obs_def/obs_def_mod.f90',
+ input_files = '../../../obs_def/obs_def_ocean_mod.f90' /
+
+&obs_kind_nml
+ /
+
+&obs_def_gps_nml
+ /
+
+&location_nml
+ /
+
+&utilities_nml
+ module_details = .false.,
+ nmlfilename = 'convert.nml'
+ /
+
+&obs_sequence_nml
+ write_binary_obs_sequence = .false. /
+
+&obs_sequence_tool_nml
+ num_input_files = 1,
+ filename_seq = '../obs_seq2007010106',
+ filename_out = 'unused',
+ print_only = .true.,
+ gregorian_cal = .true.,
+ first_obs_days = -1,
+ first_obs_seconds = -1,
+ last_obs_days = -1,
+ last_obs_seconds = -1,
+/
+! obs_types =
+! keep_types =
+! min_lat =
+! max_lat =
+! min_lon =
+! max_lon =
+
Property changes on: DART/trunk/observations/GTSPP/work/input.nml
___________________________________________________________________
Name: svn:mime-type
+ text/text
Name: svn:keywords
+ Date Revision Author HeadURL Id
Added: DART/trunk/observations/GTSPP/work/mkmf_advance_time
===================================================================
--- DART/trunk/observations/GTSPP/work/mkmf_advance_time (rev 0)
+++ DART/trunk/observations/GTSPP/work/mkmf_advance_time 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,15 @@
+#!/bin/csh
+#
+# Data Assimilation Research Testbed -- DART
+# Copyright 2004-2007, Data Assimilation Research Section
+# University Corporation for Atmospheric Research
+# Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+#
+# <next few lines under version control, do not edit>
+# $URL: http://subversion.ucar.edu/DAReS/DART/trunk/models/bgrid_solo/work/mkmf_merge_obs_seq $
+# $Id: mkmf_merge_obs_seq 2691 2007-03-11 18:18:09Z thoar $
+# $Revision: 2691 $
+# $Date: 2007-03-11 12:18:09 -0600 (Sun, 11 Mar 2007) $
+#
+../../../mkmf/mkmf -p advance_time -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_advance_time
Property changes on: DART/trunk/observations/GTSPP/work/mkmf_advance_time
___________________________________________________________________
Name: svn:executable
+ *
Added: DART/trunk/observations/GTSPP/work/mkmf_gtspp_to_obs
===================================================================
--- DART/trunk/observations/GTSPP/work/mkmf_gtspp_to_obs (rev 0)
+++ DART/trunk/observations/GTSPP/work/mkmf_gtspp_to_obs 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,15 @@
+#!/bin/csh
+#
+# Data Assimilation Research Testbed -- DART
+# Copyright 2004-2007, Data Assimilation Research Section
+# University Corporation for Atmospheric Research
+# Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+#
+# <next few lines under version control, do not edit>
+# $URL: http://subversion.ucar.edu/DAReS/DART/branches/nancy_work/models/wrf/work/mkmf_preprocess $
+# $Id: mkmf_preprocess 2691 2007-03-11 18:18:09Z thoar $
+# $Revision: 2691 $
+# $Date: 2007-03-11 12:18:09 -0600 (Sun, 11 Mar 2007) $
+#
+../../../mkmf/mkmf -p gtspp_to_obs -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_gtspp_to_obs
Property changes on: DART/trunk/observations/GTSPP/work/mkmf_gtspp_to_obs
___________________________________________________________________
Name: svn:executable
+ *
Added: DART/trunk/observations/GTSPP/work/mkmf_obs_sequence_tool
===================================================================
--- DART/trunk/observations/GTSPP/work/mkmf_obs_sequence_tool (rev 0)
+++ DART/trunk/observations/GTSPP/work/mkmf_obs_sequence_tool 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,16 @@
+#!/bin/csh
+#
+# Data Assimilation Research Testbed -- DART
+# Copyright 2004-2007, Data Assimilation Research Section
+# University Corporation for Atmospheric Research
+# Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+#
+# <next few lines under version control, do not edit>
+# $URL: $
+# $Id: $
+# $Revision: 2691 $
+# $Date: 2007-03-11 12:18:09 -0600 (Sun, 11 Mar 2007) $
+#
+../../../mkmf/mkmf -p obs_sequence_tool -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_obs_sequence_tool
+
Property changes on: DART/trunk/observations/GTSPP/work/mkmf_obs_sequence_tool
___________________________________________________________________
Name: svn:executable
+ *
Added: DART/trunk/observations/GTSPP/work/mkmf_preprocess
===================================================================
--- DART/trunk/observations/GTSPP/work/mkmf_preprocess (rev 0)
+++ DART/trunk/observations/GTSPP/work/mkmf_preprocess 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,15 @@
+#!/bin/csh
+#
+# Data Assimilation Research Testbed -- DART
+# Copyright 2004-2007, Data Assimilation Research Section
+# University Corporation for Atmospheric Research
+# Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+#
+# <next few lines under version control, do not edit>
+# $URL: http://subversion.ucar.edu/DAReS/DART/branches/nancy_work/models/wrf/work/mkmf_preprocess $
+# $Id: mkmf_preprocess 2691 2007-03-11 18:18:09Z thoar $
+# $Revision: 2691 $
+# $Date: 2007-03-11 12:18:09 -0600 (Sun, 11 Mar 2007) $
+#
+../../../mkmf/mkmf -p preprocess -t ../../../mkmf/mkmf.template -c"-Duse_netCDF" \
+ -a "../../.." path_names_preprocess
Property changes on: DART/trunk/observations/GTSPP/work/mkmf_preprocess
___________________________________________________________________
Name: svn:executable
+ *
Added: DART/trunk/observations/GTSPP/work/path_names_advance_time
===================================================================
--- DART/trunk/observations/GTSPP/work/path_names_advance_time (rev 0)
+++ DART/trunk/observations/GTSPP/work/path_names_advance_time 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,6 @@
+time_manager/advance_time.f90
+time_manager/time_manager_mod.f90
+common/types_mod.f90
+utilities/utilities_mod.f90
+utilities/parse_args_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/observations/GTSPP/work/path_names_gtspp_to_obs
===================================================================
--- DART/trunk/observations/GTSPP/work/path_names_gtspp_to_obs (rev 0)
+++ DART/trunk/observations/GTSPP/work/path_names_gtspp_to_obs 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,20 @@
+observations/GTSPP/gtspp_to_obs.f90
+assim_tools/assim_tools_mod.f90
+adaptive_inflate/adaptive_inflate_mod.f90
+sort/sort_mod.f90
+cov_cutoff/cov_cutoff_mod.f90
+obs_sequence/obs_sequence_mod.f90
+obs_model/obs_model_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/template/model_mod.f90
+common/types_mod.f90
+location/threed_sphere/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+utilities/utilities_mod.f90
+time_manager/time_manager_mod.f90
+reg_factor/reg_factor_mod.f90
+ensemble_manager/ensemble_manager_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/observations/GTSPP/work/path_names_obs_sequence_tool
===================================================================
--- DART/trunk/observations/GTSPP/work/path_names_obs_sequence_tool (rev 0)
+++ DART/trunk/observations/GTSPP/work/path_names_obs_sequence_tool 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,13 @@
+obs_sequence/obs_sequence_tool.f90
+obs_sequence/obs_sequence_mod.f90
+obs_kind/obs_kind_mod.f90
+obs_def/obs_def_mod.f90
+assim_model/assim_model_mod.f90
+models/template/model_mod.f90
+common/types_mod.f90
+location/threed_sphere/location_mod.f90
+random_seq/random_seq_mod.f90
+random_nr/random_nr_mod.f90
+time_manager/time_manager_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
Added: DART/trunk/observations/GTSPP/work/path_names_preprocess
===================================================================
--- DART/trunk/observations/GTSPP/work/path_names_preprocess (rev 0)
+++ DART/trunk/observations/GTSPP/work/path_names_preprocess 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,5 @@
+preprocess/preprocess.f90
+common/types_mod.f90
+utilities/utilities_mod.f90
+mpi_utilities/null_mpi_utilities_mod.f90
+time_manager/time_manager_mod.f90
Added: DART/trunk/observations/GTSPP/work/quickbuild.csh
===================================================================
--- DART/trunk/observations/GTSPP/work/quickbuild.csh (rev 0)
+++ DART/trunk/observations/GTSPP/work/quickbuild.csh 2009-09-02 22:48:15 UTC (rev 4031)
@@ -0,0 +1,70 @@
+#!/bin/csh
+#
+# Data Assimilation Research Testbed -- DART
+# Copyright 2004-2007, Data Assimilation Research Section
+# University Corporation for Atmospheric Research
+# Licensed under the GPL -- www.gpl.org/licenses/gpl.html
+#
+# <next few lines under version control, do not edit>
+# $URL$
+# $Id$
+# $Revision$
+# $Date$
+
+# This script compiles all executables in this directory.
+#
+#----------------------------------------------------------------------
+# 'preprocess' is a program that culls the appropriate sections of the
+# observation module for the observations types in 'input.nml'; the
+# resulting source file is used by all the remaining programs,
+# so this MUST be run first.
+#----------------------------------------------------------------------
+
+\rm -f preprocess *.o *.mod
+\rm -f ../../../obs_def/obs_def_mod.f90
+\rm -f ../../../obs_kind/obs_kind_mod.f90
+
+set MODEL = "gtspp_to_obs"
+
+@ n = 1
+
+echo
+echo
+echo "---------------------------------------------------------------"
+echo "${MODEL} build number ${n} is preprocess"
+
+csh mkmf_preprocess
+make || exit $n
+
+./preprocess || exit 99
+
+#----------------------------------------------------------------------
+# Build all the single-threaded targets
+#----------------------------------------------------------------------
+
+foreach TARGET ( mkmf_* )
+
+ set PROG = `echo $TARGET | sed -e 's#mkmf_##'`
+
+ switch ( $TARGET )
+ case mkmf_preprocess:
+ breaksw
+ case mkmf_advance_time:
+ echo "If advance_time fails to build with gfortran, edit the source"
+ echo "and comment out the interface block for iargc() and try again."
+ # fall through!
+ default:
+ @ n = $n + 1
+ echo
+ echo "---------------------------------------------------"
+ echo "${MODEL} build number ${n} is ${PROG}"
+ \rm -f ${PROG}
+ csh $TARGET || exit $n
+ make || exit $n
+ breaksw
+ endsw
+end
+
+echo "Success: All DART programs compiled."
+exit 0
+
Property changes on: DART/trunk/observations/GTSPP/work/quickbuild.csh
___________________________________________________________________
Name: svn:executable
+ *
Name: svn:keywords
+ Date Revision Author HeadURL Id
More information about the Dart-dev
mailing list