[Dart-dev] [4538] DART/trunk/observations/gps/shell_scripts: Major revision of the GPS scripts from Ryan Torn, with more tweaks by me.

nancy at ucar.edu nancy at ucar.edu
Fri Oct 22 10:54:11 MDT 2010


Revision: 4538
Author:   nancy
Date:     2010-10-22 10:54:11 -0600 (Fri, 22 Oct 2010)
Log Message:
-----------
Major revision of the GPS scripts from Ryan Torn, with more tweaks by me.  
The scripts now download and convert a day at a time, 0Z to 0Z.  To construct 
files with obs from 03:01Z to 03:00 + 1 day, use the obs_sequence_tool.

There is now a single wrapper script and all the download, convert, and
cleanup is done by a second script.  The documentation has been updated
to match the revised scripts.

Modified Paths:
--------------
    DART/trunk/observations/gps/shell_scripts/README
    DART/trunk/observations/gps/shell_scripts/convert_script.csh
    DART/trunk/observations/gps/shell_scripts/cosmic_to_obsseq.csh

Removed Paths:
-------------
    DART/trunk/observations/gps/shell_scripts/cosmic_download.csh
    DART/trunk/observations/gps/shell_scripts/download_script.csh

-------------- next part --------------
Modified: DART/trunk/observations/gps/shell_scripts/README
===================================================================
--- DART/trunk/observations/gps/shell_scripts/README	2010-10-21 00:38:44 UTC (rev 4537)
+++ DART/trunk/observations/gps/shell_scripts/README	2010-10-22 16:54:11 UTC (rev 4538)
@@ -4,13 +4,15 @@
 #
 # DART $Id$
 
-The intent for the scripts in this directory are:
+Description of the scripts provided to process the COSMIC and 
+CHAMP GPS radio occultation data.
 
-Summary:  
+Summary of workflow:  
 1) cd to the ../work directory and run ./quickbuild.csh to compile everything.  
-2) Edit cosmic_download.csh and cosmic_to_obsseq.csh once to set DART_DIR.
-3) Edit ./download_script.csh to set the cosmic name, pw, and dates.  Run it.
-4) Edit ./convert_script.csh to set the cosmic name, pw, and dates.  Run it.
+2) Edit ./cosmic_to_obsseq.csh once to set the directory where the DART
+    code is installed, and your COSMIC web site user name and password.
+3) Edit ./convert_script.csh to set the days of data to download/convert/remove.
+4) Run ./convert_script.csh
 5) For additional days repeat steps 3 and 4.
 
 
@@ -26,47 +28,30 @@
 to do the GPS conversion into DART obs_sequence files.
 
 
-2) download_script.csh (which calls cosmic_download.csh):
+2) cosmic_to_obsseq.csh:
 
-Edit cosmic_download.csh once to set the DART_DIR to where you have
+Edit cosmic_to_obsseq.csh once to set the DART_DIR to where you have
 downloaded the DART distribution.  (There are a few additional options
 in this script, but the distribution version should be good for most users.)
-After this you should be able to ignore this script.
+If you are downloading data from the COSMIC web site, set your
+web page user name and password.  After this you should be able to 
+ignore this script.
 
-Edit and run the download_script.csh.  You will need to set your
-cosmic web site user name and password, and set the days for which you 
-want to download data.
 
+2) convert_script.csh:
 
-3) convert_script.csh (which calls cosmic_to_obsseq.csh):
+A wrapper script that calls the converter script a day at a time.
+Set the days of data you want to download/convert/remove.  See the
+comments at the top of this script for the various options to set.  
+Rerun this script for all data you need.
 
-Edit cosmic_to_obsseq.csh once to set the DART_DIR to where you have
-downloaded the DART distribution.  (There are a few additional options
-in this script, but the distribution version should be good for most users.)
-After this you should be able to ignore this script.
 
-Edit and run the convert_script.csh.  You will need to set your
-cosmic web site user name and password, and set the days for which you 
-want to convert data into DART obs_sequence files.
 
-There are options on the convert_script.csh to bypass step 2 and do both
-the download and convert in a single step. For one or two days this
-might be a good option, but it is common to have problems during the download.
-It is easier to restart the download separately from the conversion
-if you're using the separate download script.  Also, the conversion needs
-3 hours from the following day to create a file which has all obs within
-the window (usually 3 hours) from the center time (usually 0Z).  So if the
-download is done as a part of the script it must download 2 days, do the
-conversion, and then move on to the next day.  The script is currently
-not smart enough to avoid redownloading data, so if you are converting
-multiple consecutive days it will redownload the next day's data anyway
-and will not realize you already have it.  It cannot simply look for the
-existance of a directory since that complicates restarting failed or
-partial downloads.  It is also risky to automatically delete the data files
-in the script, since if there are errors (full file system, etc) that
-don't cause the script to exit, you will delete the data files and not
-have the converted obs files.  This is all only a consideration because
-currently it is slow to download the data.  If that becomes faster, then
-most of the discussion in this paragraph is moot.
+It can be risky to use the automatic delete/cleanup option - if there are
+any errors in the script or conversion (file system full, bad file format,
+etc) and the script doesn't exit, it can delete the input files before 
+the conversion has succeeded.  But if you have file quota concerns
+this allows you to keep the total disk usage lower.
 
 
+

Modified: DART/trunk/observations/gps/shell_scripts/convert_script.csh
===================================================================
--- DART/trunk/observations/gps/shell_scripts/convert_script.csh	2010-10-21 00:38:44 UTC (rev 4537)
+++ DART/trunk/observations/gps/shell_scripts/convert_script.csh	2010-10-22 16:54:11 UTC (rev 4538)
@@ -7,45 +7,60 @@
 # $Id$
 #
 # Main script:
-# generate multiple days of gps observations
+#    generate multiple days of gps observations
 #
-# calls the cosmic_to_obsseq script with 4 args:
+# calls the cosmic_to_obsseq script with 5 args:
 #
 #  - the date in YYYYMMDD format
-#  - the working directory location
-#  - whether to download the data automatically from the cosmic web site 
-#     (downloading data requires signing up for a username/password to 
-#     access the site, and then setting the username and password here 
-#     before running this script.)  set to 'no' if the data has already been
-#     downloaded separately before now.  set to 'both' to download both the
-#     current day plus the next day (needed to get the first N hours of the
-#     following day for assimilation windows centered on midnight).  set to
-#     'next' if the current day's data is already downloaded.
+#  - the processing directory location, relative to the 'work' dir.
+#  - whether to download the data automatically from the cosmic web site.
+#     'yes' will do the download, 'no' assumes the data is already downloaded
+#     and on local disk.  (downloading data requires signing up for a 
+#     username/password to access the site, and then setting the username 
+#     and password in the cosmic_to_obsseq script before running it.)
+#  - whether to convert the data.  set to 'yes' to make obs_seq files (the
+#     usual use of this script). 'no' if just downloading or just cleaning up.
 #  - whether to delete the data automatically from the local disk after the
-#     conversion is done.  values are 'no', 'curr', or 'both'.  'curr' deletes
-#     the current day but leaves the following day.  'both' deletes both
-#     the current and next day's data.
+#     conversion is done.  valid values are 'yes' or 'no'.
 #
 
-setenv cosmic_user xxx
-setenv cosmic_pw   yyy
+# examples of common use follow.  
 
-# assumes all data predownloaded, and will be deleted afterwards
-# by hand.
-./cosmic_to_obsseq.csh 20071001 ../cosmic no no
-./cosmic_to_obsseq.csh 20071002 ../cosmic no no
-./cosmic_to_obsseq.csh 20071003 ../cosmic no no
-./cosmic_to_obsseq.csh 20071004 ../cosmic no no
-./cosmic_to_obsseq.csh 20071005 ../cosmic no no
-./cosmic_to_obsseq.csh 20071006 ../cosmic no no
-./cosmic_to_obsseq.csh 20071007 ../cosmic no no
+# download only:
+./cosmic_to_obsseq.csh 20071001 ../cosmic yes no no
+./cosmic_to_obsseq.csh 20071002 ../cosmic yes no no
+./cosmic_to_obsseq.csh 20071003 ../cosmic yes no no
+./cosmic_to_obsseq.csh 20071004 ../cosmic yes no no
+./cosmic_to_obsseq.csh 20071005 ../cosmic yes no no
 
-## example of using both, curr and next for on-demand download and cleanup.
-#./cosmic_to_obsseq.csh 20071001 ../cosmic both curr
-#./cosmic_to_obsseq.csh 20071002 ../cosmic next curr
-#./cosmic_to_obsseq.csh 20071003 ../cosmic next curr
-#./cosmic_to_obsseq.csh 20071004 ../cosmic next both
+# convert only.  assume all data already downloaded:
+./cosmic_to_obsseq.csh 20071001 ../cosmic no yes no
+./cosmic_to_obsseq.csh 20071002 ../cosmic no yes no
+./cosmic_to_obsseq.csh 20071003 ../cosmic no yes no
+./cosmic_to_obsseq.csh 20071004 ../cosmic no yes no
+./cosmic_to_obsseq.csh 20071005 ../cosmic no yes no
 
+# download and convert, not removing files:
+./cosmic_to_obsseq.csh 20071001 ../cosmic yes yes no
+./cosmic_to_obsseq.csh 20071002 ../cosmic yes yes no
+./cosmic_to_obsseq.csh 20071003 ../cosmic yes yes no
+./cosmic_to_obsseq.csh 20071004 ../cosmic yes yes no
+./cosmic_to_obsseq.csh 20071005 ../cosmic yes yes no
+
+# clean up only after verifying conversion worked:
+./cosmic_to_obsseq.csh 20071001 ../cosmic no no yes
+./cosmic_to_obsseq.csh 20071002 ../cosmic no no yes
+./cosmic_to_obsseq.csh 20071003 ../cosmic no no yes
+./cosmic_to_obsseq.csh 20071004 ../cosmic no no yes
+./cosmic_to_obsseq.csh 20071005 ../cosmic no no yes
+
+# download, convert, and clean up all in one go:
+./cosmic_to_obsseq.csh 20071001 ../cosmic yes yes yes
+./cosmic_to_obsseq.csh 20071002 ../cosmic yes yes yes
+./cosmic_to_obsseq.csh 20071003 ../cosmic yes yes yes
+./cosmic_to_obsseq.csh 20071004 ../cosmic yes yes yes
+./cosmic_to_obsseq.csh 20071005 ../cosmic yes yes yes
+
 exit 0
 
 # <next few lines under version control, do not edit>

Deleted: DART/trunk/observations/gps/shell_scripts/cosmic_download.csh
===================================================================
--- DART/trunk/observations/gps/shell_scripts/cosmic_download.csh	2010-10-21 00:38:44 UTC (rev 4537)
+++ DART/trunk/observations/gps/shell_scripts/cosmic_download.csh	2010-10-22 16:54:11 UTC (rev 4538)
@@ -1,155 +0,0 @@
-#!/bin/csh
-#
-# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
-# provided by UCAR, "as is", without charge, subject to all terms of use at
-# http://www.image.ucar.edu/DAReS/DART/DART_download
-#
-# $Id$
-#
-########################################################################
-#
-#   cosmic_download.csh - script that downloads COSMIC observations.
-#         then you can run cosmic_to_obsseq.csh with a third argument of
-#         'no' so it does not re-download the same files.
-#
-# requires 2 args:
-#    $1 - analysis date (yyyymmdd format)
-#    $2 - base observation directory
-#
-# and update the 3 env settings at the top to match your system.
-#
-#     created May  2009, nancy collins ncar/cisl
-#                        converted from cosmic_to_obsseq.csh
-#     updated Aug  2009, nancy collins ncar/cisl
-#
-# from the cosmic web site about the use of 'wget' to download
-# the many files needed to do this process:
-# ------- 
-# Hints for using wget for fetching CDAAC files from CDAAC:
-# 
-# Here is one recipe for fetching all cosmic real time atmPrf files for one day:
-# 
-# wget -nd -np -r -l 10 -w 2 --http-user=xxxx --http-passwd=xxxx http://cosmic-io.cosmic.ucar.edu/cdaac/login/cosmicrt/level2/atmPrf/2009.007/
-# 
-# The option -np (no parents) is important. Without it, all manner of 
-# files from throughout the site will be loaded, I think due to the 
-# links back to the main page which are everywhere.
-# 
-# The option -r (recursive fetch) is necessary unless there is just 
-# one file you want to fetch.
-# 
-# The option -l 10 (limit of recursive depth to 10 levels) is necessary 
-# in order to get around the default 5 level depth.
-# 
-# The option -nd dumps all fetched files into your current directory. 
-# Otherwise a directory hierarchy will be created: 
-#   cosmic-io.cosmic.ucar.edu/cdaac/login/cosmic/level2/atmPrf/2006.207
-# 
-# The option -w 2 tells wget to wait two seconds between each file fetch 
-# so as to have pity on our poor web server.
-# ------- 
-# 
-########################################################################
-
-# should only have to set the DART_DIR and the rest should be in the
-# right place relative to it.
-setenv DART_DIR      /home/user/DART
-
-setenv DART_WORK_DIR  ${DART_DIR}/observations/gps/work
-setenv DATE_PROG      advance_time
-
-set chatty=yes
-set downld=yes
-
-set datea   = ${1}     # target date, YYYYMMDD
-set datadir = ${2}     # where to process the files
-
-set assim_freq          = 6  # hours, sets centers of windows.
-set download_window     = 3  # window half-width (some users choose 2 hours)
-set gps_repository_path = 'http://cosmic-io.cosmic.ucar.edu/cdaac/login'
-set wget_cmd            = 'wget -q -nd -np -r -l 10 -w 1'
-
-# i've done this wrong enough times and wasted a lot of download time,
-# so do a bunch of bullet-proofing here before going on.
-
-# verify the dirs all exist, the input.nml is in place.
-if ( ! -d ${DART_WORK_DIR} ) then
-  echo 'work directory not found: ' ${DART_WORK_DIR}
-  exit
-endif
-
-echo 'current dir is ' `pwd`
-if ( `pwd` != ${DART_WORK_DIR} ) then
-   echo 'if not already there, changing directory to work dir.'
-   cd ${DART_WORK_DIR}
-   echo 'current dir now ' `pwd`
-endif
-
-if ( ! -d ${datadir} ) then
-  echo 'data processing directory not found: ' ${datadir}
-  echo 'creating now.'
-  mkdir ${datadir}
-  ls -ld ${datadir}
-endif
-
-if ( ! -e ${datadir}/${DATE_PROG} ) then
-  echo 'data processing directory does not contain the time converter'
-  echo 'copying from work dir to data proc dir'
-  echo `pwd`/${DATE_PROG} '->' ${datadir}/${DATE_PROG}
-  cp -f ./${DATE_PROG} ${datadir}
-else
-  echo 'using time conversion program already found in data proc dir'
-endif
-
-echo 'changing dir to data proc directory'
-cd ${datadir}
-echo 'current dir now ' `pwd`
-
- if ( ! $?cosmic_user || ! $?cosmic_pw ) then
-    echo "You must setenv cosmic_user to your username for the cosmic web site"
-    echo "and setenv cosmic_pw to your password. (or export cosmic_user=val for"
-    echo "ksh/bash users) then rerun this script. "
-    exit -1
- endif
-
-if ( $chatty == 'yes' ) then
-   echo 'starting raw file download at' `date`
-endif
-
-set get = "${wget_cmd} --http-user=${cosmic_user} --http-passwd=${cosmic_pw}" 
-
-set yyyy   = `echo $datea | cut -b1-4`
-
-if ( ! -d ${datea} ) then
-  echo 'year/month/day directory not found: ' ${datea}
-  echo 'creating now.'
-  mkdir ${datea}
-endif
-
-cd ${datea}
-ln -sf ${DART_WORK_DIR}/input.nml .
-
-echo $datea
-set jyyyydd = `echo $datea 0 -j | ../${DATE_PROG}` 
-echo $jyyyydd
-@ mday = $jyyyydd[2] + 1000  ;  set mday = `echo $mday | cut -b2-4`
-${get} ${gps_repository_path}/cosmic/level2/atmPrf/${yyyy}.${mday}/
-rm -f *.html *.txt
-${get} ${gps_repository_path}/champ/level2/atmPrf/${yyyy}.${mday}/
-rm -f *.html *.txt input.nml
-
-if ( $chatty == 'yes' ) then
-   # the ls arg list line gets too long in some cases
-   echo `/bin/ls | grep _nc | wc -l` 'raw files'
-   echo 'all raw files download at ' `date`
-endif
-
-cd ..
-
-exit 0
-
-# <next few lines under version control, do not edit>
-# $URL$
-# $Revision$
-# $Date$
-

Modified: DART/trunk/observations/gps/shell_scripts/cosmic_to_obsseq.csh
===================================================================
--- DART/trunk/observations/gps/shell_scripts/cosmic_to_obsseq.csh	2010-10-21 00:38:44 UTC (rev 4537)
+++ DART/trunk/observations/gps/shell_scripts/cosmic_to_obsseq.csh	2010-10-22 16:54:11 UTC (rev 4538)
@@ -11,42 +11,106 @@
 #   cosmic_to_obsseq.csh - script that downloads COSMIC observations 
 #               and converts them to a DART observation sequence file.
 #
-# requires 3 args:
+# requires 5 args:
 #    $1 - analysis date (yyyymmdd format)
-#    $2 - base observation directory
-#    $3 - yes to download COSMIC data (calls cosmic_download.csh)
-#    $4 - yes to delete raw COSMIC data when finished
+#    $2 - base directory where files will be processed
+#    $3 - yes to download raw COSMIC data 
+#    $4 - yes to convert COSMIC data to an obs_seq file
+#    $5 - yes to delete raw COSMIC data when finished
 #
-# update the DART_DIR setting at the top to match your system.
+# update DART_DIR in this script to match your system, and if
+# downloading, set your COSMIC web page user name and password.
 #
+# edit the input.nml in the work directory to select any options;
+# it will be copied to the various places it is needed.
+#
+# the processing directory name is relative to the 'work' directory.
+#
+#
 #     created June 2008, Ryan Torn NCAR/MMM
 #     updated Nov  2008, nancy collins ncar/cisl
 #     updated Aug  2009, nancy collins ncar/cisl
+#     updated Oct  2010, Ryan and nancy
 # 
+#
+# ------- 
+# from the cosmic web site about the use of 'wget' to download
+# the many files needed to do this process:
+#
+#   Hints for using wget for fetching CDAAC files from CDAAC:
+#   
+#   Here is one recipe for fetching all cosmic real time atmPrf files for one day:
+#   
+#   wget -nd -np -r -l 10 -w 2 --http-user=xxxx --http-passwd=xxxx \
+#         http://cosmic-io.cosmic.ucar.edu/cdaac/login/cosmicrt/level2/atmPrf/2009.007/
+#   
+#   The option -np (no parents) is important. Without it, all manner of 
+#   files from throughout the site will be loaded, I think due to the 
+#   links back to the main page which are everywhere.
+#   
+#   The option -r (recursive fetch) is necessary unless there is just 
+#   one file you want to fetch.
+#   
+#   The option -l 10 (limit of recursive depth to 10 levels) is necessary 
+#   in order to get around the default 5 level depth.
+#   
+#   The option -nd dumps all fetched files into your current directory. 
+#   Otherwise a directory hierarchy will be created: 
+#     cosmic-io.cosmic.ucar.edu/cdaac/login/cosmic/level2/atmPrf/2006.207
+#   
+#   The option -w 2 tells wget to wait two seconds between each file fetch 
+#   so as to have pity on the poor web server.
+# ------- 
+# 
+# note: there are between 1000 and 3000 files per day.  without the -w
+# flag, i was getting about 5 files per second (each individual file is
+# relatively small).  but with -w 1 obviously we get slightly less than
+# a file a second, -w 2 is half that again.  this script uses -w 1 by
+# default, but if you are trying to download a lot of days i'd recommend
+# removing it.
+#
 ########################################################################
 
-# should only have to set the DART_DIR and the rest should be in the
-# right place relative to this directory.
-setenv DART_DIR      /home/user/dart
+########################################################################
 
+# should only have to set the DART_DIR, and if downloading, set the
+# web site user name and password for access.  expects to use the 'wget'
+# utility to download files from the web page.  
+
+# top level directory (where observations dir is found)
+setenv DART_DIR    /home/user/dart
+set cosmic_user    = xxx
+set cosmic_pw      = yyy
+
+set gps_repository_path = 'http://cosmic-io.cosmic.ucar.edu/cdaac/login'
+
 setenv DART_WORK_DIR  ${DART_DIR}/observations/gps/work
 setenv CONV_PROG      convert_cosmic_gps_cdf
 setenv DATE_PROG      advance_time
 
+# if you are in a hurry, this has no delay between requests:
+#set wget_cmd           = 'wget -q -nd -np -r -l 10 -w 0'
+set wget_cmd            = 'wget -q -nd -np -r -l 10 -w 1'
+
 set chatty=yes
 
 set datea   = ${1}     # target date, YYYYMMDD
-set datadir = ${2}     # where to process the files
-set downld  = ${3}     # download?  'both' (both days) 'next' (next day)
-set cleanup = ${4}     # delete COSMIC files at end? 'both' or 'curr'
+set datadir = ${2}     # under what directory to process the files
+set downld  = ${3}     # download?  'yes' or 'no'
+set convert = ${4}     # convert?   'yes' or 'no'
+set cleanup = ${5}     # delete COSMIC files at end? 'yes' or 'no'
 
-set assim_freq          = 6  # hours, sets centers of windows.
-set download_window     = 3  # window half-width (some users choose 2 hours)
 
-# i've done this wrong enough times and wasted a lot of download time,
-# so do a bunch of bullet-proofing here before going on.
+if ( $chatty == 'yes' ) then
+   echo 'starting gps script run at ' `date`
+endif
 
-# verify the dirs all exist, the input.nml is in place.
+# i've done this wrong enough times and wasted a lot of download 
+# time, so do a bunch of bullet-proofing here before going on.
+# e.g. verify the dirs all exist, the input.nml is in place.
+# make sure the user has not edited the download/processing
+# version of the input.nml - we use the one from the work dir.
+
 if ( ! -d ${DART_WORK_DIR} ) then
   echo 'work directory not found: ' ${DART_WORK_DIR}
   exit 1
@@ -60,10 +124,8 @@
 endif
 
 if ( ! -d ${datadir} ) then
-  echo 'data processing directory not found: ' ${datadir}
-  echo 'creating now.'
+  echo 'creating data processing directory: ' ${datadir}
   mkdir ${datadir}
-  ls -ld ${datadir}
 endif
 if ( ! -e ${datadir}/input.nml ) then
   echo 'data processing directory does not contain an input.nml'
@@ -71,137 +133,135 @@
   echo `pwd`/input.nml '->' ${datadir}/input.nml
   cp -f ./input.nml ${datadir}
 else
-  echo 'data processing directory already contains an input.nml'
-  echo 'which will be used for this run.'
   diff -q ./input.nml ${datadir}/input.nml
+  if ( $status == 1 ) then
+     echo 'the input.nml file in the work directory is different'
+     echo 'than the one in the data processing directory.'
+     echo 'update them to be consistent, or remove the one in the'
+     echo 'data processing directory and a new one will be copied'
+     echo 'over from the work directory.'
+     exit -1
+  endif
 endif
 
-if ( ! -e ${datadir}/${DATE_PROG} ) then
-  echo 'data processing directory does not contain the time converter'
-  echo 'copying from work dir to data proc dir'
-  echo `pwd`/${DATE_PROG} '->' ${datadir}/${DATE_PROG}
-  cp -f ./${DATE_PROG} ${datadir}
-else
-  echo 'using time conversion program already found in data proc dir'
-endif
+# copy over the date program and the converter from
+# the work dir to the data processing directory.
+cp -f ./${DATE_PROG} ./${CONV_PROG} ${datadir}
 
-if ( ! -e ${datadir}/${CONV_PROG} ) then
-  echo 'data processing directory does not contain the data converter'
-  echo 'copying from work dir to data proc dir'
-  echo `pwd`/${CONV_PROG} '->' ${datadir}/${CONV_PROG}
-  cp -f ./${CONV_PROG} ${datadir}
-else
-  echo 'using data conversion program already found in data proc dir'
-endif
+if ( $downld == 'yes' ) then
+   echo 'if not already there, changing dir to data proc directory'
+   cd ${datadir}
+   echo 'current dir now ' `pwd`
 
-set date2 = `echo $datea +1d -f ccyymmdd | ${DATE_PROG}`
+   if ( ! $?cosmic_user || ! $?cosmic_pw ) then
+      echo "You must set cosmic_user to your username for the cosmic web site"
+      echo "and set cosmic_pw to your password, then rerun this script. "
+      exit -1
+   endif
 
-if ( $downld == 'both' || $downld == 'next' ) then
+   if ( $chatty == 'yes' ) then
+      echo 'starting raw file download at' `date`
+   endif
+   
+   set get = "${wget_cmd} --http-user=${cosmic_user} --http-passwd=${cosmic_pw}" 
+   set yyyy   = `echo $datea | cut -b1-4`
+   
+   if ( ! -d ${datea} ) then
+     echo 'year/month/day directory not found: ' ${datea}
+     echo 'creating now.'
+     mkdir ${datea}
+   else
+     echo 'existing directory found, cleaning up before new download'
+     rm -fr ${datea}
+     mkdir ${datea}
+   endif
+   
+   cd ${datea}
+   ln -sf ${DART_WORK_DIR}/input.nml .
+   
+   set jyyyydd = `echo $datea 0 -j | ../${DATE_PROG}` 
+   @ mday = $jyyyydd[2] + 1000  ;  set mday = `echo $mday | cut -b2-4`
+   echo 'downloading obs for date: ' $datea ', which is julian day: ' $jyyyydd
 
-   if ( $downld == 'both' )  ./cosmic_download.csh ${datea} ${datadir}
-
-   ./cosmic_download.csh ${date2} ${datadir}
-
+   ${get} ${gps_repository_path}/cosmic/level2/atmPrf/${yyyy}.${mday}/
+   rm -f *.html *.txt
+   ${get} ${gps_repository_path}/champ/level2/atmPrf/${yyyy}.${mday}/
+   rm -f *.html *.txt input.nml
+   
+   if ( $chatty == 'yes' ) then
+      echo `/bin/ls . | grep _nc | wc -l` 'raw files downloaded at ' `date`
+   endif
+   
+   cd ${DART_WORK_DIR}
+else
+   echo 'not downloading data; assume it is already on local disk'
 endif
 
-echo 'changing dir to data proc directory'
-cd ${datadir}
-echo 'current dir now ' `pwd`
-
-# save original
-set dates = $datea
-
-@ hh = $assim_freq + 100  ;  set hh = `echo $hh | cut -b2-3`
-set datea = `echo $datea | cut -b1-8`${hh}
-set datee = `echo ${datea}00 24 | ./${DATE_PROG}`
-
-if ( $chatty == 'yes' ) then
-   echo 'starting gps conversion at ' `date`
-endif
-
-while ( $datea < $datee )
-
-  set yyyy = `echo $datea | cut -b1-4`
-  set   mm = `echo $datea | cut -b5-6`
-  set   dd = `echo $datea | cut -b7-8`
-  set   hh = `echo $datea | cut -b9-10` 
-
-  rm -f flist
-  @ nhours = 2 * $download_window  ;  set n = 1
-  set datef = `echo $datea -$download_window | ./${DATE_PROG}`
-  while ( $n <= $nhours ) 
-
-    set    yyyy = `echo $datef | cut -b1-4`
-    set      hh = `echo $datef | cut -b9-10`
-    set yyyymmdd = `echo $datef | cut -b1-8` 
-    set jyyyydd = `echo $datef 0 -j | ./${DATE_PROG}`
-    @ mday = $jyyyydd[2] + 1000  ;  set mday = `echo $mday | cut -b2-4`
-
-    /bin/ls ${yyyymmdd}/*.${yyyy}.${mday}.${hh}.*_nc >>! flist
-
-    if ( $chatty == 'yes' ) then
-      echo $datea ' hour ' $hh
-    endif
-
-    set datef = `echo $datef 1 | ./${DATE_PROG}`
-    @ n += 1
-  end
- 
-  set nfiles = `cat flist | wc -l`
-  if ( $chatty == 'yes' ) then
+if ( $convert == 'yes') then
+   echo 'if not already there, changing dir to data proc directory'
+   cd ${datadir}
+   echo 'current dir now ' `pwd`
+   
+   if ( $chatty == 'yes' ) then
+      echo 'starting gps conversion at ' `date`
+   endif
+   
+   rm -f flist
+   set yyyy    = `echo $datea | cut -b1-4`
+   set jyyyydd = `echo ${datea}00 0 -j | ./${DATE_PROG}`
+   @ mday = $jyyyydd[2] + 1000  ;  set mday = `echo $mday | cut -b2-4`
+   echo 'converting obs for date: ' $datea
+   
+   /bin/ls -1 ${datea}/*.${yyyy}.${mday}.*_nc >! flist
+   
+   set nfiles = `cat flist | wc -l`
+   if ( $chatty == 'yes' ) then
       echo $nfiles ' to process for file ' $datea 
-  endif
+   endif
+   
+   ./${CONV_PROG} >>! convert_output_log
 
-  ./${CONV_PROG} >>! convert_output_log
-  rm -rf cosmic_gps_input.nc  flist
-
-  set datef = $datea
-  if ( `echo $datea | cut -b9-10` == '00' ) then
-    set datef = `echo $datea -24 | ./${DATE_PROG}`
-    set datef = `echo $datef | cut -b1-8`24
-  endif
-  if ( -e obs_seq.gpsro )  mv obs_seq.gpsro ../obs_seq.gpsro_${datef}
-
-   if ( $chatty == 'yes' ) then
-      echo "moved accumulated obs to ../obs_seq.gpsro_${datef} at " `date`
+   rm -rf cosmic_gps_input.nc flist
+   if ( -e obs_seq.gpsro ) then
+       mv obs_seq.gpsro obs_seq.gpsro_${datea}
+   
+      if ( $chatty == 'yes' ) then
+         echo "all observations for day in file obs_seq.gpsro_${datea} at " `date`
+      endif
+   else
+      if ( $chatty == 'yes' ) then
+         echo "no obs found for date ${datea}, or conversion failed.'
+      endif
    endif
 
-  set datea = `echo $datea $assim_freq | ./${DATE_PROG}`
+   cd ${DART_WORK_DIR}
+else
+   echo 'not converting data'
+endif
 
-end
-
-if ( $cleanup == 'both' || $cleanup == 'curr' ) then
-
+if ( $cleanup == 'yes' ) then
+   echo 'if not already there, changing dir to data proc directory'
+   cd ${datadir}
+   echo 'current dir now ' `pwd`
+   
    if ( $chatty == 'yes' ) then
       echo 'cleaning up files at ' `date`
    endif
 
-  # do in chunks because some systems have problems with the command
-  # line getting too long (large numbers of files here).
-  rm -f ${dates}/atmPrf_C001*
-  rm -f ${dates}/atmPrf_C002*
-  rm -f ${dates}/atmPrf_C003*
-  rm -f ${dates}/atmPrf_C004*
-  rm -f ${dates}/atmPrf_C005*
-  rm -f ${dates}/atmPrf_C006*
-  rm -f ${dates}/atmPrf_CHAM*
-  rm -f ${dates}/atmPrf_*
+   echo 'removing original cosmic data files for date: ' $datea
 
-  if ( $cleanup == 'both' ) then
-    rm -f ${date2}/atmPrf_C001*
-    rm -f ${date2}/atmPrf_C002*
-    rm -f ${date2}/atmPrf_C003*
-    rm -f ${date2}/atmPrf_C004*
-    rm -f ${date2}/atmPrf_C005*
-    rm -f ${date2}/atmPrf_C006*
-    rm -f ${date2}/atmPrf_CHAM*
-    rm -f ${date2}/atmPrf_*
-  endif
+   # just remove the whole subdir here.  trying to list individual
+   # files can cause problems with long command line lengths.
+   rm -fr ${datea}
 
+   cd ${DART_WORK_DIR}
+else
+   echo 'not removing original cosmic data files'
 endif
 
 if ( $chatty == 'yes' ) then
-   echo 'finished gps conversion at ' `date`
+   echo 'finished gps script run at ' `date`
+   echo ' '
 endif
 
 exit 0

Deleted: DART/trunk/observations/gps/shell_scripts/download_script.csh
===================================================================
--- DART/trunk/observations/gps/shell_scripts/download_script.csh	2010-10-21 00:38:44 UTC (rev 4537)
+++ DART/trunk/observations/gps/shell_scripts/download_script.csh	2010-10-22 16:54:11 UTC (rev 4538)
@@ -1,68 +0,0 @@
-#!/bin/csh
-#
-# DART software - Copyright \xA9 2004 - 2010 UCAR. This open source software is
-# provided by UCAR, "as is", without charge, subject to all terms of use at
-# http://www.image.ucar.edu/DAReS/DART/DART_download
-#
-# $Id$
-#
-# Main script:
-# download multiple days of gps files
-#
-# calls the cosmic_download script with a date and the working
-# directory location.  downloading data requires signing up 
-# for a username/password to access the site, and then setting 
-# the username and password here before running this script.
-#
-#BSUB -J get_gps
-#BSUB -o gps.%J.log
-#BSUB -q standby
-#BSUB -n 1
-#BSUB -W 12:00
-
-setenv cosmic_user xxx
-setenv cosmic_pw   yyy
-
-./cosmic_download.csh 20071001 ../cosmic
-./cosmic_download.csh 20071002 ../cosmic
-./cosmic_download.csh 20071003 ../cosmic
-./cosmic_download.csh 20071004 ../cosmic
-./cosmic_download.csh 20071005 ../cosmic
-./cosmic_download.csh 20071006 ../cosmic
-./cosmic_download.csh 20071007 ../cosmic
- 
-#./cosmic_download.csh 20071008 ../cosmic
-#./cosmic_download.csh 20071009 ../cosmic
-#./cosmic_download.csh 20071010 ../cosmic
-#./cosmic_download.csh 20071011 ../cosmic
-#./cosmic_download.csh 20071012 ../cosmic
-#./cosmic_download.csh 20071013 ../cosmic
-#./cosmic_download.csh 20071014 ../cosmic
- 
-#./cosmic_download.csh 20071015 ../cosmic
-#./cosmic_download.csh 20071016 ../cosmic
-#./cosmic_download.csh 20071017 ../cosmic
-#./cosmic_download.csh 20071018 ../cosmic
-#./cosmic_download.csh 20071019 ../cosmic
-#./cosmic_download.csh 20071020 ../cosmic
-#./cosmic_download.csh 20071021 ../cosmic
- 
-#./cosmic_download.csh 20071022 ../cosmic
-#./cosmic_download.csh 20071023 ../cosmic
-#./cosmic_download.csh 20071024 ../cosmic
-#./cosmic_download.csh 20071025 ../cosmic
-#./cosmic_download.csh 20071026 ../cosmic
-#./cosmic_download.csh 20071027 ../cosmic
-#./cosmic_download.csh 20071028 ../cosmic
-
-#./cosmic_download.csh 20071029 ../cosmic
-#./cosmic_download.csh 20071030 ../cosmic
-#./cosmic_download.csh 20071031 ../cosmic
-
-exit 0
-
-# <next few lines under version control, do not edit>
-# $URL$
-# $Revision$
-# $Date$
-


More information about the Dart-dev mailing list