[Met_help] help in obtaining archived PREPBUFR observations

John Halley Gotway johnhg at rap.ucar.edu
Mon Jan 26 15:14:11 MST 2009


Jonathan,

I have good news and bad news for you.

First, the data you're actually looking for are the files that contain "prepbufr" in them - not simply "bufr".  And the GDAS prepbufr data is stored in 6 hours chunks - 00Z, 06Z, 12Z, and 18Z.  I
believe each file contains +/- 3 hours of data around the time.  So the 06Z file contains observations between 03Z and 09Z.

Here's one of the files you're looking for:
http://nomads.ncdc.noaa.gov/data/gdas/200803/20080330/gdas1.t06z.prepbufr.nr

Each one of these prepbufr files contains all of the observation types put together.  But unfortunately, you may find that this prepbufr archive does not go back in time as far as you need.  I think
it's only available here back to 12/14/2006.  Does that work for you?

Retrieving the data is actually pretty easy.  You can use the "wget" unix command to grab a whole bunch of files from the web.  So you'd just need to write a script to construct the full path for the
files you'd like to retrieve, save them to a file, and pass it to wget.

For example, suppose a file named "my_prepbufr_files.txt" contains the following 4 lines:

http://nomads.ncdc.noaa.gov/data/gdas/200612/20061231/gdas1.t00z.prepbufr.nr
http://nomads.ncdc.noaa.gov/data/gdas/200612/20061231/gdas1.t06z.prepbufr.nr
http://nomads.ncdc.noaa.gov/data/gdas/200612/20061231/gdas1.t12z.prepbufr.nr
http://nomads.ncdc.noaa.gov/data/gdas/200612/20061231/gdas1.t18z.prepbufr.nr

Run the following command to grab all those files:
wget -i my_prepbufr_files.txt

It should be pretty straight-forward to generate a list of files you'd like.  And then you can just run the wget command overnight.

Since the observations are in 6 hour chunks, you'll need to run them through the PB2NC tool to generate the 1 hour files you'd like.

In METv1.1 (the current released version), you'd need to run PB2NC 6 times to generate the 6 1-hour files.  In METv2.0 (to be released in Feb/March), you'd need to run it through PB2NC only once, and
then use command line arguments to Point-Stat to control the time range of observations to be used for each Point-Stat run.

If you'd like, in a couple of weeks, you'd be welcome to run the beta version of METv2.0.  Having more people test it out is always a good thing prior to a release.

Hope this helps.  Let me know if you still have questions.

John Halley Gotway
johnhg at ucar.edu

Case, Jonathan (MSFC)[] wrote:
> Dear MET help,
> 
>  
> 
> I'd like to get started in running the standard verification statistics
> programs such as point-stat.
> 
> On your web site, you point to http://nomads.ncdc.noaa.gov/data/gdas/ as
> a source of archived PREPBUFR observations that are used in the GDAS.
> However, there are numerous files with the string "bufr", the data only
> go back to 2006, and it would be cumbersome to download each file needed
> for verification.  
> 
>  
> 
> Therefore, I'd like to ask what is the best way to obtain hourly or
> sub-hourly PREPBUFR surface observations that can be used in point-stat
> to compute typical surface verification, and what datasets should I be
> looking for? 
> 
>  
> 
> I appreciate your assistance!
> Jonathan
> 
>  
> 
> *********************************************************** 
> Jonathan Case, ENSCO, Inc. 
> Aerospace Sciences & Engineering Division 
> Short-term Prediction Research and Transition Center 
> 320 Sparkman Drive, Room 3062 
> Huntsville, AL 35805-1912 
> Voice: (256) 961-7504   Fax: (256) 961-7788 
> Emails: Jonathan.Case-1 at nasa.gov
> 
>              case.jonathan at ensco.com
> 
> ***********************************************************
> 
>  
> 
> 
> 
> 
> ------------------------------------------------------------------------
> 
> _______________________________________________
> Met_help mailing list
> Met_help at mailman.ucar.edu
> http://mailman.ucar.edu/mailman/listinfo/met_help


More information about the Met_help mailing list