[Met_help] Input data for point-stat tool
John Halley Gotway
johnhg at rap.ucar.edu
Thu Oct 2 10:09:50 MDT 2008
Valerio,
Sorry for the delay in getting back to you.
When you run the Point-Stat tool, you pass to it a GRIB forecast file, a set of observations in NetCDF, and a configuration file. Point-Stat extracts the fields from the GRIB file that you requested
in the config file. And it compares those fields to the observations in the NetCDF file. No filtering of the observation by time is done at this point. Presumably, that's been done upstream.
For those using PREPBUFR observations for verification, the PB2NC tool allows you to specify which observation times you'd like to retain from the input PREPBBUFR file - as well as many other
filtering parameters. However, since you're using your own ASCII observations, it's up to you to decide which observation times should be compared to which forecast times.
And Point-Stat should be run for each time that you'd like to verify. So if you have 72 forecast times you'd like to verify, you should run Point-Stat 72 times. And for each time you run it, you'll
need to create a NetCDF file containing observations to be compared with your forecast.
Hopefully that's clear. Let me know if you have any more questions.
As you go through this process, I'd really like to hear feedback of how it's going. If you have any suggestions for changes or enhancements we could make to the MET tools that would make your job a
lot easier, please let me know. For example, would it be helpful in Point-Stat to allow the user to set a beginning and ending offset time for matching with observations? So if a forecast is valid
at time T, only match to observations that are within the time window T plus or minus some time offset? As I mentioned, currently that's taken care of in PB2NC but we could add it to Point-Stat as
well for those using ASCII observations. Any suggestions?
Thanks,
John
capecchi wrote:
> John,
> I'm submitting you another question concerning the point_stat tool...
>
> 1. The forecast grib1 file I have comes from the above mentioned procedure
> and I'm attaching
> the grib2ctl output:
> dset ^meteosalute_2007122500-72fcst.grb
> index ^meteosalute_2007122500-72fcst.grb.idx
> undef 9.999E+20
> title meteosalute_2007122500-72fcst.grb
> * produced by grib2ctl v0.9.12.5p39lbeta
> dtype grib 255
> ydef 131 linear 35.000000 0
> xdef 131 linear 6.000000 0.000000
> tdef 73 linear 00Z25dec2007 1hr
> zdef 1 linear 1 1
> vars 1
> TMPsfc 0 11,1,0 ** surface Temp. [K]
> ENDVARS
>
> 2. The observational netcdf file I have comes from the ascii2nc procedure
> and I'm attaching the ncdump -h output:
> netcdf METEOS_2503_ts2007_redux {
> dimensions:
> mxstr = 15 ;
> hdr_arr_len = 3 ;
> obs_arr_len = 5 ;
> nhdr = 43800 ;
> nobs = UNLIMITED ; // (43800 currently)
> variables:
> char hdr_typ(nhdr, mxstr) ;
> hdr_typ:long_name = "message type" ;
> char hdr_sid(nhdr, mxstr) ;
> hdr_sid:long_name = "station identification" ;
> char hdr_vld(nhdr, mxstr) ;
> hdr_vld:long_name = "valid time" ;
> hdr_vld:units = "YYYYMMDD_HHMMSS UTC" ;
> float hdr_arr(nhdr, hdr_arr_len) ;
> hdr_arr:long_name = "array of observation station header values" ;
> hdr_arr:_fill_value = -9999.f ;
> hdr_arr:columns = "lat lon elv" ;
> hdr_arr:lat_long_name = "latitude" ;
> hdr_arr:lat_units = "degrees_north" ;
> hdr_arr:lon_long_name = "longitude" ;
> hdr_arr:lon_units = "degrees_east" ;
> hdr_arr:elv_long_name = "elevation " ;
> hdr_arr:elv_units = "meters above sea level (msl)" ;
> float obs_arr(nobs, obs_arr_len) ;
> obs_arr:long_name = "array of observation values" ;
> obs_arr:_fill_value = -9999.f ;
> obs_arr:columns = "hdr_id gc lvl hgt ob" ;
> obs_arr:hdr_id_long_name = "index of matching header data" ;
> obs_arr:gc_long_name = "grib code corresponding to the observation
> type" ;
> obs_arr:lvl_long_name = "pressure level (hPa) or accumulation
> interval (h)" ;
> obs_arr:hgt_long_name = "height in meters above sea level (msl)" ;
> obs_arr:ob_long_name = "observation value" ;
>
> // global attributes:
> :FileOrigins = "File
> /home/salute/data/output/products_verif2007/METEOS_2503_ts2007_redux.nc
> generated 20080926_152518 UTC on host pisolo by the ASCII2NC tool" ;
> }
> Basically the observational data consists of one file per station of hourly
> records for all year 2007.
>
> 3. After applying the point_stat tool I get this output for fho file
> VERSION MODEL FCST_LEAD FCST_VALID OBTYPE VX_MASK
> TYPE VAR LEVEL = TOTAL
> F_RATE H_RATE O_RATE INTERP_MTHD INTERP_PNTS
> V1.1 MeteosaluteWRF 000000 20071201_000000 ADPSFC 2503
> FHO>278.000 TMP Z0 = 8760
> 1.00000 0.97717 0.97717 UW_MEAN 1
> V1.1 MeteosaluteWRF 000000 20071201_000000 ADPSFC 2503
> FHO>283.000 TMP Z0 = 8760
> 0.00000 0.00000 0.84384 UW_MEAN 1
> V1.1 MeteosaluteWRF 000000 20071201_000000 ADPSFC 2503
> FHO>288.000 TMP Z0 = 8760
> 0.00000 0.00000 0.58836 UW_MEAN 1
> and similar results for contingency tables outputs (if you need I can send
> you my config-file for point_stat tool)
>
> It seems from FCST_LEAD and FCST_VALID columns that verification is
> performed only for the analysis time step
> and not for the following 72 time steps (which is the actual forecast lead
> time of my predictions)
> Am I right? I'd like to validate day by day the 72 forecast....and not just
> the analysis time...
> How should I organize the verification commands and data?
>
> I apologize for the length of the question, regards, Valerio
>
>
> 2008/9/30 capecchi <v.capecchi at ibimet.cnr.it>
>
>> John,
>> I apologize for my delayed reply, I'm attaching one of my NetCDF sample
>> files...
>> Syntax and sequence of commands is the one I sent in a previous mail:
>> cdo selcode,-2 my_netcdf_file tmp1
>> cdo expr,'tmpk=ta2m+273.15;' ftmp1 ftmp2
>> cdo -f grb setcode,011 ftmp2 my_grib1_file
>> it works only for one variable per time.
>> Nevertheless I'm looking for new methods for converting NetCDF2GRIB and
>> wait
>> for improvements in ingesting input data in MET 2.0.
>> Thanks again for your help, regards, Valerio
>>
>> 2008/9/26 John Halley Gotway <johnhg at rap.ucar.edu>
>>
>>> Valerio,
>>>
>>> I didn't have CDO available on my machine but was able to download and
>>> install it. Seems to be working. I tried playing around with it and
>>> browsed the user's guide.
>>>
>>> However, I wasn't immediately able to get it to convert my sample NetCDF
>>> file to GRIB, but it looks promising.
>>>
>>> Would you mind sending me a sample NetCDF file that you used as input to
>>> the CDO commands you sent? I'd just like to verify that I can get it
>>> working.
>>>
>>> It does look limited in the types of grids it supports - lat/lon,
>>> gaussian, global icosahedral-hexagonal... but not polar stereographic,
>>> lambert conformal, or mercator. Assuming that I'm reading the
>>> user's guide correctly!
>>>
>>> Thanks,
>>> John
>>>
>>> capecchi wrote:
>>>> John,
>>>> even if in a narrow way I was successful in converting my NetCDF files
>>> to
>>>> GRIB1 format...in the followings lines
>>>> the commands I'm using in the terminal:
>>>>
>>>> cdo selcode,-2 my_netcdf_file tmp1
>>>> cdo expr,'tmpk=ta2m+273.15;' ftmp1 ftmp2
>>>> cdo -f grb setcode,011 ftmp2 my_grib1_file
>>>>
>>>> some of the previous operations are redundant but since I'm mostly
>>>> interested in 2 meters
>>>> temperature the method with cdo is quite fast even with large amount of
>>>> data.
>>>> I wish I was able to use a GRIB user defined table for converting all
>>> the
>>>> parameters in my NetCDF file
>>>> and with just this command:
>>>>
>>>> cdo -t my_user_defined_table -f grb copy
>>>> <http://aaa.meteosalute_2007123100-2008010223_d01.nc/>my_netcdf_file
>>>> my_grib1_file
>>>>
>>>> but, unfortunately, it was not possible and none change was performed.
>>>> Anyway I'm shyly suggesting the use of cdo operators for this task until
>>> the
>>>> introduction of NetCDF files as input file
>>>> for MET tool.
>>>> Thanks for your help, Valerio
>>
>
More information about the Met_help
mailing list