[Met_help] [rt.rap.ucar.edu #56393] History for met error in reading records for U
Paul Oldenburg via RT
met_help at ucar.edu
Thu May 17 16:08:48 MDT 2012
----------------------------------------------------------------
Initial Request
----------------------------------------------------------------
Hi,
We tried to validate U and V and encountered the following errors after running point_stat
/discover/nobackup/szhou/nu-wrf-trunk/nu-wrf_v2beta2-3.2.1/MET/bin/point_stat wrfout_d01_2010-05-30_00:00:00_PLEV ./data_20100530.t00z.nc PointStatConfig_SFC_WRF_sjz
GSL_RNG_TYPE=mt19937
GSL_RNG_SEED=1160424783
Forecast File: wrfout_d01_2010-05-30_00:00:00_PLEV
Climatology File: none
Configuration File: PointStatConfig_SFC_WRF_sjz
Observation File: ./data_20100530.t00z.nc
--------------------------------------------------------------------------------
Reading records for T2(0,*,*).
For T2(0,*,*) found 1 forecast levels and 0 climatology levels.
--------------------------------------------------------------------------------
Reading records for Q2(0,*,*).
For Q2(0,*,*) found 1 forecast levels and 0 climatology levels.
--------------------------------------------------------------------------------
Reading records for U(0,*,*).
ERROR: read_levels_pinterp() -> error reading U(0,*,*) from the p_interp NetCDF file wrfout_d01_2010-05-30_00:00:00_PLEV
ERROR: read_levels_pinterp() -> the valid time for the U(0,*,*) variable in the p_interp NetCDF file wrfout_d01_2010-05-30_00
:00:00_PLEV does not match the requested valid time: (19700101_000000 != 20100530_000000
Here is the script for point_stat:
///////////////////////////////////////////////////////////////////////////////
//
// Default point_stat configuration file
//
////////////////////////////////////////////////////////////////////////////////
//
// Specify a name to designate the model being verified. This name will be
// written to the second column of the ASCII output generated.
//
model = "WRF";
//
// Beginning and ending time offset values in seconds for observations
// to be used. These time offsets are defined in reference to the
// forecast valid time, v. Observations with a valid time falling in the
// window [v+beg_ds, v+end_ds] will be used.
// These selections are overridden by the command line arguments
// -obs_valid_beg and -obs_valid_end.
//
beg_ds = -3600;
end_ds = 3600;
//
// Specify a comma-separated list of fields to be verified. The forecast and
// observation fields may be specified separately. If the obs_field parameter
// is left blank, it will default to the contents of fcst_field.
//
// Each field is specified as a GRIB code or abbreviation followed by an
// accumulation or vertical level indicator for GRIB files or as a variable name
// followed by a list of dimensions for NetCDF files output from p_interp or MET.
//
// Specifying verification fields for GRIB files:
// GC/ANNN for accumulation interval NNN
// GC/ZNNN for vertical level NNN
// GC/ZNNN-NNN for a range of vertical levels (MSL or AGL)
// GC/PNNN for pressure level NNN in hPa
// GC/PNNN-NNN for a range of pressure levels in hPa
// GC/LNNN for a generic level type
// GC/RNNN for a specific GRIB record number
// Where GC is the number of or abbreviation for the grib code
// to be verified.
// http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
//
// Specifying verification fields for NetCDF files:
// var_name(i,...,j,*,*) for a single field
// var_name(i-j,*,*) for a range of fields
// Where var_name is the name of the NetCDF variable,
// and i,...,j specifies fixed dimension values,
// and i-j specifies a range of values for a single dimension,
// and *,* specifies the two dimensions for the gridded field.
//
// NOTE: To verify winds as vectors rather than scalars,
// specify UGRD (or 33) followed by VGRD (or 34) with the
// same level values.
//
// NOTE: To process a probability field, add "/PROB", such as "POP/Z0/PROB".
//
// e.g. fcst_field[] = [ "SPFH/P500", "TMP/P500" ]; for a GRIB input
// e.g. fcst_field[] = [ "QVAPOR(0,5,*,*)", "TT(0,5,*,*)" ]; for NetCDF input
//
//sjz what are TMP/Z2 and SPFH/Z2 ??
fcst_field[] = [ "T2(0,*,*)", "Q2(0,*,*)", "U(0,*,*)", "V(0,*,*)" ];
//fcst_field[] = [ "T2(0,*,*)"];
obs_field[] = [ "TMP/Z2", "SPFH/Z2","UGRD/Z10", "VGRD/Z10" ];
//obs_field[] = [ "TMP/Z2"];
//
// Specify a comma-separated list of groups of thresholds to be applied to the
// fields listed above. Thresholds for the forecast and observation fields
// may be specified separately. If the obs_thresh parameter is left blank,
// it will default to the contents of fcst_thresh.
//
// At least one threshold must be provided for each field listed above. The
// lengths of the "fcst_field" and "fcst_thresh" arrays must match, as must
// lengths of the "obs_field" and "obs_thresh" arrays. To apply multiple
// thresholds to a field, separate the threshold values with a space.
//
// Each threshold must be preceded by a two letter indicator for the type of
// thresholding to be performed:
// 'lt' for less than 'le' for less than or equal to
// 'eq' for equal to 'ne' for not equal to
// 'gt' for greater than 'ge' for greater than or equal to
//
// NOTE: Thresholds for probabilities must begin with 0.0, end with 1.0,
// and be preceeded by "ge".
//
// e.g. fcst_thresh[] = [ "gt80", "gt273" ];
//
// sjz is 300 too high?
fcst_thresh[] = [ "gt300", "gt0.0", "lt100", "lt100" ];
//fcst_thresh[] = [ "gt300"];
obs_thresh[] = [];
//
// Specify a comma-separated list of thresholds to be used when computing
// VL1L2 and VAL1L2 partial sums for winds. The thresholds are applied to the
// wind speed values derived from each U/V pair. Only those U/V pairs which meet
// the wind speed threshold criteria are retained. If the obs_wind_thresh
// parameter is left blank, it will default to the contents of fcst_wind_thresh.
//
// To apply multiple wind speed thresholds, separate the threshold values with a
// space. Use "NA" to indicate that no wind speed threshold should be applied.
//
// Each threshold must be preceded by a two letter indicator for the type of
// thresholding to be performed:
// 'lt' for less than 'le' for less than or equal to
// 'eq' for equal to 'ne' for not equal to
// 'gt' for greater than 'ge' for greater than or equal to
// 'NA' for no threshold
//
// e.g. fcst_wind_thresh[] = [ "NA", "ge1.0" ];
//
fcst_wind_thresh[] = [ "NA" ];
obs_wind_thresh[] = [];
//
// Specify a comma-separated list of PrepBufr message types with which
// to perform the verification. Statistics will be computed separately
// for each message type specified. At least one PrepBufr message type
// must be provided.
// List of valid message types:
// ADPUPA AIRCAR AIRCFT ADPSFC ERS1DA GOESND GPSIPW
// MSONET PROFLR QKSWND RASSDA SATEMP SATWND SFCBOG
// SFCSHP SPSSMI SYNDAT VADWND
// ANYAIR (= AIRCAR, AIRCFT)
// ANYSFC (= ADPSFC, SFCSHP, ADPUPA, PROFLR)
// ONLYSF (= ADPSFC, SFCSHP)
// http://www.emc.ncep.noaa.gov/mmb/data_processing/prepbufr.doc/table_1.htm
//
// e.g. message_type[] = [ "ADPUPA", "AIRCAR" ];
//
//message_type[] = [ "ADPSFC" ];
message_type[] = [ "ANYSFC" ];
//
// Specify a comma-separated list of grids to be used in masking the data over
// which to perform scoring. An empty list indicates that no masking grid
// should be performed. The standard NCEP grids are named "GNNN" where NNN
// indicates the three digit grid number. Enter "FULL" to score over the
// entire domain.
// http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html
//
// e.g. mask_grid[] = [ "FULL" ];
//
mask_grid[] = [ ];
//
// Specify a comma-separated list of masking regions to be applied.
// An empty list indicates that no additional masks should be used.
// The masking regions may be defined in one of 4 ways:
//
// (1) An ASCII file containing a lat/lon polygon.
// Latitude in degrees north and longitude in degrees east.
// By default, the first and last polygon points are connected.
// e.g. "MET_BASE/data/poly/EAST.poly" which consists of n points:
// "poly_name lat1 lon1 lat2 lon2... latn lonn"
//
// (2) The NetCDF output of the gen_poly_mask tool.
//
// (3) A NetCDF data file, followed by the name of the NetCDF variable
// to be used, and optionally, a threshold to be applied to the field.
// e.g. "sample.nc var_name gt0.00"
//
// (4) A GRIB data file, followed by a description of the field
// to be used, and optionally, a threshold to be applied to the field.
// e.g. "sample.grb APCP/A3 gt0.00"
//
// Any NetCDF or GRIB file used must have the same grid dimensions as the
// data being verified.
//
// MET_BASE may be used in the path for the files above.
//
// e.g. mask_poly[] = [ "MET_BASE/data/poly/EAST.poly",
// "poly_mask.ncf",
// "sample.nc APCP",
// "sample.grb HGT/Z0 gt100.0" ];
//
//mask_poly[] = [ "/gpfsm/dnb32/akumar3/NU-WRF/nu-wrf_v2dev4-3.2.1/MET/anil_test/SPL.poly" ];
//mask_poly[] = [ "/discover/nobackup/szhou/test_met/zhining_test/CalNex_LIS32/CONUS.poly" ];
mask_poly[] = [ "/discover/nobackup/szhou/test_met/zhining_april19_2012/WestCoast.poly" ];
//mask_poly[] = [ ];
//
// Specify the name of an ASCII file containing a space-separated list of
// station ID's at which to perform verification. Each station ID specified
// is treated as an individual masking region.
//
// An empty list file name indicates that no station ID masks should be used.
//
// MET_BASE may be used in the path for the station ID mask file name.
//
// e.g. mask_sid = "MET_BASE/data/stations/CONUS.stations";
//
mask_sid = "";
//
// Specify a comma-separated list of values for alpha to be used when computing
// confidence intervals. Values of alpha must be between 0 and 1.
//
// e.g. ci_alpha[] = [ 0.05, 0.10 ];
//
ci_alpha[] = [ 0.05 ];
//
// Specify the method to be used for computing bootstrap confidence intervals.
// The value for this is interpreted as follows:
// (0) Use the BCa interval method (computationally intensive)
// (1) Use the percentile interval method
//
boot_interval = 1;
//
// Specify a proportion between 0 and 1 to define the replicate sample size
// to be used when computing percentile intervals. The replicate sample
// size is set to boot_rep_prop * n, where n is the number of raw data points.
//
// e.g boot_rep_prop = 0.80;
//
boot_rep_prop = 1.0;
//
// Specify the number of times each set of matched pair data should be
// resampled when computing bootstrap confidence intervals. A value of
// zero disables the computation of bootstrap condifence intervals.
//
// e.g. n_boot_rep = 1000;
//
n_boot_rep = 0;
//
// Specify the name of the random number generator to be used. See the MET
// Users Guide for a list of possible random number generators.
//
boot_rng = "mt19937";
//
// Specify the seed value to be used when computing bootstrap confidence
// intervals. If left unspecified, the seed will change for each run and
// the computed bootstrap confidence intervals will not be reproducable.
//
boot_seed = "";
//
// Specify a comma-separated list of interpolation method(s) to be used
// for comparing the forecast grid to the observation points. String values
// are interpreted as follows:
// MIN = Minimum in the neighborhood
// MAX = Maximum in the neighborhood
// MEDIAN = Median in the neighborhood
// UW_MEAN = Unweighted mean in the neighborhood
// DW_MEAN = Distance-weighted mean in the neighborhood
// LS_FIT = Least-squares fit in the neighborhood
//
// In all cases, vertical interpolation is performed in the natural log
// of pressure of the levels above and below the observation.
//
// e.g. interp_method[] = [ "UW_MEAN", "MEDIAN" ];
//
interp_method[] = [ "DW_MEAN" ];
//
// Specify a comma-separated list of box widths to be used by the
// interpolation techniques listed above. A value of 1 indicates that
// the nearest neighbor approach should be used. For a value of n
// greater than 1, the n*n grid points closest to the observation define
// the neighborhood.
//
// e.g. interp_width = [ 1, 3, 5 ];
//
interp_width[] = [ 3 ];
//
// When interpolating, compute a ratio of the number of valid data points
// to the total number of points in the neighborhood. If that ratio is
// less than this threshold, do not include the observation. This
// threshold must be between 0 and 1. Setting this threshold to 1 will
// require that each observation be surrounded by n*n valid forecast
// points.
//
// e.g. interp_thresh = 1.0;
//
interp_thresh = 1.0;
//
// Specify flags to indicate the type of data to be output:
// (1) STAT and FHO Text Files, Forecast, Hit, Observation Rates:
// Total (TOTAL),
// Forecast Rate (F_RATE),
// Hit Rate (H_RATE),
// Observation Rate (O_RATE)
//
// (2) STAT and CTC Text Files, Contingency Table Counts:
// Total (TOTAL),
// Forecast Yes and Observation Yes Count (FY_OY),
// Forecast Yes and Observation No Count (FY_ON),
// Forecast No and Observation Yes Count (FN_OY),
// Forecast No and Observation No Count (FN_ON)
//
// (3) STAT and CTS Text Files, Contingency Table Scores:
// Total (TOTAL),
// Base Rate (BASER),
// Forecast Mean (FMEAN),
// Accuracy (ACC),
// Frequency Bias (FBIAS),
// Probability of Detecting Yes (PODY),
// Probability of Detecting No (PODN),
// Probability of False Detection (POFD),
// False Alarm Ratio (FAR),
// Critical Success Index (CSI),
// Gilbert Skill Score (GSS),
// Hanssen and Kuipers Discriminant (HK),
// Heidke Skill Score (HSS),
// Odds Ratio (ODDS),
// NOTE: All statistics listed above contain parametric and/or
// non-parametric confidence interval limits.
//
// (4) STAT and MCTC Text Files, NxN Multi-Category Contingency Table Counts:
// Total (TOTAL),
// Number of Categories (N_CAT),
// Contingency Table Count columns repeated N_CAT*N_CAT times
//
// (5) STAT and MCTS Text Files, NxN Multi-Category Contingency Table Scores:
// Total (TOTAL),
// Number of Categories (N_CAT),
// Accuracy (ACC),
// Hanssen and Kuipers Discriminant (HK),
// Heidke Skill Score (HSS),
// Gerrity Score (GER),
// NOTE: All statistics listed above contain parametric and/or
// non-parametric confidence interval limits.
//
// (6) STAT and CNT Text Files, Statistics of Continuous Variables:
// Total (TOTAL),
// Forecast Mean (FBAR),
// Forecast Standard Deviation (FSTDEV),
// Observation Mean (OBAR),
// Observation Standard Deviation (OSTDEV),
// Pearson's Correlation Coefficient (PR_CORR),
// Spearman's Rank Correlation Coefficient (SP_CORR),
// Kendall Tau Rank Correlation Coefficient (KT_CORR),
// Number of ranks compared (RANKS),
// Number of tied ranks in the forecast field (FRANK_TIES),
// Number of tied ranks in the observation field (ORANK_TIES),
// Mean Error (ME),
// Standard Deviation of the Error (ESTDEV),
// Multiplicative Bias (MBIAS = FBAR - OBAR),
// Mean Absolute Error (MAE),
// Mean Squared Error (MSE),
// Bias-Corrected Mean Squared Error (BCMSE),
// Root Mean Squared Error (RMSE),
// Percentiles of the Error (E10, E25, E50, E75, E90)
// NOTE: Most statistics listed above contain parametric and/or
// non-parametric confidence interval limits.
//
// (7) STAT and SL1L2 Text Files, Scalar Partial Sums:
// Total (TOTAL),
// Forecast Mean (FBAR),
// = mean(f)
// Observation Mean (OBAR),
// = mean(o)
// Forecast*Observation Product Mean (FOBAR),
// = mean(f*o)
// Forecast Squared Mean (FFBAR),
// = mean(f^2)
// Observation Squared Mean (OOBAR)
// = mean(o^2)
//
// (8) STAT and SAL1L2 Text Files, Scalar Anomaly Partial Sums:
// Total (TOTAL),
// Forecast Anomaly Mean (FABAR),
// = mean(f-c)
// Observation Anomaly Mean (OABAR),
// = mean(o-c)
// Product of Forecast and Observation Anomalies Mean (FOABAR),
// = mean((f-c)*(o-c))
// Forecast Anomaly Squared Mean (FFABAR),
// = mean((f-c)^2)
// Observation Anomaly Squared Mean (OOABAR)
// = mean((o-c)^2)
//
// (9) STAT and VL1L2 Text Files, Vector Partial Sums:
// Total (TOTAL),
// U-Forecast Mean (UFBAR),
// = mean(uf)
// V-Forecast Mean (VFBAR),
// = mean(vf)
// U-Observation Mean (UOBAR),
// = mean(uo)
// V-Observation Mean (VOBAR),
// = mean(vo)
// U-Product Plus V-Product (UVFOBAR),
// = mean(uf*uo+vf*vo)
// U-Forecast Squared Plus V-Forecast Squared (UVFFBAR),
// = mean(uf^2+vf^2)
// U-Observation Squared Plus V-Observation Squared (UVOOBAR)
// = mean(uo^2+vo^2)
//
// (10) STAT and VAL1L2 Text Files, Vector Anomaly Partial Sums:
// U-Forecast Anomaly Mean (UFABAR),
// = mean(uf-uc)
// V-Forecast Anomaly Mean (VFABAR),
// = mean(vf-vc)
// U-Observation Anomaly Mean (UOABAR),
// = mean(uo-uc)
// V-Observation Anomaly Mean (VOABAR),
// = mean(vo-vc)
// U-Anomaly Product Plus V-Anomaly Product (UVFOABAR),
// = mean((uf-uc)*(uo-uc)+(vf-vc)*(vo-vc))
// U-Forecast Anomaly Squared Plus V-Forecast Anomaly Squared (UVFFABAR),
// = mean((uf-uc)^2+(vf-vc)^2)
// U-Observation Anomaly Squared Plus V-Observation Anomaly Squared (UVOOABAR)
// = mean((uo-uc)^2+(vo-vc)^2)
//
// (11) STAT and PCT Text Files, Nx2 Probability Contingency Table Counts:
// Total (TOTAL),
// Number of Forecast Probability Thresholds (N_THRESH),
// Probability Threshold Value (THRESH_i),
// Row Observation Yes Count (OY_i),
// Row Observation No Count (ON_i),
// NOTE: Previous 3 columns repeated for each row in the table.
// Last Probability Threshold Value (THRESH_n)
//
// (12) STAT and PSTD Text Files, Nx2 Probability Contingency Table Scores:
// Total (TOTAL),
// Number of Forecast Probability Thresholds (N_THRESH),
// Base Rate (BASER) with confidence interval limits,
// Reliability (RELIABILITY),
// Resolution (RESOLUTION),
// Uncertainty (UNCERTAINTY),
// Area Under the ROC Curve (ROC_AUC),
// Brier Score (BRIER) with confidence interval limits,
// Probability Threshold Value (THRESH_i)
// NOTE: Previous column repeated for each probability threshold.
//
// (13) STAT and PJC Text Files, Joint/Continuous Statistics of
// Probabilistic Variables:
// Total (TOTAL),
// Number of Forecast Probability Thresholds (N_THRESH),
// Probability Threshold Value (THRESH_i),
// Observation Yes Count Divided by Total (OY_TP_i),
// Observation No Count Divided by Total (ON_TP_i),
// Calibration (CALIBRATION_i),
// Refinement (REFINEMENT_i),
// Likelikhood (LIKELIHOOD_i),
// Base Rate (BASER_i),
// NOTE: Previous 7 columns repeated for each row in the table.
// Last Probability Threshold Value (THRESH_n)
//
// (14) STAT and PRC Text Files, ROC Curve Points for
// Probabilistic Variables:
// Total (TOTAL),
// Number of Forecast Probability Thresholds (N_THRESH),
// Probability Threshold Value (THRESH_i),
// Probability of Detecting Yes (PODY_i),
// Probability of False Detection (POFD_i),
// NOTE: Previous 3 columns repeated for each row in the table.
// Last Probability Threshold Value (THRESH_n)
//
// (15) STAT and MPR Text Files, Matched Pair Data:
// Total (TOTAL),
// Index (INDEX),
// Observation Station ID (OBS_SID),
// Observation Latitude (OBS_LAT),
// Observation Longitude (OBS_LON),
// Observation Level (OBS_LVL),
// Observation Elevation (OBS_ELV),
// Forecast Value (FCST),
// Observation Value (OBS),
// Climatological Value (CLIMO)
//
// In the expressions above, f are forecast values, o are observed values,
// and c are climatological values.
//
// Values for these flags are interpreted as follows:
// (0) Do not generate output of this type
// (1) Write output to a STAT file
// (2) Write output to a STAT file and a text file
//
output_flag[] = [ 2, 2, 2, 0, 0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0 ];
//
// Flag to indicate whether Kendall's Tau and Spearman's Rank Correlation
// Coefficients should be computed. Computing them over large datasets is
// computationally intensive and slows down the runtime execution significantly.
// (0) Do not compute these correlation coefficients
// (1) Compute these correlation coefficients
//
rank_corr_flag = 0;
//
// Specify the GRIB Table 2 parameter table version number to be used
// for interpreting GRIB codes.
// http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
//
grib_ptv = 2;
//
// Directory where temporary files should be written.
//
tmp_dir = "./tmp";
//
// Prefix to be used for the output file names.
//
output_prefix = "";
//
// Indicate a version number for the contents of this configuration file.
// The value should generally not be modified.
//
version = "V3.0";
*********************************************************************************************************
Please let me know how to fix the problem. In addition, do you have an exemplary script in validating soil moisture variable?
Thanks,
Shujia
----------------------------------------------------------------
Complete Ticket History
----------------------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #56393] met error in reading records for U
From: Paul Oldenburg
Time: Fri May 11 10:09:27 2012
Shujia,
Can you please send me the model data and obs data that you are using?
You can upload the data to our FTP site using
the instructions here:
http://www.dtcenter.org/met/users/support/met_help.php#ftp. It
appears that you are trying to
verify p_interp data, correct? If so, there may be a problem in how
point_stat handles p_interp data.
Regarding your question about soil moisture, we do not have any
example scripts for this. I did find the following
discussion that includes talk about soil moisture:
http://mailman.ucar.edu/pipermail/met_help/2009-August/000919.html
I found using the google search 'met_help archive soil moisture'.
Sorry I can't be more help on this topic.
If you have any questions, please let me know.
Paul
On 05/11/2012 09:48 AM, Shujia Zhou via RT wrote:
>
> Fri May 11 09:48:36 2012: Request 56393 was acted upon.
> Transaction: Ticket created by shujia.zhou-1 at nasa.gov
> Queue: met_help
> Subject: met error in reading records for U
> Owner: Nobody
> Requestors: shujia.zhou-1 at nasa.gov
> Status: new
> Ticket<URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=56393>
>
>
> Hi,
>
> We tried to validate U and V and encountered the following errors
after running point_stat
>
>
> /discover/nobackup/szhou/nu-wrf-trunk/nu-wrf_v2beta2-
3.2.1/MET/bin/point_stat wrfout_d01_2010-05-30_00:00:00_PLEV
./data_20100530.t00z.nc PointStatConfig_SFC_WRF_sjz
>
> GSL_RNG_TYPE=mt19937
> GSL_RNG_SEED=1160424783
> Forecast File: wrfout_d01_2010-05-30_00:00:00_PLEV
> Climatology File: none
> Configuration File: PointStatConfig_SFC_WRF_sjz
> Observation File: ./data_20100530.t00z.nc
>
>
--------------------------------------------------------------------------------
>
> Reading records for T2(0,*,*).
> For T2(0,*,*) found 1 forecast levels and 0 climatology levels.
>
>
--------------------------------------------------------------------------------
>
> Reading records for Q2(0,*,*).
> For Q2(0,*,*) found 1 forecast levels and 0 climatology levels.
>
>
--------------------------------------------------------------------------------
>
> Reading records for U(0,*,*).
>
>
> ERROR: read_levels_pinterp() -> error reading U(0,*,*) from the
p_interp NetCDF file wrfout_d01_2010-05-30_00:00:00_PLEV
>
>
>
> ERROR: read_levels_pinterp() -> the valid time for the U(0,*,*)
variable in the p_interp NetCDF file wrfout_d01_2010-05-30_00
> :00:00_PLEV does not match the requested valid time:
(19700101_000000 != 20100530_000000
>
>
> Here is the script for point_stat:
>
>
>
///////////////////////////////////////////////////////////////////////////////
> //
> // Default point_stat configuration file
> //
>
////////////////////////////////////////////////////////////////////////////////
>
> //
> // Specify a name to designate the model being verified. This name
will be
> // written to the second column of the ASCII output generated.
> //
> model = "WRF";
>
> //
> // Beginning and ending time offset values in seconds for
observations
> // to be used. These time offsets are defined in reference to the
> // forecast valid time, v. Observations with a valid time falling
in the
> // window [v+beg_ds, v+end_ds] will be used.
> // These selections are overridden by the command line arguments
> // -obs_valid_beg and -obs_valid_end.
> //
> beg_ds = -3600;
> end_ds = 3600;
>
> //
> // Specify a comma-separated list of fields to be verified. The
forecast and
> // observation fields may be specified separately. If the obs_field
parameter
> // is left blank, it will default to the contents of fcst_field.
> //
> // Each field is specified as a GRIB code or abbreviation followed
by an
> // accumulation or vertical level indicator for GRIB files or as a
variable name
> // followed by a list of dimensions for NetCDF files output from
p_interp or MET.
> //
> // Specifying verification fields for GRIB files:
> // GC/ANNN for accumulation interval NNN
> // GC/ZNNN for vertical level NNN
> // GC/ZNNN-NNN for a range of vertical levels (MSL or AGL)
> // GC/PNNN for pressure level NNN in hPa
> // GC/PNNN-NNN for a range of pressure levels in hPa
> // GC/LNNN for a generic level type
> // GC/RNNN for a specific GRIB record number
> // Where GC is the number of or abbreviation for the grib code
> // to be verified.
> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
> //
> // Specifying verification fields for NetCDF files:
> // var_name(i,...,j,*,*) for a single field
> // var_name(i-j,*,*) for a range of fields
> // Where var_name is the name of the NetCDF variable,
> // and i,...,j specifies fixed dimension values,
> // and i-j specifies a range of values for a single dimension,
> // and *,* specifies the two dimensions for the gridded field.
> //
> // NOTE: To verify winds as vectors rather than scalars,
> // specify UGRD (or 33) followed by VGRD (or 34) with the
> // same level values.
> //
> // NOTE: To process a probability field, add "/PROB", such as
"POP/Z0/PROB".
> //
> // e.g. fcst_field[] = [ "SPFH/P500", "TMP/P500" ]; for a GRIB input
> // e.g. fcst_field[] = [ "QVAPOR(0,5,*,*)", "TT(0,5,*,*)" ]; for
NetCDF input
> //
> //sjz what are TMP/Z2 and SPFH/Z2 ??
> fcst_field[] = [ "T2(0,*,*)", "Q2(0,*,*)", "U(0,*,*)", "V(0,*,*)" ];
> //fcst_field[] = [ "T2(0,*,*)"];
> obs_field[] = [ "TMP/Z2", "SPFH/Z2","UGRD/Z10", "VGRD/Z10" ];
> //obs_field[] = [ "TMP/Z2"];
>
> //
> // Specify a comma-separated list of groups of thresholds to be
applied to the
> // fields listed above. Thresholds for the forecast and observation
fields
> // may be specified separately. If the obs_thresh parameter is left
blank,
> // it will default to the contents of fcst_thresh.
> //
> // At least one threshold must be provided for each field listed
above. The
> // lengths of the "fcst_field" and "fcst_thresh" arrays must match,
as must
> // lengths of the "obs_field" and "obs_thresh" arrays. To apply
multiple
> // thresholds to a field, separate the threshold values with a
space.
> //
> // Each threshold must be preceded by a two letter indicator for the
type of
> // thresholding to be performed:
> // 'lt' for less than 'le' for less than or equal to
> // 'eq' for equal to 'ne' for not equal to
> // 'gt' for greater than 'ge' for greater than or equal to
> //
> // NOTE: Thresholds for probabilities must begin with 0.0, end with
1.0,
> // and be preceeded by "ge".
> //
> // e.g. fcst_thresh[] = [ "gt80", "gt273" ];
> //
> // sjz is 300 too high?
> fcst_thresh[] = [ "gt300", "gt0.0", "lt100", "lt100" ];
> //fcst_thresh[] = [ "gt300"];
> obs_thresh[] = [];
>
> //
> // Specify a comma-separated list of thresholds to be used when
computing
> // VL1L2 and VAL1L2 partial sums for winds. The thresholds are
applied to the
> // wind speed values derived from each U/V pair. Only those U/V
pairs which meet
> // the wind speed threshold criteria are retained. If the
obs_wind_thresh
> // parameter is left blank, it will default to the contents of
fcst_wind_thresh.
> //
> // To apply multiple wind speed thresholds, separate the threshold
values with a
> // space. Use "NA" to indicate that no wind speed threshold should
be applied.
> //
> // Each threshold must be preceded by a two letter indicator for the
type of
> // thresholding to be performed:
> // 'lt' for less than 'le' for less than or equal to
> // 'eq' for equal to 'ne' for not equal to
> // 'gt' for greater than 'ge' for greater than or equal to
> // 'NA' for no threshold
> //
> // e.g. fcst_wind_thresh[] = [ "NA", "ge1.0" ];
> //
> fcst_wind_thresh[] = [ "NA" ];
> obs_wind_thresh[] = [];
>
> //
> // Specify a comma-separated list of PrepBufr message types with
which
> // to perform the verification. Statistics will be computed
separately
> // for each message type specified. At least one PrepBufr message
type
> // must be provided.
> // List of valid message types:
> // ADPUPA AIRCAR AIRCFT ADPSFC ERS1DA GOESND GPSIPW
> // MSONET PROFLR QKSWND RASSDA SATEMP SATWND SFCBOG
> // SFCSHP SPSSMI SYNDAT VADWND
> // ANYAIR (= AIRCAR, AIRCFT)
> // ANYSFC (= ADPSFC, SFCSHP, ADPUPA, PROFLR)
> // ONLYSF (= ADPSFC, SFCSHP)
> //
http://www.emc.ncep.noaa.gov/mmb/data_processing/prepbufr.doc/table_1.htm
> //
> // e.g. message_type[] = [ "ADPUPA", "AIRCAR" ];
> //
> //message_type[] = [ "ADPSFC" ];
> message_type[] = [ "ANYSFC" ];
>
> //
> // Specify a comma-separated list of grids to be used in masking the
data over
> // which to perform scoring. An empty list indicates that no
masking grid
> // should be performed. The standard NCEP grids are named "GNNN"
where NNN
> // indicates the three digit grid number. Enter "FULL" to score
over the
> // entire domain.
> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html
> //
> // e.g. mask_grid[] = [ "FULL" ];
> //
> mask_grid[] = [ ];
>
> //
> // Specify a comma-separated list of masking regions to be applied.
> // An empty list indicates that no additional masks should be used.
> // The masking regions may be defined in one of 4 ways:
> //
> // (1) An ASCII file containing a lat/lon polygon.
> // Latitude in degrees north and longitude in degrees east.
> // By default, the first and last polygon points are connected.
> // e.g. "MET_BASE/data/poly/EAST.poly" which consists of n
points:
> // "poly_name lat1 lon1 lat2 lon2... latn lonn"
> //
> // (2) The NetCDF output of the gen_poly_mask tool.
> //
> // (3) A NetCDF data file, followed by the name of the NetCDF
variable
> // to be used, and optionally, a threshold to be applied to the
field.
> // e.g. "sample.nc var_name gt0.00"
> //
> // (4) A GRIB data file, followed by a description of the field
> // to be used, and optionally, a threshold to be applied to the
field.
> // e.g. "sample.grb APCP/A3 gt0.00"
> //
> // Any NetCDF or GRIB file used must have the same grid dimensions
as the
> // data being verified.
> //
> // MET_BASE may be used in the path for the files above.
> //
> // e.g. mask_poly[] = [ "MET_BASE/data/poly/EAST.poly",
> // "poly_mask.ncf",
> // "sample.nc APCP",
> // "sample.grb HGT/Z0 gt100.0" ];
> //
> //mask_poly[] = [ "/gpfsm/dnb32/akumar3/NU-WRF/nu-wrf_v2dev4-
3.2.1/MET/anil_test/SPL.poly" ];
> //mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_test/CalNex_LIS32/CONUS.poly"
];
> mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_april19_2012/WestCoast.poly"
];
> //mask_poly[] = [ ];
> //
> // Specify the name of an ASCII file containing a space-separated
list of
> // station ID's at which to perform verification. Each station ID
specified
> // is treated as an individual masking region.
> //
> // An empty list file name indicates that no station ID masks should
be used.
> //
> // MET_BASE may be used in the path for the station ID mask file
name.
> //
> // e.g. mask_sid = "MET_BASE/data/stations/CONUS.stations";
> //
> mask_sid = "";
>
> //
> // Specify a comma-separated list of values for alpha to be used
when computing
> // confidence intervals. Values of alpha must be between 0 and 1.
> //
> // e.g. ci_alpha[] = [ 0.05, 0.10 ];
> //
> ci_alpha[] = [ 0.05 ];
>
> //
> // Specify the method to be used for computing bootstrap confidence
intervals.
> // The value for this is interpreted as follows:
> // (0) Use the BCa interval method (computationally intensive)
> // (1) Use the percentile interval method
> //
> boot_interval = 1;
>
> //
> // Specify a proportion between 0 and 1 to define the replicate
sample size
> // to be used when computing percentile intervals. The replicate
sample
> // size is set to boot_rep_prop * n, where n is the number of raw
data points.
> //
> // e.g boot_rep_prop = 0.80;
> //
> boot_rep_prop = 1.0;
>
> //
> // Specify the number of times each set of matched pair data should
be
> // resampled when computing bootstrap confidence intervals. A value
of
> // zero disables the computation of bootstrap condifence intervals.
> //
> // e.g. n_boot_rep = 1000;
> //
> n_boot_rep = 0;
>
> //
> // Specify the name of the random number generator to be used. See
the MET
> // Users Guide for a list of possible random number generators.
> //
> boot_rng = "mt19937";
>
> //
> // Specify the seed value to be used when computing bootstrap
confidence
> // intervals. If left unspecified, the seed will change for each
run and
> // the computed bootstrap confidence intervals will not be
reproducable.
> //
> boot_seed = "";
>
> //
> // Specify a comma-separated list of interpolation method(s) to be
used
> // for comparing the forecast grid to the observation points.
String values
> // are interpreted as follows:
> // MIN = Minimum in the neighborhood
> // MAX = Maximum in the neighborhood
> // MEDIAN = Median in the neighborhood
> // UW_MEAN = Unweighted mean in the neighborhood
> // DW_MEAN = Distance-weighted mean in the neighborhood
> // LS_FIT = Least-squares fit in the neighborhood
> //
> // In all cases, vertical interpolation is performed in the natural
log
> // of pressure of the levels above and below the observation.
> //
> // e.g. interp_method[] = [ "UW_MEAN", "MEDIAN" ];
> //
> interp_method[] = [ "DW_MEAN" ];
>
> //
> // Specify a comma-separated list of box widths to be used by the
> // interpolation techniques listed above. A value of 1 indicates
that
> // the nearest neighbor approach should be used. For a value of n
> // greater than 1, the n*n grid points closest to the observation
define
> // the neighborhood.
> //
> // e.g. interp_width = [ 1, 3, 5 ];
> //
> interp_width[] = [ 3 ];
>
> //
> // When interpolating, compute a ratio of the number of valid data
points
> // to the total number of points in the neighborhood. If that ratio
is
> // less than this threshold, do not include the observation. This
> // threshold must be between 0 and 1. Setting this threshold to 1
will
> // require that each observation be surrounded by n*n valid forecast
> // points.
> //
> // e.g. interp_thresh = 1.0;
> //
> interp_thresh = 1.0;
>
> //
> // Specify flags to indicate the type of data to be output:
> // (1) STAT and FHO Text Files, Forecast, Hit, Observation Rates:
> // Total (TOTAL),
> // Forecast Rate (F_RATE),
> // Hit Rate (H_RATE),
> // Observation Rate (O_RATE)
> //
> // (2) STAT and CTC Text Files, Contingency Table Counts:
> // Total (TOTAL),
> // Forecast Yes and Observation Yes Count (FY_OY),
> // Forecast Yes and Observation No Count (FY_ON),
> // Forecast No and Observation Yes Count (FN_OY),
> // Forecast No and Observation No Count (FN_ON)
> //
> // (3) STAT and CTS Text Files, Contingency Table Scores:
> // Total (TOTAL),
> // Base Rate (BASER),
> // Forecast Mean (FMEAN),
> // Accuracy (ACC),
> // Frequency Bias (FBIAS),
> // Probability of Detecting Yes (PODY),
> // Probability of Detecting No (PODN),
> // Probability of False Detection (POFD),
> // False Alarm Ratio (FAR),
> // Critical Success Index (CSI),
> // Gilbert Skill Score (GSS),
> // Hanssen and Kuipers Discriminant (HK),
> // Heidke Skill Score (HSS),
> // Odds Ratio (ODDS),
> // NOTE: All statistics listed above contain parametric
and/or
> // non-parametric confidence interval limits.
> //
> // (4) STAT and MCTC Text Files, NxN Multi-Category Contingency
Table Counts:
> // Total (TOTAL),
> // Number of Categories (N_CAT),
> // Contingency Table Count columns repeated N_CAT*N_CAT
times
> //
> // (5) STAT and MCTS Text Files, NxN Multi-Category Contingency
Table Scores:
> // Total (TOTAL),
> // Number of Categories (N_CAT),
> // Accuracy (ACC),
> // Hanssen and Kuipers Discriminant (HK),
> // Heidke Skill Score (HSS),
> // Gerrity Score (GER),
> // NOTE: All statistics listed above contain parametric
and/or
> // non-parametric confidence interval limits.
> //
> // (6) STAT and CNT Text Files, Statistics of Continuous
Variables:
> // Total (TOTAL),
> // Forecast Mean (FBAR),
> // Forecast Standard Deviation (FSTDEV),
> // Observation Mean (OBAR),
> // Observation Standard Deviation (OSTDEV),
> // Pearson's Correlation Coefficient (PR_CORR),
> // Spearman's Rank Correlation Coefficient (SP_CORR),
> // Kendall Tau Rank Correlation Coefficient (KT_CORR),
> // Number of ranks compared (RANKS),
> // Number of tied ranks in the forecast field
(FRANK_TIES),
> // Number of tied ranks in the observation field
(ORANK_TIES),
> // Mean Error (ME),
> // Standard Deviation of the Error (ESTDEV),
> // Multiplicative Bias (MBIAS = FBAR - OBAR),
> // Mean Absolute Error (MAE),
> // Mean Squared Error (MSE),
> // Bias-Corrected Mean Squared Error (BCMSE),
> // Root Mean Squared Error (RMSE),
> // Percentiles of the Error (E10, E25, E50, E75, E90)
> // NOTE: Most statistics listed above contain parametric
and/or
> // non-parametric confidence interval limits.
> //
> // (7) STAT and SL1L2 Text Files, Scalar Partial Sums:
> // Total (TOTAL),
> // Forecast Mean (FBAR),
> // = mean(f)
> // Observation Mean (OBAR),
> // = mean(o)
> // Forecast*Observation Product Mean (FOBAR),
> // = mean(f*o)
> // Forecast Squared Mean (FFBAR),
> // = mean(f^2)
> // Observation Squared Mean (OOBAR)
> // = mean(o^2)
> //
> // (8) STAT and SAL1L2 Text Files, Scalar Anomaly Partial Sums:
> // Total (TOTAL),
> // Forecast Anomaly Mean (FABAR),
> // = mean(f-c)
> // Observation Anomaly Mean (OABAR),
> // = mean(o-c)
> // Product of Forecast and Observation Anomalies Mean
(FOABAR),
> // = mean((f-c)*(o-c))
> // Forecast Anomaly Squared Mean (FFABAR),
> // = mean((f-c)^2)
> // Observation Anomaly Squared Mean (OOABAR)
> // = mean((o-c)^2)
> //
> // (9) STAT and VL1L2 Text Files, Vector Partial Sums:
> // Total (TOTAL),
> // U-Forecast Mean (UFBAR),
> // = mean(uf)
> // V-Forecast Mean (VFBAR),
> // = mean(vf)
> // U-Observation Mean (UOBAR),
> // = mean(uo)
> // V-Observation Mean (VOBAR),
> // = mean(vo)
> // U-Product Plus V-Product (UVFOBAR),
> // = mean(uf*uo+vf*vo)
> // U-Forecast Squared Plus V-Forecast Squared (UVFFBAR),
> // = mean(uf^2+vf^2)
> // U-Observation Squared Plus V-Observation Squared
(UVOOBAR)
> // = mean(uo^2+vo^2)
> //
> // (10) STAT and VAL1L2 Text Files, Vector Anomaly Partial Sums:
> // U-Forecast Anomaly Mean (UFABAR),
> // = mean(uf-uc)
> // V-Forecast Anomaly Mean (VFABAR),
> // = mean(vf-vc)
> // U-Observation Anomaly Mean (UOABAR),
> // = mean(uo-uc)
> // V-Observation Anomaly Mean (VOABAR),
> // = mean(vo-vc)
> // U-Anomaly Product Plus V-Anomaly Product (UVFOABAR),
> // = mean((uf-uc)*(uo-uc)+(vf-vc)*(vo-vc))
> // U-Forecast Anomaly Squared Plus V-Forecast Anomaly
Squared (UVFFABAR),
> // = mean((uf-uc)^2+(vf-vc)^2)
> // U-Observation Anomaly Squared Plus V-Observation
Anomaly Squared (UVOOABAR)
> // = mean((uo-uc)^2+(vo-vc)^2)
> //
> // (11) STAT and PCT Text Files, Nx2 Probability Contingency Table
Counts:
> // Total (TOTAL),
> // Number of Forecast Probability Thresholds (N_THRESH),
> // Probability Threshold Value (THRESH_i),
> // Row Observation Yes Count (OY_i),
> // Row Observation No Count (ON_i),
> // NOTE: Previous 3 columns repeated for each row in the
table.
> // Last Probability Threshold Value (THRESH_n)
> //
> // (12) STAT and PSTD Text Files, Nx2 Probability Contingency
Table Scores:
> // Total (TOTAL),
> // Number of Forecast Probability Thresholds (N_THRESH),
> // Base Rate (BASER) with confidence interval limits,
> // Reliability (RELIABILITY),
> // Resolution (RESOLUTION),
> // Uncertainty (UNCERTAINTY),
> // Area Under the ROC Curve (ROC_AUC),
> // Brier Score (BRIER) with confidence interval limits,
> // Probability Threshold Value (THRESH_i)
> // NOTE: Previous column repeated for each probability
threshold.
> //
> // (13) STAT and PJC Text Files, Joint/Continuous Statistics of
> // Probabilistic Variables:
> // Total (TOTAL),
> // Number of Forecast Probability Thresholds (N_THRESH),
> // Probability Threshold Value (THRESH_i),
> // Observation Yes Count Divided by Total (OY_TP_i),
> // Observation No Count Divided by Total (ON_TP_i),
> // Calibration (CALIBRATION_i),
> // Refinement (REFINEMENT_i),
> // Likelikhood (LIKELIHOOD_i),
> // Base Rate (BASER_i),
> // NOTE: Previous 7 columns repeated for each row in the
table.
> // Last Probability Threshold Value (THRESH_n)
> //
> // (14) STAT and PRC Text Files, ROC Curve Points for
> // Probabilistic Variables:
> // Total (TOTAL),
> // Number of Forecast Probability Thresholds (N_THRESH),
> // Probability Threshold Value (THRESH_i),
> // Probability of Detecting Yes (PODY_i),
> // Probability of False Detection (POFD_i),
> // NOTE: Previous 3 columns repeated for each row in the
table.
> // Last Probability Threshold Value (THRESH_n)
> //
> // (15) STAT and MPR Text Files, Matched Pair Data:
> // Total (TOTAL),
> // Index (INDEX),
> // Observation Station ID (OBS_SID),
> // Observation Latitude (OBS_LAT),
> // Observation Longitude (OBS_LON),
> // Observation Level (OBS_LVL),
> // Observation Elevation (OBS_ELV),
> // Forecast Value (FCST),
> // Observation Value (OBS),
> // Climatological Value (CLIMO)
> //
> // In the expressions above, f are forecast values, o are observed
values,
> // and c are climatological values.
> //
> // Values for these flags are interpreted as follows:
> // (0) Do not generate output of this type
> // (1) Write output to a STAT file
> // (2) Write output to a STAT file and a text file
> //
> output_flag[] = [ 2, 2, 2, 0, 0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0 ];
>
> //
> // Flag to indicate whether Kendall's Tau and Spearman's Rank
Correlation
> // Coefficients should be computed. Computing them over large
datasets is
> // computationally intensive and slows down the runtime execution
significantly.
> // (0) Do not compute these correlation coefficients
> // (1) Compute these correlation coefficients
> //
> rank_corr_flag = 0;
>
> //
> // Specify the GRIB Table 2 parameter table version number to be
used
> // for interpreting GRIB codes.
> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
> //
> grib_ptv = 2;
>
> //
> // Directory where temporary files should be written.
> //
> tmp_dir = "./tmp";
>
> //
> // Prefix to be used for the output file names.
> //
> output_prefix = "";
>
> //
> // Indicate a version number for the contents of this configuration
file.
> // The value should generally not be modified.
> //
> version = "V3.0";
>
>
>
*********************************************************************************************************
>
> Please let me know how to fix the problem. In addition, do you have
an exemplary script in validating soil moisture variable?
>
>
> Thanks,
>
>
> Shujia
>
>
>
------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #56393] met error in reading records for U
From: Shujia Zhou
Time: Fri May 11 10:53:15 2012
Hi, Paul:
I have put the relevant files into zhou_may11_2012.
257 "/incoming/irap/met_help/zhou_may11_2012" is current directory.
ftp> ls
227 Entering Passive Mode (128,117,192,211,195,36)
150 Opening ASCII mode data connection for /bin/ls.
total 1659316
-rw-r--r-- 1 120 120 8326 May 11 16:32
PB2NCConfig_G212_SFC_WRF_sjz
-rw-r--r-- 1 120 120 20599 May 11 16:32
PointStatConfig_SFC_WRF_sjz
-rw-r--r-- 1 120 120 5510 May 11 16:33 WestCoast.poly
-rw-r--r-- 1 120 120 186824 May 11 16:20 data_20100530.t00z.nc
-rw-r--r-- 1 120 120 997 May 11 16:51 namelist.pinterp
-rw-r--r-- 1 120 120 872812156 May 11 16:45 wrfout_d01_2010_05_30_00
---->rename to make ftp work
-rw-r--r-- 1 120 120 824413720 May 11 16:32
wrfout_d01_2010_05_30_00_PLEV --->rename
Thanks,
Shujia
On May 11, 2012, at 12:09 PM, Paul Oldenburg via RT wrote:
> Shujia,
>
> Can you please send me the model data and obs data that you are
using? You can upload the data to our FTP site using
> the instructions here:
http://www.dtcenter.org/met/users/support/met_help.php#ftp. It
appears that you are trying to
> verify p_interp data, correct? If so, there may be a problem in how
point_stat handles p_interp data.
>
> Regarding your question about soil moisture, we do not have any
example scripts for this. I did find the following
> discussion that includes talk about soil moisture:
>
> http://mailman.ucar.edu/pipermail/met_help/2009-August/000919.html
>
> I found using the google search 'met_help archive soil moisture'.
Sorry I can't be more help on this topic.
>
> If you have any questions, please let me know.
>
> Paul
>
>
> On 05/11/2012 09:48 AM, Shujia Zhou via RT wrote:
>>
>> Fri May 11 09:48:36 2012: Request 56393 was acted upon.
>> Transaction: Ticket created by shujia.zhou-1 at nasa.gov
>> Queue: met_help
>> Subject: met error in reading records for U
>> Owner: Nobody
>> Requestors: shujia.zhou-1 at nasa.gov
>> Status: new
>> Ticket<URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=56393>
>>
>>
>> Hi,
>>
>> We tried to validate U and V and encountered the following errors
after running point_stat
>>
>>
>> /discover/nobackup/szhou/nu-wrf-trunk/nu-wrf_v2beta2-
3.2.1/MET/bin/point_stat wrfout_d01_2010-05-30_00:00:00_PLEV
./data_20100530.t00z.nc PointStatConfig_SFC_WRF_sjz
>>
>> GSL_RNG_TYPE=mt19937
>> GSL_RNG_SEED=1160424783
>> Forecast File: wrfout_d01_2010-05-30_00:00:00_PLEV
>> Climatology File: none
>> Configuration File: PointStatConfig_SFC_WRF_sjz
>> Observation File: ./data_20100530.t00z.nc
>>
>>
--------------------------------------------------------------------------------
>>
>> Reading records for T2(0,*,*).
>> For T2(0,*,*) found 1 forecast levels and 0 climatology levels.
>>
>>
--------------------------------------------------------------------------------
>>
>> Reading records for Q2(0,*,*).
>> For Q2(0,*,*) found 1 forecast levels and 0 climatology levels.
>>
>>
--------------------------------------------------------------------------------
>>
>> Reading records for U(0,*,*).
>>
>>
>> ERROR: read_levels_pinterp() -> error reading U(0,*,*) from the
p_interp NetCDF file wrfout_d01_2010-05-30_00:00:00_PLEV
>>
>>
>>
>> ERROR: read_levels_pinterp() -> the valid time for the U(0,*,*)
variable in the p_interp NetCDF file wrfout_d01_2010-05-30_00
>> :00:00_PLEV does not match the requested valid time:
(19700101_000000 != 20100530_000000
>>
>>
>> Here is the script for point_stat:
>>
>>
>>
///////////////////////////////////////////////////////////////////////////////
>> //
>> // Default point_stat configuration file
>> //
>>
////////////////////////////////////////////////////////////////////////////////
>>
>> //
>> // Specify a name to designate the model being verified. This name
will be
>> // written to the second column of the ASCII output generated.
>> //
>> model = "WRF";
>>
>> //
>> // Beginning and ending time offset values in seconds for
observations
>> // to be used. These time offsets are defined in reference to the
>> // forecast valid time, v. Observations with a valid time falling
in the
>> // window [v+beg_ds, v+end_ds] will be used.
>> // These selections are overridden by the command line arguments
>> // -obs_valid_beg and -obs_valid_end.
>> //
>> beg_ds = -3600;
>> end_ds = 3600;
>>
>> //
>> // Specify a comma-separated list of fields to be verified. The
forecast and
>> // observation fields may be specified separately. If the
obs_field parameter
>> // is left blank, it will default to the contents of fcst_field.
>> //
>> // Each field is specified as a GRIB code or abbreviation followed
by an
>> // accumulation or vertical level indicator for GRIB files or as a
variable name
>> // followed by a list of dimensions for NetCDF files output from
p_interp or MET.
>> //
>> // Specifying verification fields for GRIB files:
>> // GC/ANNN for accumulation interval NNN
>> // GC/ZNNN for vertical level NNN
>> // GC/ZNNN-NNN for a range of vertical levels (MSL or AGL)
>> // GC/PNNN for pressure level NNN in hPa
>> // GC/PNNN-NNN for a range of pressure levels in hPa
>> // GC/LNNN for a generic level type
>> // GC/RNNN for a specific GRIB record number
>> // Where GC is the number of or abbreviation for the grib code
>> // to be verified.
>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
>> //
>> // Specifying verification fields for NetCDF files:
>> // var_name(i,...,j,*,*) for a single field
>> // var_name(i-j,*,*) for a range of fields
>> // Where var_name is the name of the NetCDF variable,
>> // and i,...,j specifies fixed dimension values,
>> // and i-j specifies a range of values for a single dimension,
>> // and *,* specifies the two dimensions for the gridded field.
>> //
>> // NOTE: To verify winds as vectors rather than scalars,
>> // specify UGRD (or 33) followed by VGRD (or 34) with the
>> // same level values.
>> //
>> // NOTE: To process a probability field, add "/PROB", such as
"POP/Z0/PROB".
>> //
>> // e.g. fcst_field[] = [ "SPFH/P500", "TMP/P500" ]; for a GRIB
input
>> // e.g. fcst_field[] = [ "QVAPOR(0,5,*,*)", "TT(0,5,*,*)" ]; for
NetCDF input
>> //
>> //sjz what are TMP/Z2 and SPFH/Z2 ??
>> fcst_field[] = [ "T2(0,*,*)", "Q2(0,*,*)", "U(0,*,*)", "V(0,*,*)"
];
>> //fcst_field[] = [ "T2(0,*,*)"];
>> obs_field[] = [ "TMP/Z2", "SPFH/Z2","UGRD/Z10", "VGRD/Z10" ];
>> //obs_field[] = [ "TMP/Z2"];
>>
>> //
>> // Specify a comma-separated list of groups of thresholds to be
applied to the
>> // fields listed above. Thresholds for the forecast and
observation fields
>> // may be specified separately. If the obs_thresh parameter is
left blank,
>> // it will default to the contents of fcst_thresh.
>> //
>> // At least one threshold must be provided for each field listed
above. The
>> // lengths of the "fcst_field" and "fcst_thresh" arrays must match,
as must
>> // lengths of the "obs_field" and "obs_thresh" arrays. To apply
multiple
>> // thresholds to a field, separate the threshold values with a
space.
>> //
>> // Each threshold must be preceded by a two letter indicator for
the type of
>> // thresholding to be performed:
>> // 'lt' for less than 'le' for less than or equal to
>> // 'eq' for equal to 'ne' for not equal to
>> // 'gt' for greater than 'ge' for greater than or equal to
>> //
>> // NOTE: Thresholds for probabilities must begin with 0.0, end with
1.0,
>> // and be preceeded by "ge".
>> //
>> // e.g. fcst_thresh[] = [ "gt80", "gt273" ];
>> //
>> // sjz is 300 too high?
>> fcst_thresh[] = [ "gt300", "gt0.0", "lt100", "lt100" ];
>> //fcst_thresh[] = [ "gt300"];
>> obs_thresh[] = [];
>>
>> //
>> // Specify a comma-separated list of thresholds to be used when
computing
>> // VL1L2 and VAL1L2 partial sums for winds. The thresholds are
applied to the
>> // wind speed values derived from each U/V pair. Only those U/V
pairs which meet
>> // the wind speed threshold criteria are retained. If the
obs_wind_thresh
>> // parameter is left blank, it will default to the contents of
fcst_wind_thresh.
>> //
>> // To apply multiple wind speed thresholds, separate the threshold
values with a
>> // space. Use "NA" to indicate that no wind speed threshold should
be applied.
>> //
>> // Each threshold must be preceded by a two letter indicator for
the type of
>> // thresholding to be performed:
>> // 'lt' for less than 'le' for less than or equal to
>> // 'eq' for equal to 'ne' for not equal to
>> // 'gt' for greater than 'ge' for greater than or equal to
>> // 'NA' for no threshold
>> //
>> // e.g. fcst_wind_thresh[] = [ "NA", "ge1.0" ];
>> //
>> fcst_wind_thresh[] = [ "NA" ];
>> obs_wind_thresh[] = [];
>>
>> //
>> // Specify a comma-separated list of PrepBufr message types with
which
>> // to perform the verification. Statistics will be computed
separately
>> // for each message type specified. At least one PrepBufr message
type
>> // must be provided.
>> // List of valid message types:
>> // ADPUPA AIRCAR AIRCFT ADPSFC ERS1DA GOESND GPSIPW
>> // MSONET PROFLR QKSWND RASSDA SATEMP SATWND SFCBOG
>> // SFCSHP SPSSMI SYNDAT VADWND
>> // ANYAIR (= AIRCAR, AIRCFT)
>> // ANYSFC (= ADPSFC, SFCSHP, ADPUPA, PROFLR)
>> // ONLYSF (= ADPSFC, SFCSHP)
>> //
http://www.emc.ncep.noaa.gov/mmb/data_processing/prepbufr.doc/table_1.htm
>> //
>> // e.g. message_type[] = [ "ADPUPA", "AIRCAR" ];
>> //
>> //message_type[] = [ "ADPSFC" ];
>> message_type[] = [ "ANYSFC" ];
>>
>> //
>> // Specify a comma-separated list of grids to be used in masking
the data over
>> // which to perform scoring. An empty list indicates that no
masking grid
>> // should be performed. The standard NCEP grids are named "GNNN"
where NNN
>> // indicates the three digit grid number. Enter "FULL" to score
over the
>> // entire domain.
>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html
>> //
>> // e.g. mask_grid[] = [ "FULL" ];
>> //
>> mask_grid[] = [ ];
>>
>> //
>> // Specify a comma-separated list of masking regions to be applied.
>> // An empty list indicates that no additional masks should be used.
>> // The masking regions may be defined in one of 4 ways:
>> //
>> // (1) An ASCII file containing a lat/lon polygon.
>> // Latitude in degrees north and longitude in degrees east.
>> // By default, the first and last polygon points are connected.
>> // e.g. "MET_BASE/data/poly/EAST.poly" which consists of n
points:
>> // "poly_name lat1 lon1 lat2 lon2... latn lonn"
>> //
>> // (2) The NetCDF output of the gen_poly_mask tool.
>> //
>> // (3) A NetCDF data file, followed by the name of the NetCDF
variable
>> // to be used, and optionally, a threshold to be applied to the
field.
>> // e.g. "sample.nc var_name gt0.00"
>> //
>> // (4) A GRIB data file, followed by a description of the field
>> // to be used, and optionally, a threshold to be applied to the
field.
>> // e.g. "sample.grb APCP/A3 gt0.00"
>> //
>> // Any NetCDF or GRIB file used must have the same grid dimensions
as the
>> // data being verified.
>> //
>> // MET_BASE may be used in the path for the files above.
>> //
>> // e.g. mask_poly[] = [ "MET_BASE/data/poly/EAST.poly",
>> // "poly_mask.ncf",
>> // "sample.nc APCP",
>> // "sample.grb HGT/Z0 gt100.0" ];
>> //
>> //mask_poly[] = [ "/gpfsm/dnb32/akumar3/NU-WRF/nu-wrf_v2dev4-
3.2.1/MET/anil_test/SPL.poly" ];
>> //mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_test/CalNex_LIS32/CONUS.poly"
];
>> mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_april19_2012/WestCoast.poly"
];
>> //mask_poly[] = [ ];
>> //
>> // Specify the name of an ASCII file containing a space-separated
list of
>> // station ID's at which to perform verification. Each station ID
specified
>> // is treated as an individual masking region.
>> //
>> // An empty list file name indicates that no station ID masks
should be used.
>> //
>> // MET_BASE may be used in the path for the station ID mask file
name.
>> //
>> // e.g. mask_sid = "MET_BASE/data/stations/CONUS.stations";
>> //
>> mask_sid = "";
>>
>> //
>> // Specify a comma-separated list of values for alpha to be used
when computing
>> // confidence intervals. Values of alpha must be between 0 and 1.
>> //
>> // e.g. ci_alpha[] = [ 0.05, 0.10 ];
>> //
>> ci_alpha[] = [ 0.05 ];
>>
>> //
>> // Specify the method to be used for computing bootstrap confidence
intervals.
>> // The value for this is interpreted as follows:
>> // (0) Use the BCa interval method (computationally intensive)
>> // (1) Use the percentile interval method
>> //
>> boot_interval = 1;
>>
>> //
>> // Specify a proportion between 0 and 1 to define the replicate
sample size
>> // to be used when computing percentile intervals. The replicate
sample
>> // size is set to boot_rep_prop * n, where n is the number of raw
data points.
>> //
>> // e.g boot_rep_prop = 0.80;
>> //
>> boot_rep_prop = 1.0;
>>
>> //
>> // Specify the number of times each set of matched pair data should
be
>> // resampled when computing bootstrap confidence intervals. A
value of
>> // zero disables the computation of bootstrap condifence intervals.
>> //
>> // e.g. n_boot_rep = 1000;
>> //
>> n_boot_rep = 0;
>>
>> //
>> // Specify the name of the random number generator to be used. See
the MET
>> // Users Guide for a list of possible random number generators.
>> //
>> boot_rng = "mt19937";
>>
>> //
>> // Specify the seed value to be used when computing bootstrap
confidence
>> // intervals. If left unspecified, the seed will change for each
run and
>> // the computed bootstrap confidence intervals will not be
reproducable.
>> //
>> boot_seed = "";
>>
>> //
>> // Specify a comma-separated list of interpolation method(s) to be
used
>> // for comparing the forecast grid to the observation points.
String values
>> // are interpreted as follows:
>> // MIN = Minimum in the neighborhood
>> // MAX = Maximum in the neighborhood
>> // MEDIAN = Median in the neighborhood
>> // UW_MEAN = Unweighted mean in the neighborhood
>> // DW_MEAN = Distance-weighted mean in the neighborhood
>> // LS_FIT = Least-squares fit in the neighborhood
>> //
>> // In all cases, vertical interpolation is performed in the natural
log
>> // of pressure of the levels above and below the observation.
>> //
>> // e.g. interp_method[] = [ "UW_MEAN", "MEDIAN" ];
>> //
>> interp_method[] = [ "DW_MEAN" ];
>>
>> //
>> // Specify a comma-separated list of box widths to be used by the
>> // interpolation techniques listed above. A value of 1 indicates
that
>> // the nearest neighbor approach should be used. For a value of n
>> // greater than 1, the n*n grid points closest to the observation
define
>> // the neighborhood.
>> //
>> // e.g. interp_width = [ 1, 3, 5 ];
>> //
>> interp_width[] = [ 3 ];
>>
>> //
>> // When interpolating, compute a ratio of the number of valid data
points
>> // to the total number of points in the neighborhood. If that
ratio is
>> // less than this threshold, do not include the observation. This
>> // threshold must be between 0 and 1. Setting this threshold to 1
will
>> // require that each observation be surrounded by n*n valid
forecast
>> // points.
>> //
>> // e.g. interp_thresh = 1.0;
>> //
>> interp_thresh = 1.0;
>>
>> //
>> // Specify flags to indicate the type of data to be output:
>> // (1) STAT and FHO Text Files, Forecast, Hit, Observation
Rates:
>> // Total (TOTAL),
>> // Forecast Rate (F_RATE),
>> // Hit Rate (H_RATE),
>> // Observation Rate (O_RATE)
>> //
>> // (2) STAT and CTC Text Files, Contingency Table Counts:
>> // Total (TOTAL),
>> // Forecast Yes and Observation Yes Count (FY_OY),
>> // Forecast Yes and Observation No Count (FY_ON),
>> // Forecast No and Observation Yes Count (FN_OY),
>> // Forecast No and Observation No Count (FN_ON)
>> //
>> // (3) STAT and CTS Text Files, Contingency Table Scores:
>> // Total (TOTAL),
>> // Base Rate (BASER),
>> // Forecast Mean (FMEAN),
>> // Accuracy (ACC),
>> // Frequency Bias (FBIAS),
>> // Probability of Detecting Yes (PODY),
>> // Probability of Detecting No (PODN),
>> // Probability of False Detection (POFD),
>> // False Alarm Ratio (FAR),
>> // Critical Success Index (CSI),
>> // Gilbert Skill Score (GSS),
>> // Hanssen and Kuipers Discriminant (HK),
>> // Heidke Skill Score (HSS),
>> // Odds Ratio (ODDS),
>> // NOTE: All statistics listed above contain parametric
and/or
>> // non-parametric confidence interval limits.
>> //
>> // (4) STAT and MCTC Text Files, NxN Multi-Category Contingency
Table Counts:
>> // Total (TOTAL),
>> // Number of Categories (N_CAT),
>> // Contingency Table Count columns repeated N_CAT*N_CAT
times
>> //
>> // (5) STAT and MCTS Text Files, NxN Multi-Category Contingency
Table Scores:
>> // Total (TOTAL),
>> // Number of Categories (N_CAT),
>> // Accuracy (ACC),
>> // Hanssen and Kuipers Discriminant (HK),
>> // Heidke Skill Score (HSS),
>> // Gerrity Score (GER),
>> // NOTE: All statistics listed above contain parametric
and/or
>> // non-parametric confidence interval limits.
>> //
>> // (6) STAT and CNT Text Files, Statistics of Continuous
Variables:
>> // Total (TOTAL),
>> // Forecast Mean (FBAR),
>> // Forecast Standard Deviation (FSTDEV),
>> // Observation Mean (OBAR),
>> // Observation Standard Deviation (OSTDEV),
>> // Pearson's Correlation Coefficient (PR_CORR),
>> // Spearman's Rank Correlation Coefficient (SP_CORR),
>> // Kendall Tau Rank Correlation Coefficient (KT_CORR),
>> // Number of ranks compared (RANKS),
>> // Number of tied ranks in the forecast field
(FRANK_TIES),
>> // Number of tied ranks in the observation field
(ORANK_TIES),
>> // Mean Error (ME),
>> // Standard Deviation of the Error (ESTDEV),
>> // Multiplicative Bias (MBIAS = FBAR - OBAR),
>> // Mean Absolute Error (MAE),
>> // Mean Squared Error (MSE),
>> // Bias-Corrected Mean Squared Error (BCMSE),
>> // Root Mean Squared Error (RMSE),
>> // Percentiles of the Error (E10, E25, E50, E75, E90)
>> // NOTE: Most statistics listed above contain parametric
and/or
>> // non-parametric confidence interval limits.
>> //
>> // (7) STAT and SL1L2 Text Files, Scalar Partial Sums:
>> // Total (TOTAL),
>> // Forecast Mean (FBAR),
>> // = mean(f)
>> // Observation Mean (OBAR),
>> // = mean(o)
>> // Forecast*Observation Product Mean (FOBAR),
>> // = mean(f*o)
>> // Forecast Squared Mean (FFBAR),
>> // = mean(f^2)
>> // Observation Squared Mean (OOBAR)
>> // = mean(o^2)
>> //
>> // (8) STAT and SAL1L2 Text Files, Scalar Anomaly Partial Sums:
>> // Total (TOTAL),
>> // Forecast Anomaly Mean (FABAR),
>> // = mean(f-c)
>> // Observation Anomaly Mean (OABAR),
>> // = mean(o-c)
>> // Product of Forecast and Observation Anomalies Mean
(FOABAR),
>> // = mean((f-c)*(o-c))
>> // Forecast Anomaly Squared Mean (FFABAR),
>> // = mean((f-c)^2)
>> // Observation Anomaly Squared Mean (OOABAR)
>> // = mean((o-c)^2)
>> //
>> // (9) STAT and VL1L2 Text Files, Vector Partial Sums:
>> // Total (TOTAL),
>> // U-Forecast Mean (UFBAR),
>> // = mean(uf)
>> // V-Forecast Mean (VFBAR),
>> // = mean(vf)
>> // U-Observation Mean (UOBAR),
>> // = mean(uo)
>> // V-Observation Mean (VOBAR),
>> // = mean(vo)
>> // U-Product Plus V-Product (UVFOBAR),
>> // = mean(uf*uo+vf*vo)
>> // U-Forecast Squared Plus V-Forecast Squared (UVFFBAR),
>> // = mean(uf^2+vf^2)
>> // U-Observation Squared Plus V-Observation Squared
(UVOOBAR)
>> // = mean(uo^2+vo^2)
>> //
>> // (10) STAT and VAL1L2 Text Files, Vector Anomaly Partial Sums:
>> // U-Forecast Anomaly Mean (UFABAR),
>> // = mean(uf-uc)
>> // V-Forecast Anomaly Mean (VFABAR),
>> // = mean(vf-vc)
>> // U-Observation Anomaly Mean (UOABAR),
>> // = mean(uo-uc)
>> // V-Observation Anomaly Mean (VOABAR),
>> // = mean(vo-vc)
>> // U-Anomaly Product Plus V-Anomaly Product (UVFOABAR),
>> // = mean((uf-uc)*(uo-uc)+(vf-vc)*(vo-vc))
>> // U-Forecast Anomaly Squared Plus V-Forecast Anomaly
Squared (UVFFABAR),
>> // = mean((uf-uc)^2+(vf-vc)^2)
>> // U-Observation Anomaly Squared Plus V-Observation
Anomaly Squared (UVOOABAR)
>> // = mean((uo-uc)^2+(vo-vc)^2)
>> //
>> // (11) STAT and PCT Text Files, Nx2 Probability Contingency
Table Counts:
>> // Total (TOTAL),
>> // Number of Forecast Probability Thresholds (N_THRESH),
>> // Probability Threshold Value (THRESH_i),
>> // Row Observation Yes Count (OY_i),
>> // Row Observation No Count (ON_i),
>> // NOTE: Previous 3 columns repeated for each row in the
table.
>> // Last Probability Threshold Value (THRESH_n)
>> //
>> // (12) STAT and PSTD Text Files, Nx2 Probability Contingency
Table Scores:
>> // Total (TOTAL),
>> // Number of Forecast Probability Thresholds (N_THRESH),
>> // Base Rate (BASER) with confidence interval limits,
>> // Reliability (RELIABILITY),
>> // Resolution (RESOLUTION),
>> // Uncertainty (UNCERTAINTY),
>> // Area Under the ROC Curve (ROC_AUC),
>> // Brier Score (BRIER) with confidence interval limits,
>> // Probability Threshold Value (THRESH_i)
>> // NOTE: Previous column repeated for each probability
threshold.
>> //
>> // (13) STAT and PJC Text Files, Joint/Continuous Statistics of
>> // Probabilistic Variables:
>> // Total (TOTAL),
>> // Number of Forecast Probability Thresholds (N_THRESH),
>> // Probability Threshold Value (THRESH_i),
>> // Observation Yes Count Divided by Total (OY_TP_i),
>> // Observation No Count Divided by Total (ON_TP_i),
>> // Calibration (CALIBRATION_i),
>> // Refinement (REFINEMENT_i),
>> // Likelikhood (LIKELIHOOD_i),
>> // Base Rate (BASER_i),
>> // NOTE: Previous 7 columns repeated for each row in the
table.
>> // Last Probability Threshold Value (THRESH_n)
>> //
>> // (14) STAT and PRC Text Files, ROC Curve Points for
>> // Probabilistic Variables:
>> // Total (TOTAL),
>> // Number of Forecast Probability Thresholds (N_THRESH),
>> // Probability Threshold Value (THRESH_i),
>> // Probability of Detecting Yes (PODY_i),
>> // Probability of False Detection (POFD_i),
>> // NOTE: Previous 3 columns repeated for each row in the
table.
>> // Last Probability Threshold Value (THRESH_n)
>> //
>> // (15) STAT and MPR Text Files, Matched Pair Data:
>> // Total (TOTAL),
>> // Index (INDEX),
>> // Observation Station ID (OBS_SID),
>> // Observation Latitude (OBS_LAT),
>> // Observation Longitude (OBS_LON),
>> // Observation Level (OBS_LVL),
>> // Observation Elevation (OBS_ELV),
>> // Forecast Value (FCST),
>> // Observation Value (OBS),
>> // Climatological Value (CLIMO)
>> //
>> // In the expressions above, f are forecast values, o are
observed values,
>> // and c are climatological values.
>> //
>> // Values for these flags are interpreted as follows:
>> // (0) Do not generate output of this type
>> // (1) Write output to a STAT file
>> // (2) Write output to a STAT file and a text file
>> //
>> output_flag[] = [ 2, 2, 2, 0, 0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0 ];
>>
>> //
>> // Flag to indicate whether Kendall's Tau and Spearman's Rank
Correlation
>> // Coefficients should be computed. Computing them over large
datasets is
>> // computationally intensive and slows down the runtime execution
significantly.
>> // (0) Do not compute these correlation coefficients
>> // (1) Compute these correlation coefficients
>> //
>> rank_corr_flag = 0;
>>
>> //
>> // Specify the GRIB Table 2 parameter table version number to be
used
>> // for interpreting GRIB codes.
>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
>> //
>> grib_ptv = 2;
>>
>> //
>> // Directory where temporary files should be written.
>> //
>> tmp_dir = "./tmp";
>>
>> //
>> // Prefix to be used for the output file names.
>> //
>> output_prefix = "";
>>
>> //
>> // Indicate a version number for the contents of this configuration
file.
>> // The value should generally not be modified.
>> //
>> version = "V3.0";
>>
>>
>>
*********************************************************************************************************
>>
>> Please let me know how to fix the problem. In addition, do you have
an exemplary script in validating soil moisture variable?
>>
>>
>> Thanks,
>>
>>
>> Shujia
>>
>>
>>
>
>
------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #56393] met error in reading records for U
From: Paul Oldenburg
Time: Fri May 11 12:13:28 2012
Shujia,
We tried to get point_stat to verify UU and VV (instead of U and V),
but when we saw the variable dimensions, we
realized that MET does not support verification on a staggered grid:
float UU(Time, num_metgrid_levels, south_north,
west_east_stag) ;
If possible, we recommend that you use the WRF Unified Post Processor
(UPP -
http://www.dtcenter.org/wrf-nmm/users/overview/upp_overview.php) to
post-process your WRF output files. MET provides
much better support for verifying the GRIB output of UPP, as opposed
to the output of the p_interp post-processor. I'm
sorry I can't be more help. Please let me know if you have any
questions.
Paul
On 05/11/2012 10:53 AM, Shujia Zhou via RT wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=56393>
>
> Hi, Paul:
>
> I have put the relevant files into zhou_may11_2012.
>
>
>
> 257 "/incoming/irap/met_help/zhou_may11_2012" is current directory.
> ftp> ls
> 227 Entering Passive Mode (128,117,192,211,195,36)
> 150 Opening ASCII mode data connection for /bin/ls.
> total 1659316
> -rw-r--r-- 1 120 120 8326 May 11 16:32
PB2NCConfig_G212_SFC_WRF_sjz
> -rw-r--r-- 1 120 120 20599 May 11 16:32
PointStatConfig_SFC_WRF_sjz
> -rw-r--r-- 1 120 120 5510 May 11 16:33 WestCoast.poly
> -rw-r--r-- 1 120 120 186824 May 11 16:20 data_20100530.t00z.nc
> -rw-r--r-- 1 120 120 997 May 11 16:51 namelist.pinterp
> -rw-r--r-- 1 120 120 872812156 May 11 16:45 wrfout_d01_2010_05_30_00
---->rename to make ftp work
> -rw-r--r-- 1 120 120 824413720 May 11 16:32
wrfout_d01_2010_05_30_00_PLEV --->rename
>
>
>
> Thanks,
>
>
> Shujia
>
> On May 11, 2012, at 12:09 PM, Paul Oldenburg via RT wrote:
>
>> Shujia,
>>
>> Can you please send me the model data and obs data that you are
using? You can upload the data to our FTP site using
>> the instructions here:
http://www.dtcenter.org/met/users/support/met_help.php#ftp. It
appears that you are trying to
>> verify p_interp data, correct? If so, there may be a problem in
how point_stat handles p_interp data.
>>
>> Regarding your question about soil moisture, we do not have any
example scripts for this. I did find the following
>> discussion that includes talk about soil moisture:
>>
>> http://mailman.ucar.edu/pipermail/met_help/2009-August/000919.html
>>
>> I found using the google search 'met_help archive soil moisture'.
Sorry I can't be more help on this topic.
>>
>> If you have any questions, please let me know.
>>
>> Paul
>>
>>
>> On 05/11/2012 09:48 AM, Shujia Zhou via RT wrote:
>>>
>>> Fri May 11 09:48:36 2012: Request 56393 was acted upon.
>>> Transaction: Ticket created by shujia.zhou-1 at nasa.gov
>>> Queue: met_help
>>> Subject: met error in reading records for U
>>> Owner: Nobody
>>> Requestors: shujia.zhou-1 at nasa.gov
>>> Status: new
>>> Ticket<URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=56393>
>>>
>>>
>>> Hi,
>>>
>>> We tried to validate U and V and encountered the following errors
after running point_stat
>>>
>>>
>>> /discover/nobackup/szhou/nu-wrf-trunk/nu-wrf_v2beta2-
3.2.1/MET/bin/point_stat wrfout_d01_2010-05-30_00:00:00_PLEV
./data_20100530.t00z.nc PointStatConfig_SFC_WRF_sjz
>>>
>>> GSL_RNG_TYPE=mt19937
>>> GSL_RNG_SEED=1160424783
>>> Forecast File: wrfout_d01_2010-05-30_00:00:00_PLEV
>>> Climatology File: none
>>> Configuration File: PointStatConfig_SFC_WRF_sjz
>>> Observation File: ./data_20100530.t00z.nc
>>>
>>>
--------------------------------------------------------------------------------
>>>
>>> Reading records for T2(0,*,*).
>>> For T2(0,*,*) found 1 forecast levels and 0 climatology levels.
>>>
>>>
--------------------------------------------------------------------------------
>>>
>>> Reading records for Q2(0,*,*).
>>> For Q2(0,*,*) found 1 forecast levels and 0 climatology levels.
>>>
>>>
--------------------------------------------------------------------------------
>>>
>>> Reading records for U(0,*,*).
>>>
>>>
>>> ERROR: read_levels_pinterp() -> error reading U(0,*,*) from the
p_interp NetCDF file wrfout_d01_2010-05-30_00:00:00_PLEV
>>>
>>>
>>>
>>> ERROR: read_levels_pinterp() -> the valid time for the U(0,*,*)
variable in the p_interp NetCDF file wrfout_d01_2010-05-30_00
>>> :00:00_PLEV does not match the requested valid time:
(19700101_000000 != 20100530_000000
>>>
>>>
>>> Here is the script for point_stat:
>>>
>>>
>>>
///////////////////////////////////////////////////////////////////////////////
>>> //
>>> // Default point_stat configuration file
>>> //
>>>
////////////////////////////////////////////////////////////////////////////////
>>>
>>> //
>>> // Specify a name to designate the model being verified. This
name will be
>>> // written to the second column of the ASCII output generated.
>>> //
>>> model = "WRF";
>>>
>>> //
>>> // Beginning and ending time offset values in seconds for
observations
>>> // to be used. These time offsets are defined in reference to the
>>> // forecast valid time, v. Observations with a valid time falling
in the
>>> // window [v+beg_ds, v+end_ds] will be used.
>>> // These selections are overridden by the command line arguments
>>> // -obs_valid_beg and -obs_valid_end.
>>> //
>>> beg_ds = -3600;
>>> end_ds = 3600;
>>>
>>> //
>>> // Specify a comma-separated list of fields to be verified. The
forecast and
>>> // observation fields may be specified separately. If the
obs_field parameter
>>> // is left blank, it will default to the contents of fcst_field.
>>> //
>>> // Each field is specified as a GRIB code or abbreviation followed
by an
>>> // accumulation or vertical level indicator for GRIB files or as a
variable name
>>> // followed by a list of dimensions for NetCDF files output from
p_interp or MET.
>>> //
>>> // Specifying verification fields for GRIB files:
>>> // GC/ANNN for accumulation interval NNN
>>> // GC/ZNNN for vertical level NNN
>>> // GC/ZNNN-NNN for a range of vertical levels (MSL or AGL)
>>> // GC/PNNN for pressure level NNN in hPa
>>> // GC/PNNN-NNN for a range of pressure levels in hPa
>>> // GC/LNNN for a generic level type
>>> // GC/RNNN for a specific GRIB record number
>>> // Where GC is the number of or abbreviation for the grib code
>>> // to be verified.
>>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
>>> //
>>> // Specifying verification fields for NetCDF files:
>>> // var_name(i,...,j,*,*) for a single field
>>> // var_name(i-j,*,*) for a range of fields
>>> // Where var_name is the name of the NetCDF variable,
>>> // and i,...,j specifies fixed dimension values,
>>> // and i-j specifies a range of values for a single dimension,
>>> // and *,* specifies the two dimensions for the gridded field.
>>> //
>>> // NOTE: To verify winds as vectors rather than scalars,
>>> // specify UGRD (or 33) followed by VGRD (or 34) with the
>>> // same level values.
>>> //
>>> // NOTE: To process a probability field, add "/PROB", such as
"POP/Z0/PROB".
>>> //
>>> // e.g. fcst_field[] = [ "SPFH/P500", "TMP/P500" ]; for a GRIB
input
>>> // e.g. fcst_field[] = [ "QVAPOR(0,5,*,*)", "TT(0,5,*,*)" ]; for
NetCDF input
>>> //
>>> //sjz what are TMP/Z2 and SPFH/Z2 ??
>>> fcst_field[] = [ "T2(0,*,*)", "Q2(0,*,*)", "U(0,*,*)", "V(0,*,*)"
];
>>> //fcst_field[] = [ "T2(0,*,*)"];
>>> obs_field[] = [ "TMP/Z2", "SPFH/Z2","UGRD/Z10", "VGRD/Z10" ];
>>> //obs_field[] = [ "TMP/Z2"];
>>>
>>> //
>>> // Specify a comma-separated list of groups of thresholds to be
applied to the
>>> // fields listed above. Thresholds for the forecast and
observation fields
>>> // may be specified separately. If the obs_thresh parameter is
left blank,
>>> // it will default to the contents of fcst_thresh.
>>> //
>>> // At least one threshold must be provided for each field listed
above. The
>>> // lengths of the "fcst_field" and "fcst_thresh" arrays must
match, as must
>>> // lengths of the "obs_field" and "obs_thresh" arrays. To apply
multiple
>>> // thresholds to a field, separate the threshold values with a
space.
>>> //
>>> // Each threshold must be preceded by a two letter indicator for
the type of
>>> // thresholding to be performed:
>>> // 'lt' for less than 'le' for less than or equal to
>>> // 'eq' for equal to 'ne' for not equal to
>>> // 'gt' for greater than 'ge' for greater than or equal to
>>> //
>>> // NOTE: Thresholds for probabilities must begin with 0.0, end
with 1.0,
>>> // and be preceeded by "ge".
>>> //
>>> // e.g. fcst_thresh[] = [ "gt80", "gt273" ];
>>> //
>>> // sjz is 300 too high?
>>> fcst_thresh[] = [ "gt300", "gt0.0", "lt100", "lt100" ];
>>> //fcst_thresh[] = [ "gt300"];
>>> obs_thresh[] = [];
>>>
>>> //
>>> // Specify a comma-separated list of thresholds to be used when
computing
>>> // VL1L2 and VAL1L2 partial sums for winds. The thresholds are
applied to the
>>> // wind speed values derived from each U/V pair. Only those U/V
pairs which meet
>>> // the wind speed threshold criteria are retained. If the
obs_wind_thresh
>>> // parameter is left blank, it will default to the contents of
fcst_wind_thresh.
>>> //
>>> // To apply multiple wind speed thresholds, separate the threshold
values with a
>>> // space. Use "NA" to indicate that no wind speed threshold
should be applied.
>>> //
>>> // Each threshold must be preceded by a two letter indicator for
the type of
>>> // thresholding to be performed:
>>> // 'lt' for less than 'le' for less than or equal to
>>> // 'eq' for equal to 'ne' for not equal to
>>> // 'gt' for greater than 'ge' for greater than or equal to
>>> // 'NA' for no threshold
>>> //
>>> // e.g. fcst_wind_thresh[] = [ "NA", "ge1.0" ];
>>> //
>>> fcst_wind_thresh[] = [ "NA" ];
>>> obs_wind_thresh[] = [];
>>>
>>> //
>>> // Specify a comma-separated list of PrepBufr message types with
which
>>> // to perform the verification. Statistics will be computed
separately
>>> // for each message type specified. At least one PrepBufr message
type
>>> // must be provided.
>>> // List of valid message types:
>>> // ADPUPA AIRCAR AIRCFT ADPSFC ERS1DA GOESND GPSIPW
>>> // MSONET PROFLR QKSWND RASSDA SATEMP SATWND SFCBOG
>>> // SFCSHP SPSSMI SYNDAT VADWND
>>> // ANYAIR (= AIRCAR, AIRCFT)
>>> // ANYSFC (= ADPSFC, SFCSHP, ADPUPA, PROFLR)
>>> // ONLYSF (= ADPSFC, SFCSHP)
>>> //
http://www.emc.ncep.noaa.gov/mmb/data_processing/prepbufr.doc/table_1.htm
>>> //
>>> // e.g. message_type[] = [ "ADPUPA", "AIRCAR" ];
>>> //
>>> //message_type[] = [ "ADPSFC" ];
>>> message_type[] = [ "ANYSFC" ];
>>>
>>> //
>>> // Specify a comma-separated list of grids to be used in masking
the data over
>>> // which to perform scoring. An empty list indicates that no
masking grid
>>> // should be performed. The standard NCEP grids are named "GNNN"
where NNN
>>> // indicates the three digit grid number. Enter "FULL" to score
over the
>>> // entire domain.
>>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html
>>> //
>>> // e.g. mask_grid[] = [ "FULL" ];
>>> //
>>> mask_grid[] = [ ];
>>>
>>> //
>>> // Specify a comma-separated list of masking regions to be
applied.
>>> // An empty list indicates that no additional masks should be
used.
>>> // The masking regions may be defined in one of 4 ways:
>>> //
>>> // (1) An ASCII file containing a lat/lon polygon.
>>> // Latitude in degrees north and longitude in degrees east.
>>> // By default, the first and last polygon points are
connected.
>>> // e.g. "MET_BASE/data/poly/EAST.poly" which consists of n
points:
>>> // "poly_name lat1 lon1 lat2 lon2... latn lonn"
>>> //
>>> // (2) The NetCDF output of the gen_poly_mask tool.
>>> //
>>> // (3) A NetCDF data file, followed by the name of the NetCDF
variable
>>> // to be used, and optionally, a threshold to be applied to
the field.
>>> // e.g. "sample.nc var_name gt0.00"
>>> //
>>> // (4) A GRIB data file, followed by a description of the field
>>> // to be used, and optionally, a threshold to be applied to
the field.
>>> // e.g. "sample.grb APCP/A3 gt0.00"
>>> //
>>> // Any NetCDF or GRIB file used must have the same grid dimensions
as the
>>> // data being verified.
>>> //
>>> // MET_BASE may be used in the path for the files above.
>>> //
>>> // e.g. mask_poly[] = [ "MET_BASE/data/poly/EAST.poly",
>>> // "poly_mask.ncf",
>>> // "sample.nc APCP",
>>> // "sample.grb HGT/Z0 gt100.0" ];
>>> //
>>> //mask_poly[] = [ "/gpfsm/dnb32/akumar3/NU-WRF/nu-wrf_v2dev4-
3.2.1/MET/anil_test/SPL.poly" ];
>>> //mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_test/CalNex_LIS32/CONUS.poly"
];
>>> mask_poly[] = [
"/discover/nobackup/szhou/test_met/zhining_april19_2012/WestCoast.poly"
];
>>> //mask_poly[] = [ ];
>>> //
>>> // Specify the name of an ASCII file containing a space-separated
list of
>>> // station ID's at which to perform verification. Each station ID
specified
>>> // is treated as an individual masking region.
>>> //
>>> // An empty list file name indicates that no station ID masks
should be used.
>>> //
>>> // MET_BASE may be used in the path for the station ID mask file
name.
>>> //
>>> // e.g. mask_sid = "MET_BASE/data/stations/CONUS.stations";
>>> //
>>> mask_sid = "";
>>>
>>> //
>>> // Specify a comma-separated list of values for alpha to be used
when computing
>>> // confidence intervals. Values of alpha must be between 0 and 1.
>>> //
>>> // e.g. ci_alpha[] = [ 0.05, 0.10 ];
>>> //
>>> ci_alpha[] = [ 0.05 ];
>>>
>>> //
>>> // Specify the method to be used for computing bootstrap
confidence intervals.
>>> // The value for this is interpreted as follows:
>>> // (0) Use the BCa interval method (computationally intensive)
>>> // (1) Use the percentile interval method
>>> //
>>> boot_interval = 1;
>>>
>>> //
>>> // Specify a proportion between 0 and 1 to define the replicate
sample size
>>> // to be used when computing percentile intervals. The replicate
sample
>>> // size is set to boot_rep_prop * n, where n is the number of raw
data points.
>>> //
>>> // e.g boot_rep_prop = 0.80;
>>> //
>>> boot_rep_prop = 1.0;
>>>
>>> //
>>> // Specify the number of times each set of matched pair data
should be
>>> // resampled when computing bootstrap confidence intervals. A
value of
>>> // zero disables the computation of bootstrap condifence
intervals.
>>> //
>>> // e.g. n_boot_rep = 1000;
>>> //
>>> n_boot_rep = 0;
>>>
>>> //
>>> // Specify the name of the random number generator to be used.
See the MET
>>> // Users Guide for a list of possible random number generators.
>>> //
>>> boot_rng = "mt19937";
>>>
>>> //
>>> // Specify the seed value to be used when computing bootstrap
confidence
>>> // intervals. If left unspecified, the seed will change for each
run and
>>> // the computed bootstrap confidence intervals will not be
reproducable.
>>> //
>>> boot_seed = "";
>>>
>>> //
>>> // Specify a comma-separated list of interpolation method(s) to be
used
>>> // for comparing the forecast grid to the observation points.
String values
>>> // are interpreted as follows:
>>> // MIN = Minimum in the neighborhood
>>> // MAX = Maximum in the neighborhood
>>> // MEDIAN = Median in the neighborhood
>>> // UW_MEAN = Unweighted mean in the neighborhood
>>> // DW_MEAN = Distance-weighted mean in the neighborhood
>>> // LS_FIT = Least-squares fit in the neighborhood
>>> //
>>> // In all cases, vertical interpolation is performed in the
natural log
>>> // of pressure of the levels above and below the observation.
>>> //
>>> // e.g. interp_method[] = [ "UW_MEAN", "MEDIAN" ];
>>> //
>>> interp_method[] = [ "DW_MEAN" ];
>>>
>>> //
>>> // Specify a comma-separated list of box widths to be used by the
>>> // interpolation techniques listed above. A value of 1 indicates
that
>>> // the nearest neighbor approach should be used. For a value of n
>>> // greater than 1, the n*n grid points closest to the observation
define
>>> // the neighborhood.
>>> //
>>> // e.g. interp_width = [ 1, 3, 5 ];
>>> //
>>> interp_width[] = [ 3 ];
>>>
>>> //
>>> // When interpolating, compute a ratio of the number of valid data
points
>>> // to the total number of points in the neighborhood. If that
ratio is
>>> // less than this threshold, do not include the observation. This
>>> // threshold must be between 0 and 1. Setting this threshold to 1
will
>>> // require that each observation be surrounded by n*n valid
forecast
>>> // points.
>>> //
>>> // e.g. interp_thresh = 1.0;
>>> //
>>> interp_thresh = 1.0;
>>>
>>> //
>>> // Specify flags to indicate the type of data to be output:
>>> // (1) STAT and FHO Text Files, Forecast, Hit, Observation
Rates:
>>> // Total (TOTAL),
>>> // Forecast Rate (F_RATE),
>>> // Hit Rate (H_RATE),
>>> // Observation Rate (O_RATE)
>>> //
>>> // (2) STAT and CTC Text Files, Contingency Table Counts:
>>> // Total (TOTAL),
>>> // Forecast Yes and Observation Yes Count (FY_OY),
>>> // Forecast Yes and Observation No Count (FY_ON),
>>> // Forecast No and Observation Yes Count (FN_OY),
>>> // Forecast No and Observation No Count (FN_ON)
>>> //
>>> // (3) STAT and CTS Text Files, Contingency Table Scores:
>>> // Total (TOTAL),
>>> // Base Rate (BASER),
>>> // Forecast Mean (FMEAN),
>>> // Accuracy (ACC),
>>> // Frequency Bias (FBIAS),
>>> // Probability of Detecting Yes (PODY),
>>> // Probability of Detecting No (PODN),
>>> // Probability of False Detection (POFD),
>>> // False Alarm Ratio (FAR),
>>> // Critical Success Index (CSI),
>>> // Gilbert Skill Score (GSS),
>>> // Hanssen and Kuipers Discriminant (HK),
>>> // Heidke Skill Score (HSS),
>>> // Odds Ratio (ODDS),
>>> // NOTE: All statistics listed above contain parametric
and/or
>>> // non-parametric confidence interval limits.
>>> //
>>> // (4) STAT and MCTC Text Files, NxN Multi-Category Contingency
Table Counts:
>>> // Total (TOTAL),
>>> // Number of Categories (N_CAT),
>>> // Contingency Table Count columns repeated N_CAT*N_CAT
times
>>> //
>>> // (5) STAT and MCTS Text Files, NxN Multi-Category Contingency
Table Scores:
>>> // Total (TOTAL),
>>> // Number of Categories (N_CAT),
>>> // Accuracy (ACC),
>>> // Hanssen and Kuipers Discriminant (HK),
>>> // Heidke Skill Score (HSS),
>>> // Gerrity Score (GER),
>>> // NOTE: All statistics listed above contain parametric
and/or
>>> // non-parametric confidence interval limits.
>>> //
>>> // (6) STAT and CNT Text Files, Statistics of Continuous
Variables:
>>> // Total (TOTAL),
>>> // Forecast Mean (FBAR),
>>> // Forecast Standard Deviation (FSTDEV),
>>> // Observation Mean (OBAR),
>>> // Observation Standard Deviation (OSTDEV),
>>> // Pearson's Correlation Coefficient (PR_CORR),
>>> // Spearman's Rank Correlation Coefficient (SP_CORR),
>>> // Kendall Tau Rank Correlation Coefficient (KT_CORR),
>>> // Number of ranks compared (RANKS),
>>> // Number of tied ranks in the forecast field
(FRANK_TIES),
>>> // Number of tied ranks in the observation field
(ORANK_TIES),
>>> // Mean Error (ME),
>>> // Standard Deviation of the Error (ESTDEV),
>>> // Multiplicative Bias (MBIAS = FBAR - OBAR),
>>> // Mean Absolute Error (MAE),
>>> // Mean Squared Error (MSE),
>>> // Bias-Corrected Mean Squared Error (BCMSE),
>>> // Root Mean Squared Error (RMSE),
>>> // Percentiles of the Error (E10, E25, E50, E75, E90)
>>> // NOTE: Most statistics listed above contain parametric
and/or
>>> // non-parametric confidence interval limits.
>>> //
>>> // (7) STAT and SL1L2 Text Files, Scalar Partial Sums:
>>> // Total (TOTAL),
>>> // Forecast Mean (FBAR),
>>> // = mean(f)
>>> // Observation Mean (OBAR),
>>> // = mean(o)
>>> // Forecast*Observation Product Mean (FOBAR),
>>> // = mean(f*o)
>>> // Forecast Squared Mean (FFBAR),
>>> // = mean(f^2)
>>> // Observation Squared Mean (OOBAR)
>>> // = mean(o^2)
>>> //
>>> // (8) STAT and SAL1L2 Text Files, Scalar Anomaly Partial Sums:
>>> // Total (TOTAL),
>>> // Forecast Anomaly Mean (FABAR),
>>> // = mean(f-c)
>>> // Observation Anomaly Mean (OABAR),
>>> // = mean(o-c)
>>> // Product of Forecast and Observation Anomalies Mean
(FOABAR),
>>> // = mean((f-c)*(o-c))
>>> // Forecast Anomaly Squared Mean (FFABAR),
>>> // = mean((f-c)^2)
>>> // Observation Anomaly Squared Mean (OOABAR)
>>> // = mean((o-c)^2)
>>> //
>>> // (9) STAT and VL1L2 Text Files, Vector Partial Sums:
>>> // Total (TOTAL),
>>> // U-Forecast Mean (UFBAR),
>>> // = mean(uf)
>>> // V-Forecast Mean (VFBAR),
>>> // = mean(vf)
>>> // U-Observation Mean (UOBAR),
>>> // = mean(uo)
>>> // V-Observation Mean (VOBAR),
>>> // = mean(vo)
>>> // U-Product Plus V-Product (UVFOBAR),
>>> // = mean(uf*uo+vf*vo)
>>> // U-Forecast Squared Plus V-Forecast Squared (UVFFBAR),
>>> // = mean(uf^2+vf^2)
>>> // U-Observation Squared Plus V-Observation Squared
(UVOOBAR)
>>> // = mean(uo^2+vo^2)
>>> //
>>> // (10) STAT and VAL1L2 Text Files, Vector Anomaly Partial Sums:
>>> // U-Forecast Anomaly Mean (UFABAR),
>>> // = mean(uf-uc)
>>> // V-Forecast Anomaly Mean (VFABAR),
>>> // = mean(vf-vc)
>>> // U-Observation Anomaly Mean (UOABAR),
>>> // = mean(uo-uc)
>>> // V-Observation Anomaly Mean (VOABAR),
>>> // = mean(vo-vc)
>>> // U-Anomaly Product Plus V-Anomaly Product (UVFOABAR),
>>> // = mean((uf-uc)*(uo-uc)+(vf-vc)*(vo-vc))
>>> // U-Forecast Anomaly Squared Plus V-Forecast Anomaly
Squared (UVFFABAR),
>>> // = mean((uf-uc)^2+(vf-vc)^2)
>>> // U-Observation Anomaly Squared Plus V-Observation
Anomaly Squared (UVOOABAR)
>>> // = mean((uo-uc)^2+(vo-vc)^2)
>>> //
>>> // (11) STAT and PCT Text Files, Nx2 Probability Contingency
Table Counts:
>>> // Total (TOTAL),
>>> // Number of Forecast Probability Thresholds (N_THRESH),
>>> // Probability Threshold Value (THRESH_i),
>>> // Row Observation Yes Count (OY_i),
>>> // Row Observation No Count (ON_i),
>>> // NOTE: Previous 3 columns repeated for each row in the
table.
>>> // Last Probability Threshold Value (THRESH_n)
>>> //
>>> // (12) STAT and PSTD Text Files, Nx2 Probability Contingency
Table Scores:
>>> // Total (TOTAL),
>>> // Number of Forecast Probability Thresholds (N_THRESH),
>>> // Base Rate (BASER) with confidence interval limits,
>>> // Reliability (RELIABILITY),
>>> // Resolution (RESOLUTION),
>>> // Uncertainty (UNCERTAINTY),
>>> // Area Under the ROC Curve (ROC_AUC),
>>> // Brier Score (BRIER) with confidence interval limits,
>>> // Probability Threshold Value (THRESH_i)
>>> // NOTE: Previous column repeated for each probability
threshold.
>>> //
>>> // (13) STAT and PJC Text Files, Joint/Continuous Statistics of
>>> // Probabilistic Variables:
>>> // Total (TOTAL),
>>> // Number of Forecast Probability Thresholds (N_THRESH),
>>> // Probability Threshold Value (THRESH_i),
>>> // Observation Yes Count Divided by Total (OY_TP_i),
>>> // Observation No Count Divided by Total (ON_TP_i),
>>> // Calibration (CALIBRATION_i),
>>> // Refinement (REFINEMENT_i),
>>> // Likelikhood (LIKELIHOOD_i),
>>> // Base Rate (BASER_i),
>>> // NOTE: Previous 7 columns repeated for each row in the
table.
>>> // Last Probability Threshold Value (THRESH_n)
>>> //
>>> // (14) STAT and PRC Text Files, ROC Curve Points for
>>> // Probabilistic Variables:
>>> // Total (TOTAL),
>>> // Number of Forecast Probability Thresholds (N_THRESH),
>>> // Probability Threshold Value (THRESH_i),
>>> // Probability of Detecting Yes (PODY_i),
>>> // Probability of False Detection (POFD_i),
>>> // NOTE: Previous 3 columns repeated for each row in the
table.
>>> // Last Probability Threshold Value (THRESH_n)
>>> //
>>> // (15) STAT and MPR Text Files, Matched Pair Data:
>>> // Total (TOTAL),
>>> // Index (INDEX),
>>> // Observation Station ID (OBS_SID),
>>> // Observation Latitude (OBS_LAT),
>>> // Observation Longitude (OBS_LON),
>>> // Observation Level (OBS_LVL),
>>> // Observation Elevation (OBS_ELV),
>>> // Forecast Value (FCST),
>>> // Observation Value (OBS),
>>> // Climatological Value (CLIMO)
>>> //
>>> // In the expressions above, f are forecast values, o are
observed values,
>>> // and c are climatological values.
>>> //
>>> // Values for these flags are interpreted as follows:
>>> // (0) Do not generate output of this type
>>> // (1) Write output to a STAT file
>>> // (2) Write output to a STAT file and a text file
>>> //
>>> output_flag[] = [ 2, 2, 2, 0, 0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0 ];
>>>
>>> //
>>> // Flag to indicate whether Kendall's Tau and Spearman's Rank
Correlation
>>> // Coefficients should be computed. Computing them over large
datasets is
>>> // computationally intensive and slows down the runtime execution
significantly.
>>> // (0) Do not compute these correlation coefficients
>>> // (1) Compute these correlation coefficients
>>> //
>>> rank_corr_flag = 0;
>>>
>>> //
>>> // Specify the GRIB Table 2 parameter table version number to be
used
>>> // for interpreting GRIB codes.
>>> // http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
>>> //
>>> grib_ptv = 2;
>>>
>>> //
>>> // Directory where temporary files should be written.
>>> //
>>> tmp_dir = "./tmp";
>>>
>>> //
>>> // Prefix to be used for the output file names.
>>> //
>>> output_prefix = "";
>>>
>>> //
>>> // Indicate a version number for the contents of this
configuration file.
>>> // The value should generally not be modified.
>>> //
>>> version = "V3.0";
>>>
>>>
>>>
*********************************************************************************************************
>>>
>>> Please let me know how to fix the problem. In addition, do you
have an exemplary script in validating soil moisture variable?
>>>
>>>
>>> Thanks,
>>>
>>>
>>> Shujia
>>>
>>>
>>>
>>
>>
>
------------------------------------------------
More information about the Met_help
mailing list