[Met_help] More MET questions

Daniel Schaffer Daniel.S.Schaffer at noaa.gov
Mon Oct 20 16:48:37 MDT 2008


Hello John

I've run into a couple of new issues with running MET that I thought you might have some ideas on.

1. We have chosen to run MET on data that is on the NCWD lat/lon grid.  When I run the grid_stat tool, I'm getting an error in one of the GSL libraries:

gsl: gamma_inc.c:99: ERROR: error
Default GSL error handler invoked.

It looks like some max iteration threshold is being exceeded in the gamma_inc_P_series subroutine.

This error occurs even if the forecast and obs input files are identical.  I've attached the config file that I'm using and the header of the forecast netCDF file that I'm using.  I'm wondering if this error is occurring during calculation of the stats.  If so, is there any way to have MET just compute the dichotomous counts and thus bypass this error?  The counts are all that I need.

2. The other problem I'm having is that it is taking a very long time for MET to deal with the mask polygon that I have.  The polygon has O(3000) points.  Combine that with the large NCWD grid and it's taking about 20 minutes on my PC to finish handling the polygon (presumably generating bit masks).  Does that seem inordinately long to you?  If not, is there some way for me to cache the coresponding bit masks so that I don't have to pay this penalty every time I call MET (which will be hundreds or thousands of times)?

Thanks,
Dan

-------------------------------------------------------------------------------

Dan Schaffer, Research Associate            e-mail: Daniel.S.Schaffer at noaa.gov
NOAA/OAR/Earth System Research Lab         phone:  (303) 497-7252
Aviation Branch                             fax:    (303) 497-6301
R/GSD5   325 Broadway                       
Boulder, CO 80305

www-ad.fsl.noaa.gov/ac/schaffer.html
-------------- next part --------------
netcdf forecast {
dimensions:
	lon = 1830 ;
	lat = 918 ;
variables:
	float lat(lat, lon) ;
		lat:units = "degrees_north" ;
		lat:long_name = "Latitude" ;
	float lon(lat, lon) ;
		lon:units = "degrees_east" ;
		lon:long_name = "Longitude" ;
	float APCP(lat, lon) ;
		APCP:units = "" ;
		APCP:long_name = "" ;
		APCP:level = "SFC" ;
		APCP:accum_time = "12 hours" ;
		APCP:_FillValue = -9.f ;
		APCP:init_time = "2008-09-01T00:00:00.000-07:00" ;
		APCP:init_time_ut = "1220252400000" ;
		APCP:valid_time = "2008-09-01T13:00:00.000-07:00" ;
		APCP:valid_time_ut = " 1220299200000" ;

// global attributes:
		:Conventions = "COARDS" ;
		:Projection = "LatLon" ;
		:delta_lat_deg = "0.035933 degrees_north" ;
		:lat_ll_deg = "20.01797 degrees_north" ;
		:lon_ll_deg = "-129.98088 degrees_east" ;
		:Nlat = "918 grid_points" ;
		:delta_lon_deg = "0.038239 degrees_east" ;
		:Nlon = "1830 grid_points" ;
}
-------------- next part --------------
////////////////////////////////////////////////////////////////////////////////
//
// Default grid_stat configuration file
//
////////////////////////////////////////////////////////////////////////////////

//
// Specify a name to designate the model being verified.  This name will be
// written to the second column of the ASCII output generated.
//
model = "WRF";

//
// Specify a comma-separated list of fields to be verified.  Each field is
// specified as a grib code or corresponding grib code abbreviation followed
// by an accumulation or vertical level indicator.
//
// Each verification field is specified as one of the following:
//    GC/ANNN for accumulation interval NNN
//    GC/ZNNN for vertical level NNN
//    GC/PNNN for pressure level NNN in hPa
//    GC/PNNN-NNN for a range of pressure levels in hPa
//    Where GC is the number of or abbreviation for the grib code
//    to be verified.
// http://www.nco.ncep.noaa.gov/pmb/docs/on388/table2.html
//
// e.g. vx_grib_code[] = [ "61/A3", "APCP/A24", "RH/L10" ];
//
vx_grib_code[] = [ "61/A12" ];

//
// Specify a comma-separated list of groups of thresholds to be applied to the
// verification fields listed above.  At least one threshold must be provided
// for each verification field listed above.  The lengths of the "vx_grib_code"
// array and the "thresholds" array must be the same.  To apply  multiple
// thresholds to a verification field, separate the threshold values with a
// space.
//
// Each threshold must be preceded by a two letter indicator for the type of
// thresholding to be performed:
//    'lt' for less than     'le' for less than or equal to
//    'eq' for equal to      'ne' for not equal to
//    'gt' for greater than  'ge' for greater than or equal to
//
// e.g. thresholds[] = [ "gt0.0 ge5.0", "gt0.0", "lt80.0 ge80.0" ];
//
thresholds[] = [ "ge1.0" ];

//
// Specify a comma-separated list of grids to be used in masking the data over
// which to perform scoring.  An empty list indicates that no masking grid
// should be performed.  The standard NCEP grids are named "GNNN" where NNN
// indicates the three digit grid number.  Enter "FULL" to score over the
// entire domain.
// http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html
//
// e.g. mask_grids[] = [ "FULL" ];
//
mask_grids[] = [ "ncwd" ];

//
// Specify a comma-separated list of ASCII files containing lat/lon polygons to
// be used in masking the data over which to perform scoring.  An empty list
// indicates that no polygon mask should be used.
//
// Latitude values are given in degrees north and longitude values are given in
// degrees east.  By default, the first and last points defined for the polygon
// are connected.
//
// The entries in the list should be files containing a name for the polygon
// followed by a space-separated list of lat/lon points defining the polygon:
//    "poly_name lat1 lon1 lat2 lon2... latn lonn"
// Where n is the number of polygon points.
//
// MET_BASE may be used in the path for the lat/lon polygon files.
//
// e.g. maks_polys[] = [ "MET_BASE/data/poly/EAST.poly",
//                       "MET_BASE/data/poly/WEST.poly" ];
mask_polys[] = [ "MET_BASE/data/poly/NEVS_CONUS.poly" ];

//
// Specify a comma-separated list of values for alpha to be used when computing
// confidence intervals.  Values of alpha must be between 0 and 1.
//
// e.g. ci_alpha[] = [ 0.05, 0.10 ];
//
ci_alpha[] = [ 0.10, 0.05 ];

//
// Specify the method to be used for computing bootstrap confidence intervals.
// The value for this is interpreted as follows:
//    (0) Use the BCa interval method (computationally intensive)
//    (1) Use the percentile interval method
//
boot_interval = 1;

//
// Specify a proportion between 0 and 1 to define the replicate sample size
// to be used when computing percentile intervals.  The replicate sample
// size is set to boot_rep_prop * n, where n is the number of raw data points.
//
// e.g boot_rep_prop = 0.80;
//
boot_rep_prop = 1.0;

//
// Specify the number of times each set of matched pair data should be
// resampled when computing bootstrap confidence intervals.  A value of
// zero disables the computation of bootstrap condifence intervals.
//
// e.g. n_boot_rep = 1000;
//
n_boot_rep = 500;

//
// Specify the name of the random number generator to be used.  See the MET
// Users Guide for a list of possible random number generators.
//
boot_rng = "mt19937";

//
// Specify the seed value to be used when computing bootstrap confidence
// intervals.  If left unspecified, the seed will change for each run and
// the computed bootstrap confidence intervals will not be reproducable.
//
boot_seed = "";

//
// Specify a comma-separated list of interpolation method(s) to be used
// for smoothing the forecast grid prior to comparing it to the observation
// grid.  The value at each forecast grid point is replaced by the measure
// computed over the neighborhood defined around the grid point.
// String values are interpreted as follows:
//    MIN     = Minimum in the neighborhood
//    MAX     = Maximum in the neighborhood
//    MEDIAN  = Median in the neighborhood
//    UW_MEAN = Unweighted mean in the neighborhood
//
//    NOTE: The distance-weighted mean (DW_MEAN) is not an option here since
//          it will have no effect on a gridded field.
//
//    NOTE: The least-squares fit (LS_FIT) is not an option here since
//          it reduces to an unweighted mean on a grid.
//
// e.g. interp_method[] = [ "UW_MEAN", "MEDIAN" ];
//
interp_method[] = [ "UW_MEAN" ];

//
// Specify a comma-separated list of box widths to be used by the
// interpolation techniques listed above.  A value of 1 indicates that
// no smoothing should be performed.  For a value of n greater than 1,
// the n*n grid points around each point will be used to smooth
// the forecast field.
//
// e.g. interp_width = [ 1, 3, 5 ];
//
interp_width[] = [ 1 ];

//
// When smoothing, compute a ratio of the number of valid data points to
// the total number of points in the neighborhood.  If that ratio is less
// than this threshold, do not compute a smoothed forecast value.  This
// threshold must be between 0 and 1.  Setting this threshold to 1 will
// require that each observation be surrounded by n*n valid forecast
// points.
//
// e.g. interp_threshold = 1.0;
//
interp_threshold = 1.0;

//
// Specify a comma-separated list of box widths to be used to define
// the neighborhood size for the neighborhood verification methods.
// For a value of n greater than 1, the n*n grid points around each point
// will be used to define the neighborhood.
//
// e.g. interp_width = [ 3, 5 ];
//
nbr_width[] = [ 5, 39, 79 ];

//
// When applying the neighborhood verification methods, compute a ratio
// of the number of valid data points to the total number of points in
// the neighborhood.  If that ratio is less than this threshold, do not
// include it in the computations.  This threshold must be between 0
// and 1.  Setting this threshold to 1 will require that each point be
// surrounded by n*n valid forecast points.
//
// e.g. nbr_threshold = 1.0;
//
nbr_threshold = 1.0;

//
// When applying the neighborhood verification methods, apply a threshold
// to the fractional coverage values to define contingency tables from
// which to compute statistics.
//
// e.g. nbr_frac_threshold[] = [ "ge0.25", "ge0.50" ];
//
nbr_frac_threshold[] = [ "ge0.08", "ge0.08", "ge0.08" ];

//
// Specify flags to indicate the type of data to be output:
//    (1) VSDB and FHO Text Files, FHO rates:
//           Total (TOTAL),
//           Forecast Rate (F_RATE),
//           Hit Rate (H_RATE),
//           Observation Rate (O_RATE)
//
//    (2) VSDB and CTC Text Files, Contingency Table Counts:
//           Total (TOTAL),
//           Forecast Yes and Observation Yes Count (FY_OY),
//           Forecast Yes and Observation No Count (FY_ON),
//           Forecast No and Observation Yes Count (FN_OY),
//           Forecast No and Observation No Count (FN_ON)
//
//    (3) VSDB and CTP Text Files, Contingency Table Proportions:
//           Total (TOTAL),
//           FY_OY/Total (FY_OY_TP),
//           FY_ON/Total (FY_ON_TP),
//           FN_OY/Total (FN_OY_TP),
//           FN_ON/Total (FN_ON_TP),
//           FY/Total    (FY_TP),
//           OY/Total    (FN_TP),
//           OY/Total    (OY_TP),
//           ON/Total    (ON_TP)
//
//    (4) VSDB and CFP Text Files, Contingency Table Forecast Proportions:
//           Total (TOTAL),
//           FY_OY/FY (FY_OY_FP),
//           FY_ON/FY (FY_ON_FP),
//           FN_OY/FN (FN_OY_FP),
//           FN_ON/FN (FN_ON_FP),
//           FY       (FY),
//           FN       (FN)
//
//    (5) VSDB and COP Text Files, Contingency Table Observation Proportions:
//           Total (TOTAL),
//           FY_OY/OY (FY_OY_OP),
//           FY_ON/ON (FY_ON_OP),
//           FN_OY/OY (FN_OY_OP),
//           FN_ON/ON (FN_ON_OP),
//           OY       (OY),
//           ON       (ON)
//
//    (6) VSDB and CTS Text Files, Contingency Table Scores:
//           Total (TOTAL),
//           Base Rate (BASER), BASER_CL, BASER_CU,
//           Forecast Mean (FMEAN), FMEAN_CL, FMEAN_CU,
//           Accuracy (ACC), ACC_CL, ACC_CU,
//           Bias (BIAS),
//           Probability of Detecting Yes (PODY), PODY_CL, PODY_CU,
//           Probability of Detecting No (PODN), PODN_CL, PODN_CU,
//           Probability of False Detection (POFD), POFD_CL, POFD_CU,
//           False Alarm Ratio (FAR), FAR_CL, FAR_CU,
//           Critical Success Index (CSI), CSI_CL, CSI_CU,
//           Gilbert Skill Score (GSS),
//           Hanssen and Kuipers Discriminant (HK), HK_CL, HK_CU,
//           Heidke Skill Score (HSS),
//           Odds Ratio (ODDS), ODDS_CL, ODDS_CU
//
//    (7) VSDB and CNT Text Files, Statistics of Continuous Variables:
//           Total (TOTAL),
//           Forecast Mean (FBAR), FBAR_CL, FBAR_CU,
//           Forecast Standard Deviation (FSTDEV), FSTDEV_CL, FSTDEV_CU
//           Observation Mean (OBAR), OBAR_CL, OBAR_CU,
//           Observation Standard Deviation (OSTDEV), OSTDEV_CL, OSTDEV_CU,
//           Pearson's Correlation Coefficient (PR_CORR), PR_CORR_CL, PR_CORR_CU,
//           Spearman's Rank Correlation Coefficient (SP_CORR),
//           Kendall Tau Rank Correlation Coefficient (KT_CORR),
//           Number of ranks compared (RANKS),
//           Number of tied ranks in the forecast field (FRANK_TIES),
//           Number of tied ranks in the observation field (ORANK_TIES),
//           Mean Error (ME), ME_CL, ME_CU,
//           Standard Deviation of the Error (ESTDEV), ESTDEV_CL, ESTDEV_CU,
//           Frequency Bias (FBIAS),
//           Mean Absolute Error (MAE),
//           Mean Squared Error (MSE),
//           Bias-Corrected Mean Squared Error (BCMSE),
//           Root Mean Squared Error (RMSE),
//           Percentiles of the Error (E10, E25, E50, E75, E90)
//
//           NOTE: CL and CU values define lower and upper
//                 confidence interval limits.
//
//    (8) VSDB and SL1L2 Text Files, SL1L2
//           Total (TOTAL),
//           Forecast Mean (FBAR),
//              = mean(f)
//           Observation Mean (OBAR),
//              = mean(o)
//           Forecast*Observation Product Mean (FOBAR),
//              = mean(f*o)
//           Forecast Squared Mean (FFBAR),
//              = mean(f^2)
//           Observation Squared Mean (OOBAR)
//              = mean(o^2)
//
//    (9) VSDB and NBRCTC Text Files, Neighborhood Methods Contingency Table Counts:
//           Total (TOTAL),
//           Forecast Yes and Observation Yes Count (FY_OY),
//           Forecast Yes and Observation No Count (FY_ON),
//           Forecast No and Observation Yes Count (FN_OY),
//           Forecast No and Observation No Count (FN_ON),
//           Fractional Threshold Value (FRAC_T),
//           Neighborhood Size (INTERP_PNTS)
//
//   (10) VSDB and NBRCTS Text Files, Neighborhood Methods Contingency Table Scores:
//           Total (TOTAL),
//           Base Rate (BASER), BASER_CL, BASER_CU,
//           Forecast Mean (FMEAN), FMEAN_CL, FMEAN_CU,
//           Accuracy (ACC), ACC_CL, ACC_CU,
//           Bias (BIAS),
//           Probability of Detecting Yes (PODY), PODY_CL, PODY_CU,
//           Probability of Detecting No (PODN), PODN_CL, PODN_CU,
//           Probability of False Detection (POFD), POFD_CL, POFD_CU,
//           False Alarm Ratio (FAR), FAR_CL, FAR_CU,
//           Critical Success Index (CSI), CSI_CL, CSI_CU,
//           Gilbert Skill Score (GSS),
//           Hanssen and Kuipers Discriminant (HK), HK_CL, HK_CU,
//           Heidke Skill Score (HSS),
//           Odds Ratio (ODDS), ODDS_CL, ODDS_CU
//           Fractional Threshold Value (FRAC_T),
//           Neighborhood Size (INTERP_PNTS)
//
//   (11) VSDB and NBRCNT Text Files, Neighborhood Methods Continuous Scores:
//           Total (TOTAL),
//           Fractions Brier Score (FBS),
//           Fractions Skill Score (FSS),
//           Neighborhood Size (INTERP_PNTS)
//
//   (12) NetCDF File containing difference fields for each grib
//        code/mask combination.  A non-zero value indicates that
//        this NetCDF file should be produced.  A value of 0
//        indicates that it should not be produced.
//
// Values for flags (1) through (8) are interpreted as follows:
//    (0) Do not generate output of this type
//    (1) Write output to a VSDB file
//    (2) Write output to a VSDB file and a text file
//
output_flag[] = [ 0, 2, 0, 0, 0, 0, 0, 0, 2, 0, 0, 0 ];

//
// Flag to indicate whether Kendall's Tau and Spearman's Rank Correlation
// Coefficients should be computed.  Computing them over large datasets is
// computationally intensive and slows down the runtime execution significantly.
//    (0) Do not compute these correlation coefficients
//    (1) Compute these correlation coefficients
//
rank_corr_flag = 1;

//
// Specify whether to use the NCEP/NWS Grib conventions.
// If set to zero, the following defaults will be used:
//    (1) Grib codes 128 to 254 will be named "GC_NNN" where NNN is the grib code
//    (2) Grib levels 204 to 254 will be named "GL_NNN" where NNN is the grib level
//
ncep_defaults = 1;

//
// Directory where temp files should be written by the point_stat tool
//
tmp_dir = "/tmp";

//
// Indicate a version number for the contents of this configuration file.
// The value should generally not be modified.
//
version = "V1.1";



More information about the Met_help mailing list