[Met_help] [rt.rap.ucar.edu #84539] History for using pcp_combine -sum for custom netcdf files

John Halley Gotway via RT met_help at ucar.edu
Wed Mar 28 18:35:21 MDT 2018


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

I have a large number of forecast netcdf files (format from ncdump -h shown
below) in a directory, and I wish to run pcp_combine to sum updraft
helicity tracks across subsets of the files in that directory. But the
files have custorm names and structure, so my attempts to do this have
resulted in an error:

ERROR  : sum_data_files() -> Cannot find a file with a valid time of
20170501_030000 and accumulation time of 010000 matching the regular
expression ".*"

The command I used was

./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc -pcpdir
/condo/map/jdduda/HWT2017/201705010000/ -field MXUPHL_P8_2L103_GLC0_max
-name max_UH_03h

where the pcpdir is full of files with names like
-rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
-rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
-rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc

Here 'm01' represents ensemble member number 1; there are 10 such ensemble
members. So you can see the regularity in the file naming.

The structure of the files is all the same and includes:
[jdduda at schooner1 MODE]$ ncdump -h /condo/map/jdduda/HWT2017/201705010000/
m01_2017050100f001.nc
netcdf m01_2017050100f001 {
dimensions:
        ygrid_0 = 1120 ;
        xgrid_0 = 1620 ;
        lv_ISBL0 = 6 ;
        lv_ISBL1 = 5 ;
        lv_HTGL2 = 2 ;
        lv_HTGL3 = 2 ;
        lv_ISBL4 = 5 ;
        lv_SPDL5 = 3 ;
        lv_HTGL6 = 2 ;
        lv_HTGL7 = 2 ;
variables:
         ...
        float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
                MXUPHL_P8_2L103_GLC0_max:initial_time = "05/01/2017
(00:00)" ;
                MXUPHL_P8_2L103_GLC0_max:forecast_time_units = "hours" ;
                MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
                MXUPHL_P8_2L103_GLC0_max:statistical_process_duration =
"initial time to forecast time" ;
                MXUPHL_P8_2L103_GLC0_max:type_of_statistical_processing =
"Maximum" ;
                MXUPHL_P8_2L103_GLC0_max:level = 5000.f, 2000.f ;
                MXUPHL_P8_2L103_GLC0_max:level_type = "Specified height
level above ground (m)" ;

MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_category_number = 8,
0, 7, 199 ;
                MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_category
= "Meteorological products, Thermodynamic stability indices" ;
                MXUPHL_P8_2L103_GLC0_max:grid_type = "Lambert Conformal can
be secant or tangent, conical or bipolar" ;
                MXUPHL_P8_2L103_GLC0_max:coordinates = "gridlat_0
gridlon_0" ;
                MXUPHL_P8_2L103_GLC0_max:_FillValue = 9.999e+20f ;
                MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
                MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly maximum of
updraft helicity over layer 2km to 5 km AGL" ;
                MXUPHL_P8_2L103_GLC0_max:production_status = "Operational
products" ;
                MXUPHL_P8_2L103_GLC0_max:center = "US National Weather
Service - NCEP (WMC)" ;
          ...

// global attributes:
                :creation_date = "Mon Mar  5 14:48:10 CST 2018" ;
                :NCL_Version = "6.2.0" ;
                :system = "Linux schooner1.oscer.ou.edu
3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018 x86_64 x86_64
x86_64 GNU/Linux" ;
                :Conventions = "None" ;
                :grib_source = "gsi-enkfCN_2017050100f001.grib2" ;
                :title = "NCL: convert-GRIB-to-netCDF" ;

There are other arrays in these files, but I omitted their entries for
simplicity. Is there something wrong with the structure of these files to
make pcp_combine not work? Otherwise, what should I use for -pcprx to do
the following: sum three consecutive 1-hourly files into 3-hourly files (so
let's start with f000-f003)?

Jeff Duda
-- 
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO


----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: using pcp_combine -sum for custom netcdf files
From: John Halley Gotway
Time: Mon Mar 26 09:49:34 2018

Jeff,

The -sum option is intended to be used for summing up precipitation.
The
logic for this is based on the input and output accumulation
intervals.
For example, if you input data has a 1-hour accumulation and you'd
like an
output 12 hour accumulation, pcp_combine would go hunting for 12 files
(since 12 / 1 = 12) with the right timing information.

You data does not contain an accumulation interval... so the -sum
command
will not be useful.

However, you could still use the "-add" option, explicitly listing the
data
you want to select from each input file:

pcp_combine -add \
m01_2017050100f001.nc <http://m01_2017050100f000.nc>
'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
m01_2017050100f002.nc <http://m01_2017050100f000.nc>
'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
m01_2017050100f003.nc <http://m01_2017050100f000.nc>
'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
03hr_UH.nc -name max_UH_03h

Does that produce the desired result?

Thanks,
John

On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <met_help at ucar.edu>
wrote:

>
> Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> Transaction: Ticket created by jeffduda319 at gmail.com
>        Queue: met_help
>      Subject: using pcp_combine -sum for custom netcdf files
>        Owner: Nobody
>   Requestors: jeffduda319 at gmail.com
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>
>
> I have a large number of forecast netcdf files (format from ncdump
-h shown
> below) in a directory, and I wish to run pcp_combine to sum updraft
> helicity tracks across subsets of the files in that directory. But
the
> files have custorm names and structure, so my attempts to do this
have
> resulted in an error:
>
> ERROR  : sum_data_files() -> Cannot find a file with a valid time of
> 20170501_030000 and accumulation time of 010000 matching the regular
> expression ".*"
>
> The command I used was
>
> ./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc -pcpdir
> /condo/map/jdduda/HWT2017/201705010000/ -field
MXUPHL_P8_2L103_GLC0_max
> -name max_UH_03h
>
> where the pcpdir is full of files with names like
> -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
>
> Here 'm01' represents ensemble member number 1; there are 10 such
ensemble
> members. So you can see the regularity in the file naming.
>
> The structure of the files is all the same and includes:
> [jdduda at schooner1 MODE]$ ncdump -h
/condo/map/jdduda/HWT2017/201705010000/
> m01_2017050100f001.nc
> netcdf m01_2017050100f001 {
> dimensions:
>         ygrid_0 = 1120 ;
>         xgrid_0 = 1620 ;
>         lv_ISBL0 = 6 ;
>         lv_ISBL1 = 5 ;
>         lv_HTGL2 = 2 ;
>         lv_HTGL3 = 2 ;
>         lv_ISBL4 = 5 ;
>         lv_SPDL5 = 3 ;
>         lv_HTGL6 = 2 ;
>         lv_HTGL7 = 2 ;
> variables:
>          ...
>         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
>                 MXUPHL_P8_2L103_GLC0_max:initial_time = "05/01/2017
> (00:00)" ;
>                 MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
"hours" ;
>                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
>
MXUPHL_P8_2L103_GLC0_max:statistical_process_duration =
> "initial time to forecast time" ;
>
MXUPHL_P8_2L103_GLC0_max:type_of_statistical_processing =
> "Maximum" ;
>                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f, 2000.f ;
>                 MXUPHL_P8_2L103_GLC0_max:level_type = "Specified
height
> level above ground (m)" ;
>
>
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_category_number
=
> 8,
> 0, 7, 199 ;
>
MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_category
> = "Meteorological products, Thermodynamic stability indices" ;
>                 MXUPHL_P8_2L103_GLC0_max:grid_type = "Lambert
Conformal
> can
> be secant or tangent, conical or bipolar" ;
>                 MXUPHL_P8_2L103_GLC0_max:coordinates = "gridlat_0
> gridlon_0" ;
>                 MXUPHL_P8_2L103_GLC0_max:_FillValue = 9.999e+20f ;
>                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
>                 MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly maximum
of
> updraft helicity over layer 2km to 5 km AGL" ;
>                 MXUPHL_P8_2L103_GLC0_max:production_status =
"Operational
> products" ;
>                 MXUPHL_P8_2L103_GLC0_max:center = "US National
Weather
> Service - NCEP (WMC)" ;
>           ...
>
> // global attributes:
>                 :creation_date = "Mon Mar  5 14:48:10 CST 2018" ;
>                 :NCL_Version = "6.2.0" ;
>                 :system = "Linux schooner1.oscer.ou.edu
> 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018 x86_64
x86_64
> x86_64 GNU/Linux" ;
>                 :Conventions = "None" ;
>                 :grib_source = "gsi-enkfCN_2017050100f001.grib2" ;
>                 :title = "NCL: convert-GRIB-to-netCDF" ;
>
> There are other arrays in these files, but I omitted their entries
for
> simplicity. Is there something wrong with the structure of these
files to
> make pcp_combine not work? Otherwise, what should I use for -pcprx
to do
> the following: sum three consecutive 1-hourly files into 3-hourly
files (so
> let's start with f000-f003)?
>
> Jeff Duda
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>
>

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: Jeff Duda
Time: Mon Mar 26 12:29:32 2018

Thanks, John, but I am getting an error no matter which set of three
files
I try:

[jdduda at schooner2 201705180000]$ /home/jdduda/MODE/pcp_combine -add
m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";' \

> m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";' \

> m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";' \

> 2017051823_03UH.nc -name max_UH_03h

DEBUG 1: Reading input file: m01_2017051800f021.nc

ERROR  :

ERROR  : get_field() -> can't open data file "m01_2017051800f021.nc"

ERROR  :

Jeff

On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Jeff,
>
> The -sum option is intended to be used for summing up precipitation.
The
> logic for this is based on the input and output accumulation
intervals.
> For example, if you input data has a 1-hour accumulation and you'd
like an
> output 12 hour accumulation, pcp_combine would go hunting for 12
files
> (since 12 / 1 = 12) with the right timing information.
>
> You data does not contain an accumulation interval... so the -sum
command
> will not be useful.
>
> However, you could still use the "-add" option, explicitly listing
the data
> you want to select from each input file:
>
> pcp_combine -add \
> m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> 03hr_UH.nc -name max_UH_03h
>
> Does that produce the desired result?
>
> Thanks,
> John
>
> On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT
<met_help at ucar.edu>
> wrote:
>
> >
> > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > Transaction: Ticket created by jeffduda319 at gmail.com
> >        Queue: met_help
> >      Subject: using pcp_combine -sum for custom netcdf files
> >        Owner: Nobody
> >   Requestors: jeffduda319 at gmail.com
> >       Status: new
> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >
> >
> > I have a large number of forecast netcdf files (format from ncdump
-h
> shown
> > below) in a directory, and I wish to run pcp_combine to sum
updraft
> > helicity tracks across subsets of the files in that directory. But
the
> > files have custorm names and structure, so my attempts to do this
have
> > resulted in an error:
> >
> > ERROR  : sum_data_files() -> Cannot find a file with a valid time
of
> > 20170501_030000 and accumulation time of 010000 matching the
regular
> > expression ".*"
> >
> > The command I used was
> >
> > ./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc
-pcpdir
> > /condo/map/jdduda/HWT2017/201705010000/ -field
MXUPHL_P8_2L103_GLC0_max
> > -name max_UH_03h
> >
> > where the pcpdir is full of files with names like
> > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> >
> > Here 'm01' represents ensemble member number 1; there are 10 such
> ensemble
> > members. So you can see the regularity in the file naming.
> >
> > The structure of the files is all the same and includes:
> > [jdduda at schooner1 MODE]$ ncdump -h /condo/map/jdduda/HWT2017/
> 201705010000/
> > m01_2017050100f001.nc
> > netcdf m01_2017050100f001 {
> > dimensions:
> >         ygrid_0 = 1120 ;
> >         xgrid_0 = 1620 ;
> >         lv_ISBL0 = 6 ;
> >         lv_ISBL1 = 5 ;
> >         lv_HTGL2 = 2 ;
> >         lv_HTGL3 = 2 ;
> >         lv_ISBL4 = 5 ;
> >         lv_SPDL5 = 3 ;
> >         lv_HTGL6 = 2 ;
> >         lv_HTGL7 = 2 ;
> > variables:
> >          ...
> >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
> >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
"05/01/2017
> > (00:00)" ;
> >                 MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
"hours" ;
> >                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
> >
MXUPHL_P8_2L103_GLC0_max:statistical_process_duration =
> > "initial time to forecast time" ;
> >
MXUPHL_P8_2L103_GLC0_max:type_of_statistical_processing
> =
> > "Maximum" ;
> >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f, 2000.f ;
> >                 MXUPHL_P8_2L103_GLC0_max:level_type = "Specified
height
> > level above ground (m)" ;
> >
> >
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_category_number
=
> > 8,
> > 0, 7, 199 ;
> >                 MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_
> category
> > = "Meteorological products, Thermodynamic stability indices" ;
> >                 MXUPHL_P8_2L103_GLC0_max:grid_type = "Lambert
Conformal
> > can
> > be secant or tangent, conical or bipolar" ;
> >                 MXUPHL_P8_2L103_GLC0_max:coordinates = "gridlat_0
> > gridlon_0" ;
> >                 MXUPHL_P8_2L103_GLC0_max:_FillValue = 9.999e+20f ;
> >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
> >                 MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly
maximum of
> > updraft helicity over layer 2km to 5 km AGL" ;
> >                 MXUPHL_P8_2L103_GLC0_max:production_status =
> "Operational
> > products" ;
> >                 MXUPHL_P8_2L103_GLC0_max:center = "US National
Weather
> > Service - NCEP (WMC)" ;
> >           ...
> >
> > // global attributes:
> >                 :creation_date = "Mon Mar  5 14:48:10 CST 2018" ;
> >                 :NCL_Version = "6.2.0" ;
> >                 :system = "Linux schooner1.oscer.ou.edu
> > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018
x86_64
> x86_64
> > x86_64 GNU/Linux" ;
> >                 :Conventions = "None" ;
> >                 :grib_source = "gsi-enkfCN_2017050100f001.grib2" ;
> >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> >
> > There are other arrays in these files, but I omitted their entries
for
> > simplicity. Is there something wrong with the structure of these
files to
> > make pcp_combine not work? Otherwise, what should I use for -pcprx
to do
> > the following: sum three consecutive 1-hourly files into 3-hourly
files
> (so
> > let's start with f000-f003)?
> >
> > Jeff Duda
> > --
> > Jeff Duda, Research Scientist
> >
> > University of Colorado Boulder
> >
> > Cooperative Institute for Research in Environmental Sciences
> >
> > NOAA/OAR/ESRL/Global Systems Division
> > Boulder, CO
> >
> >
>
>


--
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: John Halley Gotway
Time: Mon Mar 26 14:24:15 2018

Jeff,

Are you able to post a sample file to our anonymous ftp site following
these instructions?

https://dtcenter.org/met/users/support/met_help.php#ftp

Thanks,
John

On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT <met_help at ucar.edu>
wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>
> Thanks, John, but I am getting an error no matter which set of three
files
> I try:
>
> [jdduda at schooner2 201705180000]$ /home/jdduda/MODE/pcp_combine -add
> m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";' \
>
> > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";'
> \
>
> > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";'
> \
>
> > 2017051823_03UH.nc -name max_UH_03h
>
> DEBUG 1: Reading input file: m01_2017051800f021.nc
>
> ERROR  :
>
> ERROR  : get_field() -> can't open data file "m01_2017051800f021.nc"
>
> ERROR  :
>
> Jeff
>
> On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
> > Jeff,
> >
> > The -sum option is intended to be used for summing up
precipitation.  The
> > logic for this is based on the input and output accumulation
intervals.
> > For example, if you input data has a 1-hour accumulation and you'd
like
> an
> > output 12 hour accumulation, pcp_combine would go hunting for 12
files
> > (since 12 / 1 = 12) with the right timing information.
> >
> > You data does not contain an accumulation interval... so the -sum
command
> > will not be useful.
> >
> > However, you could still use the "-add" option, explicitly listing
the
> data
> > you want to select from each input file:
> >
> > pcp_combine -add \
> > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > 03hr_UH.nc -name max_UH_03h
> >
> > Does that produce the desired result?
> >
> > Thanks,
> > John
> >
> > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > wrote:
> >
> > >
> > > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > > Transaction: Ticket created by jeffduda319 at gmail.com
> > >        Queue: met_help
> > >      Subject: using pcp_combine -sum for custom netcdf files
> > >        Owner: Nobody
> > >   Requestors: jeffduda319 at gmail.com
> > >       Status: new
> > >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539
> >
> > >
> > >
> > > I have a large number of forecast netcdf files (format from
ncdump -h
> > shown
> > > below) in a directory, and I wish to run pcp_combine to sum
updraft
> > > helicity tracks across subsets of the files in that directory.
But the
> > > files have custorm names and structure, so my attempts to do
this have
> > > resulted in an error:
> > >
> > > ERROR  : sum_data_files() -> Cannot find a file with a valid
time of
> > > 20170501_030000 and accumulation time of 010000 matching the
regular
> > > expression ".*"
> > >
> > > The command I used was
> > >
> > > ./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc
-pcpdir
> > > /condo/map/jdduda/HWT2017/201705010000/ -field
> MXUPHL_P8_2L103_GLC0_max
> > > -name max_UH_03h
> > >
> > > where the pcpdir is full of files with names like
> > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > >
> > > Here 'm01' represents ensemble member number 1; there are 10
such
> > ensemble
> > > members. So you can see the regularity in the file naming.
> > >
> > > The structure of the files is all the same and includes:
> > > [jdduda at schooner1 MODE]$ ncdump -h /condo/map/jdduda/HWT2017/
> > 201705010000/
> > > m01_2017050100f001.nc
> > > netcdf m01_2017050100f001 {
> > > dimensions:
> > >         ygrid_0 = 1120 ;
> > >         xgrid_0 = 1620 ;
> > >         lv_ISBL0 = 6 ;
> > >         lv_ISBL1 = 5 ;
> > >         lv_HTGL2 = 2 ;
> > >         lv_HTGL3 = 2 ;
> > >         lv_ISBL4 = 5 ;
> > >         lv_SPDL5 = 3 ;
> > >         lv_HTGL6 = 2 ;
> > >         lv_HTGL7 = 2 ;
> > > variables:
> > >          ...
> > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
> > >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
"05/01/2017
> > > (00:00)" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
> "hours" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
> > >
MXUPHL_P8_2L103_GLC0_max:statistical_process_duration
> =
> > > "initial time to forecast time" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:type_
> of_statistical_processing
> > =
> > > "Maximum" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f, 2000.f
;
> > >                 MXUPHL_P8_2L103_GLC0_max:level_type = "Specified
> height
> > > level above ground (m)" ;
> > >
> > >
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_category_number
> =
> > > 8,
> > > 0, 7, 199 ;
> > >
MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_
> > category
> > > = "Meteorological products, Thermodynamic stability indices" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:grid_type = "Lambert
> Conformal
> > > can
> > > be secant or tangent, conical or bipolar" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
"gridlat_0
> > > gridlon_0" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue = 9.999e+20f
;
> > >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly
maximum
> of
> > > updraft helicity over layer 2km to 5 km AGL" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:production_status =
> > "Operational
> > > products" ;
> > >                 MXUPHL_P8_2L103_GLC0_max:center = "US National
Weather
> > > Service - NCEP (WMC)" ;
> > >           ...
> > >
> > > // global attributes:
> > >                 :creation_date = "Mon Mar  5 14:48:10 CST 2018"
;
> > >                 :NCL_Version = "6.2.0" ;
> > >                 :system = "Linux schooner1.oscer.ou.edu
> > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018
x86_64
> > x86_64
> > > x86_64 GNU/Linux" ;
> > >                 :Conventions = "None" ;
> > >                 :grib_source = "gsi-enkfCN_2017050100f001.grib2"
;
> > >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> > >
> > > There are other arrays in these files, but I omitted their
entries for
> > > simplicity. Is there something wrong with the structure of these
files
> to
> > > make pcp_combine not work? Otherwise, what should I use for
-pcprx to
> do
> > > the following: sum three consecutive 1-hourly files into 3-
hourly files
> > (so
> > > let's start with f000-f003)?
> > >
> > > Jeff Duda
> > > --
> > > Jeff Duda, Research Scientist
> > >
> > > University of Colorado Boulder
> > >
> > > Cooperative Institute for Research in Environmental Sciences
> > >
> > > NOAA/OAR/ESRL/Global Systems Division
> > > Boulder, CO
> > >
> > >
> >
> >
>
>
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>
>

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: Jeff Duda
Time: Mon Mar 26 14:55:23 2018

John,
I dropped three files in there that should give reasonable results.

Jeff

On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Jeff,
>
> Are you able to post a sample file to our anonymous ftp site
following
> these instructions?
>
> https://dtcenter.org/met/users/support/met_help.php#ftp
>
> Thanks,
> John
>
> On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT
<met_help at ucar.edu>
> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >
> > Thanks, John, but I am getting an error no matter which set of
three
> files
> > I try:
> >
> > [jdduda at schooner2 201705180000]$ /home/jdduda/MODE/pcp_combine
-add
> > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
level="(*,*)";'
> \
> >
> > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> level="(*,*)";'
> > \
> >
> > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> level="(*,*)";'
> > \
> >
> > > 2017051823_03UH.nc -name max_UH_03h
> >
> > DEBUG 1: Reading input file: m01_2017051800f021.nc
> >
> > ERROR  :
> >
> > ERROR  : get_field() -> can't open data file
"m01_2017051800f021.nc"
> >
> > ERROR  :
> >
> > Jeff
> >
> > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> > > Jeff,
> > >
> > > The -sum option is intended to be used for summing up
precipitation.
> The
> > > logic for this is based on the input and output accumulation
intervals.
> > > For example, if you input data has a 1-hour accumulation and
you'd like
> > an
> > > output 12 hour accumulation, pcp_combine would go hunting for 12
files
> > > (since 12 / 1 = 12) with the right timing information.
> > >
> > > You data does not contain an accumulation interval... so the
-sum
> command
> > > will not be useful.
> > >
> > > However, you could still use the "-add" option, explicitly
listing the
> > data
> > > you want to select from each input file:
> > >
> > > pcp_combine -add \
> > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > 03hr_UH.nc -name max_UH_03h
> > >
> > > Does that produce the desired result?
> > >
> > > Thanks,
> > > John
> > >
> > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > > wrote:
> > >
> > > >
> > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > > > Transaction: Ticket created by jeffduda319 at gmail.com
> > > >        Queue: met_help
> > > >      Subject: using pcp_combine -sum for custom netcdf files
> > > >        Owner: Nobody
> > > >   Requestors: jeffduda319 at gmail.com
> > > >       Status: new
> > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> Ticket/Display.html?id=84539
> > >
> > > >
> > > >
> > > > I have a large number of forecast netcdf files (format from
ncdump -h
> > > shown
> > > > below) in a directory, and I wish to run pcp_combine to sum
updraft
> > > > helicity tracks across subsets of the files in that directory.
But
> the
> > > > files have custorm names and structure, so my attempts to do
this
> have
> > > > resulted in an error:
> > > >
> > > > ERROR  : sum_data_files() -> Cannot find a file with a valid
time of
> > > > 20170501_030000 and accumulation time of 010000 matching the
regular
> > > > expression ".*"
> > > >
> > > > The command I used was
> > > >
> > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc
-pcpdir
> > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> > MXUPHL_P8_2L103_GLC0_max
> > > > -name max_UH_03h
> > > >
> > > > where the pcpdir is full of files with names like
> > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > > >
> > > > Here 'm01' represents ensemble member number 1; there are 10
such
> > > ensemble
> > > > members. So you can see the regularity in the file naming.
> > > >
> > > > The structure of the files is all the same and includes:
> > > > [jdduda at schooner1 MODE]$ ncdump -h /condo/map/jdduda/HWT2017/
> > > 201705010000/
> > > > m01_2017050100f001.nc
> > > > netcdf m01_2017050100f001 {
> > > > dimensions:
> > > >         ygrid_0 = 1120 ;
> > > >         xgrid_0 = 1620 ;
> > > >         lv_ISBL0 = 6 ;
> > > >         lv_ISBL1 = 5 ;
> > > >         lv_HTGL2 = 2 ;
> > > >         lv_HTGL3 = 2 ;
> > > >         lv_ISBL4 = 5 ;
> > > >         lv_SPDL5 = 3 ;
> > > >         lv_HTGL6 = 2 ;
> > > >         lv_HTGL7 = 2 ;
> > > > variables:
> > > >          ...
> > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
"05/01/2017
> > > > (00:00)" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
> > "hours" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:
> statistical_process_duration
> > =
> > > > "initial time to forecast time" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> > of_statistical_processing
> > > =
> > > > "Maximum" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f,
2000.f ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
"Specified
> > height
> > > > level above ground (m)" ;
> > > >
> > > > MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> category_number
> > =
> > > > 8,
> > > > 0, 7, 199 ;
> > > >
MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_
> > > category
> > > > = "Meteorological products, Thermodynamic stability indices" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type = "Lambert
> > Conformal
> > > > can
> > > > be secant or tangent, conical or bipolar" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
"gridlat_0
> > > > gridlon_0" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
9.999e+20f ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly
maximum
> > of
> > > > updraft helicity over layer 2km to 5 km AGL" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:production_status =
> > > "Operational
> > > > products" ;
> > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US National
> Weather
> > > > Service - NCEP (WMC)" ;
> > > >           ...
> > > >
> > > > // global attributes:
> > > >                 :creation_date = "Mon Mar  5 14:48:10 CST
2018" ;
> > > >                 :NCL_Version = "6.2.0" ;
> > > >                 :system = "Linux schooner1.oscer.ou.edu
> > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018
x86_64
> > > x86_64
> > > > x86_64 GNU/Linux" ;
> > > >                 :Conventions = "None" ;
> > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.grib2" ;
> > > >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> > > >
> > > > There are other arrays in these files, but I omitted their
entries
> for
> > > > simplicity. Is there something wrong with the structure of
these
> files
> > to
> > > > make pcp_combine not work? Otherwise, what should I use for
-pcprx to
> > do
> > > > the following: sum three consecutive 1-hourly files into 3-
hourly
> files
> > > (so
> > > > let's start with f000-f003)?
> > > >
> > > > Jeff Duda
> > > > --
> > > > Jeff Duda, Research Scientist
> > > >
> > > > University of Colorado Boulder
> > > >
> > > > Cooperative Institute for Research in Environmental Sciences
> > > >
> > > > NOAA/OAR/ESRL/Global Systems Division
> > > > Boulder, CO
> > > >
> > > >
> > >
> > >
> >
> >
> > --
> > Jeff Duda, Research Scientist
> >
> > University of Colorado Boulder
> >
> > Cooperative Institute for Research in Environmental Sciences
> >
> > NOAA/OAR/ESRL/Global Systems Division
> > Boulder, CO
> >
> >
>
>


--
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: John Halley Gotway
Time: Mon Mar 26 16:33:09 2018

Jeff,

OK, so this a flavor of NetCDF that MET doesn't know how to handle.
MET
reads NetCDF files in one of 3 formats:
(1) The NetCDF output of the MET tools themselves.
(2) NetCDF files that follow the CF convention.
(3) The NetCDF output of the wrf_interp utility.

This file doesn't fall into any of those categories.  So
unfortunately,
none of the MET utilities, including pcp_combine will be able to read
it.

Of course, they could easily read the GRIB2 data from which these
files
were generated:
   gsi-enkfCN_2017051800f022.grib2

So unfortunately I don't have a nice, easy solution.

John



On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT <met_help at ucar.edu>
wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>
> John,
> I dropped three files in there that should give reasonable results.
>
> Jeff
>
> On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
> > Jeff,
> >
> > Are you able to post a sample file to our anonymous ftp site
following
> > these instructions?
> >
> > https://dtcenter.org/met/users/support/met_help.php#ftp
> >
> > Thanks,
> > John
> >
> > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> > >
> > > Thanks, John, but I am getting an error no matter which set of
three
> > files
> > > I try:
> > >
> > > [jdduda at schooner2 201705180000]$ /home/jdduda/MODE/pcp_combine
-add
> > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> level="(*,*)";'
> > \
> > >
> > > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > level="(*,*)";'
> > > \
> > >
> > > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > level="(*,*)";'
> > > \
> > >
> > > > 2017051823_03UH.nc -name max_UH_03h
> > >
> > > DEBUG 1: Reading input file: m01_2017051800f021.nc
> > >
> > > ERROR  :
> > >
> > > ERROR  : get_field() -> can't open data file
"m01_2017051800f021.nc"
> > >
> > > ERROR  :
> > >
> > > Jeff
> > >
> > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > > Jeff,
> > > >
> > > > The -sum option is intended to be used for summing up
precipitation.
> > The
> > > > logic for this is based on the input and output accumulation
> intervals.
> > > > For example, if you input data has a 1-hour accumulation and
you'd
> like
> > > an
> > > > output 12 hour accumulation, pcp_combine would go hunting for
12
> files
> > > > (since 12 / 1 = 12) with the right timing information.
> > > >
> > > > You data does not contain an accumulation interval... so the
-sum
> > command
> > > > will not be useful.
> > > >
> > > > However, you could still use the "-add" option, explicitly
listing
> the
> > > data
> > > > you want to select from each input file:
> > > >
> > > > pcp_combine -add \
> > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > 03hr_UH.nc -name max_UH_03h
> > > >
> > > > Does that produce the desired result?
> > > >
> > > > Thanks,
> > > > John
> > > >
> > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT
<met_help at ucar.edu
> >
> > > > wrote:
> > > >
> > > > >
> > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > > > > Transaction: Ticket created by jeffduda319 at gmail.com
> > > > >        Queue: met_help
> > > > >      Subject: using pcp_combine -sum for custom netcdf files
> > > > >        Owner: Nobody
> > > > >   Requestors: jeffduda319 at gmail.com
> > > > >       Status: new
> > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> > Ticket/Display.html?id=84539
> > > >
> > > > >
> > > > >
> > > > > I have a large number of forecast netcdf files (format from
ncdump
> -h
> > > > shown
> > > > > below) in a directory, and I wish to run pcp_combine to sum
updraft
> > > > > helicity tracks across subsets of the files in that
directory. But
> > the
> > > > > files have custorm names and structure, so my attempts to do
this
> > have
> > > > > resulted in an error:
> > > > >
> > > > > ERROR  : sum_data_files() -> Cannot find a file with a valid
time
> of
> > > > > 20170501_030000 and accumulation time of 010000 matching the
> regular
> > > > > expression ".*"
> > > > >
> > > > > The command I used was
> > > > >
> > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03 03hr_UH.nc
-pcpdir
> > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> > > MXUPHL_P8_2L103_GLC0_max
> > > > > -name max_UH_03h
> > > > >
> > > > > where the pcpdir is full of files with names like
> > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > /condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > > > >
> > > > > Here 'm01' represents ensemble member number 1; there are 10
such
> > > > ensemble
> > > > > members. So you can see the regularity in the file naming.
> > > > >
> > > > > The structure of the files is all the same and includes:
> > > > > [jdduda at schooner1 MODE]$ ncdump -h
/condo/map/jdduda/HWT2017/
> > > > 201705010000/
> > > > > m01_2017050100f001.nc
> > > > > netcdf m01_2017050100f001 {
> > > > > dimensions:
> > > > >         ygrid_0 = 1120 ;
> > > > >         xgrid_0 = 1620 ;
> > > > >         lv_ISBL0 = 6 ;
> > > > >         lv_ISBL1 = 5 ;
> > > > >         lv_HTGL2 = 2 ;
> > > > >         lv_HTGL3 = 2 ;
> > > > >         lv_ISBL4 = 5 ;
> > > > >         lv_SPDL5 = 3 ;
> > > > >         lv_HTGL6 = 2 ;
> > > > >         lv_HTGL7 = 2 ;
> > > > > variables:
> > > > >          ...
> > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
> "05/01/2017
> > > > > (00:00)" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time_units
=
> > > "hours" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1 ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > statistical_process_duration
> > > =
> > > > > "initial time to forecast time" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> > > of_statistical_processing
> > > > =
> > > > > "Maximum" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f,
2000.f ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
"Specified
> > > height
> > > > > level above ground (m)" ;
> > > > >
> > > > > MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> > category_number
> > > =
> > > > > 8,
> > > > > 0, 7, 199 ;
> > > > >
MXUPHL_P8_2L103_GLC0_max:parameter_discipline_and_
> > > > category
> > > > > = "Meteorological products, Thermodynamic stability indices"
;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type =
"Lambert
> > > Conformal
> > > > > can
> > > > > be secant or tangent, conical or bipolar" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
"gridlat_0
> > > > > gridlon_0" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
9.999e+20f ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name = "Hourly
> maximum
> > > of
> > > > > updraft helicity over layer 2km to 5 km AGL" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:production_status =
> > > > "Operational
> > > > > products" ;
> > > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US
National
> > Weather
> > > > > Service - NCEP (WMC)" ;
> > > > >           ...
> > > > >
> > > > > // global attributes:
> > > > >                 :creation_date = "Mon Mar  5 14:48:10 CST
2018" ;
> > > > >                 :NCL_Version = "6.2.0" ;
> > > > >                 :system = "Linux schooner1.oscer.ou.edu
> > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC
2018
> x86_64
> > > > x86_64
> > > > > x86_64 GNU/Linux" ;
> > > > >                 :Conventions = "None" ;
> > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.grib2" ;
> > > > >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> > > > >
> > > > > There are other arrays in these files, but I omitted their
entries
> > for
> > > > > simplicity. Is there something wrong with the structure of
these
> > files
> > > to
> > > > > make pcp_combine not work? Otherwise, what should I use for
-pcprx
> to
> > > do
> > > > > the following: sum three consecutive 1-hourly files into 3-
hourly
> > files
> > > > (so
> > > > > let's start with f000-f003)?
> > > > >
> > > > > Jeff Duda
> > > > > --
> > > > > Jeff Duda, Research Scientist
> > > > >
> > > > > University of Colorado Boulder
> > > > >
> > > > > Cooperative Institute for Research in Environmental Sciences
> > > > >
> > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > Boulder, CO
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> > > --
> > > Jeff Duda, Research Scientist
> > >
> > > University of Colorado Boulder
> > >
> > > Cooperative Institute for Research in Environmental Sciences
> > >
> > > NOAA/OAR/ESRL/Global Systems Division
> > > Boulder, CO
> > >
> > >
> >
> >
>
>
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>
>

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: Jeff Duda
Time: Mon Mar 26 16:35:18 2018

John,
So I could run pcp_combine on the originating GRIB2 file and it would
work?
I do still have access to those original files.

Jeff

On Mon, Mar 26, 2018 at 4:33 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Jeff,
>
> OK, so this a flavor of NetCDF that MET doesn't know how to handle.
MET
> reads NetCDF files in one of 3 formats:
> (1) The NetCDF output of the MET tools themselves.
> (2) NetCDF files that follow the CF convention.
> (3) The NetCDF output of the wrf_interp utility.
>
> This file doesn't fall into any of those categories.  So
unfortunately,
> none of the MET utilities, including pcp_combine will be able to
read it.
>
> Of course, they could easily read the GRIB2 data from which these
files
> were generated:
>    gsi-enkfCN_2017051800f022.grib2
>
> So unfortunately I don't have a nice, easy solution.
>
> John
>
>
>
> On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT
<met_help at ucar.edu>
> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >
> > John,
> > I dropped three files in there that should give reasonable
results.
> >
> > Jeff
> >
> > On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> > > Jeff,
> > >
> > > Are you able to post a sample file to our anonymous ftp site
following
> > > these instructions?
> > >
> > > https://dtcenter.org/met/users/support/met_help.php#ftp
> > >
> > > Thanks,
> > > John
> > >
> > > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > > wrote:
> > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539
>
> > > >
> > > > Thanks, John, but I am getting an error no matter which set of
three
> > > files
> > > > I try:
> > > >
> > > > [jdduda at schooner2 201705180000]$ /home/jdduda/MODE/pcp_combine
-add
> > > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > level="(*,*)";'
> > > \
> > > >
> > > > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > level="(*,*)";'
> > > > \
> > > >
> > > > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > level="(*,*)";'
> > > > \
> > > >
> > > > > 2017051823_03UH.nc -name max_UH_03h
> > > >
> > > > DEBUG 1: Reading input file: m01_2017051800f021.nc
> > > >
> > > > ERROR  :
> > > >
> > > > ERROR  : get_field() -> can't open data file
"m01_2017051800f021.nc"
> > > >
> > > > ERROR  :
> > > >
> > > > Jeff
> > > >
> > > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
> > > > met_help at ucar.edu> wrote:
> > > >
> > > > > Jeff,
> > > > >
> > > > > The -sum option is intended to be used for summing up
> precipitation.
> > > The
> > > > > logic for this is based on the input and output accumulation
> > intervals.
> > > > > For example, if you input data has a 1-hour accumulation and
you'd
> > like
> > > > an
> > > > > output 12 hour accumulation, pcp_combine would go hunting
for 12
> > files
> > > > > (since 12 / 1 = 12) with the right timing information.
> > > > >
> > > > > You data does not contain an accumulation interval... so the
-sum
> > > command
> > > > > will not be useful.
> > > > >
> > > > > However, you could still use the "-add" option, explicitly
listing
> > the
> > > > data
> > > > > you want to select from each input file:
> > > > >
> > > > > pcp_combine -add \
> > > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > 03hr_UH.nc -name max_UH_03h
> > > > >
> > > > > Does that produce the desired result?
> > > > >
> > > > > Thanks,
> > > > > John
> > > > >
> > > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <
> met_help at ucar.edu
> > >
> > > > > wrote:
> > > > >
> > > > > >
> > > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > > > > > Transaction: Ticket created by jeffduda319 at gmail.com
> > > > > >        Queue: met_help
> > > > > >      Subject: using pcp_combine -sum for custom netcdf
files
> > > > > >        Owner: Nobody
> > > > > >   Requestors: jeffduda319 at gmail.com
> > > > > >       Status: new
> > > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> > > Ticket/Display.html?id=84539
> > > > >
> > > > > >
> > > > > >
> > > > > > I have a large number of forecast netcdf files (format
from
> ncdump
> > -h
> > > > > shown
> > > > > > below) in a directory, and I wish to run pcp_combine to
sum
> updraft
> > > > > > helicity tracks across subsets of the files in that
directory.
> But
> > > the
> > > > > > files have custorm names and structure, so my attempts to
do this
> > > have
> > > > > > resulted in an error:
> > > > > >
> > > > > > ERROR  : sum_data_files() -> Cannot find a file with a
valid time
> > of
> > > > > > 20170501_030000 and accumulation time of 010000 matching
the
> > regular
> > > > > > expression ".*"
> > > > > >
> > > > > > The command I used was
> > > > > >
> > > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03
03hr_UH.nc
> -pcpdir
> > > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> > > > MXUPHL_P8_2L103_GLC0_max
> > > > > > -name max_UH_03h
> > > > > >
> > > > > > where the pcpdir is full of files with names like
> > > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > > > > >
> > > > > > Here 'm01' represents ensemble member number 1; there are
10 such
> > > > > ensemble
> > > > > > members. So you can see the regularity in the file naming.
> > > > > >
> > > > > > The structure of the files is all the same and includes:
> > > > > > [jdduda at schooner1 MODE]$ ncdump -h
/condo/map/jdduda/HWT2017/
> > > > > 201705010000/
> > > > > > m01_2017050100f001.nc
> > > > > > netcdf m01_2017050100f001 {
> > > > > > dimensions:
> > > > > >         ygrid_0 = 1120 ;
> > > > > >         xgrid_0 = 1620 ;
> > > > > >         lv_ISBL0 = 6 ;
> > > > > >         lv_ISBL1 = 5 ;
> > > > > >         lv_HTGL2 = 2 ;
> > > > > >         lv_HTGL3 = 2 ;
> > > > > >         lv_ISBL4 = 5 ;
> > > > > >         lv_SPDL5 = 3 ;
> > > > > >         lv_HTGL6 = 2 ;
> > > > > >         lv_HTGL7 = 2 ;
> > > > > > variables:
> > > > > >          ...
> > > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0) ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
> > "05/01/2017
> > > > > > (00:00)" ;
> > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
> > > > "hours" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time = 1
;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > > statistical_process_duration
> > > > =
> > > > > > "initial time to forecast time" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> > > > of_statistical_processing
> > > > > =
> > > > > > "Maximum" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f,
2000.f ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
"Specified
> > > > height
> > > > > > level above ground (m)" ;
> > > > > >
> > > > > > MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> > > category_number
> > > > =
> > > > > > 8,
> > > > > > 0, 7, 199 ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> parameter_discipline_and_
> > > > > category
> > > > > > = "Meteorological products, Thermodynamic stability
indices" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type =
"Lambert
> > > > Conformal
> > > > > > can
> > > > > > be secant or tangent, conical or bipolar" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
> "gridlat_0
> > > > > > gridlon_0" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
> 9.999e+20f ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name =
"Hourly
> > maximum
> > > > of
> > > > > > updraft helicity over layer 2km to 5 km AGL" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:production_status
=
> > > > > "Operational
> > > > > > products" ;
> > > > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US
National
> > > Weather
> > > > > > Service - NCEP (WMC)" ;
> > > > > >           ...
> > > > > >
> > > > > > // global attributes:
> > > > > >                 :creation_date = "Mon Mar  5 14:48:10 CST
2018" ;
> > > > > >                 :NCL_Version = "6.2.0" ;
> > > > > >                 :system = "Linux schooner1.oscer.ou.edu
> > > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC
2018
> > x86_64
> > > > > x86_64
> > > > > > x86_64 GNU/Linux" ;
> > > > > >                 :Conventions = "None" ;
> > > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.grib2"
> ;
> > > > > >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> > > > > >
> > > > > > There are other arrays in these files, but I omitted their
> entries
> > > for
> > > > > > simplicity. Is there something wrong with the structure of
these
> > > files
> > > > to
> > > > > > make pcp_combine not work? Otherwise, what should I use
for
> -pcprx
> > to
> > > > do
> > > > > > the following: sum three consecutive 1-hourly files into
3-hourly
> > > files
> > > > > (so
> > > > > > let's start with f000-f003)?
> > > > > >
> > > > > > Jeff Duda
> > > > > > --
> > > > > > Jeff Duda, Research Scientist
> > > > > >
> > > > > > University of Colorado Boulder
> > > > > >
> > > > > > Cooperative Institute for Research in Environmental
Sciences
> > > > > >
> > > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > > Boulder, CO
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > > > --
> > > > Jeff Duda, Research Scientist
> > > >
> > > > University of Colorado Boulder
> > > >
> > > > Cooperative Institute for Research in Environmental Sciences
> > > >
> > > > NOAA/OAR/ESRL/Global Systems Division
> > > > Boulder, CO
> > > >
> > > >
> > >
> > >
> >
> >
> > --
> > Jeff Duda, Research Scientist
> >
> > University of Colorado Boulder
> >
> > Cooperative Institute for Research in Environmental Sciences
> >
> > NOAA/OAR/ESRL/Global Systems Division
> > Boulder, CO
> >
> >
>
>


--
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: John Halley Gotway
Time: Mon Mar 26 16:36:20 2018

Yes, sure.  pcp_combine should work fine on the GRIB2.

John

On Mon, Mar 26, 2018 at 4:35 PM, Jeff Duda via RT <met_help at ucar.edu>
wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>
> John,
> So I could run pcp_combine on the originating GRIB2 file and it
would work?
> I do still have access to those original files.
>
> Jeff
>
> On Mon, Mar 26, 2018 at 4:33 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
> > Jeff,
> >
> > OK, so this a flavor of NetCDF that MET doesn't know how to
handle.  MET
> > reads NetCDF files in one of 3 formats:
> > (1) The NetCDF output of the MET tools themselves.
> > (2) NetCDF files that follow the CF convention.
> > (3) The NetCDF output of the wrf_interp utility.
> >
> > This file doesn't fall into any of those categories.  So
unfortunately,
> > none of the MET utilities, including pcp_combine will be able to
read it.
> >
> > Of course, they could easily read the GRIB2 data from which these
files
> > were generated:
> >    gsi-enkfCN_2017051800f022.grib2
> >
> > So unfortunately I don't have a nice, easy solution.
> >
> > John
> >
> >
> >
> > On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> > >
> > > John,
> > > I dropped three files in there that should give reasonable
results.
> > >
> > > Jeff
> > >
> > > On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > > Jeff,
> > > >
> > > > Are you able to post a sample file to our anonymous ftp site
> following
> > > > these instructions?
> > > >
> > > > https://dtcenter.org/met/users/support/met_help.php#ftp
> > > >
> > > > Thanks,
> > > > John
> > > >
> > > > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT <
> met_help at ucar.edu>
> > > > wrote:
> > > >
> > > > >
> > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> > > > >
> > > > > Thanks, John, but I am getting an error no matter which set
of
> three
> > > > files
> > > > > I try:
> > > > >
> > > > > [jdduda at schooner2 201705180000]$
/home/jdduda/MODE/pcp_combine
> -add
> > > > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > level="(*,*)";'
> > > > \
> > > > >
> > > > > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > > level="(*,*)";'
> > > > > \
> > > > >
> > > > > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > > level="(*,*)";'
> > > > > \
> > > > >
> > > > > > 2017051823_03UH.nc -name max_UH_03h
> > > > >
> > > > > DEBUG 1: Reading input file: m01_2017051800f021.nc
> > > > >
> > > > > ERROR  :
> > > > >
> > > > > ERROR  : get_field() -> can't open data file "
> m01_2017051800f021.nc"
> > > > >
> > > > > ERROR  :
> > > > >
> > > > > Jeff
> > > > >
> > > > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT <
> > > > > met_help at ucar.edu> wrote:
> > > > >
> > > > > > Jeff,
> > > > > >
> > > > > > The -sum option is intended to be used for summing up
> > precipitation.
> > > > The
> > > > > > logic for this is based on the input and output
accumulation
> > > intervals.
> > > > > > For example, if you input data has a 1-hour accumulation
and
> you'd
> > > like
> > > > > an
> > > > > > output 12 hour accumulation, pcp_combine would go hunting
for 12
> > > files
> > > > > > (since 12 / 1 = 12) with the right timing information.
> > > > > >
> > > > > > You data does not contain an accumulation interval... so
the -sum
> > > > command
> > > > > > will not be useful.
> > > > > >
> > > > > > However, you could still use the "-add" option, explicitly
> listing
> > > the
> > > > > data
> > > > > > you want to select from each input file:
> > > > > >
> > > > > > pcp_combine -add \
> > > > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > 03hr_UH.nc -name max_UH_03h
> > > > > >
> > > > > > Does that produce the desired result?
> > > > > >
> > > > > > Thanks,
> > > > > > John
> > > > > >
> > > > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <
> > met_help at ucar.edu
> > > >
> > > > > > wrote:
> > > > > >
> > > > > > >
> > > > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted upon.
> > > > > > > Transaction: Ticket created by jeffduda319 at gmail.com
> > > > > > >        Queue: met_help
> > > > > > >      Subject: using pcp_combine -sum for custom netcdf
files
> > > > > > >        Owner: Nobody
> > > > > > >   Requestors: jeffduda319 at gmail.com
> > > > > > >       Status: new
> > > > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> > > > Ticket/Display.html?id=84539
> > > > > >
> > > > > > >
> > > > > > >
> > > > > > > I have a large number of forecast netcdf files (format
from
> > ncdump
> > > -h
> > > > > > shown
> > > > > > > below) in a directory, and I wish to run pcp_combine to
sum
> > updraft
> > > > > > > helicity tracks across subsets of the files in that
directory.
> > But
> > > > the
> > > > > > > files have custorm names and structure, so my attempts
to do
> this
> > > > have
> > > > > > > resulted in an error:
> > > > > > >
> > > > > > > ERROR  : sum_data_files() -> Cannot find a file with a
valid
> time
> > > of
> > > > > > > 20170501_030000 and accumulation time of 010000 matching
the
> > > regular
> > > > > > > expression ".*"
> > > > > > >
> > > > > > > The command I used was
> > > > > > >
> > > > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03
03hr_UH.nc
> > -pcpdir
> > > > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> > > > > MXUPHL_P8_2L103_GLC0_max
> > > > > > > -name max_UH_03h
> > > > > > >
> > > > > > > where the pcpdir is full of files with names like
> > > > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > > > > > >
> > > > > > > Here 'm01' represents ensemble member number 1; there
are 10
> such
> > > > > > ensemble
> > > > > > > members. So you can see the regularity in the file
naming.
> > > > > > >
> > > > > > > The structure of the files is all the same and includes:
> > > > > > > [jdduda at schooner1 MODE]$ ncdump -h
/condo/map/jdduda/HWT2017/
> > > > > > 201705010000/
> > > > > > > m01_2017050100f001.nc
> > > > > > > netcdf m01_2017050100f001 {
> > > > > > > dimensions:
> > > > > > >         ygrid_0 = 1120 ;
> > > > > > >         xgrid_0 = 1620 ;
> > > > > > >         lv_ISBL0 = 6 ;
> > > > > > >         lv_ISBL1 = 5 ;
> > > > > > >         lv_HTGL2 = 2 ;
> > > > > > >         lv_HTGL3 = 2 ;
> > > > > > >         lv_ISBL4 = 5 ;
> > > > > > >         lv_SPDL5 = 3 ;
> > > > > > >         lv_HTGL6 = 2 ;
> > > > > > >         lv_HTGL7 = 2 ;
> > > > > > > variables:
> > > > > > >          ...
> > > > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0, xgrid_0)
;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time =
> > > "05/01/2017
> > > > > > > (00:00)" ;
> > > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time_units =
> > > > > "hours" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time =
1 ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > > > statistical_process_duration
> > > > > =
> > > > > > > "initial time to forecast time" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> > > > > of_statistical_processing
> > > > > > =
> > > > > > > "Maximum" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level = 5000.f,
> 2000.f ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
> "Specified
> > > > > height
> > > > > > > level above ground (m)" ;
> > > > > > >
> > > > > > > MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> > > > category_number
> > > > > =
> > > > > > > 8,
> > > > > > > 0, 7, 199 ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > parameter_discipline_and_
> > > > > > category
> > > > > > > = "Meteorological products, Thermodynamic stability
indices" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type =
"Lambert
> > > > > Conformal
> > > > > > > can
> > > > > > > be secant or tangent, conical or bipolar" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
> > "gridlat_0
> > > > > > > gridlon_0" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
> > 9.999e+20f ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:units = "m2/s2"
;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name =
"Hourly
> > > maximum
> > > > > of
> > > > > > > updraft helicity over layer 2km to 5 km AGL" ;
> > > > > > >
MXUPHL_P8_2L103_GLC0_max:production_status =
> > > > > > "Operational
> > > > > > > products" ;
> > > > > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US
National
> > > > Weather
> > > > > > > Service - NCEP (WMC)" ;
> > > > > > >           ...
> > > > > > >
> > > > > > > // global attributes:
> > > > > > >                 :creation_date = "Mon Mar  5 14:48:10
CST
> 2018" ;
> > > > > > >                 :NCL_Version = "6.2.0" ;
> > > > > > >                 :system = "Linux schooner1.oscer.ou.edu
> > > > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC
2018
> > > x86_64
> > > > > > x86_64
> > > > > > > x86_64 GNU/Linux" ;
> > > > > > >                 :Conventions = "None" ;
> > > > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.
> grib2"
> > ;
> > > > > > >                 :title = "NCL: convert-GRIB-to-netCDF" ;
> > > > > > >
> > > > > > > There are other arrays in these files, but I omitted
their
> > entries
> > > > for
> > > > > > > simplicity. Is there something wrong with the structure
of
> these
> > > > files
> > > > > to
> > > > > > > make pcp_combine not work? Otherwise, what should I use
for
> > -pcprx
> > > to
> > > > > do
> > > > > > > the following: sum three consecutive 1-hourly files into
> 3-hourly
> > > > files
> > > > > > (so
> > > > > > > let's start with f000-f003)?
> > > > > > >
> > > > > > > Jeff Duda
> > > > > > > --
> > > > > > > Jeff Duda, Research Scientist
> > > > > > >
> > > > > > > University of Colorado Boulder
> > > > > > >
> > > > > > > Cooperative Institute for Research in Environmental
Sciences
> > > > > > >
> > > > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > > > Boulder, CO
> > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Jeff Duda, Research Scientist
> > > > >
> > > > > University of Colorado Boulder
> > > > >
> > > > > Cooperative Institute for Research in Environmental Sciences
> > > > >
> > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > Boulder, CO
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> > > --
> > > Jeff Duda, Research Scientist
> > >
> > > University of Colorado Boulder
> > >
> > > Cooperative Institute for Research in Environmental Sciences
> > >
> > > NOAA/OAR/ESRL/Global Systems Division
> > > Boulder, CO
> > >
> > >
> >
> >
>
>
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>
>

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: Jeff Duda
Time: Mon Mar 26 16:43:24 2018

John,
Okay, I tested this quickly and got it to sum things from three GRIB2
files. But how do I tell pcp_combine which specific array to combine?
It
seems to have summed the precip array by default.

Jeff

On Mon, Mar 26, 2018 at 4:36 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Yes, sure.  pcp_combine should work fine on the GRIB2.
>
> John
>
> On Mon, Mar 26, 2018 at 4:35 PM, Jeff Duda via RT
<met_help at ucar.edu>
> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >
> > John,
> > So I could run pcp_combine on the originating GRIB2 file and it
would
> work?
> > I do still have access to those original files.
> >
> > Jeff
> >
> > On Mon, Mar 26, 2018 at 4:33 PM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> > > Jeff,
> > >
> > > OK, so this a flavor of NetCDF that MET doesn't know how to
handle.
> MET
> > > reads NetCDF files in one of 3 formats:
> > > (1) The NetCDF output of the MET tools themselves.
> > > (2) NetCDF files that follow the CF convention.
> > > (3) The NetCDF output of the wrf_interp utility.
> > >
> > > This file doesn't fall into any of those categories.  So
unfortunately,
> > > none of the MET utilities, including pcp_combine will be able to
read
> it.
> > >
> > > Of course, they could easily read the GRIB2 data from which
these files
> > > were generated:
> > >    gsi-enkfCN_2017051800f022.grib2
> > >
> > > So unfortunately I don't have a nice, easy solution.
> > >
> > > John
> > >
> > >
> > >
> > > On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT
<met_help at ucar.edu>
> > > wrote:
> > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539
>
> > > >
> > > > John,
> > > > I dropped three files in there that should give reasonable
results.
> > > >
> > > > Jeff
> > > >
> > > > On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
> > > > met_help at ucar.edu> wrote:
> > > >
> > > > > Jeff,
> > > > >
> > > > > Are you able to post a sample file to our anonymous ftp site
> > following
> > > > > these instructions?
> > > > >
> > > > > https://dtcenter.org/met/users/support/met_help.php#ftp
> > > > >
> > > > > Thanks,
> > > > > John
> > > > >
> > > > > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT <
> > met_help at ucar.edu>
> > > > > wrote:
> > > > >
> > > > > >
> > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> > > > > >
> > > > > > Thanks, John, but I am getting an error no matter which
set of
> > three
> > > > > files
> > > > > > I try:
> > > > > >
> > > > > > [jdduda at schooner2 201705180000]$
/home/jdduda/MODE/pcp_combine
> > -add
> > > > > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > > level="(*,*)";'
> > > > > \
> > > > > >
> > > > > > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > > > level="(*,*)";'
> > > > > > \
> > > > > >
> > > > > > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> > > > > level="(*,*)";'
> > > > > > \
> > > > > >
> > > > > > > 2017051823_03UH.nc -name max_UH_03h
> > > > > >
> > > > > > DEBUG 1: Reading input file: m01_2017051800f021.nc
> > > > > >
> > > > > > ERROR  :
> > > > > >
> > > > > > ERROR  : get_field() -> can't open data file "
> > m01_2017051800f021.nc"
> > > > > >
> > > > > > ERROR  :
> > > > > >
> > > > > > Jeff
> > > > > >
> > > > > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via RT
<
> > > > > > met_help at ucar.edu> wrote:
> > > > > >
> > > > > > > Jeff,
> > > > > > >
> > > > > > > The -sum option is intended to be used for summing up
> > > precipitation.
> > > > > The
> > > > > > > logic for this is based on the input and output
accumulation
> > > > intervals.
> > > > > > > For example, if you input data has a 1-hour accumulation
and
> > you'd
> > > > like
> > > > > > an
> > > > > > > output 12 hour accumulation, pcp_combine would go
hunting for
> 12
> > > > files
> > > > > > > (since 12 / 1 = 12) with the right timing information.
> > > > > > >
> > > > > > > You data does not contain an accumulation interval... so
the
> -sum
> > > > > command
> > > > > > > will not be useful.
> > > > > > >
> > > > > > > However, you could still use the "-add" option,
explicitly
> > listing
> > > > the
> > > > > > data
> > > > > > > you want to select from each input file:
> > > > > > >
> > > > > > > pcp_combine -add \
> > > > > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> > > > > > > 03hr_UH.nc -name max_UH_03h
> > > > > > >
> > > > > > > Does that produce the desired result?
> > > > > > >
> > > > > > > Thanks,
> > > > > > > John
> > > > > > >
> > > > > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <
> > > met_help at ucar.edu
> > > > >
> > > > > > > wrote:
> > > > > > >
> > > > > > > >
> > > > > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted
upon.
> > > > > > > > Transaction: Ticket created by jeffduda319 at gmail.com
> > > > > > > >        Queue: met_help
> > > > > > > >      Subject: using pcp_combine -sum for custom netcdf
files
> > > > > > > >        Owner: Nobody
> > > > > > > >   Requestors: jeffduda319 at gmail.com
> > > > > > > >       Status: new
> > > > > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> > > > > Ticket/Display.html?id=84539
> > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > I have a large number of forecast netcdf files (format
from
> > > ncdump
> > > > -h
> > > > > > > shown
> > > > > > > > below) in a directory, and I wish to run pcp_combine
to sum
> > > updraft
> > > > > > > > helicity tracks across subsets of the files in that
> directory.
> > > But
> > > > > the
> > > > > > > > files have custorm names and structure, so my attempts
to do
> > this
> > > > > have
> > > > > > > > resulted in an error:
> > > > > > > >
> > > > > > > > ERROR  : sum_data_files() -> Cannot find a file with a
valid
> > time
> > > > of
> > > > > > > > 20170501_030000 and accumulation time of 010000
matching the
> > > > regular
> > > > > > > > expression ".*"
> > > > > > > >
> > > > > > > > The command I used was
> > > > > > > >
> > > > > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03
03hr_UH.nc
> > > -pcpdir
> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> > > > > > MXUPHL_P8_2L103_GLC0_max
> > > > > > > > -name max_UH_03h
> > > > > > > >
> > > > > > > > where the pcpdir is full of files with names like
> > > > > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f000.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f001.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f002.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f003.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f004.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f005.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f006.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f007.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f008.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f009.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f010.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f011.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f012.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f013.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f014.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f015.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f016.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f017.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f018.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f019.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f020.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f021.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f022.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f023.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f024.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f025.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f026.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f027.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f028.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f029.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f030.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f031.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f032.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f033.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f034.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f035.nc
> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> > > > > > > >
/condo/map/jdduda/HWT2017/201705010000/m01_2017050100f036.nc
> > > > > > > >
> > > > > > > > Here 'm01' represents ensemble member number 1; there
are 10
> > such
> > > > > > > ensemble
> > > > > > > > members. So you can see the regularity in the file
naming.
> > > > > > > >
> > > > > > > > The structure of the files is all the same and
includes:
> > > > > > > > [jdduda at schooner1 MODE]$ ncdump -h
> /condo/map/jdduda/HWT2017/
> > > > > > > 201705010000/
> > > > > > > > m01_2017050100f001.nc
> > > > > > > > netcdf m01_2017050100f001 {
> > > > > > > > dimensions:
> > > > > > > >         ygrid_0 = 1120 ;
> > > > > > > >         xgrid_0 = 1620 ;
> > > > > > > >         lv_ISBL0 = 6 ;
> > > > > > > >         lv_ISBL1 = 5 ;
> > > > > > > >         lv_HTGL2 = 2 ;
> > > > > > > >         lv_HTGL3 = 2 ;
> > > > > > > >         lv_ISBL4 = 5 ;
> > > > > > > >         lv_SPDL5 = 3 ;
> > > > > > > >         lv_HTGL6 = 2 ;
> > > > > > > >         lv_HTGL7 = 2 ;
> > > > > > > > variables:
> > > > > > > >          ...
> > > > > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0,
xgrid_0) ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time
=
> > > > "05/01/2017
> > > > > > > > (00:00)" ;
> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time_units
> =
> > > > > > "hours" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:forecast_time
= 1 ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > > > > statistical_process_duration
> > > > > > =
> > > > > > > > "initial time to forecast time" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> > > > > > of_statistical_processing
> > > > > > > =
> > > > > > > > "Maximum" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level =
5000.f,
> > 2000.f ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
> > "Specified
> > > > > > height
> > > > > > > > level above ground (m)" ;
> > > > > > > >
> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> > > > > category_number
> > > > > > =
> > > > > > > > 8,
> > > > > > > > 0, 7, 199 ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> > > parameter_discipline_and_
> > > > > > > category
> > > > > > > > = "Meteorological products, Thermodynamic stability
indices"
> ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type =
> "Lambert
> > > > > > Conformal
> > > > > > > > can
> > > > > > > > be secant or tangent, conical or bipolar" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates =
> > > "gridlat_0
> > > > > > > > gridlon_0" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
> > > 9.999e+20f ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:units =
"m2/s2" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name =
"Hourly
> > > > maximum
> > > > > > of
> > > > > > > > updraft helicity over layer 2km to 5 km AGL" ;
> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:production_status =
> > > > > > > "Operational
> > > > > > > > products" ;
> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US
> National
> > > > > Weather
> > > > > > > > Service - NCEP (WMC)" ;
> > > > > > > >           ...
> > > > > > > >
> > > > > > > > // global attributes:
> > > > > > > >                 :creation_date = "Mon Mar  5 14:48:10
CST
> > 2018" ;
> > > > > > > >                 :NCL_Version = "6.2.0" ;
> > > > > > > >                 :system = "Linux
schooner1.oscer.ou.edu
> > > > > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37
UTC 2018
> > > > x86_64
> > > > > > > x86_64
> > > > > > > > x86_64 GNU/Linux" ;
> > > > > > > >                 :Conventions = "None" ;
> > > > > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.
> > grib2"
> > > ;
> > > > > > > >                 :title = "NCL: convert-GRIB-to-netCDF"
;
> > > > > > > >
> > > > > > > > There are other arrays in these files, but I omitted
their
> > > entries
> > > > > for
> > > > > > > > simplicity. Is there something wrong with the
structure of
> > these
> > > > > files
> > > > > > to
> > > > > > > > make pcp_combine not work? Otherwise, what should I
use for
> > > -pcprx
> > > > to
> > > > > > do
> > > > > > > > the following: sum three consecutive 1-hourly files
into
> > 3-hourly
> > > > > files
> > > > > > > (so
> > > > > > > > let's start with f000-f003)?
> > > > > > > >
> > > > > > > > Jeff Duda
> > > > > > > > --
> > > > > > > > Jeff Duda, Research Scientist
> > > > > > > >
> > > > > > > > University of Colorado Boulder
> > > > > > > >
> > > > > > > > Cooperative Institute for Research in Environmental
Sciences
> > > > > > > >
> > > > > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > > > > Boulder, CO
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > > Jeff Duda, Research Scientist
> > > > > >
> > > > > > University of Colorado Boulder
> > > > > >
> > > > > > Cooperative Institute for Research in Environmental
Sciences
> > > > > >
> > > > > > NOAA/OAR/ESRL/Global Systems Division
> > > > > > Boulder, CO
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > > > --
> > > > Jeff Duda, Research Scientist
> > > >
> > > > University of Colorado Boulder
> > > >
> > > > Cooperative Institute for Research in Environmental Sciences
> > > >
> > > > NOAA/OAR/ESRL/Global Systems Division
> > > > Boulder, CO
> > > >
> > > >
> > >
> > >
> >
> >
> > --
> > Jeff Duda, Research Scientist
> >
> > University of Colorado Boulder
> >
> > Cooperative Institute for Research in Environmental Sciences
> >
> > NOAA/OAR/ESRL/Global Systems Division
> > Boulder, CO
> >
> >
>
>


--
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: Jeff Duda
Time: Mon Mar 26 16:46:08 2018

Nevermind. I figured it out! Thanks!

Jeff

On Mon, Mar 26, 2018 at 4:43 PM, Jeff Duda <jeffduda319 at gmail.com>
wrote:

> John,
> Okay, I tested this quickly and got it to sum things from three
GRIB2
> files. But how do I tell pcp_combine which specific array to
combine? It
> seems to have summed the precip array by default.
>
> Jeff
>
> On Mon, Mar 26, 2018 at 4:36 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
>> Yes, sure.  pcp_combine should work fine on the GRIB2.
>>
>> John
>>
>> On Mon, Mar 26, 2018 at 4:35 PM, Jeff Duda via RT
<met_help at ucar.edu>
>> wrote:
>>
>> >
>> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>> >
>> > John,
>> > So I could run pcp_combine on the originating GRIB2 file and it
would
>> work?
>> > I do still have access to those original files.
>> >
>> > Jeff
>> >
>> > On Mon, Mar 26, 2018 at 4:33 PM, John Halley Gotway via RT <
>> > met_help at ucar.edu> wrote:
>> >
>> > > Jeff,
>> > >
>> > > OK, so this a flavor of NetCDF that MET doesn't know how to
handle.
>> MET
>> > > reads NetCDF files in one of 3 formats:
>> > > (1) The NetCDF output of the MET tools themselves.
>> > > (2) NetCDF files that follow the CF convention.
>> > > (3) The NetCDF output of the wrf_interp utility.
>> > >
>> > > This file doesn't fall into any of those categories.  So
>> unfortunately,
>> > > none of the MET utilities, including pcp_combine will be able
to read
>> it.
>> > >
>> > > Of course, they could easily read the GRIB2 data from which
these
>> files
>> > > were generated:
>> > >    gsi-enkfCN_2017051800f022.grib2
>> > >
>> > > So unfortunately I don't have a nice, easy solution.
>> > >
>> > > John
>> > >
>> > >
>> > >
>> > > On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT
<met_help at ucar.edu>
>> > > wrote:
>> > >
>> > > >
>> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539
>
>> > > >
>> > > > John,
>> > > > I dropped three files in there that should give reasonable
results.
>> > > >
>> > > > Jeff
>> > > >
>> > > > On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT <
>> > > > met_help at ucar.edu> wrote:
>> > > >
>> > > > > Jeff,
>> > > > >
>> > > > > Are you able to post a sample file to our anonymous ftp
site
>> > following
>> > > > > these instructions?
>> > > > >
>> > > > > https://dtcenter.org/met/users/support/met_help.php#ftp
>> > > > >
>> > > > > Thanks,
>> > > > > John
>> > > > >
>> > > > > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT <
>> > met_help at ucar.edu>
>> > > > > wrote:
>> > > > >
>> > > > > >
>> > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>> > > > > >
>> > > > > > Thanks, John, but I am getting an error no matter which
set of
>> > three
>> > > > > files
>> > > > > > I try:
>> > > > > >
>> > > > > > [jdduda at schooner2 201705180000]$
/home/jdduda/MODE/pcp_combine
>> > -add
>> > > > > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
>> > > > level="(*,*)";'
>> > > > > \
>> > > > > >
>> > > > > > > m01_2017051800f022.nc 'name="MXUPHL_P8_2L103_GLC0_max";
>> > > > > level="(*,*)";'
>> > > > > > \
>> > > > > >
>> > > > > > > m01_2017051800f023.nc 'name="MXUPHL_P8_2L103_GLC0_max";
>> > > > > level="(*,*)";'
>> > > > > > \
>> > > > > >
>> > > > > > > 2017051823_03UH.nc -name max_UH_03h
>> > > > > >
>> > > > > > DEBUG 1: Reading input file: m01_2017051800f021.nc
>> > > > > >
>> > > > > > ERROR  :
>> > > > > >
>> > > > > > ERROR  : get_field() -> can't open data file "
>> > m01_2017051800f021.nc"
>> > > > > >
>> > > > > > ERROR  :
>> > > > > >
>> > > > > > Jeff
>> > > > > >
>> > > > > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via
RT <
>> > > > > > met_help at ucar.edu> wrote:
>> > > > > >
>> > > > > > > Jeff,
>> > > > > > >
>> > > > > > > The -sum option is intended to be used for summing up
>> > > precipitation.
>> > > > > The
>> > > > > > > logic for this is based on the input and output
accumulation
>> > > > intervals.
>> > > > > > > For example, if you input data has a 1-hour
accumulation and
>> > you'd
>> > > > like
>> > > > > > an
>> > > > > > > output 12 hour accumulation, pcp_combine would go
hunting for
>> 12
>> > > > files
>> > > > > > > (since 12 / 1 = 12) with the right timing information.
>> > > > > > >
>> > > > > > > You data does not contain an accumulation interval...
so the
>> -sum
>> > > > > command
>> > > > > > > will not be useful.
>> > > > > > >
>> > > > > > > However, you could still use the "-add" option,
explicitly
>> > listing
>> > > > the
>> > > > > > data
>> > > > > > > you want to select from each input file:
>> > > > > > >
>> > > > > > > pcp_combine -add \
>> > > > > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
>> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
>> > > > > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
>> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
>> > > > > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
>> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
>> > > > > > > 03hr_UH.nc -name max_UH_03h
>> > > > > > >
>> > > > > > > Does that produce the desired result?
>> > > > > > >
>> > > > > > > Thanks,
>> > > > > > > John
>> > > > > > >
>> > > > > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <
>> > > met_help at ucar.edu
>> > > > >
>> > > > > > > wrote:
>> > > > > > >
>> > > > > > > >
>> > > > > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted
upon.
>> > > > > > > > Transaction: Ticket created by jeffduda319 at gmail.com
>> > > > > > > >        Queue: met_help
>> > > > > > > >      Subject: using pcp_combine -sum for custom
netcdf files
>> > > > > > > >        Owner: Nobody
>> > > > > > > >   Requestors: jeffduda319 at gmail.com
>> > > > > > > >       Status: new
>> > > > > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
>> > > > > Ticket/Display.html?id=84539
>> > > > > > >
>> > > > > > > >
>> > > > > > > >
>> > > > > > > > I have a large number of forecast netcdf files
(format from
>> > > ncdump
>> > > > -h
>> > > > > > > shown
>> > > > > > > > below) in a directory, and I wish to run pcp_combine
to sum
>> > > updraft
>> > > > > > > > helicity tracks across subsets of the files in that
>> directory.
>> > > But
>> > > > > the
>> > > > > > > > files have custorm names and structure, so my
attempts to do
>> > this
>> > > > > have
>> > > > > > > > resulted in an error:
>> > > > > > > >
>> > > > > > > > ERROR  : sum_data_files() -> Cannot find a file with
a valid
>> > time
>> > > > of
>> > > > > > > > 20170501_030000 and accumulation time of 010000
matching the
>> > > > regular
>> > > > > > > > expression ".*"
>> > > > > > > >
>> > > > > > > > The command I used was
>> > > > > > > >
>> > > > > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03
03hr_UH.nc
>> > > -pcpdir
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
>> > > > > > MXUPHL_P8_2L103_GLC0_max
>> > > > > > > > -name max_UH_03h
>> > > > > > > >
>> > > > > > > > where the pcpdir is full of files with names like
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f000.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f001.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f002.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f003.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f004.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f005.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f006.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f007.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f008.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f009.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f010.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f011.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f012.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f013.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f014.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f015.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f016.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f017.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f018.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f019.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f020.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f021.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f022.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f023.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f024.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f025.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f026.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f027.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f028.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f029.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f030.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f031.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f032.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f033.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f034.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f035.nc
>> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
>> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
>> m01_2017050100f036.nc
>> > > > > > > >
>> > > > > > > > Here 'm01' represents ensemble member number 1; there
are 10
>> > such
>> > > > > > > ensemble
>> > > > > > > > members. So you can see the regularity in the file
naming.
>> > > > > > > >
>> > > > > > > > The structure of the files is all the same and
includes:
>> > > > > > > > [jdduda at schooner1 MODE]$ ncdump -h
>> /condo/map/jdduda/HWT2017/
>> > > > > > > 201705010000/
>> > > > > > > > m01_2017050100f001.nc
>> > > > > > > > netcdf m01_2017050100f001 {
>> > > > > > > > dimensions:
>> > > > > > > >         ygrid_0 = 1120 ;
>> > > > > > > >         xgrid_0 = 1620 ;
>> > > > > > > >         lv_ISBL0 = 6 ;
>> > > > > > > >         lv_ISBL1 = 5 ;
>> > > > > > > >         lv_HTGL2 = 2 ;
>> > > > > > > >         lv_HTGL3 = 2 ;
>> > > > > > > >         lv_ISBL4 = 5 ;
>> > > > > > > >         lv_SPDL5 = 3 ;
>> > > > > > > >         lv_HTGL6 = 2 ;
>> > > > > > > >         lv_HTGL7 = 2 ;
>> > > > > > > > variables:
>> > > > > > > >          ...
>> > > > > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0,
xgrid_0) ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:initial_time
=
>> > > > "05/01/2017
>> > > > > > > > (00:00)" ;
>> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time_units
>> =
>> > > > > > "hours" ;
>> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time = 1
>> ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
>> > > > > statistical_process_duration
>> > > > > > =
>> > > > > > > > "initial time to forecast time" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
>> > > > > > of_statistical_processing
>> > > > > > > =
>> > > > > > > > "Maximum" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level =
5000.f,
>> > 2000.f ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type =
>> > "Specified
>> > > > > > height
>> > > > > > > > level above ground (m)" ;
>> > > > > > > >
>> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
>> > > > > category_number
>> > > > > > =
>> > > > > > > > 8,
>> > > > > > > > 0, 7, 199 ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
>> > > parameter_discipline_and_
>> > > > > > > category
>> > > > > > > > = "Meteorological products, Thermodynamic stability
>> indices" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type =
>> "Lambert
>> > > > > > Conformal
>> > > > > > > > can
>> > > > > > > > be secant or tangent, conical or bipolar" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:coordinates
=
>> > > "gridlat_0
>> > > > > > > > gridlon_0" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue =
>> > > 9.999e+20f ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:units =
"m2/s2" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name =
>> "Hourly
>> > > > maximum
>> > > > > > of
>> > > > > > > > updraft helicity over layer 2km to 5 km AGL" ;
>> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:production_status
>> =
>> > > > > > > "Operational
>> > > > > > > > products" ;
>> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:center = "US
>> National
>> > > > > Weather
>> > > > > > > > Service - NCEP (WMC)" ;
>> > > > > > > >           ...
>> > > > > > > >
>> > > > > > > > // global attributes:
>> > > > > > > >                 :creation_date = "Mon Mar  5 14:48:10
CST
>> > 2018" ;
>> > > > > > > >                 :NCL_Version = "6.2.0" ;
>> > > > > > > >                 :system = "Linux
schooner1.oscer.ou.edu
>> > > > > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37
UTC
>> 2018
>> > > > x86_64
>> > > > > > > x86_64
>> > > > > > > > x86_64 GNU/Linux" ;
>> > > > > > > >                 :Conventions = "None" ;
>> > > > > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.
>> > grib2"
>> > > ;
>> > > > > > > >                 :title = "NCL: convert-GRIB-to-
netCDF" ;
>> > > > > > > >
>> > > > > > > > There are other arrays in these files, but I omitted
their
>> > > entries
>> > > > > for
>> > > > > > > > simplicity. Is there something wrong with the
structure of
>> > these
>> > > > > files
>> > > > > > to
>> > > > > > > > make pcp_combine not work? Otherwise, what should I
use for
>> > > -pcprx
>> > > > to
>> > > > > > do
>> > > > > > > > the following: sum three consecutive 1-hourly files
into
>> > 3-hourly
>> > > > > files
>> > > > > > > (so
>> > > > > > > > let's start with f000-f003)?
>> > > > > > > >
>> > > > > > > > Jeff Duda
>> > > > > > > > --
>> > > > > > > > Jeff Duda, Research Scientist
>> > > > > > > >
>> > > > > > > > University of Colorado Boulder
>> > > > > > > >
>> > > > > > > > Cooperative Institute for Research in Environmental
Sciences
>> > > > > > > >
>> > > > > > > > NOAA/OAR/ESRL/Global Systems Division
>> > > > > > > > Boulder, CO
>> > > > > > > >
>> > > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > > >
>> > > > > > --
>> > > > > > Jeff Duda, Research Scientist
>> > > > > >
>> > > > > > University of Colorado Boulder
>> > > > > >
>> > > > > > Cooperative Institute for Research in Environmental
Sciences
>> > > > > >
>> > > > > > NOAA/OAR/ESRL/Global Systems Division
>> > > > > > Boulder, CO
>> > > > > >
>> > > > > >
>> > > > >
>> > > > >
>> > > >
>> > > >
>> > > > --
>> > > > Jeff Duda, Research Scientist
>> > > >
>> > > > University of Colorado Boulder
>> > > >
>> > > > Cooperative Institute for Research in Environmental Sciences
>> > > >
>> > > > NOAA/OAR/ESRL/Global Systems Division
>> > > > Boulder, CO
>> > > >
>> > > >
>> > >
>> > >
>> >
>> >
>> > --
>> > Jeff Duda, Research Scientist
>> >
>> > University of Colorado Boulder
>> >
>> > Cooperative Institute for Research in Environmental Sciences
>> >
>> > NOAA/OAR/ESRL/Global Systems Division
>> > Boulder, CO
>> >
>> >
>>
>>
>
>
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>



--
Jeff Duda, Research Scientist

University of Colorado Boulder

Cooperative Institute for Research in Environmental Sciences

NOAA/OAR/ESRL/Global Systems Division
Boulder, CO

------------------------------------------------
Subject: using pcp_combine -sum for custom netcdf files
From: John Halley Gotway
Time: Tue Mar 27 09:41:18 2018

Jeff,

OK, great thanks for confirming.  I'll go ahead and resolve this
ticket.

Thanks,
John

On Mon, Mar 26, 2018 at 4:46 PM, Jeff Duda via RT <met_help at ucar.edu>
wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
>
> Nevermind. I figured it out! Thanks!
>
> Jeff
>
> On Mon, Mar 26, 2018 at 4:43 PM, Jeff Duda <jeffduda319 at gmail.com>
wrote:
>
> > John,
> > Okay, I tested this quickly and got it to sum things from three
GRIB2
> > files. But how do I tell pcp_combine which specific array to
combine? It
> > seems to have summed the precip array by default.
> >
> > Jeff
> >
> > On Mon, Mar 26, 2018 at 4:36 PM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> >> Yes, sure.  pcp_combine should work fine on the GRIB2.
> >>
> >> John
> >>
> >> On Mon, Mar 26, 2018 at 4:35 PM, Jeff Duda via RT
<met_help at ucar.edu>
> >> wrote:
> >>
> >> >
> >> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >> >
> >> > John,
> >> > So I could run pcp_combine on the originating GRIB2 file and it
would
> >> work?
> >> > I do still have access to those original files.
> >> >
> >> > Jeff
> >> >
> >> > On Mon, Mar 26, 2018 at 4:33 PM, John Halley Gotway via RT <
> >> > met_help at ucar.edu> wrote:
> >> >
> >> > > Jeff,
> >> > >
> >> > > OK, so this a flavor of NetCDF that MET doesn't know how to
handle.
> >> MET
> >> > > reads NetCDF files in one of 3 formats:
> >> > > (1) The NetCDF output of the MET tools themselves.
> >> > > (2) NetCDF files that follow the CF convention.
> >> > > (3) The NetCDF output of the wrf_interp utility.
> >> > >
> >> > > This file doesn't fall into any of those categories.  So
> >> unfortunately,
> >> > > none of the MET utilities, including pcp_combine will be able
to
> read
> >> it.
> >> > >
> >> > > Of course, they could easily read the GRIB2 data from which
these
> >> files
> >> > > were generated:
> >> > >    gsi-enkfCN_2017051800f022.grib2
> >> > >
> >> > > So unfortunately I don't have a nice, easy solution.
> >> > >
> >> > > John
> >> > >
> >> > >
> >> > >
> >> > > On Mon, Mar 26, 2018 at 2:55 PM, Jeff Duda via RT <
> met_help at ucar.edu>
> >> > > wrote:
> >> > >
> >> > > >
> >> > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539 >
> >> > > >
> >> > > > John,
> >> > > > I dropped three files in there that should give reasonable
> results.
> >> > > >
> >> > > > Jeff
> >> > > >
> >> > > > On Mon, Mar 26, 2018 at 2:24 PM, John Halley Gotway via RT
<
> >> > > > met_help at ucar.edu> wrote:
> >> > > >
> >> > > > > Jeff,
> >> > > > >
> >> > > > > Are you able to post a sample file to our anonymous ftp
site
> >> > following
> >> > > > > these instructions?
> >> > > > >
> >> > > > > https://dtcenter.org/met/users/support/met_help.php#ftp
> >> > > > >
> >> > > > > Thanks,
> >> > > > > John
> >> > > > >
> >> > > > > On Mon, Mar 26, 2018 at 12:29 PM, Jeff Duda via RT <
> >> > met_help at ucar.edu>
> >> > > > > wrote:
> >> > > > >
> >> > > > > >
> >> > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=84539
> >
> >> > > > > >
> >> > > > > > Thanks, John, but I am getting an error no matter which
set of
> >> > three
> >> > > > > files
> >> > > > > > I try:
> >> > > > > >
> >> > > > > > [jdduda at schooner2 201705180000]$
> /home/jdduda/MODE/pcp_combine
> >> > -add
> >> > > > > > m01_2017051800f021.nc 'name="MXUPHL_P8_2L103_GLC0_max";
> >> > > > level="(*,*)";'
> >> > > > > \
> >> > > > > >
> >> > > > > > > m01_2017051800f022.nc
'name="MXUPHL_P8_2L103_GLC0_max";
> >> > > > > level="(*,*)";'
> >> > > > > > \
> >> > > > > >
> >> > > > > > > m01_2017051800f023.nc
'name="MXUPHL_P8_2L103_GLC0_max";
> >> > > > > level="(*,*)";'
> >> > > > > > \
> >> > > > > >
> >> > > > > > > 2017051823_03UH.nc -name max_UH_03h
> >> > > > > >
> >> > > > > > DEBUG 1: Reading input file: m01_2017051800f021.nc
> >> > > > > >
> >> > > > > > ERROR  :
> >> > > > > >
> >> > > > > > ERROR  : get_field() -> can't open data file "
> >> > m01_2017051800f021.nc"
> >> > > > > >
> >> > > > > > ERROR  :
> >> > > > > >
> >> > > > > > Jeff
> >> > > > > >
> >> > > > > > On Mon, Mar 26, 2018 at 9:49 AM, John Halley Gotway via
RT <
> >> > > > > > met_help at ucar.edu> wrote:
> >> > > > > >
> >> > > > > > > Jeff,
> >> > > > > > >
> >> > > > > > > The -sum option is intended to be used for summing up
> >> > > precipitation.
> >> > > > > The
> >> > > > > > > logic for this is based on the input and output
accumulation
> >> > > > intervals.
> >> > > > > > > For example, if you input data has a 1-hour
accumulation and
> >> > you'd
> >> > > > like
> >> > > > > > an
> >> > > > > > > output 12 hour accumulation, pcp_combine would go
hunting
> for
> >> 12
> >> > > > files
> >> > > > > > > (since 12 / 1 = 12) with the right timing
information.
> >> > > > > > >
> >> > > > > > > You data does not contain an accumulation interval...
so the
> >> -sum
> >> > > > > command
> >> > > > > > > will not be useful.
> >> > > > > > >
> >> > > > > > > However, you could still use the "-add" option,
explicitly
> >> > listing
> >> > > > the
> >> > > > > > data
> >> > > > > > > you want to select from each input file:
> >> > > > > > >
> >> > > > > > > pcp_combine -add \
> >> > > > > > > m01_2017050100f001.nc <http://m01_2017050100f000.nc>
> >> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> >> > > > > > > m01_2017050100f002.nc <http://m01_2017050100f000.nc>
> >> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> >> > > > > > > m01_2017050100f003.nc <http://m01_2017050100f000.nc>
> >> > > > > > > 'name="MXUPHL_P8_2L103_GLC0_max"; level="(*,*)";' \
> >> > > > > > > 03hr_UH.nc -name max_UH_03h
> >> > > > > > >
> >> > > > > > > Does that produce the desired result?
> >> > > > > > >
> >> > > > > > > Thanks,
> >> > > > > > > John
> >> > > > > > >
> >> > > > > > > On Sat, Mar 24, 2018 at 3:10 PM, Jeff Duda via RT <
> >> > > met_help at ucar.edu
> >> > > > >
> >> > > > > > > wrote:
> >> > > > > > >
> >> > > > > > > >
> >> > > > > > > > Sat Mar 24 15:10:37 2018: Request 84539 was acted
upon.
> >> > > > > > > > Transaction: Ticket created by
jeffduda319 at gmail.com
> >> > > > > > > >        Queue: met_help
> >> > > > > > > >      Subject: using pcp_combine -sum for custom
netcdf
> files
> >> > > > > > > >        Owner: Nobody
> >> > > > > > > >   Requestors: jeffduda319 at gmail.com
> >> > > > > > > >       Status: new
> >> > > > > > > >  Ticket <URL: https://rt.rap.ucar.edu/rt/
> >> > > > > Ticket/Display.html?id=84539
> >> > > > > > >
> >> > > > > > > >
> >> > > > > > > >
> >> > > > > > > > I have a large number of forecast netcdf files
(format
> from
> >> > > ncdump
> >> > > > -h
> >> > > > > > > shown
> >> > > > > > > > below) in a directory, and I wish to run
pcp_combine to
> sum
> >> > > updraft
> >> > > > > > > > helicity tracks across subsets of the files in that
> >> directory.
> >> > > But
> >> > > > > the
> >> > > > > > > > files have custorm names and structure, so my
attempts to
> do
> >> > this
> >> > > > > have
> >> > > > > > > > resulted in an error:
> >> > > > > > > >
> >> > > > > > > > ERROR  : sum_data_files() -> Cannot find a file
with a
> valid
> >> > time
> >> > > > of
> >> > > > > > > > 20170501_030000 and accumulation time of 010000
matching
> the
> >> > > > regular
> >> > > > > > > > expression ".*"
> >> > > > > > > >
> >> > > > > > > > The command I used was
> >> > > > > > > >
> >> > > > > > > > ./pcp_combine -sum 20170501_00 01 20170501_03 03
> 03hr_UH.nc
> >> > > -pcpdir
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/ -field
> >> > > > > > MXUPHL_P8_2L103_GLC0_max
> >> > > > > > > > -name max_UH_03h
> >> > > > > > > >
> >> > > > > > > > where the pcpdir is full of files with names like
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775984 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f000.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775980 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f001.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f002.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f003.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f004.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f005.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f006.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f007.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f008.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f009.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f010.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f011.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f012.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f013.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f014.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f015.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f016.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f017.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f018.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f019.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f020.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f021.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f022.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f023.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:48
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f024.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f025.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f026.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f027.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f028.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f029.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f030.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f031.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f032.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f033.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f034.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f035.nc
> >> > > > > > > > -rw-r--r--. 1 jdduda map 21775988 Mar  5 14:49
> >> > > > > > > > /condo/map/jdduda/HWT2017/201705010000/
> >> m01_2017050100f036.nc
> >> > > > > > > >
> >> > > > > > > > Here 'm01' represents ensemble member number 1;
there are
> 10
> >> > such
> >> > > > > > > ensemble
> >> > > > > > > > members. So you can see the regularity in the file
naming.
> >> > > > > > > >
> >> > > > > > > > The structure of the files is all the same and
includes:
> >> > > > > > > > [jdduda at schooner1 MODE]$ ncdump -h
> >> /condo/map/jdduda/HWT2017/
> >> > > > > > > 201705010000/
> >> > > > > > > > m01_2017050100f001.nc
> >> > > > > > > > netcdf m01_2017050100f001 {
> >> > > > > > > > dimensions:
> >> > > > > > > >         ygrid_0 = 1120 ;
> >> > > > > > > >         xgrid_0 = 1620 ;
> >> > > > > > > >         lv_ISBL0 = 6 ;
> >> > > > > > > >         lv_ISBL1 = 5 ;
> >> > > > > > > >         lv_HTGL2 = 2 ;
> >> > > > > > > >         lv_HTGL3 = 2 ;
> >> > > > > > > >         lv_ISBL4 = 5 ;
> >> > > > > > > >         lv_SPDL5 = 3 ;
> >> > > > > > > >         lv_HTGL6 = 2 ;
> >> > > > > > > >         lv_HTGL7 = 2 ;
> >> > > > > > > > variables:
> >> > > > > > > >          ...
> >> > > > > > > >         float MXUPHL_P8_2L103_GLC0_max(ygrid_0,
xgrid_0)
> ;
> >> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:initial_time =
> >> > > > "05/01/2017
> >> > > > > > > > (00:00)" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> forecast_time_units
> >> =
> >> > > > > > "hours" ;
> >> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:forecast_time =
> 1
> >> ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> >> > > > > statistical_process_duration
> >> > > > > > =
> >> > > > > > > > "initial time to forecast time" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:type_
> >> > > > > > of_statistical_processing
> >> > > > > > > =
> >> > > > > > > > "Maximum" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level =
5000.f,
> >> > 2000.f ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:level_type
=
> >> > "Specified
> >> > > > > > height
> >> > > > > > > > level above ground (m)" ;
> >> > > > > > > >
> >> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:parameter_template_discipline_
> >> > > > > category_number
> >> > > > > > =
> >> > > > > > > > 8,
> >> > > > > > > > 0, 7, 199 ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> >> > > parameter_discipline_and_
> >> > > > > > > category
> >> > > > > > > > = "Meteorological products, Thermodynamic stability
> >> indices" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:grid_type
=
> >> "Lambert
> >> > > > > > Conformal
> >> > > > > > > > can
> >> > > > > > > > be secant or tangent, conical or bipolar" ;
> >> > > > > > > >
MXUPHL_P8_2L103_GLC0_max:coordinates =
> >> > > "gridlat_0
> >> > > > > > > > gridlon_0" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:_FillValue
=
> >> > > 9.999e+20f ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:units =
"m2/s2" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:long_name
=
> >> "Hourly
> >> > > > maximum
> >> > > > > > of
> >> > > > > > > > updraft helicity over layer 2km to 5 km AGL" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:
> production_status
> >> =
> >> > > > > > > "Operational
> >> > > > > > > > products" ;
> >> > > > > > > >                 MXUPHL_P8_2L103_GLC0_max:center =
"US
> >> National
> >> > > > > Weather
> >> > > > > > > > Service - NCEP (WMC)" ;
> >> > > > > > > >           ...
> >> > > > > > > >
> >> > > > > > > > // global attributes:
> >> > > > > > > >                 :creation_date = "Mon Mar  5
14:48:10 CST
> >> > 2018" ;
> >> > > > > > > >                 :NCL_Version = "6.2.0" ;
> >> > > > > > > >                 :system = "Linux
schooner1.oscer.ou.edu
> >> > > > > > > > 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4
01:06:37 UTC
> >> 2018
> >> > > > x86_64
> >> > > > > > > x86_64
> >> > > > > > > > x86_64 GNU/Linux" ;
> >> > > > > > > >                 :Conventions = "None" ;
> >> > > > > > > >                 :grib_source = "gsi-
enkfCN_2017050100f001.
> >> > grib2"
> >> > > ;
> >> > > > > > > >                 :title = "NCL: convert-GRIB-to-
netCDF" ;
> >> > > > > > > >
> >> > > > > > > > There are other arrays in these files, but I
omitted their
> >> > > entries
> >> > > > > for
> >> > > > > > > > simplicity. Is there something wrong with the
structure of
> >> > these
> >> > > > > files
> >> > > > > > to
> >> > > > > > > > make pcp_combine not work? Otherwise, what should I
use
> for
> >> > > -pcprx
> >> > > > to
> >> > > > > > do
> >> > > > > > > > the following: sum three consecutive 1-hourly files
into
> >> > 3-hourly
> >> > > > > files
> >> > > > > > > (so
> >> > > > > > > > let's start with f000-f003)?
> >> > > > > > > >
> >> > > > > > > > Jeff Duda
> >> > > > > > > > --
> >> > > > > > > > Jeff Duda, Research Scientist
> >> > > > > > > >
> >> > > > > > > > University of Colorado Boulder
> >> > > > > > > >
> >> > > > > > > > Cooperative Institute for Research in Environmental
> Sciences
> >> > > > > > > >
> >> > > > > > > > NOAA/OAR/ESRL/Global Systems Division
> >> > > > > > > > Boulder, CO
> >> > > > > > > >
> >> > > > > > > >
> >> > > > > > >
> >> > > > > > >
> >> > > > > >
> >> > > > > >
> >> > > > > > --
> >> > > > > > Jeff Duda, Research Scientist
> >> > > > > >
> >> > > > > > University of Colorado Boulder
> >> > > > > >
> >> > > > > > Cooperative Institute for Research in Environmental
Sciences
> >> > > > > >
> >> > > > > > NOAA/OAR/ESRL/Global Systems Division
> >> > > > > > Boulder, CO
> >> > > > > >
> >> > > > > >
> >> > > > >
> >> > > > >
> >> > > >
> >> > > >
> >> > > > --
> >> > > > Jeff Duda, Research Scientist
> >> > > >
> >> > > > University of Colorado Boulder
> >> > > >
> >> > > > Cooperative Institute for Research in Environmental
Sciences
> >> > > >
> >> > > > NOAA/OAR/ESRL/Global Systems Division
> >> > > > Boulder, CO
> >> > > >
> >> > > >
> >> > >
> >> > >
> >> >
> >> >
> >> > --
> >> > Jeff Duda, Research Scientist
> >> >
> >> > University of Colorado Boulder
> >> >
> >> > Cooperative Institute for Research in Environmental Sciences
> >> >
> >> > NOAA/OAR/ESRL/Global Systems Division
> >> > Boulder, CO
> >> >
> >> >
> >>
> >>
> >
> >
> > --
> > Jeff Duda, Research Scientist
> >
> > University of Colorado Boulder
> >
> > Cooperative Institute for Research in Environmental Sciences
> >
> > NOAA/OAR/ESRL/Global Systems Division
> > Boulder, CO
> >
>
>
>
> --
> Jeff Duda, Research Scientist
>
> University of Colorado Boulder
>
> Cooperative Institute for Research in Environmental Sciences
>
> NOAA/OAR/ESRL/Global Systems Division
> Boulder, CO
>
>

------------------------------------------------


More information about the Met_help mailing list