[Met_help] [rt.rap.ucar.edu #96501] History for core dump wtih from gen_vx_mask and grid_stat

John Halley Gotway via RT met_help at ucar.edu
Thu Oct 1 14:32:56 MDT 2020


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Folks - I have noticed that if I *remapbil,r360x181* a GRIB file from 0.25 deg to 1.0 deg (using cdo tool), both gen_vx_mask and grid_stat core dump. I was wondering if this has happened before. Any help would be greatly appreciated. Thanks!

Efren A. Serra (Contractor)
Physicist

DeVine Consulting, Inc.
Naval Research Laboratory
Marine Meteorology Division
7 Grace Hopper Ave., STOP 2
Monterey, CA 93943
Code 7542
Mobile: 408-425-5027



----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Mon Aug 31 13:38:57 2020

Hi Efren,

Well, core dumps are never an acceptable outcome. Whenever that
happens, it
indicates there's a problem that should be fixed. We should enhance
the MET
tools to do more error checking and exit with a useful error message
rather
than core dumping. Of course, if the core is actually coming from one
of
the dependent libraries (like NetCDF or something), then we have less
control over it.

If possible please send me the command you ran to generate the core
dump
and then post the file that caused it to our anonymous ftp site
following
these directions:

http://dtcenter.org/community-code/model-evaluation-tools-met/met-
help-desk#ftp

Hopefully I'll be able to replicate this behavior and figure out
what's
going on.

Thanks,
John Halley Gotway

On Fri, Aug 28, 2020 at 5:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
<
met_help at ucar.edu> wrote:

>
> Fri Aug 28 17:07:48 2020: Request 96501 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump wtih from gen_vx_mask and grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
>
> Folks - I have noticed that if I *remapbil,r360x181* a GRIB file
from 0.25
> deg to 1.0 deg (using cdo tool), both gen_vx_mask and grid_stat core
dump.
> I was wondering if this has happened before. Any help would be
greatly
> appreciated. Thanks!
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>

------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Mon Aug 31 17:40:00 2020

John - I shall do that; what's your anonymous ftp site. Here is a
little of context on the core dump. Attached are a log file and the
GridStatConfig file.

1] command line

grid_stat /omar_backup/leo_backup/serra/data_repos/ww3tcofcl-
evaluation/dynamic/2018092900/WW3NAVGEM_prb/US058GOCN-
GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
/omar_backup/leo_backup/serra/data_repos/ww3tcofcl-
evaluation/dynamic/2018100400/WW3TCOFCL_det/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-000000sig_wav_ht
/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/WW3NAVGEM-
WW3TCOFCL_fcst_cat_thresh_10/GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120 -outdir
/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/WW3NAVGEM-
WW3TCOFCL_fcst_cat_thresh_10 -v 4

2] probability field grid_stat
/omar_backup/leo_backup/serra/data_repos/ww3tcofcl-
evaluation/dynamic/2018092900/WW3NAVGEM_prb/US058GOCN-
GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft

Wgrib -V output:

Undefined parameter table (center 58-0 table 3), using NCEP-opn
rec 1:0:date 2018092900 MFLX kpds5=172 kpds6=1 kpds7=0 levels=(0,0)
grid=240 sfc 120hr fcst: bitmap: 29430 undef
  MFLX=Momentum flux [N/m^2]
  timerange 0 P1 120 P2 0 TimeU 1  nx 360 ny 181 GDS grid 0 num_in_ave
0 missing 0
  center 58 subcenter 0 process 50 Table 3 scan: WE:SN winds(N/S)
  latlon: lat  -90.000000 to 90.000000 by 1.000000  nxny 65160
          long 0.000000 to -1.000000 by 1.000000, (360 x 181) scan 64
mode 128 bdsgrid 1
  min/max data 0 1  num bits 7  BDS_Ref 0  DecScale 2 BinScale 0

3] observation field
/omar_backup/leo_backup/serra/data_repos/ww3tcofcl-
evaluation/dynamic/2018100400/WW3TCOFCL_det/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-000000sig_wav_ht

Wgrib -V output:

rec 1:0:date 2018100400 HTSGW kpds5=100 kpds6=1 kpds7=0 levels=(0,0)
grid=255 sfc anl: bitmap: 29874 undef
  HTSGW=Sig height of wind waves and swell [m]
  timerange 0 P1 0 P2 0 TimeU 1  nx 360 ny 180 GDS grid 0 num_in_ave 0
missing 0
  center 58 subcenter 0 process 95 Table 3 scan: WE:SN winds(N/S)
  latlon: lat  -89.500000 to 89.500000 by 1.000000  nxny 64800
          long 0.000000 to 359.000000 by 1.000000, (360 x 180) scan 64
mode 128 bdsgrid 1
  min/max data 0 13.87  num bits 18  BDS_Ref 0  DecScale 0 BinScale
-14



-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Monday, August 31, 2020 12:39 PM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Hi Efren,

Well, core dumps are never an acceptable outcome. Whenever that
happens, it indicates there's a problem that should be fixed. We
should enhance the MET tools to do more error checking and exit with a
useful error message rather than core dumping. Of course, if the core
is actually coming from one of the dependent libraries (like NetCDF or
something), then we have less control over it.

If possible please send me the command you ran to generate the core
dump and then post the file that caused it to our anonymous ftp site
following these directions:

http://dtcenter.org/community-code/model-evaluation-tools-met/met-
help-desk#ftp

Hopefully I'll be able to replicate this behavior and figure out
what's going on.

Thanks,
John Halley Gotway

On Fri, Aug 28, 2020 at 5:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Fri Aug 28 17:07:48 2020: Request 96501 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump wtih from gen_vx_mask and grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501
> >
>
>
> Folks - I have noticed that if I *remapbil,r360x181* a GRIB file
from
> 0.25 deg to 1.0 deg (using cdo tool), both gen_vx_mask and grid_stat
core dump.
> I was wondering if this has happened before. Any help would be
greatly
> appreciated. Thanks!
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>


------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Mon Aug 31 17:40:25 2020

I'm reading about the anonymous ftp; sorry!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Monday, August 31, 2020 12:39 PM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Hi Efren,

Well, core dumps are never an acceptable outcome. Whenever that
happens, it indicates there's a problem that should be fixed. We
should enhance the MET tools to do more error checking and exit with a
useful error message rather than core dumping. Of course, if the core
is actually coming from one of the dependent libraries (like NetCDF or
something), then we have less control over it.

If possible please send me the command you ran to generate the core
dump and then post the file that caused it to our anonymous ftp site
following these directions:

http://dtcenter.org/community-code/model-evaluation-tools-met/met-
help-desk#ftp

Hopefully I'll be able to replicate this behavior and figure out
what's going on.

Thanks,
John Halley Gotway

On Fri, Aug 28, 2020 at 5:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Fri Aug 28 17:07:48 2020: Request 96501 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump wtih from gen_vx_mask and grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501
> >
>
>
> Folks - I have noticed that if I *remapbil,r360x181* a GRIB file
from
> 0.25 deg to 1.0 deg (using cdo tool), both gen_vx_mask and grid_stat
core dump.
> I was wondering if this has happened before. Any help would be
greatly
> appreciated. Thanks!
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 01 08:03:23 2020

John - Files are in my ftp directory; listing below:

ftp> ls -lrt
227 Entering Passive Mode (128,117,14,132,196,154).
150 Opening ASCII mode data connection for file list
-rw-r--r--   1 ftp      ftp         39538 Sep  1 14:02 US058GOCN-
GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
-rw-r--r--   1 ftp      ftp         86776 Sep  1 14:02 US058GOCN-
GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-000000sig_wav_ht
-rw-r--r--   1 ftp      ftp        270598 Sep  1 14:02 wp302018-
2018100400.nc
-rw-r--r--   1 ftp      ftp          3254 Sep  1 14:02
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
226 Transfer complete
ftp> pwd
257 "/incoming/irap/met_help/serra" is the current directory

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Monday, August 31, 2020 12:39 PM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Hi Efren,

Well, core dumps are never an acceptable outcome. Whenever that
happens, it indicates there's a problem that should be fixed. We
should enhance the MET tools to do more error checking and exit with a
useful error message rather than core dumping. Of course, if the core
is actually coming from one of the dependent libraries (like NetCDF or
something), then we have less control over it.

If possible please send me the command you ran to generate the core
dump and then post the file that caused it to our anonymous ftp site
following these directions:

http://dtcenter.org/community-code/model-evaluation-tools-met/met-
help-desk#ftp

Hopefully I'll be able to replicate this behavior and figure out
what's going on.

Thanks,
John Halley Gotway

On Fri, Aug 28, 2020 at 5:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Fri Aug 28 17:07:48 2020: Request 96501 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump wtih from gen_vx_mask and grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501
> >
>
>
> Folks - I have noticed that if I *remapbil,r360x181* a GRIB file
from
> 0.25 deg to 1.0 deg (using cdo tool), both gen_vx_mask and grid_stat
core dump.
> I was wondering if this has happened before. Any help would be
greatly
> appreciated. Thanks!
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>



------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Tue Sep 01 09:52:15 2020

Efren,

Thanks for sending your sample data and commands. I see from your
configuration file that you're using MET version 8.0 from Sept 2018.
We
released MET version 9.1 this August. I'll start by debugging with
version
8.0 (plus all posted bugfixes) to see if I can replicate the problem.
If I
can, I'll then check to see if the problem still exists in version
9.1.

I started by running the plot_data_plane tool (version 9.1) on both
input
datasets:

*plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*

*plot_data_plane
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
level="L0";'*


That created the attached images, which look fine to me. Always a good
idea
to start with plot_data_plane when working with new data to make sure
MET
is plotting the data correctly and in the expected location.


When I run met-8.0 grid_stat with the data you sent I get an error
message
about the grids not matching:


*met-8.0_bugfix/bin/grid_stat
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
-outdir out -v 3 -log run_gs.log*

*DEBUG 1: Default Config File:
/Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*

*DEBUG 1: User Config File:
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*

*ERROR  : *

*ERROR  : parse_vx_grid() -> The forecast and observation grids do not
match:*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000 !=*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000*

*ERROR  : Specify regridding logic in the config file "regrid"
section.*

*ERROR  : *


So I modified the config file to regrid to the "OBS" field
(regrid.to_grid
= OBS;). And then grid_stat ran to completion without an error.
However, I
do have a few suggestions:


(1) I'd recommend upgrading to met-9.1 if/when that is possible.


(2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will have
no
effect on the output. Instead, I'd recommend setting the "model" and
"obtype" settings at the beginning of the Grid-Stat config file to
populate
the MODEL and OBTYPE columns in the output.


(3) Listed below are your "mask" settings which define the
verification
regions over which statistics should be computed:


mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


The grid and poly options are arrays. I suspect that you want 0 grid
entries, but you actually currently have 1... but it's set to an empty
string. Try specifying it like this instead:


mask = { grid = [ ]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


And that should only produce output for that single polyline that you
defined.


(4) Looking in wp302018-2018100400.nc, I see that the variable is
named
"box_mask":

float box_mask(lat, lon) ;


And that causes the VX_MASK column to be populated with "box_mask" in
the
output. I'd recommend choosing a more descriptive name for that
region.
When you run gen_vx_mask, you can add the "-name" command line option
to
define the name of the output variable. And then whatever name you
choose
will show up in the VX_MASK column of the Grid-Stat output.


Perhaps changing the mask.grid setting or upgrading to MET version 9.1
will
resolve the behavior you're experiencing. If not, please let me know
and I
can try to help you debug this more remotely.


Thanks,
John

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 01 10:45:41 2020

John - I requested that MET 9.1 be installed; I'm waiting for this and
I shall let you know ASAP. Thanks!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 8:52 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Efren,

Thanks for sending your sample data and commands. I see from your
configuration file that you're using MET version 8.0 from Sept 2018.
We released MET version 9.1 this August. I'll start by debugging with
version
8.0 (plus all posted bugfixes) to see if I can replicate the problem.
If I can, I'll then check to see if the problem still exists in
version 9.1.

I started by running the plot_data_plane tool (version 9.1) on both
input
datasets:

*plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*

*plot_data_plane
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
level="L0";'*


That created the attached images, which look fine to me. Always a good
idea to start with plot_data_plane when working with new data to make
sure MET is plotting the data correctly and in the expected location.


When I run met-8.0 grid_stat with the data you sent I get an error
message about the grids not matching:


*met-8.0_bugfix/bin/grid_stat
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
-outdir out -v 3 -log run_gs.log*

*DEBUG 1: Default Config File:
/Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*

*DEBUG 1: User Config File:
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*

*ERROR  : *

*ERROR  : parse_vx_grid() -> The forecast and observation grids do not
match:*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000 !=*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000*

*ERROR  : Specify regridding logic in the config file "regrid"
section.*

*ERROR  : *


So I modified the config file to regrid to the "OBS" field
(regrid.to_grid = OBS;). And then grid_stat ran to completion without
an error.  However, I do have a few suggestions:


(1) I'd recommend upgrading to met-9.1 if/when that is possible.


(2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will have
no effect on the output. Instead, I'd recommend setting the "model"
and "obtype" settings at the beginning of the Grid-Stat config file to
populate the MODEL and OBTYPE columns in the output.


(3) Listed below are your "mask" settings which define the
verification regions over which statistics should be computed:


mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


The grid and poly options are arrays. I suspect that you want 0 grid
entries, but you actually currently have 1... but it's set to an empty
string. Try specifying it like this instead:


mask = { grid = [ ]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


And that should only produce output for that single polyline that you
defined.


(4) Looking in wp302018-2018100400.nc, I see that the variable is
named
"box_mask":

float box_mask(lat, lon) ;


And that causes the VX_MASK column to be populated with "box_mask" in
the output. I'd recommend choosing a more descriptive name for that
region.
When you run gen_vx_mask, you can add the "-name" command line option
to define the name of the output variable. And then whatever name you
choose will show up in the VX_MASK column of the Grid-Stat output.


Perhaps changing the mask.grid setting or upgrading to MET version 9.1
will resolve the behavior you're experiencing. If not, please let me
know and I can try to help you debug this more remotely.


Thanks,
John



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 01 10:48:45 2020

Also, the reason I had 360x180 in the probability file is that I
noticed that it fixed the core dump when I was using gen_vx_mask to
create a mask file. The 360x180 probability field I obtain via cdo
command line "cdo -f grb1 remapbil,r360x180 ..." I think upgrading to
MET v 9.1 is the ticket. Let me do that, but let's keep this issue
open. Thanks!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 8:52 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Efren,

Thanks for sending your sample data and commands. I see from your
configuration file that you're using MET version 8.0 from Sept 2018.
We released MET version 9.1 this August. I'll start by debugging with
version
8.0 (plus all posted bugfixes) to see if I can replicate the problem.
If I can, I'll then check to see if the problem still exists in
version 9.1.

I started by running the plot_data_plane tool (version 9.1) on both
input
datasets:

*plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*

*plot_data_plane
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
level="L0";'*


That created the attached images, which look fine to me. Always a good
idea to start with plot_data_plane when working with new data to make
sure MET is plotting the data correctly and in the expected location.


When I run met-8.0 grid_stat with the data you sent I get an error
message about the grids not matching:


*met-8.0_bugfix/bin/grid_stat
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
-outdir out -v 3 -log run_gs.log*

*DEBUG 1: Default Config File:
/Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*

*DEBUG 1: User Config File:
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*

*ERROR  : *

*ERROR  : parse_vx_grid() -> The forecast and observation grids do not
match:*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000 !=*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000*

*ERROR  : Specify regridding logic in the config file "regrid"
section.*

*ERROR  : *


So I modified the config file to regrid to the "OBS" field
(regrid.to_grid = OBS;). And then grid_stat ran to completion without
an error.  However, I do have a few suggestions:


(1) I'd recommend upgrading to met-9.1 if/when that is possible.


(2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will have
no effect on the output. Instead, I'd recommend setting the "model"
and "obtype" settings at the beginning of the Grid-Stat config file to
populate the MODEL and OBTYPE columns in the output.


(3) Listed below are your "mask" settings which define the
verification regions over which statistics should be computed:


mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


The grid and poly options are arrays. I suspect that you want 0 grid
entries, but you actually currently have 1... but it's set to an empty
string. Try specifying it like this instead:


mask = { grid = [ ]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


And that should only produce output for that single polyline that you
defined.


(4) Looking in wp302018-2018100400.nc, I see that the variable is
named
"box_mask":

float box_mask(lat, lon) ;


And that causes the VX_MASK column to be populated with "box_mask" in
the output. I'd recommend choosing a more descriptive name for that
region.
When you run gen_vx_mask, you can add the "-name" command line option
to define the name of the output variable. And then whatever name you
choose will show up in the VX_MASK column of the Grid-Stat output.


Perhaps changing the mask.grid setting or upgrading to MET version 9.1
will resolve the behavior you're experiencing. If not, please let me
know and I can try to help you debug this more remotely.


Thanks,
John



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 01 10:56:52 2020

John - My IT guy (he's my personal IT guy) updated MET tools to v
9.0.2 and I used the lower resolution sig_wav_ht field at r360x181
(same resolution as the probability field) and it *DID NOT* core dump.
Freaking beautiful Should we keep v9.0.2 or push v9.1. Thanks for the
help mate, Buck is gonna be happy!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 8:52 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Efren,

Thanks for sending your sample data and commands. I see from your
configuration file that you're using MET version 8.0 from Sept 2018.
We released MET version 9.1 this August. I'll start by debugging with
version
8.0 (plus all posted bugfixes) to see if I can replicate the problem.
If I can, I'll then check to see if the problem still exists in
version 9.1.

I started by running the plot_data_plane tool (version 9.1) on both
input
datasets:

*plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*

*plot_data_plane
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
level="L0";'*


That created the attached images, which look fine to me. Always a good
idea to start with plot_data_plane when working with new data to make
sure MET is plotting the data correctly and in the expected location.


When I run met-8.0 grid_stat with the data you sent I get an error
message about the grids not matching:


*met-8.0_bugfix/bin/grid_stat
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
-outdir out -v 3 -log run_gs.log*

*DEBUG 1: Default Config File:
/Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*

*DEBUG 1: User Config File:
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*

*ERROR  : *

*ERROR  : parse_vx_grid() -> The forecast and observation grids do not
match:*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000 !=*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000*

*ERROR  : Specify regridding logic in the config file "regrid"
section.*

*ERROR  : *


So I modified the config file to regrid to the "OBS" field
(regrid.to_grid = OBS;). And then grid_stat ran to completion without
an error.  However, I do have a few suggestions:


(1) I'd recommend upgrading to met-9.1 if/when that is possible.


(2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will have
no effect on the output. Instead, I'd recommend setting the "model"
and "obtype" settings at the beginning of the Grid-Stat config file to
populate the MODEL and OBTYPE columns in the output.


(3) Listed below are your "mask" settings which define the
verification regions over which statistics should be computed:


mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


The grid and poly options are arrays. I suspect that you want 0 grid
entries, but you actually currently have 1... but it's set to an empty
string. Try specifying it like this instead:


mask = { grid = [ ]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


And that should only produce output for that single polyline that you
defined.


(4) Looking in wp302018-2018100400.nc, I see that the variable is
named
"box_mask":

float box_mask(lat, lon) ;


And that causes the VX_MASK column to be populated with "box_mask" in
the output. I'd recommend choosing a more descriptive name for that
region.
When you run gen_vx_mask, you can add the "-name" command line option
to define the name of the output variable. And then whatever name you
choose will show up in the VX_MASK column of the Grid-Stat output.


Perhaps changing the mask.grid setting or upgrading to MET version 9.1
will resolve the behavior you're experiencing. If not, please let me
know and I can try to help you debug this more remotely.


Thanks,
John



------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Tue Sep 01 11:08:17 2020

Great, glad to hear that upgrading to met-9.0.2 did the trick! There
actually was a met-9.0.3 bugfix release with release notes here (
http://dtcenter.org/community-code/model-evaluation-tools-met/met-
version-9-0-3#notes
).

As for updating to met-9.1, it's really up to you.  Here's a link to
the
met-9.1 release notes. We've switched to using GitHub pages for the
met-9.1
release:
https://dtcenter.github.io/MET/Users_Guide/release-notes.html

So you can review those changes and see if anything jumps out at you
as
being useful. If updating to met-9.1 is easy enough, I'd recommend
doing so.

Thanks,
John

On Tue, Sep 1, 2020 at 10:57 AM efren.serra.ctr at nrlmry.navy.mil via RT
<
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - My IT guy (he's my personal IT guy) updated MET tools to v
9.0.2
> and I used the lower resolution sig_wav_ht field at r360x181 (same
> resolution as the probability field) and it *DID NOT* core dump.
Freaking
> beautiful Should we keep v9.0.2 or push v9.1. Thanks for the help
mate,
> Buck is gonna be happy!
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Tuesday, September 1, 2020 8:52 AM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from
gen_vx_mask and
> grid_stat
>
> Efren,
>
> Thanks for sending your sample data and commands. I see from your
> configuration file that you're using MET version 8.0 from Sept 2018.
We
> released MET version 9.1 this August. I'll start by debugging with
version
> 8.0 (plus all posted bugfixes) to see if I can replicate the
problem. If I
> can, I'll then check to see if the problem still exists in version
9.1.
>
> I started by running the plot_data_plane tool (version 9.1) on both
input
> datasets:
>
> *plot_data_plane
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
> sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*
>
> *plot_data_plane
>
> US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
> prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
> level="L0";'*
>
>
> That created the attached images, which look fine to me. Always a
good
> idea to start with plot_data_plane when working with new data to
make sure
> MET is plotting the data correctly and in the expected location.
>
>
> When I run met-8.0 grid_stat with the data you sent I get an error
message
> about the grids not matching:
>
>
> *met-8.0_bugfix/bin/grid_stat
>
> US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
>
> GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
> -outdir out -v 3 -log run_gs.log*
>
> *DEBUG 1: Default Config File:
>
> /Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*
>
> *DEBUG 1: User Config File:
>
> GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*
>
> *ERROR  : *
>
> *ERROR  : parse_vx_grid() -> The forecast and observation grids do
not
> match:*
>
> *ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000
lon_ll:
> -0.000 delta_lat: 1.000 delta_lon: 1.000 !=*
>
> *ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500
lon_ll:
> -0.000 delta_lat: 1.000 delta_lon: 1.000*
>
> *ERROR  : Specify regridding logic in the config file "regrid"
section.*
>
> *ERROR  : *
>
>
> So I modified the config file to regrid to the "OBS" field
(regrid.to_grid
> = OBS;). And then grid_stat ran to completion without an error.
However, I
> do have a few suggestions:
>
>
> (1) I'd recommend upgrading to met-9.1 if/when that is possible.
>
>
> (2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
> dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will
have no
> effect on the output. Instead, I'd recommend setting the "model" and
> "obtype" settings at the beginning of the Grid-Stat config file to
populate
> the MODEL and OBTYPE columns in the output.
>
>
> (3) Listed below are your "mask" settings which define the
verification
> regions over which statistics should be computed:
>
>
> mask = { grid = [""]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
> wp302018-2018100400.nc"]; }
>
>
> The grid and poly options are arrays. I suspect that you want 0 grid
> entries, but you actually currently have 1... but it's set to an
empty
> string. Try specifying it like this instead:
>
>
> mask = { grid = [ ]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
> wp302018-2018100400.nc"]; }
>
>
> And that should only produce output for that single polyline that
you
> defined.
>
>
> (4) Looking in wp302018-2018100400.nc, I see that the variable is
named
> "box_mask":
>
> float box_mask(lat, lon) ;
>
>
> And that causes the VX_MASK column to be populated with "box_mask"
in the
> output. I'd recommend choosing a more descriptive name for that
region.
> When you run gen_vx_mask, you can add the "-name" command line
option to
> define the name of the output variable. And then whatever name you
choose
> will show up in the VX_MASK column of the Grid-Stat output.
>
>
> Perhaps changing the mask.grid setting or upgrading to MET version
9.1
> will resolve the behavior you're experiencing. If not, please let me
know
> and I can try to help you debug this more remotely.
>
>
> Thanks,
> John
>
>
>
>

------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: Sampson, Mr. Buck
Time: Tue Sep 01 11:14:46 2020

Thanks John.  We heart MET.

Buck


-----Original Message-----
From: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Sent: Tuesday, September 1, 2020 9:57 AM
To: met_help at ucar.edu
Cc: Sampson, Mr. Buck <Buck.Sampson at nrlmry.navy.mil>
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and
grid_stat

John - My IT guy (he's my personal IT guy) updated MET tools to v
9.0.2 and I
used the lower resolution sig_wav_ht field at r360x181 (same
resolution as the
probability field) and it *DID NOT* core dump. Freaking beautiful
Should we
keep v9.0.2 or push v9.1. Thanks for the help mate, Buck is gonna be
happy!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 8:52 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and
grid_stat

Efren,

Thanks for sending your sample data and commands. I see from your
configuration file that you're using MET version 8.0 from Sept 2018.
We
released MET version 9.1 this August. I'll start by debugging with
version
8.0 (plus all posted bugfixes) to see if I can replicate the problem.
If I
can, I'll then check to see if the problem still exists in version
9.1.

I started by running the plot_data_plane tool (version 9.1) on both
input
datasets:

*plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW"; level="L0";'*

*plot_data_plane
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
prob_sig_wav_ht_gt12ft.ps <http://prob_sig_wav_ht_gt12ft.ps>
'name="MFLX";
level="L0";'*


That created the attached images, which look fine to me. Always a good
idea to
start with plot_data_plane when working with new data to make sure MET
is
plotting the data correctly and in the expected location.


When I run met-8.0 grid_stat with the data you sent I get an error
message
about the grids not matching:


*met-8.0_bugfix/bin/grid_stat
US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_sig_wav_ht_gt12ft
US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_wav_ht
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120
-outdir out -v 3 -log run_gs.log*

*DEBUG 1: Default Config File:
/Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/GridStatConfig_default*

*DEBUG 1: User Config File:
GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_2018092900-120*

*ERROR  : *

*ERROR  : parse_vx_grid() -> The forecast and observation grids do not
match:*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000 !=*

*ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500 lon_ll:
-0.000 delta_lat: 1.000 delta_lon: 1.000*

*ERROR  : Specify regridding logic in the config file "regrid"
section.*

*ERROR  : *


So I modified the config file to regrid to the "OBS" field
(regrid.to_grid =
OBS;). And then grid_stat ran to completion without an error.
However, I do
have a few suggestions:


(1) I'd recommend upgrading to met-9.1 if/when that is possible.


(2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will have
no
effect on the output. Instead, I'd recommend setting the "model" and
"obtype"
settings at the beginning of the Grid-Stat config file to populate the
MODEL
and OBTYPE columns in the output.


(3) Listed below are your "mask" settings which define the
verification
regions over which statistics should be computed:


mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


The grid and poly options are arrays. I suspect that you want 0 grid
entries,
but you actually currently have 1... but it's set to an empty string.
Try
specifying it like this instead:


mask = { grid = [ ]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
wp302018-2018100400.nc"]; }


And that should only produce output for that single polyline that you
defined.


(4) Looking in wp302018-2018100400.nc, I see that the variable is
named
"box_mask":

float box_mask(lat, lon) ;


And that causes the VX_MASK column to be populated with "box_mask" in
the
output. I'd recommend choosing a more descriptive name for that
region.
When you run gen_vx_mask, you can add the "-name" command line option
to
define the name of the output variable. And then whatever name you
choose will
show up in the VX_MASK column of the Grid-Stat output.


Perhaps changing the mask.grid setting or upgrading to MET version 9.1
will
resolve the behavior you're experiencing. If not, please let me know
and I can
try to help you debug this more remotely.


Thanks,
John


------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] Resolved: core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Wed Sep 02 13:51:29 2020

John - We updated to MET v9.1 and I'm getting core dumped; I'm going
to upload files as you indicated earlier. Thanks!

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 10:23 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: [rt.rap.ucar.edu #96501] Resolved: core dump wtih from
gen_vx_mask and grid_stat

According to our records, your request has been resolved. If you have
any further questions or concerns, please respond to this message.


------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Wed Sep 02 13:56:34 2020

Here are the latest files:

US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
-rw-r--r--   1 ftp      ftp         82192 Sep  2 19:55 US058GOCN-
GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-000000sig_wav_ht
-rw-r--r--   1 ftp      ftp        270600 Sep  2 19:55 wp312018-
2018102300.nc
-rw-r--r--   1 ftp      ftp          3252 Sep  2 19:55
GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000

I am getting core dump when using grid_stat MET v9.1

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 1, 2020 10:08 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from gen_vx_mask
and grid_stat

Great, glad to hear that upgrading to met-9.0.2 did the trick! There
actually was a met-9.0.3 bugfix release with release notes here (
http://dtcenter.org/community-code/model-evaluation-tools-met/met-
version-9-0-3#notes
).

As for updating to met-9.1, it's really up to you.  Here's a link to
the
met-9.1 release notes. We've switched to using GitHub pages for the
met-9.1
release:
https://dtcenter.github.io/MET/Users_Guide/release-notes.html

So you can review those changes and see if anything jumps out at you
as being useful. If updating to met-9.1 is easy enough, I'd recommend
doing so.

Thanks,
John

On Tue, Sep 1, 2020 at 10:57 AM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - My IT guy (he's my personal IT guy) updated MET tools to v
> 9.0.2 and I used the lower resolution sig_wav_ht field at r360x181
> (same resolution as the probability field) and it *DID NOT* core
dump.
> Freaking beautiful Should we keep v9.0.2 or push v9.1. Thanks for
the
> help mate, Buck is gonna be happy!
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Tuesday, September 1, 2020 8:52 AM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: Re: [rt.rap.ucar.edu #96501] core dump wtih from
gen_vx_mask
> and grid_stat
>
> Efren,
>
> Thanks for sending your sample data and commands. I see from your
> configuration file that you're using MET version 8.0 from Sept 2018.
> We released MET version 9.1 this August. I'll start by debugging
with
> version
> 8.0 (plus all posted bugfixes) to see if I can replicate the
problem.
> If I can, I'll then check to see if the problem still exists in
version 9.1.
>
> I started by running the plot_data_plane tool (version 9.1) on both
> input
> datasets:
>
> *plot_data_plane
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_w
> av_ht sig_wav_ht.ps <http://sig_wav_ht.ps> 'name="HTSGW";
> level="L0";'*
>
> *plot_data_plane
>
> US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_
> sig_wav_ht_gt12ft prob_sig_wav_ht_gt12ft.ps
> <http://prob_sig_wav_ht_gt12ft.ps> 'name="MFLX";
> level="L0";'*
>
>
> That created the attached images, which look fine to me. Always a
good
> idea to start with plot_data_plane when working with new data to
make
> sure MET is plotting the data correctly and in the expected
location.
>
>
> When I run met-8.0 grid_stat with the data you sent I get an error
> message about the grids not matching:
>
>
> *met-8.0_bugfix/bin/grid_stat
>
> US058GOCN-GR1mdl.0050_0240_12000U0RL2018092900_0001_000000-
000000prob_
> sig_wav_ht_gt12ft
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018100400_0001_000000-
000000sig_w
> av_ht
>
> GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_20180929
> 00-120
> -outdir out -v 3 -log run_gs.log*
>
> *DEBUG 1: Default Config File:
>
> /Volumes/d1/projects/MET/MET_releases/met-
8.0_bugfix/share/met/config/
> GridStatConfig_default*
>
> *DEBUG 1: User Config File:
>
> GridStatConfig_wp302018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018100400_20180929
> 00-120*
>
> *ERROR  : *
>
> *ERROR  : parse_vx_grid() -> The forecast and observation grids do
not
> match:*
>
> *ERROR  : Projection: Lat/Lon Nx: 360 Ny: 181 lat_ll: -90.000
lon_ll:
> -0.000 delta_lat: 1.000 delta_lon: 1.000 !=*
>
> *ERROR  : Projection: Lat/Lon Nx: 360 Ny: 180 lat_ll: -89.500
lon_ll:
> -0.000 delta_lat: 1.000 delta_lon: 1.000*
>
> *ERROR  : Specify regridding logic in the config file "regrid"
> section.*
>
> *ERROR  : *
>
>
> So I modified the config file to regrid to the "OBS" field
> (regrid.to_grid = OBS;). And then grid_stat ran to completion
without
> an error.  However, I do have a few suggestions:
>
>
> (1) I'd recommend upgrading to met-9.1 if/when that is possible.
>
>
> (2) In you config file, setting model = "WW3NAVGEM"; in the "fcst"
> dictionary and  model = "WW3TCOFCL"; in the "obs" dictionary will
have
> no effect on the output. Instead, I'd recommend setting the "model"
> and "obtype" settings at the beginning of the Grid-Stat config file
to
> populate the MODEL and OBTYPE columns in the output.
>
>
> (3) Listed below are your "mask" settings which define the
> verification regions over which statistics should be computed:
>
>
> mask = { grid = [""]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
> wp302018-2018100400.nc"]; }
>
>
> The grid and poly options are arrays. I suspect that you want 0 grid
> entries, but you actually currently have 1... but it's set to an
empty
> string. Try specifying it like this instead:
>
>
> mask = { grid = [ ]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp302018/
> wp302018-2018100400.nc"]; }
>
>
> And that should only produce output for that single polyline that
you
> defined.
>
>
> (4) Looking in wp302018-2018100400.nc, I see that the variable is
> named
> "box_mask":
>
> float box_mask(lat, lon) ;
>
>
> And that causes the VX_MASK column to be populated with "box_mask"
in
> the output. I'd recommend choosing a more descriptive name for that
region.
> When you run gen_vx_mask, you can add the "-name" command line
option
> to define the name of the output variable. And then whatever name
you
> choose will show up in the VX_MASK column of the Grid-Stat output.
>
>
> Perhaps changing the mask.grid setting or upgrading to MET version
9.1
> will resolve the behavior you're experiencing. If not, please let me
> know and I can try to help you debug this more remotely.
>
>
> Thanks,
> John
>
>
>
>



------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Fri Sep 04 12:06:44 2020

Efren,

Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
I did pull the updated files and run with the configuration you sent.

I was able to run with error. Although there still was a warning. Let
me
address that warning first...

grid_stat \

US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
\

US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\

GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000
\
-outdir out -v 4 -log run_gs.log

WARNING:

WARNING: check_hdr_str() -> null string!

WARNING:


Please edit your config file:

FROM: mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }

TO:       mask = { grid = []; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }


That empty string is being treated as a second masking region with no
name... which results in that "null string" warning.


But I don't see any obvious, repeatable source of a segfault here. If
this
were GRIB2 data, I'd suspect it would have something to do with the
compilation of the grib2c library.  But this is NOT GRIB2 data, it's
GRIB1.


We could try narrowing this down by running plot_data_plane on each of
the
input files to see if MET is having trouble reading any of the input
datasets. Do all 3 of these plotting commands run OK for you?


plot_data_plane
US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
MFLX.ps 'name="MFLX"; level="Z0";'

plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
HTSGW.ps 'name="HTSGW"; level="Z0";'

plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
level="(*,*)";'


Thanks,
John

On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via RT
<
met_help at ucar.edu> wrote:

>
> Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump with MET v9.1 grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548 >
>
>
> Hi - we installed MET v9.1 and I got core dumped when invoking
grid_stat
> on two GRIB1 files. I placed these and auxiliary files at
> incoming/irap/met_help/serra
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core dump with MET v9.1 grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 08 11:24:12 2020

John - Thanks for correction on my grid_stat config file; I just got
back from a little holiday and I'm going down your recommendations
below.

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Friday, September 4, 2020 11:07 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
dump with MET v9.1 grid_stat

Efren,

Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
I did pull the updated files and run with the configuration you sent.

I was able to run with error. Although there still was a warning. Let
me address that warning first...

grid_stat \

US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
\

US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\

GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000
\
-outdir out -v 4 -log run_gs.log

WARNING:

WARNING: check_hdr_str() -> null string!

WARNING:


Please edit your config file:

FROM: mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }

TO:       mask = { grid = []; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }


That empty string is being treated as a second masking region with no
name... which results in that "null string" warning.


But I don't see any obvious, repeatable source of a segfault here. If
this were GRIB2 data, I'd suspect it would have something to do with
the compilation of the grib2c library.  But this is NOT GRIB2 data,
it's GRIB1.


We could try narrowing this down by running plot_data_plane on each of
the input files to see if MET is having trouble reading any of the
input datasets. Do all 3 of these plotting commands run OK for you?


plot_data_plane
US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
MFLX.ps 'name="MFLX"; level="Z0";'

plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
HTSGW.ps 'name="HTSGW"; level="Z0";'

plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
level="(*,*)";'


Thanks,
John

On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump with MET v9.1 grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> >
>
>
> Hi - we installed MET v9.1 and I got core dumped when invoking
> grid_stat on two GRIB1 files. I placed these and auxiliary files at
> incoming/irap/met_help/serra
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>



------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 08 12:56:31 2020

John - The plot_data_plane on sig_wav_ht gave a core dump. I should
point out that the sig_wav_ht fields are 1.0 deg GRIB1 fields derived
from .25 deg GRIB1 fields tapered down via cdo with this command: cdo
-f grb1 remapbil,r360x181

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Friday, September 4, 2020 11:07 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
dump with MET v9.1 grid_stat

Efren,

Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
I did pull the updated files and run with the configuration you sent.

I was able to run with error. Although there still was a warning. Let
me address that warning first...

grid_stat \

US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
\

US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\

GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000
\
-outdir out -v 4 -log run_gs.log

WARNING:

WARNING: check_hdr_str() -> null string!

WARNING:


Please edit your config file:

FROM: mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }

TO:       mask = { grid = []; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }


That empty string is being treated as a second masking region with no
name... which results in that "null string" warning.


But I don't see any obvious, repeatable source of a segfault here. If
this were GRIB2 data, I'd suspect it would have something to do with
the compilation of the grib2c library.  But this is NOT GRIB2 data,
it's GRIB1.


We could try narrowing this down by running plot_data_plane on each of
the input files to see if MET is having trouble reading any of the
input datasets. Do all 3 of these plotting commands run OK for you?


plot_data_plane
US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
MFLX.ps 'name="MFLX"; level="Z0";'

plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
HTSGW.ps 'name="HTSGW"; level="Z0";'

plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
level="(*,*)";'


Thanks,
John

On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump with MET v9.1 grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> >
>
>
> Hi - we installed MET v9.1 and I got core dumped when invoking
> grid_stat on two GRIB1 files. I placed these and auxiliary files at
> incoming/irap/met_help/serra
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>


------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 08 12:58:23 2020



-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Friday, September 4, 2020 11:07 AM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
dump with MET v9.1 grid_stat

Efren,

Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
I did pull the updated files and run with the configuration you sent.

I was able to run with error. Although there still was a warning. Let
me address that warning first...

grid_stat \

US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
\

US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\

GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000
\
-outdir out -v 4 -log run_gs.log

WARNING:

WARNING: check_hdr_str() -> null string!

WARNING:


Please edit your config file:

FROM: mask = { grid = [""]; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }

TO:       mask = { grid = []; poly =
["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
wp312018-2018102300.nc"]; }


That empty string is being treated as a second masking region with no
name... which results in that "null string" warning.


But I don't see any obvious, repeatable source of a segfault here. If
this were GRIB2 data, I'd suspect it would have something to do with
the compilation of the grib2c library.  But this is NOT GRIB2 data,
it's GRIB1.


We could try narrowing this down by running plot_data_plane on each of
the input files to see if MET is having trouble reading any of the
input datasets. Do all 3 of these plotting commands run OK for you?


plot_data_plane
US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
MFLX.ps 'name="MFLX"; level="Z0";'

plot_data_plane
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
HTSGW.ps 'name="HTSGW"; level="Z0";'

plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
level="(*,*)";'


Thanks,
John

On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
>        Queue: met_help
>      Subject: core dump with MET v9.1 grid_stat
>        Owner: Nobody
>   Requestors: efren.serra.ctr at nrlmry.navy.mil
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> >
>
>
> Hi - we installed MET v9.1 and I got core dumped when invoking
> grid_stat on two GRIB1 files. I placed these and auxiliary files at
> incoming/irap/met_help/serra
>
> Efren A. Serra (Contractor)
> Physicist
>
> DeVine Consulting, Inc.
> Naval Research Laboratory
> Marine Meteorology Division
> 7 Grace Hopper Ave., STOP 2
> Monterey, CA 93943
> Code 7542
> Mobile: 408-425-5027
>
>
>


------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Tue Sep 08 14:28:38 2020

Efren,

So to be crystal clear, this is the exact input data file:
ftp://ftp.rap.ucar.edu/incoming/irap/met_help/serra/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-000000sig_wav_ht

And running that through plot_data_plane results in a segfault?

plot_data_plane \
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\
HTSGW.ps 'name="HTSGW"; level="Z0";'

I wish I was getting the same segfault as you so I could debug it. My
next
step would be checking that MET can plot the INPUT to the cdo command
you're running... so the same command as above, but using whatever the
0.25
degree input file is named.

As a side-note, MET could actually do this regridding to a 1.0 degree
grid
for you on the fly instead of needing to call cdo. You'd specify that
in
the "regrid" dictionary of the Grid-Stat config file.

John



On Tue, Sep 8, 2020 at 1:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
<
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - The plot_data_plane on sig_wav_ht gave a core dump. I should
point
> out that the sig_wav_ht fields are 1.0 deg GRIB1 fields derived from
.25
> deg GRIB1 fields tapered down via cdo with this command: cdo -f grb1
> remapbil,r360x181
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Friday, September 4, 2020 11:07 AM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
dump
> with MET v9.1 grid_stat
>
> Efren,
>
> Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
> I did pull the updated files and run with the configuration you
sent.
>
> I was able to run with error. Although there still was a warning.
Let me
> address that warning first...
>
> grid_stat \
>
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
> \
>
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
> \
>
>
> GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_2018102300-000
> \
> -outdir out -v 4 -log run_gs.log
>
> WARNING:
>
> WARNING: check_hdr_str() -> null string!
>
> WARNING:
>
>
> Please edit your config file:
>
> FROM: mask = { grid = [""]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
> TO:       mask = { grid = []; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
>
> That empty string is being treated as a second masking region with
no
> name... which results in that "null string" warning.
>
>
> But I don't see any obvious, repeatable source of a segfault here.
If this
> were GRIB2 data, I'd suspect it would have something to do with the
> compilation of the grib2c library.  But this is NOT GRIB2 data, it's
GRIB1.
>
>
> We could try narrowing this down by running plot_data_plane on each
of the
> input files to see if MET is having trouble reading any of the input
> datasets. Do all 3 of these plotting commands run OK for you?
>
>
> plot_data_plane
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_sig_wav_ht_gt12ft
> MFLX.ps 'name="MFLX"; level="Z0";'
>
> plot_data_plane
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
> HTSGW.ps 'name="HTSGW"; level="Z0";'
>
> plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
> level="(*,*)";'
>
>
> Thanks,
> John
>
> On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via
RT <
> met_help at ucar.edu> wrote:
>
> >
> > Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> > Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
> >        Queue: met_help
> >      Subject: core dump with MET v9.1 grid_stat
> >        Owner: Nobody
> >   Requestors: efren.serra.ctr at nrlmry.navy.mil
> >       Status: new
> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> > >
> >
> >
> > Hi - we installed MET v9.1 and I got core dumped when invoking
> > grid_stat on two GRIB1 files. I placed these and auxiliary files
at
> > incoming/irap/met_help/serra
> >
> > Efren A. Serra (Contractor)
> > Physicist
> >
> > DeVine Consulting, Inc.
> > Naval Research Laboratory
> > Marine Meteorology Division
> > 7 Grace Hopper Ave., STOP 2
> > Monterey, CA 93943
> > Code 7542
> > Mobile: 408-425-5027
> >
> >
> >
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core dump with MET v9.1 grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Tue Sep 08 19:10:06 2020

John - I shall place the .25 deg input file in my ftp directory

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 8, 2020 1:29 PM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548]
core dump with MET v9.1 grid_stat

Efren,

So to be crystal clear, this is the exact input data file:
ftp://ftp.rap.ucar.edu/incoming/irap/met_help/serra/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-000000sig_wav_ht

And running that through plot_data_plane results in a segfault?

plot_data_plane \
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\
HTSGW.ps 'name="HTSGW"; level="Z0";'

I wish I was getting the same segfault as you so I could debug it. My
next step would be checking that MET can plot the INPUT to the cdo
command you're running... so the same command as above, but using
whatever the 0.25 degree input file is named.

As a side-note, MET could actually do this regridding to a 1.0 degree
grid for you on the fly instead of needing to call cdo. You'd specify
that in the "regrid" dictionary of the Grid-Stat config file.

John



On Tue, Sep 8, 2020 at 1:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - The plot_data_plane on sig_wav_ht gave a core dump. I should
> point out that the sig_wav_ht fields are 1.0 deg GRIB1 fields
derived
> from .25 deg GRIB1 fields tapered down via cdo with this command:
cdo
> -f grb1
> remapbil,r360x181
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Friday, September 4, 2020 11:07 AM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
> dump with MET v9.1 grid_stat
>
> Efren,
>
> Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
> I did pull the updated files and run with the configuration you
sent.
>
> I was able to run with error. Although there still was a warning.
Let
> me address that warning first...
>
> grid_stat \
>
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> sig_wav_ht_gt12ft
> \
>
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> av_ht
> \
>
>
> GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_20181023
> 00-000
> \
> -outdir out -v 4 -log run_gs.log
>
> WARNING:
>
> WARNING: check_hdr_str() -> null string!
>
> WARNING:
>
>
> Please edit your config file:
>
> FROM: mask = { grid = [""]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
> TO:       mask = { grid = []; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
>
> That empty string is being treated as a second masking region with
no
> name... which results in that "null string" warning.
>
>
> But I don't see any obvious, repeatable source of a segfault here.
If
> this were GRIB2 data, I'd suspect it would have something to do with
> the compilation of the grib2c library.  But this is NOT GRIB2 data,
it's GRIB1.
>
>
> We could try narrowing this down by running plot_data_plane on each
of
> the input files to see if MET is having trouble reading any of the
> input datasets. Do all 3 of these plotting commands run OK for you?
>
>
> plot_data_plane
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> sig_wav_ht_gt12ft
> MFLX.ps 'name="MFLX"; level="Z0";'
>
> plot_data_plane
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> av_ht
> HTSGW.ps 'name="HTSGW"; level="Z0";'
>
> plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
> level="(*,*)";'
>
>
> Thanks,
> John
>
> On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via
RT
> < met_help at ucar.edu> wrote:
>
> >
> > Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> > Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
> >        Queue: met_help
> >      Subject: core dump with MET v9.1 grid_stat
> >        Owner: Nobody
> >   Requestors: efren.serra.ctr at nrlmry.navy.mil
> >       Status: new
> >  Ticket <URL:
> > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> > >
> >
> >
> > Hi - we installed MET v9.1 and I got core dumped when invoking
> > grid_stat on two GRIB1 files. I placed these and auxiliary files
at
> > incoming/irap/met_help/serra
> >
> > Efren A. Serra (Contractor)
> > Physicist
> >
> > DeVine Consulting, Inc.
> > Naval Research Laboratory
> > Marine Meteorology Division
> > 7 Grace Hopper Ave., STOP 2
> > Monterey, CA 93943
> > Code 7542
> > Mobile: 408-425-5027
> >
> >
> >
>
>
>



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core dump with MET v9.1 grid_stat
From: efren.serra.ctr at nrlmry.navy.mil
Time: Wed Sep 09 14:55:03 2020

John - is this a valid to_grid "spec" in the regrid object:
to_grid="latlon 360 181 -90.0 0.0 1.0 1.0";

-----Original Message-----
From: John Halley Gotway via RT <met_help at ucar.edu>
Sent: Tuesday, September 8, 2020 1:29 PM
To: Serra, Mr. Efren, Contractor, Code 7531
<efren.serra.ctr at nrlmry.navy.mil>
Subject: Re: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548]
core dump with MET v9.1 grid_stat

Efren,

So to be crystal clear, this is the exact input data file:
ftp://ftp.rap.ucar.edu/incoming/irap/met_help/serra/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-000000sig_wav_ht

And running that through plot_data_plane results in a segfault?

plot_data_plane \
US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
\
HTSGW.ps 'name="HTSGW"; level="Z0";'

I wish I was getting the same segfault as you so I could debug it. My
next step would be checking that MET can plot the INPUT to the cdo
command you're running... so the same command as above, but using
whatever the 0.25 degree input file is named.

As a side-note, MET could actually do this regridding to a 1.0 degree
grid for you on the fly instead of needing to call cdo. You'd specify
that in the "regrid" dictionary of the Grid-Stat config file.

John



On Tue, Sep 8, 2020 at 1:07 PM efren.serra.ctr at nrlmry.navy.mil via RT
< met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - The plot_data_plane on sig_wav_ht gave a core dump. I should
> point out that the sig_wav_ht fields are 1.0 deg GRIB1 fields
derived
> from .25 deg GRIB1 fields tapered down via cdo with this command:
cdo
> -f grb1
> remapbil,r360x181
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Friday, September 4, 2020 11:07 AM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548] core
> dump with MET v9.1 grid_stat
>
> Efren,
>
> Sorry to hear that you're having more trouble with met-9.1 on this
dataset.
> I did pull the updated files and run with the configuration you
sent.
>
> I was able to run with error. Although there still was a warning.
Let
> me address that warning first...
>
> grid_stat \
>
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> sig_wav_ht_gt12ft
> \
>
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> av_ht
> \
>
>
> GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_20181023
> 00-000
> \
> -outdir out -v 4 -log run_gs.log
>
> WARNING:
>
> WARNING: check_hdr_str() -> null string!
>
> WARNING:
>
>
> Please edit your config file:
>
> FROM: mask = { grid = [""]; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
> TO:       mask = { grid = []; poly =
> ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> wp312018-2018102300.nc"]; }
>
>
> That empty string is being treated as a second masking region with
no
> name... which results in that "null string" warning.
>
>
> But I don't see any obvious, repeatable source of a segfault here.
If
> this were GRIB2 data, I'd suspect it would have something to do with
> the compilation of the grib2c library.  But this is NOT GRIB2 data,
it's GRIB1.
>
>
> We could try narrowing this down by running plot_data_plane on each
of
> the input files to see if MET is having trouble reading any of the
> input datasets. Do all 3 of these plotting commands run OK for you?
>
>
> plot_data_plane
>
> US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> sig_wav_ht_gt12ft
> MFLX.ps 'name="MFLX"; level="Z0";'
>
> plot_data_plane
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> av_ht
> HTSGW.ps 'name="HTSGW"; level="Z0";'
>
> plot_data_plane wp312018-2018102300.nc box_mask.ps 'name="box_mask";
> level="(*,*)";'
>
>
> Thanks,
> John
>
> On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via
RT
> < met_help at ucar.edu> wrote:
>
> >
> > Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> > Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
> >        Queue: met_help
> >      Subject: core dump with MET v9.1 grid_stat
> >        Owner: Nobody
> >   Requestors: efren.serra.ctr at nrlmry.navy.mil
> >       Status: new
> >  Ticket <URL:
> > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> > >
> >
> >
> > Hi - we installed MET v9.1 and I got core dumped when invoking
> > grid_stat on two GRIB1 files. I placed these and auxiliary files
at
> > incoming/irap/met_help/serra
> >
> > Efren A. Serra (Contractor)
> > Physicist
> >
> > DeVine Consulting, Inc.
> > Naval Research Laboratory
> > Marine Meteorology Division
> > 7 Grace Hopper Ave., STOP 2
> > Monterey, CA 93943
> > Code 7542
> > Mobile: 408-425-5027
> >
> >
> >
>
>
>



------------------------------------------------
Subject: core dump wtih from gen_vx_mask and grid_stat
From: John Halley Gotway
Time: Wed Sep 09 15:05:48 2020

Erfren,

I didn't see any new GRIB files in this FTP directory:
ftp://ftp.rap.ucar.edu/incoming/irap/met_help/serra/

But I don't actually need it... since I'm not able to replicate the
issue
you're having reading data from the regridded output file. But I would
recommend that you try running plot_data_plane on both the input and
output
of your call to "cdo". If MET can read the input but not the output,
then
at least we've identified where the problem lies. If MET can read
neither,
then there's some other issue.

To answer your question, yes, that does look like a valid "to_grid"
string
definition. Here's a selection from this page of the user's guide:
https://dtcenter.github.io/MET/Users_Guide/data_io.html

//      - to_grid = "spec"; To define a grid specified as follows:
//         - lambert Nx Ny lat_ll lon_ll lon_orient D_km R_km
standard_parallel_1
//           [standard_parallel_2] N|S
//         - stereo Nx Ny lat_ll lon_ll lon_orient D_km R_km lat_scale
N|S
//         - latlon Nx Ny lat_ll lon_ll delta_lat delta_lon
//         - mercator Nx Ny lat_ll lon_ll lat_ur lon_ur
//         - gaussian lon_zero Nx Ny

And the formatting of "latlon 360 181 -90.0 0.0 1.0 1.0" matches the
spec.

John

On Wed, Sep 9, 2020 at 2:55 PM efren.serra.ctr at nrlmry.navy.mil via RT
<
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
>
> John - is this a valid to_grid "spec" in the regrid object:
> to_grid="latlon 360 181 -90.0 0.0 1.0 1.0";
>
> -----Original Message-----
> From: John Halley Gotway via RT <met_help at ucar.edu>
> Sent: Tuesday, September 8, 2020 1:29 PM
> To: Serra, Mr. Efren, Contractor, Code 7531 <
> efren.serra.ctr at nrlmry.navy.mil>
> Subject: Re: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548]
core
> dump with MET v9.1 grid_stat
>
> Efren,
>
> So to be crystal clear, this is the exact input data file:
>
> ftp://ftp.rap.ucar.edu/incoming/irap/met_help/serra/US058GOCN-
GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-000000sig_wav_ht
>
> And running that through plot_data_plane results in a segfault?
>
> plot_data_plane \
> US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_wav_ht
> \
> HTSGW.ps 'name="HTSGW"; level="Z0";'
>
> I wish I was getting the same segfault as you so I could debug it.
My next
> step would be checking that MET can plot the INPUT to the cdo
command
> you're running... so the same command as above, but using whatever
the 0.25
> degree input file is named.
>
> As a side-note, MET could actually do this regridding to a 1.0
degree grid
> for you on the fly instead of needing to call cdo. You'd specify
that in
> the "regrid" dictionary of the Grid-Stat config file.
>
> John
>
>
>
> On Tue, Sep 8, 2020 at 1:07 PM efren.serra.ctr at nrlmry.navy.mil via
RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96501 >
> >
> > John - The plot_data_plane on sig_wav_ht gave a core dump. I
should
> > point out that the sig_wav_ht fields are 1.0 deg GRIB1 fields
derived
> > from .25 deg GRIB1 fields tapered down via cdo with this command:
cdo
> > -f grb1
> > remapbil,r360x181
> >
> > -----Original Message-----
> > From: John Halley Gotway via RT <met_help at ucar.edu>
> > Sent: Friday, September 4, 2020 11:07 AM
> > To: Serra, Mr. Efren, Contractor, Code 7531 <
> > efren.serra.ctr at nrlmry.navy.mil>
> > Subject: [rt.rap.ucar.edu #96501] Re: [rt.rap.ucar.edu #96548]
core
> > dump with MET v9.1 grid_stat
> >
> > Efren,
> >
> > Sorry to hear that you're having more trouble with met-9.1 on this
> dataset.
> > I did pull the updated files and run with the configuration you
sent.
> >
> > I was able to run with error. Although there still was a warning.
Let
> > me address that warning first...
> >
> > grid_stat \
> >
> >
> > US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> > sig_wav_ht_gt12ft
> > \
> >
> > US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> > av_ht
> > \
> >
> >
> > GridStatConfig_wp312018_gt12ft_WW3NAVGEM-
WW3TCOFCL.2018102300_20181023
> > 00-000
> > \
> > -outdir out -v 4 -log run_gs.log
> >
> > WARNING:
> >
> > WARNING: check_hdr_str() -> null string!
> >
> > WARNING:
> >
> >
> > Please edit your config file:
> >
> > FROM: mask = { grid = [""]; poly =
> > ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> > wp312018-2018102300.nc"]; }
> >
> > TO:       mask = { grid = []; poly =
> > ["/omar_backup/leo_backup/serra/ww3tcofcl-evaluation/wp312018/
> > wp312018-2018102300.nc"]; }
> >
> >
> > That empty string is being treated as a second masking region with
no
> > name... which results in that "null string" warning.
> >
> >
> > But I don't see any obvious, repeatable source of a segfault here.
If
> > this were GRIB2 data, I'd suspect it would have something to do
with
> > the compilation of the grib2c library.  But this is NOT GRIB2
data, it's
> GRIB1.
> >
> >
> > We could try narrowing this down by running plot_data_plane on
each of
> > the input files to see if MET is having trouble reading any of the
> > input datasets. Do all 3 of these plotting commands run OK for
you?
> >
> >
> > plot_data_plane
> >
> > US058GOCN-GR1mdl.0050_0240_00000U0RL2018102300_0001_000000-
000000prob_
> > sig_wav_ht_gt12ft
> > MFLX.ps 'name="MFLX"; level="Z0";'
> >
> > plot_data_plane
> > US058GOCN-GR1mdl.0095_0200_00000F0RL2018102300_0001_000000-
000000sig_w
> > av_ht
> > HTSGW.ps 'name="HTSGW"; level="Z0";'
> >
> > plot_data_plane wp312018-2018102300.nc box_mask.ps
'name="box_mask";
> > level="(*,*)";'
> >
> >
> > Thanks,
> > John
> >
> > On Wed, Sep 2, 2020 at 3:39 PM efren.serra.ctr at nrlmry.navy.mil via
RT
> > < met_help at ucar.edu> wrote:
> >
> > >
> > > Wed Sep 02 15:38:53 2020: Request 96548 was acted upon.
> > > Transaction: Ticket created by efren.serra.ctr at nrlmry.navy.mil
> > >        Queue: met_help
> > >      Subject: core dump with MET v9.1 grid_stat
> > >        Owner: Nobody
> > >   Requestors: efren.serra.ctr at nrlmry.navy.mil
> > >       Status: new
> > >  Ticket <URL:
> > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=96548
> > > >
> > >
> > >
> > > Hi - we installed MET v9.1 and I got core dumped when invoking
> > > grid_stat on two GRIB1 files. I placed these and auxiliary files
at
> > > incoming/irap/met_help/serra
> > >
> > > Efren A. Serra (Contractor)
> > > Physicist
> > >
> > > DeVine Consulting, Inc.
> > > Naval Research Laboratory
> > > Marine Meteorology Division
> > > 7 Grace Hopper Ave., STOP 2
> > > Monterey, CA 93943
> > > Code 7542
> > > Mobile: 408-425-5027
> > >
> > >
> > >
> >
> >
> >
>
>
>
>

------------------------------------------------


More information about the Met_help mailing list