[Met_help] [rt.rap.ucar.edu #99775] History for Ensemble verification

John Halley Gotway via RT met_help at ucar.edu
Wed May 12 09:58:03 MDT 2021


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Hello,
I am wondering how to deal with different time steps in MET? My obs. files
are every 10 minutes, but my model is hourly average data. I want to do the
ensemble verification. I know I can get an hourly average first for the
obs. data using other tools, but is it possible that I can do the ensemble
verification directly using MET without the prep-process? Thank you.


Binyu


----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: Ensemble verification
From: John Halley Gotway
Time: Mon May 03 16:52:16 2021

Binyu,

I see that you have questions about ensemble verification when the
time
steps in your model and observation data differ. I'll probably need
several
more details to give you a good answer.

First, what type of observations are you verifying against? Gridded
analyses or point obs?

If they are gridded observations, I would actually recommend doing a
pre-processing step. Run them through pcp_combine using the "-derive"
option to compute the hourly mean from the 10-minute analysis data.

If they are point observations, you could use the obs_summary option
in the
Ensemble-Stat config file. Listed below is a description of that
option
taken from:
https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-io-
met-configuration-file-options

I'd recommend defining the obs_window to +/- 30 minutes around the
ensemble
valid time and then set "obs_summary = UW_MEAN". That way, you'll
verify
against the hourly mean value from each point observation location.

Hope that helps.

Thanks,
John

//
// The "obs_summary" entry specifies how to compute statistics on
// observations that appear at a single location (lat,lon,level,elev)
// in Point-Stat and Ensemble-Stat. Eight techniques are
// currently supported:
//
//    - "NONE" to use all point observations (legacy behavior)
//    - "NEAREST" use only the observation that has the valid
//      time closest to the forecast valid time
//    - "MIN" use only the observation that has the lowest value
//    - "MAX" use only the observation that has the highest value
//    - "UW_MEAN" compute an unweighted mean of the observations
//    - "DW_MEAN" compute a weighted mean of the observations based
//      on the time of the observation
//    - "MEDIAN" use the median observation
//    - "PERC" use the Nth percentile observation where N =
obs_perc_value
//
// The reporting mechanism for this feature can be activated by
specifying
// a verbosity level of three or higher. The report will show
information
// about where duplicates were detected and which observations were
used
// in those cases.
//
obs_summary = NONE;


On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
met_help at ucar.edu> wrote:

>
> Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> Transaction: Ticket created by binyu.wang at noaa.gov
>        Queue: met_help
>      Subject: Ensemble verification
>        Owner: Nobody
>   Requestors: binyu.wang at noaa.gov
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
>
>
> Hello,
> I am wondering how to deal with different time steps in MET? My obs.
files
> are every 10 minutes, but my model is hourly average data. I want to
do the
> ensemble verification. I know I can get an hourly average first for
the
> obs. data using other tools, but is it possible that I can do the
ensemble
> verification directly using MET without the prep-process? Thank you.
>
>
> Binyu
>
>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Mon May 03 20:12:19 2021

Hello John,

Thank you for your quick response. Actually I am not sure how to
determine
the data is point or gridded?  It is satellite obs. data, I guess it
is
point obs. because it is lat/lon based? But I am not exactly sure.

Here is the data:

/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite


Thank you.

Binyu

On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT
<met_help at ucar.edu>
wrote:

> Binyu,
>
> I see that you have questions about ensemble verification when the
time
> steps in your model and observation data differ. I'll probably need
several
> more details to give you a good answer.
>
> First, what type of observations are you verifying against? Gridded
> analyses or point obs?
>
> If they are gridded observations, I would actually recommend doing a
> pre-processing step. Run them through pcp_combine using the "-
derive"
> option to compute the hourly mean from the 10-minute analysis data.
>
> If they are point observations, you could use the obs_summary option
in the
> Ensemble-Stat config file. Listed below is a description of that
option
> taken from:
>
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
>
> I'd recommend defining the obs_window to +/- 30 minutes around the
ensemble
> valid time and then set "obs_summary = UW_MEAN". That way, you'll
verify
> against the hourly mean value from each point observation location.
>
> Hope that helps.
>
> Thanks,
> John
>
> //
> // The "obs_summary" entry specifies how to compute statistics on
> // observations that appear at a single location
(lat,lon,level,elev)
> // in Point-Stat and Ensemble-Stat. Eight techniques are
> // currently supported:
> //
> //    - "NONE" to use all point observations (legacy behavior)
> //    - "NEAREST" use only the observation that has the valid
> //      time closest to the forecast valid time
> //    - "MIN" use only the observation that has the lowest value
> //    - "MAX" use only the observation that has the highest value
> //    - "UW_MEAN" compute an unweighted mean of the observations
> //    - "DW_MEAN" compute a weighted mean of the observations based
> //      on the time of the observation
> //    - "MEDIAN" use the median observation
> //    - "PERC" use the Nth percentile observation where N =
obs_perc_value
> //
> // The reporting mechanism for this feature can be activated by
specifying
> // a verbosity level of three or higher. The report will show
information
> // about where duplicates were detected and which observations were
used
> // in those cases.
> //
> obs_summary = NONE;
>
>
> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
> met_help at ucar.edu> wrote:
>
> >
> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> > Transaction: Ticket created by binyu.wang at noaa.gov
> >        Queue: met_help
> >      Subject: Ensemble verification
> >        Owner: Nobody
> >   Requestors: binyu.wang at noaa.gov
> >       Status: new
> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> >
> >
> > Hello,
> > I am wondering how to deal with different time steps in MET? My
obs.
> files
> > are every 10 minutes, but my model is hourly average data. I want
to do
> the
> > ensemble verification. I know I can get an hourly average first
for the
> > obs. data using other tools, but is it possible that I can do the
> ensemble
> > verification directly using MET without the prep-process? Thank
you.
> >
> >
> > Binyu
> >
> >
>
>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Fri May 07 15:26:38 2021

Hello John,

There is some update about my previous question:

Here are the new files (it is different format with the previous one):

/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc


Using the same way as Howard did before, I can do:

1.
/gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc
"latlon 150 150 45 153 0.1 0.1" test1.nc -field
'name="ash_mass_loading";
level="(0,*,*)";'


then

2.
/gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'


As I asked before, I am not sure if the original data is grid analysis
or
point obs. Since I can do "point2grid", I guess the media product
"test1.nc"
is grid data, is that correct? I need to decide which tool to use to
get
hourly average data.  Thank you.



Binyu









On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
binyu.wang at noaa.gov> wrote:

> Hello John,
>
> Thank you for your quick response. Actually I am not sure how to
determine
> the data is point or gridded?  It is satellite obs. data, I guess it
is
> point obs. because it is lat/lon based? But I am not exactly sure.
>
> Here is the data:
>
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
>
>
> Thank you.
>
> Binyu
>
> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
>> Binyu,
>>
>> I see that you have questions about ensemble verification when the
time
>> steps in your model and observation data differ. I'll probably need
>> several
>> more details to give you a good answer.
>>
>> First, what type of observations are you verifying against? Gridded
>> analyses or point obs?
>>
>> If they are gridded observations, I would actually recommend doing
a
>> pre-processing step. Run them through pcp_combine using the "-
derive"
>> option to compute the hourly mean from the 10-minute analysis data.
>>
>> If they are point observations, you could use the obs_summary
option in
>> the
>> Ensemble-Stat config file. Listed below is a description of that
option
>> taken from:
>>
>> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
>>
>> I'd recommend defining the obs_window to +/- 30 minutes around the
>> ensemble
>> valid time and then set "obs_summary = UW_MEAN". That way, you'll
verify
>> against the hourly mean value from each point observation location.
>>
>> Hope that helps.
>>
>> Thanks,
>> John
>>
>> //
>> // The "obs_summary" entry specifies how to compute statistics on
>> // observations that appear at a single location
(lat,lon,level,elev)
>> // in Point-Stat and Ensemble-Stat. Eight techniques are
>> // currently supported:
>> //
>> //    - "NONE" to use all point observations (legacy behavior)
>> //    - "NEAREST" use only the observation that has the valid
>> //      time closest to the forecast valid time
>> //    - "MIN" use only the observation that has the lowest value
>> //    - "MAX" use only the observation that has the highest value
>> //    - "UW_MEAN" compute an unweighted mean of the observations
>> //    - "DW_MEAN" compute a weighted mean of the observations based
>> //      on the time of the observation
>> //    - "MEDIAN" use the median observation
>> //    - "PERC" use the Nth percentile observation where N =
obs_perc_value
>> //
>> // The reporting mechanism for this feature can be activated by
specifying
>> // a verbosity level of three or higher. The report will show
information
>> // about where duplicates were detected and which observations were
used
>> // in those cases.
>> //
>> obs_summary = NONE;
>>
>>
>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
>> met_help at ucar.edu> wrote:
>>
>> >
>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
>> > Transaction: Ticket created by binyu.wang at noaa.gov
>> >        Queue: met_help
>> >      Subject: Ensemble verification
>> >        Owner: Nobody
>> >   Requestors: binyu.wang at noaa.gov
>> >       Status: new
>> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
>> >
>> >
>> > Hello,
>> > I am wondering how to deal with different time steps in MET? My
obs.
>> files
>> > are every 10 minutes, but my model is hourly average data. I want
to do
>> the
>> > ensemble verification. I know I can get an hourly average first
for the
>> > obs. data using other tools, but is it possible that I can do the
>> ensemble
>> > verification directly using MET without the prep-process? Thank
you.
>> >
>> >
>> > Binyu
>> >
>> >
>>
>>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Fri May 07 15:55:43 2021

Hello,

Continue to my previous email, I tried to use "pcp_combine" to do the
average as below:

$ cd
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite

[Binyu.Wang at v71a2 Satellite]$
/gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
-mean test1.nc test2.nc 'name="ash_mass_loading"; level="(*,*)";'
output_
mean.nc

ERROR  :

ERROR  : CommandLine::next_option() -> unrecognized command-line
switch
"-mean"

ERROR  :


What is the problem here?


Thank you.

Binyu

On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
binyu.wang at noaa.gov> wrote:

> Hello John,
>
> There is some update about my previous question:
>
> Here are the new files (it is different format with the previous
one):
>
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc
>
>
> Using the same way as Howard did before, I can do:
>
> 1.
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc
> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
'name="ash_mass_loading";
> level="(0,*,*)";'
>
>
> then
>
> 2.
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
>
>
> As I asked before, I am not sure if the original data is grid
analysis or
> point obs. Since I can do "point2grid", I guess the media product "
> test1.nc" is grid data, is that correct? I need to decide which tool
to
> use to get hourly average data.  Thank you.
>
>
>
> Binyu
>
>
>
>
>
>
>
>
>
> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
> binyu.wang at noaa.gov> wrote:
>
>> Hello John,
>>
>> Thank you for your quick response. Actually I am not sure how to
>> determine the data is point or gridded?  It is satellite obs. data,
I guess
>> it is point obs. because it is lat/lon based? But I am not exactly
sure.
>>
>> Here is the data:
>>
>>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
>>
>>
>> Thank you.
>>
>> Binyu
>>
>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
>> met_help at ucar.edu> wrote:
>>
>>> Binyu,
>>>
>>> I see that you have questions about ensemble verification when the
time
>>> steps in your model and observation data differ. I'll probably
need
>>> several
>>> more details to give you a good answer.
>>>
>>> First, what type of observations are you verifying against?
Gridded
>>> analyses or point obs?
>>>
>>> If they are gridded observations, I would actually recommend doing
a
>>> pre-processing step. Run them through pcp_combine using the "-
derive"
>>> option to compute the hourly mean from the 10-minute analysis
data.
>>>
>>> If they are point observations, you could use the obs_summary
option in
>>> the
>>> Ensemble-Stat config file. Listed below is a description of that
option
>>> taken from:
>>>
>>>
https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-io-
met-configuration-file-options
>>>
>>> I'd recommend defining the obs_window to +/- 30 minutes around the
>>> ensemble
>>> valid time and then set "obs_summary = UW_MEAN". That way, you'll
verify
>>> against the hourly mean value from each point observation
location.
>>>
>>> Hope that helps.
>>>
>>> Thanks,
>>> John
>>>
>>> //
>>> // The "obs_summary" entry specifies how to compute statistics on
>>> // observations that appear at a single location
(lat,lon,level,elev)
>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
>>> // currently supported:
>>> //
>>> //    - "NONE" to use all point observations (legacy behavior)
>>> //    - "NEAREST" use only the observation that has the valid
>>> //      time closest to the forecast valid time
>>> //    - "MIN" use only the observation that has the lowest value
>>> //    - "MAX" use only the observation that has the highest value
>>> //    - "UW_MEAN" compute an unweighted mean of the observations
>>> //    - "DW_MEAN" compute a weighted mean of the observations
based
>>> //      on the time of the observation
>>> //    - "MEDIAN" use the median observation
>>> //    - "PERC" use the Nth percentile observation where N =
>>> obs_perc_value
>>> //
>>> // The reporting mechanism for this feature can be activated by
>>> specifying
>>> // a verbosity level of three or higher. The report will show
information
>>> // about where duplicates were detected and which observations
were used
>>> // in those cases.
>>> //
>>> obs_summary = NONE;
>>>
>>>
>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
>>> met_help at ucar.edu> wrote:
>>>
>>> >
>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
>>> > Transaction: Ticket created by binyu.wang at noaa.gov
>>> >        Queue: met_help
>>> >      Subject: Ensemble verification
>>> >        Owner: Nobody
>>> >   Requestors: binyu.wang at noaa.gov
>>> >       Status: new
>>> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
>>> >
>>> >
>>> >
>>> > Hello,
>>> > I am wondering how to deal with different time steps in MET? My
obs.
>>> files
>>> > are every 10 minutes, but my model is hourly average data. I
want to
>>> do the
>>> > ensemble verification. I know I can get an hourly average first
for the
>>> > obs. data using other tools, but is it possible that I can do
the
>>> ensemble
>>> > verification directly using MET without the prep-process? Thank
you.
>>> >
>>> >
>>> > Binyu
>>> >
>>> >
>>>
>>>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Mon May 10 08:57:28 2021

Hello John,

I think I figured it out how to get the mean using pcp_combine (see
below),
is that correct?

/gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
-derive mean  bezy_2020295_2040_regrided.nc
bezy_2020295_2050_regrided.nc  -field
'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc


But I am still not sure how to tell if the file is grid analysis or
point-obs, thank you.


Binyu

On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
binyu.wang at noaa.gov> wrote:

> Hello,
>
> Continue to my previous email, I tried to use "pcp_combine" to do
the
> average as below:
>
> $ cd
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
>
> [Binyu.Wang at v71a2 Satellite]$
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> -mean test1.nc test2.nc 'name="ash_mass_loading"; level="(*,*)";'
> output_mean.nc
>
> ERROR  :
>
> ERROR  : CommandLine::next_option() -> unrecognized command-line
switch
> "-mean"
>
> ERROR  :
>
>
> What is the problem here?
>
>
> Thank you.
>
> Binyu
>
> On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
> binyu.wang at noaa.gov> wrote:
>
>> Hello John,
>>
>> There is some update about my previous question:
>>
>> Here are the new files (it is different format with the previous
one):
>>
>>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
>> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc
>>
>>
>> Using the same way as Howard did before, I can do:
>>
>> 1.
>> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
>> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc
>> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
>> 'name="ash_mass_loading"; level="(0,*,*)";'
>>
>>
>> then
>>
>> 2.
>> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
>> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
>>
>>
>> As I asked before, I am not sure if the original data is grid
analysis or
>> point obs. Since I can do "point2grid", I guess the media product "
>> test1.nc" is grid data, is that correct? I need to decide which
tool to
>> use to get hourly average data.  Thank you.
>>
>>
>>
>> Binyu
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
>> binyu.wang at noaa.gov> wrote:
>>
>>> Hello John,
>>>
>>> Thank you for your quick response. Actually I am not sure how to
>>> determine the data is point or gridded?  It is satellite obs.
data, I guess
>>> it is point obs. because it is lat/lon based? But I am not exactly
sure.
>>>
>>> Here is the data:
>>>
>>>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
>>>
>>>
>>> Thank you.
>>>
>>> Binyu
>>>
>>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
>>> met_help at ucar.edu> wrote:
>>>
>>>> Binyu,
>>>>
>>>> I see that you have questions about ensemble verification when
the time
>>>> steps in your model and observation data differ. I'll probably
need
>>>> several
>>>> more details to give you a good answer.
>>>>
>>>> First, what type of observations are you verifying against?
Gridded
>>>> analyses or point obs?
>>>>
>>>> If they are gridded observations, I would actually recommend
doing a
>>>> pre-processing step. Run them through pcp_combine using the "-
derive"
>>>> option to compute the hourly mean from the 10-minute analysis
data.
>>>>
>>>> If they are point observations, you could use the obs_summary
option in
>>>> the
>>>> Ensemble-Stat config file. Listed below is a description of that
option
>>>> taken from:
>>>>
>>>>
https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-io-
met-configuration-file-options
>>>>
>>>> I'd recommend defining the obs_window to +/- 30 minutes around
the
>>>> ensemble
>>>> valid time and then set "obs_summary = UW_MEAN". That way, you'll
verify
>>>> against the hourly mean value from each point observation
location.
>>>>
>>>> Hope that helps.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>> //
>>>> // The "obs_summary" entry specifies how to compute statistics on
>>>> // observations that appear at a single location
(lat,lon,level,elev)
>>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
>>>> // currently supported:
>>>> //
>>>> //    - "NONE" to use all point observations (legacy behavior)
>>>> //    - "NEAREST" use only the observation that has the valid
>>>> //      time closest to the forecast valid time
>>>> //    - "MIN" use only the observation that has the lowest value
>>>> //    - "MAX" use only the observation that has the highest value
>>>> //    - "UW_MEAN" compute an unweighted mean of the observations
>>>> //    - "DW_MEAN" compute a weighted mean of the observations
based
>>>> //      on the time of the observation
>>>> //    - "MEDIAN" use the median observation
>>>> //    - "PERC" use the Nth percentile observation where N =
>>>> obs_perc_value
>>>> //
>>>> // The reporting mechanism for this feature can be activated by
>>>> specifying
>>>> // a verbosity level of three or higher. The report will show
>>>> information
>>>> // about where duplicates were detected and which observations
were used
>>>> // in those cases.
>>>> //
>>>> obs_summary = NONE;
>>>>
>>>>
>>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
>>>> met_help at ucar.edu> wrote:
>>>>
>>>> >
>>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
>>>> > Transaction: Ticket created by binyu.wang at noaa.gov
>>>> >        Queue: met_help
>>>> >      Subject: Ensemble verification
>>>> >        Owner: Nobody
>>>> >   Requestors: binyu.wang at noaa.gov
>>>> >       Status: new
>>>> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
>>>> >
>>>> >
>>>> >
>>>> > Hello,
>>>> > I am wondering how to deal with different time steps in MET? My
obs.
>>>> files
>>>> > are every 10 minutes, but my model is hourly average data. I
want to
>>>> do the
>>>> > ensemble verification. I know I can get an hourly average first
for
>>>> the
>>>> > obs. data using other tools, but is it possible that I can do
the
>>>> ensemble
>>>> > verification directly using MET without the prep-process? Thank
you.
>>>> >
>>>> >
>>>> > Binyu
>>>> >
>>>> >
>>>>
>>>>

------------------------------------------------
Subject: Ensemble verification
From: John Halley Gotway
Time: Mon May 10 10:03:41 2021

Binyu,

I'm sorry for the delay in responding to this issue.

Yes, you're right, running pcp_combine with the "-derive mean" option
will
compute the mean across whatever files you pass as input.

For the second question, yes, the output of the point2grid tool is
gridded
data. So you're comparing gridded NetCDF mean output for pcp_combine
to the
gridded NetCDF output from point2grid.

When you run ensemble-stat, you pass gridded observations as input
using
the "-grid_obs" command line option. So that's how you'd pass the
point2grid output.

Thanks,
John

On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
>
> Hello John,
>
> I think I figured it out how to get the mean using pcp_combine (see
below),
> is that correct?
>
>
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> -derive mean  bezy_2020295_2040_regrided.nc
> bezy_2020295_2050_regrided.nc  -field
> 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
>
>
> But I am still not sure how to tell if the file is grid analysis or
> point-obs, thank you.
>
>
> Binyu
>
> On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
> binyu.wang at noaa.gov> wrote:
>
> > Hello,
> >
> > Continue to my previous email, I tried to use "pcp_combine" to do
the
> > average as below:
> >
> > $ cd
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> >
> > [Binyu.Wang at v71a2 Satellite]$
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > -mean test1.nc test2.nc 'name="ash_mass_loading"; level="(*,*)";'
> > output_mean.nc
> >
> > ERROR  :
> >
> > ERROR  : CommandLine::next_option() -> unrecognized command-line
switch
> > "-mean"
> >
> > ERROR  :
> >
> >
> > What is the problem here?
> >
> >
> > Thank you.
> >
> > Binyu
> >
> > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
> > binyu.wang at noaa.gov> wrote:
> >
> >> Hello John,
> >>
> >> There is some update about my previous question:
> >>
> >> Here are the new files (it is different format with the previous
one):
> >>
> >>
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> >>
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> g001_pc.nc
> >>
> >>
> >> Using the same way as Howard did before, I can do:
> >>
> >> 1.
> >>
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> >>
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> g001_pc.nc
> >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> >> 'name="ash_mass_loading"; level="(0,*,*)";'
> >>
> >>
> >> then
> >>
> >> 2.
> >>
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> >> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
> >>
> >>
> >> As I asked before, I am not sure if the original data is grid
analysis
> or
> >> point obs. Since I can do "point2grid", I guess the media product
"
> >> test1.nc" is grid data, is that correct? I need to decide which
tool to
> >> use to get hourly average data.  Thank you.
> >>
> >>
> >>
> >> Binyu
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
> >> binyu.wang at noaa.gov> wrote:
> >>
> >>> Hello John,
> >>>
> >>> Thank you for your quick response. Actually I am not sure how to
> >>> determine the data is point or gridded?  It is satellite obs.
data, I
> guess
> >>> it is point obs. because it is lat/lon based? But I am not
exactly
> sure.
> >>>
> >>> Here is the data:
> >>>
> >>>
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> >>>
> >>>
> >>> Thank you.
> >>>
> >>> Binyu
> >>>
> >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
> >>> met_help at ucar.edu> wrote:
> >>>
> >>>> Binyu,
> >>>>
> >>>> I see that you have questions about ensemble verification when
the
> time
> >>>> steps in your model and observation data differ. I'll probably
need
> >>>> several
> >>>> more details to give you a good answer.
> >>>>
> >>>> First, what type of observations are you verifying against?
Gridded
> >>>> analyses or point obs?
> >>>>
> >>>> If they are gridded observations, I would actually recommend
doing a
> >>>> pre-processing step. Run them through pcp_combine using the "-
derive"
> >>>> option to compute the hourly mean from the 10-minute analysis
data.
> >>>>
> >>>> If they are point observations, you could use the obs_summary
option
> in
> >>>> the
> >>>> Ensemble-Stat config file. Listed below is a description of
that
> option
> >>>> taken from:
> >>>>
> >>>>
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> >>>>
> >>>> I'd recommend defining the obs_window to +/- 30 minutes around
the
> >>>> ensemble
> >>>> valid time and then set "obs_summary = UW_MEAN". That way,
you'll
> verify
> >>>> against the hourly mean value from each point observation
location.
> >>>>
> >>>> Hope that helps.
> >>>>
> >>>> Thanks,
> >>>> John
> >>>>
> >>>> //
> >>>> // The "obs_summary" entry specifies how to compute statistics
on
> >>>> // observations that appear at a single location
(lat,lon,level,elev)
> >>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
> >>>> // currently supported:
> >>>> //
> >>>> //    - "NONE" to use all point observations (legacy behavior)
> >>>> //    - "NEAREST" use only the observation that has the valid
> >>>> //      time closest to the forecast valid time
> >>>> //    - "MIN" use only the observation that has the lowest
value
> >>>> //    - "MAX" use only the observation that has the highest
value
> >>>> //    - "UW_MEAN" compute an unweighted mean of the
observations
> >>>> //    - "DW_MEAN" compute a weighted mean of the observations
based
> >>>> //      on the time of the observation
> >>>> //    - "MEDIAN" use the median observation
> >>>> //    - "PERC" use the Nth percentile observation where N =
> >>>> obs_perc_value
> >>>> //
> >>>> // The reporting mechanism for this feature can be activated by
> >>>> specifying
> >>>> // a verbosity level of three or higher. The report will show
> >>>> information
> >>>> // about where duplicates were detected and which observations
were
> used
> >>>> // in those cases.
> >>>> //
> >>>> obs_summary = NONE;
> >>>>
> >>>>
> >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
> >>>> met_help at ucar.edu> wrote:
> >>>>
> >>>> >
> >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> >>>> >        Queue: met_help
> >>>> >      Subject: Ensemble verification
> >>>> >        Owner: Nobody
> >>>> >   Requestors: binyu.wang at noaa.gov
> >>>> >       Status: new
> >>>> >  Ticket <URL:
> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> >>>> >
> >>>> >
> >>>> >
> >>>> > Hello,
> >>>> > I am wondering how to deal with different time steps in MET?
My obs.
> >>>> files
> >>>> > are every 10 minutes, but my model is hourly average data. I
want to
> >>>> do the
> >>>> > ensemble verification. I know I can get an hourly average
first for
> >>>> the
> >>>> > obs. data using other tools, but is it possible that I can do
the
> >>>> ensemble
> >>>> > verification directly using MET without the prep-process?
Thank you.
> >>>> >
> >>>> >
> >>>> > Binyu
> >>>> >
> >>>> >
> >>>>
> >>>>
>
>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Mon May 10 10:09:53 2021

Hello John,


Thank you for the clarification. But for the original one, how to tell
if
it is gridded analysis or point? Like this file:


/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/SatelliteVOLCAT_HIMAWARI-
8_FLDK_s2020296_185000_v300250_VCB_w167_FLDK_b2020295_204000_g001_pc.nc


Thank you.

On Mon, May 10, 2021 at 12:03 PM John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Binyu,
>
> I'm sorry for the delay in responding to this issue.
>
> Yes, you're right, running pcp_combine with the "-derive mean"
option will
> compute the mean across whatever files you pass as input.
>
> For the second question, yes, the output of the point2grid tool is
gridded
> data. So you're comparing gridded NetCDF mean output for pcp_combine
to the
> gridded NetCDF output from point2grid.
>
> When you run ensemble-stat, you pass gridded observations as input
using
> the "-grid_obs" command line option. So that's how you'd pass the
> point2grid output.
>
> Thanks,
> John
>
> On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> >
> > Hello John,
> >
> > I think I figured it out how to get the mean using pcp_combine
(see
> below),
> > is that correct?
> >
> >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > -derive mean  bezy_2020295_2040_regrided.nc
> > bezy_2020295_2050_regrided.nc  -field
> > 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
> >
> >
> > But I am still not sure how to tell if the file is grid analysis
or
> > point-obs, thank you.
> >
> >
> > Binyu
> >
> > On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
> > binyu.wang at noaa.gov> wrote:
> >
> > > Hello,
> > >
> > > Continue to my previous email, I tried to use "pcp_combine" to
do the
> > > average as below:
> > >
> > > $ cd
> > >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > >
> > > [Binyu.Wang at v71a2 Satellite]$
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > -mean test1.nc test2.nc 'name="ash_mass_loading";
level="(*,*)";'
> > > output_mean.nc
> > >
> > > ERROR  :
> > >
> > > ERROR  : CommandLine::next_option() -> unrecognized command-line
switch
> > > "-mean"
> > >
> > > ERROR  :
> > >
> > >
> > > What is the problem here?
> > >
> > >
> > > Thank you.
> > >
> > > Binyu
> > >
> > > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
> > > binyu.wang at noaa.gov> wrote:
> > >
> > >> Hello John,
> > >>
> > >> There is some update about my previous question:
> > >>
> > >> Here are the new files (it is different format with the
previous one):
> > >>
> > >>
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> > >>
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > g001_pc.nc
> > >>
> > >>
> > >> Using the same way as Howard did before, I can do:
> > >>
> > >> 1.
> > >>
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> > >>
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > g001_pc.nc
> > >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> > >> 'name="ash_mass_loading"; level="(0,*,*)";'
> > >>
> > >>
> > >> then
> > >>
> > >> 2.
> > >>
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> > >> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
> > >>
> > >>
> > >> As I asked before, I am not sure if the original data is grid
analysis
> > or
> > >> point obs. Since I can do "point2grid", I guess the media
product "
> > >> test1.nc" is grid data, is that correct? I need to decide which
tool
> to
> > >> use to get hourly average data.  Thank you.
> > >>
> > >>
> > >>
> > >> Binyu
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
> > >> binyu.wang at noaa.gov> wrote:
> > >>
> > >>> Hello John,
> > >>>
> > >>> Thank you for your quick response. Actually I am not sure how
to
> > >>> determine the data is point or gridded?  It is satellite obs.
data, I
> > guess
> > >>> it is point obs. because it is lat/lon based? But I am not
exactly
> > sure.
> > >>>
> > >>> Here is the data:
> > >>>
> > >>>
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > >>>
> > >>>
> > >>> Thank you.
> > >>>
> > >>> Binyu
> > >>>
> > >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
> > >>> met_help at ucar.edu> wrote:
> > >>>
> > >>>> Binyu,
> > >>>>
> > >>>> I see that you have questions about ensemble verification
when the
> > time
> > >>>> steps in your model and observation data differ. I'll
probably need
> > >>>> several
> > >>>> more details to give you a good answer.
> > >>>>
> > >>>> First, what type of observations are you verifying against?
Gridded
> > >>>> analyses or point obs?
> > >>>>
> > >>>> If they are gridded observations, I would actually recommend
doing a
> > >>>> pre-processing step. Run them through pcp_combine using the
> "-derive"
> > >>>> option to compute the hourly mean from the 10-minute analysis
data.
> > >>>>
> > >>>> If they are point observations, you could use the obs_summary
option
> > in
> > >>>> the
> > >>>> Ensemble-Stat config file. Listed below is a description of
that
> > option
> > >>>> taken from:
> > >>>>
> > >>>>
> >
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> > >>>>
> > >>>> I'd recommend defining the obs_window to +/- 30 minutes
around the
> > >>>> ensemble
> > >>>> valid time and then set "obs_summary = UW_MEAN". That way,
you'll
> > verify
> > >>>> against the hourly mean value from each point observation
location.
> > >>>>
> > >>>> Hope that helps.
> > >>>>
> > >>>> Thanks,
> > >>>> John
> > >>>>
> > >>>> //
> > >>>> // The "obs_summary" entry specifies how to compute
statistics on
> > >>>> // observations that appear at a single location
> (lat,lon,level,elev)
> > >>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
> > >>>> // currently supported:
> > >>>> //
> > >>>> //    - "NONE" to use all point observations (legacy
behavior)
> > >>>> //    - "NEAREST" use only the observation that has the valid
> > >>>> //      time closest to the forecast valid time
> > >>>> //    - "MIN" use only the observation that has the lowest
value
> > >>>> //    - "MAX" use only the observation that has the highest
value
> > >>>> //    - "UW_MEAN" compute an unweighted mean of the
observations
> > >>>> //    - "DW_MEAN" compute a weighted mean of the observations
based
> > >>>> //      on the time of the observation
> > >>>> //    - "MEDIAN" use the median observation
> > >>>> //    - "PERC" use the Nth percentile observation where N =
> > >>>> obs_perc_value
> > >>>> //
> > >>>> // The reporting mechanism for this feature can be activated
by
> > >>>> specifying
> > >>>> // a verbosity level of three or higher. The report will show
> > >>>> information
> > >>>> // about where duplicates were detected and which
observations were
> > used
> > >>>> // in those cases.
> > >>>> //
> > >>>> obs_summary = NONE;
> > >>>>
> > >>>>
> > >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT <
> > >>>> met_help at ucar.edu> wrote:
> > >>>>
> > >>>> >
> > >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> > >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> > >>>> >        Queue: met_help
> > >>>> >      Subject: Ensemble verification
> > >>>> >        Owner: Nobody
> > >>>> >   Requestors: binyu.wang at noaa.gov
> > >>>> >       Status: new
> > >>>> >  Ticket <URL:
> > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> > >>>> >
> > >>>> >
> > >>>> >
> > >>>> > Hello,
> > >>>> > I am wondering how to deal with different time steps in
MET? My
> obs.
> > >>>> files
> > >>>> > are every 10 minutes, but my model is hourly average data.
I want
> to
> > >>>> do the
> > >>>> > ensemble verification. I know I can get an hourly average
first
> for
> > >>>> the
> > >>>> > obs. data using other tools, but is it possible that I can
do the
> > >>>> ensemble
> > >>>> > verification directly using MET without the prep-process?
Thank
> you.
> > >>>> >
> > >>>> >
> > >>>> > Binyu
> > >>>> >
> > >>>> >
> > >>>>
> > >>>>
> >
> >
>
>

------------------------------------------------
Subject: Ensemble verification
From: John Halley Gotway
Time: Mon May 10 10:22:13 2021

Binyu,

If you're processing the Himawari data using the point2grid tool in
MET,
then it's output is gridded data. That's the whole point of that
tool...
taking data defined at individual lat, lon point locations and
interpolating them onto a grid. The Himawari data is a dense mesh of
lat/lon points, but they are just satellite pixels I believe. They are
not
gridded.

The point2grid tool puts them onto a grid.

John

On Mon, May 10, 2021 at 10:10 AM binyu.wang at noaa.gov via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
>
> Hello John,
>
>
> Thank you for the clarification. But for the original one, how to
tell if
> it is gridded analysis or point? Like this file:
>
>
>
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/SatelliteVOLCAT_HIMAWARI-
8_FLDK_s2020296_185000_v300250_VCB_w167_FLDK_
> b2020295_204000_g001_pc.nc
>
>
> Thank you.
>
> On Mon, May 10, 2021 at 12:03 PM John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
> > Binyu,
> >
> > I'm sorry for the delay in responding to this issue.
> >
> > Yes, you're right, running pcp_combine with the "-derive mean"
option
> will
> > compute the mean across whatever files you pass as input.
> >
> > For the second question, yes, the output of the point2grid tool is
> gridded
> > data. So you're comparing gridded NetCDF mean output for
pcp_combine to
> the
> > gridded NetCDF output from point2grid.
> >
> > When you run ensemble-stat, you pass gridded observations as input
using
> > the "-grid_obs" command line option. So that's how you'd pass the
> > point2grid output.
> >
> > Thanks,
> > John
> >
> > On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> > >
> > > Hello John,
> > >
> > > I think I figured it out how to get the mean using pcp_combine
(see
> > below),
> > > is that correct?
> > >
> > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > -derive mean  bezy_2020295_2040_regrided.nc
> > > bezy_2020295_2050_regrided.nc  -field
> > > 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
> > >
> > >
> > > But I am still not sure how to tell if the file is grid analysis
or
> > > point-obs, thank you.
> > >
> > >
> > > Binyu
> > >
> > > On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
> > > binyu.wang at noaa.gov> wrote:
> > >
> > > > Hello,
> > > >
> > > > Continue to my previous email, I tried to use "pcp_combine" to
do the
> > > > average as below:
> > > >
> > > > $ cd
> > > >
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > >
> > > > [Binyu.Wang at v71a2 Satellite]$
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > -mean test1.nc test2.nc 'name="ash_mass_loading";
level="(*,*)";'
> > > > output_mean.nc
> > > >
> > > > ERROR  :
> > > >
> > > > ERROR  : CommandLine::next_option() -> unrecognized command-
line
> switch
> > > > "-mean"
> > > >
> > > > ERROR  :
> > > >
> > > >
> > > > What is the problem here?
> > > >
> > > >
> > > > Thank you.
> > > >
> > > > Binyu
> > > >
> > > > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
> > > > binyu.wang at noaa.gov> wrote:
> > > >
> > > >> Hello John,
> > > >>
> > > >> There is some update about my previous question:
> > > >>
> > > >> Here are the new files (it is different format with the
previous
> one):
> > > >>
> > > >>
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> > > >>
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > g001_pc.nc
> > > >>
> > > >>
> > > >> Using the same way as Howard did before, I can do:
> > > >>
> > > >> 1.
> > > >>
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> > > >>
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > g001_pc.nc
> > > >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> > > >> 'name="ash_mass_loading"; level="(0,*,*)";'
> > > >>
> > > >>
> > > >> then
> > > >>
> > > >> 2.
> > > >>
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> > > >> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
> > > >>
> > > >>
> > > >> As I asked before, I am not sure if the original data is grid
> analysis
> > > or
> > > >> point obs. Since I can do "point2grid", I guess the media
product "
> > > >> test1.nc" is grid data, is that correct? I need to decide
which
> tool
> > to
> > > >> use to get hourly average data.  Thank you.
> > > >>
> > > >>
> > > >>
> > > >> Binyu
> > > >>
> > > >>
> > > >>
> > > >>
> > > >>
> > > >>
> > > >>
> > > >>
> > > >>
> > > >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate <
> > > >> binyu.wang at noaa.gov> wrote:
> > > >>
> > > >>> Hello John,
> > > >>>
> > > >>> Thank you for your quick response. Actually I am not sure
how to
> > > >>> determine the data is point or gridded?  It is satellite
obs.
> data, I
> > > guess
> > > >>> it is point obs. because it is lat/lon based? But I am not
exactly
> > > sure.
> > > >>>
> > > >>> Here is the data:
> > > >>>
> > > >>>
> > >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > >>>
> > > >>>
> > > >>> Thank you.
> > > >>>
> > > >>> Binyu
> > > >>>
> > > >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
> > > >>> met_help at ucar.edu> wrote:
> > > >>>
> > > >>>> Binyu,
> > > >>>>
> > > >>>> I see that you have questions about ensemble verification
when the
> > > time
> > > >>>> steps in your model and observation data differ. I'll
probably
> need
> > > >>>> several
> > > >>>> more details to give you a good answer.
> > > >>>>
> > > >>>> First, what type of observations are you verifying against?
> Gridded
> > > >>>> analyses or point obs?
> > > >>>>
> > > >>>> If they are gridded observations, I would actually
recommend
> doing a
> > > >>>> pre-processing step. Run them through pcp_combine using the
> > "-derive"
> > > >>>> option to compute the hourly mean from the 10-minute
analysis
> data.
> > > >>>>
> > > >>>> If they are point observations, you could use the
obs_summary
> option
> > > in
> > > >>>> the
> > > >>>> Ensemble-Stat config file. Listed below is a description of
that
> > > option
> > > >>>> taken from:
> > > >>>>
> > > >>>>
> > >
> >
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> > > >>>>
> > > >>>> I'd recommend defining the obs_window to +/- 30 minutes
around the
> > > >>>> ensemble
> > > >>>> valid time and then set "obs_summary = UW_MEAN". That way,
you'll
> > > verify
> > > >>>> against the hourly mean value from each point observation
> location.
> > > >>>>
> > > >>>> Hope that helps.
> > > >>>>
> > > >>>> Thanks,
> > > >>>> John
> > > >>>>
> > > >>>> //
> > > >>>> // The "obs_summary" entry specifies how to compute
statistics on
> > > >>>> // observations that appear at a single location
> > (lat,lon,level,elev)
> > > >>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
> > > >>>> // currently supported:
> > > >>>> //
> > > >>>> //    - "NONE" to use all point observations (legacy
behavior)
> > > >>>> //    - "NEAREST" use only the observation that has the
valid
> > > >>>> //      time closest to the forecast valid time
> > > >>>> //    - "MIN" use only the observation that has the lowest
value
> > > >>>> //    - "MAX" use only the observation that has the highest
value
> > > >>>> //    - "UW_MEAN" compute an unweighted mean of the
observations
> > > >>>> //    - "DW_MEAN" compute a weighted mean of the
observations
> based
> > > >>>> //      on the time of the observation
> > > >>>> //    - "MEDIAN" use the median observation
> > > >>>> //    - "PERC" use the Nth percentile observation where N =
> > > >>>> obs_perc_value
> > > >>>> //
> > > >>>> // The reporting mechanism for this feature can be
activated by
> > > >>>> specifying
> > > >>>> // a verbosity level of three or higher. The report will
show
> > > >>>> information
> > > >>>> // about where duplicates were detected and which
observations
> were
> > > used
> > > >>>> // in those cases.
> > > >>>> //
> > > >>>> obs_summary = NONE;
> > > >>>>
> > > >>>>
> > > >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via RT
<
> > > >>>> met_help at ucar.edu> wrote:
> > > >>>>
> > > >>>> >
> > > >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> > > >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> > > >>>> >        Queue: met_help
> > > >>>> >      Subject: Ensemble verification
> > > >>>> >        Owner: Nobody
> > > >>>> >   Requestors: binyu.wang at noaa.gov
> > > >>>> >       Status: new
> > > >>>> >  Ticket <URL:
> > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> > > >>>> >
> > > >>>> >
> > > >>>> >
> > > >>>> > Hello,
> > > >>>> > I am wondering how to deal with different time steps in
MET? My
> > obs.
> > > >>>> files
> > > >>>> > are every 10 minutes, but my model is hourly average
data. I
> want
> > to
> > > >>>> do the
> > > >>>> > ensemble verification. I know I can get an hourly average
first
> > for
> > > >>>> the
> > > >>>> > obs. data using other tools, but is it possible that I
can do
> the
> > > >>>> ensemble
> > > >>>> > verification directly using MET without the prep-process?
Thank
> > you.
> > > >>>> >
> > > >>>> >
> > > >>>> > Binyu
> > > >>>> >
> > > >>>> >
> > > >>>>
> > > >>>>
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Mon May 10 10:53:04 2021

Hello John,

I understand that after using point2grid, we will get gridded data as
the
output. But my question is: before we use point2grid, how to tell if
the
original file is grid or point by looking at the header file? Because
we
need to decide which tool (point2grid, grid2grid, point_stat, or
grid_stat etc ) to use based on the file type, right? Thank you.

Binyu

On Mon, May 10, 2021 at 12:22 PM John Halley Gotway via RT <
met_help at ucar.edu> wrote:

> Binyu,
>
> If you're processing the Himawari data using the point2grid tool in
MET,
> then it's output is gridded data. That's the whole point of that
tool...
> taking data defined at individual lat, lon point locations and
> interpolating them onto a grid. The Himawari data is a dense mesh of
> lat/lon points, but they are just satellite pixels I believe. They
are not
> gridded.
>
> The point2grid tool puts them onto a grid.
>
> John
>
> On Mon, May 10, 2021 at 10:10 AM binyu.wang at noaa.gov via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> >
> > Hello John,
> >
> >
> > Thank you for the clarification. But for the original one, how to
tell if
> > it is gridded analysis or point? Like this file:
> >
> >
> >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/SatelliteVOLCAT_HIMAWARI-
8_FLDK_s2020296_185000_v300250_VCB_w167_FLDK_
> > b2020295_204000_g001_pc.nc
> >
> >
> > Thank you.
> >
> > On Mon, May 10, 2021 at 12:03 PM John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> > > Binyu,
> > >
> > > I'm sorry for the delay in responding to this issue.
> > >
> > > Yes, you're right, running pcp_combine with the "-derive mean"
option
> > will
> > > compute the mean across whatever files you pass as input.
> > >
> > > For the second question, yes, the output of the point2grid tool
is
> > gridded
> > > data. So you're comparing gridded NetCDF mean output for
pcp_combine to
> > the
> > > gridded NetCDF output from point2grid.
> > >
> > > When you run ensemble-stat, you pass gridded observations as
input
> using
> > > the "-grid_obs" command line option. So that's how you'd pass
the
> > > point2grid output.
> > >
> > > Thanks,
> > > John
> > >
> > > On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
>
> > > >
> > > > Hello John,
> > > >
> > > > I think I figured it out how to get the mean using pcp_combine
(see
> > > below),
> > > > is that correct?
> > > >
> > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > -derive mean  bezy_2020295_2040_regrided.nc
> > > > bezy_2020295_2050_regrided.nc  -field
> > > > 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
> > > >
> > > >
> > > > But I am still not sure how to tell if the file is grid
analysis or
> > > > point-obs, thank you.
> > > >
> > > >
> > > > Binyu
> > > >
> > > > On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
> > > > binyu.wang at noaa.gov> wrote:
> > > >
> > > > > Hello,
> > > > >
> > > > > Continue to my previous email, I tried to use "pcp_combine"
to do
> the
> > > > > average as below:
> > > > >
> > > > > $ cd
> > > > >
> > >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > >
> > > > > [Binyu.Wang at v71a2 Satellite]$
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > > -mean test1.nc test2.nc 'name="ash_mass_loading";
level="(*,*)";'
> > > > > output_mean.nc
> > > > >
> > > > > ERROR  :
> > > > >
> > > > > ERROR  : CommandLine::next_option() -> unrecognized command-
line
> > switch
> > > > > "-mean"
> > > > >
> > > > > ERROR  :
> > > > >
> > > > >
> > > > > What is the problem here?
> > > > >
> > > > >
> > > > > Thank you.
> > > > >
> > > > > Binyu
> > > > >
> > > > > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate <
> > > > > binyu.wang at noaa.gov> wrote:
> > > > >
> > > > >> Hello John,
> > > > >>
> > > > >> There is some update about my previous question:
> > > > >>
> > > > >> Here are the new files (it is different format with the
previous
> > one):
> > > > >>
> > > > >>
> > > >
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> > > > >>
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > g001_pc.nc
> > > > >>
> > > > >>
> > > > >> Using the same way as Howard did before, I can do:
> > > > >>
> > > > >> 1.
> > > > >>
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> > > > >>
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > g001_pc.nc
> > > > >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> > > > >> 'name="ash_mass_loading"; level="(0,*,*)";'
> > > > >>
> > > > >>
> > > > >> then
> > > > >>
> > > > >> 2.
> > > > >>
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> > > > >> test.nc test1.ps 'name="ash_mass_loading"; level="(*,*)";'
> > > > >>
> > > > >>
> > > > >> As I asked before, I am not sure if the original data is
grid
> > analysis
> > > > or
> > > > >> point obs. Since I can do "point2grid", I guess the media
product
> "
> > > > >> test1.nc" is grid data, is that correct? I need to decide
which
> > tool
> > > to
> > > > >> use to get hourly average data.  Thank you.
> > > > >>
> > > > >>
> > > > >>
> > > > >> Binyu
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA Affiliate
<
> > > > >> binyu.wang at noaa.gov> wrote:
> > > > >>
> > > > >>> Hello John,
> > > > >>>
> > > > >>> Thank you for your quick response. Actually I am not sure
how to
> > > > >>> determine the data is point or gridded?  It is satellite
obs.
> > data, I
> > > > guess
> > > > >>> it is point obs. because it is lat/lon based? But I am not
> exactly
> > > > sure.
> > > > >>>
> > > > >>> Here is the data:
> > > > >>>
> > > > >>>
> > > >
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > >>>
> > > > >>>
> > > > >>> Thank you.
> > > > >>>
> > > > >>> Binyu
> > > > >>>
> > > > >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT <
> > > > >>> met_help at ucar.edu> wrote:
> > > > >>>
> > > > >>>> Binyu,
> > > > >>>>
> > > > >>>> I see that you have questions about ensemble verification
when
> the
> > > > time
> > > > >>>> steps in your model and observation data differ. I'll
probably
> > need
> > > > >>>> several
> > > > >>>> more details to give you a good answer.
> > > > >>>>
> > > > >>>> First, what type of observations are you verifying
against?
> > Gridded
> > > > >>>> analyses or point obs?
> > > > >>>>
> > > > >>>> If they are gridded observations, I would actually
recommend
> > doing a
> > > > >>>> pre-processing step. Run them through pcp_combine using
the
> > > "-derive"
> > > > >>>> option to compute the hourly mean from the 10-minute
analysis
> > data.
> > > > >>>>
> > > > >>>> If they are point observations, you could use the
obs_summary
> > option
> > > > in
> > > > >>>> the
> > > > >>>> Ensemble-Stat config file. Listed below is a description
of that
> > > > option
> > > > >>>> taken from:
> > > > >>>>
> > > > >>>>
> > > >
> > >
> >
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> > > > >>>>
> > > > >>>> I'd recommend defining the obs_window to +/- 30 minutes
around
> the
> > > > >>>> ensemble
> > > > >>>> valid time and then set "obs_summary = UW_MEAN". That
way,
> you'll
> > > > verify
> > > > >>>> against the hourly mean value from each point observation
> > location.
> > > > >>>>
> > > > >>>> Hope that helps.
> > > > >>>>
> > > > >>>> Thanks,
> > > > >>>> John
> > > > >>>>
> > > > >>>> //
> > > > >>>> // The "obs_summary" entry specifies how to compute
statistics
> on
> > > > >>>> // observations that appear at a single location
> > > (lat,lon,level,elev)
> > > > >>>> // in Point-Stat and Ensemble-Stat. Eight techniques are
> > > > >>>> // currently supported:
> > > > >>>> //
> > > > >>>> //    - "NONE" to use all point observations (legacy
behavior)
> > > > >>>> //    - "NEAREST" use only the observation that has the
valid
> > > > >>>> //      time closest to the forecast valid time
> > > > >>>> //    - "MIN" use only the observation that has the
lowest value
> > > > >>>> //    - "MAX" use only the observation that has the
highest
> value
> > > > >>>> //    - "UW_MEAN" compute an unweighted mean of the
observations
> > > > >>>> //    - "DW_MEAN" compute a weighted mean of the
observations
> > based
> > > > >>>> //      on the time of the observation
> > > > >>>> //    - "MEDIAN" use the median observation
> > > > >>>> //    - "PERC" use the Nth percentile observation where N
=
> > > > >>>> obs_perc_value
> > > > >>>> //
> > > > >>>> // The reporting mechanism for this feature can be
activated by
> > > > >>>> specifying
> > > > >>>> // a verbosity level of three or higher. The report will
show
> > > > >>>> information
> > > > >>>> // about where duplicates were detected and which
observations
> > were
> > > > used
> > > > >>>> // in those cases.
> > > > >>>> //
> > > > >>>> obs_summary = NONE;
> > > > >>>>
> > > > >>>>
> > > > >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via
RT <
> > > > >>>> met_help at ucar.edu> wrote:
> > > > >>>>
> > > > >>>> >
> > > > >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted upon.
> > > > >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> > > > >>>> >        Queue: met_help
> > > > >>>> >      Subject: Ensemble verification
> > > > >>>> >        Owner: Nobody
> > > > >>>> >   Requestors: binyu.wang at noaa.gov
> > > > >>>> >       Status: new
> > > > >>>> >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> > > > >>>> >
> > > > >>>> >
> > > > >>>> >
> > > > >>>> > Hello,
> > > > >>>> > I am wondering how to deal with different time steps in
MET?
> My
> > > obs.
> > > > >>>> files
> > > > >>>> > are every 10 minutes, but my model is hourly average
data. I
> > want
> > > to
> > > > >>>> do the
> > > > >>>> > ensemble verification. I know I can get an hourly
average
> first
> > > for
> > > > >>>> the
> > > > >>>> > obs. data using other tools, but is it possible that I
can do
> > the
> > > > >>>> ensemble
> > > > >>>> > verification directly using MET without the prep-
process?
> Thank
> > > you.
> > > > >>>> >
> > > > >>>> >
> > > > >>>> > Binyu
> > > > >>>> >
> > > > >>>> >
> > > > >>>>
> > > > >>>>
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: Ensemble verification
From: John Halley Gotway
Time: Mon May 10 11:24:54 2021

Binyu,

Ah, OK, I understand the question better now.

Perhaps it's easiest to ask, is this gridded data that MET can read
directly? If it's in GRIB1 or GRIB2 format, then the answer is yes.
And
then there are 3 flavors of NetCDF that MET can read directly...
- Gridded NetCDF output from one of the MET tools should include
global
attributes for the "MET_version" and the "Projection"... followed by
additional attributes defining that projection.
- The output from the wrf_interp utility for WRF output is also
identifiable via NetCDF attributes (I don't remember all those details
right now).
- NetCDF data following the climate-forecast (CF) convention has a
global
"Conventions" attribute with a string like "CF-1.7".
- Lastly, users can write python scripts to pass data that's gridded
to the
MET tools in memory.

So if your data doesn't fall into one of those categories, then it's
not a
gridded dataset that MET can handle directly. Satellite data, in
general,
will not be gridded. Typically it contains a dense mesh of data at
lat/lon
points, but typically those lat/lon points are not evenly spaced onto
a
regular grid.

While MET's point2grid tool does support some satellite data inputs,
it is
limited. Using python embedding is another option for handling new
datasets
not supported natively by MET.

Thanks
John

On Mon, May 10, 2021 at 10:53 AM binyu.wang at noaa.gov via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
>
> Hello John,
>
> I understand that after using point2grid, we will get gridded data
as the
> output. But my question is: before we use point2grid, how to tell if
the
> original file is grid or point by looking at the header file?
Because we
> need to decide which tool (point2grid, grid2grid, point_stat, or
> grid_stat etc ) to use based on the file type, right? Thank you.
>
> Binyu
>
> On Mon, May 10, 2021 at 12:22 PM John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
>
> > Binyu,
> >
> > If you're processing the Himawari data using the point2grid tool
in MET,
> > then it's output is gridded data. That's the whole point of that
tool...
> > taking data defined at individual lat, lon point locations and
> > interpolating them onto a grid. The Himawari data is a dense mesh
of
> > lat/lon points, but they are just satellite pixels I believe. They
are
> not
> > gridded.
> >
> > The point2grid tool puts them onto a grid.
> >
> > John
> >
> > On Mon, May 10, 2021 at 10:10 AM binyu.wang at noaa.gov via RT <
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> > >
> > > Hello John,
> > >
> > >
> > > Thank you for the clarification. But for the original one, how
to tell
> if
> > > it is gridded analysis or point? Like this file:
> > >
> > >
> > >
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/SatelliteVOLCAT_HIMAWARI-
8_FLDK_s2020296_185000_v300250_VCB_w167_FLDK_
> > > b2020295_204000_g001_pc.nc
> > >
> > >
> > > Thank you.
> > >
> > > On Mon, May 10, 2021 at 12:03 PM John Halley Gotway via RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > > Binyu,
> > > >
> > > > I'm sorry for the delay in responding to this issue.
> > > >
> > > > Yes, you're right, running pcp_combine with the "-derive mean"
option
> > > will
> > > > compute the mean across whatever files you pass as input.
> > > >
> > > > For the second question, yes, the output of the point2grid
tool is
> > > gridded
> > > > data. So you're comparing gridded NetCDF mean output for
pcp_combine
> to
> > > the
> > > > gridded NetCDF output from point2grid.
> > > >
> > > > When you run ensemble-stat, you pass gridded observations as
input
> > using
> > > > the "-grid_obs" command line option. So that's how you'd pass
the
> > > > point2grid output.
> > > >
> > > > Thanks,
> > > > John
> > > >
> > > > On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
> > > > met_help at ucar.edu> wrote:
> > > >
> > > > >
> > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> > > > >
> > > > > Hello John,
> > > > >
> > > > > I think I figured it out how to get the mean using
pcp_combine (see
> > > > below),
> > > > > is that correct?
> > > > >
> > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > > -derive mean  bezy_2020295_2040_regrided.nc
> > > > > bezy_2020295_2050_regrided.nc  -field
> > > > > 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
> > > > >
> > > > >
> > > > > But I am still not sure how to tell if the file is grid
analysis or
> > > > > point-obs, thank you.
> > > > >
> > > > >
> > > > > Binyu
> > > > >
> > > > > On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate <
> > > > > binyu.wang at noaa.gov> wrote:
> > > > >
> > > > > > Hello,
> > > > > >
> > > > > > Continue to my previous email, I tried to use
"pcp_combine" to do
> > the
> > > > > > average as below:
> > > > > >
> > > > > > $ cd
> > > > > >
> > > >
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > > >
> > > > > > [Binyu.Wang at v71a2 Satellite]$
> > > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > > > -mean test1.nc test2.nc 'name="ash_mass_loading";
> level="(*,*)";'
> > > > > > output_mean.nc
> > > > > >
> > > > > > ERROR  :
> > > > > >
> > > > > > ERROR  : CommandLine::next_option() -> unrecognized
command-line
> > > switch
> > > > > > "-mean"
> > > > > >
> > > > > > ERROR  :
> > > > > >
> > > > > >
> > > > > > What is the problem here?
> > > > > >
> > > > > >
> > > > > > Thank you.
> > > > > >
> > > > > > Binyu
> > > > > >
> > > > > > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA Affiliate
<
> > > > > > binyu.wang at noaa.gov> wrote:
> > > > > >
> > > > > >> Hello John,
> > > > > >>
> > > > > >> There is some update about my previous question:
> > > > > >>
> > > > > >> Here are the new files (it is different format with the
previous
> > > one):
> > > > > >>
> > > > > >>
> > > > >
> > > >
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> > > > > >>
> > > > >
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > > g001_pc.nc
> > > > > >>
> > > > > >>
> > > > > >> Using the same way as Howard did before, I can do:
> > > > > >>
> > > > > >> 1.
> > > > > >>
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> > > > > >>
> > > > >
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > > g001_pc.nc
> > > > > >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> > > > > >> 'name="ash_mass_loading"; level="(0,*,*)";'
> > > > > >>
> > > > > >>
> > > > > >> then
> > > > > >>
> > > > > >> 2.
> > > > > >>
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> > > > > >> test.nc test1.ps 'name="ash_mass_loading";
level="(*,*)";'
> > > > > >>
> > > > > >>
> > > > > >> As I asked before, I am not sure if the original data is
grid
> > > analysis
> > > > > or
> > > > > >> point obs. Since I can do "point2grid", I guess the media
> product
> > "
> > > > > >> test1.nc" is grid data, is that correct? I need to decide
which
> > > tool
> > > > to
> > > > > >> use to get hourly average data.  Thank you.
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >> Binyu
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >>
> > > > > >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA
Affiliate <
> > > > > >> binyu.wang at noaa.gov> wrote:
> > > > > >>
> > > > > >>> Hello John,
> > > > > >>>
> > > > > >>> Thank you for your quick response. Actually I am not
sure how
> to
> > > > > >>> determine the data is point or gridded?  It is satellite
obs.
> > > data, I
> > > > > guess
> > > > > >>> it is point obs. because it is lat/lon based? But I am
not
> > exactly
> > > > > sure.
> > > > > >>>
> > > > > >>> Here is the data:
> > > > > >>>
> > > > > >>>
> > > > >
> > >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > > >>>
> > > > > >>>
> > > > > >>> Thank you.
> > > > > >>>
> > > > > >>> Binyu
> > > > > >>>
> > > > > >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via RT
<
> > > > > >>> met_help at ucar.edu> wrote:
> > > > > >>>
> > > > > >>>> Binyu,
> > > > > >>>>
> > > > > >>>> I see that you have questions about ensemble
verification when
> > the
> > > > > time
> > > > > >>>> steps in your model and observation data differ. I'll
probably
> > > need
> > > > > >>>> several
> > > > > >>>> more details to give you a good answer.
> > > > > >>>>
> > > > > >>>> First, what type of observations are you verifying
against?
> > > Gridded
> > > > > >>>> analyses or point obs?
> > > > > >>>>
> > > > > >>>> If they are gridded observations, I would actually
recommend
> > > doing a
> > > > > >>>> pre-processing step. Run them through pcp_combine using
the
> > > > "-derive"
> > > > > >>>> option to compute the hourly mean from the 10-minute
analysis
> > > data.
> > > > > >>>>
> > > > > >>>> If they are point observations, you could use the
obs_summary
> > > option
> > > > > in
> > > > > >>>> the
> > > > > >>>> Ensemble-Stat config file. Listed below is a
description of
> that
> > > > > option
> > > > > >>>> taken from:
> > > > > >>>>
> > > > > >>>>
> > > > >
> > > >
> > >
> >
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> > > > > >>>>
> > > > > >>>> I'd recommend defining the obs_window to +/- 30 minutes
around
> > the
> > > > > >>>> ensemble
> > > > > >>>> valid time and then set "obs_summary = UW_MEAN". That
way,
> > you'll
> > > > > verify
> > > > > >>>> against the hourly mean value from each point
observation
> > > location.
> > > > > >>>>
> > > > > >>>> Hope that helps.
> > > > > >>>>
> > > > > >>>> Thanks,
> > > > > >>>> John
> > > > > >>>>
> > > > > >>>> //
> > > > > >>>> // The "obs_summary" entry specifies how to compute
statistics
> > on
> > > > > >>>> // observations that appear at a single location
> > > > (lat,lon,level,elev)
> > > > > >>>> // in Point-Stat and Ensemble-Stat. Eight techniques
are
> > > > > >>>> // currently supported:
> > > > > >>>> //
> > > > > >>>> //    - "NONE" to use all point observations (legacy
behavior)
> > > > > >>>> //    - "NEAREST" use only the observation that has the
valid
> > > > > >>>> //      time closest to the forecast valid time
> > > > > >>>> //    - "MIN" use only the observation that has the
lowest
> value
> > > > > >>>> //    - "MAX" use only the observation that has the
highest
> > value
> > > > > >>>> //    - "UW_MEAN" compute an unweighted mean of the
> observations
> > > > > >>>> //    - "DW_MEAN" compute a weighted mean of the
observations
> > > based
> > > > > >>>> //      on the time of the observation
> > > > > >>>> //    - "MEDIAN" use the median observation
> > > > > >>>> //    - "PERC" use the Nth percentile observation where
N =
> > > > > >>>> obs_perc_value
> > > > > >>>> //
> > > > > >>>> // The reporting mechanism for this feature can be
activated
> by
> > > > > >>>> specifying
> > > > > >>>> // a verbosity level of three or higher. The report
will show
> > > > > >>>> information
> > > > > >>>> // about where duplicates were detected and which
observations
> > > were
> > > > > used
> > > > > >>>> // in those cases.
> > > > > >>>> //
> > > > > >>>> obs_summary = NONE;
> > > > > >>>>
> > > > > >>>>
> > > > > >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov via
RT <
> > > > > >>>> met_help at ucar.edu> wrote:
> > > > > >>>>
> > > > > >>>> >
> > > > > >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted
upon.
> > > > > >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> > > > > >>>> >        Queue: met_help
> > > > > >>>> >      Subject: Ensemble verification
> > > > > >>>> >        Owner: Nobody
> > > > > >>>> >   Requestors: binyu.wang at noaa.gov
> > > > > >>>> >       Status: new
> > > > > >>>> >  Ticket <URL:
> > > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> > > > > >>>> >
> > > > > >>>> >
> > > > > >>>> >
> > > > > >>>> > Hello,
> > > > > >>>> > I am wondering how to deal with different time steps
in MET?
> > My
> > > > obs.
> > > > > >>>> files
> > > > > >>>> > are every 10 minutes, but my model is hourly average
data. I
> > > want
> > > > to
> > > > > >>>> do the
> > > > > >>>> > ensemble verification. I know I can get an hourly
average
> > first
> > > > for
> > > > > >>>> the
> > > > > >>>> > obs. data using other tools, but is it possible that
I can
> do
> > > the
> > > > > >>>> ensemble
> > > > > >>>> > verification directly using MET without the prep-
process?
> > Thank
> > > > you.
> > > > > >>>> >
> > > > > >>>> >
> > > > > >>>> > Binyu
> > > > > >>>> >
> > > > > >>>> >
> > > > > >>>>
> > > > > >>>>
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: Ensemble verification
From: binyu.wang at noaa.gov
Time: Tue May 11 16:44:10 2021

Hello John,
Thank you for your explanation.

On Mon, May 10, 2021 at 1:25 PM John Halley Gotway via RT
<met_help at ucar.edu>
wrote:

> Binyu,
>
> Ah, OK, I understand the question better now.
>
> Perhaps it's easiest to ask, is this gridded data that MET can read
> directly? If it's in GRIB1 or GRIB2 format, then the answer is yes.
And
> then there are 3 flavors of NetCDF that MET can read directly...
> - Gridded NetCDF output from one of the MET tools should include
global
> attributes for the "MET_version" and the "Projection"... followed by
> additional attributes defining that projection.
> - The output from the wrf_interp utility for WRF output is also
> identifiable via NetCDF attributes (I don't remember all those
details
> right now).
> - NetCDF data following the climate-forecast (CF) convention has a
global
> "Conventions" attribute with a string like "CF-1.7".
> - Lastly, users can write python scripts to pass data that's gridded
to the
> MET tools in memory.
>
> So if your data doesn't fall into one of those categories, then it's
not a
> gridded dataset that MET can handle directly. Satellite data, in
general,
> will not be gridded. Typically it contains a dense mesh of data at
lat/lon
> points, but typically those lat/lon points are not evenly spaced
onto a
> regular grid.
>
> While MET's point2grid tool does support some satellite data inputs,
it is
> limited. Using python embedding is another option for handling new
datasets
> not supported natively by MET.
>
> Thanks
> John
>
> On Mon, May 10, 2021 at 10:53 AM binyu.wang at noaa.gov via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> >
> > Hello John,
> >
> > I understand that after using point2grid, we will get gridded data
as the
> > output. But my question is: before we use point2grid, how to tell
if the
> > original file is grid or point by looking at the header file?
Because we
> > need to decide which tool (point2grid, grid2grid, point_stat, or
> > grid_stat etc ) to use based on the file type, right? Thank you.
> >
> > Binyu
> >
> > On Mon, May 10, 2021 at 12:22 PM John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> >
> > > Binyu,
> > >
> > > If you're processing the Himawari data using the point2grid tool
in
> MET,
> > > then it's output is gridded data. That's the whole point of that
> tool...
> > > taking data defined at individual lat, lon point locations and
> > > interpolating them onto a grid. The Himawari data is a dense
mesh of
> > > lat/lon points, but they are just satellite pixels I believe.
They are
> > not
> > > gridded.
> > >
> > > The point2grid tool puts them onto a grid.
> > >
> > > John
> > >
> > > On Mon, May 10, 2021 at 10:10 AM binyu.wang at noaa.gov via RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
>
> > > >
> > > > Hello John,
> > > >
> > > >
> > > > Thank you for the clarification. But for the original one, how
to
> tell
> > if
> > > > it is gridded analysis or point? Like this file:
> > > >
> > > >
> > > >
> > > >
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/SatelliteVOLCAT_HIMAWARI-
8_FLDK_s2020296_185000_v300250_VCB_w167_FLDK_
> > > > b2020295_204000_g001_pc.nc
> > > >
> > > >
> > > > Thank you.
> > > >
> > > > On Mon, May 10, 2021 at 12:03 PM John Halley Gotway via RT <
> > > > met_help at ucar.edu> wrote:
> > > >
> > > > > Binyu,
> > > > >
> > > > > I'm sorry for the delay in responding to this issue.
> > > > >
> > > > > Yes, you're right, running pcp_combine with the "-derive
mean"
> option
> > > > will
> > > > > compute the mean across whatever files you pass as input.
> > > > >
> > > > > For the second question, yes, the output of the point2grid
tool is
> > > > gridded
> > > > > data. So you're comparing gridded NetCDF mean output for
> pcp_combine
> > to
> > > > the
> > > > > gridded NetCDF output from point2grid.
> > > > >
> > > > > When you run ensemble-stat, you pass gridded observations as
input
> > > using
> > > > > the "-grid_obs" command line option. So that's how you'd
pass the
> > > > > point2grid output.
> > > > >
> > > > > Thanks,
> > > > > John
> > > > >
> > > > > On Mon, May 10, 2021 at 8:58 AM binyu.wang at noaa.gov via RT <
> > > > > met_help at ucar.edu> wrote:
> > > > >
> > > > > >
> > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775 >
> > > > > >
> > > > > > Hello John,
> > > > > >
> > > > > > I think I figured it out how to get the mean using
pcp_combine
> (see
> > > > > below),
> > > > > > is that correct?
> > > > > >
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > > > -derive mean  bezy_2020295_2040_regrided.nc
> > > > > > bezy_2020295_2050_regrided.nc  -field
> > > > > > 'name="ash_mass_loading"; level="(*,*)";'   output_mean.nc
> > > > > >
> > > > > >
> > > > > > But I am still not sure how to tell if the file is grid
analysis
> or
> > > > > > point-obs, thank you.
> > > > > >
> > > > > >
> > > > > > Binyu
> > > > > >
> > > > > > On Fri, May 7, 2021 at 5:55 PM Binyu Wang - NOAA Affiliate
<
> > > > > > binyu.wang at noaa.gov> wrote:
> > > > > >
> > > > > > > Hello,
> > > > > > >
> > > > > > > Continue to my previous email, I tried to use
"pcp_combine" to
> do
> > > the
> > > > > > > average as below:
> > > > > > >
> > > > > > > $ cd
> > > > > > >
> > > > >
> > >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > > > >
> > > > > > > [Binyu.Wang at v71a2 Satellite]$
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/pcp_combine
> > > > > > > -mean test1.nc test2.nc 'name="ash_mass_loading";
> > level="(*,*)";'
> > > > > > > output_mean.nc
> > > > > > >
> > > > > > > ERROR  :
> > > > > > >
> > > > > > > ERROR  : CommandLine::next_option() -> unrecognized
> command-line
> > > > switch
> > > > > > > "-mean"
> > > > > > >
> > > > > > > ERROR  :
> > > > > > >
> > > > > > >
> > > > > > > What is the problem here?
> > > > > > >
> > > > > > >
> > > > > > > Thank you.
> > > > > > >
> > > > > > > Binyu
> > > > > > >
> > > > > > > On Fri, May 7, 2021 at 5:26 PM Binyu Wang - NOAA
Affiliate <
> > > > > > > binyu.wang at noaa.gov> wrote:
> > > > > > >
> > > > > > >> Hello John,
> > > > > > >>
> > > > > > >> There is some update about my previous question:
> > > > > > >>
> > > > > > >> Here are the new files (it is different format with the
> previous
> > > > one):
> > > > > > >>
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
>
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite/
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > > > g001_pc.nc
> > > > > > >>
> > > > > > >>
> > > > > > >> Using the same way as Howard did before, I can do:
> > > > > > >>
> > > > > > >> 1.
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/point2grid
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> VOLCAT_HIMAWARI-
8_FLDK_s2020295_204000_v300250_VCB_w167_FLDK_b2020295_204000_
> > > > > > g001_pc.nc
> > > > > > >> "latlon 150 150 45 153 0.1 0.1" test1.nc -field
> > > > > > >> 'name="ash_mass_loading"; level="(0,*,*)";'
> > > > > > >>
> > > > > > >>
> > > > > > >> then
> > > > > > >>
> > > > > > >> 2.
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> /gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0-
beta5/exec/plot_data_plane
> > > > > > >> test.nc test1.ps 'name="ash_mass_loading";
level="(*,*)";'
> > > > > > >>
> > > > > > >>
> > > > > > >> As I asked before, I am not sure if the original data
is grid
> > > > analysis
> > > > > > or
> > > > > > >> point obs. Since I can do "point2grid", I guess the
media
> > product
> > > "
> > > > > > >> test1.nc" is grid data, is that correct? I need to
decide
> which
> > > > tool
> > > > > to
> > > > > > >> use to get hourly average data.  Thank you.
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >> Binyu
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >>
> > > > > > >> On Mon, May 3, 2021 at 10:12 PM Binyu Wang - NOAA
Affiliate <
> > > > > > >> binyu.wang at noaa.gov> wrote:
> > > > > > >>
> > > > > > >>> Hello John,
> > > > > > >>>
> > > > > > >>> Thank you for your quick response. Actually I am not
sure how
> > to
> > > > > > >>> determine the data is point or gridded?  It is
satellite obs.
> > > > data, I
> > > > > > guess
> > > > > > >>> it is point obs. because it is lat/lon based? But I am
not
> > > exactly
> > > > > > sure.
> > > > > > >>>
> > > > > > >>> Here is the data:
> > > > > > >>>
> > > > > > >>>
> > > > > >
> > > >
> >
/gpfs/dell2/emc/modeling/noscrub/Binyu.Wang/Download/Bezymianny/Satellite
> > > > > > >>>
> > > > > > >>>
> > > > > > >>> Thank you.
> > > > > > >>>
> > > > > > >>> Binyu
> > > > > > >>>
> > > > > > >>> On Mon, May 3, 2021 at 6:52 PM John Halley Gotway via
RT <
> > > > > > >>> met_help at ucar.edu> wrote:
> > > > > > >>>
> > > > > > >>>> Binyu,
> > > > > > >>>>
> > > > > > >>>> I see that you have questions about ensemble
verification
> when
> > > the
> > > > > > time
> > > > > > >>>> steps in your model and observation data differ. I'll
> probably
> > > > need
> > > > > > >>>> several
> > > > > > >>>> more details to give you a good answer.
> > > > > > >>>>
> > > > > > >>>> First, what type of observations are you verifying
against?
> > > > Gridded
> > > > > > >>>> analyses or point obs?
> > > > > > >>>>
> > > > > > >>>> If they are gridded observations, I would actually
recommend
> > > > doing a
> > > > > > >>>> pre-processing step. Run them through pcp_combine
using the
> > > > > "-derive"
> > > > > > >>>> option to compute the hourly mean from the 10-minute
> analysis
> > > > data.
> > > > > > >>>>
> > > > > > >>>> If they are point observations, you could use the
> obs_summary
> > > > option
> > > > > > in
> > > > > > >>>> the
> > > > > > >>>> Ensemble-Stat config file. Listed below is a
description of
> > that
> > > > > > option
> > > > > > >>>> taken from:
> > > > > > >>>>
> > > > > > >>>>
> > > > > >
> > > > >
> > > >
> > >
> >
> https://met.readthedocs.io/en/latest/Users_Guide/data_io.html#data-
io-met-configuration-file-options
> > > > > > >>>>
> > > > > > >>>> I'd recommend defining the obs_window to +/- 30
minutes
> around
> > > the
> > > > > > >>>> ensemble
> > > > > > >>>> valid time and then set "obs_summary = UW_MEAN". That
way,
> > > you'll
> > > > > > verify
> > > > > > >>>> against the hourly mean value from each point
observation
> > > > location.
> > > > > > >>>>
> > > > > > >>>> Hope that helps.
> > > > > > >>>>
> > > > > > >>>> Thanks,
> > > > > > >>>> John
> > > > > > >>>>
> > > > > > >>>> //
> > > > > > >>>> // The "obs_summary" entry specifies how to compute
> statistics
> > > on
> > > > > > >>>> // observations that appear at a single location
> > > > > (lat,lon,level,elev)
> > > > > > >>>> // in Point-Stat and Ensemble-Stat. Eight techniques
are
> > > > > > >>>> // currently supported:
> > > > > > >>>> //
> > > > > > >>>> //    - "NONE" to use all point observations (legacy
> behavior)
> > > > > > >>>> //    - "NEAREST" use only the observation that has
the
> valid
> > > > > > >>>> //      time closest to the forecast valid time
> > > > > > >>>> //    - "MIN" use only the observation that has the
lowest
> > value
> > > > > > >>>> //    - "MAX" use only the observation that has the
highest
> > > value
> > > > > > >>>> //    - "UW_MEAN" compute an unweighted mean of the
> > observations
> > > > > > >>>> //    - "DW_MEAN" compute a weighted mean of the
> observations
> > > > based
> > > > > > >>>> //      on the time of the observation
> > > > > > >>>> //    - "MEDIAN" use the median observation
> > > > > > >>>> //    - "PERC" use the Nth percentile observation
where N =
> > > > > > >>>> obs_perc_value
> > > > > > >>>> //
> > > > > > >>>> // The reporting mechanism for this feature can be
activated
> > by
> > > > > > >>>> specifying
> > > > > > >>>> // a verbosity level of three or higher. The report
will
> show
> > > > > > >>>> information
> > > > > > >>>> // about where duplicates were detected and which
> observations
> > > > were
> > > > > > used
> > > > > > >>>> // in those cases.
> > > > > > >>>> //
> > > > > > >>>> obs_summary = NONE;
> > > > > > >>>>
> > > > > > >>>>
> > > > > > >>>> On Mon, May 3, 2021 at 11:49 AM binyu.wang at noaa.gov
via RT
> <
> > > > > > >>>> met_help at ucar.edu> wrote:
> > > > > > >>>>
> > > > > > >>>> >
> > > > > > >>>> > Mon May 03 11:48:35 2021: Request 99775 was acted
upon.
> > > > > > >>>> > Transaction: Ticket created by binyu.wang at noaa.gov
> > > > > > >>>> >        Queue: met_help
> > > > > > >>>> >      Subject: Ensemble verification
> > > > > > >>>> >        Owner: Nobody
> > > > > > >>>> >   Requestors: binyu.wang at noaa.gov
> > > > > > >>>> >       Status: new
> > > > > > >>>> >  Ticket <URL:
> > > > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=99775
> > > > > > >>>> >
> > > > > > >>>> >
> > > > > > >>>> >
> > > > > > >>>> > Hello,
> > > > > > >>>> > I am wondering how to deal with different time
steps in
> MET?
> > > My
> > > > > obs.
> > > > > > >>>> files
> > > > > > >>>> > are every 10 minutes, but my model is hourly
average
> data. I
> > > > want
> > > > > to
> > > > > > >>>> do the
> > > > > > >>>> > ensemble verification. I know I can get an hourly
average
> > > first
> > > > > for
> > > > > > >>>> the
> > > > > > >>>> > obs. data using other tools, but is it possible
that I can
> > do
> > > > the
> > > > > > >>>> ensemble
> > > > > > >>>> > verification directly using MET without the prep-
process?
> > > Thank
> > > > > you.
> > > > > > >>>> >
> > > > > > >>>> >
> > > > > > >>>> > Binyu
> > > > > > >>>> >
> > > > > > >>>> >
> > > > > > >>>>
> > > > > > >>>>
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------


More information about the Met_help mailing list