[Met_help] [rt.rap.ucar.edu #81564] History for MET v5.1 with MRMS data runs extremely slow

Julie Prestopnik via RT met_help at ucar.edu
Wed Aug 16 14:53:04 MDT 2017


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Dear MET Help,

I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500 large grid at 0.01-deg spacing) running through MET (version 5.1 is what I'm running), but it's processing extremely slowly.  I have a 12-km grid-spaced model test run over the SE U.S., which I interpolated to the MRMS grid using the regrid_data_plane utility.
This alone takes quite a while to do (perhaps a few minutes).  Then running grid_stat takes several more minutes per scenario.
In all, I've only gotten through the first 14 hours of my model run hourly output while running jobs in a script overnight.

Do you know of a faster way to process high-res precip grids within MET?  This seems prohibitively too slow since I need to run verification for daily forecasts across multiple months.  Perhaps I should be interpolating fields to the coarser-res grid (whether the obs or forecast grid?)?

Appreciate the help,
Jon

*****************************************************************
Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
Short-term Prediction Research and Transition (SPoRT) Center
320 Sparkman Dr., Room 3008; Huntsville, AL 35805
Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov> (preferred) or case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
Voice: 256.961.7504  ;  Fax: 256.961.7788
http://weather.msfc.nasa.gov/sport/
*****************************************************************



----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: MET v5.1 with MRMS data runs extremely slow
From: Julie Prestopnik
Time: Thu Aug 10 09:23:57 2017

Hi Jonathan.  Our MET expert is out of the office for the rest of this
week.  However, I do believe I have some useful information for you.
Thank
you for letting us know what version of MET you are using.  I see that
you
are currently using METv5.1.  We did fix a problem (back in January in
METv5.2, but not METv5.1) with Grid-Stat because users reported very
slow
performance  when computing continuous statistics on high resolution
grids:

----

From: http://www.dtcenter.org/met/us <goog_287487168>
ers/support/known_issues/METv5 <goog_287487168>.2/index.php
Fix memory allocation inefficiencies.
Posted 01/26/2017 *Problem:* Users reported very slow performance from
Grid-Stat when computing continuous statistics on high resolution
grids.
Debugging revealed very inefficient memory allocation logic in the MET
statistics libraries. Many small memory allocations and reallocations
slowed down the code considerably.
*Solution:* The fix is to update the MET library and application code
to
allocate the expected amount of required memory in one big chunk.
Updating
this logic caused the runtime of one test case to improve from 36
minutes
to 18 seconds.

----

It is possible that MET is running slowly due to the dense grid that
you
are using, however, if possible, I would suggest that you upgrade to
METv6.0 (our latest release) and give that a shot using your current
grid
and check for a speed up in processing before you interpolate to a
coarser-res grid.  Please note that when compiling METv6.0 *do not
use* the
previous MET environments.  In v6.0, we've made a large change from
using
NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon
which NetCDF4 is built.  For more details please take a look at the:

*MET Users Guide*
http://www.dtcenter.org/met/users/docs/users_guide/MET_Users_Guide_v6.0.pdf

*Online tutorial*
http://www.dtcenter.org/met/users/support/online_tutorial/METv6.0/tutorial.php?name=compilation&category=index

I hope this helps.  Please let us know how it goes.

Thanks,
Julie

On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> Transaction: Ticket created by jonathan.case-1 at nasa.gov
>        Queue: met_help
>      Subject: MET v5.1 with MRMS data runs extremely slow
>        Owner: Nobody
>   Requestors: jonathan.case-1 at nasa.gov
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
>
> Dear MET Help,
>
> I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500 large
grid
> at 0.01-deg spacing) running through MET (version 5.1 is what I'm
running),
> but it's processing extremely slowly.  I have a 12-km grid-spaced
model
> test run over the SE U.S., which I interpolated to the MRMS grid
using the
> regrid_data_plane utility.
> This alone takes quite a while to do (perhaps a few minutes).  Then
> running grid_stat takes several more minutes per scenario.
> In all, I've only gotten through the first 14 hours of my model run
hourly
> output while running jobs in a script overnight.
>
> Do you know of a faster way to process high-res precip grids within
MET?
> This seems prohibitively too slow since I need to run verification
for
> daily forecasts across multiple months.  Perhaps I should be
interpolating
> fields to the coarser-res grid (whether the obs or forecast grid?)?
>
> Appreciate the help,
> Jon
>
> *****************************************************************
> Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> Short-term Prediction Research and Transition (SPoRT) Center
> 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> Voice: 256.961.7504  ;  Fax: 256.961.7788
> http://weather.msfc.nasa.gov/sport/
> *****************************************************************
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs extremely slow
From: Case, Jonathan[ENSCO INC]
Time: Thu Aug 10 09:27:31 2017

Thank you Julie,

I look forward to hearing back from you.  In the meantime, it sounds
like it would be worthwhile for me to upgrade the MET version I'm
using in my scripts to at least v5.2.
I generally want to interpolate all data to the obs grids for
consistency sake, so a faster solution is certainly desired!

A possible problem we encountered is that it looks like our MET v5.1
was installed with debug compiler flags, which could explain the slow
performance as well.
We re-compiled with -O2 optimization to see if this improves
performance significantly.

Sincerely,
Jonathan

-----Original Message-----
From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
Sent: Thursday, August 10, 2017 10:24 AM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Hi Jonathan.  Our MET expert is out of the office for the rest of this
week.  However, I do believe I have some useful information for you.
Thank you for letting us know what version of MET you are using.  I
see that you are currently using METv5.1.  We did fix a problem (back
in January in METv5.2, but not METv5.1) with Grid-Stat because users
reported very slow performance  when computing continuous statistics
on high resolution grids:

----

From: http://www.dtcenter.org/met/us <goog_287487168>
ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix memory
allocation inefficiencies.
Posted 01/26/2017 *Problem:* Users reported very slow performance from
Grid-Stat when computing continuous statistics on high resolution
grids.
Debugging revealed very inefficient memory allocation logic in the MET
statistics libraries. Many small memory allocations and reallocations
slowed down the code considerably.
*Solution:* The fix is to update the MET library and application code
to allocate the expected amount of required memory in one big chunk.
Updating this logic caused the runtime of one test case to improve
from 36 minutes to 18 seconds.

----

It is possible that MET is running slowly due to the dense grid that
you are using, however, if possible, I would suggest that you upgrade
to
METv6.0 (our latest release) and give that a shot using your current
grid and check for a speed up in processing before you interpolate to
a coarser-res grid.  Please note that when compiling METv6.0 *do not
use* the previous MET environments.  In v6.0, we've made a large
change from using
NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon which NetCDF4 is built.  For more details please take a look at
the:

*MET Users Guide*
http://www.dtcenter.org/met/users/docs/users_guide/MET_Users_Guide_v6.0.pdf

*Online tutorial*
http://www.dtcenter.org/met/users/support/online_tutorial/METv6.0/tutorial.php?name=compilation&category=index

I hope this helps.  Please let us know how it goes.

Thanks,
Julie

On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> Transaction: Ticket created by jonathan.case-1 at nasa.gov
>        Queue: met_help
>      Subject: MET v5.1 with MRMS data runs extremely slow
>        Owner: Nobody
>   Requestors: jonathan.case-1 at nasa.gov
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> >
>
>
> Dear MET Help,
>
> I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500 large
> grid at 0.01-deg spacing) running through MET (version 5.1 is what
I'm
> running), but it's processing extremely slowly.  I have a 12-km
> grid-spaced model test run over the SE U.S., which I interpolated to
> the MRMS grid using the regrid_data_plane utility.
> This alone takes quite a while to do (perhaps a few minutes).  Then
> running grid_stat takes several more minutes per scenario.
> In all, I've only gotten through the first 14 hours of my model run
> hourly output while running jobs in a script overnight.
>
> Do you know of a faster way to process high-res precip grids within
MET?
> This seems prohibitively too slow since I need to run verification
for
> daily forecasts across multiple months.  Perhaps I should be
> interpolating fields to the coarser-res grid (whether the obs or
forecast grid?)?
>
> Appreciate the help,
> Jon
>
> *****************************************************************
> Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA Short-
term
> Prediction Research and Transition (SPoRT) Center
> 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> Voice: 256.961.7504  ;  Fax: 256.961.7788
> http://weather.msfc.nasa.gov/sport/
> *****************************************************************
>
>
>



------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: Julie Prestopnik
Time: Thu Aug 10 09:39:14 2017

Please let us know if you see a performance speed up after you
upgrade.
Meanwhile, I'm sure John will follow up next week if he has any
additional
helpful information.

Thanks,
Julie

On Thu, Aug 10, 2017 at 9:27 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Thank you Julie,
>
> I look forward to hearing back from you.  In the meantime, it sounds
like
> it would be worthwhile for me to upgrade the MET version I'm using
in my
> scripts to at least v5.2.
> I generally want to interpolate all data to the obs grids for
consistency
> sake, so a faster solution is certainly desired!
>
> A possible problem we encountered is that it looks like our MET v5.1
was
> installed with debug compiler flags, which could explain the slow
> performance as well.
> We re-compiled with -O2 optimization to see if this improves
performance
> significantly.
>
> Sincerely,
> Jonathan
>
> -----Original Message-----
> From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> Sent: Thursday, August 10, 2017 10:24 AM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hi Jonathan.  Our MET expert is out of the office for the rest of
this
> week.  However, I do believe I have some useful information for you.
Thank
> you for letting us know what version of MET you are using.  I see
that you
> are currently using METv5.1.  We did fix a problem (back in January
in
> METv5.2, but not METv5.1) with Grid-Stat because users reported very
slow
> performance  when computing continuous statistics on high resolution
grids:
>
> ----
>
> From: http://www.dtcenter.org/met/us <goog_287487168>
> ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
memory
> allocation inefficiencies.
> Posted 01/26/2017 *Problem:* Users reported very slow performance
from
> Grid-Stat when computing continuous statistics on high resolution
grids.
> Debugging revealed very inefficient memory allocation logic in the
MET
> statistics libraries. Many small memory allocations and
reallocations
> slowed down the code considerably.
> *Solution:* The fix is to update the MET library and application
code to
> allocate the expected amount of required memory in one big chunk.
Updating
> this logic caused the runtime of one test case to improve from 36
minutes
> to 18 seconds.
>
> ----
>
> It is possible that MET is running slowly due to the dense grid that
you
> are using, however, if possible, I would suggest that you upgrade to
> METv6.0 (our latest release) and give that a shot using your current
grid
> and check for a speed up in processing before you interpolate to a
> coarser-res grid.  Please note that when compiling METv6.0 *do not
use* the
> previous MET environments.  In v6.0, we've made a large change from
using
> NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon
> which NetCDF4 is built.  For more details please take a look at the:
>
> *MET Users Guide*
> http://www.dtcenter.org/met/users/docs/users_guide/MET_
> Users_Guide_v6.0.pdf
>
> *Online tutorial*
> http://www.dtcenter.org/met/users/support/online_tutorial/
> METv6.0/tutorial.php?name=compilation&category=index
>
> I hope this helps.  Please let us know how it goes.
>
> Thanks,
> Julie
>
> On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> >        Queue: met_help
> >      Subject: MET v5.1 with MRMS data runs extremely slow
> >        Owner: Nobody
> >   Requestors: jonathan.case-1 at nasa.gov
> >       Status: new
> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > >
> >
> >
> > Dear MET Help,
> >
> > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
large
> > grid at 0.01-deg spacing) running through MET (version 5.1 is what
I'm
> > running), but it's processing extremely slowly.  I have a 12-km
> > grid-spaced model test run over the SE U.S., which I interpolated
to
> > the MRMS grid using the regrid_data_plane utility.
> > This alone takes quite a while to do (perhaps a few minutes).
Then
> > running grid_stat takes several more minutes per scenario.
> > In all, I've only gotten through the first 14 hours of my model
run
> > hourly output while running jobs in a script overnight.
> >
> > Do you know of a faster way to process high-res precip grids
within MET?
> > This seems prohibitively too slow since I need to run verification
for
> > daily forecasts across multiple months.  Perhaps I should be
> > interpolating fields to the coarser-res grid (whether the obs or
> forecast grid?)?
> >
> > Appreciate the help,
> > Jon
> >
> > *****************************************************************
> > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA Short-
term
> > Prediction Research and Transition (SPoRT) Center
> > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > http://weather.msfc.nasa.gov/sport/
> > *****************************************************************
> >
> >
> >
>
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs extremely slow
From: Case, Jonathan[ENSCO INC]
Time: Thu Aug 10 09:40:54 2017

Thanks.  I think we'll also pursue installation of MET v5.2.
-Jonathan

-----Original Message-----
From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
Sent: Thursday, August 10, 2017 10:39 AM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Please let us know if you see a performance speed up after you
upgrade.
Meanwhile, I'm sure John will follow up next week if he has any
additional helpful information.

Thanks,
Julie

On Thu, Aug 10, 2017 at 9:27 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Thank you Julie,
>
> I look forward to hearing back from you.  In the meantime, it sounds
> like it would be worthwhile for me to upgrade the MET version I'm
> using in my scripts to at least v5.2.
> I generally want to interpolate all data to the obs grids for
> consistency sake, so a faster solution is certainly desired!
>
> A possible problem we encountered is that it looks like our MET v5.1
> was installed with debug compiler flags, which could explain the
slow
> performance as well.
> We re-compiled with -O2 optimization to see if this improves
> performance significantly.
>
> Sincerely,
> Jonathan
>
> -----Original Message-----
> From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> Sent: Thursday, August 10, 2017 10:24 AM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hi Jonathan.  Our MET expert is out of the office for the rest of
this
> week.  However, I do believe I have some useful information for you.
> Thank you for letting us know what version of MET you are using.  I
> see that you are currently using METv5.1.  We did fix a problem
(back
> in January in METv5.2, but not METv5.1) with Grid-Stat because users
> reported very slow performance  when computing continuous statistics
on high resolution grids:
>
> ----
>
> From: http://www.dtcenter.org/met/us <goog_287487168>
> ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
memory
> allocation inefficiencies.
> Posted 01/26/2017 *Problem:* Users reported very slow performance
from
> Grid-Stat when computing continuous statistics on high resolution
grids.
> Debugging revealed very inefficient memory allocation logic in the
MET
> statistics libraries. Many small memory allocations and
reallocations
> slowed down the code considerably.
> *Solution:* The fix is to update the MET library and application
code
> to allocate the expected amount of required memory in one big chunk.
> Updating this logic caused the runtime of one test case to improve
> from 36 minutes to 18 seconds.
>
> ----
>
> It is possible that MET is running slowly due to the dense grid that
> you are using, however, if possible, I would suggest that you
upgrade
> to
> METv6.0 (our latest release) and give that a shot using your current
> grid and check for a speed up in processing before you interpolate
to
> a coarser-res grid.  Please note that when compiling METv6.0 *do not
> use* the previous MET environments.  In v6.0, we've made a large
> change from using
> NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
> upon which NetCDF4 is built.  For more details please take a look at
the:
>
> *MET Users Guide*
> http://www.dtcenter.org/met/users/docs/users_guide/MET_
> Users_Guide_v6.0.pdf
>
> *Online tutorial*
> http://www.dtcenter.org/met/users/support/online_tutorial/
> METv6.0/tutorial.php?name=compilation&category=index
>
> I hope this helps.  Please let us know how it goes.
>
> Thanks,
> Julie
>
> On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> >        Queue: met_help
> >      Subject: MET v5.1 with MRMS data runs extremely slow
> >        Owner: Nobody
> >   Requestors: jonathan.case-1 at nasa.gov
> >       Status: new
> >  Ticket <URL:
> > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > >
> >
> >
> > Dear MET Help,
> >
> > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
large
> > grid at 0.01-deg spacing) running through MET (version 5.1 is what
> > I'm running), but it's processing extremely slowly.  I have a 12-
km
> > grid-spaced model test run over the SE U.S., which I interpolated
to
> > the MRMS grid using the regrid_data_plane utility.
> > This alone takes quite a while to do (perhaps a few minutes).
Then
> > running grid_stat takes several more minutes per scenario.
> > In all, I've only gotten through the first 14 hours of my model
run
> > hourly output while running jobs in a script overnight.
> >
> > Do you know of a faster way to process high-res precip grids
within MET?
> > This seems prohibitively too slow since I need to run verification
> > for daily forecasts across multiple months.  Perhaps I should be
> > interpolating fields to the coarser-res grid (whether the obs or
> forecast grid?)?
> >
> > Appreciate the help,
> > Jon
> >
> > *****************************************************************
> > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > Short-term Prediction Research and Transition (SPoRT) Center
> > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > (preferred) or
> > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > http://weather.msfc.nasa.gov/sport/
> > *****************************************************************
> >
> >
> >
>
>
>
>



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs extremely slow
From: Case, Jonathan[ENSCO INC]
Time: Thu Aug 10 11:04:50 2017

Hi again Julie,

I see that in the MET 6.0 documentation that there performance
improvements can be realized in several programs including grid_stat,
but not in the MET v5.2 patch.  Was this related to the MET v5.2 fix
you referred to below?  Perhaps I could simply obtain the files you
implemented in MET v5.2 for this improvement to grid_stat continuous
stats because that is definitely where I'm seeing the biggest back-
log.  Were new source code files released for this?

Jonathan

-----Original Message-----
From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
Sent: Thursday, August 10, 2017 10:24 AM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Hi Jonathan.  Our MET expert is out of the office for the rest of this
week.  However, I do believe I have some useful information for you.
Thank you for letting us know what version of MET you are using.  I
see that you are currently using METv5.1.  We did fix a problem (back
in January in METv5.2, but not METv5.1) with Grid-Stat because users
reported very slow performance  when computing continuous statistics
on high resolution grids:

----

From: http://www.dtcenter.org/met/us <goog_287487168>
ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix memory
allocation inefficiencies.
Posted 01/26/2017 *Problem:* Users reported very slow performance from
Grid-Stat when computing continuous statistics on high resolution
grids.
Debugging revealed very inefficient memory allocation logic in the MET
statistics libraries. Many small memory allocations and reallocations
slowed down the code considerably.
*Solution:* The fix is to update the MET library and application code
to allocate the expected amount of required memory in one big chunk.
Updating this logic caused the runtime of one test case to improve
from 36 minutes to 18 seconds.

----

It is possible that MET is running slowly due to the dense grid that
you are using, however, if possible, I would suggest that you upgrade
to
METv6.0 (our latest release) and give that a shot using your current
grid and check for a speed up in processing before you interpolate to
a coarser-res grid.  Please note that when compiling METv6.0 *do not
use* the previous MET environments.  In v6.0, we've made a large
change from using
NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon which NetCDF4 is built.  For more details please take a look at
the:

*MET Users Guide*
http://www.dtcenter.org/met/users/docs/users_guide/MET_Users_Guide_v6.0.pdf

*Online tutorial*
http://www.dtcenter.org/met/users/support/online_tutorial/METv6.0/tutorial.php?name=compilation&category=index

I hope this helps.  Please let us know how it goes.

Thanks,
Julie

On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> Transaction: Ticket created by jonathan.case-1 at nasa.gov
>        Queue: met_help
>      Subject: MET v5.1 with MRMS data runs extremely slow
>        Owner: Nobody
>   Requestors: jonathan.case-1 at nasa.gov
>       Status: new
>  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> >
>
>
> Dear MET Help,
>
> I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500 large
> grid at 0.01-deg spacing) running through MET (version 5.1 is what
I'm
> running), but it's processing extremely slowly.  I have a 12-km
> grid-spaced model test run over the SE U.S., which I interpolated to
> the MRMS grid using the regrid_data_plane utility.
> This alone takes quite a while to do (perhaps a few minutes).  Then
> running grid_stat takes several more minutes per scenario.
> In all, I've only gotten through the first 14 hours of my model run
> hourly output while running jobs in a script overnight.
>
> Do you know of a faster way to process high-res precip grids within
MET?
> This seems prohibitively too slow since I need to run verification
for
> daily forecasts across multiple months.  Perhaps I should be
> interpolating fields to the coarser-res grid (whether the obs or
forecast grid?)?
>
> Appreciate the help,
> Jon
>
> *****************************************************************
> Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA Short-
term
> Prediction Research and Transition (SPoRT) Center
> 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> Voice: 256.961.7504  ;  Fax: 256.961.7788
> http://weather.msfc.nasa.gov/sport/
> *****************************************************************
>
>
>



------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: Julie Prestopnik
Time: Thu Aug 10 12:25:19 2017

Hi Jonathan.

Yes, the description I had listed previously with the Problem and
Solution
were included in the METv5.2 patch on 1/26/2017 as described further
on
this page:

http://www.dtcenter.org/met/us <http://goog_287487168>
ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php

Also, underneath the Problem and Solution on that page are specific
files
that were affected.  I do not know if it would make sense to simply
grab
those files to replace in METv5.1 and would not suggest doing that.
If
possible, I highly recommend that you upgrade to our latest version
6.0,
but if that doesn't make sense for you right now, I would suggest get
the
latest METv5.2 release with all of the patches and installing it.

I hope that helps.  Please let us know if you have other questions.

Julie



On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi again Julie,
>
> I see that in the MET 6.0 documentation that there performance
> improvements can be realized in several programs including
grid_stat, but
> not in the MET v5.2 patch.  Was this related to the MET v5.2 fix you
> referred to below?  Perhaps I could simply obtain the files you
implemented
> in MET v5.2 for this improvement to grid_stat continuous stats
because that
> is definitely where I'm seeing the biggest back-log.  Were new
source code
> files released for this?
>
> Jonathan
>
> -----Original Message-----
> From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> Sent: Thursday, August 10, 2017 10:24 AM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hi Jonathan.  Our MET expert is out of the office for the rest of
this
> week.  However, I do believe I have some useful information for you.
Thank
> you for letting us know what version of MET you are using.  I see
that you
> are currently using METv5.1.  We did fix a problem (back in January
in
> METv5.2, but not METv5.1) with Grid-Stat because users reported very
slow
> performance  when computing continuous statistics on high resolution
grids:
>
> ----
>
> From: http://www.dtcenter.org/met/us <goog_287487168>
> ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
memory
> allocation inefficiencies.
> Posted 01/26/2017 *Problem:* Users reported very slow performance
from
> Grid-Stat when computing continuous statistics on high resolution
grids.
> Debugging revealed very inefficient memory allocation logic in the
MET
> statistics libraries. Many small memory allocations and
reallocations
> slowed down the code considerably.
> *Solution:* The fix is to update the MET library and application
code to
> allocate the expected amount of required memory in one big chunk.
Updating
> this logic caused the runtime of one test case to improve from 36
minutes
> to 18 seconds.
>
> ----
>
> It is possible that MET is running slowly due to the dense grid that
you
> are using, however, if possible, I would suggest that you upgrade to
> METv6.0 (our latest release) and give that a shot using your current
grid
> and check for a speed up in processing before you interpolate to a
> coarser-res grid.  Please note that when compiling METv6.0 *do not
use* the
> previous MET environments.  In v6.0, we've made a large change from
using
> NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon
> which NetCDF4 is built.  For more details please take a look at the:
>
> *MET Users Guide*
> http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> _Guide_v6.0.pdf
>
> *Online tutorial*
> http://www.dtcenter.org/met/users/support/online_tutorial/ME
> Tv6.0/tutorial.php?name=compilation&category=index
>
> I hope this helps.  Please let us know how it goes.
>
> Thanks,
> Julie
>
> On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> >        Queue: met_help
> >      Subject: MET v5.1 with MRMS data runs extremely slow
> >        Owner: Nobody
> >   Requestors: jonathan.case-1 at nasa.gov
> >       Status: new
> >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > >
> >
> >
> > Dear MET Help,
> >
> > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
large
> > grid at 0.01-deg spacing) running through MET (version 5.1 is what
I'm
> > running), but it's processing extremely slowly.  I have a 12-km
> > grid-spaced model test run over the SE U.S., which I interpolated
to
> > the MRMS grid using the regrid_data_plane utility.
> > This alone takes quite a while to do (perhaps a few minutes).
Then
> > running grid_stat takes several more minutes per scenario.
> > In all, I've only gotten through the first 14 hours of my model
run
> > hourly output while running jobs in a script overnight.
> >
> > Do you know of a faster way to process high-res precip grids
within MET?
> > This seems prohibitively too slow since I need to run verification
for
> > daily forecasts across multiple months.  Perhaps I should be
> > interpolating fields to the coarser-res grid (whether the obs or
> forecast grid?)?
> >
> > Appreciate the help,
> > Jon
> >
> > *****************************************************************
> > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA Short-
term
> > Prediction Research and Transition (SPoRT) Center
> > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > http://weather.msfc.nasa.gov/sport/
> > *****************************************************************
> >
> >
> >
>
>
>
>

------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: John Halley Gotway
Time: Mon Aug 14 10:59:37 2017

Johnathan,

This is John Halley Gotway.  I read through your message history with
Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be
aware though that version 6.0 is a pretty big change since we switched
from
using NetCDF3 to NetCDF4.

Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET
configuration/compilation process has changed slightly.

We are currently working on version 6.1 which will require NetCDF4 as
well.  So as long as you're recompiling, I'd recommend using 6.0.

And yes, Julie's correct.  We were doing some frankly pretty dumb
memory
allocation.  Grid-Stat is running so slowly because it keeps
allocating
memory, deleting it, and then re-allocating a bit more.  The fix is
simple... figure out how much memory is required ahead of time and
allocate
it once.  We looked for this same issue in other applications and
applied
the fix wherever we found it.

Grid-Stat should run dramatically faster in version 6.0.

Thanks,
John


On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT
<met_help at ucar.edu
> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi Jonathan.
>
> Yes, the description I had listed previously with the Problem and
Solution
> were included in the METv5.2 patch on 1/26/2017 as described further
on
> this page:
>
> http://www.dtcenter.org/met/us <http://goog_287487168>
> ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
>
> Also, underneath the Problem and Solution on that page are specific
files
> that were affected.  I do not know if it would make sense to simply
grab
> those files to replace in METv5.1 and would not suggest doing that.
If
> possible, I highly recommend that you upgrade to our latest version
6.0,
> but if that doesn't make sense for you right now, I would suggest
get the
> latest METv5.2 release with all of the patches and installing it.
>
> I hope that helps.  Please let us know if you have other questions.
>
> Julie
>
>
>
> On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi again Julie,
> >
> > I see that in the MET 6.0 documentation that there performance
> > improvements can be realized in several programs including
grid_stat, but
> > not in the MET v5.2 patch.  Was this related to the MET v5.2 fix
you
> > referred to below?  Perhaps I could simply obtain the files you
> implemented
> > in MET v5.2 for this improvement to grid_stat continuous stats
because
> that
> > is definitely where I'm seeing the biggest back-log.  Were new
source
> code
> > files released for this?
> >
> > Jonathan
> >
> > -----Original Message-----
> > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > Sent: Thursday, August 10, 2017 10:24 AM
> > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Hi Jonathan.  Our MET expert is out of the office for the rest of
this
> > week.  However, I do believe I have some useful information for
you.
> Thank
> > you for letting us know what version of MET you are using.  I see
that
> you
> > are currently using METv5.1.  We did fix a problem (back in
January in
> > METv5.2, but not METv5.1) with Grid-Stat because users reported
very slow
> > performance  when computing continuous statistics on high
resolution
> grids:
> >
> > ----
> >
> > From: http://www.dtcenter.org/met/us <goog_287487168>
> > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
memory
> > allocation inefficiencies.
> > Posted 01/26/2017 *Problem:* Users reported very slow performance
from
> > Grid-Stat when computing continuous statistics on high resolution
grids.
> > Debugging revealed very inefficient memory allocation logic in the
MET
> > statistics libraries. Many small memory allocations and
reallocations
> > slowed down the code considerably.
> > *Solution:* The fix is to update the MET library and application
code to
> > allocate the expected amount of required memory in one big chunk.
> Updating
> > this logic caused the runtime of one test case to improve from 36
minutes
> > to 18 seconds.
> >
> > ----
> >
> > It is possible that MET is running slowly due to the dense grid
that you
> > are using, however, if possible, I would suggest that you upgrade
to
> > METv6.0 (our latest release) and give that a shot using your
current grid
> > and check for a speed up in processing before you interpolate to a
> > coarser-res grid.  Please note that when compiling METv6.0 *do not
use*
> the
> > previous MET environments.  In v6.0, we've made a large change
from using
> > NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
upon
> > which NetCDF4 is built.  For more details please take a look at
the:
> >
> > *MET Users Guide*
> > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > _Guide_v6.0.pdf
> >
> > *Online tutorial*
> > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > Tv6.0/tutorial.php?name=compilation&category=index
> >
> > I hope this helps.  Please let us know how it goes.
> >
> > Thanks,
> > Julie
> >
> > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > >        Queue: met_help
> > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > >        Owner: Nobody
> > >   Requestors: jonathan.case-1 at nasa.gov
> > >       Status: new
> > >  Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > >
> > >
> > >
> > > Dear MET Help,
> > >
> > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
large
> > > grid at 0.01-deg spacing) running through MET (version 5.1 is
what I'm
> > > running), but it's processing extremely slowly.  I have a 12-km
> > > grid-spaced model test run over the SE U.S., which I
interpolated to
> > > the MRMS grid using the regrid_data_plane utility.
> > > This alone takes quite a while to do (perhaps a few minutes).
Then
> > > running grid_stat takes several more minutes per scenario.
> > > In all, I've only gotten through the first 14 hours of my model
run
> > > hourly output while running jobs in a script overnight.
> > >
> > > Do you know of a faster way to process high-res precip grids
within
> MET?
> > > This seems prohibitively too slow since I need to run
verification for
> > > daily forecasts across multiple months.  Perhaps I should be
> > > interpolating fields to the coarser-res grid (whether the obs or
> > forecast grid?)?
> > >
> > > Appreciate the help,
> > > Jon
> > >
> > >
*****************************************************************
> > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
Short-term
> > > Prediction Research and Transition (SPoRT) Center
> > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > (preferred) or
case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > http://weather.msfc.nasa.gov/sport/
> > >
*****************************************************************
> > >
> > >
> > >
> >
> >
> >
> >
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs extremely slow
From: Case, Jonathan[ENSCO INC]
Time: Mon Aug 14 11:04:17 2017

Hi John H-G,

I'm presently working on upgrading to MET v6.0.  We have most of it
compiled, but one of the new utilities requires some new capabilities
to build it (mode graphics, I believe).  Thankfully, my co-worker
Jayanthi is handling the MET compilation.

One thing I noticed is that grid_stat isn't working past the 1-h
forecast, as it did successfully in versions 5.1 and 5.2 with my
scripts.  I'm presently debugging it, so I'll get back to you if I
can't figure out the problem.  It appears that it's not funding the
correct APCP forecast field.

I'll let you know how much faster it runs for me once I get v6.0
running.
-Jonathan

-----Original Message-----
From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
Sent: Monday, August 14, 2017 12:00 PM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Johnathan,

This is John Halley Gotway.  I read through your message history with
Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be aware though that version 6.0 is a pretty big change since we
switched from using NetCDF3 to NetCDF4.

Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET configuration/compilation process has changed slightly.

We are currently working on version 6.1 which will require NetCDF4 as
well.  So as long as you're recompiling, I'd recommend using 6.0.

And yes, Julie's correct.  We were doing some frankly pretty dumb
memory allocation.  Grid-Stat is running so slowly because it keeps
allocating memory, deleting it, and then re-allocating a bit more.
The fix is simple... figure out how much memory is required ahead of
time and allocate it once.  We looked for this same issue in other
applications and applied the fix wherever we found it.

Grid-Stat should run dramatically faster in version 6.0.

Thanks,
John


On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT
<met_help at ucar.edu
> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi Jonathan.
>
> Yes, the description I had listed previously with the Problem and
> Solution were included in the METv5.2 patch on 1/26/2017 as
described
> further on this page:
>
> http://www.dtcenter.org/met/us <http://goog_287487168>
> ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
>
> Also, underneath the Problem and Solution on that page are specific
> files that were affected.  I do not know if it would make sense to
> simply grab those files to replace in METv5.1 and would not suggest
> doing that.  If possible, I highly recommend that you upgrade to our
> latest version 6.0, but if that doesn't make sense for you right
now,
> I would suggest get the latest METv5.2 release with all of the
patches and installing it.
>
> I hope that helps.  Please let us know if you have other questions.
>
> Julie
>
>
>
> On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi again Julie,
> >
> > I see that in the MET 6.0 documentation that there performance
> > improvements can be realized in several programs including
> > grid_stat, but not in the MET v5.2 patch.  Was this related to the
> > MET v5.2 fix you referred to below?  Perhaps I could simply obtain
> > the files you
> implemented
> > in MET v5.2 for this improvement to grid_stat continuous stats
> > because
> that
> > is definitely where I'm seeing the biggest back-log.  Were new
> > source
> code
> > files released for this?
> >
> > Jonathan
> >
> > -----Original Message-----
> > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > Sent: Thursday, August 10, 2017 10:24 AM
> > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Hi Jonathan.  Our MET expert is out of the office for the rest of
> > this week.  However, I do believe I have some useful information
for you.
> Thank
> > you for letting us know what version of MET you are using.  I see
> > that
> you
> > are currently using METv5.1.  We did fix a problem (back in
January
> > in METv5.2, but not METv5.1) with Grid-Stat because users reported
> > very slow performance  when computing continuous statistics on
high
> > resolution
> grids:
> >
> > ----
> >
> > From: http://www.dtcenter.org/met/us <goog_287487168>
> > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > memory allocation inefficiencies.
> > Posted 01/26/2017 *Problem:* Users reported very slow performance
> > from Grid-Stat when computing continuous statistics on high
resolution grids.
> > Debugging revealed very inefficient memory allocation logic in the
> > MET statistics libraries. Many small memory allocations and
> > reallocations slowed down the code considerably.
> > *Solution:* The fix is to update the MET library and application
> > code to allocate the expected amount of required memory in one big
chunk.
> Updating
> > this logic caused the runtime of one test case to improve from 36
> > minutes to 18 seconds.
> >
> > ----
> >
> > It is possible that MET is running slowly due to the dense grid
that
> > you are using, however, if possible, I would suggest that you
> > upgrade to
> > METv6.0 (our latest release) and give that a shot using your
current
> > grid and check for a speed up in processing before you interpolate
> > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > not use*
> the
> > previous MET environments.  In v6.0, we've made a large change
from
> > using
> > NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
> > upon which NetCDF4 is built.  For more details please take a look
at the:
> >
> > *MET Users Guide*
> > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > _Guide_v6.0.pdf
> >
> > *Online tutorial*
> > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > Tv6.0/tutorial.php?name=compilation&category=index
> >
> > I hope this helps.  Please let us know how it goes.
> >
> > Thanks,
> > Julie
> >
> > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > >        Queue: met_help
> > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > >        Owner: Nobody
> > >   Requestors: jonathan.case-1 at nasa.gov
> > >       Status: new
> > >  Ticket <URL:
> > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > >
> > >
> > >
> > > Dear MET Help,
> > >
> > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > large grid at 0.01-deg spacing) running through MET (version 5.1
> > > is what I'm running), but it's processing extremely slowly.  I
> > > have a 12-km grid-spaced model test run over the SE U.S., which
I
> > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > This alone takes quite a while to do (perhaps a few minutes).
> > > Then running grid_stat takes several more minutes per scenario.
> > > In all, I've only gotten through the first 14 hours of my model
> > > run hourly output while running jobs in a script overnight.
> > >
> > > Do you know of a faster way to process high-res precip grids
> > > within
> MET?
> > > This seems prohibitively too slow since I need to run
verification
> > > for daily forecasts across multiple months.  Perhaps I should be
> > > interpolating fields to the coarser-res grid (whether the obs or
> > forecast grid?)?
> > >
> > > Appreciate the help,
> > > Jon
> > >
> > >
*****************************************************************
> > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > Short-term Prediction Research and Transition (SPoRT) Center
> > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > (preferred) or
> > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > http://weather.msfc.nasa.gov/sport/
> > >
*****************************************************************
> > >
> > >
> > >
> >
> >
> >
> >
>
>



------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: John Halley Gotway
Time: Mon Aug 14 11:09:41 2017

Jon,

If the MODE graphics utility is causing issues, I'd recommend just
skipping
over it (i.e. don't use --enable-mode_graphics).  It basically
produces
some PNG plots of MODE NetCDF output files rather than the PostScript
plots
produced my MODE itself or the plot_data_plane utility.

It definitely isn't in the critical path, so I wouldn't let it slow
you
down.

FYI, we've also been working on containerized versions of MET, which
can
are described here:
   http://www.dtcenter.org/met/users/downloads/docker_container.php

We currently have containers set up for running the online tutorial
using
met-5.2 and met-6.0.  In the coming months, we plan to create a
container
for the METViewer database and display tool as well.  And we'll be
discussing this at a short course at AMS this year.

Thanks,
John

On Mon, Aug 14, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi John H-G,
>
> I'm presently working on upgrading to MET v6.0.  We have most of it
> compiled, but one of the new utilities requires some new
capabilities to
> build it (mode graphics, I believe).  Thankfully, my co-worker
Jayanthi is
> handling the MET compilation.
>
> One thing I noticed is that grid_stat isn't working past the 1-h
forecast,
> as it did successfully in versions 5.1 and 5.2 with my scripts.  I'm
> presently debugging it, so I'll get back to you if I can't figure
out the
> problem.  It appears that it's not funding the correct APCP forecast
field.
>
> I'll let you know how much faster it runs for me once I get v6.0
running.
> -Jonathan
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Monday, August 14, 2017 12:00 PM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Johnathan,
>
> This is John Halley Gotway.  I read through your message history
with
> Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be
> aware though that version 6.0 is a pretty big change since we
switched from
> using NetCDF3 to NetCDF4.
>
> Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET
> configuration/compilation process has changed slightly.
>
> We are currently working on version 6.1 which will require NetCDF4
as
> well.  So as long as you're recompiling, I'd recommend using 6.0.
>
> And yes, Julie's correct.  We were doing some frankly pretty dumb
memory
> allocation.  Grid-Stat is running so slowly because it keeps
allocating
> memory, deleting it, and then re-allocating a bit more.  The fix is
> simple... figure out how much memory is required ahead of time and
allocate
> it once.  We looked for this same issue in other applications and
applied
> the fix wherever we found it.
>
> Grid-Stat should run dramatically faster in version 6.0.
>
> Thanks,
> John
>
>
> On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> met_help at ucar.edu
> > wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi Jonathan.
> >
> > Yes, the description I had listed previously with the Problem and
> > Solution were included in the METv5.2 patch on 1/26/2017 as
described
> > further on this page:
> >
> > http://www.dtcenter.org/met/us <http://goog_287487168>
> > ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
> >
> > Also, underneath the Problem and Solution on that page are
specific
> > files that were affected.  I do not know if it would make sense to
> > simply grab those files to replace in METv5.1 and would not
suggest
> > doing that.  If possible, I highly recommend that you upgrade to
our
> > latest version 6.0, but if that doesn't make sense for you right
now,
> > I would suggest get the latest METv5.2 release with all of the
patches
> and installing it.
> >
> > I hope that helps.  Please let us know if you have other
questions.
> >
> > Julie
> >
> >
> >
> > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi again Julie,
> > >
> > > I see that in the MET 6.0 documentation that there performance
> > > improvements can be realized in several programs including
> > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > the files you
> > implemented
> > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > because
> > that
> > > is definitely where I'm seeing the biggest back-log.  Were new
> > > source
> > code
> > > files released for this?
> > >
> > > Jonathan
> > >
> > > -----Original Message-----
> > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > Sent: Thursday, August 10, 2017 10:24 AM
> > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > extremely slow
> > >
> > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > this week.  However, I do believe I have some useful information
for
> you.
> > Thank
> > > you for letting us know what version of MET you are using.  I
see
> > > that
> > you
> > > are currently using METv5.1.  We did fix a problem (back in
January
> > > in METv5.2, but not METv5.1) with Grid-Stat because users
reported
> > > very slow performance  when computing continuous statistics on
high
> > > resolution
> > grids:
> > >
> > > ----
> > >
> > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > > memory allocation inefficiencies.
> > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > from Grid-Stat when computing continuous statistics on high
resolution
> grids.
> > > Debugging revealed very inefficient memory allocation logic in
the
> > > MET statistics libraries. Many small memory allocations and
> > > reallocations slowed down the code considerably.
> > > *Solution:* The fix is to update the MET library and application
> > > code to allocate the expected amount of required memory in one
big
> chunk.
> > Updating
> > > this logic caused the runtime of one test case to improve from
36
> > > minutes to 18 seconds.
> > >
> > > ----
> > >
> > > It is possible that MET is running slowly due to the dense grid
that
> > > you are using, however, if possible, I would suggest that you
> > > upgrade to
> > > METv6.0 (our latest release) and give that a shot using your
current
> > > grid and check for a speed up in processing before you
interpolate
> > > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > > not use*
> > the
> > > previous MET environments.  In v6.0, we've made a large change
from
> > > using
> > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > upon which NetCDF4 is built.  For more details please take a
look at
> the:
> > >
> > > *MET Users Guide*
> > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > _Guide_v6.0.pdf
> > >
> > > *Online tutorial*
> > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > Tv6.0/tutorial.php?name=compilation&category=index
> > >
> > > I hope this helps.  Please let us know how it goes.
> > >
> > > Thanks,
> > > Julie
> > >
> > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > >        Queue: met_help
> > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > >        Owner: Nobody
> > > >   Requestors: jonathan.case-1 at nasa.gov
> > > >       Status: new
> > > >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > >
> > > >
> > > >
> > > > Dear MET Help,
> > > >
> > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > is what I'm running), but it's processing extremely slowly.  I
> > > > have a 12-km grid-spaced model test run over the SE U.S.,
which I
> > > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > > This alone takes quite a while to do (perhaps a few minutes).
> > > > Then running grid_stat takes several more minutes per
scenario.
> > > > In all, I've only gotten through the first 14 hours of my
model
> > > > run hourly output while running jobs in a script overnight.
> > > >
> > > > Do you know of a faster way to process high-res precip grids
> > > > within
> > MET?
> > > > This seems prohibitively too slow since I need to run
verification
> > > > for daily forecasts across multiple months.  Perhaps I should
be
> > > > interpolating fields to the coarser-res grid (whether the obs
or
> > > forecast grid?)?
> > > >
> > > > Appreciate the help,
> > > > Jon
> > > >
> > > >
*****************************************************************
> > > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > > (preferred) or
> > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > http://weather.msfc.nasa.gov/sport/
> > > >
*****************************************************************
> > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs extremely slow
From: Case, Jonathan[ENSCO INC]
Time: Mon Aug 14 13:47:03 2017

Hello again John H-G,

I've got my scripts set up to run regrid_data_plane and pcp_combine on
the model precip to sum/subtract and re-project onto the obs grid (in
my case, MRMS), prior to running grid_stat.  I had no trouble with
this method prior to upgrading to version 6.0, but now I've
encountered issues beyond the 1-h forecast time in the model run I'm
testing.

In version 6.0, it appears that an extra level field is being appended
onto the variable name in what I believe is the output from
pcp_combine.  For example, in the summed 3-hour forecast precip field,
the name appears as "APCP_03_A3" instead of simply "APCP_03".  In both
instances, the level is "A3".  Why is "_A3" being appended onto the
variable name in the resulting netcdf output file?  I believe this is
the source of my problems with v6.0 output since I wasn't experiencing
this issue when running grid_stat in the 5.1/5.2 versions.

FYI, when running grid_stat, I set the level to be '(*,*)' for the
forecast model fields, but this doesn't seem to be working with the
variable names being output with the level appended onto the name.

Please advise on how I should best proceed.

Most sincerely,
Jonathan

-----Original Message-----
From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
Sent: Monday, August 14, 2017 12:00 PM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Johnathan,

This is John Halley Gotway.  I read through your message history with
Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be aware though that version 6.0 is a pretty big change since we
switched from using NetCDF3 to NetCDF4.

Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET configuration/compilation process has changed slightly.

We are currently working on version 6.1 which will require NetCDF4 as
well.  So as long as you're recompiling, I'd recommend using 6.0.

And yes, Julie's correct.  We were doing some frankly pretty dumb
memory allocation.  Grid-Stat is running so slowly because it keeps
allocating memory, deleting it, and then re-allocating a bit more.
The fix is simple... figure out how much memory is required ahead of
time and allocate it once.  We looked for this same issue in other
applications and applied the fix wherever we found it.

Grid-Stat should run dramatically faster in version 6.0.

Thanks,
John


On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT
<met_help at ucar.edu
> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi Jonathan.
>
> Yes, the description I had listed previously with the Problem and
> Solution were included in the METv5.2 patch on 1/26/2017 as
described
> further on this page:
>
> http://www.dtcenter.org/met/us <http://goog_287487168>
> ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
>
> Also, underneath the Problem and Solution on that page are specific
> files that were affected.  I do not know if it would make sense to
> simply grab those files to replace in METv5.1 and would not suggest
> doing that.  If possible, I highly recommend that you upgrade to our
> latest version 6.0, but if that doesn't make sense for you right
now,
> I would suggest get the latest METv5.2 release with all of the
patches and installing it.
>
> I hope that helps.  Please let us know if you have other questions.
>
> Julie
>
>
>
> On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi again Julie,
> >
> > I see that in the MET 6.0 documentation that there performance
> > improvements can be realized in several programs including
> > grid_stat, but not in the MET v5.2 patch.  Was this related to the
> > MET v5.2 fix you referred to below?  Perhaps I could simply obtain
> > the files you
> implemented
> > in MET v5.2 for this improvement to grid_stat continuous stats
> > because
> that
> > is definitely where I'm seeing the biggest back-log.  Were new
> > source
> code
> > files released for this?
> >
> > Jonathan
> >
> > -----Original Message-----
> > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > Sent: Thursday, August 10, 2017 10:24 AM
> > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Hi Jonathan.  Our MET expert is out of the office for the rest of
> > this week.  However, I do believe I have some useful information
for you.
> Thank
> > you for letting us know what version of MET you are using.  I see
> > that
> you
> > are currently using METv5.1.  We did fix a problem (back in
January
> > in METv5.2, but not METv5.1) with Grid-Stat because users reported
> > very slow performance  when computing continuous statistics on
high
> > resolution
> grids:
> >
> > ----
> >
> > From: http://www.dtcenter.org/met/us <goog_287487168>
> > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > memory allocation inefficiencies.
> > Posted 01/26/2017 *Problem:* Users reported very slow performance
> > from Grid-Stat when computing continuous statistics on high
resolution grids.
> > Debugging revealed very inefficient memory allocation logic in the
> > MET statistics libraries. Many small memory allocations and
> > reallocations slowed down the code considerably.
> > *Solution:* The fix is to update the MET library and application
> > code to allocate the expected amount of required memory in one big
chunk.
> Updating
> > this logic caused the runtime of one test case to improve from 36
> > minutes to 18 seconds.
> >
> > ----
> >
> > It is possible that MET is running slowly due to the dense grid
that
> > you are using, however, if possible, I would suggest that you
> > upgrade to
> > METv6.0 (our latest release) and give that a shot using your
current
> > grid and check for a speed up in processing before you interpolate
> > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > not use*
> the
> > previous MET environments.  In v6.0, we've made a large change
from
> > using
> > NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
> > upon which NetCDF4 is built.  For more details please take a look
at the:
> >
> > *MET Users Guide*
> > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > _Guide_v6.0.pdf
> >
> > *Online tutorial*
> > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > Tv6.0/tutorial.php?name=compilation&category=index
> >
> > I hope this helps.  Please let us know how it goes.
> >
> > Thanks,
> > Julie
> >
> > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > >        Queue: met_help
> > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > >        Owner: Nobody
> > >   Requestors: jonathan.case-1 at nasa.gov
> > >       Status: new
> > >  Ticket <URL:
> > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > >
> > >
> > >
> > > Dear MET Help,
> > >
> > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > large grid at 0.01-deg spacing) running through MET (version 5.1
> > > is what I'm running), but it's processing extremely slowly.  I
> > > have a 12-km grid-spaced model test run over the SE U.S., which
I
> > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > This alone takes quite a while to do (perhaps a few minutes).
> > > Then running grid_stat takes several more minutes per scenario.
> > > In all, I've only gotten through the first 14 hours of my model
> > > run hourly output while running jobs in a script overnight.
> > >
> > > Do you know of a faster way to process high-res precip grids
> > > within
> MET?
> > > This seems prohibitively too slow since I need to run
verification
> > > for daily forecasts across multiple months.  Perhaps I should be
> > > interpolating fields to the coarser-res grid (whether the obs or
> > forecast grid?)?
> > >
> > > Appreciate the help,
> > > Jon
> > >
> > >
*****************************************************************
> > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > Short-term Prediction Research and Transition (SPoRT) Center
> > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > (preferred) or
> > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > http://weather.msfc.nasa.gov/sport/
> > >
*****************************************************************
> > >
> > >
> > >
> >
> >
> >
> >
>
>



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] regrid_data_plane problem in v6.0 (CORRECTION to last email)
From: Case, Jonathan[ENSCO INC]
Time: Mon Aug 14 14:02:45 2017

CORRECTION: It looks like regrid_data_place is appending the extra
level field onto the output netcdf field name.
The output from pcp_combine writes out the field name as "APCP_03",
whereas the output from regrid_data_plane writes out the field as
"APCP_03_A3".

Sorry for the confusion,
Jonathan

-----Original Message-----
From: Case, Jonathan (MSFC-ST11)[ENSCO INC]
Sent: Monday, August 14, 2017 2:47 PM
To: 'met_help at ucar.edu' <met_help at ucar.edu>
Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Hello again John H-G,

I've got my scripts set up to run regrid_data_plane and pcp_combine on
the model precip to sum/subtract and re-project onto the obs grid (in
my case, MRMS), prior to running grid_stat.  I had no trouble with
this method prior to upgrading to version 6.0, but now I've
encountered issues beyond the 1-h forecast time in the model run I'm
testing.

In version 6.0, it appears that an extra level field is being appended
onto the variable name in what I believe is the output from
pcp_combine.  For example, in the summed 3-hour forecast precip field,
the name appears as "APCP_03_A3" instead of simply "APCP_03".  In both
instances, the level is "A3".  Why is "_A3" being appended onto the
variable name in the resulting netcdf output file?  I believe this is
the source of my problems with v6.0 output since I wasn't experiencing
this issue when running grid_stat in the 5.1/5.2 versions.

FYI, when running grid_stat, I set the level to be '(*,*)' for the
forecast model fields, but this doesn't seem to be working with the
variable names being output with the level appended onto the name.

Please advise on how I should best proceed.

Most sincerely,
Jonathan

-----Original Message-----
From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
Sent: Monday, August 14, 2017 12:00 PM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
extremely slow

Johnathan,

This is John Halley Gotway.  I read through your message history with
Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be aware though that version 6.0 is a pretty big change since we
switched from using NetCDF3 to NetCDF4.

Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET configuration/compilation process has changed slightly.

We are currently working on version 6.1 which will require NetCDF4 as
well.  So as long as you're recompiling, I'd recommend using 6.0.

And yes, Julie's correct.  We were doing some frankly pretty dumb
memory allocation.  Grid-Stat is running so slowly because it keeps
allocating memory, deleting it, and then re-allocating a bit more.
The fix is simple... figure out how much memory is required ahead of
time and allocate it once.  We looked for this same issue in other
applications and applied the fix wherever we found it.

Grid-Stat should run dramatically faster in version 6.0.

Thanks,
John


On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT
<met_help at ucar.edu
> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi Jonathan.
>
> Yes, the description I had listed previously with the Problem and
> Solution were included in the METv5.2 patch on 1/26/2017 as
described
> further on this page:
>
> http://www.dtcenter.org/met/us <http://goog_287487168>
> ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
>
> Also, underneath the Problem and Solution on that page are specific
> files that were affected.  I do not know if it would make sense to
> simply grab those files to replace in METv5.1 and would not suggest
> doing that.  If possible, I highly recommend that you upgrade to our
> latest version 6.0, but if that doesn't make sense for you right
now,
> I would suggest get the latest METv5.2 release with all of the
patches and installing it.
>
> I hope that helps.  Please let us know if you have other questions.
>
> Julie
>
>
>
> On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi again Julie,
> >
> > I see that in the MET 6.0 documentation that there performance
> > improvements can be realized in several programs including
> > grid_stat, but not in the MET v5.2 patch.  Was this related to the
> > MET v5.2 fix you referred to below?  Perhaps I could simply obtain
> > the files you
> implemented
> > in MET v5.2 for this improvement to grid_stat continuous stats
> > because
> that
> > is definitely where I'm seeing the biggest back-log.  Were new
> > source
> code
> > files released for this?
> >
> > Jonathan
> >
> > -----Original Message-----
> > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > Sent: Thursday, August 10, 2017 10:24 AM
> > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Hi Jonathan.  Our MET expert is out of the office for the rest of
> > this week.  However, I do believe I have some useful information
for you.
> Thank
> > you for letting us know what version of MET you are using.  I see
> > that
> you
> > are currently using METv5.1.  We did fix a problem (back in
January
> > in METv5.2, but not METv5.1) with Grid-Stat because users reported
> > very slow performance  when computing continuous statistics on
high
> > resolution
> grids:
> >
> > ----
> >
> > From: http://www.dtcenter.org/met/us <goog_287487168>
> > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > memory allocation inefficiencies.
> > Posted 01/26/2017 *Problem:* Users reported very slow performance
> > from Grid-Stat when computing continuous statistics on high
resolution grids.
> > Debugging revealed very inefficient memory allocation logic in the
> > MET statistics libraries. Many small memory allocations and
> > reallocations slowed down the code considerably.
> > *Solution:* The fix is to update the MET library and application
> > code to allocate the expected amount of required memory in one big
chunk.
> Updating
> > this logic caused the runtime of one test case to improve from 36
> > minutes to 18 seconds.
> >
> > ----
> >
> > It is possible that MET is running slowly due to the dense grid
that
> > you are using, however, if possible, I would suggest that you
> > upgrade to
> > METv6.0 (our latest release) and give that a shot using your
current
> > grid and check for a speed up in processing before you interpolate
> > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > not use*
> the
> > previous MET environments.  In v6.0, we've made a large change
from
> > using
> > NetCDF3 to 4. You'll need to have installed both NetCDF4 and HDF5,
> > upon which NetCDF4 is built.  For more details please take a look
at the:
> >
> > *MET Users Guide*
> > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > _Guide_v6.0.pdf
> >
> > *Online tutorial*
> > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > Tv6.0/tutorial.php?name=compilation&category=index
> >
> > I hope this helps.  Please let us know how it goes.
> >
> > Thanks,
> > Julie
> >
> > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > >        Queue: met_help
> > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > >        Owner: Nobody
> > >   Requestors: jonathan.case-1 at nasa.gov
> > >       Status: new
> > >  Ticket <URL:
> > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > >
> > >
> > >
> > > Dear MET Help,
> > >
> > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > large grid at 0.01-deg spacing) running through MET (version 5.1
> > > is what I'm running), but it's processing extremely slowly.  I
> > > have a 12-km grid-spaced model test run over the SE U.S., which
I
> > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > This alone takes quite a while to do (perhaps a few minutes).
> > > Then running grid_stat takes several more minutes per scenario.
> > > In all, I've only gotten through the first 14 hours of my model
> > > run hourly output while running jobs in a script overnight.
> > >
> > > Do you know of a faster way to process high-res precip grids
> > > within
> MET?
> > > This seems prohibitively too slow since I need to run
verification
> > > for daily forecasts across multiple months.  Perhaps I should be
> > > interpolating fields to the coarser-res grid (whether the obs or
> > forecast grid?)?
> > >
> > > Appreciate the help,
> > > Jon
> > >
> > >
*****************************************************************
> > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > Short-term Prediction Research and Transition (SPoRT) Center
> > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > (preferred) or
> > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > http://weather.msfc.nasa.gov/sport/
> > >
*****************************************************************
> > >
> > >
> > >
> >
> >
> >
> >
>
>



------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: John Halley Gotway
Time: Tue Aug 15 09:53:26 2017

Jonathan,

The pcp_combine tool includes a "-name" command line option for the
user to
override the output variable name that's written.  I think the
simplest fix
would be updating your script that calls pcp_combine by adding "-name
APCP_03", assuming that's the variable name that's expected by Grid-
Stat.

Hopefully that'll get them running again smoothly.

Thanks,
John

On Mon, Aug 14, 2017 at 1:47 PM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hello again John H-G,
>
> I've got my scripts set up to run regrid_data_plane and pcp_combine
on the
> model precip to sum/subtract and re-project onto the obs grid (in my
case,
> MRMS), prior to running grid_stat.  I had no trouble with this
method prior
> to upgrading to version 6.0, but now I've encountered issues beyond
the 1-h
> forecast time in the model run I'm testing.
>
> In version 6.0, it appears that an extra level field is being
appended
> onto the variable name in what I believe is the output from
pcp_combine.
> For example, in the summed 3-hour forecast precip field, the name
appears
> as "APCP_03_A3" instead of simply "APCP_03".  In both instances, the
level
> is "A3".  Why is "_A3" being appended onto the variable name in the
> resulting netcdf output file?  I believe this is the source of my
problems
> with v6.0 output since I wasn't experiencing this issue when running
> grid_stat in the 5.1/5.2 versions.
>
> FYI, when running grid_stat, I set the level to be '(*,*)' for the
> forecast model fields, but this doesn't seem to be working with the
> variable names being output with the level appended onto the name.
>
> Please advise on how I should best proceed.
>
> Most sincerely,
> Jonathan
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Monday, August 14, 2017 12:00 PM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Johnathan,
>
> This is John Halley Gotway.  I read through your message history
with
> Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be
> aware though that version 6.0 is a pretty big change since we
switched from
> using NetCDF3 to NetCDF4.
>
> Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET
> configuration/compilation process has changed slightly.
>
> We are currently working on version 6.1 which will require NetCDF4
as
> well.  So as long as you're recompiling, I'd recommend using 6.0.
>
> And yes, Julie's correct.  We were doing some frankly pretty dumb
memory
> allocation.  Grid-Stat is running so slowly because it keeps
allocating
> memory, deleting it, and then re-allocating a bit more.  The fix is
> simple... figure out how much memory is required ahead of time and
allocate
> it once.  We looked for this same issue in other applications and
applied
> the fix wherever we found it.
>
> Grid-Stat should run dramatically faster in version 6.0.
>
> Thanks,
> John
>
>
> On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> met_help at ucar.edu
> > wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi Jonathan.
> >
> > Yes, the description I had listed previously with the Problem and
> > Solution were included in the METv5.2 patch on 1/26/2017 as
described
> > further on this page:
> >
> > http://www.dtcenter.org/met/us <http://goog_287487168>
> > ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
> >
> > Also, underneath the Problem and Solution on that page are
specific
> > files that were affected.  I do not know if it would make sense to
> > simply grab those files to replace in METv5.1 and would not
suggest
> > doing that.  If possible, I highly recommend that you upgrade to
our
> > latest version 6.0, but if that doesn't make sense for you right
now,
> > I would suggest get the latest METv5.2 release with all of the
patches
> and installing it.
> >
> > I hope that helps.  Please let us know if you have other
questions.
> >
> > Julie
> >
> >
> >
> > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi again Julie,
> > >
> > > I see that in the MET 6.0 documentation that there performance
> > > improvements can be realized in several programs including
> > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > the files you
> > implemented
> > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > because
> > that
> > > is definitely where I'm seeing the biggest back-log.  Were new
> > > source
> > code
> > > files released for this?
> > >
> > > Jonathan
> > >
> > > -----Original Message-----
> > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > Sent: Thursday, August 10, 2017 10:24 AM
> > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > extremely slow
> > >
> > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > this week.  However, I do believe I have some useful information
for
> you.
> > Thank
> > > you for letting us know what version of MET you are using.  I
see
> > > that
> > you
> > > are currently using METv5.1.  We did fix a problem (back in
January
> > > in METv5.2, but not METv5.1) with Grid-Stat because users
reported
> > > very slow performance  when computing continuous statistics on
high
> > > resolution
> > grids:
> > >
> > > ----
> > >
> > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > > memory allocation inefficiencies.
> > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > from Grid-Stat when computing continuous statistics on high
resolution
> grids.
> > > Debugging revealed very inefficient memory allocation logic in
the
> > > MET statistics libraries. Many small memory allocations and
> > > reallocations slowed down the code considerably.
> > > *Solution:* The fix is to update the MET library and application
> > > code to allocate the expected amount of required memory in one
big
> chunk.
> > Updating
> > > this logic caused the runtime of one test case to improve from
36
> > > minutes to 18 seconds.
> > >
> > > ----
> > >
> > > It is possible that MET is running slowly due to the dense grid
that
> > > you are using, however, if possible, I would suggest that you
> > > upgrade to
> > > METv6.0 (our latest release) and give that a shot using your
current
> > > grid and check for a speed up in processing before you
interpolate
> > > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > > not use*
> > the
> > > previous MET environments.  In v6.0, we've made a large change
from
> > > using
> > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > upon which NetCDF4 is built.  For more details please take a
look at
> the:
> > >
> > > *MET Users Guide*
> > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > _Guide_v6.0.pdf
> > >
> > > *Online tutorial*
> > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > Tv6.0/tutorial.php?name=compilation&category=index
> > >
> > > I hope this helps.  Please let us know how it goes.
> > >
> > > Thanks,
> > > Julie
> > >
> > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > >        Queue: met_help
> > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > >        Owner: Nobody
> > > >   Requestors: jonathan.case-1 at nasa.gov
> > > >       Status: new
> > > >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > >
> > > >
> > > >
> > > > Dear MET Help,
> > > >
> > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > is what I'm running), but it's processing extremely slowly.  I
> > > > have a 12-km grid-spaced model test run over the SE U.S.,
which I
> > > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > > This alone takes quite a while to do (perhaps a few minutes).
> > > > Then running grid_stat takes several more minutes per
scenario.
> > > > In all, I've only gotten through the first 14 hours of my
model
> > > > run hourly output while running jobs in a script overnight.
> > > >
> > > > Do you know of a faster way to process high-res precip grids
> > > > within
> > MET?
> > > > This seems prohibitively too slow since I need to run
verification
> > > > for daily forecasts across multiple months.  Perhaps I should
be
> > > > interpolating fields to the coarser-res grid (whether the obs
or
> > > forecast grid?)?
> > > >
> > > > Appreciate the help,
> > > > Jon
> > > >
> > > >
*****************************************************************
> > > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > > (preferred) or
> > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > http://weather.msfc.nasa.gov/sport/
> > > >
*****************************************************************
> > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>
>
>

------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: John Halley Gotway
Time: Tue Aug 15 09:59:06 2017

Jon,

I answered the pcp_combine question before reading this... the
regrid_data_plane tool also supports the "-name" command line option
for
the user to specify the output variable name to be written.

It's writing APCP_03_A3 because in general it appends the variable
name
"APCP_03" and the level information "A3" when creating the output
variable
name.  I do realize that for precip that's a bit redundant!

But hopefully adding the "-name" command line option to your
regrid_data_plane calls will get it running smoothly.

Actually, there's another option to consider.  Instead of calling both
pcp_combine and regrid_data_plane on your model data, you could just
call
pcp_combine and then let grid_stat do the regridding for you on the
fly.
Either way would work fine... Doing it on the fly has the advantage of
one
fewer output file.  But it's up to you.

John

On Mon, Aug 14, 2017 at 2:02 PM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> CORRECTION: It looks like regrid_data_place is appending the extra
level
> field onto the output netcdf field name.
> The output from pcp_combine writes out the field name as "APCP_03",
> whereas the output from regrid_data_plane writes out the field as
> "APCP_03_A3".
>
> Sorry for the confusion,
> Jonathan
>
> -----Original Message-----
> From: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> Sent: Monday, August 14, 2017 2:47 PM
> To: 'met_help at ucar.edu' <met_help at ucar.edu>
> Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hello again John H-G,
>
> I've got my scripts set up to run regrid_data_plane and pcp_combine
on the
> model precip to sum/subtract and re-project onto the obs grid (in my
case,
> MRMS), prior to running grid_stat.  I had no trouble with this
method prior
> to upgrading to version 6.0, but now I've encountered issues beyond
the 1-h
> forecast time in the model run I'm testing.
>
> In version 6.0, it appears that an extra level field is being
appended
> onto the variable name in what I believe is the output from
pcp_combine.
> For example, in the summed 3-hour forecast precip field, the name
appears
> as "APCP_03_A3" instead of simply "APCP_03".  In both instances, the
level
> is "A3".  Why is "_A3" being appended onto the variable name in the
> resulting netcdf output file?  I believe this is the source of my
problems
> with v6.0 output since I wasn't experiencing this issue when running
> grid_stat in the 5.1/5.2 versions.
>
> FYI, when running grid_stat, I set the level to be '(*,*)' for the
> forecast model fields, but this doesn't seem to be working with the
> variable names being output with the level appended onto the name.
>
> Please advise on how I should best proceed.
>
> Most sincerely,
> Jonathan
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Monday, August 14, 2017 12:00 PM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Johnathan,
>
> This is John Halley Gotway.  I read through your message history
with
> Julie.  If possible, I'd suggest upgrading from METv5.1 to version
6.0.  Be
> aware though that version 6.0 is a pretty big change since we
switched from
> using NetCDF3 to NetCDF4.
>
> Practically speaking that requires NetCDF4 built upon HDF5.  So the
MET
> configuration/compilation process has changed slightly.
>
> We are currently working on version 6.1 which will require NetCDF4
as
> well.  So as long as you're recompiling, I'd recommend using 6.0.
>
> And yes, Julie's correct.  We were doing some frankly pretty dumb
memory
> allocation.  Grid-Stat is running so slowly because it keeps
allocating
> memory, deleting it, and then re-allocating a bit more.  The fix is
> simple... figure out how much memory is required ahead of time and
allocate
> it once.  We looked for this same issue in other applications and
applied
> the fix wherever we found it.
>
> Grid-Stat should run dramatically faster in version 6.0.
>
> Thanks,
> John
>
>
> On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> met_help at ucar.edu
> > wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi Jonathan.
> >
> > Yes, the description I had listed previously with the Problem and
> > Solution were included in the METv5.2 patch on 1/26/2017 as
described
> > further on this page:
> >
> > http://www.dtcenter.org/met/us <http://goog_287487168>
> > ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
> >
> > Also, underneath the Problem and Solution on that page are
specific
> > files that were affected.  I do not know if it would make sense to
> > simply grab those files to replace in METv5.1 and would not
suggest
> > doing that.  If possible, I highly recommend that you upgrade to
our
> > latest version 6.0, but if that doesn't make sense for you right
now,
> > I would suggest get the latest METv5.2 release with all of the
patches
> and installing it.
> >
> > I hope that helps.  Please let us know if you have other
questions.
> >
> > Julie
> >
> >
> >
> > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi again Julie,
> > >
> > > I see that in the MET 6.0 documentation that there performance
> > > improvements can be realized in several programs including
> > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > the files you
> > implemented
> > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > because
> > that
> > > is definitely where I'm seeing the biggest back-log.  Were new
> > > source
> > code
> > > files released for this?
> > >
> > > Jonathan
> > >
> > > -----Original Message-----
> > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > Sent: Thursday, August 10, 2017 10:24 AM
> > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > extremely slow
> > >
> > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > this week.  However, I do believe I have some useful information
for
> you.
> > Thank
> > > you for letting us know what version of MET you are using.  I
see
> > > that
> > you
> > > are currently using METv5.1.  We did fix a problem (back in
January
> > > in METv5.2, but not METv5.1) with Grid-Stat because users
reported
> > > very slow performance  when computing continuous statistics on
high
> > > resolution
> > grids:
> > >
> > > ----
> > >
> > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > > memory allocation inefficiencies.
> > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > from Grid-Stat when computing continuous statistics on high
resolution
> grids.
> > > Debugging revealed very inefficient memory allocation logic in
the
> > > MET statistics libraries. Many small memory allocations and
> > > reallocations slowed down the code considerably.
> > > *Solution:* The fix is to update the MET library and application
> > > code to allocate the expected amount of required memory in one
big
> chunk.
> > Updating
> > > this logic caused the runtime of one test case to improve from
36
> > > minutes to 18 seconds.
> > >
> > > ----
> > >
> > > It is possible that MET is running slowly due to the dense grid
that
> > > you are using, however, if possible, I would suggest that you
> > > upgrade to
> > > METv6.0 (our latest release) and give that a shot using your
current
> > > grid and check for a speed up in processing before you
interpolate
> > > to a coarser-res grid.  Please note that when compiling METv6.0
*do
> > > not use*
> > the
> > > previous MET environments.  In v6.0, we've made a large change
from
> > > using
> > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > upon which NetCDF4 is built.  For more details please take a
look at
> the:
> > >
> > > *MET Users Guide*
> > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > _Guide_v6.0.pdf
> > >
> > > *Online tutorial*
> > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > Tv6.0/tutorial.php?name=compilation&category=index
> > >
> > > I hope this helps.  Please let us know how it goes.
> > >
> > > Thanks,
> > > Julie
> > >
> > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > >        Queue: met_help
> > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > >        Owner: Nobody
> > > >   Requestors: jonathan.case-1 at nasa.gov
> > > >       Status: new
> > > >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > >
> > > >
> > > >
> > > > Dear MET Help,
> > > >
> > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > is what I'm running), but it's processing extremely slowly.  I
> > > > have a 12-km grid-spaced model test run over the SE U.S.,
which I
> > > > interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > > This alone takes quite a while to do (perhaps a few minutes).
> > > > Then running grid_stat takes several more minutes per
scenario.
> > > > In all, I've only gotten through the first 14 hours of my
model
> > > > run hourly output while running jobs in a script overnight.
> > > >
> > > > Do you know of a faster way to process high-res precip grids
> > > > within
> > MET?
> > > > This seems prohibitively too slow since I need to run
verification
> > > > for daily forecasts across multiple months.  Perhaps I should
be
> > > > interpolating fields to the coarser-res grid (whether the obs
or
> > > forecast grid?)?
> > > >
> > > > Appreciate the help,
> > > > Jon
> > > >
> > > >
*****************************************************************
> > > > Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-
1 at nasa.gov>
> > > > (preferred) or
> > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > http://weather.msfc.nasa.gov/sport/
> > > >
*****************************************************************
> > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>
>
>

------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] regrid_data_plane problem in v6.0 (CORRECTION to last email)
From: Case, Jonathan[ENSCO INC]
Time: Tue Aug 15 10:01:07 2017

Thanks John!

I will modify ASAP.  I already made some mods to accommodate the new
name from regrid_data_plane, but I much prefer your recommendation
because I found that my method also required modifications for
stat_analysis, too.

Thanks for the info,
Jon

-----Original Message-----
From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
Sent: Tuesday, August 15, 2017 10:59 AM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] regrid_data_plane problem in
v6.0 (CORRECTION to last email)

Jon,

I answered the pcp_combine question before reading this... the
regrid_data_plane tool also supports the "-name" command line option
for the user to specify the output variable name to be written.

It's writing APCP_03_A3 because in general it appends the variable
name "APCP_03" and the level information "A3" when creating the output
variable name.  I do realize that for precip that's a bit redundant!

But hopefully adding the "-name" command line option to your
regrid_data_plane calls will get it running smoothly.

Actually, there's another option to consider.  Instead of calling both
pcp_combine and regrid_data_plane on your model data, you could just
call pcp_combine and then let grid_stat do the regridding for you on
the fly.
Either way would work fine... Doing it on the fly has the advantage of
one fewer output file.  But it's up to you.

John

On Mon, Aug 14, 2017 at 2:02 PM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> CORRECTION: It looks like regrid_data_place is appending the extra
> level field onto the output netcdf field name.
> The output from pcp_combine writes out the field name as "APCP_03",
> whereas the output from regrid_data_plane writes out the field as
> "APCP_03_A3".
>
> Sorry for the confusion,
> Jonathan
>
> -----Original Message-----
> From: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> Sent: Monday, August 14, 2017 2:47 PM
> To: 'met_help at ucar.edu' <met_help at ucar.edu>
> Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hello again John H-G,
>
> I've got my scripts set up to run regrid_data_plane and pcp_combine
on
> the model precip to sum/subtract and re-project onto the obs grid
(in
> my case, MRMS), prior to running grid_stat.  I had no trouble with
> this method prior to upgrading to version 6.0, but now I've
> encountered issues beyond the 1-h forecast time in the model run I'm
testing.
>
> In version 6.0, it appears that an extra level field is being
appended
> onto the variable name in what I believe is the output from
pcp_combine.
> For example, in the summed 3-hour forecast precip field, the name
> appears as "APCP_03_A3" instead of simply "APCP_03".  In both
> instances, the level is "A3".  Why is "_A3" being appended onto the
> variable name in the resulting netcdf output file?  I believe this
is
> the source of my problems with v6.0 output since I wasn't
experiencing
> this issue when running grid_stat in the 5.1/5.2 versions.
>
> FYI, when running grid_stat, I set the level to be '(*,*)' for the
> forecast model fields, but this doesn't seem to be working with the
> variable names being output with the level appended onto the name.
>
> Please advise on how I should best proceed.
>
> Most sincerely,
> Jonathan
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Monday, August 14, 2017 12:00 PM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Johnathan,
>
> This is John Halley Gotway.  I read through your message history
with
> Julie.  If possible, I'd suggest upgrading from METv5.1 to version
> 6.0.  Be aware though that version 6.0 is a pretty big change since
we
> switched from using NetCDF3 to NetCDF4.
>
> Practically speaking that requires NetCDF4 built upon HDF5.  So the
> MET configuration/compilation process has changed slightly.
>
> We are currently working on version 6.1 which will require NetCDF4
as
> well.  So as long as you're recompiling, I'd recommend using 6.0.
>
> And yes, Julie's correct.  We were doing some frankly pretty dumb
> memory allocation.  Grid-Stat is running so slowly because it keeps
> allocating memory, deleting it, and then re-allocating a bit more.
> The fix is simple... figure out how much memory is required ahead of
> time and allocate it once.  We looked for this same issue in other
> applications and applied the fix wherever we found it.
>
> Grid-Stat should run dramatically faster in version 6.0.
>
> Thanks,
> John
>
>
> On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> met_help at ucar.edu
> > wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi Jonathan.
> >
> > Yes, the description I had listed previously with the Problem and
> > Solution were included in the METv5.2 patch on 1/26/2017 as
> > described further on this page:
> >
> > http://www.dtcenter.org/met/us <http://goog_287487168>
> > ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
> >
> > Also, underneath the Problem and Solution on that page are
specific
> > files that were affected.  I do not know if it would make sense to
> > simply grab those files to replace in METv5.1 and would not
suggest
> > doing that.  If possible, I highly recommend that you upgrade to
our
> > latest version 6.0, but if that doesn't make sense for you right
> > now, I would suggest get the latest METv5.2 release with all of
the
> > patches
> and installing it.
> >
> > I hope that helps.  Please let us know if you have other
questions.
> >
> > Julie
> >
> >
> >
> > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi again Julie,
> > >
> > > I see that in the MET 6.0 documentation that there performance
> > > improvements can be realized in several programs including
> > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > the files you
> > implemented
> > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > because
> > that
> > > is definitely where I'm seeing the biggest back-log.  Were new
> > > source
> > code
> > > files released for this?
> > >
> > > Jonathan
> > >
> > > -----Original Message-----
> > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > Sent: Thursday, August 10, 2017 10:24 AM
> > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> > > <jonathan.case-1 at nasa.gov>
> > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > extremely slow
> > >
> > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > this week.  However, I do believe I have some useful information
> > > for
> you.
> > Thank
> > > you for letting us know what version of MET you are using.  I
see
> > > that
> > you
> > > are currently using METv5.1.  We did fix a problem (back in
> > > January in METv5.2, but not METv5.1) with Grid-Stat because
users
> > > reported very slow performance  when computing continuous
> > > statistics on high resolution
> > grids:
> > >
> > > ----
> > >
> > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > > memory allocation inefficiencies.
> > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > from Grid-Stat when computing continuous statistics on high
> > > resolution
> grids.
> > > Debugging revealed very inefficient memory allocation logic in
the
> > > MET statistics libraries. Many small memory allocations and
> > > reallocations slowed down the code considerably.
> > > *Solution:* The fix is to update the MET library and application
> > > code to allocate the expected amount of required memory in one
big
> chunk.
> > Updating
> > > this logic caused the runtime of one test case to improve from
36
> > > minutes to 18 seconds.
> > >
> > > ----
> > >
> > > It is possible that MET is running slowly due to the dense grid
> > > that you are using, however, if possible, I would suggest that
you
> > > upgrade to
> > > METv6.0 (our latest release) and give that a shot using your
> > > current grid and check for a speed up in processing before you
> > > interpolate to a coarser-res grid.  Please note that when
> > > compiling METv6.0 *do not use*
> > the
> > > previous MET environments.  In v6.0, we've made a large change
> > > from using
> > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > upon which NetCDF4 is built.  For more details please take a
look
> > > at
> the:
> > >
> > > *MET Users Guide*
> > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > _Guide_v6.0.pdf
> > >
> > > *Online tutorial*
> > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > Tv6.0/tutorial.php?name=compilation&category=index
> > >
> > > I hope this helps.  Please let us know how it goes.
> > >
> > > Thanks,
> > > Julie
> > >
> > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT
> > > < met_help at ucar.edu> wrote:
> > >
> > > >
> > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > >        Queue: met_help
> > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > >        Owner: Nobody
> > > >   Requestors: jonathan.case-1 at nasa.gov
> > > >       Status: new
> > > >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > >
> > > >
> > > >
> > > > Dear MET Help,
> > > >
> > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > is what I'm running), but it's processing extremely slowly.  I
> > > > have a 12-km grid-spaced model test run over the SE U.S.,
which
> > > > I interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > > This alone takes quite a while to do (perhaps a few minutes).
> > > > Then running grid_stat takes several more minutes per
scenario.
> > > > In all, I've only gotten through the first 14 hours of my
model
> > > > run hourly output while running jobs in a script overnight.
> > > >
> > > > Do you know of a faster way to process high-res precip grids
> > > > within
> > MET?
> > > > This seems prohibitively too slow since I need to run
> > > > verification for daily forecasts across multiple months.
> > > > Perhaps I should be interpolating fields to the coarser-res
grid
> > > > (whether the obs or
> > > forecast grid?)?
> > > >
> > > > Appreciate the help,
> > > > Jon
> > > >
> > > >
****************************************************************
> > > > * Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > Emails:
> > > > Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > > > (preferred) or
> > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > http://weather.msfc.nasa.gov/sport/
> > > >
****************************************************************
> > > > *
> > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>
>
>



------------------------------------------------
Subject: RE: [rt.rap.ucar.edu #81564] regrid_data_plane problem in v6.0 (CORRECTION to last email)
From: Case, Jonathan[ENSCO INC]
Time: Tue Aug 15 10:34:29 2017

Hi again John H-G,

Well this is embarrassing!  I'm already using the -name option when
running regrid_data_plane and found that I had an error in my python
script.  My script supports both total accumulation and bucket
accumulation type of output from WRF.  I've typically run with total
accumulation, but I'm currently testing with a bucket accumulation
model run with hourly output.  Oddly enough, I didn't run into this
problem until upgrading to v6.0.  But I did find a bug in that I was
not naming the APCP variable the same in the bucket section as in the
total accumulation section.  I'm working to make it consistent now and
hopefully this will clean up my issues.

-Jon

-----Original Message-----
From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
Sent: Tuesday, August 15, 2017 10:59 AM
To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
Subject: Re: [rt.rap.ucar.edu #81564] regrid_data_plane problem in
v6.0 (CORRECTION to last email)

Jon,

I answered the pcp_combine question before reading this... the
regrid_data_plane tool also supports the "-name" command line option
for the user to specify the output variable name to be written.

It's writing APCP_03_A3 because in general it appends the variable
name "APCP_03" and the level information "A3" when creating the output
variable name.  I do realize that for precip that's a bit redundant!

But hopefully adding the "-name" command line option to your
regrid_data_plane calls will get it running smoothly.

Actually, there's another option to consider.  Instead of calling both
pcp_combine and regrid_data_plane on your model data, you could just
call pcp_combine and then let grid_stat do the regridding for you on
the fly.
Either way would work fine... Doing it on the fly has the advantage of
one fewer output file.  But it's up to you.

John

On Mon, Aug 14, 2017 at 2:02 PM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> CORRECTION: It looks like regrid_data_place is appending the extra
> level field onto the output netcdf field name.
> The output from pcp_combine writes out the field name as "APCP_03",
> whereas the output from regrid_data_plane writes out the field as
> "APCP_03_A3".
>
> Sorry for the confusion,
> Jonathan
>
> -----Original Message-----
> From: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> Sent: Monday, August 14, 2017 2:47 PM
> To: 'met_help at ucar.edu' <met_help at ucar.edu>
> Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Hello again John H-G,
>
> I've got my scripts set up to run regrid_data_plane and pcp_combine
on
> the model precip to sum/subtract and re-project onto the obs grid
(in
> my case, MRMS), prior to running grid_stat.  I had no trouble with
> this method prior to upgrading to version 6.0, but now I've
> encountered issues beyond the 1-h forecast time in the model run I'm
testing.
>
> In version 6.0, it appears that an extra level field is being
appended
> onto the variable name in what I believe is the output from
pcp_combine.
> For example, in the summed 3-hour forecast precip field, the name
> appears as "APCP_03_A3" instead of simply "APCP_03".  In both
> instances, the level is "A3".  Why is "_A3" being appended onto the
> variable name in the resulting netcdf output file?  I believe this
is
> the source of my problems with v6.0 output since I wasn't
experiencing
> this issue when running grid_stat in the 5.1/5.2 versions.
>
> FYI, when running grid_stat, I set the level to be '(*,*)' for the
> forecast model fields, but this doesn't seem to be working with the
> variable names being output with the level appended onto the name.
>
> Please advise on how I should best proceed.
>
> Most sincerely,
> Jonathan
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Monday, August 14, 2017 12:00 PM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> extremely slow
>
> Johnathan,
>
> This is John Halley Gotway.  I read through your message history
with
> Julie.  If possible, I'd suggest upgrading from METv5.1 to version
> 6.0.  Be aware though that version 6.0 is a pretty big change since
we
> switched from using NetCDF3 to NetCDF4.
>
> Practically speaking that requires NetCDF4 built upon HDF5.  So the
> MET configuration/compilation process has changed slightly.
>
> We are currently working on version 6.1 which will require NetCDF4
as
> well.  So as long as you're recompiling, I'd recommend using 6.0.
>
> And yes, Julie's correct.  We were doing some frankly pretty dumb
> memory allocation.  Grid-Stat is running so slowly because it keeps
> allocating memory, deleting it, and then re-allocating a bit more.
> The fix is simple... figure out how much memory is required ahead of
> time and allocate it once.  We looked for this same issue in other
> applications and applied the fix wherever we found it.
>
> Grid-Stat should run dramatically faster in version 6.0.
>
> Thanks,
> John
>
>
> On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> met_help at ucar.edu
> > wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > Hi Jonathan.
> >
> > Yes, the description I had listed previously with the Problem and
> > Solution were included in the METv5.2 patch on 1/26/2017 as
> > described further on this page:
> >
> > http://www.dtcenter.org/met/us <http://goog_287487168>
> > ers/support/known_issues/METv5 <http://goog_287487168>.2/index.php
> >
> > Also, underneath the Problem and Solution on that page are
specific
> > files that were affected.  I do not know if it would make sense to
> > simply grab those files to replace in METv5.1 and would not
suggest
> > doing that.  If possible, I highly recommend that you upgrade to
our
> > latest version 6.0, but if that doesn't make sense for you right
> > now, I would suggest get the latest METv5.2 release with all of
the
> > patches
> and installing it.
> >
> > I hope that helps.  Please let us know if you have other
questions.
> >
> > Julie
> >
> >
> >
> > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via RT
<
> > met_help at ucar.edu> wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi again Julie,
> > >
> > > I see that in the MET 6.0 documentation that there performance
> > > improvements can be realized in several programs including
> > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > the files you
> > implemented
> > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > because
> > that
> > > is definitely where I'm seeing the biggest back-log.  Were new
> > > source
> > code
> > > files released for this?
> > >
> > > Jonathan
> > >
> > > -----Original Message-----
> > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > Sent: Thursday, August 10, 2017 10:24 AM
> > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> > > <jonathan.case-1 at nasa.gov>
> > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > extremely slow
> > >
> > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > this week.  However, I do believe I have some useful information
> > > for
> you.
> > Thank
> > > you for letting us know what version of MET you are using.  I
see
> > > that
> > you
> > > are currently using METv5.1.  We did fix a problem (back in
> > > January in METv5.2, but not METv5.1) with Grid-Stat because
users
> > > reported very slow performance  when computing continuous
> > > statistics on high resolution
> > grids:
> > >
> > > ----
> > >
> > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php Fix
> > > memory allocation inefficiencies.
> > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > from Grid-Stat when computing continuous statistics on high
> > > resolution
> grids.
> > > Debugging revealed very inefficient memory allocation logic in
the
> > > MET statistics libraries. Many small memory allocations and
> > > reallocations slowed down the code considerably.
> > > *Solution:* The fix is to update the MET library and application
> > > code to allocate the expected amount of required memory in one
big
> chunk.
> > Updating
> > > this logic caused the runtime of one test case to improve from
36
> > > minutes to 18 seconds.
> > >
> > > ----
> > >
> > > It is possible that MET is running slowly due to the dense grid
> > > that you are using, however, if possible, I would suggest that
you
> > > upgrade to
> > > METv6.0 (our latest release) and give that a shot using your
> > > current grid and check for a speed up in processing before you
> > > interpolate to a coarser-res grid.  Please note that when
> > > compiling METv6.0 *do not use*
> > the
> > > previous MET environments.  In v6.0, we've made a large change
> > > from using
> > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > upon which NetCDF4 is built.  For more details please take a
look
> > > at
> the:
> > >
> > > *MET Users Guide*
> > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > _Guide_v6.0.pdf
> > >
> > > *Online tutorial*
> > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > Tv6.0/tutorial.php?name=compilation&category=index
> > >
> > > I hope this helps.  Please let us know how it goes.
> > >
> > > Thanks,
> > > Julie
> > >
> > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT
> > > < met_help at ucar.edu> wrote:
> > >
> > > >
> > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > >        Queue: met_help
> > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > >        Owner: Nobody
> > > >   Requestors: jonathan.case-1 at nasa.gov
> > > >       Status: new
> > > >  Ticket <URL:
> > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > >
> > > >
> > > >
> > > > Dear MET Help,
> > > >
> > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x 3500
> > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > is what I'm running), but it's processing extremely slowly.  I
> > > > have a 12-km grid-spaced model test run over the SE U.S.,
which
> > > > I interpolated to the MRMS grid using the regrid_data_plane
utility.
> > > > This alone takes quite a while to do (perhaps a few minutes).
> > > > Then running grid_stat takes several more minutes per
scenario.
> > > > In all, I've only gotten through the first 14 hours of my
model
> > > > run hourly output while running jobs in a script overnight.
> > > >
> > > > Do you know of a faster way to process high-res precip grids
> > > > within
> > MET?
> > > > This seems prohibitively too slow since I need to run
> > > > verification for daily forecasts across multiple months.
> > > > Perhaps I should be interpolating fields to the coarser-res
grid
> > > > (whether the obs or
> > > forecast grid?)?
> > > >
> > > > Appreciate the help,
> > > > Jon
> > > >
> > > >
****************************************************************
> > > > * Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > Emails:
> > > > Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > > > (preferred) or
> > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > http://weather.msfc.nasa.gov/sport/
> > > >
****************************************************************
> > > > *
> > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>
>
>



------------------------------------------------
Subject: MET v5.1 with MRMS data runs extremely slow
From: John Halley Gotway
Time: Wed Aug 16 09:38:25 2017

Jon,

Glad you were able to track down the issue.  Just let us know if more
issues or questions arise.

Thanks,
John

On Tue, Aug 15, 2017 at 10:34 AM, Case, Jonathan[ENSCO INC] via RT <
met_help at ucar.edu> wrote:

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
>
> Hi again John H-G,
>
> Well this is embarrassing!  I'm already using the -name option when
> running regrid_data_plane and found that I had an error in my python
> script.  My script supports both total accumulation and bucket
accumulation
> type of output from WRF.  I've typically run with total
accumulation, but
> I'm currently testing with a bucket accumulation model run with
hourly
> output.  Oddly enough, I didn't run into this problem until
upgrading to
> v6.0.  But I did find a bug in that I was not naming the APCP
variable the
> same in the bucket section as in the total accumulation section.
I'm
> working to make it consistent now and hopefully this will clean up
my
> issues.
>
> -Jon
>
> -----Original Message-----
> From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> Sent: Tuesday, August 15, 2017 10:59 AM
> To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-1 at nasa.gov>
> Subject: Re: [rt.rap.ucar.edu #81564] regrid_data_plane problem in
v6.0
> (CORRECTION to last email)
>
> Jon,
>
> I answered the pcp_combine question before reading this... the
> regrid_data_plane tool also supports the "-name" command line option
for
> the user to specify the output variable name to be written.
>
> It's writing APCP_03_A3 because in general it appends the variable
name
> "APCP_03" and the level information "A3" when creating the output
variable
> name.  I do realize that for precip that's a bit redundant!
>
> But hopefully adding the "-name" command line option to your
> regrid_data_plane calls will get it running smoothly.
>
> Actually, there's another option to consider.  Instead of calling
both
> pcp_combine and regrid_data_plane on your model data, you could just
call
> pcp_combine and then let grid_stat do the regridding for you on the
fly.
> Either way would work fine... Doing it on the fly has the advantage
of one
> fewer output file.  But it's up to you.
>
> John
>
> On Mon, Aug 14, 2017 at 2:02 PM, Case, Jonathan[ENSCO INC] via RT <
> met_help at ucar.edu> wrote:
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> >
> > CORRECTION: It looks like regrid_data_place is appending the extra
> > level field onto the output netcdf field name.
> > The output from pcp_combine writes out the field name as
"APCP_03",
> > whereas the output from regrid_data_plane writes out the field as
> > "APCP_03_A3".
> >
> > Sorry for the confusion,
> > Jonathan
> >
> > -----Original Message-----
> > From: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> > Sent: Monday, August 14, 2017 2:47 PM
> > To: 'met_help at ucar.edu' <met_help at ucar.edu>
> > Subject: RE: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Hello again John H-G,
> >
> > I've got my scripts set up to run regrid_data_plane and
pcp_combine on
> > the model precip to sum/subtract and re-project onto the obs grid
(in
> > my case, MRMS), prior to running grid_stat.  I had no trouble with
> > this method prior to upgrading to version 6.0, but now I've
> > encountered issues beyond the 1-h forecast time in the model run
I'm
> testing.
> >
> > In version 6.0, it appears that an extra level field is being
appended
> > onto the variable name in what I believe is the output from
pcp_combine.
> > For example, in the summed 3-hour forecast precip field, the name
> > appears as "APCP_03_A3" instead of simply "APCP_03".  In both
> > instances, the level is "A3".  Why is "_A3" being appended onto
the
> > variable name in the resulting netcdf output file?  I believe this
is
> > the source of my problems with v6.0 output since I wasn't
experiencing
> > this issue when running grid_stat in the 5.1/5.2 versions.
> >
> > FYI, when running grid_stat, I set the level to be '(*,*)' for the
> > forecast model fields, but this doesn't seem to be working with
the
> > variable names being output with the level appended onto the name.
> >
> > Please advise on how I should best proceed.
> >
> > Most sincerely,
> > Jonathan
> >
> > -----Original Message-----
> > From: John Halley Gotway via RT [mailto:met_help at ucar.edu]
> > Sent: Monday, August 14, 2017 12:00 PM
> > To: Case, Jonathan (MSFC-ST11)[ENSCO INC] <jonathan.case-
1 at nasa.gov>
> > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data runs
> > extremely slow
> >
> > Johnathan,
> >
> > This is John Halley Gotway.  I read through your message history
with
> > Julie.  If possible, I'd suggest upgrading from METv5.1 to version
> > 6.0.  Be aware though that version 6.0 is a pretty big change
since we
> > switched from using NetCDF3 to NetCDF4.
> >
> > Practically speaking that requires NetCDF4 built upon HDF5.  So
the
> > MET configuration/compilation process has changed slightly.
> >
> > We are currently working on version 6.1 which will require NetCDF4
as
> > well.  So as long as you're recompiling, I'd recommend using 6.0.
> >
> > And yes, Julie's correct.  We were doing some frankly pretty dumb
> > memory allocation.  Grid-Stat is running so slowly because it
keeps
> > allocating memory, deleting it, and then re-allocating a bit more.
> > The fix is simple... figure out how much memory is required ahead
of
> > time and allocate it once.  We looked for this same issue in other
> > applications and applied the fix wherever we found it.
> >
> > Grid-Stat should run dramatically faster in version 6.0.
> >
> > Thanks,
> > John
> >
> >
> > On Thu, Aug 10, 2017 at 12:25 PM, Julie Prestopnik via RT <
> > met_help at ucar.edu
> > > wrote:
> >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564 >
> > >
> > > Hi Jonathan.
> > >
> > > Yes, the description I had listed previously with the Problem
and
> > > Solution were included in the METv5.2 patch on 1/26/2017 as
> > > described further on this page:
> > >
> > > http://www.dtcenter.org/met/us <http://goog_287487168>
> > > ers/support/known_issues/METv5
<http://goog_287487168>.2/index.php
> > >
> > > Also, underneath the Problem and Solution on that page are
specific
> > > files that were affected.  I do not know if it would make sense
to
> > > simply grab those files to replace in METv5.1 and would not
suggest
> > > doing that.  If possible, I highly recommend that you upgrade to
our
> > > latest version 6.0, but if that doesn't make sense for you right
> > > now, I would suggest get the latest METv5.2 release with all of
the
> > > patches
> > and installing it.
> > >
> > > I hope that helps.  Please let us know if you have other
questions.
> > >
> > > Julie
> > >
> > >
> > >
> > > On Thu, Aug 10, 2017 at 11:04 AM, Case, Jonathan[ENSCO INC] via
RT <
> > > met_help at ucar.edu> wrote:
> > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
>
> > > >
> > > > Hi again Julie,
> > > >
> > > > I see that in the MET 6.0 documentation that there performance
> > > > improvements can be realized in several programs including
> > > > grid_stat, but not in the MET v5.2 patch.  Was this related to
the
> > > > MET v5.2 fix you referred to below?  Perhaps I could simply
obtain
> > > > the files you
> > > implemented
> > > > in MET v5.2 for this improvement to grid_stat continuous stats
> > > > because
> > > that
> > > > is definitely where I'm seeing the biggest back-log.  Were new
> > > > source
> > > code
> > > > files released for this?
> > > >
> > > > Jonathan
> > > >
> > > > -----Original Message-----
> > > > From: Julie Prestopnik via RT [mailto:met_help at ucar.edu]
> > > > Sent: Thursday, August 10, 2017 10:24 AM
> > > > To: Case, Jonathan (MSFC-ST11)[ENSCO INC]
> > > > <jonathan.case-1 at nasa.gov>
> > > > Subject: Re: [rt.rap.ucar.edu #81564] MET v5.1 with MRMS data
runs
> > > > extremely slow
> > > >
> > > > Hi Jonathan.  Our MET expert is out of the office for the rest
of
> > > > this week.  However, I do believe I have some useful
information
> > > > for
> > you.
> > > Thank
> > > > you for letting us know what version of MET you are using.  I
see
> > > > that
> > > you
> > > > are currently using METv5.1.  We did fix a problem (back in
> > > > January in METv5.2, but not METv5.1) with Grid-Stat because
users
> > > > reported very slow performance  when computing continuous
> > > > statistics on high resolution
> > > grids:
> > > >
> > > > ----
> > > >
> > > > From: http://www.dtcenter.org/met/us <goog_287487168>
> > > > ers/support/known_issues/METv5 <goog_287487168>.2/index.php
Fix
> > > > memory allocation inefficiencies.
> > > > Posted 01/26/2017 *Problem:* Users reported very slow
performance
> > > > from Grid-Stat when computing continuous statistics on high
> > > > resolution
> > grids.
> > > > Debugging revealed very inefficient memory allocation logic in
the
> > > > MET statistics libraries. Many small memory allocations and
> > > > reallocations slowed down the code considerably.
> > > > *Solution:* The fix is to update the MET library and
application
> > > > code to allocate the expected amount of required memory in one
big
> > chunk.
> > > Updating
> > > > this logic caused the runtime of one test case to improve from
36
> > > > minutes to 18 seconds.
> > > >
> > > > ----
> > > >
> > > > It is possible that MET is running slowly due to the dense
grid
> > > > that you are using, however, if possible, I would suggest that
you
> > > > upgrade to
> > > > METv6.0 (our latest release) and give that a shot using your
> > > > current grid and check for a speed up in processing before you
> > > > interpolate to a coarser-res grid.  Please note that when
> > > > compiling METv6.0 *do not use*
> > > the
> > > > previous MET environments.  In v6.0, we've made a large change
> > > > from using
> > > > NetCDF3 to 4. You'll need to have installed both NetCDF4 and
HDF5,
> > > > upon which NetCDF4 is built.  For more details please take a
look
> > > > at
> > the:
> > > >
> > > > *MET Users Guide*
> > > > http://www.dtcenter.org/met/users/docs/users_guide/MET_Users
> > > > _Guide_v6.0.pdf
> > > >
> > > > *Online tutorial*
> > > > http://www.dtcenter.org/met/users/support/online_tutorial/ME
> > > > Tv6.0/tutorial.php?name=compilation&category=index
> > > >
> > > > I hope this helps.  Please let us know how it goes.
> > > >
> > > > Thanks,
> > > > Julie
> > > >
> > > > On Thu, Aug 10, 2017 at 8:22 AM, Case, Jonathan[ENSCO INC] via
RT
> > > > < met_help at ucar.edu> wrote:
> > > >
> > > > >
> > > > > Thu Aug 10 08:22:26 2017: Request 81564 was acted upon.
> > > > > Transaction: Ticket created by jonathan.case-1 at nasa.gov
> > > > >        Queue: met_help
> > > > >      Subject: MET v5.1 with MRMS data runs extremely slow
> > > > >        Owner: Nobody
> > > > >   Requestors: jonathan.case-1 at nasa.gov
> > > > >       Status: new
> > > > >  Ticket <URL:
> > > > > https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=81564
> > > > > >
> > > > >
> > > > >
> > > > > Dear MET Help,
> > > > >
> > > > > I'm using MRMS gauge-corrected radar gridded QPE (7000 x
3500
> > > > > large grid at 0.01-deg spacing) running through MET (version
5.1
> > > > > is what I'm running), but it's processing extremely slowly.
I
> > > > > have a 12-km grid-spaced model test run over the SE U.S.,
which
> > > > > I interpolated to the MRMS grid using the regrid_data_plane
> utility.
> > > > > This alone takes quite a while to do (perhaps a few
minutes).
> > > > > Then running grid_stat takes several more minutes per
scenario.
> > > > > In all, I've only gotten through the first 14 hours of my
model
> > > > > run hourly output while running jobs in a script overnight.
> > > > >
> > > > > Do you know of a faster way to process high-res precip grids
> > > > > within
> > > MET?
> > > > > This seems prohibitively too slow since I need to run
> > > > > verification for daily forecasts across multiple months.
> > > > > Perhaps I should be interpolating fields to the coarser-res
grid
> > > > > (whether the obs or
> > > > forecast grid?)?
> > > > >
> > > > > Appreciate the help,
> > > > > Jon
> > > > >
> > > > >
****************************************************************
> > > > > * Jonathan Case;  Research Meteorologist at ENSCO, Inc./NASA
> > > > > Short-term Prediction Research and Transition (SPoRT) Center
> > > > > 320 Sparkman Dr., Room 3008; Huntsville, AL 35805
> > > > > Emails:
> > > > > Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov>
> > > > > (preferred) or
> > > > > case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>
> > > > > Voice: 256.961.7504  ;  Fax: 256.961.7788
> > > > > http://weather.msfc.nasa.gov/sport/
> > > > >
****************************************************************
> > > > > *
> > > > >
> > > > >
> > > > >
> > > >
> > > >
> > > >
> > > >
> > >
> > >
> >
> >
> >
> >
>
>
>
>

------------------------------------------------


More information about the Met_help mailing list