[Met_help] [rt.rap.ucar.edu #70099] History for MET: validating WRF shallow cumuli fields with MODIS data

John Halley Gotway via RT met_help at ucar.edu
Wed Jan 7 14:12:39 MST 2015


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Good afternoon,

I'm interested in validating the shallow cumuli in my WRF simulations, and
for this I was thinking of using MODIS data.  My thought was to follow this
plan:

- use the UPP to destagger WRF output and get it into GRIB format
- use MODIS_regrid in MET to reformat MODIS Level 2 data
- use copygb to get WRF output and MODIS data to a common verification
format
- use MET tools, probably GRID_STAT, to validate WRF output against MODIS
data

However, I'm running into some apparent complications:

1. MODIS_regrid seems to only be able to output MODIS data in netCDF
format, yet copygb seems to require both WRF output and any observations to
be in GRIB format.  First, is that true?  If so, what tool do you recommend
for converting MODIS netCDF output into GRIB format?  I've found
ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file> <GRIB file>, but
I'm not familiar with any of them and wondered if you have recommendations
and/or cautions.

2. A few questions have arisen about how to use UPP.  I've read the
instructions on pg. 347-369 of the WRF Users' Guide.  For starters, does it
have to be installed in the same directory as WRF?  I didn't see that
statement in the instructions, but based on TOP_DIR's description as "the
top-level directory for source codes (UPPV1 and WRFV3), it seems suspect.
So I should have WPS, WRFV3, and UPPV1 all at the same subdirectory level
within the same parent directory?

Thank you,
Jess


----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: Kunke, Jessica
Time: Mon Dec 15 13:00:46 2014

Correction to my previous email:  I was saying UPPV1 because that's
what
the WRF users' guide says in the instructions, but I actually have
version
2.

Also, the instructions say to set DOMAINPATH to the "directory where
UPP
will be run from."  What limits which directory I can choose to run
UPP
from?  Can I set it to the path to my UPPV2 directory?

On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
met_help at ucar.edu> wrote:
>
> Greetings,
>
> This message has been automatically generated in response to the
> creation of a trouble ticket regarding:
>         "MET: validating WRF shallow cumuli fields with MODIS data",
> a summary of which appears below.
>
> There is no need to reply to this message right now.  Your ticket
has been
> assigned an ID of [rt.rap.ucar.edu #70099].
>
> Please include the string:
>
>          [rt.rap.ucar.edu #70099]
>
> in the subject line of all future correspondence about this issue.
To do
> so,
> you may reply to this message.
>
>                         Thank you,
>                         met_help at ucar.edu
>
>
-------------------------------------------------------------------------
> Good afternoon,
>
> I'm interested in validating the shallow cumuli in my WRF
simulations, and
> for this I was thinking of using MODIS data.  My thought was to
follow this
> plan:
>
> - use the UPP to destagger WRF output and get it into GRIB format
> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> - use copygb to get WRF output and MODIS data to a common
verification
> format
> - use MET tools, probably GRID_STAT, to validate WRF output against
MODIS
> data
>
> However, I'm running into some apparent complications:
>
> 1. MODIS_regrid seems to only be able to output MODIS data in netCDF
> format, yet copygb seems to require both WRF output and any
observations to
> be in GRIB format.  First, is that true?  If so, what tool do you
recommend
> for converting MODIS netCDF output into GRIB format?  I've found
> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file> <GRIB
file>, but
> I'm not familiar with any of them and wondered if you have
recommendations
> and/or cautions.
>
> 2. A few questions have arisen about how to use UPP.  I've read the
> instructions on pg. 347-369 of the WRF Users' Guide.  For starters,
does it
> have to be installed in the same directory as WRF?  I didn't see
that
> statement in the instructions, but based on TOP_DIR's description as
"the
> top-level directory for source codes (UPPV1 and WRFV3), it seems
suspect.
> So I should have WPS, WRFV3, and UPPV1 all at the same subdirectory
level
> within the same parent directory?
>
> Thank you,
> Jess
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: Kunke, Jessica
Time: Mon Dec 15 13:05:19 2014

Version 2.2, actually, so my directory name is UPPV2.2

On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica
<jkunke at fas.harvard.edu>
wrote:
>
> Correction to my previous email:  I was saying UPPV1 because that's
what
> the WRF users' guide says in the instructions, but I actually have
version
> 2.
>
> Also, the instructions say to set DOMAINPATH to the "directory where
UPP
> will be run from."  What limits which directory I can choose to run
UPP
> from?  Can I set it to the path to my UPPV2 directory?
>
> On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
> met_help at ucar.edu> wrote:
>>
>> Greetings,
>>
>> This message has been automatically generated in response to the
>> creation of a trouble ticket regarding:
>>         "MET: validating WRF shallow cumuli fields with MODIS
data",
>> a summary of which appears below.
>>
>> There is no need to reply to this message right now.  Your ticket
has been
>> assigned an ID of [rt.rap.ucar.edu #70099].
>>
>> Please include the string:
>>
>>          [rt.rap.ucar.edu #70099]
>>
>> in the subject line of all future correspondence about this issue.
To do
>> so,
>> you may reply to this message.
>>
>>                         Thank you,
>>                         met_help at ucar.edu
>>
>>
-------------------------------------------------------------------------
>> Good afternoon,
>>
>> I'm interested in validating the shallow cumuli in my WRF
simulations, and
>> for this I was thinking of using MODIS data.  My thought was to
follow
>> this
>> plan:
>>
>> - use the UPP to destagger WRF output and get it into GRIB format
>> - use MODIS_regrid in MET to reformat MODIS Level 2 data
>> - use copygb to get WRF output and MODIS data to a common
verification
>> format
>> - use MET tools, probably GRID_STAT, to validate WRF output against
MODIS
>> data
>>
>> However, I'm running into some apparent complications:
>>
>> 1. MODIS_regrid seems to only be able to output MODIS data in
netCDF
>> format, yet copygb seems to require both WRF output and any
observations
>> to
>> be in GRIB format.  First, is that true?  If so, what tool do you
>> recommend
>> for converting MODIS netCDF output into GRIB format?  I've found
>> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file> <GRIB
file>,
>> but
>> I'm not familiar with any of them and wondered if you have
recommendations
>> and/or cautions.
>>
>> 2. A few questions have arisen about how to use UPP.  I've read the
>> instructions on pg. 347-369 of the WRF Users' Guide.  For starters,
does
>> it
>> have to be installed in the same directory as WRF?  I didn't see
that
>> statement in the instructions, but based on TOP_DIR's description
as "the
>> top-level directory for source codes (UPPV1 and WRFV3), it seems
suspect.
>> So I should have WPS, WRFV3, and UPPV1 all at the same subdirectory
level
>> within the same parent directory?
>>
>> Thank you,
>> Jess
>>
>>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: John Halley Gotway
Time: Mon Dec 15 14:56:18 2014

Hello Jess,

I see that you'd like to start using MET to compare your WRF output
with
MODIS data.  Hopefully we'll be able to help you get going on that.
Sounds
like you have a pretty good plan laid out.  Unfortunately, there is no
easy
way in general to regrid NetCDF data.  However, you're in luck - the
MODIS_regrid tool is one of the only tools in MET that enables you to
regrid data.

Here's what I'd suggest...
 - Run UPP to post-process your raw WRF output files.
 - Run copygb to re-grid the GRIB output to your desired evaluation
domain.
 - Run modis_grid to re-grid the MODIS data to your desired evaluation
domain (use the -data_file option to pass it a GRIB file defining your
desired output domain).
 - Run grid_stat, mode, wavelet-stat, or series-analysis to compare
the
GRIB output of UPP to the NetCDF output of modis_regrid and compute
whatever statistics you'd like.  As long as the data is on the same
grid,
they can be in different file formats.

I haven't actually used the modis_regrid tool myself that much.  So if
questions come up about that, I'll refer you to the developer of that
tool.

Regarding UPP, support for it is provided through wrfhelp at ucar.edu.  I
can
take a stab at easy questions, but will refer you to wrfhelp for
tougher
ones!

We have WPS, WRF, and UPP compiled in a directory structure similar to
what
you describe on our machines here.  Actually though, we have WPS, WRF,
and
UPP all at the same level.  The WRF directory contains subdirectories
for
many different versions (v3.5, v3.6, and so on).  Likewise, UPP has
subdirectories for difference versions (UPPV1.1 and UPPV2.1).  So
there is
no explicit requirement for how to organize your directory structure.

When you run "configure" for UPP, it does need to locate WRF since it
links
to some WRF libraries.  You can specify where to find WRF by (1)
setting
the WRF_DIR environment variable or (2) having WRFV3 at the same level
as
UPPV2.1 or (3) by explicitly entering it at the prompt from
"configure".

Hope that helps.

Thanks,
John Halley Gotway
met_help at ucar.edu





On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT
<met_help at ucar.edu>
wrote:
>
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
>
> Version 2.2, actually, so my directory name is UPPV2.2
>
> On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica
<jkunke at fas.harvard.edu>
> wrote:
> >
> > Correction to my previous email:  I was saying UPPV1 because
that's what
> > the WRF users' guide says in the instructions, but I actually have
> version
> > 2.
> >
> > Also, the instructions say to set DOMAINPATH to the "directory
where UPP
> > will be run from."  What limits which directory I can choose to
run UPP
> > from?  Can I set it to the path to my UPPV2 directory?
> >
> > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
> > met_help at ucar.edu> wrote:
> >>
> >> Greetings,
> >>
> >> This message has been automatically generated in response to the
> >> creation of a trouble ticket regarding:
> >>         "MET: validating WRF shallow cumuli fields with MODIS
data",
> >> a summary of which appears below.
> >>
> >> There is no need to reply to this message right now.  Your ticket
has
> been
> >> assigned an ID of [rt.rap.ucar.edu #70099].
> >>
> >> Please include the string:
> >>
> >>          [rt.rap.ucar.edu #70099]
> >>
> >> in the subject line of all future correspondence about this
issue. To do
> >> so,
> >> you may reply to this message.
> >>
> >>                         Thank you,
> >>                         met_help at ucar.edu
> >>
> >>
>
-------------------------------------------------------------------------
> >> Good afternoon,
> >>
> >> I'm interested in validating the shallow cumuli in my WRF
simulations,
> and
> >> for this I was thinking of using MODIS data.  My thought was to
follow
> >> this
> >> plan:
> >>
> >> - use the UPP to destagger WRF output and get it into GRIB format
> >> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> >> - use copygb to get WRF output and MODIS data to a common
verification
> >> format
> >> - use MET tools, probably GRID_STAT, to validate WRF output
against
> MODIS
> >> data
> >>
> >> However, I'm running into some apparent complications:
> >>
> >> 1. MODIS_regrid seems to only be able to output MODIS data in
netCDF
> >> format, yet copygb seems to require both WRF output and any
observations
> >> to
> >> be in GRIB format.  First, is that true?  If so, what tool do you
> >> recommend
> >> for converting MODIS netCDF output into GRIB format?  I've found
> >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file> <GRIB
file>,
> >> but
> >> I'm not familiar with any of them and wondered if you have
> recommendations
> >> and/or cautions.
> >>
> >> 2. A few questions have arisen about how to use UPP.  I've read
the
> >> instructions on pg. 347-369 of the WRF Users' Guide.  For
starters, does
> >> it
> >> have to be installed in the same directory as WRF?  I didn't see
that
> >> statement in the instructions, but based on TOP_DIR's description
as
> "the
> >> top-level directory for source codes (UPPV1 and WRFV3), it seems
> suspect.
> >> So I should have WPS, WRFV3, and UPPV1 all at the same
subdirectory
> level
> >> within the same parent directory?
> >>
> >> Thank you,
> >> Jess
> >>
> >>
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: Kunke, Jessica
Time: Tue Dec 16 09:09:56 2014

Hello John,

Thanks for your prompt reply.  Here are a few follow-up questions, and
if
they need to be sent to wrfhelp instead, please just let me know.

1. The only reason I anticipated having to convert the MODIS netCDF to
GRIB
format was because I thought the way copygb worked was that it
required two
GRIB-format files, and that it would take those two files and regrid
them
onto the same grid.  Is that not true?  How do I tell copygb what my
desired grid and/or evaluation domain would be?  I'm actually fairly
confused about how to use copygb.  I read the instructions on pg. 358-
359
of the WRF users' guide, and it's not clear to me what series of
commands I
would use with what options, for my purpose of getting WRF output onto
the
grid I will put MODIS on (or putting them on the same grid using
copygb, if
I input both the model and observation files).

2. The four scripts run_unipost, run_unipost_frames,
run_unipost_gracet,
and run_unipost_minutes seem to be pretty similar.  Can you explain
the
difference between them, when I would use one versus the other?  I've
tried
using diff on the files, but I still don't quite get the differences
in
their application.

3. I just wanted to check that I understand the environment variables.
Before I configured and compiled UPP, I set WRF_DIR in my .bashrc
file.
Now in whichever script I choose to copy over to postprd and edit, I
set
the following variables:

export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere else?>
<==
since UNIPOST_HOME and WRFPATH are what define where UPPV2.2 and WRFV3
are,
do I even need TOP_DIR?

export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
/DOMAINS/test_case?  I don't have this directory.  Do I run from
UPPV2.2,
from bin, or from postprd?

export WRFPATH=${WRF_DIR}

export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change this to
${TOP_DIR}?

export POSTEXEC=${UNIPOST_HOME}/bin

4. My WRF simulation started at Aug 30 2013 06:00:00 and ended at Aug
31
2013 00:00:00, and I set history_interval = 60 and
frames_per_outfile=80,
so that it saves data to my wrfout file every hour and puts up to 80
such
hourly outputs in the same file.  What does this mean for how I set
startdate, fhr,  lastfhr, and incrementhr?  MODIS data is only
available at
10:30am and 1:30pm, so in theory I'm only comparing at these two
times, but
I may want to do some temporal averaging or something in case WRF
captured
the MODIS data well but was off by a little time.

Thanks,
Jess


On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:
>
> Hello Jess,
>
> I see that you'd like to start using MET to compare your WRF output
with
> MODIS data.  Hopefully we'll be able to help you get going on that.
Sounds
> like you have a pretty good plan laid out.  Unfortunately, there is
no easy
> way in general to regrid NetCDF data.  However, you're in luck - the
> MODIS_regrid tool is one of the only tools in MET that enables you
to
> regrid data.
>
> Here's what I'd suggest...
>  - Run UPP to post-process your raw WRF output files.
>  - Run copygb to re-grid the GRIB output to your desired evaluation
domain.
>  - Run modis_grid to re-grid the MODIS data to your desired
evaluation
> domain (use the -data_file option to pass it a GRIB file defining
your
> desired output domain).
>  - Run grid_stat, mode, wavelet-stat, or series-analysis to compare
the
> GRIB output of UPP to the NetCDF output of modis_regrid and compute
> whatever statistics you'd like.  As long as the data is on the same
grid,
> they can be in different file formats.
>
> I haven't actually used the modis_regrid tool myself that much.  So
if
> questions come up about that, I'll refer you to the developer of
that tool.
>
> Regarding UPP, support for it is provided through wrfhelp at ucar.edu.
I can
> take a stab at easy questions, but will refer you to wrfhelp for
tougher
> ones!
>
> We have WPS, WRF, and UPP compiled in a directory structure similar
to what
> you describe on our machines here.  Actually though, we have WPS,
WRF, and
> UPP all at the same level.  The WRF directory contains
subdirectories for
> many different versions (v3.5, v3.6, and so on).  Likewise, UPP has
> subdirectories for difference versions (UPPV1.1 and UPPV2.1).  So
there is
> no explicit requirement for how to organize your directory
structure.
>
> When you run "configure" for UPP, it does need to locate WRF since
it links
> to some WRF libraries.  You can specify where to find WRF by (1)
setting
> the WRF_DIR environment variable or (2) having WRFV3 at the same
level as
> UPPV2.1 or (3) by explicitly entering it at the prompt from
"configure".
>
> Hope that helps.
>
> Thanks,
> John Halley Gotway
> met_help at ucar.edu
>
>
>
>
>
> On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT
<met_help at ucar.edu>
> wrote:
> >
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> >
> > Version 2.2, actually, so my directory name is UPPV2.2
> >
> > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica
<jkunke at fas.harvard.edu>
> > wrote:
> > >
> > > Correction to my previous email:  I was saying UPPV1 because
that's
> what
> > > the WRF users' guide says in the instructions, but I actually
have
> > version
> > > 2.
> > >
> > > Also, the instructions say to set DOMAINPATH to the "directory
where
> UPP
> > > will be run from."  What limits which directory I can choose to
run UPP
> > > from?  Can I set it to the path to my UPPV2 directory?
> > >
> > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
> > > met_help at ucar.edu> wrote:
> > >>
> > >> Greetings,
> > >>
> > >> This message has been automatically generated in response to
the
> > >> creation of a trouble ticket regarding:
> > >>         "MET: validating WRF shallow cumuli fields with MODIS
data",
> > >> a summary of which appears below.
> > >>
> > >> There is no need to reply to this message right now.  Your
ticket has
> > been
> > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > >>
> > >> Please include the string:
> > >>
> > >>          [rt.rap.ucar.edu #70099]
> > >>
> > >> in the subject line of all future correspondence about this
issue. To
> do
> > >> so,
> > >> you may reply to this message.
> > >>
> > >>                         Thank you,
> > >>                         met_help at ucar.edu
> > >>
> > >>
> >
-------------------------------------------------------------------------
> > >> Good afternoon,
> > >>
> > >> I'm interested in validating the shallow cumuli in my WRF
simulations,
> > and
> > >> for this I was thinking of using MODIS data.  My thought was to
follow
> > >> this
> > >> plan:
> > >>
> > >> - use the UPP to destagger WRF output and get it into GRIB
format
> > >> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> > >> - use copygb to get WRF output and MODIS data to a common
verification
> > >> format
> > >> - use MET tools, probably GRID_STAT, to validate WRF output
against
> > MODIS
> > >> data
> > >>
> > >> However, I'm running into some apparent complications:
> > >>
> > >> 1. MODIS_regrid seems to only be able to output MODIS data in
netCDF
> > >> format, yet copygb seems to require both WRF output and any
> observations
> > >> to
> > >> be in GRIB format.  First, is that true?  If so, what tool do
you
> > >> recommend
> > >> for converting MODIS netCDF output into GRIB format?  I've
found
> > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file>
<GRIB
> file>,
> > >> but
> > >> I'm not familiar with any of them and wondered if you have
> > recommendations
> > >> and/or cautions.
> > >>
> > >> 2. A few questions have arisen about how to use UPP.  I've read
the
> > >> instructions on pg. 347-369 of the WRF Users' Guide.  For
starters,
> does
> > >> it
> > >> have to be installed in the same directory as WRF?  I didn't
see that
> > >> statement in the instructions, but based on TOP_DIR's
description as
> > "the
> > >> top-level directory for source codes (UPPV1 and WRFV3), it
seems
> > suspect.
> > >> So I should have WPS, WRFV3, and UPPV1 all at the same
subdirectory
> > level
> > >> within the same parent directory?
> > >>
> > >> Thank you,
> > >> Jess
> > >>
> > >>
> >
> >
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: John Halley Gotway
Time: Tue Dec 16 10:08:03 2014

Jess,

1. copygb takes as input one GRIB file and a specification of the
desired
output grid.  The grid is specified by a long and rather cryptic
series of
numbers.  Please take a look at the MET online tutorial which includes
a
section on running copygb:

http://www.dtcenter.org/met/users/support/online_tutorial/METv5.0/copygb/index.php
Just click the forward arrows in the bottom-right corner to advance to
the
next page.  There are examples of running coypgb to regrid to lat/lon,
lambert conformal, and polar stereographic grids.  Mercator is
supported as
well.  Just try to work through an example and let me know if you get
stuck.

2. Here's my best guess... run_unipost is the original and most basic
one.
run_unipost_frames is for when you have multiple output times in a
single
wrfout file.  Typically, you write one output file per time.  But wrf
can
be configured to write multiple times into a single file.  And you'd
use
the "_frames" variant in that case.  run_unipost_minutes would be used
when
your output times don't fall exactly on the hour, e.g. wrfout every 15
minutes.  And I have no idea about "_gracet".  Rather than having 4
versions of essentially the same script, it would be preferable to
have a
single script that supports multiple options.  But unfortunately,
that's
not what's available.

3. I think you should write wrfhelp at ucar.edu with this question.
While I
run these scripts periodically, I really don't know them well enough
to
answer your questions about setting up the directory structure and
environment variables.

4. Since you'll have multiple output times per file, you should use
the
"run_unipost_frames" script.  I assume you'd set startdate to the
beginning
time of your simulation (following the time format already in the
script).
fhr would be 0, lastfhr would be 80 and incrementhr would be 1.  At
least
that what I'd try first.

Hope that helps.

Thanks,
John

On Tue, Dec 16, 2014 at 9:09 AM, Kunke, Jessica via RT
<met_help at ucar.edu>
wrote:
>
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
>
> Hello John,
>
> Thanks for your prompt reply.  Here are a few follow-up questions,
and if
> they need to be sent to wrfhelp instead, please just let me know.
>
> 1. The only reason I anticipated having to convert the MODIS netCDF
to GRIB
> format was because I thought the way copygb worked was that it
required two
> GRIB-format files, and that it would take those two files and regrid
them
> onto the same grid.  Is that not true?  How do I tell copygb what my
> desired grid and/or evaluation domain would be?  I'm actually fairly
> confused about how to use copygb.  I read the instructions on pg.
358-359
> of the WRF users' guide, and it's not clear to me what series of
commands I
> would use with what options, for my purpose of getting WRF output
onto the
> grid I will put MODIS on (or putting them on the same grid using
copygb, if
> I input both the model and observation files).
>
> 2. The four scripts run_unipost, run_unipost_frames,
run_unipost_gracet,
> and run_unipost_minutes seem to be pretty similar.  Can you explain
the
> difference between them, when I would use one versus the other?
I've tried
> using diff on the files, but I still don't quite get the differences
in
> their application.
>
> 3. I just wanted to check that I understand the environment
variables.
> Before I configured and compiled UPP, I set WRF_DIR in my .bashrc
file.
> Now in whichever script I choose to copy over to postprd and edit, I
set
> the following variables:
>
> export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere else?>
<==
> since UNIPOST_HOME and WRFPATH are what define where UPPV2.2 and
WRFV3 are,
> do I even need TOP_DIR?
>
> export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
> /DOMAINS/test_case?  I don't have this directory.  Do I run from
UPPV2.2,
> from bin, or from postprd?
>
> export WRFPATH=${WRF_DIR}
>
> export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change this to
> ${TOP_DIR}?
>
> export POSTEXEC=${UNIPOST_HOME}/bin
>
> 4. My WRF simulation started at Aug 30 2013 06:00:00 and ended at
Aug 31
> 2013 00:00:00, and I set history_interval = 60 and
frames_per_outfile=80,
> so that it saves data to my wrfout file every hour and puts up to 80
such
> hourly outputs in the same file.  What does this mean for how I set
> startdate, fhr,  lastfhr, and incrementhr?  MODIS data is only
available at
> 10:30am and 1:30pm, so in theory I'm only comparing at these two
times, but
> I may want to do some temporal averaging or something in case WRF
captured
> the MODIS data well but was off by a little time.
>
> Thanks,
> Jess
>
>
> On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
> >
> > Hello Jess,
> >
> > I see that you'd like to start using MET to compare your WRF
output with
> > MODIS data.  Hopefully we'll be able to help you get going on
that.
> Sounds
> > like you have a pretty good plan laid out.  Unfortunately, there
is no
> easy
> > way in general to regrid NetCDF data.  However, you're in luck -
the
> > MODIS_regrid tool is one of the only tools in MET that enables you
to
> > regrid data.
> >
> > Here's what I'd suggest...
> >  - Run UPP to post-process your raw WRF output files.
> >  - Run copygb to re-grid the GRIB output to your desired
evaluation
> domain.
> >  - Run modis_grid to re-grid the MODIS data to your desired
evaluation
> > domain (use the -data_file option to pass it a GRIB file defining
your
> > desired output domain).
> >  - Run grid_stat, mode, wavelet-stat, or series-analysis to
compare the
> > GRIB output of UPP to the NetCDF output of modis_regrid and
compute
> > whatever statistics you'd like.  As long as the data is on the
same grid,
> > they can be in different file formats.
> >
> > I haven't actually used the modis_regrid tool myself that much.
So if
> > questions come up about that, I'll refer you to the developer of
that
> tool.
> >
> > Regarding UPP, support for it is provided through
wrfhelp at ucar.edu.  I
> can
> > take a stab at easy questions, but will refer you to wrfhelp for
tougher
> > ones!
> >
> > We have WPS, WRF, and UPP compiled in a directory structure
similar to
> what
> > you describe on our machines here.  Actually though, we have WPS,
WRF,
> and
> > UPP all at the same level.  The WRF directory contains
subdirectories for
> > many different versions (v3.5, v3.6, and so on).  Likewise, UPP
has
> > subdirectories for difference versions (UPPV1.1 and UPPV2.1).  So
there
> is
> > no explicit requirement for how to organize your directory
structure.
> >
> > When you run "configure" for UPP, it does need to locate WRF since
it
> links
> > to some WRF libraries.  You can specify where to find WRF by (1)
setting
> > the WRF_DIR environment variable or (2) having WRFV3 at the same
level as
> > UPPV2.1 or (3) by explicitly entering it at the prompt from
"configure".
> >
> > Hope that helps.
> >
> > Thanks,
> > John Halley Gotway
> > met_help at ucar.edu
> >
> >
> >
> >
> >
> > On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT <
> met_help at ucar.edu>
> > wrote:
> > >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > >
> > > Version 2.2, actually, so my directory name is UPPV2.2
> > >
> > > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica <
> jkunke at fas.harvard.edu>
> > > wrote:
> > > >
> > > > Correction to my previous email:  I was saying UPPV1 because
that's
> > what
> > > > the WRF users' guide says in the instructions, but I actually
have
> > > version
> > > > 2.
> > > >
> > > > Also, the instructions say to set DOMAINPATH to the "directory
where
> > UPP
> > > > will be run from."  What limits which directory I can choose
to run
> UPP
> > > > from?  Can I set it to the path to my UPPV2 directory?
> > > >
> > > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
> > > > met_help at ucar.edu> wrote:
> > > >>
> > > >> Greetings,
> > > >>
> > > >> This message has been automatically generated in response to
the
> > > >> creation of a trouble ticket regarding:
> > > >>         "MET: validating WRF shallow cumuli fields with MODIS
data",
> > > >> a summary of which appears below.
> > > >>
> > > >> There is no need to reply to this message right now.  Your
ticket
> has
> > > been
> > > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > > >>
> > > >> Please include the string:
> > > >>
> > > >>          [rt.rap.ucar.edu #70099]
> > > >>
> > > >> in the subject line of all future correspondence about this
issue.
> To
> > do
> > > >> so,
> > > >> you may reply to this message.
> > > >>
> > > >>                         Thank you,
> > > >>                         met_help at ucar.edu
> > > >>
> > > >>
> > >
>
-------------------------------------------------------------------------
> > > >> Good afternoon,
> > > >>
> > > >> I'm interested in validating the shallow cumuli in my WRF
> simulations,
> > > and
> > > >> for this I was thinking of using MODIS data.  My thought was
to
> follow
> > > >> this
> > > >> plan:
> > > >>
> > > >> - use the UPP to destagger WRF output and get it into GRIB
format
> > > >> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> > > >> - use copygb to get WRF output and MODIS data to a common
> verification
> > > >> format
> > > >> - use MET tools, probably GRID_STAT, to validate WRF output
against
> > > MODIS
> > > >> data
> > > >>
> > > >> However, I'm running into some apparent complications:
> > > >>
> > > >> 1. MODIS_regrid seems to only be able to output MODIS data in
netCDF
> > > >> format, yet copygb seems to require both WRF output and any
> > observations
> > > >> to
> > > >> be in GRIB format.  First, is that true?  If so, what tool do
you
> > > >> recommend
> > > >> for converting MODIS netCDF output into GRIB format?  I've
found
> > > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file>
<GRIB
> > file>,
> > > >> but
> > > >> I'm not familiar with any of them and wondered if you have
> > > recommendations
> > > >> and/or cautions.
> > > >>
> > > >> 2. A few questions have arisen about how to use UPP.  I've
read the
> > > >> instructions on pg. 347-369 of the WRF Users' Guide.  For
starters,
> > does
> > > >> it
> > > >> have to be installed in the same directory as WRF?  I didn't
see
> that
> > > >> statement in the instructions, but based on TOP_DIR's
description as
> > > "the
> > > >> top-level directory for source codes (UPPV1 and WRFV3), it
seems
> > > suspect.
> > > >> So I should have WPS, WRFV3, and UPPV1 all at the same
subdirectory
> > > level
> > > >> within the same parent directory?
> > > >>
> > > >> Thank you,
> > > >> Jess
> > > >>
> > > >>
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: Kunke, Jessica
Time: Tue Dec 16 11:06:25 2014

Hello John,

Thanks again for your reply.  I'll check out the copygb part of the
tutorial (I've been looking at the tutorial, but you're right that it
would
be good to revisit that section).  My one question about the
run_unipost_frames script is that regular run_unipost seems to be able
to
accommodate multiple times too, in its use of fhr, lasthr, and
incrementhr.  Is that something different?  And I've followed up with
wrfhelp on these UPP questions I've had.

Thank you!  Best wishes,
Jess

On Tue, Dec 16, 2014 at 12:08 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:
>
> Jess,
>
> 1. copygb takes as input one GRIB file and a specification of the
desired
> output grid.  The grid is specified by a long and rather cryptic
series of
> numbers.  Please take a look at the MET online tutorial which
includes a
> section on running copygb:
>
>
>
http://www.dtcenter.org/met/users/support/online_tutorial/METv5.0/copygb/index.php
> Just click the forward arrows in the bottom-right corner to advance
to the
> next page.  There are examples of running coypgb to regrid to
lat/lon,
> lambert conformal, and polar stereographic grids.  Mercator is
supported as
> well.  Just try to work through an example and let me know if you
get
> stuck.
>
> 2. Here's my best guess... run_unipost is the original and most
basic one.
> run_unipost_frames is for when you have multiple output times in a
single
> wrfout file.  Typically, you write one output file per time.  But
wrf can
> be configured to write multiple times into a single file.  And you'd
use
> the "_frames" variant in that case.  run_unipost_minutes would be
used when
> your output times don't fall exactly on the hour, e.g. wrfout every
15
> minutes.  And I have no idea about "_gracet".  Rather than having 4
> versions of essentially the same script, it would be preferable to
have a
> single script that supports multiple options.  But unfortunately,
that's
> not what's available.
>
> 3. I think you should write wrfhelp at ucar.edu with this question.
While I
> run these scripts periodically, I really don't know them well enough
to
> answer your questions about setting up the directory structure and
> environment variables.
>
> 4. Since you'll have multiple output times per file, you should use
the
> "run_unipost_frames" script.  I assume you'd set startdate to the
beginning
> time of your simulation (following the time format already in the
script).
> fhr would be 0, lastfhr would be 80 and incrementhr would be 1.  At
least
> that what I'd try first.
>
> Hope that helps.
>
> Thanks,
> John
>
> On Tue, Dec 16, 2014 at 9:09 AM, Kunke, Jessica via RT
<met_help at ucar.edu>
> wrote:
> >
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> >
> > Hello John,
> >
> > Thanks for your prompt reply.  Here are a few follow-up questions,
and if
> > they need to be sent to wrfhelp instead, please just let me know.
> >
> > 1. The only reason I anticipated having to convert the MODIS
netCDF to
> GRIB
> > format was because I thought the way copygb worked was that it
required
> two
> > GRIB-format files, and that it would take those two files and
regrid them
> > onto the same grid.  Is that not true?  How do I tell copygb what
my
> > desired grid and/or evaluation domain would be?  I'm actually
fairly
> > confused about how to use copygb.  I read the instructions on pg.
358-359
> > of the WRF users' guide, and it's not clear to me what series of
> commands I
> > would use with what options, for my purpose of getting WRF output
onto
> the
> > grid I will put MODIS on (or putting them on the same grid using
copygb,
> if
> > I input both the model and observation files).
> >
> > 2. The four scripts run_unipost, run_unipost_frames,
run_unipost_gracet,
> > and run_unipost_minutes seem to be pretty similar.  Can you
explain the
> > difference between them, when I would use one versus the other?
I've
> tried
> > using diff on the files, but I still don't quite get the
differences in
> > their application.
> >
> > 3. I just wanted to check that I understand the environment
variables.
> > Before I configured and compiled UPP, I set WRF_DIR in my .bashrc
file.
> > Now in whichever script I choose to copy over to postprd and edit,
I set
> > the following variables:
> >
> > export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere else?>
<==
> > since UNIPOST_HOME and WRFPATH are what define where UPPV2.2 and
WRFV3
> are,
> > do I even need TOP_DIR?
> >
> > export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
> > /DOMAINS/test_case?  I don't have this directory.  Do I run from
UPPV2.2,
> > from bin, or from postprd?
> >
> > export WRFPATH=${WRF_DIR}
> >
> > export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change this to
> > ${TOP_DIR}?
> >
> > export POSTEXEC=${UNIPOST_HOME}/bin
> >
> > 4. My WRF simulation started at Aug 30 2013 06:00:00 and ended at
Aug 31
> > 2013 00:00:00, and I set history_interval = 60 and
frames_per_outfile=80,
> > so that it saves data to my wrfout file every hour and puts up to
80 such
> > hourly outputs in the same file.  What does this mean for how I
set
> > startdate, fhr,  lastfhr, and incrementhr?  MODIS data is only
available
> at
> > 10:30am and 1:30pm, so in theory I'm only comparing at these two
times,
> but
> > I may want to do some temporal averaging or something in case WRF
> captured
> > the MODIS data well but was off by a little time.
> >
> > Thanks,
> > Jess
> >
> >
> > On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> > >
> > > Hello Jess,
> > >
> > > I see that you'd like to start using MET to compare your WRF
output
> with
> > > MODIS data.  Hopefully we'll be able to help you get going on
that.
> > Sounds
> > > like you have a pretty good plan laid out.  Unfortunately, there
is no
> > easy
> > > way in general to regrid NetCDF data.  However, you're in luck -
the
> > > MODIS_regrid tool is one of the only tools in MET that enables
you to
> > > regrid data.
> > >
> > > Here's what I'd suggest...
> > >  - Run UPP to post-process your raw WRF output files.
> > >  - Run copygb to re-grid the GRIB output to your desired
evaluation
> > domain.
> > >  - Run modis_grid to re-grid the MODIS data to your desired
evaluation
> > > domain (use the -data_file option to pass it a GRIB file
defining your
> > > desired output domain).
> > >  - Run grid_stat, mode, wavelet-stat, or series-analysis to
compare the
> > > GRIB output of UPP to the NetCDF output of modis_regrid and
compute
> > > whatever statistics you'd like.  As long as the data is on the
same
> grid,
> > > they can be in different file formats.
> > >
> > > I haven't actually used the modis_regrid tool myself that much.
So if
> > > questions come up about that, I'll refer you to the developer of
that
> > tool.
> > >
> > > Regarding UPP, support for it is provided through
wrfhelp at ucar.edu.  I
> > can
> > > take a stab at easy questions, but will refer you to wrfhelp for
> tougher
> > > ones!
> > >
> > > We have WPS, WRF, and UPP compiled in a directory structure
similar to
> > what
> > > you describe on our machines here.  Actually though, we have
WPS, WRF,
> > and
> > > UPP all at the same level.  The WRF directory contains
subdirectories
> for
> > > many different versions (v3.5, v3.6, and so on).  Likewise, UPP
has
> > > subdirectories for difference versions (UPPV1.1 and UPPV2.1).
So there
> > is
> > > no explicit requirement for how to organize your directory
structure.
> > >
> > > When you run "configure" for UPP, it does need to locate WRF
since it
> > links
> > > to some WRF libraries.  You can specify where to find WRF by (1)
> setting
> > > the WRF_DIR environment variable or (2) having WRFV3 at the same
level
> as
> > > UPPV2.1 or (3) by explicitly entering it at the prompt from
> "configure".
> > >
> > > Hope that helps.
> > >
> > > Thanks,
> > > John Halley Gotway
> > > met_help at ucar.edu
> > >
> > >
> > >
> > >
> > >
> > > On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT <
> > met_help at ucar.edu>
> > > wrote:
> > > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099
>
> > > >
> > > > Version 2.2, actually, so my directory name is UPPV2.2
> > > >
> > > > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica <
> > jkunke at fas.harvard.edu>
> > > > wrote:
> > > > >
> > > > > Correction to my previous email:  I was saying UPPV1 because
that's
> > > what
> > > > > the WRF users' guide says in the instructions, but I
actually have
> > > > version
> > > > > 2.
> > > > >
> > > > > Also, the instructions say to set DOMAINPATH to the
"directory
> where
> > > UPP
> > > > > will be run from."  What limits which directory I can choose
to run
> > UPP
> > > > > from?  Can I set it to the path to my UPPV2 directory?
> > > > >
> > > > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT <
> > > > > met_help at ucar.edu> wrote:
> > > > >>
> > > > >> Greetings,
> > > > >>
> > > > >> This message has been automatically generated in response
to the
> > > > >> creation of a trouble ticket regarding:
> > > > >>         "MET: validating WRF shallow cumuli fields with
MODIS
> data",
> > > > >> a summary of which appears below.
> > > > >>
> > > > >> There is no need to reply to this message right now.  Your
ticket
> > has
> > > > been
> > > > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > > > >>
> > > > >> Please include the string:
> > > > >>
> > > > >>          [rt.rap.ucar.edu #70099]
> > > > >>
> > > > >> in the subject line of all future correspondence about this
issue.
> > To
> > > do
> > > > >> so,
> > > > >> you may reply to this message.
> > > > >>
> > > > >>                         Thank you,
> > > > >>                         met_help at ucar.edu
> > > > >>
> > > > >>
> > > >
> >
-------------------------------------------------------------------------
> > > > >> Good afternoon,
> > > > >>
> > > > >> I'm interested in validating the shallow cumuli in my WRF
> > simulations,
> > > > and
> > > > >> for this I was thinking of using MODIS data.  My thought
was to
> > follow
> > > > >> this
> > > > >> plan:
> > > > >>
> > > > >> - use the UPP to destagger WRF output and get it into GRIB
format
> > > > >> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> > > > >> - use copygb to get WRF output and MODIS data to a common
> > verification
> > > > >> format
> > > > >> - use MET tools, probably GRID_STAT, to validate WRF output
> against
> > > > MODIS
> > > > >> data
> > > > >>
> > > > >> However, I'm running into some apparent complications:
> > > > >>
> > > > >> 1. MODIS_regrid seems to only be able to output MODIS data
in
> netCDF
> > > > >> format, yet copygb seems to require both WRF output and any
> > > observations
> > > > >> to
> > > > >> be in GRIB format.  First, is that true?  If so, what tool
do you
> > > > >> recommend
> > > > >> for converting MODIS netCDF output into GRIB format?  I've
found
> > > > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF file>
<GRIB
> > > file>,
> > > > >> but
> > > > >> I'm not familiar with any of them and wondered if you have
> > > > recommendations
> > > > >> and/or cautions.
> > > > >>
> > > > >> 2. A few questions have arisen about how to use UPP.  I've
read
> the
> > > > >> instructions on pg. 347-369 of the WRF Users' Guide.  For
> starters,
> > > does
> > > > >> it
> > > > >> have to be installed in the same directory as WRF?  I
didn't see
> > that
> > > > >> statement in the instructions, but based on TOP_DIR's
description
> as
> > > > "the
> > > > >> top-level directory for source codes (UPPV1 and WRFV3), it
seems
> > > > suspect.
> > > > >> So I should have WPS, WRFV3, and UPPV1 all at the same
> subdirectory
> > > > level
> > > > >> within the same parent directory?
> > > > >>
> > > > >> Thank you,
> > > > >> Jess
> > > > >>
> > > > >>
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: John Halley Gotway
Time: Tue Dec 16 14:15:46 2014

Jess,

Looking at those two scripts that are 95% identical really makes me
think
they should be one script with multiple options!

That aside, here's the substantive difference:
154c158
< ../wrfprd/wrfout_${domain}_${YY}-${MM}-${DD}_${HH}:00:00
---
> ../wrfprd/wrfout_${domain}_${YYi}-${MMi}-${DDi}_${HHi}:00:00

In run_unipost, the timestamp in the wrfout file name
(${YY}-${MM}-${DD}_${HH}) is the time being processed.  If you're
processing 10 different output times, it looks for 10 different file
names,
one for each timestamp.

In run_unipost_frames, the timestamp in the wrfout file name
(${YYi}-${MMi}-${DDi}_${HHi}) is the initialization time.  I'm
guessing
that's why they appended the 'i' to the variable names.  If you're
processing 10 different output times, all with the same
initialization, it
looks for a single file name that include the initialization
timestamp.

Since your data has multiple output times in each file, use
run_unipost_frames.

Thanks,
John


On Tue, Dec 16, 2014 at 11:06 AM, Kunke, Jessica via RT
<met_help at ucar.edu>
wrote:
>
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
>
> Hello John,
>
> Thanks again for your reply.  I'll check out the copygb part of the
> tutorial (I've been looking at the tutorial, but you're right that
it would
> be good to revisit that section).  My one question about the
> run_unipost_frames script is that regular run_unipost seems to be
able to
> accommodate multiple times too, in its use of fhr, lasthr, and
> incrementhr.  Is that something different?  And I've followed up
with
> wrfhelp on these UPP questions I've had.
>
> Thank you!  Best wishes,
> Jess
>
> On Tue, Dec 16, 2014 at 12:08 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
> >
> > Jess,
> >
> > 1. copygb takes as input one GRIB file and a specification of the
desired
> > output grid.  The grid is specified by a long and rather cryptic
series
> of
> > numbers.  Please take a look at the MET online tutorial which
includes a
> > section on running copygb:
> >
> >
> >
>
http://www.dtcenter.org/met/users/support/online_tutorial/METv5.0/copygb/index.php
> > Just click the forward arrows in the bottom-right corner to
advance to
> the
> > next page.  There are examples of running coypgb to regrid to
lat/lon,
> > lambert conformal, and polar stereographic grids.  Mercator is
supported
> as
> > well.  Just try to work through an example and let me know if you
get
> > stuck.
> >
> > 2. Here's my best guess... run_unipost is the original and most
basic
> one.
> > run_unipost_frames is for when you have multiple output times in a
single
> > wrfout file.  Typically, you write one output file per time.  But
wrf can
> > be configured to write multiple times into a single file.  And
you'd use
> > the "_frames" variant in that case.  run_unipost_minutes would be
used
> when
> > your output times don't fall exactly on the hour, e.g. wrfout
every 15
> > minutes.  And I have no idea about "_gracet".  Rather than having
4
> > versions of essentially the same script, it would be preferable to
have a
> > single script that supports multiple options.  But unfortunately,
that's
> > not what's available.
> >
> > 3. I think you should write wrfhelp at ucar.edu with this question.
While
> I
> > run these scripts periodically, I really don't know them well
enough to
> > answer your questions about setting up the directory structure and
> > environment variables.
> >
> > 4. Since you'll have multiple output times per file, you should
use the
> > "run_unipost_frames" script.  I assume you'd set startdate to the
> beginning
> > time of your simulation (following the time format already in the
> script).
> > fhr would be 0, lastfhr would be 80 and incrementhr would be 1.
At least
> > that what I'd try first.
> >
> > Hope that helps.
> >
> > Thanks,
> > John
> >
> > On Tue, Dec 16, 2014 at 9:09 AM, Kunke, Jessica via RT <
> met_help at ucar.edu>
> > wrote:
> > >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > >
> > > Hello John,
> > >
> > > Thanks for your prompt reply.  Here are a few follow-up
questions, and
> if
> > > they need to be sent to wrfhelp instead, please just let me
know.
> > >
> > > 1. The only reason I anticipated having to convert the MODIS
netCDF to
> > GRIB
> > > format was because I thought the way copygb worked was that it
required
> > two
> > > GRIB-format files, and that it would take those two files and
regrid
> them
> > > onto the same grid.  Is that not true?  How do I tell copygb
what my
> > > desired grid and/or evaluation domain would be?  I'm actually
fairly
> > > confused about how to use copygb.  I read the instructions on
pg.
> 358-359
> > > of the WRF users' guide, and it's not clear to me what series of
> > commands I
> > > would use with what options, for my purpose of getting WRF
output onto
> > the
> > > grid I will put MODIS on (or putting them on the same grid using
> copygb,
> > if
> > > I input both the model and observation files).
> > >
> > > 2. The four scripts run_unipost, run_unipost_frames,
> run_unipost_gracet,
> > > and run_unipost_minutes seem to be pretty similar.  Can you
explain the
> > > difference between them, when I would use one versus the other?
I've
> > tried
> > > using diff on the files, but I still don't quite get the
differences in
> > > their application.
> > >
> > > 3. I just wanted to check that I understand the environment
variables.
> > > Before I configured and compiled UPP, I set WRF_DIR in my
.bashrc file.
> > > Now in whichever script I choose to copy over to postprd and
edit, I
> set
> > > the following variables:
> > >
> > > export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere
else?>  <==
> > > since UNIPOST_HOME and WRFPATH are what define where UPPV2.2 and
WRFV3
> > are,
> > > do I even need TOP_DIR?
> > >
> > > export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
> > > /DOMAINS/test_case?  I don't have this directory.  Do I run from
> UPPV2.2,
> > > from bin, or from postprd?
> > >
> > > export WRFPATH=${WRF_DIR}
> > >
> > > export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change this
to
> > > ${TOP_DIR}?
> > >
> > > export POSTEXEC=${UNIPOST_HOME}/bin
> > >
> > > 4. My WRF simulation started at Aug 30 2013 06:00:00 and ended
at Aug
> 31
> > > 2013 00:00:00, and I set history_interval = 60 and
> frames_per_outfile=80,
> > > so that it saves data to my wrfout file every hour and puts up
to 80
> such
> > > hourly outputs in the same file.  What does this mean for how I
set
> > > startdate, fhr,  lastfhr, and incrementhr?  MODIS data is only
> available
> > at
> > > 10:30am and 1:30pm, so in theory I'm only comparing at these two
times,
> > but
> > > I may want to do some temporal averaging or something in case
WRF
> > captured
> > > the MODIS data well but was off by a little time.
> > >
> > > Thanks,
> > > Jess
> > >
> > >
> > > On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
> > > met_help at ucar.edu> wrote:
> > > >
> > > > Hello Jess,
> > > >
> > > > I see that you'd like to start using MET to compare your WRF
output
> > with
> > > > MODIS data.  Hopefully we'll be able to help you get going on
that.
> > > Sounds
> > > > like you have a pretty good plan laid out.  Unfortunately,
there is
> no
> > > easy
> > > > way in general to regrid NetCDF data.  However, you're in luck
- the
> > > > MODIS_regrid tool is one of the only tools in MET that enables
you to
> > > > regrid data.
> > > >
> > > > Here's what I'd suggest...
> > > >  - Run UPP to post-process your raw WRF output files.
> > > >  - Run copygb to re-grid the GRIB output to your desired
evaluation
> > > domain.
> > > >  - Run modis_grid to re-grid the MODIS data to your desired
> evaluation
> > > > domain (use the -data_file option to pass it a GRIB file
defining
> your
> > > > desired output domain).
> > > >  - Run grid_stat, mode, wavelet-stat, or series-analysis to
compare
> the
> > > > GRIB output of UPP to the NetCDF output of modis_regrid and
compute
> > > > whatever statistics you'd like.  As long as the data is on the
same
> > grid,
> > > > they can be in different file formats.
> > > >
> > > > I haven't actually used the modis_regrid tool myself that
much.  So
> if
> > > > questions come up about that, I'll refer you to the developer
of that
> > > tool.
> > > >
> > > > Regarding UPP, support for it is provided through
wrfhelp at ucar.edu.
> I
> > > can
> > > > take a stab at easy questions, but will refer you to wrfhelp
for
> > tougher
> > > > ones!
> > > >
> > > > We have WPS, WRF, and UPP compiled in a directory structure
similar
> to
> > > what
> > > > you describe on our machines here.  Actually though, we have
WPS,
> WRF,
> > > and
> > > > UPP all at the same level.  The WRF directory contains
subdirectories
> > for
> > > > many different versions (v3.5, v3.6, and so on).  Likewise,
UPP has
> > > > subdirectories for difference versions (UPPV1.1 and UPPV2.1).
So
> there
> > > is
> > > > no explicit requirement for how to organize your directory
structure.
> > > >
> > > > When you run "configure" for UPP, it does need to locate WRF
since it
> > > links
> > > > to some WRF libraries.  You can specify where to find WRF by
(1)
> > setting
> > > > the WRF_DIR environment variable or (2) having WRFV3 at the
same
> level
> > as
> > > > UPPV2.1 or (3) by explicitly entering it at the prompt from
> > "configure".
> > > >
> > > > Hope that helps.
> > > >
> > > > Thanks,
> > > > John Halley Gotway
> > > > met_help at ucar.edu
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT <
> > > met_help at ucar.edu>
> > > > wrote:
> > > > >
> > > > >
> > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > > > >
> > > > > Version 2.2, actually, so my directory name is UPPV2.2
> > > > >
> > > > > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica <
> > > jkunke at fas.harvard.edu>
> > > > > wrote:
> > > > > >
> > > > > > Correction to my previous email:  I was saying UPPV1
because
> that's
> > > > what
> > > > > > the WRF users' guide says in the instructions, but I
actually
> have
> > > > > version
> > > > > > 2.
> > > > > >
> > > > > > Also, the instructions say to set DOMAINPATH to the
"directory
> > where
> > > > UPP
> > > > > > will be run from."  What limits which directory I can
choose to
> run
> > > UPP
> > > > > > from?  Can I set it to the path to my UPPV2 directory?
> > > > > >
> > > > > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via RT
<
> > > > > > met_help at ucar.edu> wrote:
> > > > > >>
> > > > > >> Greetings,
> > > > > >>
> > > > > >> This message has been automatically generated in response
to the
> > > > > >> creation of a trouble ticket regarding:
> > > > > >>         "MET: validating WRF shallow cumuli fields with
MODIS
> > data",
> > > > > >> a summary of which appears below.
> > > > > >>
> > > > > >> There is no need to reply to this message right now.
Your
> ticket
> > > has
> > > > > been
> > > > > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > > > > >>
> > > > > >> Please include the string:
> > > > > >>
> > > > > >>          [rt.rap.ucar.edu #70099]
> > > > > >>
> > > > > >> in the subject line of all future correspondence about
this
> issue.
> > > To
> > > > do
> > > > > >> so,
> > > > > >> you may reply to this message.
> > > > > >>
> > > > > >>                         Thank you,
> > > > > >>                         met_help at ucar.edu
> > > > > >>
> > > > > >>
> > > > >
> > >
>
-------------------------------------------------------------------------
> > > > > >> Good afternoon,
> > > > > >>
> > > > > >> I'm interested in validating the shallow cumuli in my WRF
> > > simulations,
> > > > > and
> > > > > >> for this I was thinking of using MODIS data.  My thought
was to
> > > follow
> > > > > >> this
> > > > > >> plan:
> > > > > >>
> > > > > >> - use the UPP to destagger WRF output and get it into
GRIB
> format
> > > > > >> - use MODIS_regrid in MET to reformat MODIS Level 2 data
> > > > > >> - use copygb to get WRF output and MODIS data to a common
> > > verification
> > > > > >> format
> > > > > >> - use MET tools, probably GRID_STAT, to validate WRF
output
> > against
> > > > > MODIS
> > > > > >> data
> > > > > >>
> > > > > >> However, I'm running into some apparent complications:
> > > > > >>
> > > > > >> 1. MODIS_regrid seems to only be able to output MODIS
data in
> > netCDF
> > > > > >> format, yet copygb seems to require both WRF output and
any
> > > > observations
> > > > > >> to
> > > > > >> be in GRIB format.  First, is that true?  If so, what
tool do
> you
> > > > > >> recommend
> > > > > >> for converting MODIS netCDF output into GRIB format?
I've found
> > > > > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF
file> <GRIB
> > > > file>,
> > > > > >> but
> > > > > >> I'm not familiar with any of them and wondered if you
have
> > > > > recommendations
> > > > > >> and/or cautions.
> > > > > >>
> > > > > >> 2. A few questions have arisen about how to use UPP.
I've read
> > the
> > > > > >> instructions on pg. 347-369 of the WRF Users' Guide.  For
> > starters,
> > > > does
> > > > > >> it
> > > > > >> have to be installed in the same directory as WRF?  I
didn't see
> > > that
> > > > > >> statement in the instructions, but based on TOP_DIR's
> description
> > as
> > > > > "the
> > > > > >> top-level directory for source codes (UPPV1 and WRFV3),
it seems
> > > > > suspect.
> > > > > >> So I should have WPS, WRFV3, and UPPV1 all at the same
> > subdirectory
> > > > > level
> > > > > >> within the same parent directory?
> > > > > >>
> > > > > >> Thank you,
> > > > > >> Jess
> > > > > >>
> > > > > >>
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: Kunke, Jessica
Time: Tue Dec 16 14:19:06 2014

Hello John,

That makes sense about having one script! And thank you for your
helpful
clarification on the frames script.

Have a lovely holiday!
Jess

On Tue, Dec 16, 2014 at 4:15 PM, John Halley Gotway via RT <
met_help at ucar.edu> wrote:
>
> Jess,
>
> Looking at those two scripts that are 95% identical really makes me
think
> they should be one script with multiple options!
>
> That aside, here's the substantive difference:
> 154c158
> < ../wrfprd/wrfout_${domain}_${YY}-${MM}-${DD}_${HH}:00:00
> ---
> > ../wrfprd/wrfout_${domain}_${YYi}-${MMi}-${DDi}_${HHi}:00:00
>
> In run_unipost, the timestamp in the wrfout file name
> (${YY}-${MM}-${DD}_${HH}) is the time being processed.  If you're
> processing 10 different output times, it looks for 10 different file
names,
> one for each timestamp.
>
> In run_unipost_frames, the timestamp in the wrfout file name
> (${YYi}-${MMi}-${DDi}_${HHi}) is the initialization time.  I'm
guessing
> that's why they appended the 'i' to the variable names.  If you're
> processing 10 different output times, all with the same
initialization, it
> looks for a single file name that include the initialization
timestamp.
>
> Since your data has multiple output times in each file, use
> run_unipost_frames.
>
> Thanks,
> John
>
>
> On Tue, Dec 16, 2014 at 11:06 AM, Kunke, Jessica via RT
<met_help at ucar.edu
> >
> wrote:
> >
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> >
> > Hello John,
> >
> > Thanks again for your reply.  I'll check out the copygb part of
the
> > tutorial (I've been looking at the tutorial, but you're right that
it
> would
> > be good to revisit that section).  My one question about the
> > run_unipost_frames script is that regular run_unipost seems to be
able to
> > accommodate multiple times too, in its use of fhr, lasthr, and
> > incrementhr.  Is that something different?  And I've followed up
with
> > wrfhelp on these UPP questions I've had.
> >
> > Thank you!  Best wishes,
> > Jess
> >
> > On Tue, Dec 16, 2014 at 12:08 PM, John Halley Gotway via RT <
> > met_help at ucar.edu> wrote:
> > >
> > > Jess,
> > >
> > > 1. copygb takes as input one GRIB file and a specification of
the
> desired
> > > output grid.  The grid is specified by a long and rather cryptic
series
> > of
> > > numbers.  Please take a look at the MET online tutorial which
includes
> a
> > > section on running copygb:
> > >
> > >
> > >
> >
>
http://www.dtcenter.org/met/users/support/online_tutorial/METv5.0/copygb/index.php
> > > Just click the forward arrows in the bottom-right corner to
advance to
> > the
> > > next page.  There are examples of running coypgb to regrid to
lat/lon,
> > > lambert conformal, and polar stereographic grids.  Mercator is
> supported
> > as
> > > well.  Just try to work through an example and let me know if
you get
> > > stuck.
> > >
> > > 2. Here's my best guess... run_unipost is the original and most
basic
> > one.
> > > run_unipost_frames is for when you have multiple output times in
a
> single
> > > wrfout file.  Typically, you write one output file per time.
But wrf
> can
> > > be configured to write multiple times into a single file.  And
you'd
> use
> > > the "_frames" variant in that case.  run_unipost_minutes would
be used
> > when
> > > your output times don't fall exactly on the hour, e.g. wrfout
every 15
> > > minutes.  And I have no idea about "_gracet".  Rather than
having 4
> > > versions of essentially the same script, it would be preferable
to
> have a
> > > single script that supports multiple options.  But
unfortunately,
> that's
> > > not what's available.
> > >
> > > 3. I think you should write wrfhelp at ucar.edu with this question.
> While
> > I
> > > run these scripts periodically, I really don't know them well
enough to
> > > answer your questions about setting up the directory structure
and
> > > environment variables.
> > >
> > > 4. Since you'll have multiple output times per file, you should
use the
> > > "run_unipost_frames" script.  I assume you'd set startdate to
the
> > beginning
> > > time of your simulation (following the time format already in
the
> > script).
> > > fhr would be 0, lastfhr would be 80 and incrementhr would be 1.
At
> least
> > > that what I'd try first.
> > >
> > > Hope that helps.
> > >
> > > Thanks,
> > > John
> > >
> > > On Tue, Dec 16, 2014 at 9:09 AM, Kunke, Jessica via RT <
> > met_help at ucar.edu>
> > > wrote:
> > > >
> > > >
> > > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099
>
> > > >
> > > > Hello John,
> > > >
> > > > Thanks for your prompt reply.  Here are a few follow-up
questions,
> and
> > if
> > > > they need to be sent to wrfhelp instead, please just let me
know.
> > > >
> > > > 1. The only reason I anticipated having to convert the MODIS
netCDF
> to
> > > GRIB
> > > > format was because I thought the way copygb worked was that it
> required
> > > two
> > > > GRIB-format files, and that it would take those two files and
regrid
> > them
> > > > onto the same grid.  Is that not true?  How do I tell copygb
what my
> > > > desired grid and/or evaluation domain would be?  I'm actually
fairly
> > > > confused about how to use copygb.  I read the instructions on
pg.
> > 358-359
> > > > of the WRF users' guide, and it's not clear to me what series
of
> > > commands I
> > > > would use with what options, for my purpose of getting WRF
output
> onto
> > > the
> > > > grid I will put MODIS on (or putting them on the same grid
using
> > copygb,
> > > if
> > > > I input both the model and observation files).
> > > >
> > > > 2. The four scripts run_unipost, run_unipost_frames,
> > run_unipost_gracet,
> > > > and run_unipost_minutes seem to be pretty similar.  Can you
explain
> the
> > > > difference between them, when I would use one versus the
other?  I've
> > > tried
> > > > using diff on the files, but I still don't quite get the
differences
> in
> > > > their application.
> > > >
> > > > 3. I just wanted to check that I understand the environment
> variables.
> > > > Before I configured and compiled UPP, I set WRF_DIR in my
.bashrc
> file.
> > > > Now in whichever script I choose to copy over to postprd and
edit, I
> > set
> > > > the following variables:
> > > >
> > > > export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere
else?>
> <==
> > > > since UNIPOST_HOME and WRFPATH are what define where UPPV2.2
and
> WRFV3
> > > are,
> > > > do I even need TOP_DIR?
> > > >
> > > > export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
> > > > /DOMAINS/test_case?  I don't have this directory.  Do I run
from
> > UPPV2.2,
> > > > from bin, or from postprd?
> > > >
> > > > export WRFPATH=${WRF_DIR}
> > > >
> > > > export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change this
to
> > > > ${TOP_DIR}?
> > > >
> > > > export POSTEXEC=${UNIPOST_HOME}/bin
> > > >
> > > > 4. My WRF simulation started at Aug 30 2013 06:00:00 and ended
at Aug
> > 31
> > > > 2013 00:00:00, and I set history_interval = 60 and
> > frames_per_outfile=80,
> > > > so that it saves data to my wrfout file every hour and puts up
to 80
> > such
> > > > hourly outputs in the same file.  What does this mean for how
I set
> > > > startdate, fhr,  lastfhr, and incrementhr?  MODIS data is only
> > available
> > > at
> > > > 10:30am and 1:30pm, so in theory I'm only comparing at these
two
> times,
> > > but
> > > > I may want to do some temporal averaging or something in case
WRF
> > > captured
> > > > the MODIS data well but was off by a little time.
> > > >
> > > > Thanks,
> > > > Jess
> > > >
> > > >
> > > > On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
> > > > met_help at ucar.edu> wrote:
> > > > >
> > > > > Hello Jess,
> > > > >
> > > > > I see that you'd like to start using MET to compare your WRF
output
> > > with
> > > > > MODIS data.  Hopefully we'll be able to help you get going
on that.
> > > > Sounds
> > > > > like you have a pretty good plan laid out.  Unfortunately,
there is
> > no
> > > > easy
> > > > > way in general to regrid NetCDF data.  However, you're in
luck -
> the
> > > > > MODIS_regrid tool is one of the only tools in MET that
enables you
> to
> > > > > regrid data.
> > > > >
> > > > > Here's what I'd suggest...
> > > > >  - Run UPP to post-process your raw WRF output files.
> > > > >  - Run copygb to re-grid the GRIB output to your desired
evaluation
> > > > domain.
> > > > >  - Run modis_grid to re-grid the MODIS data to your desired
> > evaluation
> > > > > domain (use the -data_file option to pass it a GRIB file
defining
> > your
> > > > > desired output domain).
> > > > >  - Run grid_stat, mode, wavelet-stat, or series-analysis to
compare
> > the
> > > > > GRIB output of UPP to the NetCDF output of modis_regrid and
compute
> > > > > whatever statistics you'd like.  As long as the data is on
the same
> > > grid,
> > > > > they can be in different file formats.
> > > > >
> > > > > I haven't actually used the modis_regrid tool myself that
much.  So
> > if
> > > > > questions come up about that, I'll refer you to the
developer of
> that
> > > > tool.
> > > > >
> > > > > Regarding UPP, support for it is provided through
wrfhelp at ucar.edu
> .
> > I
> > > > can
> > > > > take a stab at easy questions, but will refer you to wrfhelp
for
> > > tougher
> > > > > ones!
> > > > >
> > > > > We have WPS, WRF, and UPP compiled in a directory structure
similar
> > to
> > > > what
> > > > > you describe on our machines here.  Actually though, we have
WPS,
> > WRF,
> > > > and
> > > > > UPP all at the same level.  The WRF directory contains
> subdirectories
> > > for
> > > > > many different versions (v3.5, v3.6, and so on).  Likewise,
UPP has
> > > > > subdirectories for difference versions (UPPV1.1 and
UPPV2.1).  So
> > there
> > > > is
> > > > > no explicit requirement for how to organize your directory
> structure.
> > > > >
> > > > > When you run "configure" for UPP, it does need to locate WRF
since
> it
> > > > links
> > > > > to some WRF libraries.  You can specify where to find WRF by
(1)
> > > setting
> > > > > the WRF_DIR environment variable or (2) having WRFV3 at the
same
> > level
> > > as
> > > > > UPPV2.1 or (3) by explicitly entering it at the prompt from
> > > "configure".
> > > > >
> > > > > Hope that helps.
> > > > >
> > > > > Thanks,
> > > > > John Halley Gotway
> > > > > met_help at ucar.edu
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT <
> > > > met_help at ucar.edu>
> > > > > wrote:
> > > > > >
> > > > > >
> > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > > > > >
> > > > > > Version 2.2, actually, so my directory name is UPPV2.2
> > > > > >
> > > > > > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica <
> > > > jkunke at fas.harvard.edu>
> > > > > > wrote:
> > > > > > >
> > > > > > > Correction to my previous email:  I was saying UPPV1
because
> > that's
> > > > > what
> > > > > > > the WRF users' guide says in the instructions, but I
actually
> > have
> > > > > > version
> > > > > > > 2.
> > > > > > >
> > > > > > > Also, the instructions say to set DOMAINPATH to the
"directory
> > > where
> > > > > UPP
> > > > > > > will be run from."  What limits which directory I can
choose to
> > run
> > > > UPP
> > > > > > > from?  Can I set it to the path to my UPPV2 directory?
> > > > > > >
> > > > > > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu via
RT <
> > > > > > > met_help at ucar.edu> wrote:
> > > > > > >>
> > > > > > >> Greetings,
> > > > > > >>
> > > > > > >> This message has been automatically generated in
response to
> the
> > > > > > >> creation of a trouble ticket regarding:
> > > > > > >>         "MET: validating WRF shallow cumuli fields with
MODIS
> > > data",
> > > > > > >> a summary of which appears below.
> > > > > > >>
> > > > > > >> There is no need to reply to this message right now.
Your
> > ticket
> > > > has
> > > > > > been
> > > > > > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > > > > > >>
> > > > > > >> Please include the string:
> > > > > > >>
> > > > > > >>          [rt.rap.ucar.edu #70099]
> > > > > > >>
> > > > > > >> in the subject line of all future correspondence about
this
> > issue.
> > > > To
> > > > > do
> > > > > > >> so,
> > > > > > >> you may reply to this message.
> > > > > > >>
> > > > > > >>                         Thank you,
> > > > > > >>                         met_help at ucar.edu
> > > > > > >>
> > > > > > >>
> > > > > >
> > > >
> >
-------------------------------------------------------------------------
> > > > > > >> Good afternoon,
> > > > > > >>
> > > > > > >> I'm interested in validating the shallow cumuli in my
WRF
> > > > simulations,
> > > > > > and
> > > > > > >> for this I was thinking of using MODIS data.  My
thought was
> to
> > > > follow
> > > > > > >> this
> > > > > > >> plan:
> > > > > > >>
> > > > > > >> - use the UPP to destagger WRF output and get it into
GRIB
> > format
> > > > > > >> - use MODIS_regrid in MET to reformat MODIS Level 2
data
> > > > > > >> - use copygb to get WRF output and MODIS data to a
common
> > > > verification
> > > > > > >> format
> > > > > > >> - use MET tools, probably GRID_STAT, to validate WRF
output
> > > against
> > > > > > MODIS
> > > > > > >> data
> > > > > > >>
> > > > > > >> However, I'm running into some apparent complications:
> > > > > > >>
> > > > > > >> 1. MODIS_regrid seems to only be able to output MODIS
data in
> > > netCDF
> > > > > > >> format, yet copygb seems to require both WRF output and
any
> > > > > observations
> > > > > > >> to
> > > > > > >> be in GRIB format.  First, is that true?  If so, what
tool do
> > you
> > > > > > >> recommend
> > > > > > >> for converting MODIS netCDF output into GRIB format?
I've
> found
> > > > > > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF
file>
> <GRIB
> > > > > file>,
> > > > > > >> but
> > > > > > >> I'm not familiar with any of them and wondered if you
have
> > > > > > recommendations
> > > > > > >> and/or cautions.
> > > > > > >>
> > > > > > >> 2. A few questions have arisen about how to use UPP.
I've
> read
> > > the
> > > > > > >> instructions on pg. 347-369 of the WRF Users' Guide.
For
> > > starters,
> > > > > does
> > > > > > >> it
> > > > > > >> have to be installed in the same directory as WRF?  I
didn't
> see
> > > > that
> > > > > > >> statement in the instructions, but based on TOP_DIR's
> > description
> > > as
> > > > > > "the
> > > > > > >> top-level directory for source codes (UPPV1 and WRFV3),
it
> seems
> > > > > > suspect.
> > > > > > >> So I should have WPS, WRFV3, and UPPV1 all at the same
> > > subdirectory
> > > > > > level
> > > > > > >> within the same parent directory?
> > > > > > >>
> > > > > > >> Thank you,
> > > > > > >> Jess
> > > > > > >>
> > > > > > >>
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------
Subject: MET: validating WRF shallow cumuli fields with MODIS data
From: John Halley Gotway
Time: Tue Dec 16 14:27:06 2014

Thanks, you too.  I'll resolve this ticket.  Just let us know if any
more
questions arise in your use of MET.

Thanks,
John

On Tue, Dec 16, 2014 at 2:19 PM, Kunke, Jessica via RT
<met_help at ucar.edu>
wrote:
>
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
>
> Hello John,
>
> That makes sense about having one script! And thank you for your
helpful
> clarification on the frames script.
>
> Have a lovely holiday!
> Jess
>
> On Tue, Dec 16, 2014 at 4:15 PM, John Halley Gotway via RT <
> met_help at ucar.edu> wrote:
> >
> > Jess,
> >
> > Looking at those two scripts that are 95% identical really makes
me think
> > they should be one script with multiple options!
> >
> > That aside, here's the substantive difference:
> > 154c158
> > < ../wrfprd/wrfout_${domain}_${YY}-${MM}-${DD}_${HH}:00:00
> > ---
> > > ../wrfprd/wrfout_${domain}_${YYi}-${MMi}-${DDi}_${HHi}:00:00
> >
> > In run_unipost, the timestamp in the wrfout file name
> > (${YY}-${MM}-${DD}_${HH}) is the time being processed.  If you're
> > processing 10 different output times, it looks for 10 different
file
> names,
> > one for each timestamp.
> >
> > In run_unipost_frames, the timestamp in the wrfout file name
> > (${YYi}-${MMi}-${DDi}_${HHi}) is the initialization time.  I'm
guessing
> > that's why they appended the 'i' to the variable names.  If you're
> > processing 10 different output times, all with the same
initialization,
> it
> > looks for a single file name that include the initialization
timestamp.
> >
> > Since your data has multiple output times in each file, use
> > run_unipost_frames.
> >
> > Thanks,
> > John
> >
> >
> > On Tue, Dec 16, 2014 at 11:06 AM, Kunke, Jessica via RT <
> met_help at ucar.edu
> > >
> > wrote:
> > >
> > >
> > > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > >
> > > Hello John,
> > >
> > > Thanks again for your reply.  I'll check out the copygb part of
the
> > > tutorial (I've been looking at the tutorial, but you're right
that it
> > would
> > > be good to revisit that section).  My one question about the
> > > run_unipost_frames script is that regular run_unipost seems to
be able
> to
> > > accommodate multiple times too, in its use of fhr, lasthr, and
> > > incrementhr.  Is that something different?  And I've followed up
with
> > > wrfhelp on these UPP questions I've had.
> > >
> > > Thank you!  Best wishes,
> > > Jess
> > >
> > > On Tue, Dec 16, 2014 at 12:08 PM, John Halley Gotway via RT <
> > > met_help at ucar.edu> wrote:
> > > >
> > > > Jess,
> > > >
> > > > 1. copygb takes as input one GRIB file and a specification of
the
> > desired
> > > > output grid.  The grid is specified by a long and rather
cryptic
> series
> > > of
> > > > numbers.  Please take a look at the MET online tutorial which
> includes
> > a
> > > > section on running copygb:
> > > >
> > > >
> > > >
> > >
> >
>
http://www.dtcenter.org/met/users/support/online_tutorial/METv5.0/copygb/index.php
> > > > Just click the forward arrows in the bottom-right corner to
advance
> to
> > > the
> > > > next page.  There are examples of running coypgb to regrid to
> lat/lon,
> > > > lambert conformal, and polar stereographic grids.  Mercator is
> > supported
> > > as
> > > > well.  Just try to work through an example and let me know if
you get
> > > > stuck.
> > > >
> > > > 2. Here's my best guess... run_unipost is the original and
most basic
> > > one.
> > > > run_unipost_frames is for when you have multiple output times
in a
> > single
> > > > wrfout file.  Typically, you write one output file per time.
But wrf
> > can
> > > > be configured to write multiple times into a single file.  And
you'd
> > use
> > > > the "_frames" variant in that case.  run_unipost_minutes would
be
> used
> > > when
> > > > your output times don't fall exactly on the hour, e.g. wrfout
every
> 15
> > > > minutes.  And I have no idea about "_gracet".  Rather than
having 4
> > > > versions of essentially the same script, it would be
preferable to
> > have a
> > > > single script that supports multiple options.  But
unfortunately,
> > that's
> > > > not what's available.
> > > >
> > > > 3. I think you should write wrfhelp at ucar.edu with this
question.
> > While
> > > I
> > > > run these scripts periodically, I really don't know them well
enough
> to
> > > > answer your questions about setting up the directory structure
and
> > > > environment variables.
> > > >
> > > > 4. Since you'll have multiple output times per file, you
should use
> the
> > > > "run_unipost_frames" script.  I assume you'd set startdate to
the
> > > beginning
> > > > time of your simulation (following the time format already in
the
> > > script).
> > > > fhr would be 0, lastfhr would be 80 and incrementhr would be
1.  At
> > least
> > > > that what I'd try first.
> > > >
> > > > Hope that helps.
> > > >
> > > > Thanks,
> > > > John
> > > >
> > > > On Tue, Dec 16, 2014 at 9:09 AM, Kunke, Jessica via RT <
> > > met_help at ucar.edu>
> > > > wrote:
> > > > >
> > > > >
> > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099 >
> > > > >
> > > > > Hello John,
> > > > >
> > > > > Thanks for your prompt reply.  Here are a few follow-up
questions,
> > and
> > > if
> > > > > they need to be sent to wrfhelp instead, please just let me
know.
> > > > >
> > > > > 1. The only reason I anticipated having to convert the MODIS
netCDF
> > to
> > > > GRIB
> > > > > format was because I thought the way copygb worked was that
it
> > required
> > > > two
> > > > > GRIB-format files, and that it would take those two files
and
> regrid
> > > them
> > > > > onto the same grid.  Is that not true?  How do I tell copygb
what
> my
> > > > > desired grid and/or evaluation domain would be?  I'm
actually
> fairly
> > > > > confused about how to use copygb.  I read the instructions
on pg.
> > > 358-359
> > > > > of the WRF users' guide, and it's not clear to me what
series of
> > > > commands I
> > > > > would use with what options, for my purpose of getting WRF
output
> > onto
> > > > the
> > > > > grid I will put MODIS on (or putting them on the same grid
using
> > > copygb,
> > > > if
> > > > > I input both the model and observation files).
> > > > >
> > > > > 2. The four scripts run_unipost, run_unipost_frames,
> > > run_unipost_gracet,
> > > > > and run_unipost_minutes seem to be pretty similar.  Can you
explain
> > the
> > > > > difference between them, when I would use one versus the
other?
> I've
> > > > tried
> > > > > using diff on the files, but I still don't quite get the
> differences
> > in
> > > > > their application.
> > > > >
> > > > > 3. I just wanted to check that I understand the environment
> > variables.
> > > > > Before I configured and compiled UPP, I set WRF_DIR in my
.bashrc
> > file.
> > > > > Now in whichever script I choose to copy over to postprd and
edit,
> I
> > > set
> > > > > the following variables:
> > > > >
> > > > > export TOP_DIR=<path to UPPV2.2, even if WRFV3 is somewhere
else?>
> > <==
> > > > > since UNIPOST_HOME and WRFPATH are what define where UPPV2.2
and
> > WRFV3
> > > > are,
> > > > > do I even need TOP_DIR?
> > > > >
> > > > > export DOMAINPATH=${TOP_DIR}/DOMAINS/test_case  <== what is
> > > > > /DOMAINS/test_case?  I don't have this directory.  Do I run
from
> > > UPPV2.2,
> > > > > from bin, or from postprd?
> > > > >
> > > > > export WRFPATH=${WRF_DIR}
> > > > >
> > > > > export UNIPOST_HOME=${TOP_DIR}/UPPV2.0   <== can I change
this to
> > > > > ${TOP_DIR}?
> > > > >
> > > > > export POSTEXEC=${UNIPOST_HOME}/bin
> > > > >
> > > > > 4. My WRF simulation started at Aug 30 2013 06:00:00 and
ended at
> Aug
> > > 31
> > > > > 2013 00:00:00, and I set history_interval = 60 and
> > > frames_per_outfile=80,
> > > > > so that it saves data to my wrfout file every hour and puts
up to
> 80
> > > such
> > > > > hourly outputs in the same file.  What does this mean for
how I set
> > > > > startdate, fhr,  lastfhr, and incrementhr?  MODIS data is
only
> > > available
> > > > at
> > > > > 10:30am and 1:30pm, so in theory I'm only comparing at these
two
> > times,
> > > > but
> > > > > I may want to do some temporal averaging or something in
case WRF
> > > > captured
> > > > > the MODIS data well but was off by a little time.
> > > > >
> > > > > Thanks,
> > > > > Jess
> > > > >
> > > > >
> > > > > On Mon, Dec 15, 2014 at 4:56 PM, John Halley Gotway via RT <
> > > > > met_help at ucar.edu> wrote:
> > > > > >
> > > > > > Hello Jess,
> > > > > >
> > > > > > I see that you'd like to start using MET to compare your
WRF
> output
> > > > with
> > > > > > MODIS data.  Hopefully we'll be able to help you get going
on
> that.
> > > > > Sounds
> > > > > > like you have a pretty good plan laid out.  Unfortunately,
there
> is
> > > no
> > > > > easy
> > > > > > way in general to regrid NetCDF data.  However, you're in
luck -
> > the
> > > > > > MODIS_regrid tool is one of the only tools in MET that
enables
> you
> > to
> > > > > > regrid data.
> > > > > >
> > > > > > Here's what I'd suggest...
> > > > > >  - Run UPP to post-process your raw WRF output files.
> > > > > >  - Run copygb to re-grid the GRIB output to your desired
> evaluation
> > > > > domain.
> > > > > >  - Run modis_grid to re-grid the MODIS data to your
desired
> > > evaluation
> > > > > > domain (use the -data_file option to pass it a GRIB file
defining
> > > your
> > > > > > desired output domain).
> > > > > >  - Run grid_stat, mode, wavelet-stat, or series-analysis
to
> compare
> > > the
> > > > > > GRIB output of UPP to the NetCDF output of modis_regrid
and
> compute
> > > > > > whatever statistics you'd like.  As long as the data is on
the
> same
> > > > grid,
> > > > > > they can be in different file formats.
> > > > > >
> > > > > > I haven't actually used the modis_regrid tool myself that
much.
> So
> > > if
> > > > > > questions come up about that, I'll refer you to the
developer of
> > that
> > > > > tool.
> > > > > >
> > > > > > Regarding UPP, support for it is provided through
> wrfhelp at ucar.edu
> > .
> > > I
> > > > > can
> > > > > > take a stab at easy questions, but will refer you to
wrfhelp for
> > > > tougher
> > > > > > ones!
> > > > > >
> > > > > > We have WPS, WRF, and UPP compiled in a directory
structure
> similar
> > > to
> > > > > what
> > > > > > you describe on our machines here.  Actually though, we
have WPS,
> > > WRF,
> > > > > and
> > > > > > UPP all at the same level.  The WRF directory contains
> > subdirectories
> > > > for
> > > > > > many different versions (v3.5, v3.6, and so on).
Likewise, UPP
> has
> > > > > > subdirectories for difference versions (UPPV1.1 and
UPPV2.1).  So
> > > there
> > > > > is
> > > > > > no explicit requirement for how to organize your directory
> > structure.
> > > > > >
> > > > > > When you run "configure" for UPP, it does need to locate
WRF
> since
> > it
> > > > > links
> > > > > > to some WRF libraries.  You can specify where to find WRF
by (1)
> > > > setting
> > > > > > the WRF_DIR environment variable or (2) having WRFV3 at
the same
> > > level
> > > > as
> > > > > > UPPV2.1 or (3) by explicitly entering it at the prompt
from
> > > > "configure".
> > > > > >
> > > > > > Hope that helps.
> > > > > >
> > > > > > Thanks,
> > > > > > John Halley Gotway
> > > > > > met_help at ucar.edu
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon, Dec 15, 2014 at 1:05 PM, Kunke, Jessica via RT <
> > > > > met_help at ucar.edu>
> > > > > > wrote:
> > > > > > >
> > > > > > >
> > > > > > > <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=70099
> >
> > > > > > >
> > > > > > > Version 2.2, actually, so my directory name is UPPV2.2
> > > > > > >
> > > > > > > On Mon, Dec 15, 2014 at 3:00 PM, Kunke, Jessica <
> > > > > jkunke at fas.harvard.edu>
> > > > > > > wrote:
> > > > > > > >
> > > > > > > > Correction to my previous email:  I was saying UPPV1
because
> > > that's
> > > > > > what
> > > > > > > > the WRF users' guide says in the instructions, but I
actually
> > > have
> > > > > > > version
> > > > > > > > 2.
> > > > > > > >
> > > > > > > > Also, the instructions say to set DOMAINPATH to the
> "directory
> > > > where
> > > > > > UPP
> > > > > > > > will be run from."  What limits which directory I can
choose
> to
> > > run
> > > > > UPP
> > > > > > > > from?  Can I set it to the path to my UPPV2 directory?
> > > > > > > >
> > > > > > > > On Sun, Dec 14, 2014 at 11:35 PM, met_help at ucar.edu
via RT <
> > > > > > > > met_help at ucar.edu> wrote:
> > > > > > > >>
> > > > > > > >> Greetings,
> > > > > > > >>
> > > > > > > >> This message has been automatically generated in
response to
> > the
> > > > > > > >> creation of a trouble ticket regarding:
> > > > > > > >>         "MET: validating WRF shallow cumuli fields
with
> MODIS
> > > > data",
> > > > > > > >> a summary of which appears below.
> > > > > > > >>
> > > > > > > >> There is no need to reply to this message right now.
Your
> > > ticket
> > > > > has
> > > > > > > been
> > > > > > > >> assigned an ID of [rt.rap.ucar.edu #70099].
> > > > > > > >>
> > > > > > > >> Please include the string:
> > > > > > > >>
> > > > > > > >>          [rt.rap.ucar.edu #70099]
> > > > > > > >>
> > > > > > > >> in the subject line of all future correspondence
about this
> > > issue.
> > > > > To
> > > > > > do
> > > > > > > >> so,
> > > > > > > >> you may reply to this message.
> > > > > > > >>
> > > > > > > >>                         Thank you,
> > > > > > > >>                         met_help at ucar.edu
> > > > > > > >>
> > > > > > > >>
> > > > > > >
> > > > >
> > >
>
-------------------------------------------------------------------------
> > > > > > > >> Good afternoon,
> > > > > > > >>
> > > > > > > >> I'm interested in validating the shallow cumuli in my
WRF
> > > > > simulations,
> > > > > > > and
> > > > > > > >> for this I was thinking of using MODIS data.  My
thought was
> > to
> > > > > follow
> > > > > > > >> this
> > > > > > > >> plan:
> > > > > > > >>
> > > > > > > >> - use the UPP to destagger WRF output and get it into
GRIB
> > > format
> > > > > > > >> - use MODIS_regrid in MET to reformat MODIS Level 2
data
> > > > > > > >> - use copygb to get WRF output and MODIS data to a
common
> > > > > verification
> > > > > > > >> format
> > > > > > > >> - use MET tools, probably GRID_STAT, to validate WRF
output
> > > > against
> > > > > > > MODIS
> > > > > > > >> data
> > > > > > > >>
> > > > > > > >> However, I'm running into some apparent
complications:
> > > > > > > >>
> > > > > > > >> 1. MODIS_regrid seems to only be able to output MODIS
data
> in
> > > > netCDF
> > > > > > > >> format, yet copygb seems to require both WRF output
and any
> > > > > > observations
> > > > > > > >> to
> > > > > > > >> be in GRIB format.  First, is that true?  If so, what
tool
> do
> > > you
> > > > > > > >> recommend
> > > > > > > >> for converting MODIS netCDF output into GRIB format?
I've
> > found
> > > > > > > >> ncl_convert2nc, ncl_grib_2nc, and cod -f grb <netCDF
file>
> > <GRIB
> > > > > > file>,
> > > > > > > >> but
> > > > > > > >> I'm not familiar with any of them and wondered if you
have
> > > > > > > recommendations
> > > > > > > >> and/or cautions.
> > > > > > > >>
> > > > > > > >> 2. A few questions have arisen about how to use UPP.
I've
> > read
> > > > the
> > > > > > > >> instructions on pg. 347-369 of the WRF Users' Guide.
For
> > > > starters,
> > > > > > does
> > > > > > > >> it
> > > > > > > >> have to be installed in the same directory as WRF?  I
didn't
> > see
> > > > > that
> > > > > > > >> statement in the instructions, but based on TOP_DIR's
> > > description
> > > > as
> > > > > > > "the
> > > > > > > >> top-level directory for source codes (UPPV1 and
WRFV3), it
> > seems
> > > > > > > suspect.
> > > > > > > >> So I should have WPS, WRFV3, and UPPV1 all at the
same
> > > > subdirectory
> > > > > > > level
> > > > > > > >> within the same parent directory?
> > > > > > > >>
> > > > > > > >> Thank you,
> > > > > > > >> Jess
> > > > > > > >>
> > > > > > > >>
> > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> >
> >
>
>

------------------------------------------------


More information about the Met_help mailing list