[Met_help] [rt.rap.ucar.edu #62175] History for MET 4 (or 4.1) question(s)

John Halley Gotway via RT met_help at ucar.edu
Tue Aug 27 10:39:00 MDT 2013


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Hi. 
First, to the developer, thank you for providing MET to the world. It
looks outstanding and very useful. If you don't get enough thanks, I hope
just one more "thank you" makes your day.

I installed MET 4.1 on a Linux.
I am ready to start using MET to compare 40 station observations of daily
accumulated precip to 64 WRF-ARW simulations for the same area.
I have read the MET 4.1 manual for the past week \and gone through all the
examples.

But I am very confused about one thing:

Once a user uses p_interp to process WRF-ARW output, where do they go from
there?
The p_interp output is still netcdf (variables are on pressure level) yet
most of the modules want grib. Yet, I the second chapter, the manual
states that p_interp netcdf data can be used. How?
How do I feed p_interp processed data into the other MET modules,
particulalry if I want to do the above, compare station data to WRF.
Perhaps I missed it?


I used assci2nc to process the station data, so they are ready. I have the
wrf-arw p_interp processed data. How do I put the wrf-data into the
point-stat module?

If it needs to to still be in grib, what do I do with the P-interp
results? I could have skipped p_interp step and just run the wrf-arw
output through UPP, right? I am confused.

If Met user of developer could help me understand the next step, that
would be greatly appreciated.

In addition, has anyone successfully installed this software on an Apple
machine?

Sincerely, 
Erik Noble





----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: John Halley Gotway
Time: Thu Jul 11 09:54:11 2013

Erik,

We appreciate the feedback!  It's been a long road getting MET out the
door and supporting it, but it's nice to hear that it's useful to the
modelling community.

I see that you're asking about the support in MET for the NetCDF
output of the pinterp utility.  The output of pinterp is a gridded
NetCDF file that the MET tools do support, but with some
limitations.  WRF is run on a staggered grid with the wind fields
staggered, while the mass points are on a regular grid.  The pinterp
output is indeed on pressure levels, but the wind points are
still staggered, and MET is unable to read them.  So basically, using
pinterp is not a good choice if you're interested in winds.  But as
long as you're not using winds, the gridded pinterp output can
be used is any place in MET that GRIB1 or GRIB2 is used.

The other big drawback to pinterp is that it's NetCDF, and therefore,
it's not easy to regrid.  When doing grid-to-grid comparisons, you
need to put the forecast and observation fields on a common
grid.  That's easy to do for GRIB using the copygb utility, but not
easy in general for NetCDF.

So the other WRF post-processing alternative is the Unified
PostProcessor (UPP).  It's output is in GRIB1 format, which MET fully
supports.  If possible, I'd suggest using UPP instead of pinterp to
avoid the staggered grid and regridding limitations of NetCDF.
Support for UPP is provided via wrfhelp at ucar.edu.

But to get to your question...

If you're dealing with point observations, the only MET tool to be
used is the Point-Stat tool.  I suggest running Point-Stat to compare
the output of pinterp (or UPP, if you switch) to the point
observation output of ASCII2NC.  With only 40 points, I'd suggest
writing out the matched pair (MPR) line type from Point-Stat.  Point-
Stat is run once per valid time, however, you can then use the
STAT-Analysis tool to aggregate results through time.  Suppose you've
run Point-Stat over a month of output for those 40 stations.  You
could run STAT-Analysis to aggregate together that month of
Point-Stat output and compute monthly statistics for each of the 40
stations individually.

Hopefully that helps.  If you get stuck on setting up the Point-Stat
configuration file, just let me know and I'll be happy to help you get
it going.

Thanks,
John Halley Gotway
met_help at ucar.edu

On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>
> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
> Transaction: Ticket created by erik.noble at nasa.gov
>         Queue: met_help
>       Subject: MET 4 (or 4.1) question(s)
>         Owner: Nobody
>    Requestors: erik.noble at nasa.gov
>        Status: new
>   Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>
>
> Hi.
> First, to the developer, thank you for providing MET to the world.
It
> looks outstanding and very useful. If you don't get enough thanks, I
hope
> just one more "thank you" makes your day.
>
> I installed MET 4.1 on a Linux.
> I am ready to start using MET to compare 40 station observations of
daily
> accumulated precip to 64 WRF-ARW simulations for the same area.
> I have read the MET 4.1 manual for the past week \and gone through
all the
> examples.
>
> But I am very confused about one thing:
>
> Once a user uses p_interp to process WRF-ARW output, where do they
go from
> there?
> The p_interp output is still netcdf (variables are on pressure
level) yet
> most of the modules want grib. Yet, I the second chapter, the manual
> states that p_interp netcdf data can be used. How?
> How do I feed p_interp processed data into the other MET modules,
> particulalry if I want to do the above, compare station data to WRF.
> Perhaps I missed it?
>
>
> I used assci2nc to process the station data, so they are ready. I
have the
> wrf-arw p_interp processed data. How do I put the wrf-data into the
> point-stat module?
>
> If it needs to to still be in grib, what do I do with the P-interp
> results? I could have skipped p_interp step and just run the wrf-arw
> output through UPP, right? I am confused.
>
> If Met user of developer could help me understand the next step,
that
> would be greatly appreciated.
>
> In addition, has anyone successfully installed this software on an
Apple
> machine?
>
> Sincerely,
> Erik Noble
>
>
>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Fri Jul 12 09:23:48 2013

Hi, thank you. Your reply clarifies a lot.

2 questions:
You wrote, "Point-Stat is run once per valid time,Š"
I have 40 ascii files that hold 12-day daily precipitation
accumulations
at each station. (1) Do the ascii files need to be once per valid time
or
can I still use the 40 12-day files?

Before I discovered MET, I went through the trouble of writing my own
scripts that do the same thing as p_interp and destaggers the winds.
Also,
I found an easy way to regrid netcdf files, both observed and wrf-arw
files to the same grid, using CDO. Details of the command line are
here,
where the 2nd-to-last-entry provides the best explanation
(https://code.zmaw.de/boards/1/topics/1051#message-1056)
What does the netcdf file need to have in order to still be used in
MET
(point-stat, etc.) Do the lats and lons still need to be 2-D
(lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon =
lon)

Thank you.
Erik

On 7/11/13 11:54 AM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:

>Erik,
>
>We appreciate the feedback!  It's been a long road getting MET out
the
>door and supporting it, but it's nice to hear that it's useful to the
>modelling community.
>
>I see that you're asking about the support in MET for the NetCDF
output
>of the pinterp utility.  The output of pinterp is a gridded NetCDF
file
>that the MET tools do support, but with some
>limitations.  WRF is run on a staggered grid with the wind fields
>staggered, while the mass points are on a regular grid.  The pinterp
>output is indeed on pressure levels, but the wind points are
>still staggered, and MET is unable to read them.  So basically, using
>pinterp is not a good choice if you're interested in winds.  But as
long
>as you're not using winds, the gridded pinterp output can
>be used is any place in MET that GRIB1 or GRIB2 is used.
>
>The other big drawback to pinterp is that it's NetCDF, and therefore,
>it's not easy to regrid.  When doing grid-to-grid comparisons, you
need
>to put the forecast and observation fields on a common
>grid.  That's easy to do for GRIB using the copygb utility, but not
easy
>in general for NetCDF.
>
>So the other WRF post-processing alternative is the Unified
PostProcessor
>(UPP).  It's output is in GRIB1 format, which MET fully supports.  If
>possible, I'd suggest using UPP instead of pinterp to
>avoid the staggered grid and regridding limitations of NetCDF.
Support
>for UPP is provided via wrfhelp at ucar.edu.
>
>But to get to your question...
>
>If you're dealing with point observations, the only MET tool to be
used
>is the Point-Stat tool.  I suggest running Point-Stat to compare the
>output of pinterp (or UPP, if you switch) to the point
>observation output of ASCII2NC.  With only 40 points, I'd suggest
writing
>out the matched pair (MPR) line type from Point-Stat.  Point-Stat is
run
>once per valid time, however, you can then use the
>STAT-Analysis tool to aggregate results through time.  Suppose you've
run
>Point-Stat over a month of output for those 40 stations.  You could
run
>STAT-Analysis to aggregate together that month of
>Point-Stat output and compute monthly statistics for each of the 40
>stations individually.
>
>Hopefully that helps.  If you get stuck on setting up the Point-Stat
>configuration file, just let me know and I'll be happy to help you
get it
>going.
>
>Thanks,
>John Halley Gotway
>met_help at ucar.edu
>
>On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>
>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>> Transaction: Ticket created by erik.noble at nasa.gov
>>         Queue: met_help
>>       Subject: MET 4 (or 4.1) question(s)
>>         Owner: Nobody
>>    Requestors: erik.noble at nasa.gov
>>        Status: new
>>   Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>
>>
>> Hi.
>> First, to the developer, thank you for providing MET to the world.
It
>> looks outstanding and very useful. If you don't get enough thanks,
I
>>hope
>> just one more "thank you" makes your day.
>>
>> I installed MET 4.1 on a Linux.
>> I am ready to start using MET to compare 40 station observations of
>>daily
>> accumulated precip to 64 WRF-ARW simulations for the same area.
>> I have read the MET 4.1 manual for the past week \and gone through
all
>>the
>> examples.
>>
>> But I am very confused about one thing:
>>
>> Once a user uses p_interp to process WRF-ARW output, where do they
go
>>from
>> there?
>> The p_interp output is still netcdf (variables are on pressure
level)
>>yet
>> most of the modules want grib. Yet, I the second chapter, the
manual
>> states that p_interp netcdf data can be used. How?
>> How do I feed p_interp processed data into the other MET modules,
>> particulalry if I want to do the above, compare station data to
WRF.
>> Perhaps I missed it?
>>
>>
>> I used assci2nc to process the station data, so they are ready. I
have
>>the
>> wrf-arw p_interp processed data. How do I put the wrf-data into the
>> point-stat module?
>>
>> If it needs to to still be in grib, what do I do with the P-interp
>> results? I could have skipped p_interp step and just run the wrf-
arw
>> output through UPP, right? I am confused.
>>
>> If Met user of developer could help me understand the next step,
that
>> would be greatly appreciated.
>>
>> In addition, has anyone successfully installed this software on an
Apple
>> machine?
>>
>> Sincerely,
>> Erik Noble
>>
>>
>>
>



------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: John Halley Gotway
Time: Fri Jul 12 11:58:59 2013

Eric,

The timing of the data within the ASCII files doesn't matter.  So
using 40 12-day files is fine.

But each time you run Point-Stat, you'll pass it a forecast file and
an observation file containing the observations that should be used to
verify that forecast.  So you'll need to know which obs
files go with which forecast files.  Point-Stat reads the timing
information from the forecast file and then defines a time window
around that valid time (see "obs_window" setting in the Point-Stat
config file and see data/config/README for details).  Any point
observations falling within that time window will be used.  Any point
observations falling outside that time window will be skipped over.

If you were using GRIB forecast files, you could actually set this up
in such a way as to verify all of the output times in a single call to
Point-Stat.  You'd literally 'cat' together all of the GRIB
files and then set up a more complex configuration file to tell Point-
Stat what to do.  I suppose you could do the same using NetCDF if you
defined a bunch of different 2D variables, each for precip
with a different time (e.g. APCP_06_2012020500, APCP_06_2012020506,
...).  Or you can keep all the NetCDF files separate and just loop
through them in a script.  That's what we typically do.

In order to create a NetCDF file that MET can handle you basically
need to structure it like the NetCDF output of the pcp_combine tool:
(1) 2 dimensions named lat and lon.
(2) The lat and lon variables are *NOT* required.  MET doesn't
actually use them.  We put them in there since other plotting tools
(like IDV) use them.
(3) The data variables should be 2 dimensional, defined (lat, lon).
(4) Need data variable attributes specifying timing information.
(5) MET expects bad data to be encoded as -9999.
(6) Need global attributes specifying grid definition information.
(7) Need "MET_version" global attribute.

Ultimately, we'd like to support CF-compliant NetCDF files, but we
haven't gotten there yet.  For now, we're stuck with this rather
restrictive NetCDF format that we use internally.

If you need help setting up the global attributes, just send me a
description of the grid you're using.  Give it a shot and if you run
into problems, just send me a sample NetCDF file and I can let
you know what changes are needed.

Thanks,
John

On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>
> Hi, thank you. Your reply clarifies a lot.
>
> 2 questions:
> You wrote, "Point-Stat is run once per valid time,Š"
> I have 40 ascii files that hold 12-day daily precipitation
accumulations
> at each station. (1) Do the ascii files need to be once per valid
time or
> can I still use the 40 12-day files?
>
> Before I discovered MET, I went through the trouble of writing my
own
> scripts that do the same thing as p_interp and destaggers the winds.
Also,
> I found an easy way to regrid netcdf files, both observed and wrf-
arw
> files to the same grid, using CDO. Details of the command line are
here,
> where the 2nd-to-last-entry provides the best explanation
> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
> What does the netcdf file need to have in order to still be used in
MET
> (point-stat, etc.) Do the lats and lons still need to be 2-D
> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon =
lon)
>
> Thank you.
> Erik
>
> On 7/11/13 11:54 AM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:
>
>> Erik,
>>
>> We appreciate the feedback!  It's been a long road getting MET out
the
>> door and supporting it, but it's nice to hear that it's useful to
the
>> modelling community.
>>
>> I see that you're asking about the support in MET for the NetCDF
output
>> of the pinterp utility.  The output of pinterp is a gridded NetCDF
file
>> that the MET tools do support, but with some
>> limitations.  WRF is run on a staggered grid with the wind fields
>> staggered, while the mass points are on a regular grid.  The
pinterp
>> output is indeed on pressure levels, but the wind points are
>> still staggered, and MET is unable to read them.  So basically,
using
>> pinterp is not a good choice if you're interested in winds.  But as
long
>> as you're not using winds, the gridded pinterp output can
>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>
>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>> it's not easy to regrid.  When doing grid-to-grid comparisons, you
need
>> to put the forecast and observation fields on a common
>> grid.  That's easy to do for GRIB using the copygb utility, but not
easy
>> in general for NetCDF.
>>
>> So the other WRF post-processing alternative is the Unified
PostProcessor
>> (UPP).  It's output is in GRIB1 format, which MET fully supports.
If
>> possible, I'd suggest using UPP instead of pinterp to
>> avoid the staggered grid and regridding limitations of NetCDF.
Support
>> for UPP is provided via wrfhelp at ucar.edu.
>>
>> But to get to your question...
>>
>> If you're dealing with point observations, the only MET tool to be
used
>> is the Point-Stat tool.  I suggest running Point-Stat to compare
the
>> output of pinterp (or UPP, if you switch) to the point
>> observation output of ASCII2NC.  With only 40 points, I'd suggest
writing
>> out the matched pair (MPR) line type from Point-Stat.  Point-Stat
is run
>> once per valid time, however, you can then use the
>> STAT-Analysis tool to aggregate results through time.  Suppose
you've run
>> Point-Stat over a month of output for those 40 stations.  You could
run
>> STAT-Analysis to aggregate together that month of
>> Point-Stat output and compute monthly statistics for each of the 40
>> stations individually.
>>
>> Hopefully that helps.  If you get stuck on setting up the Point-
Stat
>> configuration file, just let me know and I'll be happy to help you
get it
>> going.
>>
>> Thanks,
>> John Halley Gotway
>> met_help at ucar.edu
>>
>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>>
>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>          Queue: met_help
>>>        Subject: MET 4 (or 4.1) question(s)
>>>          Owner: Nobody
>>>     Requestors: erik.noble at nasa.gov
>>>         Status: new
>>>    Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>
>>>
>>> Hi.
>>> First, to the developer, thank you for providing MET to the world.
It
>>> looks outstanding and very useful. If you don't get enough thanks,
I
>>> hope
>>> just one more "thank you" makes your day.
>>>
>>> I installed MET 4.1 on a Linux.
>>> I am ready to start using MET to compare 40 station observations
of
>>> daily
>>> accumulated precip to 64 WRF-ARW simulations for the same area.
>>> I have read the MET 4.1 manual for the past week \and gone through
all
>>> the
>>> examples.
>>>
>>> But I am very confused about one thing:
>>>
>>> Once a user uses p_interp to process WRF-ARW output, where do they
go
>>> from
>>> there?
>>> The p_interp output is still netcdf (variables are on pressure
level)
>>> yet
>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>> states that p_interp netcdf data can be used. How?
>>> How do I feed p_interp processed data into the other MET modules,
>>> particulalry if I want to do the above, compare station data to
WRF.
>>> Perhaps I missed it?
>>>
>>>
>>> I used assci2nc to process the station data, so they are ready. I
have
>>> the
>>> wrf-arw p_interp processed data. How do I put the wrf-data into
the
>>> point-stat module?
>>>
>>> If it needs to to still be in grib, what do I do with the P-interp
>>> results? I could have skipped p_interp step and just run the wrf-
arw
>>> output through UPP, right? I am confused.
>>>
>>> If Met user of developer could help me understand the next step,
that
>>> would be greatly appreciated.
>>>
>>> In addition, has anyone successfully installed this software on an
Apple
>>> machine?
>>>
>>> Sincerely,
>>> Erik Noble
>>>
>>>
>>>
>>
>
>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Fri Jul 12 13:15:23 2013

Wow, thank you.
I think I understand now. Your description of the point-ascii files is
clear and I think I can proceed.

I your description of the netcdf files you wrote:
(1) 2 dimensions named lat and lon.
(2) The lat and lon variables are *NOT* required. . .
(3) The data variables should be 2 dimensional, defined (lat, lon).
(4) Need data variable attributes specifying timing information.
(5) MET expects bad data to be encoded as -9999.
(6) Need global attributes specifying grid definition information.
(7) Need "MET_version" global attribute.


I have all of that except 4, 6 and 7.

A description of the netcdf files I have are below. They are on a
0.25°x0.25° grid.
The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
The variable in the TRMM file is: ACC_RAIN(time, lat, lon)

	where they are both on the same grid 0.25°x0.25° grid. time = 12, and
lat
and lon are 1D.

First, where would global attributes specifying grid definition
information go?
Is the timing info ok?
By Met version, I just need to add "Metv4.1" ?

Thank you.
-Erik


*****************
Grid netcdf file description (0.25x0.25_grid_template.nc)

*****************
1 #


  2 # gridID 0
  3 #
  4 gridtype  = lonlat
  5 gridsize  = 61600
  6 xname     = lon
  7 xlongname = longitude
  8 xunits    = degrees_east
  9 yname     = lat
 10 ylongname = latitude
 11 yunits    = degrees_north
 12 xsize     = 280
 13 ysize     = 220
 14 xfirst    = -34.875
 15 xinc      = 0.25
 16 yfirst    = -19.875
 17 yinc      = 0.25


****************
 netcdf file: Grid template
****************
$ ncdump -h 0.25x0.25_grid_template.nc
netcdf \0.25x0.25_grid_template {
dimensions:
	lon = 280 ;
	lat = 220 ;
variables:
	double lon(lon) ;
		lon:standard_name = "longitude" ;
		lon:long_name = "longitude" ;
		lon:units = "degrees_east" ;
		lon:axis = "X" ;
	double lat(lat) ;
		lat:standard_name = "latitude" ;
		lat:long_name = "latitude" ;
		lat:units = "degrees_north" ;
		lat:axis = "Y" ;
	float random(lat, lon) ;

// global attributes:
		:CDI = "Climate Data Interface version 1.5.5
(http://code.zmaw.de/projects/cdi)" ;
		:Conventions = "CF-1.4" ;
		:history = "Tue May 22 15:38:08 2012: cdo -f nc
-sellonlatbox,-35,35,-20,35 -random,global_0.25
0.25x0.25_grid_template.nc" ;
		:CDO = "Climate Data Operators version 1.5.5
(http://code.zmaw.de/projects/cdo)" ;
}
****************
TRMM netcdf file (12 days, daily accumulated precip)
****************
gs611-noble:precip_work eunoble$ ncdump -h
0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
dimensions:
	lon = 280 ;
	lat = 220 ;
	time = UNLIMITED ; // (12 currently)
variables:
	double lon(lon) ;
		lon:standard_name = "longitude" ;
		lon:long_name = "longitude" ;
		lon:units = "degrees_east" ;
		lon:axis = "X" ;
	double lat(lat) ;
		lat:standard_name = "latitude" ;
		lat:long_name = "latitude" ;
		lat:units = "degrees_north" ;
		lat:axis = "Y" ;
	double time(time) ;
		time:standard_name = "time" ;
		time:units = "hours since 1997-01-01 00:00:00" ;
		time:calendar = "standard" ;
	float ACC_RAIN(time, lat, lon) ;
		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
		ACC_RAIN:units = "mm" ;
		ACC_RAIN:_FillValue = -9999.f ;

// global attributes:
		:CDI = "Climate Data Interface version 1.5.5
(http://code.zmaw.de/projects/cdi)" ;
		:Conventions = "CF-1.4" ;
		:history = "Thu Oct 25 20:37:02 2012: cdo
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0.2
5_grid_template.nc -sellonlatbox,-35,35,-20,35 -seltimestep,2/13/1
TRMM_accum_precip_daily.nc
TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
			"Wed Oct 24 17:01:05 2012: cdo daysum -chname,PRC_ACCUM,ACC_RAIN,
-selname,PRC_ACCUM mergefile_accum.nc TRMM_accum_precip_daily.nc\n",
			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
mergefile_accum.nc\n",
			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc <snip>
3B42.060930.9.6.HDF.nc mergefile.nc" ;
		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
		:info = "\n",
			"The 3B-42 estimates are scaled to match the monthly rain gauge
analyses\n",
			"used in 3B-43.The output is rainfall for 0.25x0.25 degree grid
boxes
\n",
			"every 3 hours.\n",
			"" ;
		:description = "\n",
			"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_README/T
RMM_3B42_readme.shtml\n",
			"" ;
		:ftp = "\n",
			"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Products/02
_Gridded/06_3-hour_Gpi_Cal_3B_42\n",
			"" ;
		:title = "TRMM_3B42" ;
		:nco_openmp_thread_number = 1 ;
		:NCO = "4.1.0" ;
		:CDO = "Climate Data Operators version 1.5.5
(http://code.zmaw.de/projects/cdo)" ;



************
WRF FILE (processed,12 days, daily accumulated precip)
************
$ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
dimensions:
	lon = 280 ;
	lat = 220 ;
	time = UNLIMITED ; // (12 currently)
variables:
	double lon(lon) ;
		lon:standard_name = "longitude" ;
		lon:long_name = "longitude" ;
		lon:units = "degrees_east" ;
		lon:axis = "X" ;
	double lat(lat) ;
		lat:standard_name = "latitude" ;
		lat:long_name = "latitude" ;
		lat:units = "degrees_north" ;
		lat:axis = "Y" ;
	double time(time) ;
		time:standard_name = "time" ;
		time:units = "hours since 2006-09-02 00:00:00" ;
		time:calendar = "standard" ;
	float ACC_RAIN(time, lat, lon) ;
		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation" ;
		ACC_RAIN:units = "mm" ;
		ACC_RAIN:_FillValue = -9.e+33f ;

// global attributes:
		:CDI = "Climate Data Interface version 1.5.5
(http://code.zmaw.de/projects/cdi)" ;
		:Conventions = "CF-1.1" ;
		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN -sub
-seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
-selname,RAINNC temp1.nc temp_RAIN.nc\n",
			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
-selname,RAINC,RAINNC
0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
			"Wed May 23 14:30:03 2012: cdo -P 4
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0.2
5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
0.25x0.25_grid_Experiment_01.nc\n",
			"Wed May 23 14:27:04 2012: ncks -v
u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
post_wrfout_d01_2006-09-02_00:00:00.nc
temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES - CIRES" ;
		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0" ;
		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
		:NCO = "4.1.0" ;
		:CDO = "Climate Data Operators version 1.5.5
(http://code.zmaw.de/projects/cdo)" ;



On 7/12/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:

>Eric,
>
>The timing of the data within the ASCII files doesn't matter.  So
using
>40 12-day files is fine.
>
>But each time you run Point-Stat, you'll pass it a forecast file and
an
>observation file containing the observations that should be used to
>verify that forecast.  So you'll need to know which obs
>files go with which forecast files.  Point-Stat reads the timing
>information from the forecast file and then defines a time window
around
>that valid time (see "obs_window" setting in the Point-Stat
>config file and see data/config/README for details).  Any point
>observations falling within that time window will be used.  Any point
>observations falling outside that time window will be skipped over.
>
>If you were using GRIB forecast files, you could actually set this up
in
>such a way as to verify all of the output times in a single call to
>Point-Stat.  You'd literally 'cat' together all of the GRIB
>files and then set up a more complex configuration file to tell
>Point-Stat what to do.  I suppose you could do the same using NetCDF
if
>you defined a bunch of different 2D variables, each for precip
>with a different time (e.g. APCP_06_2012020500, APCP_06_2012020506,
...).
> Or you can keep all the NetCDF files separate and just loop through
them
>in a script.  That's what we typically do.
>
>In order to create a NetCDF file that MET can handle you basically
need
>to structure it like the NetCDF output of the pcp_combine tool:
>(1) 2 dimensions named lat and lon.
>(2) The lat and lon variables are *NOT* required.  MET doesn't
actually
>use them.  We put them in there since other plotting tools (like IDV)
use
>them.
>(3) The data variables should be 2 dimensional, defined (lat, lon).
>(4) Need data variable attributes specifying timing information.
>(5) MET expects bad data to be encoded as -9999.
>(6) Need global attributes specifying grid definition information.
>(7) Need "MET_version" global attribute.
>
>Ultimately, we'd like to support CF-compliant NetCDF files, but we
>haven't gotten there yet.  For now, we're stuck with this rather
>restrictive NetCDF format that we use internally.
>
>If you need help setting up the global attributes, just send me a
>description of the grid you're using.  Give it a shot and if you run
into
>problems, just send me a sample NetCDF file and I can let
>you know what changes are needed.
>
>Thanks,
>John
>
>On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>
>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>
>> Hi, thank you. Your reply clarifies a lot.
>>
>> 2 questions:
>> You wrote, "Point-Stat is run once per valid time,Š"
>> I have 40 ascii files that hold 12-day daily precipitation
accumulations
>> at each station. (1) Do the ascii files need to be once per valid
time
>>or
>> can I still use the 40 12-day files?
>>
>> Before I discovered MET, I went through the trouble of writing my
own
>> scripts that do the same thing as p_interp and destaggers the
winds.
>>Also,
>> I found an easy way to regrid netcdf files, both observed and wrf-
arw
>> files to the same grid, using CDO. Details of the command line are
here,
>> where the 2nd-to-last-entry provides the best explanation
>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>> What does the netcdf file need to have in order to still be used in
MET
>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon =
lon)
>>
>> Thank you.
>> Erik
>>
>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>wrote:
>>
>>> Erik,
>>>
>>> We appreciate the feedback!  It's been a long road getting MET out
the
>>> door and supporting it, but it's nice to hear that it's useful to
the
>>> modelling community.
>>>
>>> I see that you're asking about the support in MET for the NetCDF
output
>>> of the pinterp utility.  The output of pinterp is a gridded NetCDF
file
>>> that the MET tools do support, but with some
>>> limitations.  WRF is run on a staggered grid with the wind fields
>>> staggered, while the mass points are on a regular grid.  The
pinterp
>>> output is indeed on pressure levels, but the wind points are
>>> still staggered, and MET is unable to read them.  So basically,
using
>>> pinterp is not a good choice if you're interested in winds.  But
as
>>>long
>>> as you're not using winds, the gridded pinterp output can
>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>
>>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>>> it's not easy to regrid.  When doing grid-to-grid comparisons, you
need
>>> to put the forecast and observation fields on a common
>>> grid.  That's easy to do for GRIB using the copygb utility, but
not
>>>easy
>>> in general for NetCDF.
>>>
>>> So the other WRF post-processing alternative is the Unified
>>>PostProcessor
>>> (UPP).  It's output is in GRIB1 format, which MET fully supports.
If
>>> possible, I'd suggest using UPP instead of pinterp to
>>> avoid the staggered grid and regridding limitations of NetCDF.
Support
>>> for UPP is provided via wrfhelp at ucar.edu.
>>>
>>> But to get to your question...
>>>
>>> If you're dealing with point observations, the only MET tool to be
used
>>> is the Point-Stat tool.  I suggest running Point-Stat to compare
the
>>> output of pinterp (or UPP, if you switch) to the point
>>> observation output of ASCII2NC.  With only 40 points, I'd suggest
>>>writing
>>> out the matched pair (MPR) line type from Point-Stat.  Point-Stat
is
>>>run
>>> once per valid time, however, you can then use the
>>> STAT-Analysis tool to aggregate results through time.  Suppose
you've
>>>run
>>> Point-Stat over a month of output for those 40 stations.  You
could run
>>> STAT-Analysis to aggregate together that month of
>>> Point-Stat output and compute monthly statistics for each of the
40
>>> stations individually.
>>>
>>> Hopefully that helps.  If you get stuck on setting up the Point-
Stat
>>> configuration file, just let me know and I'll be happy to help you
get
>>>it
>>> going.
>>>
>>> Thanks,
>>> John Halley Gotway
>>> met_help at ucar.edu
>>>
>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
>>>wrote:
>>>>
>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>          Queue: met_help
>>>>        Subject: MET 4 (or 4.1) question(s)
>>>>          Owner: Nobody
>>>>     Requestors: erik.noble at nasa.gov
>>>>         Status: new
>>>>    Ticket <URL:
>>>>https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>
>>>>
>>>> Hi.
>>>> First, to the developer, thank you for providing MET to the
world. It
>>>> looks outstanding and very useful. If you don't get enough
thanks, I
>>>> hope
>>>> just one more "thank you" makes your day.
>>>>
>>>> I installed MET 4.1 on a Linux.
>>>> I am ready to start using MET to compare 40 station observations
of
>>>> daily
>>>> accumulated precip to 64 WRF-ARW simulations for the same area.
>>>> I have read the MET 4.1 manual for the past week \and gone
through all
>>>> the
>>>> examples.
>>>>
>>>> But I am very confused about one thing:
>>>>
>>>> Once a user uses p_interp to process WRF-ARW output, where do
they go
>>>> from
>>>> there?
>>>> The p_interp output is still netcdf (variables are on pressure
level)
>>>> yet
>>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>>> states that p_interp netcdf data can be used. How?
>>>> How do I feed p_interp processed data into the other MET modules,
>>>> particulalry if I want to do the above, compare station data to
WRF.
>>>> Perhaps I missed it?
>>>>
>>>>
>>>> I used assci2nc to process the station data, so they are ready. I
have
>>>> the
>>>> wrf-arw p_interp processed data. How do I put the wrf-data into
the
>>>> point-stat module?
>>>>
>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>> results? I could have skipped p_interp step and just run the wrf-
arw
>>>> output through UPP, right? I am confused.
>>>>
>>>> If Met user of developer could help me understand the next step,
that
>>>> would be greatly appreciated.
>>>>
>>>> In addition, has anyone successfully installed this software on
an
>>>>Apple
>>>> machine?
>>>>
>>>> Sincerely,
>>>> Erik Noble
>>>>
>>>>
>>>>
>>>
>>
>>
>



------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: John Halley Gotway
Time: Fri Jul 12 16:14:14 2013

Erik,

I see that the variables are dimensioned like this: ACC_RAIN(time,
lat, lon).  The problem is that the MET tools won't read the timing
information from the "time" dimension as you'd probably expect.
Instead, it'll try to read the timing information from the variable
attributes.  For example, the output of pcp_combine looks like this:
         float APCP_12(lat, lon) ;
                 APCP_12:name = "APCP_12" ;
                 APCP_12:long_name = "Total precipitation" ;
                 APCP_12:level = "A12" ;
                 APCP_12:units = "kg/m^2" ;
                 APCP_12:_FillValue = -9999.f ;
                 APCP_12:init_time = "20050807_000000" ;
                 APCP_12:init_time_ut = 1123372800 ;
                 APCP_12:valid_time = "20050807_120000" ;
                 APCP_12:valid_time_ut = 1123416000 ;
                 APCP_12:accum_time = "120000" ;
                 APCP_12:accum_time_sec = 43200 ;

It reads the model initialization time (init_time_ut), valid time
(valid_time_ut), and accumulation interval (accum_time_sec) from those
variable attributes.  Your NetCDF files follow the CF-1.4
convention, but unfortunately MET doesn't have the ability to handle
that yet.

If you want to have multiple times in the file, they'd need to be
stored in separate variables, with the corresponding timing
information defined in the variable attribute section.

So unfortunately, it's messier than we'd like.  I do see that you're
using TRMM data, and we do have an Rscript on our website that
reformats ASCII TRMM data into the flavor of NetCDF that MET is
expecting.  You can find information about that here (trmm2nc.R):
    http://www.dtcenter.org/met/users/downloads/observation_data.php

Let me know how you'd like to proceed, and what I can do to help.

Thanks,
John


On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>
> Wow, thank you.
> I think I understand now. Your description of the point-ascii files
is
> clear and I think I can proceed.
>
> I your description of the netcdf files you wrote:
> (1) 2 dimensions named lat and lon.
> (2) The lat and lon variables are *NOT* required. . .
> (3) The data variables should be 2 dimensional, defined (lat, lon).
> (4) Need data variable attributes specifying timing information.
> (5) MET expects bad data to be encoded as -9999.
> (6) Need global attributes specifying grid definition information.
> (7) Need "MET_version" global attribute.
>
>
> I have all of that except 4, 6 and 7.
>
> A description of the netcdf files I have are below. They are on a
> 0.25°x0.25° grid.
> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>
> 	where they are both on the same grid 0.25°x0.25° grid. time = 12,
and lat
> and lon are 1D.
>
> First, where would global attributes specifying grid definition
> information go?
> Is the timing info ok?
> By Met version, I just need to add "Metv4.1" ?
>
> Thank you.
> -Erik
>
>
> *****************
> Grid netcdf file description (0.25x0.25_grid_template.nc)
>
> *****************
> 1 #
>
>
>    2 # gridID 0
>    3 #
>    4 gridtype  = lonlat
>    5 gridsize  = 61600
>    6 xname     = lon
>    7 xlongname = longitude
>    8 xunits    = degrees_east
>    9 yname     = lat
>   10 ylongname = latitude
>   11 yunits    = degrees_north
>   12 xsize     = 280
>   13 ysize     = 220
>   14 xfirst    = -34.875
>   15 xinc      = 0.25
>   16 yfirst    = -19.875
>   17 yinc      = 0.25
>
>
> ****************
>   netcdf file: Grid template
> ****************
> $ ncdump -h 0.25x0.25_grid_template.nc
> netcdf \0.25x0.25_grid_template {
> dimensions:
> 	lon = 280 ;
> 	lat = 220 ;
> variables:
> 	double lon(lon) ;
> 		lon:standard_name = "longitude" ;
> 		lon:long_name = "longitude" ;
> 		lon:units = "degrees_east" ;
> 		lon:axis = "X" ;
> 	double lat(lat) ;
> 		lat:standard_name = "latitude" ;
> 		lat:long_name = "latitude" ;
> 		lat:units = "degrees_north" ;
> 		lat:axis = "Y" ;
> 	float random(lat, lon) ;
>
> // global attributes:
> 		:CDI = "Climate Data Interface version 1.5.5
> (http://code.zmaw.de/projects/cdi)" ;
> 		:Conventions = "CF-1.4" ;
> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
> -sellonlatbox,-35,35,-20,35 -random,global_0.25
> 0.25x0.25_grid_template.nc" ;
> 		:CDO = "Climate Data Operators version 1.5.5
> (http://code.zmaw.de/projects/cdo)" ;
> }
> ****************
> TRMM netcdf file (12 days, daily accumulated precip)
> ****************
> gs611-noble:precip_work eunoble$ ncdump -h
> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
> dimensions:
> 	lon = 280 ;
> 	lat = 220 ;
> 	time = UNLIMITED ; // (12 currently)
> variables:
> 	double lon(lon) ;
> 		lon:standard_name = "longitude" ;
> 		lon:long_name = "longitude" ;
> 		lon:units = "degrees_east" ;
> 		lon:axis = "X" ;
> 	double lat(lat) ;
> 		lat:standard_name = "latitude" ;
> 		lat:long_name = "latitude" ;
> 		lat:units = "degrees_north" ;
> 		lat:axis = "Y" ;
> 	double time(time) ;
> 		time:standard_name = "time" ;
> 		time:units = "hours since 1997-01-01 00:00:00" ;
> 		time:calendar = "standard" ;
> 	float ACC_RAIN(time, lat, lon) ;
> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
> 		ACC_RAIN:units = "mm" ;
> 		ACC_RAIN:_FillValue = -9999.f ;
>
> // global attributes:
> 		:CDI = "Climate Data Interface version 1.5.5
> (http://code.zmaw.de/projects/cdi)" ;
> 		:Conventions = "CF-1.4" ;
> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0.2
> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 -seltimestep,2/13/1
> TRMM_accum_precip_daily.nc
> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
> 			"Wed Oct 24 17:01:05 2012: cdo daysum -chname,PRC_ACCUM,ACC_RAIN,
> -selname,PRC_ACCUM mergefile_accum.nc TRMM_accum_precip_daily.nc\n",
> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
> mergefile_accum.nc\n",
> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc <snip>
> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
> 		:info = "\n",
> 			"The 3B-42 estimates are scaled to match the monthly rain gauge
> analyses\n",
> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree grid
boxes
> \n",
> 			"every 3 hours.\n",
> 			"" ;
> 		:description = "\n",
>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_README/T
> RMM_3B42_readme.shtml\n",
> 			"" ;
> 		:ftp = "\n",
>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Products/02
> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
> 			"" ;
> 		:title = "TRMM_3B42" ;
> 		:nco_openmp_thread_number = 1 ;
> 		:NCO = "4.1.0" ;
> 		:CDO = "Climate Data Operators version 1.5.5
> (http://code.zmaw.de/projects/cdo)" ;
>
>
>
> ************
> WRF FILE (processed,12 days, daily accumulated precip)
> ************
> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
> dimensions:
> 	lon = 280 ;
> 	lat = 220 ;
> 	time = UNLIMITED ; // (12 currently)
> variables:
> 	double lon(lon) ;
> 		lon:standard_name = "longitude" ;
> 		lon:long_name = "longitude" ;
> 		lon:units = "degrees_east" ;
> 		lon:axis = "X" ;
> 	double lat(lat) ;
> 		lat:standard_name = "latitude" ;
> 		lat:long_name = "latitude" ;
> 		lat:units = "degrees_north" ;
> 		lat:axis = "Y" ;
> 	double time(time) ;
> 		time:standard_name = "time" ;
> 		time:units = "hours since 2006-09-02 00:00:00" ;
> 		time:calendar = "standard" ;
> 	float ACC_RAIN(time, lat, lon) ;
> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation" ;
> 		ACC_RAIN:units = "mm" ;
> 		ACC_RAIN:_FillValue = -9.e+33f ;
>
> // global attributes:
> 		:CDI = "Climate Data Interface version 1.5.5
> (http://code.zmaw.de/projects/cdi)" ;
> 		:Conventions = "CF-1.1" ;
> 		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN
-sub
> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
-selname,RAINC,RAINNC
> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
> 			"Wed May 23 14:30:03 2012: cdo -P 4
>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0.2
> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
> 0.25x0.25_grid_Experiment_01.nc\n",
> 			"Wed May 23 14:27:04 2012: ncks -v
> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
> post_wrfout_d01_2006-09-02_00:00:00.nc
> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES - CIRES"
;
> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0" ;
> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
> 		:NCO = "4.1.0" ;
> 		:CDO = "Climate Data Operators version 1.5.5
> (http://code.zmaw.de/projects/cdo)" ;
>
>
>
> On 7/12/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:
>
>> Eric,
>>
>> The timing of the data within the ASCII files doesn't matter.  So
using
>> 40 12-day files is fine.
>>
>> But each time you run Point-Stat, you'll pass it a forecast file
and an
>> observation file containing the observations that should be used to
>> verify that forecast.  So you'll need to know which obs
>> files go with which forecast files.  Point-Stat reads the timing
>> information from the forecast file and then defines a time window
around
>> that valid time (see "obs_window" setting in the Point-Stat
>> config file and see data/config/README for details).  Any point
>> observations falling within that time window will be used.  Any
point
>> observations falling outside that time window will be skipped over.
>>
>> If you were using GRIB forecast files, you could actually set this
up in
>> such a way as to verify all of the output times in a single call to
>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>> files and then set up a more complex configuration file to tell
>> Point-Stat what to do.  I suppose you could do the same using
NetCDF if
>> you defined a bunch of different 2D variables, each for precip
>> with a different time (e.g. APCP_06_2012020500, APCP_06_2012020506,
...).
>> Or you can keep all the NetCDF files separate and just loop through
them
>> in a script.  That's what we typically do.
>>
>> In order to create a NetCDF file that MET can handle you basically
need
>> to structure it like the NetCDF output of the pcp_combine tool:
>> (1) 2 dimensions named lat and lon.
>> (2) The lat and lon variables are *NOT* required.  MET doesn't
actually
>> use them.  We put them in there since other plotting tools (like
IDV) use
>> them.
>> (3) The data variables should be 2 dimensional, defined (lat, lon).
>> (4) Need data variable attributes specifying timing information.
>> (5) MET expects bad data to be encoded as -9999.
>> (6) Need global attributes specifying grid definition information.
>> (7) Need "MET_version" global attribute.
>>
>> Ultimately, we'd like to support CF-compliant NetCDF files, but we
>> haven't gotten there yet.  For now, we're stuck with this rather
>> restrictive NetCDF format that we use internally.
>>
>> If you need help setting up the global attributes, just send me a
>> description of the grid you're using.  Give it a shot and if you
run into
>> problems, just send me a sample NetCDF file and I can let
>> you know what changes are needed.
>>
>> Thanks,
>> John
>>
>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>
>>> Hi, thank you. Your reply clarifies a lot.
>>>
>>> 2 questions:
>>> You wrote, "Point-Stat is run once per valid time,Š"
>>> I have 40 ascii files that hold 12-day daily precipitation
accumulations
>>> at each station. (1) Do the ascii files need to be once per valid
time
>>> or
>>> can I still use the 40 12-day files?
>>>
>>> Before I discovered MET, I went through the trouble of writing my
own
>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>> Also,
>>> I found an easy way to regrid netcdf files, both observed and wrf-
arw
>>> files to the same grid, using CDO. Details of the command line are
here,
>>> where the 2nd-to-last-entry provides the best explanation
>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>> What does the netcdf file need to have in order to still be used
in MET
>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon =
lon)
>>>
>>> Thank you.
>>> Erik
>>>
>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>> wrote:
>>>
>>>> Erik,
>>>>
>>>> We appreciate the feedback!  It's been a long road getting MET
out the
>>>> door and supporting it, but it's nice to hear that it's useful to
the
>>>> modelling community.
>>>>
>>>> I see that you're asking about the support in MET for the NetCDF
output
>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF file
>>>> that the MET tools do support, but with some
>>>> limitations.  WRF is run on a staggered grid with the wind fields
>>>> staggered, while the mass points are on a regular grid.  The
pinterp
>>>> output is indeed on pressure levels, but the wind points are
>>>> still staggered, and MET is unable to read them.  So basically,
using
>>>> pinterp is not a good choice if you're interested in winds.  But
as
>>>> long
>>>> as you're not using winds, the gridded pinterp output can
>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>
>>>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>>>> it's not easy to regrid.  When doing grid-to-grid comparisons,
you need
>>>> to put the forecast and observation fields on a common
>>>> grid.  That's easy to do for GRIB using the copygb utility, but
not
>>>> easy
>>>> in general for NetCDF.
>>>>
>>>> So the other WRF post-processing alternative is the Unified
>>>> PostProcessor
>>>> (UPP).  It's output is in GRIB1 format, which MET fully supports.
If
>>>> possible, I'd suggest using UPP instead of pinterp to
>>>> avoid the staggered grid and regridding limitations of NetCDF.
Support
>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>
>>>> But to get to your question...
>>>>
>>>> If you're dealing with point observations, the only MET tool to
be used
>>>> is the Point-Stat tool.  I suggest running Point-Stat to compare
the
>>>> output of pinterp (or UPP, if you switch) to the point
>>>> observation output of ASCII2NC.  With only 40 points, I'd suggest
>>>> writing
>>>> out the matched pair (MPR) line type from Point-Stat.  Point-Stat
is
>>>> run
>>>> once per valid time, however, you can then use the
>>>> STAT-Analysis tool to aggregate results through time.  Suppose
you've
>>>> run
>>>> Point-Stat over a month of output for those 40 stations.  You
could run
>>>> STAT-Analysis to aggregate together that month of
>>>> Point-Stat output and compute monthly statistics for each of the
40
>>>> stations individually.
>>>>
>>>> Hopefully that helps.  If you get stuck on setting up the Point-
Stat
>>>> configuration file, just let me know and I'll be happy to help
you get
>>>> it
>>>> going.
>>>>
>>>> Thanks,
>>>> John Halley Gotway
>>>> met_help at ucar.edu
>>>>
>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>> wrote:
>>>>>
>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>           Queue: met_help
>>>>>         Subject: MET 4 (or 4.1) question(s)
>>>>>           Owner: Nobody
>>>>>      Requestors: erik.noble at nasa.gov
>>>>>          Status: new
>>>>>     Ticket <URL:
>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>
>>>>>
>>>>> Hi.
>>>>> First, to the developer, thank you for providing MET to the
world. It
>>>>> looks outstanding and very useful. If you don't get enough
thanks, I
>>>>> hope
>>>>> just one more "thank you" makes your day.
>>>>>
>>>>> I installed MET 4.1 on a Linux.
>>>>> I am ready to start using MET to compare 40 station observations
of
>>>>> daily
>>>>> accumulated precip to 64 WRF-ARW simulations for the same area.
>>>>> I have read the MET 4.1 manual for the past week \and gone
through all
>>>>> the
>>>>> examples.
>>>>>
>>>>> But I am very confused about one thing:
>>>>>
>>>>> Once a user uses p_interp to process WRF-ARW output, where do
they go
>>>>> from
>>>>> there?
>>>>> The p_interp output is still netcdf (variables are on pressure
level)
>>>>> yet
>>>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>>>> states that p_interp netcdf data can be used. How?
>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>> particulalry if I want to do the above, compare station data to
WRF.
>>>>> Perhaps I missed it?
>>>>>
>>>>>
>>>>> I used assci2nc to process the station data, so they are ready.
I have
>>>>> the
>>>>> wrf-arw p_interp processed data. How do I put the wrf-data into
the
>>>>> point-stat module?
>>>>>
>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>> results? I could have skipped p_interp step and just run the
wrf-arw
>>>>> output through UPP, right? I am confused.
>>>>>
>>>>> If Met user of developer could help me understand the next step,
that
>>>>> would be greatly appreciated.
>>>>>
>>>>> In addition, has anyone successfully installed this software on
an
>>>>> Apple
>>>>> machine?
>>>>>
>>>>> Sincerely,
>>>>> Erik Noble
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>
>
>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Mon Jul 15 08:11:21 2013

Dear John,
Thank you. I am working with both TRMM and CMORPH, in addition to
station
data. So there is some overlap and new parts to consider.

"If you want to have multiple times in the file, they'd need to be
stored
in separate variables, with the corresponding timing information
defined
in the variable attribute section."


Is there an example file in the sample data that shows this?
-Erik

On 7/12/13 6:14 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:

>Erik,
>
>I see that the variables are dimensioned like this: ACC_RAIN(time,
lat,
>lon).  The problem is that the MET tools won't read the timing
>information from the "time" dimension as you'd probably expect.
>Instead, it'll try to read the timing information from the variable
>attributes.  For example, the output of pcp_combine looks like this:
>         float APCP_12(lat, lon) ;
>                 APCP_12:name = "APCP_12" ;
>                 APCP_12:long_name = "Total precipitation" ;
>                 APCP_12:level = "A12" ;
>                 APCP_12:units = "kg/m^2" ;
>                 APCP_12:_FillValue = -9999.f ;
>                 APCP_12:init_time = "20050807_000000" ;
>                 APCP_12:init_time_ut = 1123372800 ;
>                 APCP_12:valid_time = "20050807_120000" ;
>                 APCP_12:valid_time_ut = 1123416000 ;
>                 APCP_12:accum_time = "120000" ;
>                 APCP_12:accum_time_sec = 43200 ;
>
>It reads the model initialization time (init_time_ut), valid time
>(valid_time_ut), and accumulation interval (accum_time_sec) from
those
>variable attributes.  Your NetCDF files follow the CF-1.4
>convention, but unfortunately MET doesn't have the ability to handle
that
>yet.
>
>If you want to have multiple times in the file, they'd need to be
stored
>in separate variables, with the corresponding timing information
defined
>in the variable attribute section.
>
>So unfortunately, it's messier than we'd like.  I do see that you're
>using TRMM data, and we do have an Rscript on our website that
reformats
>ASCII TRMM data into the flavor of NetCDF that MET is
>expecting.  You can find information about that here (trmm2nc.R):
>    http://www.dtcenter.org/met/users/downloads/observation_data.php
>
>Let me know how you'd like to proceed, and what I can do to help.
>
>Thanks,
>John
>
>
>On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>
>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>
>> Wow, thank you.
>> I think I understand now. Your description of the point-ascii files
is
>> clear and I think I can proceed.
>>
>> I your description of the netcdf files you wrote:
>> (1) 2 dimensions named lat and lon.
>> (2) The lat and lon variables are *NOT* required. . .
>> (3) The data variables should be 2 dimensional, defined (lat, lon).
>> (4) Need data variable attributes specifying timing information.
>> (5) MET expects bad data to be encoded as -9999.
>> (6) Need global attributes specifying grid definition information.
>> (7) Need "MET_version" global attribute.
>>
>>
>> I have all of that except 4, 6 and 7.
>>
>> A description of the netcdf files I have are below. They are on a
>> 0.25°x0.25° grid.
>> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
>> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>>
>> 	where they are both on the same grid 0.25°x0.25° grid. time = 12,
and
>>lat
>> and lon are 1D.
>>
>> First, where would global attributes specifying grid definition
>> information go?
>> Is the timing info ok?
>> By Met version, I just need to add "Metv4.1" ?
>>
>> Thank you.
>> -Erik
>>
>>
>> *****************
>> Grid netcdf file description (0.25x0.25_grid_template.nc)
>>
>> *****************
>> 1 #
>>
>>
>>    2 # gridID 0
>>    3 #
>>    4 gridtype  = lonlat
>>    5 gridsize  = 61600
>>    6 xname     = lon
>>    7 xlongname = longitude
>>    8 xunits    = degrees_east
>>    9 yname     = lat
>>   10 ylongname = latitude
>>   11 yunits    = degrees_north
>>   12 xsize     = 280
>>   13 ysize     = 220
>>   14 xfirst    = -34.875
>>   15 xinc      = 0.25
>>   16 yfirst    = -19.875
>>   17 yinc      = 0.25
>>
>>
>> ****************
>>   netcdf file: Grid template
>> ****************
>> $ ncdump -h 0.25x0.25_grid_template.nc
>> netcdf \0.25x0.25_grid_template {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	float random(lat, lon) ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.4" ;
>> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
>> -sellonlatbox,-35,35,-20,35 -random,global_0.25
>> 0.25x0.25_grid_template.nc" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>> }
>> ****************
>> TRMM netcdf file (12 days, daily accumulated precip)
>> ****************
>> gs611-noble:precip_work eunoble$ ncdump -h
>> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
>> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> 	time = UNLIMITED ; // (12 currently)
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	double time(time) ;
>> 		time:standard_name = "time" ;
>> 		time:units = "hours since 1997-01-01 00:00:00" ;
>> 		time:calendar = "standard" ;
>> 	float ACC_RAIN(time, lat, lon) ;
>> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
>> 		ACC_RAIN:units = "mm" ;
>> 		ACC_RAIN:_FillValue = -9999.f ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.4" ;
>> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>>
>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>.2
>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 -seltimestep,2/13/1
>> TRMM_accum_precip_daily.nc
>> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
>> 			"Wed Oct 24 17:01:05 2012: cdo daysum
-chname,PRC_ACCUM,ACC_RAIN,
>> -selname,PRC_ACCUM mergefile_accum.nc
TRMM_accum_precip_daily.nc\n",
>> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
>> mergefile_accum.nc\n",
>> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc <snip>
>> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
>> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
>> 		:info = "\n",
>> 			"The 3B-42 estimates are scaled to match the monthly rain gauge
>> analyses\n",
>> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree grid
boxes
>> \n",
>> 			"every 3 hours.\n",
>> 			"" ;
>> 		:description = "\n",
>>
>>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_README
>>/T
>> RMM_3B42_readme.shtml\n",
>> 			"" ;
>> 		:ftp = "\n",
>>
>>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Products/
>>02
>> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
>> 			"" ;
>> 		:title = "TRMM_3B42" ;
>> 		:nco_openmp_thread_number = 1 ;
>> 		:NCO = "4.1.0" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>>
>>
>>
>> ************
>> WRF FILE (processed,12 days, daily accumulated precip)
>> ************
>> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
>> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> 	time = UNLIMITED ; // (12 currently)
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	double time(time) ;
>> 		time:standard_name = "time" ;
>> 		time:units = "hours since 2006-09-02 00:00:00" ;
>> 		time:calendar = "standard" ;
>> 	float ACC_RAIN(time, lat, lon) ;
>> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
>> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation" ;
>> 		ACC_RAIN:units = "mm" ;
>> 		ACC_RAIN:_FillValue = -9.e+33f ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.1" ;
>> 		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN
-sub
>> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
>> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
>> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
>> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
>> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
>>-selname,RAINC,RAINNC
>> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
>> 			"Wed May 23 14:30:03 2012: cdo -P 4
>>
>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>.2
>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
>> 0.25x0.25_grid_Experiment_01.nc\n",
>> 			"Wed May 23 14:27:04 2012: ncks -v
>> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
>> post_wrfout_d01_2006-09-02_00:00:00.nc
>> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES - CIRES"
;
>> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0" ;
>> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
>> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
>> 		:NCO = "4.1.0" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>>
>>
>>
>> On 7/12/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
>>wrote:
>>
>>> Eric,
>>>
>>> The timing of the data within the ASCII files doesn't matter.  So
using
>>> 40 12-day files is fine.
>>>
>>> But each time you run Point-Stat, you'll pass it a forecast file
and an
>>> observation file containing the observations that should be used
to
>>> verify that forecast.  So you'll need to know which obs
>>> files go with which forecast files.  Point-Stat reads the timing
>>> information from the forecast file and then defines a time window
>>>around
>>> that valid time (see "obs_window" setting in the Point-Stat
>>> config file and see data/config/README for details).  Any point
>>> observations falling within that time window will be used.  Any
point
>>> observations falling outside that time window will be skipped
over.
>>>
>>> If you were using GRIB forecast files, you could actually set this
up
>>>in
>>> such a way as to verify all of the output times in a single call
to
>>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>>> files and then set up a more complex configuration file to tell
>>> Point-Stat what to do.  I suppose you could do the same using
NetCDF if
>>> you defined a bunch of different 2D variables, each for precip
>>> with a different time (e.g. APCP_06_2012020500,
APCP_06_2012020506,
>>>...).
>>> Or you can keep all the NetCDF files separate and just loop
through
>>>them
>>> in a script.  That's what we typically do.
>>>
>>> In order to create a NetCDF file that MET can handle you basically
need
>>> to structure it like the NetCDF output of the pcp_combine tool:
>>> (1) 2 dimensions named lat and lon.
>>> (2) The lat and lon variables are *NOT* required.  MET doesn't
actually
>>> use them.  We put them in there since other plotting tools (like
IDV)
>>>use
>>> them.
>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>> (4) Need data variable attributes specifying timing information.
>>> (5) MET expects bad data to be encoded as -9999.
>>> (6) Need global attributes specifying grid definition information.
>>> (7) Need "MET_version" global attribute.
>>>
>>> Ultimately, we'd like to support CF-compliant NetCDF files, but we
>>> haven't gotten there yet.  For now, we're stuck with this rather
>>> restrictive NetCDF format that we use internally.
>>>
>>> If you need help setting up the global attributes, just send me a
>>> description of the grid you're using.  Give it a shot and if you
run
>>>into
>>> problems, just send me a sample NetCDF file and I can let
>>> you know what changes are needed.
>>>
>>> Thanks,
>>> John
>>>
>>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
>>>wrote:
>>>>
>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>
>>>> Hi, thank you. Your reply clarifies a lot.
>>>>
>>>> 2 questions:
>>>> You wrote, "Point-Stat is run once per valid time,Š"
>>>> I have 40 ascii files that hold 12-day daily precipitation
>>>>accumulations
>>>> at each station. (1) Do the ascii files need to be once per valid
time
>>>> or
>>>> can I still use the 40 12-day files?
>>>>
>>>> Before I discovered MET, I went through the trouble of writing my
own
>>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>>> Also,
>>>> I found an easy way to regrid netcdf files, both observed and
wrf-arw
>>>> files to the same grid, using CDO. Details of the command line
are
>>>>here,
>>>> where the 2nd-to-last-entry provides the best explanation
>>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>>> What does the netcdf file need to have in order to still be used
in
>>>>MET
>>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon
=
>>>>lon)
>>>>
>>>> Thank you.
>>>> Erik
>>>>
>>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>> wrote:
>>>>
>>>>> Erik,
>>>>>
>>>>> We appreciate the feedback!  It's been a long road getting MET
out
>>>>>the
>>>>> door and supporting it, but it's nice to hear that it's useful
to the
>>>>> modelling community.
>>>>>
>>>>> I see that you're asking about the support in MET for the NetCDF
>>>>>output
>>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF
>>>>>file
>>>>> that the MET tools do support, but with some
>>>>> limitations.  WRF is run on a staggered grid with the wind
fields
>>>>> staggered, while the mass points are on a regular grid.  The
pinterp
>>>>> output is indeed on pressure levels, but the wind points are
>>>>> still staggered, and MET is unable to read them.  So basically,
using
>>>>> pinterp is not a good choice if you're interested in winds.  But
as
>>>>> long
>>>>> as you're not using winds, the gridded pinterp output can
>>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>>
>>>>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>>>>> it's not easy to regrid.  When doing grid-to-grid comparisons,
you
>>>>>need
>>>>> to put the forecast and observation fields on a common
>>>>> grid.  That's easy to do for GRIB using the copygb utility, but
not
>>>>> easy
>>>>> in general for NetCDF.
>>>>>
>>>>> So the other WRF post-processing alternative is the Unified
>>>>> PostProcessor
>>>>> (UPP).  It's output is in GRIB1 format, which MET fully
supports.  If
>>>>> possible, I'd suggest using UPP instead of pinterp to
>>>>> avoid the staggered grid and regridding limitations of NetCDF.
>>>>>Support
>>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>>
>>>>> But to get to your question...
>>>>>
>>>>> If you're dealing with point observations, the only MET tool to
be
>>>>>used
>>>>> is the Point-Stat tool.  I suggest running Point-Stat to compare
the
>>>>> output of pinterp (or UPP, if you switch) to the point
>>>>> observation output of ASCII2NC.  With only 40 points, I'd
suggest
>>>>> writing
>>>>> out the matched pair (MPR) line type from Point-Stat.  Point-
Stat is
>>>>> run
>>>>> once per valid time, however, you can then use the
>>>>> STAT-Analysis tool to aggregate results through time.  Suppose
you've
>>>>> run
>>>>> Point-Stat over a month of output for those 40 stations.  You
could
>>>>>run
>>>>> STAT-Analysis to aggregate together that month of
>>>>> Point-Stat output and compute monthly statistics for each of the
40
>>>>> stations individually.
>>>>>
>>>>> Hopefully that helps.  If you get stuck on setting up the Point-
Stat
>>>>> configuration file, just let me know and I'll be happy to help
you
>>>>>get
>>>>> it
>>>>> going.
>>>>>
>>>>> Thanks,
>>>>> John Halley Gotway
>>>>> met_help at ucar.edu
>>>>>
>>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>>> wrote:
>>>>>>
>>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>>           Queue: met_help
>>>>>>         Subject: MET 4 (or 4.1) question(s)
>>>>>>           Owner: Nobody
>>>>>>      Requestors: erik.noble at nasa.gov
>>>>>>          Status: new
>>>>>>     Ticket <URL:
>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>
>>>>>>
>>>>>> Hi.
>>>>>> First, to the developer, thank you for providing MET to the
world.
>>>>>>It
>>>>>> looks outstanding and very useful. If you don't get enough
thanks, I
>>>>>> hope
>>>>>> just one more "thank you" makes your day.
>>>>>>
>>>>>> I installed MET 4.1 on a Linux.
>>>>>> I am ready to start using MET to compare 40 station
observations of
>>>>>> daily
>>>>>> accumulated precip to 64 WRF-ARW simulations for the same area.
>>>>>> I have read the MET 4.1 manual for the past week \and gone
through
>>>>>>all
>>>>>> the
>>>>>> examples.
>>>>>>
>>>>>> But I am very confused about one thing:
>>>>>>
>>>>>> Once a user uses p_interp to process WRF-ARW output, where do
they
>>>>>>go
>>>>>> from
>>>>>> there?
>>>>>> The p_interp output is still netcdf (variables are on pressure
>>>>>>level)
>>>>>> yet
>>>>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>>>>> states that p_interp netcdf data can be used. How?
>>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>>> particulalry if I want to do the above, compare station data to
WRF.
>>>>>> Perhaps I missed it?
>>>>>>
>>>>>>
>>>>>> I used assci2nc to process the station data, so they are ready.
I
>>>>>>have
>>>>>> the
>>>>>> wrf-arw p_interp processed data. How do I put the wrf-data into
the
>>>>>> point-stat module?
>>>>>>
>>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>>> results? I could have skipped p_interp step and just run the
wrf-arw
>>>>>> output through UPP, right? I am confused.
>>>>>>
>>>>>> If Met user of developer could help me understand the next
step,
>>>>>>that
>>>>>> would be greatly appreciated.
>>>>>>
>>>>>> In addition, has anyone successfully installed this software on
an
>>>>>> Apple
>>>>>> machine?
>>>>>>
>>>>>> Sincerely,
>>>>>> Erik Noble
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>



------------------------------------------------
Subject: MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Mon Jul 15 08:41:28 2013

Thank you.

One last question: attached is the ascii version of the station file,
and
netdf version of the file from the ascii2nc script.
The attributes don't give time information. I would not have known
about
this without your email. Will this be a problem? What needs to be
changed?

Sincerely,
Erik



> ncdump -h NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc
netcdf NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt {
dimensions:
	mxstr = 30 ;
	hdr_arr_len = 3 ;
	obs_arr_len = 5 ;
	nhdr = 12 ;
	nobs = UNLIMITED ; // (12 currently)
variables:
	char hdr_typ(nhdr, mxstr) ;
		hdr_typ:long_name = "message type" ;
	char hdr_sid(nhdr, mxstr) ;
		hdr_sid:long_name = "station identification" ;
	char hdr_vld(nhdr, mxstr) ;
		hdr_vld:long_name = "valid time" ;
		hdr_vld:units = "YYYYMMDD_HHMMSS UTC" ;
	float hdr_arr(nhdr, hdr_arr_len) ;
		hdr_arr:long_name = "array of observation station header values" ;
		hdr_arr:_fill_value = -9999.f ;
		hdr_arr:columns = "lat lon elv" ;
		hdr_arr:lat_long_name = "latitude" ;
		hdr_arr:lat_units = "degrees_north" ;
		hdr_arr:lon_long_name = "longitude" ;
		hdr_arr:lon_units = "degrees_east" ;
		hdr_arr:elv_long_name = "elevation " ;
		hdr_arr:elv_units = "meters above sea level (msl)" ;
	char obs_qty(nobs, mxstr) ;
		obs_qty:long_name = "quality flag" ;
	float obs_arr(nobs, obs_arr_len) ;
		obs_arr:long_name = "array of observation values" ;
		obs_arr:_fill_value = -9999.f ;
		obs_arr:columns = "hdr_id gc lvl hgt ob" ;
		obs_arr:hdr_id_long_name = "index of matching header data" ;
		obs_arr:gc_long_name = "grib code corresponding to the observation
type"
;
		obs_arr:lvl_long_name = "pressure level (hPa) or accumulation
interval
(sec)" ;
		obs_arr:hgt_long_name = "height in meters above sea level or ground
level (msl or agl)" ;
		obs_arr:ob_long_name = "observation value" ;

// global attributes:
		:FileOrigins = "File
NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc generated
20130709_215434 UTC on host borgb113 by the MET ascii2nc tool" ;
		:MET_version = "V4.1" ;
		:MET_tool = "ascii2nc" ;
}




On 7/12/13 6:14 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:

>Erik,
>
>I see that the variables are dimensioned like this: ACC_RAIN(time,
lat,
>lon).  The problem is that the MET tools won't read the timing
>information from the "time" dimension as you'd probably expect.
>Instead, it'll try to read the timing information from the variable
>attributes.  For example, the output of pcp_combine looks like this:
>         float APCP_12(lat, lon) ;
>                 APCP_12:name = "APCP_12" ;
>                 APCP_12:long_name = "Total precipitation" ;
>                 APCP_12:level = "A12" ;
>                 APCP_12:units = "kg/m^2" ;
>                 APCP_12:_FillValue = -9999.f ;
>                 APCP_12:init_time = "20050807_000000" ;
>                 APCP_12:init_time_ut = 1123372800 ;
>                 APCP_12:valid_time = "20050807_120000" ;
>                 APCP_12:valid_time_ut = 1123416000 ;
>                 APCP_12:accum_time = "120000" ;
>                 APCP_12:accum_time_sec = 43200 ;
>
>It reads the model initialization time (init_time_ut), valid time
>(valid_time_ut), and accumulation interval (accum_time_sec) from
those
>variable attributes.  Your NetCDF files follow the CF-1.4
>convention, but unfortunately MET doesn't have the ability to handle
that
>yet.
>
>If you want to have multiple times in the file, they'd need to be
stored
>in separate variables, with the corresponding timing information
defined
>in the variable attribute section.
>
>So unfortunately, it's messier than we'd like.  I do see that you're
>using TRMM data, and we do have an Rscript on our website that
reformats
>ASCII TRMM data into the flavor of NetCDF that MET is
>expecting.  You can find information about that here (trmm2nc.R):
>    http://www.dtcenter.org/met/users/downloads/observation_data.php
>
>Let me know how you'd like to proceed, and what I can do to help.
>
>Thanks,
>John
>
>
>On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>
>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>
>> Wow, thank you.
>> I think I understand now. Your description of the point-ascii files
is
>> clear and I think I can proceed.
>>
>> I your description of the netcdf files you wrote:
>> (1) 2 dimensions named lat and lon.
>> (2) The lat and lon variables are *NOT* required. . .
>> (3) The data variables should be 2 dimensional, defined (lat, lon).
>> (4) Need data variable attributes specifying timing information.
>> (5) MET expects bad data to be encoded as -9999.
>> (6) Need global attributes specifying grid definition information.
>> (7) Need "MET_version" global attribute.
>>
>>
>> I have all of that except 4, 6 and 7.
>>
>> A description of the netcdf files I have are below. They are on a
>> 0.25°x0.25° grid.
>> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
>> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>>
>> 	where they are both on the same grid 0.25°x0.25° grid. time = 12,
and
>>lat
>> and lon are 1D.
>>
>> First, where would global attributes specifying grid definition
>> information go?
>> Is the timing info ok?
>> By Met version, I just need to add "Metv4.1" ?
>>
>> Thank you.
>> -Erik
>>
>>
>> *****************
>> Grid netcdf file description (0.25x0.25_grid_template.nc)
>>
>> *****************
>> 1 #
>>
>>
>>    2 # gridID 0
>>    3 #
>>    4 gridtype  = lonlat
>>    5 gridsize  = 61600
>>    6 xname     = lon
>>    7 xlongname = longitude
>>    8 xunits    = degrees_east
>>    9 yname     = lat
>>   10 ylongname = latitude
>>   11 yunits    = degrees_north
>>   12 xsize     = 280
>>   13 ysize     = 220
>>   14 xfirst    = -34.875
>>   15 xinc      = 0.25
>>   16 yfirst    = -19.875
>>   17 yinc      = 0.25
>>
>>
>> ****************
>>   netcdf file: Grid template
>> ****************
>> $ ncdump -h 0.25x0.25_grid_template.nc
>> netcdf \0.25x0.25_grid_template {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	float random(lat, lon) ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.4" ;
>> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
>> -sellonlatbox,-35,35,-20,35 -random,global_0.25
>> 0.25x0.25_grid_template.nc" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>> }
>> ****************
>> TRMM netcdf file (12 days, daily accumulated precip)
>> ****************
>> gs611-noble:precip_work eunoble$ ncdump -h
>> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
>> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> 	time = UNLIMITED ; // (12 currently)
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	double time(time) ;
>> 		time:standard_name = "time" ;
>> 		time:units = "hours since 1997-01-01 00:00:00" ;
>> 		time:calendar = "standard" ;
>> 	float ACC_RAIN(time, lat, lon) ;
>> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
>> 		ACC_RAIN:units = "mm" ;
>> 		ACC_RAIN:_FillValue = -9999.f ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.4" ;
>> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>>
>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>.2
>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 -seltimestep,2/13/1
>> TRMM_accum_precip_daily.nc
>> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
>> 			"Wed Oct 24 17:01:05 2012: cdo daysum
-chname,PRC_ACCUM,ACC_RAIN,
>> -selname,PRC_ACCUM mergefile_accum.nc
TRMM_accum_precip_daily.nc\n",
>> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
>> mergefile_accum.nc\n",
>> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc <snip>
>> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
>> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
>> 		:info = "\n",
>> 			"The 3B-42 estimates are scaled to match the monthly rain gauge
>> analyses\n",
>> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree grid
boxes
>> \n",
>> 			"every 3 hours.\n",
>> 			"" ;
>> 		:description = "\n",
>>
>>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_README
>>/T
>> RMM_3B42_readme.shtml\n",
>> 			"" ;
>> 		:ftp = "\n",
>>
>>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Products/
>>02
>> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
>> 			"" ;
>> 		:title = "TRMM_3B42" ;
>> 		:nco_openmp_thread_number = 1 ;
>> 		:NCO = "4.1.0" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>>
>>
>>
>> ************
>> WRF FILE (processed,12 days, daily accumulated precip)
>> ************
>> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
>> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
>> dimensions:
>> 	lon = 280 ;
>> 	lat = 220 ;
>> 	time = UNLIMITED ; // (12 currently)
>> variables:
>> 	double lon(lon) ;
>> 		lon:standard_name = "longitude" ;
>> 		lon:long_name = "longitude" ;
>> 		lon:units = "degrees_east" ;
>> 		lon:axis = "X" ;
>> 	double lat(lat) ;
>> 		lat:standard_name = "latitude" ;
>> 		lat:long_name = "latitude" ;
>> 		lat:units = "degrees_north" ;
>> 		lat:axis = "Y" ;
>> 	double time(time) ;
>> 		time:standard_name = "time" ;
>> 		time:units = "hours since 2006-09-02 00:00:00" ;
>> 		time:calendar = "standard" ;
>> 	float ACC_RAIN(time, lat, lon) ;
>> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
>> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation" ;
>> 		ACC_RAIN:units = "mm" ;
>> 		ACC_RAIN:_FillValue = -9.e+33f ;
>>
>> // global attributes:
>> 		:CDI = "Climate Data Interface version 1.5.5
>> (http://code.zmaw.de/projects/cdi)" ;
>> 		:Conventions = "CF-1.1" ;
>> 		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN
-sub
>> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
>> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
>> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
>> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
>> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
>>-selname,RAINC,RAINNC
>> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
>> 			"Wed May 23 14:30:03 2012: cdo -P 4
>>
>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>.2
>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
>> 0.25x0.25_grid_Experiment_01.nc\n",
>> 			"Wed May 23 14:27:04 2012: ncks -v
>> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
>> post_wrfout_d01_2006-09-02_00:00:00.nc
>> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES - CIRES"
;
>> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0" ;
>> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
>> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
>> 		:NCO = "4.1.0" ;
>> 		:CDO = "Climate Data Operators version 1.5.5
>> (http://code.zmaw.de/projects/cdo)" ;
>>
>>
>>
>> On 7/12/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
>>wrote:
>>
>>> Eric,
>>>
>>> The timing of the data within the ASCII files doesn't matter.  So
using
>>> 40 12-day files is fine.
>>>
>>> But each time you run Point-Stat, you'll pass it a forecast file
and an
>>> observation file containing the observations that should be used
to
>>> verify that forecast.  So you'll need to know which obs
>>> files go with which forecast files.  Point-Stat reads the timing
>>> information from the forecast file and then defines a time window
>>>around
>>> that valid time (see "obs_window" setting in the Point-Stat
>>> config file and see data/config/README for details).  Any point
>>> observations falling within that time window will be used.  Any
point
>>> observations falling outside that time window will be skipped
over.
>>>
>>> If you were using GRIB forecast files, you could actually set this
up
>>>in
>>> such a way as to verify all of the output times in a single call
to
>>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>>> files and then set up a more complex configuration file to tell
>>> Point-Stat what to do.  I suppose you could do the same using
NetCDF if
>>> you defined a bunch of different 2D variables, each for precip
>>> with a different time (e.g. APCP_06_2012020500,
APCP_06_2012020506,
>>>...).
>>> Or you can keep all the NetCDF files separate and just loop
through
>>>them
>>> in a script.  That's what we typically do.
>>>
>>> In order to create a NetCDF file that MET can handle you basically
need
>>> to structure it like the NetCDF output of the pcp_combine tool:
>>> (1) 2 dimensions named lat and lon.
>>> (2) The lat and lon variables are *NOT* required.  MET doesn't
actually
>>> use them.  We put them in there since other plotting tools (like
IDV)
>>>use
>>> them.
>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>> (4) Need data variable attributes specifying timing information.
>>> (5) MET expects bad data to be encoded as -9999.
>>> (6) Need global attributes specifying grid definition information.
>>> (7) Need "MET_version" global attribute.
>>>
>>> Ultimately, we'd like to support CF-compliant NetCDF files, but we
>>> haven't gotten there yet.  For now, we're stuck with this rather
>>> restrictive NetCDF format that we use internally.
>>>
>>> If you need help setting up the global attributes, just send me a
>>> description of the grid you're using.  Give it a shot and if you
run
>>>into
>>> problems, just send me a sample NetCDF file and I can let
>>> you know what changes are needed.
>>>
>>> Thanks,
>>> John
>>>
>>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
>>>wrote:
>>>>
>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>
>>>> Hi, thank you. Your reply clarifies a lot.
>>>>
>>>> 2 questions:
>>>> You wrote, "Point-Stat is run once per valid time,Š"
>>>> I have 40 ascii files that hold 12-day daily precipitation
>>>>accumulations
>>>> at each station. (1) Do the ascii files need to be once per valid
time
>>>> or
>>>> can I still use the 40 12-day files?
>>>>
>>>> Before I discovered MET, I went through the trouble of writing my
own
>>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>>> Also,
>>>> I found an easy way to regrid netcdf files, both observed and
wrf-arw
>>>> files to the same grid, using CDO. Details of the command line
are
>>>>here,
>>>> where the 2nd-to-last-entry provides the best explanation
>>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>>> What does the netcdf file need to have in order to still be used
in
>>>>MET
>>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon
=
>>>>lon)
>>>>
>>>> Thank you.
>>>> Erik
>>>>
>>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>> wrote:
>>>>
>>>>> Erik,
>>>>>
>>>>> We appreciate the feedback!  It's been a long road getting MET
out
>>>>>the
>>>>> door and supporting it, but it's nice to hear that it's useful
to the
>>>>> modelling community.
>>>>>
>>>>> I see that you're asking about the support in MET for the NetCDF
>>>>>output
>>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF
>>>>>file
>>>>> that the MET tools do support, but with some
>>>>> limitations.  WRF is run on a staggered grid with the wind
fields
>>>>> staggered, while the mass points are on a regular grid.  The
pinterp
>>>>> output is indeed on pressure levels, but the wind points are
>>>>> still staggered, and MET is unable to read them.  So basically,
using
>>>>> pinterp is not a good choice if you're interested in winds.  But
as
>>>>> long
>>>>> as you're not using winds, the gridded pinterp output can
>>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>>
>>>>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>>>>> it's not easy to regrid.  When doing grid-to-grid comparisons,
you
>>>>>need
>>>>> to put the forecast and observation fields on a common
>>>>> grid.  That's easy to do for GRIB using the copygb utility, but
not
>>>>> easy
>>>>> in general for NetCDF.
>>>>>
>>>>> So the other WRF post-processing alternative is the Unified
>>>>> PostProcessor
>>>>> (UPP).  It's output is in GRIB1 format, which MET fully
supports.  If
>>>>> possible, I'd suggest using UPP instead of pinterp to
>>>>> avoid the staggered grid and regridding limitations of NetCDF.
>>>>>Support
>>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>>
>>>>> But to get to your question...
>>>>>
>>>>> If you're dealing with point observations, the only MET tool to
be
>>>>>used
>>>>> is the Point-Stat tool.  I suggest running Point-Stat to compare
the
>>>>> output of pinterp (or UPP, if you switch) to the point
>>>>> observation output of ASCII2NC.  With only 40 points, I'd
suggest
>>>>> writing
>>>>> out the matched pair (MPR) line type from Point-Stat.  Point-
Stat is
>>>>> run
>>>>> once per valid time, however, you can then use the
>>>>> STAT-Analysis tool to aggregate results through time.  Suppose
you've
>>>>> run
>>>>> Point-Stat over a month of output for those 40 stations.  You
could
>>>>>run
>>>>> STAT-Analysis to aggregate together that month of
>>>>> Point-Stat output and compute monthly statistics for each of the
40
>>>>> stations individually.
>>>>>
>>>>> Hopefully that helps.  If you get stuck on setting up the Point-
Stat
>>>>> configuration file, just let me know and I'll be happy to help
you
>>>>>get
>>>>> it
>>>>> going.
>>>>>
>>>>> Thanks,
>>>>> John Halley Gotway
>>>>> met_help at ucar.edu
>>>>>
>>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>>> wrote:
>>>>>>
>>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>>           Queue: met_help
>>>>>>         Subject: MET 4 (or 4.1) question(s)
>>>>>>           Owner: Nobody
>>>>>>      Requestors: erik.noble at nasa.gov
>>>>>>          Status: new
>>>>>>     Ticket <URL:
>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>
>>>>>>
>>>>>> Hi.
>>>>>> First, to the developer, thank you for providing MET to the
world.
>>>>>>It
>>>>>> looks outstanding and very useful. If you don't get enough
thanks, I
>>>>>> hope
>>>>>> just one more "thank you" makes your day.
>>>>>>
>>>>>> I installed MET 4.1 on a Linux.
>>>>>> I am ready to start using MET to compare 40 station
observations of
>>>>>> daily
>>>>>> accumulated precip to 64 WRF-ARW simulations for the same area.
>>>>>> I have read the MET 4.1 manual for the past week \and gone
through
>>>>>>all
>>>>>> the
>>>>>> examples.
>>>>>>
>>>>>> But I am very confused about one thing:
>>>>>>
>>>>>> Once a user uses p_interp to process WRF-ARW output, where do
they
>>>>>>go
>>>>>> from
>>>>>> there?
>>>>>> The p_interp output is still netcdf (variables are on pressure
>>>>>>level)
>>>>>> yet
>>>>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>>>>> states that p_interp netcdf data can be used. How?
>>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>>> particulalry if I want to do the above, compare station data to
WRF.
>>>>>> Perhaps I missed it?
>>>>>>
>>>>>>
>>>>>> I used assci2nc to process the station data, so they are ready.
I
>>>>>>have
>>>>>> the
>>>>>> wrf-arw p_interp processed data. How do I put the wrf-data into
the
>>>>>> point-stat module?
>>>>>>
>>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>>> results? I could have skipped p_interp step and just run the
wrf-arw
>>>>>> output through UPP, right? I am confused.
>>>>>>
>>>>>> If Met user of developer could help me understand the next
step,
>>>>>>that
>>>>>> would be greatly appreciated.
>>>>>>
>>>>>> In addition, has anyone successfully installed this software on
an
>>>>>> Apple
>>>>>> machine?
>>>>>>
>>>>>> Sincerely,
>>>>>> Erik Noble
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>


------------------------------------------------
Subject: MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Mon Jul 15 08:41:28 2013

ADPSFC AMMA01 20060902_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA    19.56
ADPSFC AMMA01 20060903_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     6.86
ADPSFC AMMA01 20060904_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     0.00
ADPSFC AMMA01 20060905_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     0.00
ADPSFC AMMA01 20060906_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA    26.42
ADPSFC AMMA01 20060907_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA    50.04
ADPSFC AMMA01 20060908_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     1.02
ADPSFC AMMA01 20060909_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     0.00
ADPSFC AMMA01 20060910_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     0.00
ADPSFC AMMA01 20060911_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     2.79
ADPSFC AMMA01 20060912_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     0.25
ADPSFC AMMA01 20060913_000000    14.57   -16.76 -9999.00 61 24
-9999.00 NA     9.14

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: John Halley Gotway
Time: Mon Jul 15 11:58:08 2013

Erik,

MET uses a NetCDF point observation file format (output of pb2nc,
ascii2nc, and madis2nc), and a gridded NetCDF file format (output of
pcp_combine and gen_poly_mask).  Those file formats differ.

For gridded data, you need to make it look like the output of
pcp_combine.  If you wanted to put multiple times into the same NetCDF
file, it might look something like this:

             float APCP_12_2013071500(lat, lon) ;
                   APCP_12:name = "APCP_12" ;
                   APCP_12:long_name = "Total precipitation" ;
                   APCP_12:level = "A12" ;
                   APCP_12:units = "kg/m^2" ;
                   APCP_12:_FillValue = -9999.f ;
                   APCP_12:init_time = "20130714_000000" ;
                   APCP_12:init_time_ut = 1373760000 ;
                   APCP_12:valid_time = "20130715_000000" ;
                   APCP_12:valid_time_ut = 1373846400 ;
                   APCP_12:accum_time = "120000" ;
                   APCP_12:accum_time_sec = 43200 ;
             float APCP_12_2013071512(lat, lon) ;
                   APCP_12:name = "APCP_12" ;
                   APCP_12:long_name = "Total precipitation" ;
                   APCP_12:level = "A12" ;
                   APCP_12:units = "kg/m^2" ;
                   APCP_12:_FillValue = -9999.f ;
                   APCP_12:init_time = "20130714_000000" ;
                   APCP_12:init_time_ut = 1373803200 ;
                   APCP_12:valid_time = "20130715_120000" ;
                   APCP_12:valid_time_ut = 1373889600 ;
                   APCP_12:accum_time = "120000" ;
                   APCP_12:accum_time_sec = 43200 ;

In the above example, I've included 2 separate variables of 12-hour
accumulated precip.  Assume that we initialized a model run at
20130714_000000 (see init_time and init_time_ut variable attributes).
  This first variable (APCP_12_2013071500) is the 12-hour accumulation
valid at 20130715_000000.  So that'd be between forecast hours 12 and
24.  The second variable (APCP_12_2013071512) is the next
12-hour accumulation, valid at 20130715_120000.  So that'd be between
forecast hours 24 and 36.  And you could add as many of these
variables as you'd like.

Structuring the NetCDF file this way should enable to verify them all
in a single call to Point-Stat, however, as I mentioned the config
file will get messier.  You'd have to list each NetCDF variable
name separately...

    cat_thresh = [ >0.0 ];

    field = [
       { name = "APCP_12_2013071500"; level = [ "(*,*)" ]; },
       { name = "APCP_12_2013071512"; level = [ "(*,*)" ]; }
    ];

FYI, if you're not familiar with unixtime (that's what the _ut stands
for), here's how you can use the Unix date command to compute it:
    # Go from YYYY-MM-DD HH:MM:SS to unixtime
    date -ud ''2013-07-14' UTC '00:00:00'' +%s

    # Go from unixtime to YYYY-MM-DD HH:MM:SS
    date -ud '1970-01-01 UTC '1373760000' seconds' +"%Y-%m-%d
%H:%M:%S"

Just let me know if more questions arise.

Thanks,
John

On 07/15/2013 08:41 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>
> Thank you.
>
> One last question: attached is the ascii version of the station
file, and
> netdf version of the file from the ascii2nc script.
> The attributes don't give time information. I would not have known
about
> this without your email. Will this be a problem? What needs to be
changed?
>
> Sincerely,
> Erik
>
>
>
>> ncdump -h NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc
> netcdf NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt {
> dimensions:
> 	mxstr = 30 ;
> 	hdr_arr_len = 3 ;
> 	obs_arr_len = 5 ;
> 	nhdr = 12 ;
> 	nobs = UNLIMITED ; // (12 currently)
> variables:
> 	char hdr_typ(nhdr, mxstr) ;
> 		hdr_typ:long_name = "message type" ;
> 	char hdr_sid(nhdr, mxstr) ;
> 		hdr_sid:long_name = "station identification" ;
> 	char hdr_vld(nhdr, mxstr) ;
> 		hdr_vld:long_name = "valid time" ;
> 		hdr_vld:units = "YYYYMMDD_HHMMSS UTC" ;
> 	float hdr_arr(nhdr, hdr_arr_len) ;
> 		hdr_arr:long_name = "array of observation station header values" ;
> 		hdr_arr:_fill_value = -9999.f ;
> 		hdr_arr:columns = "lat lon elv" ;
> 		hdr_arr:lat_long_name = "latitude" ;
> 		hdr_arr:lat_units = "degrees_north" ;
> 		hdr_arr:lon_long_name = "longitude" ;
> 		hdr_arr:lon_units = "degrees_east" ;
> 		hdr_arr:elv_long_name = "elevation " ;
> 		hdr_arr:elv_units = "meters above sea level (msl)" ;
> 	char obs_qty(nobs, mxstr) ;
> 		obs_qty:long_name = "quality flag" ;
> 	float obs_arr(nobs, obs_arr_len) ;
> 		obs_arr:long_name = "array of observation values" ;
> 		obs_arr:_fill_value = -9999.f ;
> 		obs_arr:columns = "hdr_id gc lvl hgt ob" ;
> 		obs_arr:hdr_id_long_name = "index of matching header data" ;
> 		obs_arr:gc_long_name = "grib code corresponding to the observation
type"
> ;
> 		obs_arr:lvl_long_name = "pressure level (hPa) or accumulation
interval
> (sec)" ;
> 		obs_arr:hgt_long_name = "height in meters above sea level or
ground
> level (msl or agl)" ;
> 		obs_arr:ob_long_name = "observation value" ;
>
> // global attributes:
> 		:FileOrigins = "File
> NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc generated
> 20130709_215434 UTC on host borgb113 by the MET ascii2nc tool" ;
> 		:MET_version = "V4.1" ;
> 		:MET_tool = "ascii2nc" ;
> }
>
>
>
>
> On 7/12/13 6:14 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:
>
>> Erik,
>>
>> I see that the variables are dimensioned like this: ACC_RAIN(time,
lat,
>> lon).  The problem is that the MET tools won't read the timing
>> information from the "time" dimension as you'd probably expect.
>> Instead, it'll try to read the timing information from the variable
>> attributes.  For example, the output of pcp_combine looks like
this:
>>          float APCP_12(lat, lon) ;
>>                  APCP_12:name = "APCP_12" ;
>>                  APCP_12:long_name = "Total precipitation" ;
>>                  APCP_12:level = "A12" ;
>>                  APCP_12:units = "kg/m^2" ;
>>                  APCP_12:_FillValue = -9999.f ;
>>                  APCP_12:init_time = "20050807_000000" ;
>>                  APCP_12:init_time_ut = 1123372800 ;
>>                  APCP_12:valid_time = "20050807_120000" ;
>>                  APCP_12:valid_time_ut = 1123416000 ;
>>                  APCP_12:accum_time = "120000" ;
>>                  APCP_12:accum_time_sec = 43200 ;
>>
>> It reads the model initialization time (init_time_ut), valid time
>> (valid_time_ut), and accumulation interval (accum_time_sec) from
those
>> variable attributes.  Your NetCDF files follow the CF-1.4
>> convention, but unfortunately MET doesn't have the ability to
handle that
>> yet.
>>
>> If you want to have multiple times in the file, they'd need to be
stored
>> in separate variables, with the corresponding timing information
defined
>> in the variable attribute section.
>>
>> So unfortunately, it's messier than we'd like.  I do see that
you're
>> using TRMM data, and we do have an Rscript on our website that
reformats
>> ASCII TRMM data into the flavor of NetCDF that MET is
>> expecting.  You can find information about that here (trmm2nc.R):
>>
http://www.dtcenter.org/met/users/downloads/observation_data.php
>>
>> Let me know how you'd like to proceed, and what I can do to help.
>>
>> Thanks,
>> John
>>
>>
>> On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>
>>> Wow, thank you.
>>> I think I understand now. Your description of the point-ascii
files is
>>> clear and I think I can proceed.
>>>
>>> I your description of the netcdf files you wrote:
>>> (1) 2 dimensions named lat and lon.
>>> (2) The lat and lon variables are *NOT* required. . .
>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>> (4) Need data variable attributes specifying timing information.
>>> (5) MET expects bad data to be encoded as -9999.
>>> (6) Need global attributes specifying grid definition information.
>>> (7) Need "MET_version" global attribute.
>>>
>>>
>>> I have all of that except 4, 6 and 7.
>>>
>>> A description of the netcdf files I have are below. They are on a
>>> 0.25°x0.25° grid.
>>> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
>>> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>>>
>>> 	where they are both on the same grid 0.25°x0.25° grid. time = 12,
and
>>> lat
>>> and lon are 1D.
>>>
>>> First, where would global attributes specifying grid definition
>>> information go?
>>> Is the timing info ok?
>>> By Met version, I just need to add "Metv4.1" ?
>>>
>>> Thank you.
>>> -Erik
>>>
>>>
>>> *****************
>>> Grid netcdf file description (0.25x0.25_grid_template.nc)
>>>
>>> *****************
>>> 1 #
>>>
>>>
>>>     2 # gridID 0
>>>     3 #
>>>     4 gridtype  = lonlat
>>>     5 gridsize  = 61600
>>>     6 xname     = lon
>>>     7 xlongname = longitude
>>>     8 xunits    = degrees_east
>>>     9 yname     = lat
>>>    10 ylongname = latitude
>>>    11 yunits    = degrees_north
>>>    12 xsize     = 280
>>>    13 ysize     = 220
>>>    14 xfirst    = -34.875
>>>    15 xinc      = 0.25
>>>    16 yfirst    = -19.875
>>>    17 yinc      = 0.25
>>>
>>>
>>> ****************
>>>    netcdf file: Grid template
>>> ****************
>>> $ ncdump -h 0.25x0.25_grid_template.nc
>>> netcdf \0.25x0.25_grid_template {
>>> dimensions:
>>> 	lon = 280 ;
>>> 	lat = 220 ;
>>> variables:
>>> 	double lon(lon) ;
>>> 		lon:standard_name = "longitude" ;
>>> 		lon:long_name = "longitude" ;
>>> 		lon:units = "degrees_east" ;
>>> 		lon:axis = "X" ;
>>> 	double lat(lat) ;
>>> 		lat:standard_name = "latitude" ;
>>> 		lat:long_name = "latitude" ;
>>> 		lat:units = "degrees_north" ;
>>> 		lat:axis = "Y" ;
>>> 	float random(lat, lon) ;
>>>
>>> // global attributes:
>>> 		:CDI = "Climate Data Interface version 1.5.5
>>> (http://code.zmaw.de/projects/cdi)" ;
>>> 		:Conventions = "CF-1.4" ;
>>> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
>>> -sellonlatbox,-35,35,-20,35 -random,global_0.25
>>> 0.25x0.25_grid_template.nc" ;
>>> 		:CDO = "Climate Data Operators version 1.5.5
>>> (http://code.zmaw.de/projects/cdo)" ;
>>> }
>>> ****************
>>> TRMM netcdf file (12 days, daily accumulated precip)
>>> ****************
>>> gs611-noble:precip_work eunoble$ ncdump -h
>>> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
>>> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
>>> dimensions:
>>> 	lon = 280 ;
>>> 	lat = 220 ;
>>> 	time = UNLIMITED ; // (12 currently)
>>> variables:
>>> 	double lon(lon) ;
>>> 		lon:standard_name = "longitude" ;
>>> 		lon:long_name = "longitude" ;
>>> 		lon:units = "degrees_east" ;
>>> 		lon:axis = "X" ;
>>> 	double lat(lat) ;
>>> 		lat:standard_name = "latitude" ;
>>> 		lat:long_name = "latitude" ;
>>> 		lat:units = "degrees_north" ;
>>> 		lat:axis = "Y" ;
>>> 	double time(time) ;
>>> 		time:standard_name = "time" ;
>>> 		time:units = "hours since 1997-01-01 00:00:00" ;
>>> 		time:calendar = "standard" ;
>>> 	float ACC_RAIN(time, lat, lon) ;
>>> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
>>> 		ACC_RAIN:units = "mm" ;
>>> 		ACC_RAIN:_FillValue = -9999.f ;
>>>
>>> // global attributes:
>>> 		:CDI = "Climate Data Interface version 1.5.5
>>> (http://code.zmaw.de/projects/cdi)" ;
>>> 		:Conventions = "CF-1.4" ;
>>> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>>>
>>>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>> .2
>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 -seltimestep,2/13/1
>>> TRMM_accum_precip_daily.nc
>>> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
>>> 			"Wed Oct 24 17:01:05 2012: cdo daysum
-chname,PRC_ACCUM,ACC_RAIN,
>>> -selname,PRC_ACCUM mergefile_accum.nc
TRMM_accum_precip_daily.nc\n",
>>> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
>>> mergefile_accum.nc\n",
>>> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc <snip>
>>> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
>>> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
>>> 		:info = "\n",
>>> 			"The 3B-42 estimates are scaled to match the monthly rain gauge
>>> analyses\n",
>>> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree grid
boxes
>>> \n",
>>> 			"every 3 hours.\n",
>>> 			"" ;
>>> 		:description = "\n",
>>>
>>>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_README
>>> /T
>>> RMM_3B42_readme.shtml\n",
>>> 			"" ;
>>> 		:ftp = "\n",
>>>
>>>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Products/
>>> 02
>>> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
>>> 			"" ;
>>> 		:title = "TRMM_3B42" ;
>>> 		:nco_openmp_thread_number = 1 ;
>>> 		:NCO = "4.1.0" ;
>>> 		:CDO = "Climate Data Operators version 1.5.5
>>> (http://code.zmaw.de/projects/cdo)" ;
>>>
>>>
>>>
>>> ************
>>> WRF FILE (processed,12 days, daily accumulated precip)
>>> ************
>>> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
>>> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
>>> dimensions:
>>> 	lon = 280 ;
>>> 	lat = 220 ;
>>> 	time = UNLIMITED ; // (12 currently)
>>> variables:
>>> 	double lon(lon) ;
>>> 		lon:standard_name = "longitude" ;
>>> 		lon:long_name = "longitude" ;
>>> 		lon:units = "degrees_east" ;
>>> 		lon:axis = "X" ;
>>> 	double lat(lat) ;
>>> 		lat:standard_name = "latitude" ;
>>> 		lat:long_name = "latitude" ;
>>> 		lat:units = "degrees_north" ;
>>> 		lat:axis = "Y" ;
>>> 	double time(time) ;
>>> 		time:standard_name = "time" ;
>>> 		time:units = "hours since 2006-09-02 00:00:00" ;
>>> 		time:calendar = "standard" ;
>>> 	float ACC_RAIN(time, lat, lon) ;
>>> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
>>> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation" ;
>>> 		ACC_RAIN:units = "mm" ;
>>> 		ACC_RAIN:_FillValue = -9.e+33f ;
>>>
>>> // global attributes:
>>> 		:CDI = "Climate Data Interface version 1.5.5
>>> (http://code.zmaw.de/projects/cdi)" ;
>>> 		:Conventions = "CF-1.1" ;
>>> 		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN
-sub
>>> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
>>> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
>>> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
>>> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
>>> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
>>> -selname,RAINC,RAINNC
>>> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
>>> 			"Wed May 23 14:30:03 2012: cdo -P 4
>>>
>>>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25x0
>>> .2
>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
>>> 0.25x0.25_grid_Experiment_01.nc\n",
>>> 			"Wed May 23 14:27:04 2012: ncks -v
>>> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
>>> post_wrfout_d01_2006-09-02_00:00:00.nc
>>> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES -
CIRES" ;
>>> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0"
;
>>> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
>>> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
>>> 		:NCO = "4.1.0" ;
>>> 		:CDO = "Climate Data Operators version 1.5.5
>>> (http://code.zmaw.de/projects/cdo)" ;
>>>
>>>
>>>
>>> On 7/12/13 1:58 PM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>> wrote:
>>>
>>>> Eric,
>>>>
>>>> The timing of the data within the ASCII files doesn't matter.  So
using
>>>> 40 12-day files is fine.
>>>>
>>>> But each time you run Point-Stat, you'll pass it a forecast file
and an
>>>> observation file containing the observations that should be used
to
>>>> verify that forecast.  So you'll need to know which obs
>>>> files go with which forecast files.  Point-Stat reads the timing
>>>> information from the forecast file and then defines a time window
>>>> around
>>>> that valid time (see "obs_window" setting in the Point-Stat
>>>> config file and see data/config/README for details).  Any point
>>>> observations falling within that time window will be used.  Any
point
>>>> observations falling outside that time window will be skipped
over.
>>>>
>>>> If you were using GRIB forecast files, you could actually set
this up
>>>> in
>>>> such a way as to verify all of the output times in a single call
to
>>>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>>>> files and then set up a more complex configuration file to tell
>>>> Point-Stat what to do.  I suppose you could do the same using
NetCDF if
>>>> you defined a bunch of different 2D variables, each for precip
>>>> with a different time (e.g. APCP_06_2012020500,
APCP_06_2012020506,
>>>> ...).
>>>> Or you can keep all the NetCDF files separate and just loop
through
>>>> them
>>>> in a script.  That's what we typically do.
>>>>
>>>> In order to create a NetCDF file that MET can handle you
basically need
>>>> to structure it like the NetCDF output of the pcp_combine tool:
>>>> (1) 2 dimensions named lat and lon.
>>>> (2) The lat and lon variables are *NOT* required.  MET doesn't
actually
>>>> use them.  We put them in there since other plotting tools (like
IDV)
>>>> use
>>>> them.
>>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>>> (4) Need data variable attributes specifying timing information.
>>>> (5) MET expects bad data to be encoded as -9999.
>>>> (6) Need global attributes specifying grid definition
information.
>>>> (7) Need "MET_version" global attribute.
>>>>
>>>> Ultimately, we'd like to support CF-compliant NetCDF files, but
we
>>>> haven't gotten there yet.  For now, we're stuck with this rather
>>>> restrictive NetCDF format that we use internally.
>>>>
>>>> If you need help setting up the global attributes, just send me a
>>>> description of the grid you're using.  Give it a shot and if you
run
>>>> into
>>>> problems, just send me a sample NetCDF file and I can let
>>>> you know what changes are needed.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>> wrote:
>>>>>
>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>
>>>>> Hi, thank you. Your reply clarifies a lot.
>>>>>
>>>>> 2 questions:
>>>>> You wrote, "Point-Stat is run once per valid time,Š"
>>>>> I have 40 ascii files that hold 12-day daily precipitation
>>>>> accumulations
>>>>> at each station. (1) Do the ascii files need to be once per
valid time
>>>>> or
>>>>> can I still use the 40 12-day files?
>>>>>
>>>>> Before I discovered MET, I went through the trouble of writing
my own
>>>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>>>> Also,
>>>>> I found an easy way to regrid netcdf files, both observed and
wrf-arw
>>>>> files to the same grid, using CDO. Details of the command line
are
>>>>> here,
>>>>> where the 2nd-to-last-entry provides the best explanation
>>>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>>>> What does the netcdf file need to have in order to still be used
in
>>>>> MET
>>>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat = lat;lon
=
>>>>> lon)
>>>>>
>>>>> Thank you.
>>>>> Erik
>>>>>
>>>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>>> wrote:
>>>>>
>>>>>> Erik,
>>>>>>
>>>>>> We appreciate the feedback!  It's been a long road getting MET
out
>>>>>> the
>>>>>> door and supporting it, but it's nice to hear that it's useful
to the
>>>>>> modelling community.
>>>>>>
>>>>>> I see that you're asking about the support in MET for the
NetCDF
>>>>>> output
>>>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF
>>>>>> file
>>>>>> that the MET tools do support, but with some
>>>>>> limitations.  WRF is run on a staggered grid with the wind
fields
>>>>>> staggered, while the mass points are on a regular grid.  The
pinterp
>>>>>> output is indeed on pressure levels, but the wind points are
>>>>>> still staggered, and MET is unable to read them.  So basically,
using
>>>>>> pinterp is not a good choice if you're interested in winds.
But as
>>>>>> long
>>>>>> as you're not using winds, the gridded pinterp output can
>>>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>>>
>>>>>> The other big drawback to pinterp is that it's NetCDF, and
therefore,
>>>>>> it's not easy to regrid.  When doing grid-to-grid comparisons,
you
>>>>>> need
>>>>>> to put the forecast and observation fields on a common
>>>>>> grid.  That's easy to do for GRIB using the copygb utility, but
not
>>>>>> easy
>>>>>> in general for NetCDF.
>>>>>>
>>>>>> So the other WRF post-processing alternative is the Unified
>>>>>> PostProcessor
>>>>>> (UPP).  It's output is in GRIB1 format, which MET fully
supports.  If
>>>>>> possible, I'd suggest using UPP instead of pinterp to
>>>>>> avoid the staggered grid and regridding limitations of NetCDF.
>>>>>> Support
>>>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>>>
>>>>>> But to get to your question...
>>>>>>
>>>>>> If you're dealing with point observations, the only MET tool to
be
>>>>>> used
>>>>>> is the Point-Stat tool.  I suggest running Point-Stat to
compare the
>>>>>> output of pinterp (or UPP, if you switch) to the point
>>>>>> observation output of ASCII2NC.  With only 40 points, I'd
suggest
>>>>>> writing
>>>>>> out the matched pair (MPR) line type from Point-Stat.  Point-
Stat is
>>>>>> run
>>>>>> once per valid time, however, you can then use the
>>>>>> STAT-Analysis tool to aggregate results through time.  Suppose
you've
>>>>>> run
>>>>>> Point-Stat over a month of output for those 40 stations.  You
could
>>>>>> run
>>>>>> STAT-Analysis to aggregate together that month of
>>>>>> Point-Stat output and compute monthly statistics for each of
the 40
>>>>>> stations individually.
>>>>>>
>>>>>> Hopefully that helps.  If you get stuck on setting up the
Point-Stat
>>>>>> configuration file, just let me know and I'll be happy to help
you
>>>>>> get
>>>>>> it
>>>>>> going.
>>>>>>
>>>>>> Thanks,
>>>>>> John Halley Gotway
>>>>>> met_help at ucar.edu
>>>>>>
>>>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>>>> wrote:
>>>>>>>
>>>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>>>            Queue: met_help
>>>>>>>          Subject: MET 4 (or 4.1) question(s)
>>>>>>>            Owner: Nobody
>>>>>>>       Requestors: erik.noble at nasa.gov
>>>>>>>           Status: new
>>>>>>>      Ticket <URL:
>>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>>
>>>>>>>
>>>>>>> Hi.
>>>>>>> First, to the developer, thank you for providing MET to the
world.
>>>>>>> It
>>>>>>> looks outstanding and very useful. If you don't get enough
thanks, I
>>>>>>> hope
>>>>>>> just one more "thank you" makes your day.
>>>>>>>
>>>>>>> I installed MET 4.1 on a Linux.
>>>>>>> I am ready to start using MET to compare 40 station
observations of
>>>>>>> daily
>>>>>>> accumulated precip to 64 WRF-ARW simulations for the same
area.
>>>>>>> I have read the MET 4.1 manual for the past week \and gone
through
>>>>>>> all
>>>>>>> the
>>>>>>> examples.
>>>>>>>
>>>>>>> But I am very confused about one thing:
>>>>>>>
>>>>>>> Once a user uses p_interp to process WRF-ARW output, where do
they
>>>>>>> go
>>>>>>> from
>>>>>>> there?
>>>>>>> The p_interp output is still netcdf (variables are on pressure
>>>>>>> level)
>>>>>>> yet
>>>>>>> most of the modules want grib. Yet, I the second chapter, the
manual
>>>>>>> states that p_interp netcdf data can be used. How?
>>>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>>>> particulalry if I want to do the above, compare station data
to WRF.
>>>>>>> Perhaps I missed it?
>>>>>>>
>>>>>>>
>>>>>>> I used assci2nc to process the station data, so they are
ready. I
>>>>>>> have
>>>>>>> the
>>>>>>> wrf-arw p_interp processed data. How do I put the wrf-data
into the
>>>>>>> point-stat module?
>>>>>>>
>>>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>>>> results? I could have skipped p_interp step and just run the
wrf-arw
>>>>>>> output through UPP, right? I am confused.
>>>>>>>
>>>>>>> If Met user of developer could help me understand the next
step,
>>>>>>> that
>>>>>>> would be greatly appreciated.
>>>>>>>
>>>>>>> In addition, has anyone successfully installed this software
on an
>>>>>>> Apple
>>>>>>> machine?
>>>>>>>
>>>>>>> Sincerely,
>>>>>>> Erik Noble
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>
>
>

------------------------------------------------
Subject: MET 4 (or 4.1) question(s)
From: Noble, Erik U.[COLUMBIA UNIVERSITY]
Time: Mon Jul 15 13:03:22 2013

Hi. Attached is the TRMM file I grabbed from TOVAS; the file name is
descriptive, it is one ascii file that has daily trmm data over a
region
of interest, yet it is 12-day's worth.
To use TRMM data with MET, it seems like it would be better for me to
use
12 separate ascii files of daily TRMM data (using trmm2nc.R)? Is this
right?
-Erik

On 7/15/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:

>Erik,
>
>MET uses a NetCDF point observation file format (output of pb2nc,
>ascii2nc, and madis2nc), and a gridded NetCDF file format (output of
>pcp_combine and gen_poly_mask).  Those file formats differ.
>
>For gridded data, you need to make it look like the output of
>pcp_combine.  If you wanted to put multiple times into the same
NetCDF
>file, it might look something like this:
>
>             float APCP_12_2013071500(lat, lon) ;
>                   APCP_12:name = "APCP_12" ;
>                   APCP_12:long_name = "Total precipitation" ;
>                   APCP_12:level = "A12" ;
>                   APCP_12:units = "kg/m^2" ;
>                   APCP_12:_FillValue = -9999.f ;
>                   APCP_12:init_time = "20130714_000000" ;
>                   APCP_12:init_time_ut = 1373760000 ;
>                   APCP_12:valid_time = "20130715_000000" ;
>                   APCP_12:valid_time_ut = 1373846400 ;
>                   APCP_12:accum_time = "120000" ;
>                   APCP_12:accum_time_sec = 43200 ;
>             float APCP_12_2013071512(lat, lon) ;
>                   APCP_12:name = "APCP_12" ;
>                   APCP_12:long_name = "Total precipitation" ;
>                   APCP_12:level = "A12" ;
>                   APCP_12:units = "kg/m^2" ;
>                   APCP_12:_FillValue = -9999.f ;
>                   APCP_12:init_time = "20130714_000000" ;
>                   APCP_12:init_time_ut = 1373803200 ;
>                   APCP_12:valid_time = "20130715_120000" ;
>                   APCP_12:valid_time_ut = 1373889600 ;
>                   APCP_12:accum_time = "120000" ;
>                   APCP_12:accum_time_sec = 43200 ;
>
>In the above example, I've included 2 separate variables of 12-hour
>accumulated precip.  Assume that we initialized a model run at
>20130714_000000 (see init_time and init_time_ut variable attributes).
>  This first variable (APCP_12_2013071500) is the 12-hour
accumulation
>valid at 20130715_000000.  So that'd be between forecast hours 12 and
24.
> The second variable (APCP_12_2013071512) is the next
>12-hour accumulation, valid at 20130715_120000.  So that'd be between
>forecast hours 24 and 36.  And you could add as many of these
variables
>as you'd like.
>
>Structuring the NetCDF file this way should enable to verify them all
in
>a single call to Point-Stat, however, as I mentioned the config file
will
>get messier.  You'd have to list each NetCDF variable
>name separately...
>
>    cat_thresh = [ >0.0 ];
>
>    field = [
>       { name = "APCP_12_2013071500"; level = [ "(*,*)" ]; },
>       { name = "APCP_12_2013071512"; level = [ "(*,*)" ]; }
>    ];
>
>FYI, if you're not familiar with unixtime (that's what the _ut stands
>for), here's how you can use the Unix date command to compute it:
>    # Go from YYYY-MM-DD HH:MM:SS to unixtime
>    date -ud ''2013-07-14' UTC '00:00:00'' +%s
>
>    # Go from unixtime to YYYY-MM-DD HH:MM:SS
>    date -ud '1970-01-01 UTC '1373760000' seconds' +"%Y-%m-%d
%H:%M:%S"
>
>Just let me know if more questions arise.
>
>Thanks,
>John
>
>On 07/15/2013 08:41 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>
>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>
>> Thank you.
>>
>> One last question: attached is the ascii version of the station
file,
>>and
>> netdf version of the file from the ascii2nc script.
>> The attributes don't give time information. I would not have known
about
>> this without your email. Will this be a problem? What needs to be
>>changed?
>>
>> Sincerely,
>> Erik
>>
>>
>>
>>> ncdump -h NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc
>> netcdf NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt {
>> dimensions:
>> 	mxstr = 30 ;
>> 	hdr_arr_len = 3 ;
>> 	obs_arr_len = 5 ;
>> 	nhdr = 12 ;
>> 	nobs = UNLIMITED ; // (12 currently)
>> variables:
>> 	char hdr_typ(nhdr, mxstr) ;
>> 		hdr_typ:long_name = "message type" ;
>> 	char hdr_sid(nhdr, mxstr) ;
>> 		hdr_sid:long_name = "station identification" ;
>> 	char hdr_vld(nhdr, mxstr) ;
>> 		hdr_vld:long_name = "valid time" ;
>> 		hdr_vld:units = "YYYYMMDD_HHMMSS UTC" ;
>> 	float hdr_arr(nhdr, hdr_arr_len) ;
>> 		hdr_arr:long_name = "array of observation station header values"
;
>> 		hdr_arr:_fill_value = -9999.f ;
>> 		hdr_arr:columns = "lat lon elv" ;
>> 		hdr_arr:lat_long_name = "latitude" ;
>> 		hdr_arr:lat_units = "degrees_north" ;
>> 		hdr_arr:lon_long_name = "longitude" ;
>> 		hdr_arr:lon_units = "degrees_east" ;
>> 		hdr_arr:elv_long_name = "elevation " ;
>> 		hdr_arr:elv_units = "meters above sea level (msl)" ;
>> 	char obs_qty(nobs, mxstr) ;
>> 		obs_qty:long_name = "quality flag" ;
>> 	float obs_arr(nobs, obs_arr_len) ;
>> 		obs_arr:long_name = "array of observation values" ;
>> 		obs_arr:_fill_value = -9999.f ;
>> 		obs_arr:columns = "hdr_id gc lvl hgt ob" ;
>> 		obs_arr:hdr_id_long_name = "index of matching header data" ;
>> 		obs_arr:gc_long_name = "grib code corresponding to the
observation
>>type"
>> ;
>> 		obs_arr:lvl_long_name = "pressure level (hPa) or accumulation
interval
>> (sec)" ;
>> 		obs_arr:hgt_long_name = "height in meters above sea level or
ground
>> level (msl or agl)" ;
>> 		obs_arr:ob_long_name = "observation value" ;
>>
>> // global attributes:
>> 		:FileOrigins = "File
>> NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc generated
>> 20130709_215434 UTC on host borgb113 by the MET ascii2nc tool" ;
>> 		:MET_version = "V4.1" ;
>> 		:MET_tool = "ascii2nc" ;
>> }
>>
>>
>>
>>
>> On 7/12/13 6:14 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
>>wrote:
>>
>>> Erik,
>>>
>>> I see that the variables are dimensioned like this: ACC_RAIN(time,
lat,
>>> lon).  The problem is that the MET tools won't read the timing
>>> information from the "time" dimension as you'd probably expect.
>>> Instead, it'll try to read the timing information from the
variable
>>> attributes.  For example, the output of pcp_combine looks like
this:
>>>          float APCP_12(lat, lon) ;
>>>                  APCP_12:name = "APCP_12" ;
>>>                  APCP_12:long_name = "Total precipitation" ;
>>>                  APCP_12:level = "A12" ;
>>>                  APCP_12:units = "kg/m^2" ;
>>>                  APCP_12:_FillValue = -9999.f ;
>>>                  APCP_12:init_time = "20050807_000000" ;
>>>                  APCP_12:init_time_ut = 1123372800 ;
>>>                  APCP_12:valid_time = "20050807_120000" ;
>>>                  APCP_12:valid_time_ut = 1123416000 ;
>>>                  APCP_12:accum_time = "120000" ;
>>>                  APCP_12:accum_time_sec = 43200 ;
>>>
>>> It reads the model initialization time (init_time_ut), valid time
>>> (valid_time_ut), and accumulation interval (accum_time_sec) from
those
>>> variable attributes.  Your NetCDF files follow the CF-1.4
>>> convention, but unfortunately MET doesn't have the ability to
handle
>>>that
>>> yet.
>>>
>>> If you want to have multiple times in the file, they'd need to be
>>>stored
>>> in separate variables, with the corresponding timing information
>>>defined
>>> in the variable attribute section.
>>>
>>> So unfortunately, it's messier than we'd like.  I do see that
you're
>>> using TRMM data, and we do have an Rscript on our website that
>>>reformats
>>> ASCII TRMM data into the flavor of NetCDF that MET is
>>> expecting.  You can find information about that here (trmm2nc.R):
>>>
http://www.dtcenter.org/met/users/downloads/observation_data.php
>>>
>>> Let me know how you'd like to proceed, and what I can do to help.
>>>
>>> Thanks,
>>> John
>>>
>>>
>>> On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
>>>wrote:
>>>>
>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>
>>>> Wow, thank you.
>>>> I think I understand now. Your description of the point-ascii
files is
>>>> clear and I think I can proceed.
>>>>
>>>> I your description of the netcdf files you wrote:
>>>> (1) 2 dimensions named lat and lon.
>>>> (2) The lat and lon variables are *NOT* required. . .
>>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>>> (4) Need data variable attributes specifying timing information.
>>>> (5) MET expects bad data to be encoded as -9999.
>>>> (6) Need global attributes specifying grid definition
information.
>>>> (7) Need "MET_version" global attribute.
>>>>
>>>>
>>>> I have all of that except 4, 6 and 7.
>>>>
>>>> A description of the netcdf files I have are below. They are on a
>>>> 0.25°x0.25° grid.
>>>> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
>>>> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>>>>
>>>> 	where they are both on the same grid 0.25°x0.25° grid. time =
12, and
>>>> lat
>>>> and lon are 1D.
>>>>
>>>> First, where would global attributes specifying grid definition
>>>> information go?
>>>> Is the timing info ok?
>>>> By Met version, I just need to add "Metv4.1" ?
>>>>
>>>> Thank you.
>>>> -Erik
>>>>
>>>>
>>>> *****************
>>>> Grid netcdf file description (0.25x0.25_grid_template.nc)
>>>>
>>>> *****************
>>>> 1 #
>>>>
>>>>
>>>>     2 # gridID 0
>>>>     3 #
>>>>     4 gridtype  = lonlat
>>>>     5 gridsize  = 61600
>>>>     6 xname     = lon
>>>>     7 xlongname = longitude
>>>>     8 xunits    = degrees_east
>>>>     9 yname     = lat
>>>>    10 ylongname = latitude
>>>>    11 yunits    = degrees_north
>>>>    12 xsize     = 280
>>>>    13 ysize     = 220
>>>>    14 xfirst    = -34.875
>>>>    15 xinc      = 0.25
>>>>    16 yfirst    = -19.875
>>>>    17 yinc      = 0.25
>>>>
>>>>
>>>> ****************
>>>>    netcdf file: Grid template
>>>> ****************
>>>> $ ncdump -h 0.25x0.25_grid_template.nc
>>>> netcdf \0.25x0.25_grid_template {
>>>> dimensions:
>>>> 	lon = 280 ;
>>>> 	lat = 220 ;
>>>> variables:
>>>> 	double lon(lon) ;
>>>> 		lon:standard_name = "longitude" ;
>>>> 		lon:long_name = "longitude" ;
>>>> 		lon:units = "degrees_east" ;
>>>> 		lon:axis = "X" ;
>>>> 	double lat(lat) ;
>>>> 		lat:standard_name = "latitude" ;
>>>> 		lat:long_name = "latitude" ;
>>>> 		lat:units = "degrees_north" ;
>>>> 		lat:axis = "Y" ;
>>>> 	float random(lat, lon) ;
>>>>
>>>> // global attributes:
>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>> 		:Conventions = "CF-1.4" ;
>>>> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
>>>> -sellonlatbox,-35,35,-20,35 -random,global_0.25
>>>> 0.25x0.25_grid_template.nc" ;
>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>> }
>>>> ****************
>>>> TRMM netcdf file (12 days, daily accumulated precip)
>>>> ****************
>>>> gs611-noble:precip_work eunoble$ ncdump -h
>>>> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
>>>> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
>>>> dimensions:
>>>> 	lon = 280 ;
>>>> 	lat = 220 ;
>>>> 	time = UNLIMITED ; // (12 currently)
>>>> variables:
>>>> 	double lon(lon) ;
>>>> 		lon:standard_name = "longitude" ;
>>>> 		lon:long_name = "longitude" ;
>>>> 		lon:units = "degrees_east" ;
>>>> 		lon:axis = "X" ;
>>>> 	double lat(lat) ;
>>>> 		lat:standard_name = "latitude" ;
>>>> 		lat:long_name = "latitude" ;
>>>> 		lat:units = "degrees_north" ;
>>>> 		lat:axis = "Y" ;
>>>> 	double time(time) ;
>>>> 		time:standard_name = "time" ;
>>>> 		time:units = "hours since 1997-01-01 00:00:00" ;
>>>> 		time:calendar = "standard" ;
>>>> 	float ACC_RAIN(time, lat, lon) ;
>>>> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
>>>> 		ACC_RAIN:units = "mm" ;
>>>> 		ACC_RAIN:_FillValue = -9999.f ;
>>>>
>>>> // global attributes:
>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>> 		:Conventions = "CF-1.4" ;
>>>> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>>>>
>>>>
>>>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25
>>>>x0
>>>> .2
>>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35
-seltimestep,2/13/1
>>>> TRMM_accum_precip_daily.nc
>>>> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
>>>> 			"Wed Oct 24 17:01:05 2012: cdo daysum
-chname,PRC_ACCUM,ACC_RAIN,
>>>> -selname,PRC_ACCUM mergefile_accum.nc
TRMM_accum_precip_daily.nc\n",
>>>> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
>>>> mergefile_accum.nc\n",
>>>> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc
<snip>
>>>> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
>>>> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
>>>> 		:info = "\n",
>>>> 			"The 3B-42 estimates are scaled to match the monthly rain
gauge
>>>> analyses\n",
>>>> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree
grid
>>>>boxes
>>>> \n",
>>>> 			"every 3 hours.\n",
>>>> 			"" ;
>>>> 		:description = "\n",
>>>>
>>>>
>>>>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_READ
>>>>ME
>>>> /T
>>>> RMM_3B42_readme.shtml\n",
>>>> 			"" ;
>>>> 		:ftp = "\n",
>>>>
>>>>
>>>>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Product
>>>>s/
>>>> 02
>>>> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
>>>> 			"" ;
>>>> 		:title = "TRMM_3B42" ;
>>>> 		:nco_openmp_thread_number = 1 ;
>>>> 		:NCO = "4.1.0" ;
>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>>
>>>>
>>>>
>>>> ************
>>>> WRF FILE (processed,12 days, daily accumulated precip)
>>>> ************
>>>> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
>>>> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
>>>> dimensions:
>>>> 	lon = 280 ;
>>>> 	lat = 220 ;
>>>> 	time = UNLIMITED ; // (12 currently)
>>>> variables:
>>>> 	double lon(lon) ;
>>>> 		lon:standard_name = "longitude" ;
>>>> 		lon:long_name = "longitude" ;
>>>> 		lon:units = "degrees_east" ;
>>>> 		lon:axis = "X" ;
>>>> 	double lat(lat) ;
>>>> 		lat:standard_name = "latitude" ;
>>>> 		lat:long_name = "latitude" ;
>>>> 		lat:units = "degrees_north" ;
>>>> 		lat:axis = "Y" ;
>>>> 	double time(time) ;
>>>> 		time:standard_name = "time" ;
>>>> 		time:units = "hours since 2006-09-02 00:00:00" ;
>>>> 		time:calendar = "standard" ;
>>>> 	float ACC_RAIN(time, lat, lon) ;
>>>> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
>>>> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation"
;
>>>> 		ACC_RAIN:units = "mm" ;
>>>> 		ACC_RAIN:_FillValue = -9.e+33f ;
>>>>
>>>> // global attributes:
>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>> 		:Conventions = "CF-1.1" ;
>>>> 		:history = "Wed May 23 14:30:35 2012: cdo chname,RAINC,ACC_RAIN
-sub
>>>> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1 temp_RAIN.nc
>>>> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
>>>> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
>>>> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
>>>> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
>>>> -selname,RAINC,RAINNC
>>>> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
>>>> 			"Wed May 23 14:30:03 2012: cdo -P 4
>>>>
>>>>
>>>>remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25
>>>>x0
>>>> .2
>>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
>>>> 0.25x0.25_grid_Experiment_01.nc\n",
>>>> 			"Wed May 23 14:27:04 2012: ncks -v
>>>> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
>>>> post_wrfout_d01_2006-09-02_00:00:00.nc
>>>> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>>> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES -
CIRES" ;
>>>> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>>> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl v1.0"
;
>>>> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
>>>> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
>>>> 		:NCO = "4.1.0" ;
>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>>
>>>>
>>>>
>>>> On 7/12/13 1:58 PM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>> wrote:
>>>>
>>>>> Eric,
>>>>>
>>>>> The timing of the data within the ASCII files doesn't matter.
So
>>>>>using
>>>>> 40 12-day files is fine.
>>>>>
>>>>> But each time you run Point-Stat, you'll pass it a forecast file
and
>>>>>an
>>>>> observation file containing the observations that should be used
to
>>>>> verify that forecast.  So you'll need to know which obs
>>>>> files go with which forecast files.  Point-Stat reads the timing
>>>>> information from the forecast file and then defines a time
window
>>>>> around
>>>>> that valid time (see "obs_window" setting in the Point-Stat
>>>>> config file and see data/config/README for details).  Any point
>>>>> observations falling within that time window will be used.  Any
point
>>>>> observations falling outside that time window will be skipped
over.
>>>>>
>>>>> If you were using GRIB forecast files, you could actually set
this up
>>>>> in
>>>>> such a way as to verify all of the output times in a single call
to
>>>>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>>>>> files and then set up a more complex configuration file to tell
>>>>> Point-Stat what to do.  I suppose you could do the same using
NetCDF
>>>>>if
>>>>> you defined a bunch of different 2D variables, each for precip
>>>>> with a different time (e.g. APCP_06_2012020500,
APCP_06_2012020506,
>>>>> ...).
>>>>> Or you can keep all the NetCDF files separate and just loop
through
>>>>> them
>>>>> in a script.  That's what we typically do.
>>>>>
>>>>> In order to create a NetCDF file that MET can handle you
basically
>>>>>need
>>>>> to structure it like the NetCDF output of the pcp_combine tool:
>>>>> (1) 2 dimensions named lat and lon.
>>>>> (2) The lat and lon variables are *NOT* required.  MET doesn't
>>>>>actually
>>>>> use them.  We put them in there since other plotting tools (like
IDV)
>>>>> use
>>>>> them.
>>>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>>>> (4) Need data variable attributes specifying timing information.
>>>>> (5) MET expects bad data to be encoded as -9999.
>>>>> (6) Need global attributes specifying grid definition
information.
>>>>> (7) Need "MET_version" global attribute.
>>>>>
>>>>> Ultimately, we'd like to support CF-compliant NetCDF files, but
we
>>>>> haven't gotten there yet.  For now, we're stuck with this rather
>>>>> restrictive NetCDF format that we use internally.
>>>>>
>>>>> If you need help setting up the global attributes, just send me
a
>>>>> description of the grid you're using.  Give it a shot and if you
run
>>>>> into
>>>>> problems, just send me a sample NetCDF file and I can let
>>>>> you know what changes are needed.
>>>>>
>>>>> Thanks,
>>>>> John
>>>>>
>>>>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>>> wrote:
>>>>>>
>>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>
>>>>>> Hi, thank you. Your reply clarifies a lot.
>>>>>>
>>>>>> 2 questions:
>>>>>> You wrote, "Point-Stat is run once per valid time,Š"
>>>>>> I have 40 ascii files that hold 12-day daily precipitation
>>>>>> accumulations
>>>>>> at each station. (1) Do the ascii files need to be once per
valid
>>>>>>time
>>>>>> or
>>>>>> can I still use the 40 12-day files?
>>>>>>
>>>>>> Before I discovered MET, I went through the trouble of writing
my
>>>>>>own
>>>>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>>>>> Also,
>>>>>> I found an easy way to regrid netcdf files, both observed and
>>>>>>wrf-arw
>>>>>> files to the same grid, using CDO. Details of the command line
are
>>>>>> here,
>>>>>> where the 2nd-to-last-entry provides the best explanation
>>>>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>>>>> What does the netcdf file need to have in order to still be
used in
>>>>>> MET
>>>>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>>>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat =
lat;lon =
>>>>>> lon)
>>>>>>
>>>>>> Thank you.
>>>>>> Erik
>>>>>>
>>>>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>>>> wrote:
>>>>>>
>>>>>>> Erik,
>>>>>>>
>>>>>>> We appreciate the feedback!  It's been a long road getting MET
out
>>>>>>> the
>>>>>>> door and supporting it, but it's nice to hear that it's useful
to
>>>>>>>the
>>>>>>> modelling community.
>>>>>>>
>>>>>>> I see that you're asking about the support in MET for the
NetCDF
>>>>>>> output
>>>>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF
>>>>>>> file
>>>>>>> that the MET tools do support, but with some
>>>>>>> limitations.  WRF is run on a staggered grid with the wind
fields
>>>>>>> staggered, while the mass points are on a regular grid.  The
>>>>>>>pinterp
>>>>>>> output is indeed on pressure levels, but the wind points are
>>>>>>> still staggered, and MET is unable to read them.  So
basically,
>>>>>>>using
>>>>>>> pinterp is not a good choice if you're interested in winds.
But as
>>>>>>> long
>>>>>>> as you're not using winds, the gridded pinterp output can
>>>>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>>>>
>>>>>>> The other big drawback to pinterp is that it's NetCDF, and
>>>>>>>therefore,
>>>>>>> it's not easy to regrid.  When doing grid-to-grid comparisons,
you
>>>>>>> need
>>>>>>> to put the forecast and observation fields on a common
>>>>>>> grid.  That's easy to do for GRIB using the copygb utility,
but not
>>>>>>> easy
>>>>>>> in general for NetCDF.
>>>>>>>
>>>>>>> So the other WRF post-processing alternative is the Unified
>>>>>>> PostProcessor
>>>>>>> (UPP).  It's output is in GRIB1 format, which MET fully
supports.
>>>>>>>If
>>>>>>> possible, I'd suggest using UPP instead of pinterp to
>>>>>>> avoid the staggered grid and regridding limitations of NetCDF.
>>>>>>> Support
>>>>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>>>>
>>>>>>> But to get to your question...
>>>>>>>
>>>>>>> If you're dealing with point observations, the only MET tool
to be
>>>>>>> used
>>>>>>> is the Point-Stat tool.  I suggest running Point-Stat to
compare
>>>>>>>the
>>>>>>> output of pinterp (or UPP, if you switch) to the point
>>>>>>> observation output of ASCII2NC.  With only 40 points, I'd
suggest
>>>>>>> writing
>>>>>>> out the matched pair (MPR) line type from Point-Stat.  Point-
Stat
>>>>>>>is
>>>>>>> run
>>>>>>> once per valid time, however, you can then use the
>>>>>>> STAT-Analysis tool to aggregate results through time.  Suppose
>>>>>>>you've
>>>>>>> run
>>>>>>> Point-Stat over a month of output for those 40 stations.  You
could
>>>>>>> run
>>>>>>> STAT-Analysis to aggregate together that month of
>>>>>>> Point-Stat output and compute monthly statistics for each of
the 40
>>>>>>> stations individually.
>>>>>>>
>>>>>>> Hopefully that helps.  If you get stuck on setting up the
>>>>>>>Point-Stat
>>>>>>> configuration file, just let me know and I'll be happy to help
you
>>>>>>> get
>>>>>>> it
>>>>>>> going.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> John Halley Gotway
>>>>>>> met_help at ucar.edu
>>>>>>>
>>>>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY]
via RT
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>>>>            Queue: met_help
>>>>>>>>          Subject: MET 4 (or 4.1) question(s)
>>>>>>>>            Owner: Nobody
>>>>>>>>       Requestors: erik.noble at nasa.gov
>>>>>>>>           Status: new
>>>>>>>>      Ticket <URL:
>>>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>>>
>>>>>>>>
>>>>>>>> Hi.
>>>>>>>> First, to the developer, thank you for providing MET to the
world.
>>>>>>>> It
>>>>>>>> looks outstanding and very useful. If you don't get enough
>>>>>>>>thanks, I
>>>>>>>> hope
>>>>>>>> just one more "thank you" makes your day.
>>>>>>>>
>>>>>>>> I installed MET 4.1 on a Linux.
>>>>>>>> I am ready to start using MET to compare 40 station
observations
>>>>>>>>of
>>>>>>>> daily
>>>>>>>> accumulated precip to 64 WRF-ARW simulations for the same
area.
>>>>>>>> I have read the MET 4.1 manual for the past week \and gone
through
>>>>>>>> all
>>>>>>>> the
>>>>>>>> examples.
>>>>>>>>
>>>>>>>> But I am very confused about one thing:
>>>>>>>>
>>>>>>>> Once a user uses p_interp to process WRF-ARW output, where do
they
>>>>>>>> go
>>>>>>>> from
>>>>>>>> there?
>>>>>>>> The p_interp output is still netcdf (variables are on
pressure
>>>>>>>> level)
>>>>>>>> yet
>>>>>>>> most of the modules want grib. Yet, I the second chapter, the
>>>>>>>>manual
>>>>>>>> states that p_interp netcdf data can be used. How?
>>>>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>>>>> particulalry if I want to do the above, compare station data
to
>>>>>>>>WRF.
>>>>>>>> Perhaps I missed it?
>>>>>>>>
>>>>>>>>
>>>>>>>> I used assci2nc to process the station data, so they are
ready. I
>>>>>>>> have
>>>>>>>> the
>>>>>>>> wrf-arw p_interp processed data. How do I put the wrf-data
into
>>>>>>>>the
>>>>>>>> point-stat module?
>>>>>>>>
>>>>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>>>>> results? I could have skipped p_interp step and just run the
>>>>>>>>wrf-arw
>>>>>>>> output through UPP, right? I am confused.
>>>>>>>>
>>>>>>>> If Met user of developer could help me understand the next
step,
>>>>>>>> that
>>>>>>>> would be greatly appreciated.
>>>>>>>>
>>>>>>>> In addition, has anyone successfully installed this software
on an
>>>>>>>> Apple
>>>>>>>> machine?
>>>>>>>>
>>>>>>>> Sincerely,
>>>>>>>> Erik Noble
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>


------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #62175] MET 4 (or 4.1) question(s)
From: John Halley Gotway
Time: Mon Jul 15 14:30:21 2013

Eric,

Yes, that's what I'd do.  I think that'd make things a lot simpler.

Thanks,
John

On 07/15/2013 01:03 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>
> Hi. Attached is the TRMM file I grabbed from TOVAS; the file name is
> descriptive, it is one ascii file that has daily trmm data over a
region
> of interest, yet it is 12-day's worth.
> To use TRMM data with MET, it seems like it would be better for me
to use
> 12 separate ascii files of daily TRMM data (using trmm2nc.R)? Is
this
> right?
> -Erik
>
> On 7/15/13 1:58 PM, "John Halley Gotway via RT" <met_help at ucar.edu>
wrote:
>
>> Erik,
>>
>> MET uses a NetCDF point observation file format (output of pb2nc,
>> ascii2nc, and madis2nc), and a gridded NetCDF file format (output
of
>> pcp_combine and gen_poly_mask).  Those file formats differ.
>>
>> For gridded data, you need to make it look like the output of
>> pcp_combine.  If you wanted to put multiple times into the same
NetCDF
>> file, it might look something like this:
>>
>>              float APCP_12_2013071500(lat, lon) ;
>>                    APCP_12:name = "APCP_12" ;
>>                    APCP_12:long_name = "Total precipitation" ;
>>                    APCP_12:level = "A12" ;
>>                    APCP_12:units = "kg/m^2" ;
>>                    APCP_12:_FillValue = -9999.f ;
>>                    APCP_12:init_time = "20130714_000000" ;
>>                    APCP_12:init_time_ut = 1373760000 ;
>>                    APCP_12:valid_time = "20130715_000000" ;
>>                    APCP_12:valid_time_ut = 1373846400 ;
>>                    APCP_12:accum_time = "120000" ;
>>                    APCP_12:accum_time_sec = 43200 ;
>>              float APCP_12_2013071512(lat, lon) ;
>>                    APCP_12:name = "APCP_12" ;
>>                    APCP_12:long_name = "Total precipitation" ;
>>                    APCP_12:level = "A12" ;
>>                    APCP_12:units = "kg/m^2" ;
>>                    APCP_12:_FillValue = -9999.f ;
>>                    APCP_12:init_time = "20130714_000000" ;
>>                    APCP_12:init_time_ut = 1373803200 ;
>>                    APCP_12:valid_time = "20130715_120000" ;
>>                    APCP_12:valid_time_ut = 1373889600 ;
>>                    APCP_12:accum_time = "120000" ;
>>                    APCP_12:accum_time_sec = 43200 ;
>>
>> In the above example, I've included 2 separate variables of 12-hour
>> accumulated precip.  Assume that we initialized a model run at
>> 20130714_000000 (see init_time and init_time_ut variable
attributes).
>>   This first variable (APCP_12_2013071500) is the 12-hour
accumulation
>> valid at 20130715_000000.  So that'd be between forecast hours 12
and 24.
>> The second variable (APCP_12_2013071512) is the next
>> 12-hour accumulation, valid at 20130715_120000.  So that'd be
between
>> forecast hours 24 and 36.  And you could add as many of these
variables
>> as you'd like.
>>
>> Structuring the NetCDF file this way should enable to verify them
all in
>> a single call to Point-Stat, however, as I mentioned the config
file will
>> get messier.  You'd have to list each NetCDF variable
>> name separately...
>>
>>     cat_thresh = [ >0.0 ];
>>
>>     field = [
>>        { name = "APCP_12_2013071500"; level = [ "(*,*)" ]; },
>>        { name = "APCP_12_2013071512"; level = [ "(*,*)" ]; }
>>     ];
>>
>> FYI, if you're not familiar with unixtime (that's what the _ut
stands
>> for), here's how you can use the Unix date command to compute it:
>>     # Go from YYYY-MM-DD HH:MM:SS to unixtime
>>     date -ud ''2013-07-14' UTC '00:00:00'' +%s
>>
>>     # Go from unixtime to YYYY-MM-DD HH:MM:SS
>>     date -ud '1970-01-01 UTC '1373760000' seconds' +"%Y-%m-%d
%H:%M:%S"
>>
>> Just let me know if more questions arise.
>>
>> Thanks,
>> John
>>
>> On 07/15/2013 08:41 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via RT
wrote:
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>
>>> Thank you.
>>>
>>> One last question: attached is the ascii version of the station
file,
>>> and
>>> netdf version of the file from the ascii2nc script.
>>> The attributes don't give time information. I would not have known
about
>>> this without your email. Will this be a problem? What needs to be
>>> changed?
>>>
>>> Sincerely,
>>> Erik
>>>
>>>
>>>
>>>> ncdump -h NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc
>>> netcdf NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt {
>>> dimensions:
>>> 	mxstr = 30 ;
>>> 	hdr_arr_len = 3 ;
>>> 	obs_arr_len = 5 ;
>>> 	nhdr = 12 ;
>>> 	nobs = UNLIMITED ; // (12 currently)
>>> variables:
>>> 	char hdr_typ(nhdr, mxstr) ;
>>> 		hdr_typ:long_name = "message type" ;
>>> 	char hdr_sid(nhdr, mxstr) ;
>>> 		hdr_sid:long_name = "station identification" ;
>>> 	char hdr_vld(nhdr, mxstr) ;
>>> 		hdr_vld:long_name = "valid time" ;
>>> 		hdr_vld:units = "YYYYMMDD_HHMMSS UTC" ;
>>> 	float hdr_arr(nhdr, hdr_arr_len) ;
>>> 		hdr_arr:long_name = "array of observation station header values"
;
>>> 		hdr_arr:_fill_value = -9999.f ;
>>> 		hdr_arr:columns = "lat lon elv" ;
>>> 		hdr_arr:lat_long_name = "latitude" ;
>>> 		hdr_arr:lat_units = "degrees_north" ;
>>> 		hdr_arr:lon_long_name = "longitude" ;
>>> 		hdr_arr:lon_units = "degrees_east" ;
>>> 		hdr_arr:elv_long_name = "elevation " ;
>>> 		hdr_arr:elv_units = "meters above sea level (msl)" ;
>>> 	char obs_qty(nobs, mxstr) ;
>>> 		obs_qty:long_name = "quality flag" ;
>>> 	float obs_arr(nobs, obs_arr_len) ;
>>> 		obs_arr:long_name = "array of observation values" ;
>>> 		obs_arr:_fill_value = -9999.f ;
>>> 		obs_arr:columns = "hdr_id gc lvl hgt ob" ;
>>> 		obs_arr:hdr_id_long_name = "index of matching header data" ;
>>> 		obs_arr:gc_long_name = "grib code corresponding to the
observation
>>> type"
>>> ;
>>> 		obs_arr:lvl_long_name = "pressure level (hPa) or accumulation
interval
>>> (sec)" ;
>>> 		obs_arr:hgt_long_name = "height in meters above sea level or
ground
>>> level (msl or agl)" ;
>>> 		obs_arr:ob_long_name = "observation value" ;
>>>
>>> // global attributes:
>>> 		:FileOrigins = "File
>>> NAMMA_AMMA02_rain_daily_20060902-13_MET_ascii.txt.nc generated
>>> 20130709_215434 UTC on host borgb113 by the MET ascii2nc tool" ;
>>> 		:MET_version = "V4.1" ;
>>> 		:MET_tool = "ascii2nc" ;
>>> }
>>>
>>>
>>>
>>>
>>> On 7/12/13 6:14 PM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>> wrote:
>>>
>>>> Erik,
>>>>
>>>> I see that the variables are dimensioned like this:
ACC_RAIN(time, lat,
>>>> lon).  The problem is that the MET tools won't read the timing
>>>> information from the "time" dimension as you'd probably expect.
>>>> Instead, it'll try to read the timing information from the
variable
>>>> attributes.  For example, the output of pcp_combine looks like
this:
>>>>           float APCP_12(lat, lon) ;
>>>>                   APCP_12:name = "APCP_12" ;
>>>>                   APCP_12:long_name = "Total precipitation" ;
>>>>                   APCP_12:level = "A12" ;
>>>>                   APCP_12:units = "kg/m^2" ;
>>>>                   APCP_12:_FillValue = -9999.f ;
>>>>                   APCP_12:init_time = "20050807_000000" ;
>>>>                   APCP_12:init_time_ut = 1123372800 ;
>>>>                   APCP_12:valid_time = "20050807_120000" ;
>>>>                   APCP_12:valid_time_ut = 1123416000 ;
>>>>                   APCP_12:accum_time = "120000" ;
>>>>                   APCP_12:accum_time_sec = 43200 ;
>>>>
>>>> It reads the model initialization time (init_time_ut), valid time
>>>> (valid_time_ut), and accumulation interval (accum_time_sec) from
those
>>>> variable attributes.  Your NetCDF files follow the CF-1.4
>>>> convention, but unfortunately MET doesn't have the ability to
handle
>>>> that
>>>> yet.
>>>>
>>>> If you want to have multiple times in the file, they'd need to be
>>>> stored
>>>> in separate variables, with the corresponding timing information
>>>> defined
>>>> in the variable attribute section.
>>>>
>>>> So unfortunately, it's messier than we'd like.  I do see that
you're
>>>> using TRMM data, and we do have an Rscript on our website that
>>>> reformats
>>>> ASCII TRMM data into the flavor of NetCDF that MET is
>>>> expecting.  You can find information about that here (trmm2nc.R):
>>>>
http://www.dtcenter.org/met/users/downloads/observation_data.php
>>>>
>>>> Let me know how you'd like to proceed, and what I can do to help.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>>
>>>> On 07/12/2013 01:15 PM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>> wrote:
>>>>>
>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>
>>>>> Wow, thank you.
>>>>> I think I understand now. Your description of the point-ascii
files is
>>>>> clear and I think I can proceed.
>>>>>
>>>>> I your description of the netcdf files you wrote:
>>>>> (1) 2 dimensions named lat and lon.
>>>>> (2) The lat and lon variables are *NOT* required. . .
>>>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>>>> (4) Need data variable attributes specifying timing information.
>>>>> (5) MET expects bad data to be encoded as -9999.
>>>>> (6) Need global attributes specifying grid definition
information.
>>>>> (7) Need "MET_version" global attribute.
>>>>>
>>>>>
>>>>> I have all of that except 4, 6 and 7.
>>>>>
>>>>> A description of the netcdf files I have are below. They are on
a
>>>>> 0.25°x0.25° grid.
>>>>> The variable in the WRF file is:  ACC_RAIN(time, lat, lon)
>>>>> The variable in the TRMM file is: ACC_RAIN(time, lat, lon)
>>>>>
>>>>> 	where they are both on the same grid 0.25°x0.25° grid. time =
12, and
>>>>> lat
>>>>> and lon are 1D.
>>>>>
>>>>> First, where would global attributes specifying grid definition
>>>>> information go?
>>>>> Is the timing info ok?
>>>>> By Met version, I just need to add "Metv4.1" ?
>>>>>
>>>>> Thank you.
>>>>> -Erik
>>>>>
>>>>>
>>>>> *****************
>>>>> Grid netcdf file description (0.25x0.25_grid_template.nc)
>>>>>
>>>>> *****************
>>>>> 1 #
>>>>>
>>>>>
>>>>>      2 # gridID 0
>>>>>      3 #
>>>>>      4 gridtype  = lonlat
>>>>>      5 gridsize  = 61600
>>>>>      6 xname     = lon
>>>>>      7 xlongname = longitude
>>>>>      8 xunits    = degrees_east
>>>>>      9 yname     = lat
>>>>>     10 ylongname = latitude
>>>>>     11 yunits    = degrees_north
>>>>>     12 xsize     = 280
>>>>>     13 ysize     = 220
>>>>>     14 xfirst    = -34.875
>>>>>     15 xinc      = 0.25
>>>>>     16 yfirst    = -19.875
>>>>>     17 yinc      = 0.25
>>>>>
>>>>>
>>>>> ****************
>>>>>     netcdf file: Grid template
>>>>> ****************
>>>>> $ ncdump -h 0.25x0.25_grid_template.nc
>>>>> netcdf \0.25x0.25_grid_template {
>>>>> dimensions:
>>>>> 	lon = 280 ;
>>>>> 	lat = 220 ;
>>>>> variables:
>>>>> 	double lon(lon) ;
>>>>> 		lon:standard_name = "longitude" ;
>>>>> 		lon:long_name = "longitude" ;
>>>>> 		lon:units = "degrees_east" ;
>>>>> 		lon:axis = "X" ;
>>>>> 	double lat(lat) ;
>>>>> 		lat:standard_name = "latitude" ;
>>>>> 		lat:long_name = "latitude" ;
>>>>> 		lat:units = "degrees_north" ;
>>>>> 		lat:axis = "Y" ;
>>>>> 	float random(lat, lon) ;
>>>>>
>>>>> // global attributes:
>>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>>> 		:Conventions = "CF-1.4" ;
>>>>> 		:history = "Tue May 22 15:38:08 2012: cdo -f nc
>>>>> -sellonlatbox,-35,35,-20,35 -random,global_0.25
>>>>> 0.25x0.25_grid_template.nc" ;
>>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>>> }
>>>>> ****************
>>>>> TRMM netcdf file (12 days, daily accumulated precip)
>>>>> ****************
>>>>> gs611-noble:precip_work eunoble$ ncdump -h
>>>>> 0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily.nc
>>>>> netcdf \0.25x0.25_grid_Obs1_TRMM_ACC_RAIN_daily {
>>>>> dimensions:
>>>>> 	lon = 280 ;
>>>>> 	lat = 220 ;
>>>>> 	time = UNLIMITED ; // (12 currently)
>>>>> variables:
>>>>> 	double lon(lon) ;
>>>>> 		lon:standard_name = "longitude" ;
>>>>> 		lon:long_name = "longitude" ;
>>>>> 		lon:units = "degrees_east" ;
>>>>> 		lon:axis = "X" ;
>>>>> 	double lat(lat) ;
>>>>> 		lat:standard_name = "latitude" ;
>>>>> 		lat:long_name = "latitude" ;
>>>>> 		lat:units = "degrees_north" ;
>>>>> 		lat:axis = "Y" ;
>>>>> 	double time(time) ;
>>>>> 		time:standard_name = "time" ;
>>>>> 		time:units = "hours since 1997-01-01 00:00:00" ;
>>>>> 		time:calendar = "standard" ;
>>>>> 	float ACC_RAIN(time, lat, lon) ;
>>>>> 		ACC_RAIN:long_name = "TRMM_3B42 3-hourly accumulation" ;
>>>>> 		ACC_RAIN:units = "mm" ;
>>>>> 		ACC_RAIN:_FillValue = -9999.f ;
>>>>>
>>>>> // global attributes:
>>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>>> 		:Conventions = "CF-1.4" ;
>>>>> 		:history = "Thu Oct 25 20:37:02 2012: cdo
>>>>>
>>>>>
>>>>>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25
>>>>> x0
>>>>> .2
>>>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35
-seltimestep,2/13/1
>>>>> TRMM_accum_precip_daily.nc
>>>>> TRMM_precip_daily_accum-0902-0914_0.25_grid.nc\n",
>>>>> 			"Wed Oct 24 17:01:05 2012: cdo daysum
-chname,PRC_ACCUM,ACC_RAIN,
>>>>> -selname,PRC_ACCUM mergefile_accum.nc
TRMM_accum_precip_daily.nc\n",
>>>>> 			"Wed Oct 24 16:54:32 2012: ncks -v PRC_ACCUM mergefile.nc
>>>>> mergefile_accum.nc\n",
>>>>> 			"Wed Oct 24 16:52:44 2012: ncrcat 3B42.060901.0.6.HDF.nc
<snip>
>>>>> 3B42.060930.9.6.HDF.nc mergefile.nc" ;
>>>>> 		:creation_date = "Fri Jun  1 18:14:08 EDT 2012" ;
>>>>> 		:info = "\n",
>>>>> 			"The 3B-42 estimates are scaled to match the monthly rain
gauge
>>>>> analyses\n",
>>>>> 			"used in 3B-43.The output is rainfall for 0.25x0.25 degree
grid
>>>>> boxes
>>>>> \n",
>>>>> 			"every 3 hours.\n",
>>>>> 			"" ;
>>>>> 		:description = "\n",
>>>>>
>>>>>
>>>>>
"http://disc.sci.gsfc.nasa.gov/precipitation/documentation/TRMM_READ
>>>>> ME
>>>>> /T
>>>>> RMM_3B42_readme.shtml\n",
>>>>> 			"" ;
>>>>> 		:ftp = "\n",
>>>>>
>>>>>
>>>>>
"http://disc.sci.gsfc.nasa.gov/data/datapool/TRMM_DP/01_Data_Product
>>>>> s/
>>>>> 02
>>>>> _Gridded/06_3-hour_Gpi_Cal_3B_42\n",
>>>>> 			"" ;
>>>>> 		:title = "TRMM_3B42" ;
>>>>> 		:nco_openmp_thread_number = 1 ;
>>>>> 		:NCO = "4.1.0" ;
>>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>>>
>>>>>
>>>>>
>>>>> ************
>>>>> WRF FILE (processed,12 days, daily accumulated precip)
>>>>> ************
>>>>> $ ncdump -h 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc
>>>>> netcdf \0.25x0.25_grid_Experiment_01_ACC_RAIN_daily {
>>>>> dimensions:
>>>>> 	lon = 280 ;
>>>>> 	lat = 220 ;
>>>>> 	time = UNLIMITED ; // (12 currently)
>>>>> variables:
>>>>> 	double lon(lon) ;
>>>>> 		lon:standard_name = "longitude" ;
>>>>> 		lon:long_name = "longitude" ;
>>>>> 		lon:units = "degrees_east" ;
>>>>> 		lon:axis = "X" ;
>>>>> 	double lat(lat) ;
>>>>> 		lat:standard_name = "latitude" ;
>>>>> 		lat:long_name = "latitude" ;
>>>>> 		lat:units = "degrees_north" ;
>>>>> 		lat:axis = "Y" ;
>>>>> 	double time(time) ;
>>>>> 		time:standard_name = "time" ;
>>>>> 		time:units = "hours since 2006-09-02 00:00:00" ;
>>>>> 		time:calendar = "standard" ;
>>>>> 	float ACC_RAIN(time, lat, lon) ;
>>>>> 		ACC_RAIN:standard_name = "convective_precipitation_amount" ;
>>>>> 		ACC_RAIN:long_name = "Accumulated Total Cumulus Precipitation"
;
>>>>> 		ACC_RAIN:units = "mm" ;
>>>>> 		ACC_RAIN:_FillValue = -9.e+33f ;
>>>>>
>>>>> // global attributes:
>>>>> 		:CDI = "Climate Data Interface version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdi)" ;
>>>>> 		:Conventions = "CF-1.1" ;
>>>>> 		:history = "Wed May 23 14:30:35 2012: cdo
chname,RAINC,ACC_RAIN -sub
>>>>> -seltimestep,2/13/1 temp_RAIN.nc -seltimestep,1/12/1
temp_RAIN.nc
>>>>> 0.25x0.25_grid_Experiment_01_ACC_RAIN_daily.nc\n",
>>>>> 			"Wed May 23 14:30:34 2012: cdo add -selname,RAINC temp1.nc
>>>>> -selname,RAINNC temp1.nc temp_RAIN.nc\n",
>>>>> 			"Wed May 23 14:30:34 2012: cdo seltimestep,1/97/8
>>>>> -selname,RAINC,RAINNC
>>>>> 0.25x0.25_grid_Experiment_01.nc temp1.nc\n",
>>>>> 			"Wed May 23 14:30:03 2012: cdo -P 4
>>>>>
>>>>>
>>>>>
remapcon,/Volumes/DATA_1_Terabyte/POST_PROCESSING/regridding_tools/0.25
>>>>> x0
>>>>> .2
>>>>> 5_grid_template.nc -sellonlatbox,-35,35,-20,35 Experiment_01.nc
>>>>> 0.25x0.25_grid_Experiment_01.nc\n",
>>>>> 			"Wed May 23 14:27:04 2012: ncks -v
>>>>> u_gr_p,v_gr_p,T_p,rh_p,q_p,precip_c,precip_g,LHFlx
>>>>> post_wrfout_d01_2006-09-02_00:00:00.nc
>>>>> temp_post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>>>> 		:institution = "NASA GODDARD INSTITUTE FOR SPACE STUDIES -
CIRES" ;
>>>>> 		:title = "post_wrfout_d01_2006-09-02_00:00:00.nc" ;
>>>>> 		:notes = "Created with NCL script:  wrfout_to_cf_Erik.ncl
v1.0" ;
>>>>> 		:created_by = "Erik Noble - erik.noble at nasa.gov" ;
>>>>> 		:creation_date = "Thu Dec  8 02:28:45 EST 2011" ;
>>>>> 		:NCO = "4.1.0" ;
>>>>> 		:CDO = "Climate Data Operators version 1.5.5
>>>>> (http://code.zmaw.de/projects/cdo)" ;
>>>>>
>>>>>
>>>>>
>>>>> On 7/12/13 1:58 PM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>>> wrote:
>>>>>
>>>>>> Eric,
>>>>>>
>>>>>> The timing of the data within the ASCII files doesn't matter.
So
>>>>>> using
>>>>>> 40 12-day files is fine.
>>>>>>
>>>>>> But each time you run Point-Stat, you'll pass it a forecast
file and
>>>>>> an
>>>>>> observation file containing the observations that should be
used to
>>>>>> verify that forecast.  So you'll need to know which obs
>>>>>> files go with which forecast files.  Point-Stat reads the
timing
>>>>>> information from the forecast file and then defines a time
window
>>>>>> around
>>>>>> that valid time (see "obs_window" setting in the Point-Stat
>>>>>> config file and see data/config/README for details).  Any point
>>>>>> observations falling within that time window will be used.  Any
point
>>>>>> observations falling outside that time window will be skipped
over.
>>>>>>
>>>>>> If you were using GRIB forecast files, you could actually set
this up
>>>>>> in
>>>>>> such a way as to verify all of the output times in a single
call to
>>>>>> Point-Stat.  You'd literally 'cat' together all of the GRIB
>>>>>> files and then set up a more complex configuration file to tell
>>>>>> Point-Stat what to do.  I suppose you could do the same using
NetCDF
>>>>>> if
>>>>>> you defined a bunch of different 2D variables, each for precip
>>>>>> with a different time (e.g. APCP_06_2012020500,
APCP_06_2012020506,
>>>>>> ...).
>>>>>> Or you can keep all the NetCDF files separate and just loop
through
>>>>>> them
>>>>>> in a script.  That's what we typically do.
>>>>>>
>>>>>> In order to create a NetCDF file that MET can handle you
basically
>>>>>> need
>>>>>> to structure it like the NetCDF output of the pcp_combine tool:
>>>>>> (1) 2 dimensions named lat and lon.
>>>>>> (2) The lat and lon variables are *NOT* required.  MET doesn't
>>>>>> actually
>>>>>> use them.  We put them in there since other plotting tools
(like IDV)
>>>>>> use
>>>>>> them.
>>>>>> (3) The data variables should be 2 dimensional, defined (lat,
lon).
>>>>>> (4) Need data variable attributes specifying timing
information.
>>>>>> (5) MET expects bad data to be encoded as -9999.
>>>>>> (6) Need global attributes specifying grid definition
information.
>>>>>> (7) Need "MET_version" global attribute.
>>>>>>
>>>>>> Ultimately, we'd like to support CF-compliant NetCDF files, but
we
>>>>>> haven't gotten there yet.  For now, we're stuck with this
rather
>>>>>> restrictive NetCDF format that we use internally.
>>>>>>
>>>>>> If you need help setting up the global attributes, just send me
a
>>>>>> description of the grid you're using.  Give it a shot and if
you run
>>>>>> into
>>>>>> problems, just send me a sample NetCDF file and I can let
>>>>>> you know what changes are needed.
>>>>>>
>>>>>> Thanks,
>>>>>> John
>>>>>>
>>>>>> On 07/12/2013 09:23 AM, Noble, Erik U.[COLUMBIA UNIVERSITY] via
RT
>>>>>> wrote:
>>>>>>>
>>>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175
>
>>>>>>>
>>>>>>> Hi, thank you. Your reply clarifies a lot.
>>>>>>>
>>>>>>> 2 questions:
>>>>>>> You wrote, "Point-Stat is run once per valid time,Š"
>>>>>>> I have 40 ascii files that hold 12-day daily precipitation
>>>>>>> accumulations
>>>>>>> at each station. (1) Do the ascii files need to be once per
valid
>>>>>>> time
>>>>>>> or
>>>>>>> can I still use the 40 12-day files?
>>>>>>>
>>>>>>> Before I discovered MET, I went through the trouble of writing
my
>>>>>>> own
>>>>>>> scripts that do the same thing as p_interp and destaggers the
winds.
>>>>>>> Also,
>>>>>>> I found an easy way to regrid netcdf files, both observed and
>>>>>>> wrf-arw
>>>>>>> files to the same grid, using CDO. Details of the command line
are
>>>>>>> here,
>>>>>>> where the 2nd-to-last-entry provides the best explanation
>>>>>>> (https://code.zmaw.de/boards/1/topics/1051#message-1056)
>>>>>>> What does the netcdf file need to have in order to still be
used in
>>>>>>> MET
>>>>>>> (point-stat, etc.) Do the lats and lons still need to be 2-D
>>>>>>> (lon=XLAT,XLONG;lat=XLAT,XLONG) or can they be 1D (lat =
lat;lon =
>>>>>>> lon)
>>>>>>>
>>>>>>> Thank you.
>>>>>>> Erik
>>>>>>>
>>>>>>> On 7/11/13 11:54 AM, "John Halley Gotway via RT"
<met_help at ucar.edu>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Erik,
>>>>>>>>
>>>>>>>> We appreciate the feedback!  It's been a long road getting
MET out
>>>>>>>> the
>>>>>>>> door and supporting it, but it's nice to hear that it's
useful to
>>>>>>>> the
>>>>>>>> modelling community.
>>>>>>>>
>>>>>>>> I see that you're asking about the support in MET for the
NetCDF
>>>>>>>> output
>>>>>>>> of the pinterp utility.  The output of pinterp is a gridded
NetCDF
>>>>>>>> file
>>>>>>>> that the MET tools do support, but with some
>>>>>>>> limitations.  WRF is run on a staggered grid with the wind
fields
>>>>>>>> staggered, while the mass points are on a regular grid.  The
>>>>>>>> pinterp
>>>>>>>> output is indeed on pressure levels, but the wind points are
>>>>>>>> still staggered, and MET is unable to read them.  So
basically,
>>>>>>>> using
>>>>>>>> pinterp is not a good choice if you're interested in winds.
But as
>>>>>>>> long
>>>>>>>> as you're not using winds, the gridded pinterp output can
>>>>>>>> be used is any place in MET that GRIB1 or GRIB2 is used.
>>>>>>>>
>>>>>>>> The other big drawback to pinterp is that it's NetCDF, and
>>>>>>>> therefore,
>>>>>>>> it's not easy to regrid.  When doing grid-to-grid
comparisons, you
>>>>>>>> need
>>>>>>>> to put the forecast and observation fields on a common
>>>>>>>> grid.  That's easy to do for GRIB using the copygb utility,
but not
>>>>>>>> easy
>>>>>>>> in general for NetCDF.
>>>>>>>>
>>>>>>>> So the other WRF post-processing alternative is the Unified
>>>>>>>> PostProcessor
>>>>>>>> (UPP).  It's output is in GRIB1 format, which MET fully
supports.
>>>>>>>> If
>>>>>>>> possible, I'd suggest using UPP instead of pinterp to
>>>>>>>> avoid the staggered grid and regridding limitations of
NetCDF.
>>>>>>>> Support
>>>>>>>> for UPP is provided via wrfhelp at ucar.edu.
>>>>>>>>
>>>>>>>> But to get to your question...
>>>>>>>>
>>>>>>>> If you're dealing with point observations, the only MET tool
to be
>>>>>>>> used
>>>>>>>> is the Point-Stat tool.  I suggest running Point-Stat to
compare
>>>>>>>> the
>>>>>>>> output of pinterp (or UPP, if you switch) to the point
>>>>>>>> observation output of ASCII2NC.  With only 40 points, I'd
suggest
>>>>>>>> writing
>>>>>>>> out the matched pair (MPR) line type from Point-Stat.  Point-
Stat
>>>>>>>> is
>>>>>>>> run
>>>>>>>> once per valid time, however, you can then use the
>>>>>>>> STAT-Analysis tool to aggregate results through time.
Suppose
>>>>>>>> you've
>>>>>>>> run
>>>>>>>> Point-Stat over a month of output for those 40 stations.  You
could
>>>>>>>> run
>>>>>>>> STAT-Analysis to aggregate together that month of
>>>>>>>> Point-Stat output and compute monthly statistics for each of
the 40
>>>>>>>> stations individually.
>>>>>>>>
>>>>>>>> Hopefully that helps.  If you get stuck on setting up the
>>>>>>>> Point-Stat
>>>>>>>> configuration file, just let me know and I'll be happy to
help you
>>>>>>>> get
>>>>>>>> it
>>>>>>>> going.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> John Halley Gotway
>>>>>>>> met_help at ucar.edu
>>>>>>>>
>>>>>>>> On 07/10/2013 09:45 AM, Noble, Erik U.[COLUMBIA UNIVERSITY]
via RT
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> Wed Jul 10 09:45:10 2013: Request 62175 was acted upon.
>>>>>>>>> Transaction: Ticket created by erik.noble at nasa.gov
>>>>>>>>>             Queue: met_help
>>>>>>>>>           Subject: MET 4 (or 4.1) question(s)
>>>>>>>>>             Owner: Nobody
>>>>>>>>>        Requestors: erik.noble at nasa.gov
>>>>>>>>>            Status: new
>>>>>>>>>       Ticket <URL:
>>>>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=62175 >
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Hi.
>>>>>>>>> First, to the developer, thank you for providing MET to the
world.
>>>>>>>>> It
>>>>>>>>> looks outstanding and very useful. If you don't get enough
>>>>>>>>> thanks, I
>>>>>>>>> hope
>>>>>>>>> just one more "thank you" makes your day.
>>>>>>>>>
>>>>>>>>> I installed MET 4.1 on a Linux.
>>>>>>>>> I am ready to start using MET to compare 40 station
observations
>>>>>>>>> of
>>>>>>>>> daily
>>>>>>>>> accumulated precip to 64 WRF-ARW simulations for the same
area.
>>>>>>>>> I have read the MET 4.1 manual for the past week \and gone
through
>>>>>>>>> all
>>>>>>>>> the
>>>>>>>>> examples.
>>>>>>>>>
>>>>>>>>> But I am very confused about one thing:
>>>>>>>>>
>>>>>>>>> Once a user uses p_interp to process WRF-ARW output, where
do they
>>>>>>>>> go
>>>>>>>>> from
>>>>>>>>> there?
>>>>>>>>> The p_interp output is still netcdf (variables are on
pressure
>>>>>>>>> level)
>>>>>>>>> yet
>>>>>>>>> most of the modules want grib. Yet, I the second chapter,
the
>>>>>>>>> manual
>>>>>>>>> states that p_interp netcdf data can be used. How?
>>>>>>>>> How do I feed p_interp processed data into the other MET
modules,
>>>>>>>>> particulalry if I want to do the above, compare station data
to
>>>>>>>>> WRF.
>>>>>>>>> Perhaps I missed it?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> I used assci2nc to process the station data, so they are
ready. I
>>>>>>>>> have
>>>>>>>>> the
>>>>>>>>> wrf-arw p_interp processed data. How do I put the wrf-data
into
>>>>>>>>> the
>>>>>>>>> point-stat module?
>>>>>>>>>
>>>>>>>>> If it needs to to still be in grib, what do I do with the P-
interp
>>>>>>>>> results? I could have skipped p_interp step and just run the
>>>>>>>>> wrf-arw
>>>>>>>>> output through UPP, right? I am confused.
>>>>>>>>>
>>>>>>>>> If Met user of developer could help me understand the next
step,
>>>>>>>>> that
>>>>>>>>> would be greatly appreciated.
>>>>>>>>>
>>>>>>>>> In addition, has anyone successfully installed this software
on an
>>>>>>>>> Apple
>>>>>>>>> machine?
>>>>>>>>>
>>>>>>>>> Sincerely,
>>>>>>>>> Erik Noble
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>
>
>

------------------------------------------------


More information about the Met_help mailing list