[Met_help] [rt.rap.ucar.edu #64630] History for MET on yellowstone

John Halley Gotway via RT met_help at ucar.edu
Tue Jan 7 15:29:06 MST 2014


----------------------------------------------------------------
  Initial Request
----------------------------------------------------------------

Hi,

I am a new user of MET, and would like to know if
it is already installed on yellowstone, so that
I can use the pre-installed version for my
purpose.

thanks,
Mukul


----------------------------------------------------------------
  Complete Ticket History
----------------------------------------------------------------

Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: John Halley Gotway
Time: Mon Dec 16 13:43:03 2013

Mukul,

METv4.1 is installed on yellowstone in:
    /glade/p/ral/jnt/MET/MET_releases/METv4.1

You're welcome to use that version if you'd like.

Thanks,
John Halley Gotway
met_help at ucar.edu

On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>
> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> Transaction: Ticket created by mukul at rap.ucar.edu
>         Queue: met_help
>       Subject: MET on yellowstone
>         Owner: Nobody
>    Requestors: mukul at ucar.edu
>        Status: new
>   Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
>
> Hi,
>
> I am a new user of MET, and would like to know if
> it is already installed on yellowstone, so that
> I can use the pre-installed version for my
> purpose.
>
> thanks,
> Mukul
>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Mon Dec 16 15:41:13 2013

Hi John,

I don't see any built executables in bin/??

Mukul

On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway via RT
wrote:
> Mukul,
>
> METv4.1 is installed on yellowstone in:
>     /glade/p/ral/jnt/MET/MET_releases/METv4.1
>
> You're welcome to use that version if you'd like.
>
> Thanks,
> John Halley Gotway
> met_help at ucar.edu
>
> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >
> > Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> > Transaction: Ticket created by mukul at rap.ucar.edu
> >         Queue: met_help
> >       Subject: MET on yellowstone
> >         Owner: Nobody
> >    Requestors: mukul at ucar.edu
> >        Status: new
> >   Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> >
> > Hi,
> >
> > I am a new user of MET, and would like to know if
> > it is already installed on yellowstone, so that
> > I can use the pre-installed version for my
> > purpose.
> >
> > thanks,
> > Mukul
> >

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: John Halley Gotway
Time: Mon Dec 16 16:47:53 2013

Mukul,

Sorry about that.  You can use a version of it that's compiled here:
    /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1

I'm trying to recompile the version I sent you, but for some reason
I'm getting a compilation error I haven't seen in the past.

Thanks,
John

On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
> Hi John,
>
> I don't see any built executables in bin/??
>
> Mukul
>
> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway via RT
wrote:
>> Mukul,
>>
>> METv4.1 is installed on yellowstone in:
>>      /glade/p/ral/jnt/MET/MET_releases/METv4.1
>>
>> You're welcome to use that version if you'd like.
>>
>> Thanks,
>> John Halley Gotway
>> met_help at ucar.edu
>>
>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>>>
>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
>>> Transaction: Ticket created by mukul at rap.ucar.edu
>>>          Queue: met_help
>>>        Subject: MET on yellowstone
>>>          Owner: Nobody
>>>     Requestors: mukul at ucar.edu
>>>         Status: new
>>>    Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>
>>>
>>> Hi,
>>>
>>> I am a new user of MET, and would like to know if
>>> it is already installed on yellowstone, so that
>>> I can use the pre-installed version for my
>>> purpose.
>>>
>>> thanks,
>>> Mukul
>>>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Mon Dec 16 17:53:06 2013

Thanks, John. I was also getting an error while compiling. I am
getting an
error related with netcdf error which I am not sure because I assigned
the
right path.

thanks,
Mukul

On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via RT
wrote:
> Mukul,
>
> Sorry about that.  You can use a version of it that's compiled here:
>     /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
>
> I'm trying to recompile the version I sent you, but for some reason
I'm getting a compilation error I haven't seen in the past.
>
> Thanks,
> John
>
> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> > Hi John,
> >
> > I don't see any built executables in bin/??
> >
> > Mukul
> >
> > On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway via
RT wrote:
> >> Mukul,
> >>
> >> METv4.1 is installed on yellowstone in:
> >>      /glade/p/ral/jnt/MET/MET_releases/METv4.1
> >>
> >> You're welcome to use that version if you'd like.
> >>
> >> Thanks,
> >> John Halley Gotway
> >> met_help at ucar.edu
> >>
> >> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >>>
> >>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> >>> Transaction: Ticket created by mukul at rap.ucar.edu
> >>>          Queue: met_help
> >>>        Subject: MET on yellowstone
> >>>          Owner: Nobody
> >>>     Requestors: mukul at ucar.edu
> >>>         Status: new
> >>>    Ticket <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I am a new user of MET, and would like to know if
> >>> it is already installed on yellowstone, so that
> >>> I can use the pre-installed version for my
> >>> purpose.
> >>>
> >>> thanks,
> >>> Mukul
> >>>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: John Halley Gotway
Time: Mon Dec 16 23:09:32 2013

Mine is a compilation error from the lex code in the vx_config
library.
I'm compiling using intel.  But this is a new error, I compiled that
same
code on yellowstone successfully a couple of months ago.

I'll look more into it later this week.

Thanks,
John

>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
> Thanks, John. I was also getting an error while compiling. I am
getting an
> error related with netcdf error which I am not sure because I
assigned the
> right path.
>
> thanks,
> Mukul
>
> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via RT
wrote:
>> Mukul,
>>
>> Sorry about that.  You can use a version of it that's compiled
here:
>>     /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
>>
>> I'm trying to recompile the version I sent you, but for some reason
I'm
>> getting a compilation error I haven't seen in the past.
>>
>> Thanks,
>> John
>>
>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
>> >
>> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>> >
>> > Hi John,
>> >
>> > I don't see any built executables in bin/??
>> >
>> > Mukul
>> >
>> > On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway via
RT
>> wrote:
>> >> Mukul,
>> >>
>> >> METv4.1 is installed on yellowstone in:
>> >>      /glade/p/ral/jnt/MET/MET_releases/METv4.1
>> >>
>> >> You're welcome to use that version if you'd like.
>> >>
>> >> Thanks,
>> >> John Halley Gotway
>> >> met_help at ucar.edu
>> >>
>> >> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>> >>>
>> >>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
>> >>> Transaction: Ticket created by mukul at rap.ucar.edu
>> >>>          Queue: met_help
>> >>>        Subject: MET on yellowstone
>> >>>          Owner: Nobody
>> >>>     Requestors: mukul at ucar.edu
>> >>>         Status: new
>> >>>    Ticket <URL:
>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>> >>>
>> >>>
>> >>> Hi,
>> >>>
>> >>> I am a new user of MET, and would like to know if
>> >>> it is already installed on yellowstone, so that
>> >>> I can use the pre-installed version for my
>> >>> purpose.
>> >>>
>> >>> thanks,
>> >>> Mukul
>> >>>
>



------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Wed Dec 18 15:15:26 2013

Hi John,

I think I can use the pre-compiled version.

I want to do grid-to-grid comparison of WRF precip
and TRMM precip, I have both on the same grid
and both are netcdf files.

I am still going through the online tutorial. Do
I need to run p_interp first on the WRF output?

Also does the MET take directly the TRMM pcp in netcdf?

thanks,
Mukul

On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via RT
wrote:
> Mine is a compilation error from the lex code in the vx_config
library.
> I'm compiling using intel.  But this is a new error, I compiled that
same
> code on yellowstone successfully a couple of months ago.
>
> I'll look more into it later this week.
>
> Thanks,
> John
>
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> > Thanks, John. I was also getting an error while compiling. I am
getting an
> > error related with netcdf error which I am not sure because I
assigned the
> > right path.
> >
> > thanks,
> > Mukul
> >
> > On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via
RT wrote:
> >> Mukul,
> >>
> >> Sorry about that.  You can use a version of it that's compiled
here:
> >>     /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
> >>
> >> I'm trying to recompile the version I sent you, but for some
reason I'm
> >> getting a compilation error I haven't seen in the past.
> >>
> >> Thanks,
> >> John
> >>
> >> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
> >> >
> >> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >> >
> >> > Hi John,
> >> >
> >> > I don't see any built executables in bin/??
> >> >
> >> > Mukul
> >> >
> >> > On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
> >> wrote:
> >> >> Mukul,
> >> >>
> >> >> METv4.1 is installed on yellowstone in:
> >> >>      /glade/p/ral/jnt/MET/MET_releases/METv4.1
> >> >>
> >> >> You're welcome to use that version if you'd like.
> >> >>
> >> >> Thanks,
> >> >> John Halley Gotway
> >> >> met_help at ucar.edu
> >> >>
> >> >> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >> >>>
> >> >>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> >> >>> Transaction: Ticket created by mukul at rap.ucar.edu
> >> >>>          Queue: met_help
> >> >>>        Subject: MET on yellowstone
> >> >>>          Owner: Nobody
> >> >>>     Requestors: mukul at ucar.edu
> >> >>>         Status: new
> >> >>>    Ticket <URL:
> >> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >> >>>
> >> >>>
> >> >>> Hi,
> >> >>>
> >> >>> I am a new user of MET, and would like to know if
> >> >>> it is already installed on yellowstone, so that
> >> >>> I can use the pre-installed version for my
> >> >>> purpose.
> >> >>>
> >> >>> thanks,
> >> >>> Mukul
> >> >>>
> >
>
>

------------------------------------------------
Subject: MET on yellowstone
From: John Halley Gotway
Time: Wed Dec 18 15:30:52 2013

Mukul,

You do need to run it through a post-processor.  You could use either
pinterp or the Unified PostProcessor (UPP).  We recommend UPP since
it's output is in GRIB1 format which can easily be regridded
using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.

MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.

You could try using the Rscript I've attached to this message.  It
reads TRMM binary data - look at the top of the script for a source of
that data.  Then it subsets the data based on the "output
domain specification" listed near the top of the script and writes a
NetCDF file that MET can read.  Just change the definition of out_res,
out_lat_ll, and so on, to define the grid you want.

You run this on the command line like this:
   Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc

It relies of the R "ncdf" package for reading/writing NetCDF files.

We'll ultimately post this script to the MET website, but it needs a
bit more work.

Hope that helps get you going.

Thanks,
John

On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
> Hi John,
>
> I think I can use the pre-compiled version.
>
> I want to do grid-to-grid comparison of WRF precip
> and TRMM precip, I have both on the same grid
> and both are netcdf files.
>
> I am still going through the online tutorial. Do
> I need to run p_interp first on the WRF output?
>
> Also does the MET take directly the TRMM pcp in netcdf?
>
> thanks,
> Mukul
>
> On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via RT
wrote:
>> Mine is a compilation error from the lex code in the vx_config
library.
>> I'm compiling using intel.  But this is a new error, I compiled
that same
>> code on yellowstone successfully a couple of months ago.
>>
>> I'll look more into it later this week.
>>
>> Thanks,
>> John
>>
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>
>>> Thanks, John. I was also getting an error while compiling. I am
getting an
>>> error related with netcdf error which I am not sure because I
assigned the
>>> right path.
>>>
>>> thanks,
>>> Mukul
>>>
>>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via
RT wrote:
>>>> Mukul,
>>>>
>>>> Sorry about that.  You can use a version of it that's compiled
here:
>>>>      /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
>>>>
>>>> I'm trying to recompile the version I sent you, but for some
reason I'm
>>>> getting a compilation error I haven't seen in the past.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
>>>>>
>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>
>>>>> Hi John,
>>>>>
>>>>> I don't see any built executables in bin/??
>>>>>
>>>>> Mukul
>>>>>
>>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway via
RT
>>>> wrote:
>>>>>> Mukul,
>>>>>>
>>>>>> METv4.1 is installed on yellowstone in:
>>>>>>       /glade/p/ral/jnt/MET/MET_releases/METv4.1
>>>>>>
>>>>>> You're welcome to use that version if you'd like.
>>>>>>
>>>>>> Thanks,
>>>>>> John Halley Gotway
>>>>>> met_help at ucar.edu
>>>>>>
>>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>>>>>>>
>>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
>>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
>>>>>>>           Queue: met_help
>>>>>>>         Subject: MET on yellowstone
>>>>>>>           Owner: Nobody
>>>>>>>      Requestors: mukul at ucar.edu
>>>>>>>          Status: new
>>>>>>>     Ticket <URL:
>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am a new user of MET, and would like to know if
>>>>>>> it is already installed on yellowstone, so that
>>>>>>> I can use the pre-installed version for my
>>>>>>> purpose.
>>>>>>>
>>>>>>> thanks,
>>>>>>> Mukul
>>>>>>>
>>>
>>
>>

------------------------------------------------
Subject: MET on yellowstone
From: John Halley Gotway
Time: Wed Dec 18 15:30:52 2013

# Daily TRMM binary files:
#
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
# 3-Hourly TRMM binary files:
#
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin

########################################################################
#
# Required libraries.
#
########################################################################

library(ncdf)

########################################################################
#
# Constants and command line arguments
#
########################################################################

hdr_len    = 5           # Each pcp file begins with 5 header lines.
hdr_file   = "trmm.hdr"  # Temporary header file.
missing    = -9999       # Missing pcp value to be used in MET.
save       = FALSE       # If set to TRUE, call save.image()
sec_per_hr = 60*60

# Native domain specification
in_res     = 0.25        # Resolution in degrees
in_lat_ll  = -50.00      # Latitude of lower-left corner
in_lon_ll  =   0.00      # Longitude of lower-left corner
in_lat_ur  =  50.00      # Latitude of upper-right corner
in_lon_ur  = 359.75      # Longitude of upper-right corner

# Output domain specification
out_res    = 0.25        # Resolution in degrees
out_lat_ll =  -25.00     # Latitude of lower-left corner
out_lon_ll = -150.00     # Longitude of lower-left corner
out_lat_ur =   60.00     # Latitude of upper-right corner
out_lon_ur =   10.00     # Longitude of upper-right corner

rescale_lon <- function (x) {
  while(x < -180) x = x + 360;
  while(x >  180) x = x - 360;
  return(x)
}

########################################################################
#
# Handle the arguments.
#
########################################################################

# Retreive the arguments
args = commandArgs(TRUE)

# Check the number of arguments
if(length(args) < 2) {
   cat("Usage: trmmbin2nc.R\n")
   cat("        trmm_file\n")
   cat("        nc_file\n")
   cat("        [-save]\n")
   cat("        where \"trmm_file\" is a binary TRMM files.\n")
   cat("              \"nc_file\"   is the out NetCDF file to be
written.\n")
   cat("              \"-save\"     to call save.image().\n\n")
   quit()
}

# Store the input file names
trmm_file = args[1]
nc_file   = args[2]

# Parse optional arguments
for(i in 1:length(args)) {
   if(args[i] == "-save") {
      save = TRUE
   }
}

########################################################################
#
# Parse accumulation interval and time from file name.
#
########################################################################

# Daily files contain the string "daily"
if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
  tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
  ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m %d",
tz="GMT")
  init  = as.POSIXct(ftime -  0*sec_per_hr)
  valid = as.POSIXct(ftime + 24*sec_per_hr)

# 3-hourly files contain the string "[0-9][0-9]z"
} else if(grepl("[0-9][0-9]z", basename(trmm_file), ignore.case=TRUE))
{
  tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
  ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
  init  = as.POSIXct(ftime - 0*sec_per_hr)
  valid = as.POSIXct(ftime + 3*sec_per_hr)

# Fail otherwise
} else {
  cat("\n\nERROR: Can\'t figure out the accumulation interval!\n\n")
  quit(1)
}

# Compute the accumulation interval
acc_sec = as.double(valid - init, units="secs")
acc_hr  = floor(acc_sec / 3600)
acc_str = sprintf("%.2i", acc_hr)

########################################################################
#
# Read the 1/4 degree binary TRMM data.
#
########################################################################

in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
in_nlat = length(in_lat)
in_nlon = length(in_lon)
data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size = 4,
endian="big")
in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))

# Rescale the input longitudes from -180.0 to 180
in_lon = sapply(in_lon, rescale_lon)

########################################################################
#
# Select subset of data to be written.
#
########################################################################

out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
out_nlat = length(out_lat)
out_nlon = length(out_lon)

# Rescale the output longitudes from -180.0 to 180
out_lon = sort(sapply(out_lon, rescale_lon))

# Extract the output data
out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
out_cnt = 0
out_vld = 0
out_sum = 0

for(cur_out_lon in 1:out_nlon) {
  for(cur_out_lat in 1:out_nlat) {

    cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
    cur_in_lat = which(out_lat[cur_out_lat] == in_lat)

    if(length(cur_in_lon) == 1 &&
       length(cur_in_lat) == 1) {
      out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
      out_vld = out_vld + 1
      out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
    }

    out_cnt = out_cnt + 1
  }
}

########################################################################
#
# Create the NetCDF output file.
#
########################################################################

# Define dimensions
dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
                       create_dimvar=TRUE)
dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
                       create_dimvar=TRUE)

# Define variables
var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
                       list(dim_lon, dim_lat), missing,
                       longname="Total precipitation", prec="single")

# Define file
nc = create.ncdf(nc_file, var_pcp)

# Add variable attributes
att.put.ncdf(nc, var_pcp, "name", "APCP")
att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
att.put.ncdf(nc, var_pcp, "grib_code", 61)
att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
att.put.ncdf(nc, var_pcp, "init_time", format(init, "%Y%m%d_%H%M%S",
tz="GMT"), prec="text")
att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
att.put.ncdf(nc, var_pcp, "valid_time", format(valid, "%Y%m%d_%H%M%S",
tz="GMT"), prec="text")
att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")

# Add global attributes
cur_time = Sys.time()
att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file, "generated",
format(Sys.time(), "%Y%m%d_%H%M%S"),
                                         "on host", Sys.info()[4], "by
the Rscript trmmbin2nc.R"))
att.put.ncdf(nc, 0, "MET_version", "V4.1")
att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat), "degrees_north"),
prec="text")
att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon), "degrees_east"),
prec="text")
att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")

# Add pcp to the file
put.var.ncdf(nc, var_pcp, out_pcp)

# Close the file
close.ncdf(nc)

# Print summary info

cat(paste("Output File:\t", nc_file, "\n", sep=""))
cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from (",
    out_lat_ll, ", ", out_lon_ll, ") to (",
    out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n", sep=""))
cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
    round(out_sum, 2), " mm total, ",
    round(out_sum/out_vld, 2), " mm avg\n", sep=""))

########################################################################
#
# Finished.
#
########################################################################

# Optionally, save all of the pcp to an .RData file
if(save == TRUE) save.image()

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Wed Dec 18 15:35:22 2013

THanks alot John, I will try it.

I am using a pre-compiled version of the UPP, I need to learn little
more on copygb. So I can compare the WRF output (after running
UPP and copygb) and TRMM after using the R script.

Since it is my first time, it is taking little longer and thanks
again for all your help.

Mukul

On Wed, Dec 18, 2013 at 03:30:52PM -0700, John Halley Gotway via RT
wrote:
> Mukul,
>
> You do need to run it through a post-processor.  You could use
either pinterp or the Unified PostProcessor (UPP).  We recommend UPP
since it's output is in GRIB1 format which can easily be regridded
> using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.
>
> MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.
>
> You could try using the Rscript I've attached to this message.  It
reads TRMM binary data - look at the top of the script for a source of
that data.  Then it subsets the data based on the "output
> domain specification" listed near the top of the script and writes a
NetCDF file that MET can read.  Just change the definition of out_res,
out_lat_ll, and so on, to define the grid you want.
>
> You run this on the command line like this:
>    Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc
>
> It relies of the R "ncdf" package for reading/writing NetCDF files.
>
> We'll ultimately post this script to the MET website, but it needs a
bit more work.
>
> Hope that helps get you going.
>
> Thanks,
> John
>
> On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> > Hi John,
> >
> > I think I can use the pre-compiled version.
> >
> > I want to do grid-to-grid comparison of WRF precip
> > and TRMM precip, I have both on the same grid
> > and both are netcdf files.
> >
> > I am still going through the online tutorial. Do
> > I need to run p_interp first on the WRF output?
> >
> > Also does the MET take directly the TRMM pcp in netcdf?
> >
> > thanks,
> > Mukul
> >
> > On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via
RT wrote:
> >> Mine is a compilation error from the lex code in the vx_config
library.
> >> I'm compiling using intel.  But this is a new error, I compiled
that same
> >> code on yellowstone successfully a couple of months ago.
> >>
> >> I'll look more into it later this week.
> >>
> >> Thanks,
> >> John
> >>
> >>>
> >>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>
> >>> Thanks, John. I was also getting an error while compiling. I am
getting an
> >>> error related with netcdf error which I am not sure because I
assigned the
> >>> right path.
> >>>
> >>> thanks,
> >>> Mukul
> >>>
> >>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via
RT wrote:
> >>>> Mukul,
> >>>>
> >>>> Sorry about that.  You can use a version of it that's compiled
here:
> >>>>      /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
> >>>>
> >>>> I'm trying to recompile the version I sent you, but for some
reason I'm
> >>>> getting a compilation error I haven't seen in the past.
> >>>>
> >>>> Thanks,
> >>>> John
> >>>>
> >>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
> >>>>>
> >>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630
>
> >>>>>
> >>>>> Hi John,
> >>>>>
> >>>>> I don't see any built executables in bin/??
> >>>>>
> >>>>> Mukul
> >>>>>
> >>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
> >>>> wrote:
> >>>>>> Mukul,
> >>>>>>
> >>>>>> METv4.1 is installed on yellowstone in:
> >>>>>>       /glade/p/ral/jnt/MET/MET_releases/METv4.1
> >>>>>>
> >>>>>> You're welcome to use that version if you'd like.
> >>>>>>
> >>>>>> Thanks,
> >>>>>> John Halley Gotway
> >>>>>> met_help at ucar.edu
> >>>>>>
> >>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >>>>>>>
> >>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> >>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
> >>>>>>>           Queue: met_help
> >>>>>>>         Subject: MET on yellowstone
> >>>>>>>           Owner: Nobody
> >>>>>>>      Requestors: mukul at ucar.edu
> >>>>>>>          Status: new
> >>>>>>>     Ticket <URL:
> >>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>>>>>
> >>>>>>>
> >>>>>>> Hi,
> >>>>>>>
> >>>>>>> I am a new user of MET, and would like to know if
> >>>>>>> it is already installed on yellowstone, so that
> >>>>>>> I can use the pre-installed version for my
> >>>>>>> purpose.
> >>>>>>>
> >>>>>>> thanks,
> >>>>>>> Mukul
> >>>>>>>
> >>>
> >>
> >>
>

> # Daily TRMM binary files:
> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
> # 3-Hourly TRMM binary files:
> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin
>
>
########################################################################
> #
> # Required libraries.
> #
>
########################################################################
>
> library(ncdf)
>
>
########################################################################
> #
> # Constants and command line arguments
> #
>
########################################################################
>
> hdr_len    = 5           # Each pcp file begins with 5 header lines.
> hdr_file   = "trmm.hdr"  # Temporary header file.
> missing    = -9999       # Missing pcp value to be used in MET.
> save       = FALSE       # If set to TRUE, call save.image()
> sec_per_hr = 60*60
>
> # Native domain specification
> in_res     = 0.25        # Resolution in degrees
> in_lat_ll  = -50.00      # Latitude of lower-left corner
> in_lon_ll  =   0.00      # Longitude of lower-left corner
> in_lat_ur  =  50.00      # Latitude of upper-right corner
> in_lon_ur  = 359.75      # Longitude of upper-right corner
>
> # Output domain specification
> out_res    = 0.25        # Resolution in degrees
> out_lat_ll =  -25.00     # Latitude of lower-left corner
> out_lon_ll = -150.00     # Longitude of lower-left corner
> out_lat_ur =   60.00     # Latitude of upper-right corner
> out_lon_ur =   10.00     # Longitude of upper-right corner
>
> rescale_lon <- function (x) {
>   while(x < -180) x = x + 360;
>   while(x >  180) x = x - 360;
>   return(x)
> }
>
>
########################################################################
> #
> # Handle the arguments.
> #
>
########################################################################
>
> # Retreive the arguments
> args = commandArgs(TRUE)
>
> # Check the number of arguments
> if(length(args) < 2) {
>    cat("Usage: trmmbin2nc.R\n")
>    cat("        trmm_file\n")
>    cat("        nc_file\n")
>    cat("        [-save]\n")
>    cat("        where \"trmm_file\" is a binary TRMM files.\n")
>    cat("              \"nc_file\"   is the out NetCDF file to be
written.\n")
>    cat("              \"-save\"     to call save.image().\n\n")
>    quit()
> }
>
> # Store the input file names
> trmm_file = args[1]
> nc_file   = args[2]
>
> # Parse optional arguments
> for(i in 1:length(args)) {
>    if(args[i] == "-save") {
>       save = TRUE
>    }
> }
>
>
########################################################################
> #
> # Parse accumulation interval and time from file name.
> #
>
########################################################################
>
> # Daily files contain the string "daily"
> if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
>   tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>   ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m %d",
tz="GMT")
>   init  = as.POSIXct(ftime -  0*sec_per_hr)
>   valid = as.POSIXct(ftime + 24*sec_per_hr)
>
> # 3-hourly files contain the string "[0-9][0-9]z"
> } else if(grepl("[0-9][0-9]z", basename(trmm_file),
ignore.case=TRUE)) {
>   tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>   ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
>   init  = as.POSIXct(ftime - 0*sec_per_hr)
>   valid = as.POSIXct(ftime + 3*sec_per_hr)
>
> # Fail otherwise
> } else {
>   cat("\n\nERROR: Can\'t figure out the accumulation interval!\n\n")
>   quit(1)
> }
>
> # Compute the accumulation interval
> acc_sec = as.double(valid - init, units="secs")
> acc_hr  = floor(acc_sec / 3600)
> acc_str = sprintf("%.2i", acc_hr)
>
>
########################################################################
> #
> # Read the 1/4 degree binary TRMM data.
> #
>
########################################################################
>
> in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
> in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
> in_nlat = length(in_lat)
> in_nlon = length(in_lon)
> data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size = 4,
endian="big")
> in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))
>
> # Rescale the input longitudes from -180.0 to 180
> in_lon = sapply(in_lon, rescale_lon)
>
>
########################################################################
> #
> # Select subset of data to be written.
> #
>
########################################################################
>
> out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
> out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
> out_nlat = length(out_lat)
> out_nlon = length(out_lon)
>
> # Rescale the output longitudes from -180.0 to 180
> out_lon = sort(sapply(out_lon, rescale_lon))
>
> # Extract the output data
> out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
> out_cnt = 0
> out_vld = 0
> out_sum = 0
>
> for(cur_out_lon in 1:out_nlon) {
>   for(cur_out_lat in 1:out_nlat) {
>
>     cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
>     cur_in_lat = which(out_lat[cur_out_lat] == in_lat)
>
>     if(length(cur_in_lon) == 1 &&
>        length(cur_in_lat) == 1) {
>       out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
>       out_vld = out_vld + 1
>       out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
>     }
>
>     out_cnt = out_cnt + 1
>   }
> }
>
>
########################################################################
> #
> # Create the NetCDF output file.
> #
>
########################################################################
>
> # Define dimensions
> dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
>                        create_dimvar=TRUE)
> dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
>                        create_dimvar=TRUE)
>
> # Define variables
> var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
>                        list(dim_lon, dim_lat), missing,
>                        longname="Total precipitation",
prec="single")
>
> # Define file
> nc = create.ncdf(nc_file, var_pcp)
>
> # Add variable attributes
> att.put.ncdf(nc, var_pcp, "name", "APCP")
> att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
> att.put.ncdf(nc, var_pcp, "grib_code", 61)
> att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
> att.put.ncdf(nc, var_pcp, "init_time", format(init, "%Y%m%d_%H%M%S",
tz="GMT"), prec="text")
> att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
> att.put.ncdf(nc, var_pcp, "valid_time", format(valid,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
> att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
> att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
> att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")
>
> # Add global attributes
> cur_time = Sys.time()
> att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file,
"generated", format(Sys.time(), "%Y%m%d_%H%M%S"),
>                                          "on host", Sys.info()[4],
"by the Rscript trmmbin2nc.R"))
> att.put.ncdf(nc, 0, "MET_version", "V4.1")
> att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
> att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat), "degrees_north"),
prec="text")
> att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon), "degrees_east"),
prec="text")
> att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
> att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
> att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
> att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")
>
> # Add pcp to the file
> put.var.ncdf(nc, var_pcp, out_pcp)
>
> # Close the file
> close.ncdf(nc)
>
> # Print summary info
>
> cat(paste("Output File:\t", nc_file, "\n", sep=""))
> cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from (",
>     out_lat_ll, ", ", out_lon_ll, ") to (",
>     out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n",
sep=""))
> cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
>     round(out_sum, 2), " mm total, ",
>     round(out_sum/out_vld, 2), " mm avg\n", sep=""))
>
>
########################################################################
> #
> # Finished.
> #
>
########################################################################
>
> # Optionally, save all of the pcp to an .RData file
> if(save == TRUE) save.image()


------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Fri Dec 27 11:58:03 2013

Hi John,

As you mentioned below, I used the UPP and copygb to convert the
wrf output and regrid. I also used the r_script you provided
to changing the domain info.

Now when I try the grid to grid analysis using,

===============================
[mukul at yslogin1 r_script]$
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis -fcst
wrf -obs trmm -out tmp.nc -config
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
DEBUG 1: Reading ASCII file list: wrf
DEBUG 1: Reading ASCII file list: trmm
DEBUG 1: Default Config File:
/glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1/data/config/SeriesAnalysisConfig_default
DEBUG 1: User Config File:
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
ERROR  :
ERROR  : SeriesAnalysisConfInfo::process_config() -> the "fcst" and
"obs" settings may not be empty.
ERROR  :
===============================

I get the above error. I created the 24hr TRMM files and WRF files and
the path for
these files are in

/glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/wrf
and
/glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm

files.

Can you please help with resolving the error? Do you think it is the
problem associated with series_analysis or with the copygb step?

thanks,
Mukul


On Wed, Dec 18, 2013 at 03:30:52PM -0700, John Halley Gotway via RT
wrote:
> Mukul,
>
> You do need to run it through a post-processor.  You could use
either pinterp or the Unified PostProcessor (UPP).  We recommend UPP
since it's output is in GRIB1 format which can easily be regridded
> using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.
>
> MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.
>
> You could try using the Rscript I've attached to this message.  It
reads TRMM binary data - look at the top of the script for a source of
that data.  Then it subsets the data based on the "output
> domain specification" listed near the top of the script and writes a
NetCDF file that MET can read.  Just change the definition of out_res,
out_lat_ll, and so on, to define the grid you want.
>
> You run this on the command line like this:
>    Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc
>
> It relies of the R "ncdf" package for reading/writing NetCDF files.
>
> We'll ultimately post this script to the MET website, but it needs a
bit more work.
>
> Hope that helps get you going.
>
> Thanks,
> John
>
> On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> > Hi John,
> >
> > I think I can use the pre-compiled version.
> >
> > I want to do grid-to-grid comparison of WRF precip
> > and TRMM precip, I have both on the same grid
> > and both are netcdf files.
> >
> > I am still going through the online tutorial. Do
> > I need to run p_interp first on the WRF output?
> >
> > Also does the MET take directly the TRMM pcp in netcdf?
> >
> > thanks,
> > Mukul
> >
> > On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via
RT wrote:
> >> Mine is a compilation error from the lex code in the vx_config
library.
> >> I'm compiling using intel.  But this is a new error, I compiled
that same
> >> code on yellowstone successfully a couple of months ago.
> >>
> >> I'll look more into it later this week.
> >>
> >> Thanks,
> >> John
> >>
> >>>
> >>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>
> >>> Thanks, John. I was also getting an error while compiling. I am
getting an
> >>> error related with netcdf error which I am not sure because I
assigned the
> >>> right path.
> >>>
> >>> thanks,
> >>> Mukul
> >>>
> >>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via
RT wrote:
> >>>> Mukul,
> >>>>
> >>>> Sorry about that.  You can use a version of it that's compiled
here:
> >>>>      /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
> >>>>
> >>>> I'm trying to recompile the version I sent you, but for some
reason I'm
> >>>> getting a compilation error I haven't seen in the past.
> >>>>
> >>>> Thanks,
> >>>> John
> >>>>
> >>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
> >>>>>
> >>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630
>
> >>>>>
> >>>>> Hi John,
> >>>>>
> >>>>> I don't see any built executables in bin/??
> >>>>>
> >>>>> Mukul
> >>>>>
> >>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
> >>>> wrote:
> >>>>>> Mukul,
> >>>>>>
> >>>>>> METv4.1 is installed on yellowstone in:
> >>>>>>       /glade/p/ral/jnt/MET/MET_releases/METv4.1
> >>>>>>
> >>>>>> You're welcome to use that version if you'd like.
> >>>>>>
> >>>>>> Thanks,
> >>>>>> John Halley Gotway
> >>>>>> met_help at ucar.edu
> >>>>>>
> >>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >>>>>>>
> >>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> >>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
> >>>>>>>           Queue: met_help
> >>>>>>>         Subject: MET on yellowstone
> >>>>>>>           Owner: Nobody
> >>>>>>>      Requestors: mukul at ucar.edu
> >>>>>>>          Status: new
> >>>>>>>     Ticket <URL:
> >>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>>>>>
> >>>>>>>
> >>>>>>> Hi,
> >>>>>>>
> >>>>>>> I am a new user of MET, and would like to know if
> >>>>>>> it is already installed on yellowstone, so that
> >>>>>>> I can use the pre-installed version for my
> >>>>>>> purpose.
> >>>>>>>
> >>>>>>> thanks,
> >>>>>>> Mukul
> >>>>>>>
> >>>
> >>
> >>
>

> # Daily TRMM binary files:
> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
> # 3-Hourly TRMM binary files:
> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin
>
>
########################################################################
> #
> # Required libraries.
> #
>
########################################################################
>
> library(ncdf)
>
>
########################################################################
> #
> # Constants and command line arguments
> #
>
########################################################################
>
> hdr_len    = 5           # Each pcp file begins with 5 header lines.
> hdr_file   = "trmm.hdr"  # Temporary header file.
> missing    = -9999       # Missing pcp value to be used in MET.
> save       = FALSE       # If set to TRUE, call save.image()
> sec_per_hr = 60*60
>
> # Native domain specification
> in_res     = 0.25        # Resolution in degrees
> in_lat_ll  = -50.00      # Latitude of lower-left corner
> in_lon_ll  =   0.00      # Longitude of lower-left corner
> in_lat_ur  =  50.00      # Latitude of upper-right corner
> in_lon_ur  = 359.75      # Longitude of upper-right corner
>
> # Output domain specification
> out_res    = 0.25        # Resolution in degrees
> out_lat_ll =  -25.00     # Latitude of lower-left corner
> out_lon_ll = -150.00     # Longitude of lower-left corner
> out_lat_ur =   60.00     # Latitude of upper-right corner
> out_lon_ur =   10.00     # Longitude of upper-right corner
>
> rescale_lon <- function (x) {
>   while(x < -180) x = x + 360;
>   while(x >  180) x = x - 360;
>   return(x)
> }
>
>
########################################################################
> #
> # Handle the arguments.
> #
>
########################################################################
>
> # Retreive the arguments
> args = commandArgs(TRUE)
>
> # Check the number of arguments
> if(length(args) < 2) {
>    cat("Usage: trmmbin2nc.R\n")
>    cat("        trmm_file\n")
>    cat("        nc_file\n")
>    cat("        [-save]\n")
>    cat("        where \"trmm_file\" is a binary TRMM files.\n")
>    cat("              \"nc_file\"   is the out NetCDF file to be
written.\n")
>    cat("              \"-save\"     to call save.image().\n\n")
>    quit()
> }
>
> # Store the input file names
> trmm_file = args[1]
> nc_file   = args[2]
>
> # Parse optional arguments
> for(i in 1:length(args)) {
>    if(args[i] == "-save") {
>       save = TRUE
>    }
> }
>
>
########################################################################
> #
> # Parse accumulation interval and time from file name.
> #
>
########################################################################
>
> # Daily files contain the string "daily"
> if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
>   tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>   ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m %d",
tz="GMT")
>   init  = as.POSIXct(ftime -  0*sec_per_hr)
>   valid = as.POSIXct(ftime + 24*sec_per_hr)
>
> # 3-hourly files contain the string "[0-9][0-9]z"
> } else if(grepl("[0-9][0-9]z", basename(trmm_file),
ignore.case=TRUE)) {
>   tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>   ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
>   init  = as.POSIXct(ftime - 0*sec_per_hr)
>   valid = as.POSIXct(ftime + 3*sec_per_hr)
>
> # Fail otherwise
> } else {
>   cat("\n\nERROR: Can\'t figure out the accumulation interval!\n\n")
>   quit(1)
> }
>
> # Compute the accumulation interval
> acc_sec = as.double(valid - init, units="secs")
> acc_hr  = floor(acc_sec / 3600)
> acc_str = sprintf("%.2i", acc_hr)
>
>
########################################################################
> #
> # Read the 1/4 degree binary TRMM data.
> #
>
########################################################################
>
> in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
> in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
> in_nlat = length(in_lat)
> in_nlon = length(in_lon)
> data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size = 4,
endian="big")
> in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))
>
> # Rescale the input longitudes from -180.0 to 180
> in_lon = sapply(in_lon, rescale_lon)
>
>
########################################################################
> #
> # Select subset of data to be written.
> #
>
########################################################################
>
> out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
> out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
> out_nlat = length(out_lat)
> out_nlon = length(out_lon)
>
> # Rescale the output longitudes from -180.0 to 180
> out_lon = sort(sapply(out_lon, rescale_lon))
>
> # Extract the output data
> out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
> out_cnt = 0
> out_vld = 0
> out_sum = 0
>
> for(cur_out_lon in 1:out_nlon) {
>   for(cur_out_lat in 1:out_nlat) {
>
>     cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
>     cur_in_lat = which(out_lat[cur_out_lat] == in_lat)
>
>     if(length(cur_in_lon) == 1 &&
>        length(cur_in_lat) == 1) {
>       out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
>       out_vld = out_vld + 1
>       out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
>     }
>
>     out_cnt = out_cnt + 1
>   }
> }
>
>
########################################################################
> #
> # Create the NetCDF output file.
> #
>
########################################################################
>
> # Define dimensions
> dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
>                        create_dimvar=TRUE)
> dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
>                        create_dimvar=TRUE)
>
> # Define variables
> var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
>                        list(dim_lon, dim_lat), missing,
>                        longname="Total precipitation",
prec="single")
>
> # Define file
> nc = create.ncdf(nc_file, var_pcp)
>
> # Add variable attributes
> att.put.ncdf(nc, var_pcp, "name", "APCP")
> att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
> att.put.ncdf(nc, var_pcp, "grib_code", 61)
> att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
> att.put.ncdf(nc, var_pcp, "init_time", format(init, "%Y%m%d_%H%M%S",
tz="GMT"), prec="text")
> att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
> att.put.ncdf(nc, var_pcp, "valid_time", format(valid,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
> att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
> att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
> att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")
>
> # Add global attributes
> cur_time = Sys.time()
> att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file,
"generated", format(Sys.time(), "%Y%m%d_%H%M%S"),
>                                          "on host", Sys.info()[4],
"by the Rscript trmmbin2nc.R"))
> att.put.ncdf(nc, 0, "MET_version", "V4.1")
> att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
> att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat), "degrees_north"),
prec="text")
> att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon), "degrees_east"),
prec="text")
> att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
> att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
> att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
> att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")
>
> # Add pcp to the file
> put.var.ncdf(nc, var_pcp, out_pcp)
>
> # Close the file
> close.ncdf(nc)
>
> # Print summary info
>
> cat(paste("Output File:\t", nc_file, "\n", sep=""))
> cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from (",
>     out_lat_ll, ", ", out_lon_ll, ") to (",
>     out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n",
sep=""))
> cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
>     round(out_sum, 2), " mm total, ",
>     round(out_sum/out_vld, 2), " mm avg\n", sep=""))
>
>
########################################################################
> #
> # Finished.
> #
>
########################################################################
>
> # Optionally, save all of the pcp to an .RData file
> if(save == TRUE) save.image()


------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: John Halley Gotway
Time: Mon Jan 06 15:14:15 2014

Mukul,

I was able to run with your data on yellowstone today and produce the
error message you're seeing.  Basically, this is a problem in the
config file - you need to tell the tool what field you'd like to
extract from each input gridded file to define the series.

I took at look at the observation file list:
/glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
That list contains 3 NetCDF files, each of which contains a 24-hour
accumulation of precipitation:

         float APCP_24(lat, lon) ;
                 APCP_24:units = "kg/m^2" ;
                 APCP_24:missing_value = -9999.f ;
                 APCP_24:long_name = "Total precipitation" ;
                 APCP_24:name = "APCP" ;
                 APCP_24:level = "A24" ;
                 ...

So I assume you're trying to define a series of 24-hourly accumulated
precip.  But looking in your forecast GRIB files, I see the following:
    [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib | grep APCP
    329:19953278:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=0:TimeU=1:sfc:0-
0hr acc:NAve=0
    [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib | grep APCP
    329:20387954:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=24:TimeU=1:sfc:0-
24hr acc:NAve=0
    [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib | grep APCP
    329:20469514:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=48:TimeU=1:sfc:0-
48hr acc:NAve=0

You have a runtime accumulation of precip.  The first file has 0-hr
precip, the second has 24-hour precip, and the third has 48-hour
precip.  So you really only have two 24-hour accumulations: 0-24
hours and 24-48 hours.  You'll first need to run the pcp_combine tool
to compute 24-hour accumulations for your forecast data.  So I ran the
following commands:

/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
-subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib 24 \
           /glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib 0 \
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp1_APCP_24.nc

/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
-subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib 48 \
           /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib 24 \
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp2_APCP_24.nc

And I created new versions of the forecast file list, observation file
list, and configuration file...


/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis \
    -fcst /glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/wrf \
    -obs /glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/trmm \
    -config
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/SeriesAnalysisConfig
\
    -out tmp.nc

However, the next problem is that the valid times for your sample data
files don't line up.

For the forecast files (tmp1_APCP_24.nc and tmp2_APCP_24.nc) the valid
times are 20040701_120000 and 20040702_120000.  But for the
observation files (myfile1.nc and myfile2.nc) the valid times are
20040703_000000 and 20040704_000000.

To get going, you'll need to get forecast and observation files whose
valid times match up.  Looks like you're just of by a couple of days.

Hope that helps get you going.  And sorry for the delay in getting
back to you.

Thanks,
John Halley Gotway
met_help at ucar.edu

On 12/27/2013 11:58 AM, Mukul Tewari via RT wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
> Hi John,
>
> As you mentioned below, I used the UPP and copygb to convert the
> wrf output and regrid. I also used the r_script you provided
> to changing the domain info.
>
> Now when I try the grid to grid analysis using,
>
> ===============================
> [mukul at yslogin1 r_script]$
> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis
-fcst
> wrf -obs trmm -out tmp.nc -config
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
> DEBUG 1: Reading ASCII file list: wrf
> DEBUG 1: Reading ASCII file list: trmm
> DEBUG 1: Default Config File:
/glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1/data/config/SeriesAnalysisConfig_default
> DEBUG 1: User Config File:
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
> ERROR  :
> ERROR  : SeriesAnalysisConfInfo::process_config() -> the "fcst" and
> "obs" settings may not be empty.
> ERROR  :
> ===============================
>
> I get the above error. I created the 24hr TRMM files and WRF files
and the path for
> these files are in
>
> /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/wrf
> and
> /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
>
> files.
>
> Can you please help with resolving the error? Do you think it is the
> problem associated with series_analysis or with the copygb step?
>
> thanks,
> Mukul
>
>
> On Wed, Dec 18, 2013 at 03:30:52PM -0700, John Halley Gotway via RT
wrote:
>> Mukul,
>>
>> You do need to run it through a post-processor.  You could use
either pinterp or the Unified PostProcessor (UPP).  We recommend UPP
since it's output is in GRIB1 format which can easily be regridded
>> using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.
>>
>> MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.
>>
>> You could try using the Rscript I've attached to this message.  It
reads TRMM binary data - look at the top of the script for a source of
that data.  Then it subsets the data based on the "output
>> domain specification" listed near the top of the script and writes
a NetCDF file that MET can read.  Just change the definition of
out_res, out_lat_ll, and so on, to define the grid you want.
>>
>> You run this on the command line like this:
>>     Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc
>>
>> It relies of the R "ncdf" package for reading/writing NetCDF files.
>>
>> We'll ultimately post this script to the MET website, but it needs
a bit more work.
>>
>> Hope that helps get you going.
>>
>> Thanks,
>> John
>>
>> On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>
>>> Hi John,
>>>
>>> I think I can use the pre-compiled version.
>>>
>>> I want to do grid-to-grid comparison of WRF precip
>>> and TRMM precip, I have both on the same grid
>>> and both are netcdf files.
>>>
>>> I am still going through the online tutorial. Do
>>> I need to run p_interp first on the WRF output?
>>>
>>> Also does the MET take directly the TRMM pcp in netcdf?
>>>
>>> thanks,
>>> Mukul
>>>
>>> On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via
RT wrote:
>>>> Mine is a compilation error from the lex code in the vx_config
library.
>>>> I'm compiling using intel.  But this is a new error, I compiled
that same
>>>> code on yellowstone successfully a couple of months ago.
>>>>
>>>> I'll look more into it later this week.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>>>
>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>
>>>>> Thanks, John. I was also getting an error while compiling. I am
getting an
>>>>> error related with netcdf error which I am not sure because I
assigned the
>>>>> right path.
>>>>>
>>>>> thanks,
>>>>> Mukul
>>>>>
>>>>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway via
RT wrote:
>>>>>> Mukul,
>>>>>>
>>>>>> Sorry about that.  You can use a version of it that's compiled
here:
>>>>>>       /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
>>>>>>
>>>>>> I'm trying to recompile the version I sent you, but for some
reason I'm
>>>>>> getting a compilation error I haven't seen in the past.
>>>>>>
>>>>>> Thanks,
>>>>>> John
>>>>>>
>>>>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
>>>>>>>
>>>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630
>
>>>>>>>
>>>>>>> Hi John,
>>>>>>>
>>>>>>> I don't see any built executables in bin/??
>>>>>>>
>>>>>>> Mukul
>>>>>>>
>>>>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
>>>>>> wrote:
>>>>>>>> Mukul,
>>>>>>>>
>>>>>>>> METv4.1 is installed on yellowstone in:
>>>>>>>>        /glade/p/ral/jnt/MET/MET_releases/METv4.1
>>>>>>>>
>>>>>>>> You're welcome to use that version if you'd like.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> John Halley Gotway
>>>>>>>> met_help at ucar.edu
>>>>>>>>
>>>>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>>>>>>>>>
>>>>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
>>>>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
>>>>>>>>>            Queue: met_help
>>>>>>>>>          Subject: MET on yellowstone
>>>>>>>>>            Owner: Nobody
>>>>>>>>>       Requestors: mukul at ucar.edu
>>>>>>>>>           Status: new
>>>>>>>>>      Ticket <URL:
>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> I am a new user of MET, and would like to know if
>>>>>>>>> it is already installed on yellowstone, so that
>>>>>>>>> I can use the pre-installed version for my
>>>>>>>>> purpose.
>>>>>>>>>
>>>>>>>>> thanks,
>>>>>>>>> Mukul
>>>>>>>>>
>>>>>
>>>>
>>>>
>>
>
>> # Daily TRMM binary files:
>> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
>> # 3-Hourly TRMM binary files:
>> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin
>>
>>
########################################################################
>> #
>> # Required libraries.
>> #
>>
########################################################################
>>
>> library(ncdf)
>>
>>
########################################################################
>> #
>> # Constants and command line arguments
>> #
>>
########################################################################
>>
>> hdr_len    = 5           # Each pcp file begins with 5 header
lines.
>> hdr_file   = "trmm.hdr"  # Temporary header file.
>> missing    = -9999       # Missing pcp value to be used in MET.
>> save       = FALSE       # If set to TRUE, call save.image()
>> sec_per_hr = 60*60
>>
>> # Native domain specification
>> in_res     = 0.25        # Resolution in degrees
>> in_lat_ll  = -50.00      # Latitude of lower-left corner
>> in_lon_ll  =   0.00      # Longitude of lower-left corner
>> in_lat_ur  =  50.00      # Latitude of upper-right corner
>> in_lon_ur  = 359.75      # Longitude of upper-right corner
>>
>> # Output domain specification
>> out_res    = 0.25        # Resolution in degrees
>> out_lat_ll =  -25.00     # Latitude of lower-left corner
>> out_lon_ll = -150.00     # Longitude of lower-left corner
>> out_lat_ur =   60.00     # Latitude of upper-right corner
>> out_lon_ur =   10.00     # Longitude of upper-right corner
>>
>> rescale_lon <- function (x) {
>>    while(x < -180) x = x + 360;
>>    while(x >  180) x = x - 360;
>>    return(x)
>> }
>>
>>
########################################################################
>> #
>> # Handle the arguments.
>> #
>>
########################################################################
>>
>> # Retreive the arguments
>> args = commandArgs(TRUE)
>>
>> # Check the number of arguments
>> if(length(args) < 2) {
>>     cat("Usage: trmmbin2nc.R\n")
>>     cat("        trmm_file\n")
>>     cat("        nc_file\n")
>>     cat("        [-save]\n")
>>     cat("        where \"trmm_file\" is a binary TRMM files.\n")
>>     cat("              \"nc_file\"   is the out NetCDF file to be
written.\n")
>>     cat("              \"-save\"     to call save.image().\n\n")
>>     quit()
>> }
>>
>> # Store the input file names
>> trmm_file = args[1]
>> nc_file   = args[2]
>>
>> # Parse optional arguments
>> for(i in 1:length(args)) {
>>     if(args[i] == "-save") {
>>        save = TRUE
>>     }
>> }
>>
>>
########################################################################
>> #
>> # Parse accumulation interval and time from file name.
>> #
>>
########################################################################
>>
>> # Daily files contain the string "daily"
>> if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
>>    tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>>    ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m
%d", tz="GMT")
>>    init  = as.POSIXct(ftime -  0*sec_per_hr)
>>    valid = as.POSIXct(ftime + 24*sec_per_hr)
>>
>> # 3-hourly files contain the string "[0-9][0-9]z"
>> } else if(grepl("[0-9][0-9]z", basename(trmm_file),
ignore.case=TRUE)) {
>>    tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
>>    ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
>>    init  = as.POSIXct(ftime - 0*sec_per_hr)
>>    valid = as.POSIXct(ftime + 3*sec_per_hr)
>>
>> # Fail otherwise
>> } else {
>>    cat("\n\nERROR: Can\'t figure out the accumulation
interval!\n\n")
>>    quit(1)
>> }
>>
>> # Compute the accumulation interval
>> acc_sec = as.double(valid - init, units="secs")
>> acc_hr  = floor(acc_sec / 3600)
>> acc_str = sprintf("%.2i", acc_hr)
>>
>>
########################################################################
>> #
>> # Read the 1/4 degree binary TRMM data.
>> #
>>
########################################################################
>>
>> in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
>> in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
>> in_nlat = length(in_lat)
>> in_nlon = length(in_lon)
>> data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size =
4, endian="big")
>> in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))
>>
>> # Rescale the input longitudes from -180.0 to 180
>> in_lon = sapply(in_lon, rescale_lon)
>>
>>
########################################################################
>> #
>> # Select subset of data to be written.
>> #
>>
########################################################################
>>
>> out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
>> out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
>> out_nlat = length(out_lat)
>> out_nlon = length(out_lon)
>>
>> # Rescale the output longitudes from -180.0 to 180
>> out_lon = sort(sapply(out_lon, rescale_lon))
>>
>> # Extract the output data
>> out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
>> out_cnt = 0
>> out_vld = 0
>> out_sum = 0
>>
>> for(cur_out_lon in 1:out_nlon) {
>>    for(cur_out_lat in 1:out_nlat) {
>>
>>      cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
>>      cur_in_lat = which(out_lat[cur_out_lat] == in_lat)
>>
>>      if(length(cur_in_lon) == 1 &&
>>         length(cur_in_lat) == 1) {
>>        out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
>>        out_vld = out_vld + 1
>>        out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
>>      }
>>
>>      out_cnt = out_cnt + 1
>>    }
>> }
>>
>>
########################################################################
>> #
>> # Create the NetCDF output file.
>> #
>>
########################################################################
>>
>> # Define dimensions
>> dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
>>                         create_dimvar=TRUE)
>> dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
>>                         create_dimvar=TRUE)
>>
>> # Define variables
>> var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
>>                         list(dim_lon, dim_lat), missing,
>>                         longname="Total precipitation",
prec="single")
>>
>> # Define file
>> nc = create.ncdf(nc_file, var_pcp)
>>
>> # Add variable attributes
>> att.put.ncdf(nc, var_pcp, "name", "APCP")
>> att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
>> att.put.ncdf(nc, var_pcp, "grib_code", 61)
>> att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
>> att.put.ncdf(nc, var_pcp, "init_time", format(init,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
>> att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
>> att.put.ncdf(nc, var_pcp, "valid_time", format(valid,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
>> att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
>> att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
>> att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")
>>
>> # Add global attributes
>> cur_time = Sys.time()
>> att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file,
"generated", format(Sys.time(), "%Y%m%d_%H%M%S"),
>>                                           "on host", Sys.info()[4],
"by the Rscript trmmbin2nc.R"))
>> att.put.ncdf(nc, 0, "MET_version", "V4.1")
>> att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
>> att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat), "degrees_north"),
prec="text")
>> att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon), "degrees_east"),
prec="text")
>> att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
>> att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
>> att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
>> att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")
>>
>> # Add pcp to the file
>> put.var.ncdf(nc, var_pcp, out_pcp)
>>
>> # Close the file
>> close.ncdf(nc)
>>
>> # Print summary info
>>
>> cat(paste("Output File:\t", nc_file, "\n", sep=""))
>> cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from (",
>>      out_lat_ll, ", ", out_lon_ll, ") to (",
>>      out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n",
sep=""))
>> cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
>>      round(out_sum, 2), " mm total, ",
>>      round(out_sum/out_vld, 2), " mm avg\n", sep=""))
>>
>>
########################################################################
>> #
>> # Finished.
>> #
>>
########################################################################
>>
>> # Optionally, save all of the pcp to an .RData file
>> if(save == TRUE) save.image()
>

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: Mukul Tewari
Time: Tue Jan 07 14:49:56 2014

Thanks John, Looks like it is working. I got the output file tmp.nc. I
might
have some questions which I will ask you later once I get more
familiar.

Again, thanks so much for your help.

Mukul

On Mon, Jan 06, 2014 at 03:14:16PM -0700, John Halley Gotway via RT
wrote:
> Mukul,
>
> I was able to run with your data on yellowstone today and produce
the error message you're seeing.  Basically, this is a problem in the
config file - you need to tell the tool what field you'd like to
> extract from each input gridded file to define the series.
>
> I took at look at the observation file list:
/glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
> That list contains 3 NetCDF files, each of which contains a 24-hour
accumulation of precipitation:
>
>          float APCP_24(lat, lon) ;
>                  APCP_24:units = "kg/m^2" ;
>                  APCP_24:missing_value = -9999.f ;
>                  APCP_24:long_name = "Total precipitation" ;
>                  APCP_24:name = "APCP" ;
>                  APCP_24:level = "A24" ;
>                  ...
>
> So I assume you're trying to define a series of 24-hourly
accumulated precip.  But looking in your forecast GRIB files, I see
the following:
>     [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib | grep APCP
>
329:19953278:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=0:TimeU=1:sfc:0-
0hr acc:NAve=0
>     [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib | grep APCP
>
329:20387954:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=24:TimeU=1:sfc:0-
24hr acc:NAve=0
>     [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib | grep APCP
>
329:20469514:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=48:TimeU=1:sfc:0-
48hr acc:NAve=0
>
> You have a runtime accumulation of precip.  The first file has 0-hr
precip, the second has 24-hour precip, and the third has 48-hour
precip.  So you really only have two 24-hour accumulations: 0-24
> hours and 24-48 hours.  You'll first need to run the pcp_combine
tool to compute 24-hour accumulations for your forecast data.  So I
ran the following commands:
>
> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
> -subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib 24
\
>            /glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib 0 \
>
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp1_APCP_24.nc
>
> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
> -subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib 48
\
>            /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib 24
\
>
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp2_APCP_24.nc
>
> And I created new versions of the forecast file list, observation
file list, and configuration file...
>
>
> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis \
>     -fcst /glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/wrf
\
>     -obs /glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/trmm
\
>     -config
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/SeriesAnalysisConfig
\
>     -out tmp.nc
>
> However, the next problem is that the valid times for your sample
data files don't line up.
>
> For the forecast files (tmp1_APCP_24.nc and tmp2_APCP_24.nc) the
valid times are 20040701_120000 and 20040702_120000.  But for the
observation files (myfile1.nc and myfile2.nc) the valid times are
> 20040703_000000 and 20040704_000000.
>
> To get going, you'll need to get forecast and observation files
whose valid times match up.  Looks like you're just of by a couple of
days.
>
> Hope that helps get you going.  And sorry for the delay in getting
back to you.
>
> Thanks,
> John Halley Gotway
> met_help at ucar.edu
>
> On 12/27/2013 11:58 AM, Mukul Tewari via RT wrote:
> >
> > <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >
> > Hi John,
> >
> > As you mentioned below, I used the UPP and copygb to convert the
> > wrf output and regrid. I also used the r_script you provided
> > to changing the domain info.
> >
> > Now when I try the grid to grid analysis using,
> >
> > ===============================
> > [mukul at yslogin1 r_script]$
> > /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis
-fcst
> > wrf -obs trmm -out tmp.nc -config
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
> > DEBUG 1: Reading ASCII file list: wrf
> > DEBUG 1: Reading ASCII file list: trmm
> > DEBUG 1: Default Config File:
/glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1/data/config/SeriesAnalysisConfig_default
> > DEBUG 1: User Config File:
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
> > ERROR  :
> > ERROR  : SeriesAnalysisConfInfo::process_config() -> the "fcst"
and
> > "obs" settings may not be empty.
> > ERROR  :
> > ===============================
> >
> > I get the above error. I created the 24hr TRMM files and WRF files
and the path for
> > these files are in
> >
> > /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/wrf
> > and
> > /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
> >
> > files.
> >
> > Can you please help with resolving the error? Do you think it is
the
> > problem associated with series_analysis or with the copygb step?
> >
> > thanks,
> > Mukul
> >
> >
> > On Wed, Dec 18, 2013 at 03:30:52PM -0700, John Halley Gotway via
RT wrote:
> >> Mukul,
> >>
> >> You do need to run it through a post-processor.  You could use
either pinterp or the Unified PostProcessor (UPP).  We recommend UPP
since it's output is in GRIB1 format which can easily be regridded
> >> using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.
> >>
> >> MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.
> >>
> >> You could try using the Rscript I've attached to this message.
It reads TRMM binary data - look at the top of the script for a source
of that data.  Then it subsets the data based on the "output
> >> domain specification" listed near the top of the script and
writes a NetCDF file that MET can read.  Just change the definition of
out_res, out_lat_ll, and so on, to define the grid you want.
> >>
> >> You run this on the command line like this:
> >>     Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc
> >>
> >> It relies of the R "ncdf" package for reading/writing NetCDF
files.
> >>
> >> We'll ultimately post this script to the MET website, but it
needs a bit more work.
> >>
> >> Hope that helps get you going.
> >>
> >> Thanks,
> >> John
> >>
> >> On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
> >>>
> >>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>
> >>> Hi John,
> >>>
> >>> I think I can use the pre-compiled version.
> >>>
> >>> I want to do grid-to-grid comparison of WRF precip
> >>> and TRMM precip, I have both on the same grid
> >>> and both are netcdf files.
> >>>
> >>> I am still going through the online tutorial. Do
> >>> I need to run p_interp first on the WRF output?
> >>>
> >>> Also does the MET take directly the TRMM pcp in netcdf?
> >>>
> >>> thanks,
> >>> Mukul
> >>>
> >>> On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via
RT wrote:
> >>>> Mine is a compilation error from the lex code in the vx_config
library.
> >>>> I'm compiling using intel.  But this is a new error, I compiled
that same
> >>>> code on yellowstone successfully a couple of months ago.
> >>>>
> >>>> I'll look more into it later this week.
> >>>>
> >>>> Thanks,
> >>>> John
> >>>>
> >>>>>
> >>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630
>
> >>>>>
> >>>>> Thanks, John. I was also getting an error while compiling. I
am getting an
> >>>>> error related with netcdf error which I am not sure because I
assigned the
> >>>>> right path.
> >>>>>
> >>>>> thanks,
> >>>>> Mukul
> >>>>>
> >>>>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway
via RT wrote:
> >>>>>> Mukul,
> >>>>>>
> >>>>>> Sorry about that.  You can use a version of it that's
compiled here:
> >>>>>>       /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
> >>>>>>
> >>>>>> I'm trying to recompile the version I sent you, but for some
reason I'm
> >>>>>> getting a compilation error I haven't seen in the past.
> >>>>>>
> >>>>>> Thanks,
> >>>>>> John
> >>>>>>
> >>>>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
> >>>>>>>
> >>>>>>> <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>>>>>
> >>>>>>> Hi John,
> >>>>>>>
> >>>>>>> I don't see any built executables in bin/??
> >>>>>>>
> >>>>>>> Mukul
> >>>>>>>
> >>>>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
> >>>>>> wrote:
> >>>>>>>> Mukul,
> >>>>>>>>
> >>>>>>>> METv4.1 is installed on yellowstone in:
> >>>>>>>>        /glade/p/ral/jnt/MET/MET_releases/METv4.1
> >>>>>>>>
> >>>>>>>> You're welcome to use that version if you'd like.
> >>>>>>>>
> >>>>>>>> Thanks,
> >>>>>>>> John Halley Gotway
> >>>>>>>> met_help at ucar.edu
> >>>>>>>>
> >>>>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
> >>>>>>>>>
> >>>>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
> >>>>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
> >>>>>>>>>            Queue: met_help
> >>>>>>>>>          Subject: MET on yellowstone
> >>>>>>>>>            Owner: Nobody
> >>>>>>>>>       Requestors: mukul at ucar.edu
> >>>>>>>>>           Status: new
> >>>>>>>>>      Ticket <URL:
> >>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> Hi,
> >>>>>>>>>
> >>>>>>>>> I am a new user of MET, and would like to know if
> >>>>>>>>> it is already installed on yellowstone, so that
> >>>>>>>>> I can use the pre-installed version for my
> >>>>>>>>> purpose.
> >>>>>>>>>
> >>>>>>>>> thanks,
> >>>>>>>>> Mukul
> >>>>>>>>>
> >>>>>
> >>>>
> >>>>
> >>
> >
> >> # Daily TRMM binary files:
> >> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
> >> # 3-Hourly TRMM binary files:
> >> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin
> >>
> >>
########################################################################
> >> #
> >> # Required libraries.
> >> #
> >>
########################################################################
> >>
> >> library(ncdf)
> >>
> >>
########################################################################
> >> #
> >> # Constants and command line arguments
> >> #
> >>
########################################################################
> >>
> >> hdr_len    = 5           # Each pcp file begins with 5 header
lines.
> >> hdr_file   = "trmm.hdr"  # Temporary header file.
> >> missing    = -9999       # Missing pcp value to be used in MET.
> >> save       = FALSE       # If set to TRUE, call save.image()
> >> sec_per_hr = 60*60
> >>
> >> # Native domain specification
> >> in_res     = 0.25        # Resolution in degrees
> >> in_lat_ll  = -50.00      # Latitude of lower-left corner
> >> in_lon_ll  =   0.00      # Longitude of lower-left corner
> >> in_lat_ur  =  50.00      # Latitude of upper-right corner
> >> in_lon_ur  = 359.75      # Longitude of upper-right corner
> >>
> >> # Output domain specification
> >> out_res    = 0.25        # Resolution in degrees
> >> out_lat_ll =  -25.00     # Latitude of lower-left corner
> >> out_lon_ll = -150.00     # Longitude of lower-left corner
> >> out_lat_ur =   60.00     # Latitude of upper-right corner
> >> out_lon_ur =   10.00     # Longitude of upper-right corner
> >>
> >> rescale_lon <- function (x) {
> >>    while(x < -180) x = x + 360;
> >>    while(x >  180) x = x - 360;
> >>    return(x)
> >> }
> >>
> >>
########################################################################
> >> #
> >> # Handle the arguments.
> >> #
> >>
########################################################################
> >>
> >> # Retreive the arguments
> >> args = commandArgs(TRUE)
> >>
> >> # Check the number of arguments
> >> if(length(args) < 2) {
> >>     cat("Usage: trmmbin2nc.R\n")
> >>     cat("        trmm_file\n")
> >>     cat("        nc_file\n")
> >>     cat("        [-save]\n")
> >>     cat("        where \"trmm_file\" is a binary TRMM files.\n")
> >>     cat("              \"nc_file\"   is the out NetCDF file to be
written.\n")
> >>     cat("              \"-save\"     to call save.image().\n\n")
> >>     quit()
> >> }
> >>
> >> # Store the input file names
> >> trmm_file = args[1]
> >> nc_file   = args[2]
> >>
> >> # Parse optional arguments
> >> for(i in 1:length(args)) {
> >>     if(args[i] == "-save") {
> >>        save = TRUE
> >>     }
> >> }
> >>
> >>
########################################################################
> >> #
> >> # Parse accumulation interval and time from file name.
> >> #
> >>
########################################################################
> >>
> >> # Daily files contain the string "daily"
> >> if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
> >>    tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
> >>    ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m
%d", tz="GMT")
> >>    init  = as.POSIXct(ftime -  0*sec_per_hr)
> >>    valid = as.POSIXct(ftime + 24*sec_per_hr)
> >>
> >> # 3-hourly files contain the string "[0-9][0-9]z"
> >> } else if(grepl("[0-9][0-9]z", basename(trmm_file),
ignore.case=TRUE)) {
> >>    tok   = unlist(strsplit(basename(trmm_file), '.', fixed=TRUE))
> >>    ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
> >>    init  = as.POSIXct(ftime - 0*sec_per_hr)
> >>    valid = as.POSIXct(ftime + 3*sec_per_hr)
> >>
> >> # Fail otherwise
> >> } else {
> >>    cat("\n\nERROR: Can\'t figure out the accumulation
interval!\n\n")
> >>    quit(1)
> >> }
> >>
> >> # Compute the accumulation interval
> >> acc_sec = as.double(valid - init, units="secs")
> >> acc_hr  = floor(acc_sec / 3600)
> >> acc_str = sprintf("%.2i", acc_hr)
> >>
> >>
########################################################################
> >> #
> >> # Read the 1/4 degree binary TRMM data.
> >> #
> >>
########################################################################
> >>
> >> in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
> >> in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
> >> in_nlat = length(in_lat)
> >> in_nlon = length(in_lon)
> >> data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size =
4, endian="big")
> >> in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))
> >>
> >> # Rescale the input longitudes from -180.0 to 180
> >> in_lon = sapply(in_lon, rescale_lon)
> >>
> >>
########################################################################
> >> #
> >> # Select subset of data to be written.
> >> #
> >>
########################################################################
> >>
> >> out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
> >> out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
> >> out_nlat = length(out_lat)
> >> out_nlon = length(out_lon)
> >>
> >> # Rescale the output longitudes from -180.0 to 180
> >> out_lon = sort(sapply(out_lon, rescale_lon))
> >>
> >> # Extract the output data
> >> out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
> >> out_cnt = 0
> >> out_vld = 0
> >> out_sum = 0
> >>
> >> for(cur_out_lon in 1:out_nlon) {
> >>    for(cur_out_lat in 1:out_nlat) {
> >>
> >>      cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
> >>      cur_in_lat = which(out_lat[cur_out_lat] == in_lat)
> >>
> >>      if(length(cur_in_lon) == 1 &&
> >>         length(cur_in_lat) == 1) {
> >>        out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
> >>        out_vld = out_vld + 1
> >>        out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
> >>      }
> >>
> >>      out_cnt = out_cnt + 1
> >>    }
> >> }
> >>
> >>
########################################################################
> >> #
> >> # Create the NetCDF output file.
> >> #
> >>
########################################################################
> >>
> >> # Define dimensions
> >> dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
> >>                         create_dimvar=TRUE)
> >> dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
> >>                         create_dimvar=TRUE)
> >>
> >> # Define variables
> >> var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
> >>                         list(dim_lon, dim_lat), missing,
> >>                         longname="Total precipitation",
prec="single")
> >>
> >> # Define file
> >> nc = create.ncdf(nc_file, var_pcp)
> >>
> >> # Add variable attributes
> >> att.put.ncdf(nc, var_pcp, "name", "APCP")
> >> att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
> >> att.put.ncdf(nc, var_pcp, "grib_code", 61)
> >> att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
> >> att.put.ncdf(nc, var_pcp, "init_time", format(init,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
> >> att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
> >> att.put.ncdf(nc, var_pcp, "valid_time", format(valid,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
> >> att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
> >> att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
> >> att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")
> >>
> >> # Add global attributes
> >> cur_time = Sys.time()
> >> att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file,
"generated", format(Sys.time(), "%Y%m%d_%H%M%S"),
> >>                                           "on host",
Sys.info()[4], "by the Rscript trmmbin2nc.R"))
> >> att.put.ncdf(nc, 0, "MET_version", "V4.1")
> >> att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
> >> att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat),
"degrees_north"), prec="text")
> >> att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon),
"degrees_east"), prec="text")
> >> att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
> >> att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
> >> att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
> >> att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")
> >>
> >> # Add pcp to the file
> >> put.var.ncdf(nc, var_pcp, out_pcp)
> >>
> >> # Close the file
> >> close.ncdf(nc)
> >>
> >> # Print summary info
> >>
> >> cat(paste("Output File:\t", nc_file, "\n", sep=""))
> >> cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from
(",
> >>      out_lat_ll, ", ", out_lon_ll, ") to (",
> >>      out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n",
sep=""))
> >> cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
> >>      round(out_sum, 2), " mm total, ",
> >>      round(out_sum/out_vld, 2), " mm avg\n", sep=""))
> >>
> >>
########################################################################
> >> #
> >> # Finished.
> >> #
> >>
########################################################################
> >>
> >> # Optionally, save all of the pcp to an .RData file
> >> if(save == TRUE) save.image()
> >

------------------------------------------------
Subject: Re: [rt.rap.ucar.edu #64630] MET on yellowstone
From: John Halley Gotway
Time: Tue Jan 07 15:28:57 2014

Mukul,

OK, great.  I'll resolve this ticket, but if you have more questions,
just write a new question to met_help at ucar.edu.

Thanks,
John

On 01/07/2014 02:49 PM, Mukul Tewari via RT wrote:
>
> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>
> Thanks John, Looks like it is working. I got the output file tmp.nc.
I might
> have some questions which I will ask you later once I get more
familiar.
>
> Again, thanks so much for your help.
>
> Mukul
>
> On Mon, Jan 06, 2014 at 03:14:16PM -0700, John Halley Gotway via RT
wrote:
>> Mukul,
>>
>> I was able to run with your data on yellowstone today and produce
the error message you're seeing.  Basically, this is a problem in the
config file - you need to tell the tool what field you'd like to
>> extract from each input gridded file to define the series.
>>
>> I took at look at the observation file list:
/glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
>> That list contains 3 NetCDF files, each of which contains a 24-hour
accumulation of precipitation:
>>
>>           float APCP_24(lat, lon) ;
>>                   APCP_24:units = "kg/m^2" ;
>>                   APCP_24:missing_value = -9999.f ;
>>                   APCP_24:long_name = "Total precipitation" ;
>>                   APCP_24:name = "APCP" ;
>>                   APCP_24:level = "A24" ;
>>                   ...
>>
>> So I assume you're trying to define a series of 24-hourly
accumulated precip.  But looking in your forecast GRIB files, I see
the following:
>>      [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib | grep APCP
>>
329:19953278:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=0:TimeU=1:sfc:0-
0hr acc:NAve=0
>>      [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib | grep APCP
>>
329:20387954:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=24:TimeU=1:sfc:0-
24hr acc:NAve=0
>>      [johnhg at yslogin2 tewari_data_20130106]$ wgrib
/glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib | grep APCP
>>
329:20469514:d=04063012:APCP:kpds5=61:kpds6=1:kpds7=0:TR=4:P1=0:P2=48:TimeU=1:sfc:0-
48hr acc:NAve=0
>>
>> You have a runtime accumulation of precip.  The first file has 0-hr
precip, the second has 24-hour precip, and the third has 48-hour
precip.  So you really only have two 24-hour accumulations: 0-24
>> hours and 24-48 hours.  You'll first need to run the pcp_combine
tool to compute 24-hour accumulations for your forecast data.  So I
ran the following commands:
>>
>> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
>> -subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib 24
\
>>             /glade/p/work/mukul/gfdl-vortextracker/work3/tmp.grib 0
\
>>
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp1_APCP_24.nc
>>
>> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/pcp_combine \
>> -subtract /glade/p/work/mukul/gfdl-vortextracker/work3/tmp2.grib 48
\
>>             /glade/p/work/mukul/gfdl-vortextracker/work3/tmp1.grib
24 \
>>
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/tmp2_APCP_24.nc
>>
>> And I created new versions of the forecast file list, observation
file list, and configuration file...
>>
>>
>> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis \
>>      -fcst
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/wrf \
>>      -obs
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/trmm \
>>      -config
/glade/u/home/johnhg/MET/MET_Help/tewari_data_20130106/SeriesAnalysisConfig
\
>>      -out tmp.nc
>>
>> However, the next problem is that the valid times for your sample
data files don't line up.
>>
>> For the forecast files (tmp1_APCP_24.nc and tmp2_APCP_24.nc) the
valid times are 20040701_120000 and 20040702_120000.  But for the
observation files (myfile1.nc and myfile2.nc) the valid times are
>> 20040703_000000 and 20040704_000000.
>>
>> To get going, you'll need to get forecast and observation files
whose valid times match up.  Looks like you're just of by a couple of
days.
>>
>> Hope that helps get you going.  And sorry for the delay in getting
back to you.
>>
>> Thanks,
>> John Halley Gotway
>> met_help at ucar.edu
>>
>> On 12/27/2013 11:58 AM, Mukul Tewari via RT wrote:
>>>
>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>
>>> Hi John,
>>>
>>> As you mentioned below, I used the UPP and copygb to convert the
>>> wrf output and regrid. I also used the r_script you provided
>>> to changing the domain info.
>>>
>>> Now when I try the grid to grid analysis using,
>>>
>>> ===============================
>>> [mukul at yslogin1 r_script]$
>>> /glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/bin/series_analysis
-fcst
>>> wrf -obs trmm -out tmp.nc -config
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
>>> DEBUG 1: Reading ASCII file list: wrf
>>> DEBUG 1: Reading ASCII file list: trmm
>>> DEBUG 1: Default Config File:
/glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1/data/config/SeriesAnalysisConfig_default
>>> DEBUG 1: User Config File:
/glade/scratch/mukul/MET4.1/METv4.1_PRE_COM/data/config/SeriesAnalysisConfig_default
>>> ERROR  :
>>> ERROR  : SeriesAnalysisConfInfo::process_config() -> the "fcst"
and
>>> "obs" settings may not be empty.
>>> ERROR  :
>>> ===============================
>>>
>>> I get the above error. I created the 24hr TRMM files and WRF files
and the path for
>>> these files are in
>>>
>>> /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/wrf
>>> and
>>> /glade/scratch/mukul/2004_WRF_RUN/ENSEMBLE_RUNS_TRMM/r_script/trmm
>>>
>>> files.
>>>
>>> Can you please help with resolving the error? Do you think it is
the
>>> problem associated with series_analysis or with the copygb step?
>>>
>>> thanks,
>>> Mukul
>>>
>>>
>>> On Wed, Dec 18, 2013 at 03:30:52PM -0700, John Halley Gotway via
RT wrote:
>>>> Mukul,
>>>>
>>>> You do need to run it through a post-processor.  You could use
either pinterp or the Unified PostProcessor (UPP).  We recommend UPP
since it's output is in GRIB1 format which can easily be regridded
>>>> using the copygb utility.  If you do not need to regrid your data
later, you could use pinterp.
>>>>
>>>> MET will likely not read the NetCDF TRMM data you currently have.
It's expecting a rather particular format for the NetCDF file.
>>>>
>>>> You could try using the Rscript I've attached to this message.
It reads TRMM binary data - look at the top of the script for a source
of that data.  Then it subsets the data based on the "output
>>>> domain specification" listed near the top of the script and
writes a NetCDF file that MET can read.  Just change the definition of
out_res, out_lat_ll, and so on, to define the grid you want.
>>>>
>>>> You run this on the command line like this:
>>>>      Rscript trmmbin2nc.R my_trmm_file.bin my_trmm_file.nc
>>>>
>>>> It relies of the R "ncdf" package for reading/writing NetCDF
files.
>>>>
>>>> We'll ultimately post this script to the MET website, but it
needs a bit more work.
>>>>
>>>> Hope that helps get you going.
>>>>
>>>> Thanks,
>>>> John
>>>>
>>>> On 12/18/2013 03:15 PM, Mukul Tewari via RT wrote:
>>>>>
>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>
>>>>> Hi John,
>>>>>
>>>>> I think I can use the pre-compiled version.
>>>>>
>>>>> I want to do grid-to-grid comparison of WRF precip
>>>>> and TRMM precip, I have both on the same grid
>>>>> and both are netcdf files.
>>>>>
>>>>> I am still going through the online tutorial. Do
>>>>> I need to run p_interp first on the WRF output?
>>>>>
>>>>> Also does the MET take directly the TRMM pcp in netcdf?
>>>>>
>>>>> thanks,
>>>>> Mukul
>>>>>
>>>>> On Mon, Dec 16, 2013 at 11:09:32PM -0700, John Halley Gotway via
RT wrote:
>>>>>> Mine is a compilation error from the lex code in the vx_config
library.
>>>>>> I'm compiling using intel.  But this is a new error, I compiled
that same
>>>>>> code on yellowstone successfully a couple of months ago.
>>>>>>
>>>>>> I'll look more into it later this week.
>>>>>>
>>>>>> Thanks,
>>>>>> John
>>>>>>
>>>>>>>
>>>>>>> <URL: https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630
>
>>>>>>>
>>>>>>> Thanks, John. I was also getting an error while compiling. I
am getting an
>>>>>>> error related with netcdf error which I am not sure because I
assigned the
>>>>>>> right path.
>>>>>>>
>>>>>>> thanks,
>>>>>>> Mukul
>>>>>>>
>>>>>>> On Mon, Dec 16, 2013 at 04:47:54PM -0700, John Halley Gotway
via RT wrote:
>>>>>>>> Mukul,
>>>>>>>>
>>>>>>>> Sorry about that.  You can use a version of it that's
compiled here:
>>>>>>>>        /glade/p/ral/jnt/MMET/CODE/MET/v4.1/METv4.1
>>>>>>>>
>>>>>>>> I'm trying to recompile the version I sent you, but for some
reason I'm
>>>>>>>> getting a compilation error I haven't seen in the past.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> John
>>>>>>>>
>>>>>>>> On 12/16/2013 03:41 PM, Mukul Tewari via RT wrote:
>>>>>>>>>
>>>>>>>>> <URL:
https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>>>>>
>>>>>>>>> Hi John,
>>>>>>>>>
>>>>>>>>> I don't see any built executables in bin/??
>>>>>>>>>
>>>>>>>>> Mukul
>>>>>>>>>
>>>>>>>>> On Mon, Dec 16, 2013 at 01:43:04PM -0700, John Halley Gotway
via RT
>>>>>>>> wrote:
>>>>>>>>>> Mukul,
>>>>>>>>>>
>>>>>>>>>> METv4.1 is installed on yellowstone in:
>>>>>>>>>>         /glade/p/ral/jnt/MET/MET_releases/METv4.1
>>>>>>>>>>
>>>>>>>>>> You're welcome to use that version if you'd like.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> John Halley Gotway
>>>>>>>>>> met_help at ucar.edu
>>>>>>>>>>
>>>>>>>>>> On 12/13/2013 04:33 PM, Mukul Tewari via RT wrote:
>>>>>>>>>>>
>>>>>>>>>>> Fri Dec 13 16:33:09 2013: Request 64630 was acted upon.
>>>>>>>>>>> Transaction: Ticket created by mukul at rap.ucar.edu
>>>>>>>>>>>             Queue: met_help
>>>>>>>>>>>           Subject: MET on yellowstone
>>>>>>>>>>>             Owner: Nobody
>>>>>>>>>>>        Requestors: mukul at ucar.edu
>>>>>>>>>>>            Status: new
>>>>>>>>>>>       Ticket <URL:
>>>>>>>> https://rt.rap.ucar.edu/rt/Ticket/Display.html?id=64630 >
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>
>>>>>>>>>>> I am a new user of MET, and would like to know if
>>>>>>>>>>> it is already installed on yellowstone, so that
>>>>>>>>>>> I can use the pre-installed version for my
>>>>>>>>>>> purpose.
>>>>>>>>>>>
>>>>>>>>>>> thanks,
>>>>>>>>>>> Mukul
>>>>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>>>> # Daily TRMM binary files:
>>>> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/Derived_Products/3B42_V7/Daily/YYYY/3B42_daily.YYYY.MM.DD.7.bin
>>>> # 3-Hourly TRMM binary files:
>>>> #
ftp://disc2.nascom.nasa.gov/data/TRMM/Gridded/3B42_V7/YYYYMM/3B42.YYMMDD.HHz.7.precipitation.bin
>>>>
>>>>
########################################################################
>>>> #
>>>> # Required libraries.
>>>> #
>>>>
########################################################################
>>>>
>>>> library(ncdf)
>>>>
>>>>
########################################################################
>>>> #
>>>> # Constants and command line arguments
>>>> #
>>>>
########################################################################
>>>>
>>>> hdr_len    = 5           # Each pcp file begins with 5 header
lines.
>>>> hdr_file   = "trmm.hdr"  # Temporary header file.
>>>> missing    = -9999       # Missing pcp value to be used in MET.
>>>> save       = FALSE       # If set to TRUE, call save.image()
>>>> sec_per_hr = 60*60
>>>>
>>>> # Native domain specification
>>>> in_res     = 0.25        # Resolution in degrees
>>>> in_lat_ll  = -50.00      # Latitude of lower-left corner
>>>> in_lon_ll  =   0.00      # Longitude of lower-left corner
>>>> in_lat_ur  =  50.00      # Latitude of upper-right corner
>>>> in_lon_ur  = 359.75      # Longitude of upper-right corner
>>>>
>>>> # Output domain specification
>>>> out_res    = 0.25        # Resolution in degrees
>>>> out_lat_ll =  -25.00     # Latitude of lower-left corner
>>>> out_lon_ll = -150.00     # Longitude of lower-left corner
>>>> out_lat_ur =   60.00     # Latitude of upper-right corner
>>>> out_lon_ur =   10.00     # Longitude of upper-right corner
>>>>
>>>> rescale_lon <- function (x) {
>>>>     while(x < -180) x = x + 360;
>>>>     while(x >  180) x = x - 360;
>>>>     return(x)
>>>> }
>>>>
>>>>
########################################################################
>>>> #
>>>> # Handle the arguments.
>>>> #
>>>>
########################################################################
>>>>
>>>> # Retreive the arguments
>>>> args = commandArgs(TRUE)
>>>>
>>>> # Check the number of arguments
>>>> if(length(args) < 2) {
>>>>      cat("Usage: trmmbin2nc.R\n")
>>>>      cat("        trmm_file\n")
>>>>      cat("        nc_file\n")
>>>>      cat("        [-save]\n")
>>>>      cat("        where \"trmm_file\" is a binary TRMM files.\n")
>>>>      cat("              \"nc_file\"   is the out NetCDF file to
be written.\n")
>>>>      cat("              \"-save\"     to call save.image().\n\n")
>>>>      quit()
>>>> }
>>>>
>>>> # Store the input file names
>>>> trmm_file = args[1]
>>>> nc_file   = args[2]
>>>>
>>>> # Parse optional arguments
>>>> for(i in 1:length(args)) {
>>>>      if(args[i] == "-save") {
>>>>         save = TRUE
>>>>      }
>>>> }
>>>>
>>>>
########################################################################
>>>> #
>>>> # Parse accumulation interval and time from file name.
>>>> #
>>>>
########################################################################
>>>>
>>>> # Daily files contain the string "daily"
>>>> if(grepl("daily", basename(trmm_file), ignore.case=TRUE)) {
>>>>     tok   = unlist(strsplit(basename(trmm_file), '.',
fixed=TRUE))
>>>>     ftime = strptime(paste(tok[2], tok[3], tok[4]), format="%Y %m
%d", tz="GMT")
>>>>     init  = as.POSIXct(ftime -  0*sec_per_hr)
>>>>     valid = as.POSIXct(ftime + 24*sec_per_hr)
>>>>
>>>> # 3-hourly files contain the string "[0-9][0-9]z"
>>>> } else if(grepl("[0-9][0-9]z", basename(trmm_file),
ignore.case=TRUE)) {
>>>>     tok   = unlist(strsplit(basename(trmm_file), '.',
fixed=TRUE))
>>>>     ftime = strptime(paste(tok[2], tok[3]), format="%y%m%d %H",
tz="GMT")
>>>>     init  = as.POSIXct(ftime - 0*sec_per_hr)
>>>>     valid = as.POSIXct(ftime + 3*sec_per_hr)
>>>>
>>>> # Fail otherwise
>>>> } else {
>>>>     cat("\n\nERROR: Can\'t figure out the accumulation
interval!\n\n")
>>>>     quit(1)
>>>> }
>>>>
>>>> # Compute the accumulation interval
>>>> acc_sec = as.double(valid - init, units="secs")
>>>> acc_hr  = floor(acc_sec / 3600)
>>>> acc_str = sprintf("%.2i", acc_hr)
>>>>
>>>>
########################################################################
>>>> #
>>>> # Read the 1/4 degree binary TRMM data.
>>>> #
>>>>
########################################################################
>>>>
>>>> in_lat  = seq(in_lat_ll, in_lat_ur, in_res)
>>>> in_lon  = seq(in_lon_ll, in_lon_ur, in_res)
>>>> in_nlat = length(in_lat)
>>>> in_nlon = length(in_lon)
>>>> data    = readBin(trmm_file, "numeric", n=in_nlon*in_nlat, size =
4, endian="big")
>>>> in_pcp  = array(data, dim=c(in_nlon, in_nlat), dimnames=c("lon",
"lat"))
>>>>
>>>> # Rescale the input longitudes from -180.0 to 180
>>>> in_lon = sapply(in_lon, rescale_lon)
>>>>
>>>>
########################################################################
>>>> #
>>>> # Select subset of data to be written.
>>>> #
>>>>
########################################################################
>>>>
>>>> out_lat  = seq(out_lat_ll, out_lat_ur, out_res)
>>>> out_lon  = seq(out_lon_ll, out_lon_ur, out_res)
>>>> out_nlat = length(out_lat)
>>>> out_nlon = length(out_lon)
>>>>
>>>> # Rescale the output longitudes from -180.0 to 180
>>>> out_lon = sort(sapply(out_lon, rescale_lon))
>>>>
>>>> # Extract the output data
>>>> out_pcp = matrix(nrow=out_nlon, ncol=out_nlat)
>>>> out_cnt = 0
>>>> out_vld = 0
>>>> out_sum = 0
>>>>
>>>> for(cur_out_lon in 1:out_nlon) {
>>>>     for(cur_out_lat in 1:out_nlat) {
>>>>
>>>>       cur_in_lon = which(out_lon[cur_out_lon] == in_lon)
>>>>       cur_in_lat = which(out_lat[cur_out_lat] == in_lat)
>>>>
>>>>       if(length(cur_in_lon) == 1 &&
>>>>          length(cur_in_lat) == 1) {
>>>>         out_pcp[cur_out_lon, cur_out_lat] = in_pcp[cur_in_lon,
cur_in_lat]
>>>>         out_vld = out_vld + 1
>>>>         out_sum = out_sum + out_pcp[cur_out_lon, cur_out_lat];
>>>>       }
>>>>
>>>>       out_cnt = out_cnt + 1
>>>>     }
>>>> }
>>>>
>>>>
########################################################################
>>>> #
>>>> # Create the NetCDF output file.
>>>> #
>>>>
########################################################################
>>>>
>>>> # Define dimensions
>>>> dim_lat = dim.def.ncdf("lat", "degrees_north", out_lat,
>>>>                          create_dimvar=TRUE)
>>>> dim_lon = dim.def.ncdf("lon", "degrees_east",  out_lon,
>>>>                          create_dimvar=TRUE)
>>>>
>>>> # Define variables
>>>> var_pcp = var.def.ncdf(paste("APCP_", acc_str, sep=''), "kg/m^2",
>>>>                          list(dim_lon, dim_lat), missing,
>>>>                          longname="Total precipitation",
prec="single")
>>>>
>>>> # Define file
>>>> nc = create.ncdf(nc_file, var_pcp)
>>>>
>>>> # Add variable attributes
>>>> att.put.ncdf(nc, var_pcp, "name", "APCP")
>>>> att.put.ncdf(nc, var_pcp, "level", paste("A", acc_hr, sep=''))
>>>> att.put.ncdf(nc, var_pcp, "grib_code", 61)
>>>> att.put.ncdf(nc, var_pcp, "_FillValue", missing, prec="single")
>>>> att.put.ncdf(nc, var_pcp, "init_time", format(init,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
>>>> att.put.ncdf(nc, var_pcp, "init_time_ut", as.numeric(init),
prec="int")
>>>> att.put.ncdf(nc, var_pcp, "valid_time", format(valid,
"%Y%m%d_%H%M%S", tz="GMT"), prec="text")
>>>> att.put.ncdf(nc, var_pcp, "valid_time_ut", as.numeric(valid),
prec="int")
>>>> att.put.ncdf(nc, var_pcp, "accum_time", paste(acc_str, "0000",
sep=''))
>>>> att.put.ncdf(nc, var_pcp, "accum_time_sec", acc_sec, prec="int")
>>>>
>>>> # Add global attributes
>>>> cur_time = Sys.time()
>>>> att.put.ncdf(nc, 0, "FileOrigins", paste("File", nc_file,
"generated", format(Sys.time(), "%Y%m%d_%H%M%S"),
>>>>                                            "on host",
Sys.info()[4], "by the Rscript trmmbin2nc.R"))
>>>> att.put.ncdf(nc, 0, "MET_version", "V4.1")
>>>> att.put.ncdf(nc, 0, "Projection", "LatLon", prec="text")
>>>> att.put.ncdf(nc, 0, "lat_ll", paste(min(out_lat),
"degrees_north"), prec="text")
>>>> att.put.ncdf(nc, 0, "lon_ll", paste(min(out_lon),
"degrees_east"), prec="text")
>>>> att.put.ncdf(nc, 0, "delta_lat", paste(out_res, "degrees"),
prec="text")
>>>> att.put.ncdf(nc, 0, "delta_lon", paste(out_res, "degrees"),
prec="text")
>>>> att.put.ncdf(nc, 0, "Nlat", paste(out_nlat, "grid_points"),
prec="text")
>>>> att.put.ncdf(nc, 0, "Nlon", paste(out_nlon, "grid_points"),
prec="text")
>>>>
>>>> # Add pcp to the file
>>>> put.var.ncdf(nc, var_pcp, out_pcp)
>>>>
>>>> # Close the file
>>>> close.ncdf(nc)
>>>>
>>>> # Print summary info
>>>>
>>>> cat(paste("Output File:\t", nc_file, "\n", sep=""))
>>>> cat(paste("Output Domain:\t", out_nlat, " X ", out_nlon, " from
(",
>>>>       out_lat_ll, ", ", out_lon_ll, ") to (",
>>>>       out_lat_ur, ", ", out_lon_ur, ") by ", out_res, " deg\n",
sep=""))
>>>> cat(paste("Output Precip:\t", out_vld, " of ", out_cnt, " points
valid, ",
>>>>       round(out_sum, 2), " mm total, ",
>>>>       round(out_sum/out_vld, 2), " mm avg\n", sep=""))
>>>>
>>>>
########################################################################
>>>> #
>>>> # Finished.
>>>> #
>>>>
########################################################################
>>>>
>>>> # Optionally, save all of the pcp to an .RData file
>>>> if(save == TRUE) save.image()
>>>

------------------------------------------------


More information about the Met_help mailing list