<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Afwande,<br>
<br>
Under situation, I don't think you can obtain rainfall information.
You have to rerun the case. <br>
<br>
WRFHELP<br>
<br>
<div class="moz-cite-prefix">On 2/28/16 11:56 AM, afwande juliet
wrote:<br>
</div>
<blockquote
cite="mid:CANVsOohHCvjw2YVbSQL2L=uUEZpTja7TQbUTxNEi8ZnRnVBZgA@mail.gmail.com"
type="cite">
<p dir="ltr">Thanks , the bucket scheme was on when running
experiment, and unfortunately, I didn't carry l_rainc and
l_rainnc since I was running the model in another country and it
seems the data was deleted from server to create space, how can
I help in this case</p>
<div class="gmail_quote">On Feb 27, 2016 2:40 AM, "wrfhelp" <<a
moz-do-not-send="true" href="mailto:wrfhelp@ucar.edu"><a class="moz-txt-link-abbreviated" href="mailto:wrfhelp@ucar.edu">wrfhelp@ucar.edu</a></a>>
wrote:<br type="attribution">
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000"> <br>
Afwande,<br>
<br>
I am not sure whether you have turned on the bucket scheme?
If so, then you should calculate the total convective and
resolvable-scale precipitation as below:<br>
<br>
RAINC=RAINC+I_RAINC*bucket_mm<br>
RAINNC=RAINNC+I_RAINNC*bucket_mm<br>
<br>
Daily and monthly total precipitation should be derived from
the total precipitation. For example, if you have run WRF
for two month, and you need to calculate monthly
precipitation for the second month. Then you need to
calculate accumulative precipitation at the end of the first
month (suppose it is PRECIP1) and that at the end of second
month (suppose it is PRECIP2). Monthly precipitation for the
second month should be:<br>
<br>
PRECIP2 - PRECIP1 <br>
<br>
WRFHELP<br>
<br>
<br>
<br>
<div>On 2/26/16 10:50 AM, Jimy Dudhia wrote:<br>
</div>
<blockquote type="cite">
<div dir="ltr">Maybe they have the bucket scheme on? If
so, point them to the user guide on the use of the
bucket_mm variables like i_rainnc
<div>Jimy</div>
<div><br>
<div class="gmail_quote">---------- Forwarded message
----------<br>
From: <b class="gmail_sendername">afwande juliet</b>
<span dir="ltr"><<a moz-do-not-send="true"
href="mailto:afwandej965@gmail.com"
target="_blank">afwandej965@gmail.com</a>></span><br>
Date: Fri, Feb 26, 2016 at 1:38 AM<br>
Subject: Re: [Wrf-users] Wrf-users Digest, Vol 138,
Issue 16 - (no subject) (afwande juliet)<br>
To: Felipe Neris <<a moz-do-not-send="true"
href="mailto:felipenc2@gmail.com" target="_blank">felipenc2@gmail.com</a>><br>
Cc: wrf users group <<a moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu" target="_blank">wrf-users@ucar.edu</a>><br>
<br>
<br>
<div dir="ltr">
<div>
<div>
<div>ok thanks<br>
</div>
when I do investigation about my
rainc&rainnc, i find that rainc doesnt
look cumulative (values decrease and increase
randomly within the time of simulation).
Looking at rainc further, there are big values
upto 1000mm with some negative values in
between. However rainnc, looks cumulative and
the values increase with time and there are no
negative values. Any other variables like temp
etc looks reasonable.<br>
<br>
</div>
And you know that precip=rainc+rainnc<br>
</div>
What could be the problem and how can i correct
this?</div>
<div>
<div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Fri, Feb 26, 2016
at 3:16 AM, Felipe Neris <span dir="ltr"><<a
moz-do-not-send="true"
href="mailto:felipenc2@gmail.com"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:felipenc2@gmail.com">felipenc2@gmail.com</a></a>></span>
wrote:<br>
<blockquote class="gmail_quote"
style="margin:0 0 0 .8ex;border-left:1px
#ccc solid;padding-left:1ex">
<div dir="ltr">Hi Juliet,
<div>If I got your question right, I
suppose the answer is: WRF sums all
precipitation that occurs since the
beginning of the simulation into the
variables RAINNC (from microphysics
parameterization) and RAINC (from
cumulus parameterization). Therefor,
if you want to have the accumulated
precipitation of a certain day, you
must specify/set the corresponding
time for these variables and display. </div>
<div>Hope I could be of any help!</div>
<div>
<div class="gmail_extra"><br
clear="all">
<div>
<div>
<div dir="ltr">
<div dir="ltr">
<div>Felipe Neris</div>
<div>
<div> </div>
</div>
</div>
</div>
</div>
</div>
<br>
<div class="gmail_quote">On Thu, Feb
25, 2016 at 3:45 PM, <span
dir="ltr"><<a
moz-do-not-send="true"
href="mailto:wrf-users-request@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users-request@ucar.edu">wrf-users-request@ucar.edu</a></a>></span>
wrote:<br>
<blockquote class="gmail_quote"
style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Send
Wrf-users mailing list
submissions to<br>
<a
moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users@ucar.edu">wrf-users@ucar.edu</a></a><br>
<br>
To subscribe or unsubscribe via
the World Wide Web, visit<br>
<a
moz-do-not-send="true"
href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
rel="noreferrer"
target="_blank"><a class="moz-txt-link-freetext" href="http://mailman.ucar.edu/mailman/listinfo/wrf-users">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a></a><br>
or, via email, send a message
with subject or body 'help' to<br>
<a
moz-do-not-send="true"
href="mailto:wrf-users-request@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users-request@ucar.edu">wrf-users-request@ucar.edu</a></a><br>
<br>
You can reach the person
managing the list at<br>
<a
moz-do-not-send="true"
href="mailto:wrf-users-owner@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users-owner@ucar.edu">wrf-users-owner@ucar.edu</a></a><br>
<br>
When replying, please edit your
Subject line so it is more
specific<br>
than "Re: Contents of Wrf-users
digest..."<br>
<br>
<br>
Today's Topics:<br>
<br>
1. (no subject) (afwande
juliet)<br>
2. Nesting and Domain
Decomposition (Douglas Lowe)<br>
3. Re: Nesting and Domain
Decomposition (Tabish Ansari)<br>
<br>
<br>
----------------------------------------------------------------------<br>
<br>
Message: 1<br>
Date: Thu, 25 Feb 2016 15:45:08
+0300<br>
From: afwande juliet <<a
moz-do-not-send="true"
href="mailto:afwandej965@gmail.com"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:afwandej965@gmail.com">afwandej965@gmail.com</a></a>><br>
Subject: [Wrf-users] (no
subject)<br>
To: wrf users group <<a
moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users@ucar.edu">wrf-users@ucar.edu</a></a>>,
wrfhelp <<a
moz-do-not-send="true"
href="mailto:wrfhelp@ucar.edu"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrfhelp@ucar.edu">wrfhelp@ucar.edu</a></a>><br>
Message-ID:<br>
<<a
moz-do-not-send="true"
href="mailto:CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw@mail.gmail.com"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw@mail.gmail.com">CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw@mail.gmail.com</a></a>><br>
Content-Type: text/plain;
charset="utf-8"<br>
<br>
I ask this again<br>
I have WRF simulations for 1981.
The model output is 3hourly, i.e
8<br>
timestep in a day<br>
When I want daily values or
monthly values, do i take every
8th timestep to<br>
be rain totals for each day and
sum them up to get monthly
totals?<br>
Do I have to divide the units
*mm* by any number to get mm/day
?<br>
<br>
<br>
<br>
thanks<br>
-------------- next part
--------------<br>
An HTML attachment was
scrubbed...<br>
URL: <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/c10344ce/attachment-0001.html"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/c10344ce/attachment-0001.html</a><br>
<br>
------------------------------<br>
<br>
Message: 2<br>
Date: Thu, 25 Feb 2016 13:59:29
+0000<br>
From: Douglas Lowe <<a
moz-do-not-send="true"
href="mailto:Douglas.Lowe@manchester.ac.uk"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:Douglas.Lowe@manchester.ac.uk">Douglas.Lowe@manchester.ac.uk</a></a>><br>
Subject: [Wrf-users] Nesting and
Domain Decomposition<br>
To: "<a moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank">wrf-users@ucar.edu</a>"
<<a moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank">wrf-users@ucar.edu</a>><br>
Message-ID:<br>
<<a
moz-do-not-send="true"
href="mailto:43E6B083008E774B87C4283E0FFA4E70012C7A0C91@MBXP02.ds.man.ac.uk"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:43E6B083008E774B87C4283E0FFA4E70012C7A0C91@MBXP02.ds.man.ac.uk">43E6B083008E774B87C4283E0FFA4E70012C7A0C91@MBXP02.ds.man.ac.uk</a></a>><br>
Content-Type: text/plain;
charset="us-ascii"<br>
<br>
Hi all,<br>
<br>
I'm running WRF-Chem with a nest
of 3 domains, with the settings
listed below. I'd like to be<br>
able to split this across as
many processes as possible in
order to speed things up
(currently<br>
I'm only managing 3x real time,
which isn't very good when
running multiday simulations).<br>
Unfortunately I am finding that
WRF hangs when calling the
photolysis driver for my 2nd
domain<br>
(which is the smallest of the
domains) if I use too many
processors.<br>
<br>
The (relevant) model domain
settings are:<br>
max_dom
= 3,<br>
e_we
= 134, 81, 91,<br>
e_sn
= 146, 81, 91,<br>
e_vert
= 41, 41, 41,<br>
num_metgrid_levels = 38,<br>
dx
= 15000,3000,1000,<br>
dy
= 15000,3000,1000,<br>
<br>
WRF will run when I split over
upto 168 processes (7 nodes on
the ARCHER supercomputer),<br>
but wont work if I split over
192 (or more) processes (8 nodes
on ARCHER).<br>
<br>
Looking at the log messages I
*think* that WRF is splitting
each domain into the same<br>
number of patches, and sending
one patch from each domain to a
single process for<br>
analysis. However, this means
that I am limited by the
smallest domain as to how many<br>
patches I can split a domain
into before we end up with
patches which are dwarved by<br>
the halos around them.<br>
<br>
Would it not make more sense to
be able to split each domain
into different numbers<br>
of patches (so that each patch
is of a similar size, regardless
of which domain it is from) and<br>
send one patch from one domain
to a single process (or,
perhaps, send more patches from
the<br>
outer domains to a single
process, if needed for balancing
computational demands)? And<br>
is there anyway for me to do
this with WRF?<br>
<br>
Thanks,<br>
Doug<br>
<br>
------------------------------<br>
<br>
Message: 3<br>
Date: Thu, 25 Feb 2016 16:37:26
+0000<br>
From: Tabish Ansari <<a
moz-do-not-send="true"
href="mailto:tabishumaransari@gmail.com"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:tabishumaransari@gmail.com">tabishumaransari@gmail.com</a></a>><br>
Subject: Re: [Wrf-users] Nesting
and Domain Decomposition<br>
To: Douglas Lowe <<a
moz-do-not-send="true"
href="mailto:Douglas.Lowe@manchester.ac.uk"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:Douglas.Lowe@manchester.ac.uk">Douglas.Lowe@manchester.ac.uk</a></a>><br>
Cc: "<a moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank">wrf-users@ucar.edu</a>"
<<a moz-do-not-send="true"
href="mailto:wrf-users@ucar.edu"
target="_blank">wrf-users@ucar.edu</a>><br>
Message-ID:<br>
<<a
moz-do-not-send="true"
href="mailto:CALLVTyvwh3nimJczxjfAy%2BgNML1PfJpDudJT8TQhkDQrCLnwZw@mail.gmail.com"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:CALLVTyvwh3nimJczxjfAy+gNML1PfJpDudJT8TQhkDQrCLnwZw@mail.gmail.com">CALLVTyvwh3nimJczxjfAy+gNML1PfJpDudJT8TQhkDQrCLnwZw@mail.gmail.com</a></a>><br>
Content-Type: text/plain;
charset="utf-8"<br>
<br>
Hi Doug,<br>
<br>
I'm not too knowledgeable in
this but have some literature
which might be<br>
of relevance. Please have a look
at the attached files.<br>
<br>
Cheers,<br>
<br>
Tabish<br>
<br>
Tabish U Ansari<br>
PhD student, Lancaster
Environment Center<br>
Lancaster Univeristy<br>
Bailrigg, Lancaster,<br>
LA1 4YW, United Kingdom<br>
<br>
On 25 February 2016 at 13:59,
Douglas Lowe <<a
moz-do-not-send="true"
href="mailto:Douglas.Lowe@manchester.ac.uk"
target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:Douglas.Lowe@manchester.ac.uk">Douglas.Lowe@manchester.ac.uk</a></a>><br>
wrote:<br>
<br>
> Hi all,<br>
><br>
> I'm running WRF-Chem with a
nest of 3 domains, with the
settings listed<br>
> below. I'd like to be<br>
> able to split this across
as many processes as possible in
order to speed<br>
> things up (currently<br>
> I'm only managing 3x real
time, which isn't very good when
running<br>
> multiday simulations).<br>
> Unfortunately I am finding
that WRF hangs when calling the
photolysis<br>
> driver for my 2nd domain<br>
> (which is the smallest of
the domains) if I use too many
processors.<br>
><br>
> The (relevant) model domain
settings are:<br>
> max_dom
= 3,<br>
> e_we
= 134, 81, 91,<br>
> e_sn
= 146, 81, 91,<br>
> e_vert
= 41, 41, 41,<br>
> num_metgrid_levels =
38,<br>
> dx
= 15000,3000,1000,<br>
> dy
= 15000,3000,1000,<br>
><br>
> WRF will run when I split
over upto 168 processes (7 nodes
on the ARCHER<br>
> supercomputer),<br>
> but wont work if I split
over 192 (or more) processes (8
nodes on ARCHER).<br>
><br>
> Looking at the log messages
I *think* that WRF is splitting
each domain<br>
> into the same<br>
> number of patches, and
sending one patch from each
domain to a single<br>
> process for<br>
> analysis. However, this
means that I am limited by the
smallest domain as<br>
> to how many<br>
> patches I can split a
domain into before we end up
with patches which are<br>
> dwarved by<br>
> the halos around them.<br>
><br>
> Would it not make more
sense to be able to split each
domain into<br>
> different numbers<br>
> of patches (so that each
patch is of a similar size,
regardless of which<br>
> domain it is from) and<br>
> send one patch from one
domain to a single process (or,
perhaps, send more<br>
> patches from the<br>
> outer domains to a single
process, if needed for balancing
computational<br>
> demands)? And<br>
> is there anyway for me to
do this with WRF?<br>
><br>
> Thanks,<br>
> Doug<br>
>
_______________________________________________<br>
> Wrf-users mailing list<br>
> <a moz-do-not-send="true"
href="mailto:Wrf-users@ucar.edu" target="_blank">Wrf-users@ucar.edu</a><br>
> <a moz-do-not-send="true"
href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
><br>
-------------- next part
--------------<br>
An HTML attachment was
scrubbed...<br>
URL: <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.html"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.html</a><br>
-------------- next part
--------------<br>
A non-text attachment was
scrubbed...<br>
Name: WRF-HPC.pdf<br>
Type: application/pdf<br>
Size: 243897 bytes<br>
Desc: not available<br>
Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.pdf"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.pdf</a><br>
-------------- next part
--------------<br>
A non-text attachment was
scrubbed...<br>
Name: WRF-chapter-multicore.pdf<br>
Type: application/pdf<br>
Size: 230093 bytes<br>
Desc: not available<br>
Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0001.pdf"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0001.pdf</a><br>
-------------- next part
--------------<br>
A non-text attachment was
scrubbed...<br>
Name: CUDA-WRF_ppt.pdf<br>
Type: application/pdf<br>
Size: 2314206 bytes<br>
Desc: not available<br>
Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0002.pdf"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0002.pdf</a><br>
<br>
------------------------------<br>
<br>
_______________________________________________<br>
Wrf-users mailing list<br>
<a moz-do-not-send="true"
href="mailto:Wrf-users@ucar.edu"
target="_blank">Wrf-users@ucar.edu</a><br>
<a moz-do-not-send="true"
href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
rel="noreferrer"
target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
<br>
<br>
End of Wrf-users Digest, Vol
138, Issue 16<br>
******************************************<br>
</blockquote>
</div>
<br>
</div>
</div>
</div>
<br>
_______________________________________________<br>
Wrf-users mailing list<br>
<a moz-do-not-send="true"
href="mailto:Wrf-users@ucar.edu"
target="_blank">Wrf-users@ucar.edu</a><br>
<a moz-do-not-send="true"
href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
<br>
</blockquote>
</div>
<br>
</div>
</div>
</div>
<br>
_______________________________________________<br>
Wrf-users mailing list<br>
<a moz-do-not-send="true"
href="mailto:Wrf-users@ucar.edu" target="_blank">Wrf-users@ucar.edu</a><br>
<a moz-do-not-send="true"
href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
<br>
</div>
<br>
</div>
</div>
</blockquote>
<br>
</div>
</blockquote>
</div>
</blockquote>
<br>
</body>
</html>