[Wrf-users] Fwd: Wrf-users Digest, Vol 138, Issue 16 - (no subject) (afwande juliet)

afwande juliet afwandej965 at gmail.com
Mon Feb 29 01:03:29 MST 2016


Thanks , the bucket scheme was on when running experiment, and
unfortunately, I didn't carry l_rainc and l_rainnc since I was running the
model in another country and it seems the data was deleted from server to
create space, how can I help in this case

On Sun, Feb 28, 2016 at 9:56 PM, afwande juliet <afwandej965 at gmail.com>
wrote:

> Thanks , the bucket scheme was on when running experiment, and
> unfortunately, I didn't carry l_rainc and l_rainnc since I was running the
> model in another country and it seems the data was deleted from server to
> create space, how can I help in this case
> On Feb 27, 2016 2:40 AM, "wrfhelp" <wrfhelp at ucar.edu> wrote:
>
>>
>> Afwande,
>>
>> I am not sure whether you have turned on the bucket scheme? If so, then
>> you should calculate the total convective and resolvable-scale
>> precipitation as below:
>>
>> RAINC=RAINC+I_RAINC*bucket_mm
>> RAINNC=RAINNC+I_RAINNC*bucket_mm
>>
>> Daily and monthly total precipitation should be derived from the total
>> precipitation. For example, if you have run WRF for two month, and you need
>> to calculate monthly precipitation for the second month. Then you need to
>> calculate accumulative precipitation at the end of the first month (suppose
>> it is PRECIP1) and that at the end of second month (suppose it is PRECIP2).
>> Monthly precipitation for the second month should be:
>>
>> PRECIP2 - PRECIP1
>>
>> WRFHELP
>>
>>
>>
>> On 2/26/16 10:50 AM, Jimy Dudhia wrote:
>>
>> Maybe they have the bucket scheme on? If so, point them to the user guide
>> on the use of the bucket_mm variables like i_rainnc
>> Jimy
>>
>> ---------- Forwarded message ----------
>> From: afwande juliet <afwandej965 at gmail.com>
>> Date: Fri, Feb 26, 2016 at 1:38 AM
>> Subject: Re: [Wrf-users] Wrf-users Digest, Vol 138, Issue 16 - (no
>> subject) (afwande juliet)
>> To: Felipe Neris <felipenc2 at gmail.com>
>> Cc: wrf users group <wrf-users at ucar.edu>
>>
>>
>> ok thanks
>> when I do investigation about my rainc&rainnc, i find that rainc doesnt
>> look cumulative (values decrease and increase randomly within the time of
>> simulation). Looking at rainc further, there are big values upto 1000mm
>> with some negative values in between. However rainnc, looks cumulative and
>> the values increase with time and there are no negative values. Any other
>> variables like temp etc looks reasonable.
>>
>> And you know that precip=rainc+rainnc
>> What could be the problem and how can i correct this?
>>
>> On Fri, Feb 26, 2016 at 3:16 AM, Felipe Neris < <felipenc2 at gmail.com>
>> felipenc2 at gmail.com> wrote:
>>
>>> Hi Juliet,
>>> If I got your question right, I suppose the answer is: WRF sums all
>>> precipitation that occurs since the beginning of the simulation into the
>>> variables RAINNC (from microphysics parameterization) and RAINC (from
>>> cumulus parameterization). Therefor, if you want to have the accumulated
>>> precipitation of a certain day, you must specify/set the corresponding time
>>> for these variables and display.
>>> Hope I could be of any help!
>>>
>>> Felipe Neris
>>>
>>>
>>> On Thu, Feb 25, 2016 at 3:45 PM, < <wrf-users-request at ucar.edu>
>>> wrf-users-request at ucar.edu> wrote:
>>>
>>>> Send Wrf-users mailing list submissions to
>>>>         wrf-users at ucar.edu
>>>>
>>>> To subscribe or unsubscribe via the World Wide Web, visit
>>>>         http://mailman.ucar.edu/mailman/listinfo/wrf-users
>>>> or, via email, send a message with subject or body 'help' to
>>>>         wrf-users-request at ucar.edu
>>>>
>>>> You can reach the person managing the list at
>>>>         wrf-users-owner at ucar.edu
>>>>
>>>> When replying, please edit your Subject line so it is more specific
>>>> than "Re: Contents of Wrf-users digest..."
>>>>
>>>>
>>>> Today's Topics:
>>>>
>>>>    1. (no subject) (afwande juliet)
>>>>    2. Nesting and Domain Decomposition (Douglas Lowe)
>>>>    3. Re: Nesting and Domain Decomposition (Tabish Ansari)
>>>>
>>>>
>>>> ----------------------------------------------------------------------
>>>>
>>>> Message: 1
>>>> Date: Thu, 25 Feb 2016 15:45:08 +0300
>>>> From: afwande juliet < <afwandej965 at gmail.com>afwandej965 at gmail.com>
>>>> Subject: [Wrf-users] (no subject)
>>>> To: wrf users group < <wrf-users at ucar.edu>wrf-users at ucar.edu>, wrfhelp
>>>> <wrfhelp at ucar.edu>
>>>> Message-ID:
>>>>         <
>>>> CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw at mail.gmail.com>
>>>> Content-Type: text/plain; charset="utf-8"
>>>>
>>>> I ask this again
>>>> I have WRF simulations for 1981. The model output is 3hourly, i.e 8
>>>> timestep in a day
>>>> When I want daily values or monthly values, do i take every 8th
>>>> timestep to
>>>> be rain totals for each day and sum them up to get monthly totals?
>>>> Do I have to divide the units *mm* by any number to get mm/day ?
>>>>
>>>>
>>>>
>>>> thanks
>>>> -------------- next part --------------
>>>> An HTML attachment was scrubbed...
>>>> URL:
>>>> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/c10344ce/attachment-0001.html
>>>>
>>>> ------------------------------
>>>>
>>>> Message: 2
>>>> Date: Thu, 25 Feb 2016 13:59:29 +0000
>>>> From: Douglas Lowe < <Douglas.Lowe at manchester.ac.uk>
>>>> Douglas.Lowe at manchester.ac.uk>
>>>> Subject: [Wrf-users] Nesting and Domain Decomposition
>>>> To: "wrf-users at ucar.edu" <wrf-users at ucar.edu>
>>>> Message-ID:
>>>>         <43E6B083008E774B87C4283E0FFA4E70012C7A0C91 at MBXP02.ds.man.ac.uk
>>>> >
>>>> Content-Type: text/plain; charset="us-ascii"
>>>>
>>>> Hi all,
>>>>
>>>> I'm running WRF-Chem with a nest of 3 domains, with the settings listed
>>>> below. I'd like to be
>>>> able to split this across as many processes as possible in order to
>>>> speed things up (currently
>>>> I'm only managing 3x real time, which isn't very good when running
>>>> multiday simulations).
>>>> Unfortunately I am finding that WRF hangs when calling the photolysis
>>>> driver for my 2nd domain
>>>> (which is the smallest of the domains) if I use too many processors.
>>>>
>>>> The (relevant) model domain settings are:
>>>> max_dom                         = 3,
>>>> e_we                                = 134,  81,   91,
>>>> e_sn                                = 146,  81,   91,
>>>> e_vert                              = 41,    41,  41,
>>>> num_metgrid_levels        = 38,
>>>> dx                                    = 15000,3000,1000,
>>>> dy                                    = 15000,3000,1000,
>>>>
>>>> WRF will run when I split over upto 168 processes (7 nodes on the
>>>> ARCHER supercomputer),
>>>> but wont work if I split over 192 (or more) processes (8 nodes on
>>>> ARCHER).
>>>>
>>>> Looking at the log messages I *think* that WRF is splitting each domain
>>>> into the same
>>>> number of patches, and sending one patch from each domain to a single
>>>> process for
>>>> analysis. However, this means that I am limited by the smallest domain
>>>> as to how many
>>>> patches I can split a domain into before we end up with patches which
>>>> are dwarved by
>>>> the halos around them.
>>>>
>>>> Would it not make more sense to be able to split each domain into
>>>> different numbers
>>>> of patches (so that each patch is of a similar size, regardless of
>>>> which domain it is from) and
>>>> send one patch from one domain to a single process (or, perhaps, send
>>>> more patches from the
>>>> outer domains to a single process, if needed for balancing
>>>> computational demands)? And
>>>> is there anyway for me to do this with WRF?
>>>>
>>>> Thanks,
>>>> Doug
>>>>
>>>> ------------------------------
>>>>
>>>> Message: 3
>>>> Date: Thu, 25 Feb 2016 16:37:26 +0000
>>>> From: Tabish Ansari < <tabishumaransari at gmail.com>
>>>> tabishumaransari at gmail.com>
>>>> Subject: Re: [Wrf-users] Nesting and Domain Decomposition
>>>> To: Douglas Lowe < <Douglas.Lowe at manchester.ac.uk>
>>>> Douglas.Lowe at manchester.ac.uk>
>>>> Cc: "wrf-users at ucar.edu" <wrf-users at ucar.edu>
>>>> Message-ID:
>>>>         <
>>>> CALLVTyvwh3nimJczxjfAy+gNML1PfJpDudJT8TQhkDQrCLnwZw at mail.gmail.com>
>>>> Content-Type: text/plain; charset="utf-8"
>>>>
>>>> Hi Doug,
>>>>
>>>> I'm not too knowledgeable in this but have some literature which might
>>>> be
>>>> of relevance. Please have a look at the attached files.
>>>>
>>>> Cheers,
>>>>
>>>> Tabish
>>>>
>>>> Tabish U Ansari
>>>> PhD student, Lancaster Environment Center
>>>> Lancaster Univeristy
>>>> Bailrigg, Lancaster,
>>>> LA1 4YW, United Kingdom
>>>>
>>>> On 25 February 2016 at 13:59, Douglas Lowe <
>>>> Douglas.Lowe at manchester.ac.uk>
>>>> wrote:
>>>>
>>>> > Hi all,
>>>> >
>>>> > I'm running WRF-Chem with a nest of 3 domains, with the settings
>>>> listed
>>>> > below. I'd like to be
>>>> > able to split this across as many processes as possible in order to
>>>> speed
>>>> > things up (currently
>>>> > I'm only managing 3x real time, which isn't very good when running
>>>> > multiday simulations).
>>>> > Unfortunately I am finding that WRF hangs when calling the photolysis
>>>> > driver for my 2nd domain
>>>> > (which is the smallest of the domains) if I use too many processors.
>>>> >
>>>> > The (relevant) model domain settings are:
>>>> > max_dom                         = 3,
>>>> > e_we                                = 134,  81,   91,
>>>> > e_sn                                = 146,  81,   91,
>>>> > e_vert                              = 41,    41,  41,
>>>> > num_metgrid_levels        = 38,
>>>> > dx                                    = 15000,3000,1000,
>>>> > dy                                    = 15000,3000,1000,
>>>> >
>>>> > WRF will run when I split over upto 168 processes (7 nodes on the
>>>> ARCHER
>>>> > supercomputer),
>>>> > but wont work if I split over 192 (or more) processes (8 nodes on
>>>> ARCHER).
>>>> >
>>>> > Looking at the log messages I *think* that WRF is splitting each
>>>> domain
>>>> > into the same
>>>> > number of patches, and sending one patch from each domain to a single
>>>> > process for
>>>> > analysis. However, this means that I am limited by the smallest
>>>> domain as
>>>> > to how many
>>>> > patches I can split a domain into before we end up with patches which
>>>> are
>>>> > dwarved by
>>>> > the halos around them.
>>>> >
>>>> > Would it not make more sense to be able to split each domain into
>>>> > different numbers
>>>> > of patches (so that each patch is of a similar size, regardless of
>>>> which
>>>> > domain it is from) and
>>>> > send one patch from one domain to a single process (or, perhaps, send
>>>> more
>>>> > patches from the
>>>> > outer domains to a single process, if needed for balancing
>>>> computational
>>>> > demands)? And
>>>> > is there anyway for me to do this with WRF?
>>>> >
>>>> > Thanks,
>>>> > Doug
>>>> > _______________________________________________
>>>> > Wrf-users mailing list
>>>> > Wrf-users at ucar.edu
>>>> > http://mailman.ucar.edu/mailman/listinfo/wrf-users
>>>> >
>>>> -------------- next part --------------
>>>> An HTML attachment was scrubbed...
>>>> URL:
>>>> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.html
>>>> -------------- next part --------------
>>>> A non-text attachment was scrubbed...
>>>> Name: WRF-HPC.pdf
>>>> Type: application/pdf
>>>> Size: 243897 bytes
>>>> Desc: not available
>>>> Url :
>>>> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.pdf
>>>> -------------- next part --------------
>>>> A non-text attachment was scrubbed...
>>>> Name: WRF-chapter-multicore.pdf
>>>> Type: application/pdf
>>>> Size: 230093 bytes
>>>> Desc: not available
>>>> Url :
>>>> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0001.pdf
>>>> -------------- next part --------------
>>>> A non-text attachment was scrubbed...
>>>> Name: CUDA-WRF_ppt.pdf
>>>> Type: application/pdf
>>>> Size: 2314206 bytes
>>>> Desc: not available
>>>> Url :
>>>> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0002.pdf
>>>>
>>>> ------------------------------
>>>>
>>>> _______________________________________________
>>>> Wrf-users mailing list
>>>> Wrf-users at ucar.edu
>>>> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>>>>
>>>>
>>>> End of Wrf-users Digest, Vol 138, Issue 16
>>>> ******************************************
>>>>
>>>
>>>
>>> _______________________________________________
>>> Wrf-users mailing list
>>> Wrf-users at ucar.edu
>>> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>>>
>>>
>>
>> _______________________________________________
>> Wrf-users mailing list
>> Wrf-users at ucar.edu
>> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>>
>>
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160229/c4740120/attachment-0001.html 


More information about the Wrf-users mailing list