<html>
  <head>
    <meta content="text/html; charset=utf-8" http-equiv="Content-Type">
  </head>
  <body bgcolor="#FFFFFF" text="#000000">
    <br>
    Afwande,<br>
    <br>
    I am not sure whether you have turned on the bucket scheme? If so,
    then you should calculate the total convective and resolvable-scale
    precipitation as below:<br>
     <br>
    RAINC=RAINC+I_RAINC*bucket_mm<br>
    RAINNC=RAINNC+I_RAINNC*bucket_mm<br>
    <br>
    Daily and monthly total precipitation should be derived from the
    total precipitation. For example, if you have run WRF for two month,
    and you need to calculate monthly precipitation for the second
    month. Then you need to calculate accumulative precipitation at the
    end of the first month (suppose it is PRECIP1) and that at the end
    of second month (suppose it is PRECIP2). Monthly precipitation for
    the second month should be:<br>
    <br>
    PRECIP2 - PRECIP1    <br>
    <br>
    WRFHELP<br>
    <br>
     <br>
    <br>
    <div class="moz-cite-prefix">On 2/26/16 10:50 AM, Jimy Dudhia wrote:<br>
    </div>
    <blockquote
cite="mid:CAHpO-oOZ6YwrhQbjAWnHFGWsPRkk5yrOPSZ+REkn1we0bZttmw@mail.gmail.com"
      type="cite">
      <div dir="ltr">Maybe they have the bucket scheme on? If so, point
        them to the user guide on the use of the bucket_mm variables
        like i_rainnc
        <div>Jimy</div>
        <div><br>
          <div class="gmail_quote">---------- Forwarded message
            ----------<br>
            From: <b class="gmail_sendername">afwande juliet</b> <span
              dir="ltr">&lt;<a moz-do-not-send="true"
                href="mailto:afwandej965@gmail.com">afwandej965@gmail.com</a>&gt;</span><br>
            Date: Fri, Feb 26, 2016 at 1:38 AM<br>
            Subject: Re: [Wrf-users] Wrf-users Digest, Vol 138, Issue 16
            - (no subject) (afwande juliet)<br>
            To: Felipe Neris &lt;<a moz-do-not-send="true"
              href="mailto:felipenc2@gmail.com">felipenc2@gmail.com</a>&gt;<br>
            Cc: wrf users group &lt;<a moz-do-not-send="true"
              href="mailto:wrf-users@ucar.edu">wrf-users@ucar.edu</a>&gt;<br>
            <br>
            <br>
            <div dir="ltr">
              <div>
                <div>
                  <div>ok thanks<br>
                  </div>
                  when I do investigation about my rainc&amp;rainnc, i
                  find that rainc doesnt look cumulative (values
                  decrease and increase randomly within the time of
                  simulation). Looking at rainc further, there are big
                  values upto 1000mm with some negative values in
                  between. However rainnc, looks cumulative and the
                  values increase with time and there are no negative
                  values. Any other variables like temp etc looks
                  reasonable.<br>
                  <br>
                </div>
                And you know that precip=rainc+rainnc<br>
              </div>
              What could be the problem and how can i correct this?</div>
            <div class="HOEnZb">
              <div class="h5">
                <div class="gmail_extra"><br>
                  <div class="gmail_quote">On Fri, Feb 26, 2016 at 3:16
                    AM, Felipe Neris <span dir="ltr">&lt;<a
                        moz-do-not-send="true"
                        href="mailto:felipenc2@gmail.com"
                        target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:felipenc2@gmail.com">felipenc2@gmail.com</a></a>&gt;</span>
                    wrote:<br>
                    <blockquote class="gmail_quote" style="margin:0 0 0
                      .8ex;border-left:1px #ccc solid;padding-left:1ex">
                      <div dir="ltr">Hi Juliet,
                        <div>If I got your question right, I suppose the
                          answer is: WRF sums all precipitation that
                          occurs since the beginning of the simulation
                          into the variables RAINNC (from microphysics
                          parameterization) and RAINC (from cumulus
                          parameterization). Therefor, if you want to
                          have the accumulated precipitation of a
                          certain day, you must specify/set the
                          corresponding time for these variables and
                          display. </div>
                        <div>Hope I could be of any help!</div>
                        <div>
                          <div class="gmail_extra"><br clear="all">
                            <div>
                              <div>
                                <div dir="ltr">
                                  <div dir="ltr">
                                    <div>Felipe Neris</div>
                                    <div>
                                      <div> </div>
                                    </div>
                                  </div>
                                </div>
                              </div>
                            </div>
                            <br>
                            <div class="gmail_quote">On Thu, Feb 25,
                              2016 at 3:45 PM, <span dir="ltr">&lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:wrf-users-request@ucar.edu"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users-request@ucar.edu">wrf-users-request@ucar.edu</a></a>&gt;</span>
                              wrote:<br>
                              <blockquote class="gmail_quote"
                                style="margin:0px 0px 0px
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Send
                                Wrf-users mailing list submissions to<br>
                                        <a moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank">wrf-users@ucar.edu</a><br>
                                <br>
                                To subscribe or unsubscribe via the
                                World Wide Web, visit<br>
                                        <a moz-do-not-send="true"
                                  href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
                                or, via email, send a message with
                                subject or body 'help' to<br>
                                        <a moz-do-not-send="true"
                                  href="mailto:wrf-users-request@ucar.edu"
                                  target="_blank">wrf-users-request@ucar.edu</a><br>
                                <br>
                                You can reach the person managing the
                                list at<br>
                                        <a moz-do-not-send="true"
                                  href="mailto:wrf-users-owner@ucar.edu"
                                  target="_blank">wrf-users-owner@ucar.edu</a><br>
                                <br>
                                When replying, please edit your Subject
                                line so it is more specific<br>
                                than "Re: Contents of Wrf-users
                                digest..."<br>
                                <br>
                                <br>
                                Today's Topics:<br>
                                <br>
                                   1. (no subject) (afwande juliet)<br>
                                   2. Nesting and Domain Decomposition
                                (Douglas Lowe)<br>
                                   3. Re: Nesting and Domain
                                Decomposition (Tabish Ansari)<br>
                                <br>
                                <br>
----------------------------------------------------------------------<br>
                                <br>
                                Message: 1<br>
                                Date: Thu, 25 Feb 2016 15:45:08 +0300<br>
                                From: afwande juliet &lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:afwandej965@gmail.com"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:afwandej965@gmail.com">afwandej965@gmail.com</a></a>&gt;<br>
                                Subject: [Wrf-users] (no subject)<br>
                                To: wrf users group &lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:wrf-users@ucar.edu">wrf-users@ucar.edu</a></a>&gt;,
                                wrfhelp &lt;<a moz-do-not-send="true"
                                  href="mailto:wrfhelp@ucar.edu"
                                  target="_blank">wrfhelp@ucar.edu</a>&gt;<br>
                                Message-ID:<br>
                                        &lt;<a moz-do-not-send="true"
href="mailto:CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw@mail.gmail.com"
                                  target="_blank">CANVsOojVG3LzL1r82T0oKnBRAU-Nq2vVqHc5GKz4SSj9j0P3Cw@mail.gmail.com</a>&gt;<br>
                                Content-Type: text/plain;
                                charset="utf-8"<br>
                                <br>
                                I ask this again<br>
                                I have WRF simulations for 1981. The
                                model output is 3hourly, i.e 8<br>
                                timestep in a day<br>
                                When I want daily values or monthly
                                values, do i take every 8th timestep to<br>
                                be rain totals for each day and sum them
                                up to get monthly totals?<br>
                                Do I have to divide the units *mm* by
                                any number to get mm/day ?<br>
                                <br>
                                <br>
                                <br>
                                thanks<br>
                                -------------- next part --------------<br>
                                An HTML attachment was scrubbed...<br>
                                URL: <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/c10344ce/attachment-0001.html"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/c10344ce/attachment-0001.html</a><br>
                                <br>
                                ------------------------------<br>
                                <br>
                                Message: 2<br>
                                Date: Thu, 25 Feb 2016 13:59:29 +0000<br>
                                From: Douglas Lowe &lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:Douglas.Lowe@manchester.ac.uk"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:Douglas.Lowe@manchester.ac.uk">Douglas.Lowe@manchester.ac.uk</a></a>&gt;<br>
                                Subject: [Wrf-users] Nesting and Domain
                                Decomposition<br>
                                To: "<a moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank">wrf-users@ucar.edu</a>"
                                &lt;<a moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank">wrf-users@ucar.edu</a>&gt;<br>
                                Message-ID:<br>
                                        &lt;<a moz-do-not-send="true"
href="mailto:43E6B083008E774B87C4283E0FFA4E70012C7A0C91@MBXP02.ds.man.ac.uk"
                                  target="_blank">43E6B083008E774B87C4283E0FFA4E70012C7A0C91@MBXP02.ds.man.ac.uk</a>&gt;<br>
                                Content-Type: text/plain;
                                charset="us-ascii"<br>
                                <br>
                                Hi all,<br>
                                <br>
                                I'm running WRF-Chem with a nest of 3
                                domains, with the settings listed below.
                                I'd like to be<br>
                                able to split this across as many
                                processes as possible in order to speed
                                things up (currently<br>
                                I'm only managing 3x real time, which
                                isn't very good when running multiday
                                simulations).<br>
                                Unfortunately I am finding that WRF
                                hangs when calling the photolysis driver
                                for my 2nd domain<br>
                                (which is the smallest of the domains)
                                if I use too many processors.<br>
                                <br>
                                The (relevant) model domain settings
                                are:<br>
                                max_dom                         = 3,<br>
                                e_we                                =
                                134,  81,   91,<br>
                                e_sn                                =
                                146,  81,   91,<br>
                                e_vert                              =
                                41,    41,  41,<br>
                                num_metgrid_levels        = 38,<br>
                                dx                                    =
                                15000,3000,1000,<br>
                                dy                                    =
                                15000,3000,1000,<br>
                                <br>
                                WRF will run when I split over upto 168
                                processes (7 nodes on the ARCHER
                                supercomputer),<br>
                                but wont work if I split over 192 (or
                                more) processes (8 nodes on ARCHER).<br>
                                <br>
                                Looking at the log messages I *think*
                                that WRF is splitting each domain into
                                the same<br>
                                number of patches, and sending one patch
                                from each domain to a single process for<br>
                                analysis. However, this means that I am
                                limited by the smallest domain as to how
                                many<br>
                                patches I can split a domain into before
                                we end up with patches which are dwarved
                                by<br>
                                the halos around them.<br>
                                <br>
                                Would it not make more sense to be able
                                to split each domain into different
                                numbers<br>
                                of patches (so that each patch is of a
                                similar size, regardless of which domain
                                it is from) and<br>
                                send one patch from one domain to a
                                single process (or, perhaps, send more
                                patches from the<br>
                                outer domains to a single process, if
                                needed for balancing computational
                                demands)? And<br>
                                is there anyway for me to do this with
                                WRF?<br>
                                <br>
                                Thanks,<br>
                                Doug<br>
                                <br>
                                ------------------------------<br>
                                <br>
                                Message: 3<br>
                                Date: Thu, 25 Feb 2016 16:37:26 +0000<br>
                                From: Tabish Ansari &lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:tabishumaransari@gmail.com"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:tabishumaransari@gmail.com">tabishumaransari@gmail.com</a></a>&gt;<br>
                                Subject: Re: [Wrf-users] Nesting and
                                Domain Decomposition<br>
                                To: Douglas Lowe &lt;<a
                                  moz-do-not-send="true"
                                  href="mailto:Douglas.Lowe@manchester.ac.uk"
                                  target="_blank"><a class="moz-txt-link-abbreviated" href="mailto:Douglas.Lowe@manchester.ac.uk">Douglas.Lowe@manchester.ac.uk</a></a>&gt;<br>
                                Cc: "<a moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank">wrf-users@ucar.edu</a>"
                                &lt;<a moz-do-not-send="true"
                                  href="mailto:wrf-users@ucar.edu"
                                  target="_blank">wrf-users@ucar.edu</a>&gt;<br>
                                Message-ID:<br>
                                        &lt;<a moz-do-not-send="true"
href="mailto:CALLVTyvwh3nimJczxjfAy%2BgNML1PfJpDudJT8TQhkDQrCLnwZw@mail.gmail.com"
                                  target="_blank">CALLVTyvwh3nimJczxjfAy+gNML1PfJpDudJT8TQhkDQrCLnwZw@mail.gmail.com</a>&gt;<br>
                                Content-Type: text/plain;
                                charset="utf-8"<br>
                                <br>
                                Hi Doug,<br>
                                <br>
                                I'm not too knowledgeable in this but
                                have some literature which might be<br>
                                of relevance. Please have a look at the
                                attached files.<br>
                                <br>
                                Cheers,<br>
                                <br>
                                Tabish<br>
                                <br>
                                Tabish U Ansari<br>
                                PhD student, Lancaster Environment
                                Center<br>
                                Lancaster Univeristy<br>
                                Bailrigg, Lancaster,<br>
                                LA1 4YW, United Kingdom<br>
                                <br>
                                On 25 February 2016 at 13:59, Douglas
                                Lowe &lt;<a moz-do-not-send="true"
                                  href="mailto:Douglas.Lowe@manchester.ac.uk"
                                  target="_blank">Douglas.Lowe@manchester.ac.uk</a>&gt;<br>
                                wrote:<br>
                                <br>
                                &gt; Hi all,<br>
                                &gt;<br>
                                &gt; I'm running WRF-Chem with a nest of
                                3 domains, with the settings listed<br>
                                &gt; below. I'd like to be<br>
                                &gt; able to split this across as many
                                processes as possible in order to speed<br>
                                &gt; things up (currently<br>
                                &gt; I'm only managing 3x real time,
                                which isn't very good when running<br>
                                &gt; multiday simulations).<br>
                                &gt; Unfortunately I am finding that WRF
                                hangs when calling the photolysis<br>
                                &gt; driver for my 2nd domain<br>
                                &gt; (which is the smallest of the
                                domains) if I use too many processors.<br>
                                &gt;<br>
                                &gt; The (relevant) model domain
                                settings are:<br>
                                &gt; max_dom                         =
                                3,<br>
                                &gt; e_we                               
                                = 134,  81,   91,<br>
                                &gt; e_sn                               
                                = 146,  81,   91,<br>
                                &gt; e_vert                             
                                = 41,    41,  41,<br>
                                &gt; num_metgrid_levels        = 38,<br>
                                &gt; dx                                 
                                  = 15000,3000,1000,<br>
                                &gt; dy                                 
                                  = 15000,3000,1000,<br>
                                &gt;<br>
                                &gt; WRF will run when I split over upto
                                168 processes (7 nodes on the ARCHER<br>
                                &gt; supercomputer),<br>
                                &gt; but wont work if I split over 192
                                (or more) processes (8 nodes on ARCHER).<br>
                                &gt;<br>
                                &gt; Looking at the log messages I
                                *think* that WRF is splitting each
                                domain<br>
                                &gt; into the same<br>
                                &gt; number of patches, and sending one
                                patch from each domain to a single<br>
                                &gt; process for<br>
                                &gt; analysis. However, this means that
                                I am limited by the smallest domain as<br>
                                &gt; to how many<br>
                                &gt; patches I can split a domain into
                                before we end up with patches which are<br>
                                &gt; dwarved by<br>
                                &gt; the halos around them.<br>
                                &gt;<br>
                                &gt; Would it not make more sense to be
                                able to split each domain into<br>
                                &gt; different numbers<br>
                                &gt; of patches (so that each patch is
                                of a similar size, regardless of which<br>
                                &gt; domain it is from) and<br>
                                &gt; send one patch from one domain to a
                                single process (or, perhaps, send more<br>
                                &gt; patches from the<br>
                                &gt; outer domains to a single process,
                                if needed for balancing computational<br>
                                &gt; demands)? And<br>
                                &gt; is there anyway for me to do this
                                with WRF?<br>
                                &gt;<br>
                                &gt; Thanks,<br>
                                &gt; Doug<br>
                                &gt;
                                _______________________________________________<br>
                                &gt; Wrf-users mailing list<br>
                                &gt; <a moz-do-not-send="true"
                                  href="mailto:Wrf-users@ucar.edu"
                                  target="_blank">Wrf-users@ucar.edu</a><br>
                                &gt; <a moz-do-not-send="true"
                                  href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
                                &gt;<br>
                                -------------- next part --------------<br>
                                An HTML attachment was scrubbed...<br>
                                URL: <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.html"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.html</a><br>
                                -------------- next part --------------<br>
                                A non-text attachment was scrubbed...<br>
                                Name: WRF-HPC.pdf<br>
                                Type: application/pdf<br>
                                Size: 243897 bytes<br>
                                Desc: not available<br>
                                Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.pdf"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment.pdf</a><br>
                                -------------- next part --------------<br>
                                A non-text attachment was scrubbed...<br>
                                Name: WRF-chapter-multicore.pdf<br>
                                Type: application/pdf<br>
                                Size: 230093 bytes<br>
                                Desc: not available<br>
                                Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0001.pdf"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0001.pdf</a><br>
                                -------------- next part --------------<br>
                                A non-text attachment was scrubbed...<br>
                                Name: CUDA-WRF_ppt.pdf<br>
                                Type: application/pdf<br>
                                Size: 2314206 bytes<br>
                                Desc: not available<br>
                                Url : <a moz-do-not-send="true"
href="http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0002.pdf"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/pipermail/wrf-users/attachments/20160225/02afb9aa/attachment-0002.pdf</a><br>
                                <br>
                                ------------------------------<br>
                                <br>
_______________________________________________<br>
                                Wrf-users mailing list<br>
                                <a moz-do-not-send="true"
                                  href="mailto:Wrf-users@ucar.edu"
                                  target="_blank">Wrf-users@ucar.edu</a><br>
                                <a moz-do-not-send="true"
                                  href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
                                  rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
                                <br>
                                <br>
                                End of Wrf-users Digest, Vol 138, Issue
                                16<br>
******************************************<br>
                              </blockquote>
                            </div>
                            <br>
                          </div>
                        </div>
                      </div>
                      <br>
                      _______________________________________________<br>
                      Wrf-users mailing list<br>
                      <a moz-do-not-send="true"
                        href="mailto:Wrf-users@ucar.edu" target="_blank">Wrf-users@ucar.edu</a><br>
                      <a moz-do-not-send="true"
                        href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
                        rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
                      <br>
                    </blockquote>
                  </div>
                  <br>
                </div>
              </div>
            </div>
            <br>
            _______________________________________________<br>
            Wrf-users mailing list<br>
            <a moz-do-not-send="true" href="mailto:Wrf-users@ucar.edu">Wrf-users@ucar.edu</a><br>
            <a moz-do-not-send="true"
              href="http://mailman.ucar.edu/mailman/listinfo/wrf-users"
              rel="noreferrer" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
            <br>
          </div>
          <br>
        </div>
      </div>
    </blockquote>
    <br>
  </body>
</html>