From hding101 at googlemail.com Tue Sep 3 17:07:05 2024 From: hding101 at googlemail.com (Hui Ding) Date: Tue, 3 Sep 2024 17:07:05 -0600 Subject: [ncl-talk] a question on the NCL function gradsf Message-ID: Dear Sir or Madam, I note that the function gradsf can calculate gradients along longitude and latitude. I wonder about the sign of the gradient. For example, as to *igradsf* (T_grad_lon, T_grad_lat, T), does T_grad_lon equals d(T)/dlon or -1*d(T)/dlon? The same question for T_grad_lat. Thank you! Best regards, Hui Ding -------------- next part -------------- An HTML attachment was scrubbed... URL: From shea at ucar.edu Wed Sep 4 08:18:04 2024 From: shea at ucar.edu (Dennis Shea) Date: Wed, 4 Sep 2024 08:18:04 -0600 Subject: [ncl-talk] a question on the NCL function gradsf In-Reply-To: References: Message-ID: If I understand you question correctly: d(T)/dlon -------- *https://www.ncl.ucar.edu/Applications/gradients.shtml * On Tue, Sep 3, 2024 at 5:07?PM Hui Ding via ncl-talk < ncl-talk at mailman.ucar.edu> wrote: > Dear Sir or Madam, > I note that the function gradsf can calculate gradients along longitude > and latitude. I wonder about the sign of the gradient. For example, as to > *igradsf* > > (T_grad_lon, T_grad_lat, T), does T_grad_lon equals d(T)/dlon or -1*d(T)/dlon? > The same question for T_grad_lat. > Thank you! > Best regards, > Hui Ding > > > > _______________________________________________ > ncl-talk mailing list > ncl-talk at mailman.ucar.edu > List instructions, subscriber options, unsubscribe: > https://mailman.ucar.edu/mailman/listinfo/ncl-talk > -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.graffino at tim.it Mon Sep 9 10:26:18 2024 From: g.graffino at tim.it (Giorgio Graffino) Date: Mon, 9 Sep 2024 18:26:18 +0200 (CEST) Subject: [ncl-talk] Query about probability density functions Message-ID: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> Hello NCL experts, ? I'm using pdfx to compute probability density functions (https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1? ? Thanks for your help. ? Cheers, Giorgio -------------- next part -------------- An HTML attachment was scrubbed... URL: From shea at ucar.edu Mon Sep 9 12:15:51 2024 From: shea at ucar.edu (Dennis Shea) Date: Mon, 9 Sep 2024 12:15:51 -0600 Subject: [ncl-talk] Query about probability density functions In-Reply-To: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> References: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> Message-ID: >From the *pdfx * documentation: "The PDF units are percent [%]." Divide by 100. On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk < ncl-talk at mailman.ucar.edu> wrote: > Hello NCL experts, > > > > I'm using pdfx to compute probability density functions ( > https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml). I > noticed that the cumulative density functions sum up to 100 in NCL, instead > of 1 as per definition. Why is that? How can I normalize them back to 1? > > > > Thanks for your help. > > > > Cheers, > > Giorgio > _______________________________________________ > ncl-talk mailing list > ncl-talk at mailman.ucar.edu > List instructions, subscriber options, unsubscribe: > https://mailman.ucar.edu/mailman/listinfo/ncl-talk > -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.graffino at tim.it Wed Sep 11 08:01:19 2024 From: g.graffino at tim.it (Giorgio Graffino) Date: Wed, 11 Sep 2024 16:01:19 +0200 (CEST) Subject: [ncl-talk] Query about probability density functions In-Reply-To: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> References: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> Message-ID: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> Hi Dennis, ? Thank you for pointing out that detail.? ? I've found the code of the pdfx function in here (https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl) and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize the pdf by multiplying the frequency pdf by the number of data and divide by 100, but the results is very different depending on the number of data to compute the pdf. Also, the results are different from similar computations using Python, especially with large number of data. Without knowing what the Fortran routine does (and I can't find it), I don't know how to normalize NCL to match Python. ? Any suggestions? ? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: shea at ucar.edu A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: luned?, settembre 9? 2024, 08:15 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? From the pdfx documentation:? "The PDF units are percent [%]." ? Divide by 100. ? ? On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk > wrote: ? Hello NCL experts, ? I'm using pdfx to compute probability density functions (https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml ). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1? ? Thanks for your help. ? Cheers, Giorgio _______________________________________________ ncl-talk mailing list ncl-talk at mailman.ucar.edu List instructions, subscriber options, unsubscribe: https://mailman.ucar.edu/mailman/listinfo/ncl-talk ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.allured at noaa.gov Wed Sep 11 08:58:02 2024 From: dave.allured at noaa.gov (Dave Allured - NOAA Affiliate) Date: Wed, 11 Sep 2024 08:58:02 -0600 Subject: [ncl-talk] Query about probability density functions In-Reply-To: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> References: <417671e7.9ed90.191d79b8288.Webtop.53@tim.it> <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> Message-ID: Giorgio, the fortran routine is at *ni/src/lib/nfpfort/xy1pdf77.f* in the NCL v6.6.2 source code. On Wed, Sep 11, 2024 at 8:01?AM Giorgio Graffino via ncl-talk < ncl-talk at mailman.ucar.edu> wrote: > Hi Dennis, > > Thank you for pointing out that detail. > > I've found the code of the pdfx function in here ( > https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl) > and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize > the pdf by multiplying the frequency pdf by the number of data and divide > by 100, but the results is very different depending on the number of data > to compute the pdf. Also, the results are different from similar > computations using Python, especially with large number of data. Without > knowing what the Fortran routine does (and I can't find it), I don't know > how to normalize NCL to match Python. > > Any suggestions? > > Cheers, > > Giorgio > > > > ------ Messaggio Originale ------ > Da: shea at ucar.edu > A: g.graffino at tim.it Cc: ncl-talk at ucar.edu > Inviato: luned?, settembre 9? 2024, 08:15 PM > Oggetto: Re: [ncl-talk] Query about probability density functions > From the *pdfx* > > documentation: "The PDF units are percent [%]." > > Divide by 100. > > > On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk < > ncl-talk at mailman.ucar.edu> wrote: > > >> Hello NCL experts, >> >> I'm using pdfx to compute probability density functions ( >> https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml). I >> noticed that the cumulative density functions sum up to 100 in NCL, instead >> of 1 as per definition. Why is that? How can I normalize them back to 1? >> >> Thanks for your help. >> >> Cheers, >> >> Giorgio >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.graffino at tim.it Wed Sep 11 09:35:38 2024 From: g.graffino at tim.it (Giorgio Graffino) Date: Wed, 11 Sep 2024 17:35:38 +0200 (CEST) Subject: [ncl-talk] Query about probability density functions In-Reply-To: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> References: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> Message-ID: <520ac947.58137.191e1b9d7ec.Webtop.51@tim.it> Thanks Dave, ? The Fortran code looks straightforward. The last bit is for converting the pdf into frequency ("if flag is set", according to the comment), but the same code is also present in the contributed.ncl code. I'm assuming the flag isn't set, otherwise the conversion would be done twice.? ? Unfortunately I'm not getting any closer to solve this problem. Simply dividing the NCL pdfs by 100 make them much smaller than Python or IDL pdfs. I don't know if anyone ever encountered this issue before. ? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: dave.allured at noaa.gov A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: mercoled?, settembre 11? 2024, 04:58 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Giorgio, the fortran routine is at ni/src/lib/nfpfort/xy1pdf77.f in the NCL v6.6.2 source code. ? ? On Wed, Sep 11, 2024 at 8:01?AM Giorgio Graffino via ncl-talk > wrote: ? Hi Dennis, Thank you for pointing out that detail. I've found the code of the pdfx function in here (https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl ) and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize the pdf by multiplying the frequency pdf by the number of data and divide by 100, but the results is very different depending on the number of data to compute the pdf. Also, the results are different from similar computations using Python, especially with large number of data. Without knowing what the Fortran routine does (and I can't find it), I don't know how to normalize NCL to match Python. Any suggestions? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: shea at ucar.edu A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: luned?, settembre 9? 2024, 08:15 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? From the pdfx documentation:? "The PDF units are percent [%]." ? Divide by 100. ? ? On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk > wrote: ? Hello NCL experts, I'm using pdfx to compute probability density functions (https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml ). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1? ? Thanks for your help. Cheers, Giorgio -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.graffino at tim.it Fri Sep 13 09:25:20 2024 From: g.graffino at tim.it (Giorgio Graffino) Date: Fri, 13 Sep 2024 17:25:20 +0200 (CEST) Subject: [ncl-talk] Query about probability density functions In-Reply-To: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> References: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> Message-ID: <28c3044f.5f1f6.191ebfd229f.Webtop.51@tim.it> I've found a way to normalize the PDF and make it like Python and IDL.? Here is the modified function for reference. I removed the call to the Fortran subroutine, so now the function it's a bit slower than the original version, and made all values float instead of double. undef("pdfx_norm") function pdfx_norm(x:numeric, nbin[1]:integer, opt:logical) local nGood, nbins, xMin, xMax, mnmxint, xSpace \ ? ? ,bin, pdf, nTot, XMIN, XMAX, nLoOut, nHiOut begin ?if (nbin.le.2) then ? ? ?nbins = 25 ? ? ? ? ? ? ; default? ?else ? ? ?nbins = nbin ?end if ?nGood = num(.not.ismissing(x)) ?if (nGood.lt.3) then ? ? ?print("pdfx_norm: nGood="+nGood+" : Need more non-missing points") ? ? ?pdf = new( nbins, "double", getFillValue(x)) ? ? ?return( pdf ) ?end if ?XMIN = min(x)*1. ?XMAX = max(x)*1. ?xMin = 0. ?xMax = 0. ?if (opt .and. isatt(opt,"bin_min")) then ? ? ? ?xMin = opt at bin_min ? ? ? ? ? ? ? ? ? ?; user set ?else ? ? ?xMin = XMIN ? ? ? ? ? ? ? ? ? ? ? ? ? ; calculate ?end if ?if (opt .and. isatt(opt,"bin_max")) then ? ? ? ?xMax = opt at bin_max ? ? ? ? ? ? ? ? ? ?; user set ?else ? ? ?xMax = XMAX ? ? ? ? ? ? ? ? ? ? ? ? ? ; calculate ?end if ?if (opt .and. isatt(opt,"bin_nice")) then ; nice xMin, xMax ? ? ?outside = False ? ? ?if (isatt(opt,"bin_nice_outside")) then ? ? ? ? ?outside = opt at bin_nice_outside ? ? ?end if ? ? ?mnmxint = nice_mnmxintvl( XMIN, XMAX, nbins, outside) ? ? ?xMin ? ?= mnmxint(0) ? ? ?xMax ? ?= mnmxint(1) ? ? ?xSpace ?= mnmxint(2) ? ?;;nbins ? = round( (xMax-xMin)/xSpace , 3)? ?end if ;;dbin ? ? ? ?= (xMax-xMin)/(nbins-1) ? ? ? ? ?; 5.2.0 ?? ?dbin ? ? ? ?= (xMax-xMin)/nbins ?? ?binBound ? ?= xMin + ispan(0,nbins,1)*dbin ?binBound(nbins) = xMax ? ? ? ? ? ? ? ? ? ? ? ; avoid roundoff ? ? ? ? ? ? ? ?binCenter ? = (binBound(0:nbins-1) + binBound(1:nbins))*0.5 ?binBoundMin = binBound(0) ? ? ? ? ?? ?binBoundMax = binBound(nbins) ? ? ? ?pdf ? ? ? ? = new( nbins,typeof(x), getFillValue(x)) ?pdf ? ? ? ? = 0. ? do nb=0,nbins-1 ? ? ? ? pdf(nb) = num( x.ge.binBound(nb) .and. x.lt.binBound(nb+1) ) ? if (nb.eq.(nbins-1)) then ? ? ? ? ? ? ? ? ; last bin? ? ? ? pdf(nb) = pdf(nb) + num( x.eq.binBound(nb+1) ) ? ; include last bound ? end if end do ;;---- ?pdf!0 ? ?= " ?pdf&x ? ?= binCenter ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ; max possible in data ? ?? ?nMax ? ? = num(x.ge.XMIN .and. x.le.XMAX) ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ; actual number used ?nUse ? ? = num(x.ge.binBoundMin .and. x.le.binBoundMax) ?nLoOut ? = num(x.lt.binBoundMin) ? ? ; number outliers ?nHiOut ? = num(x.gt.binBoundMax) ? ; ?pdf ? ? ?= pdf/nMax*1.e2 ? ? ? ? ? ? ? ? ? ? ?; original percent-frequency PDF ?pdf ? ? ?= pdf/sum(pdf)/dbin ? ? ? ? ? ? ? ? ; modified normalised PDF ? ?pdf at bin_center ? ? = binCenter ?pdf at bin_bounds ? ? = binBound ?pdf at bin_bound_min ?= binBoundMin ?pdf at bin_bound_max ?= binBoundMax ?pdf at bin_spacing ? ?= dbin ? ? ? ? ? ?; binBound(2)-binBound(1) ?pdf at nbins ? ? ? ? ?= nbins ?pdf at nMax ? ? ? ? ? = nMax ?pdf at nUse ? ? ? ? ? = nUse ?if (nLoOut.gt.0 .or. nHiOut.gt.0) then ? ? ?pdf at nLoOut ? ? = nLoOut ? ? ?? ? ? ?pdf at nHiOut ? ? = nHiOut ?end if ?pdf at long_name ? ? ?= " ?if (isatt(x,"long_name")) then ? ? ?pdf at long_name ?= " ?end if ? ?return( pdf ) end ? ------ Messaggio Originale ------ Da: ncl-talk at mailman.ucar.edu A: dave.allured at noaa.gov Cc: ncl-talk at ucar.edu Inviato: mercoled?, settembre 11? 2024, 05:35 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Thanks Dave, ? The Fortran code looks straightforward. The last bit is for converting the pdf into frequency ("if flag is set", according to the comment), but the same code is also present in the contributed.ncl code. I'm assuming the flag isn't set, otherwise the conversion would be done twice.? ? Unfortunately I'm not getting any closer to solve this problem. Simply dividing the NCL pdfs by 100 make them much smaller than Python or IDL pdfs. I don't know if anyone ever encountered this issue before. ? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: dave.allured at noaa.gov A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: mercoled?, settembre 11? 2024, 04:58 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Giorgio, the fortran routine is at ni/src/lib/nfpfort/xy1pdf77.f in the NCL v6.6.2 source code. ? ? On Wed, Sep 11, 2024 at 8:01?AM Giorgio Graffino via ncl-talk > wrote: ? Hi Dennis, Thank you for pointing out that detail. I've found the code of the pdfx function in here (https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl ) and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize the pdf by multiplying the frequency pdf by the number of data and divide by 100, but the results is very different depending on the number of data to compute the pdf. Also, the results are different from similar computations using Python, especially with large number of data. Without knowing what the Fortran routine does (and I can't find it), I don't know how to normalize NCL to match Python. Any suggestions? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: shea at ucar.edu A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: luned?, settembre 9? 2024, 08:15 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? From the pdfx documentation:? "The PDF units are percent [%]." ? Divide by 100. ? ? On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk > wrote: ? Hello NCL experts, I'm using pdfx to compute probability density functions (https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml ). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1? ? Thanks for your help. Cheers, Giorgio _______________________________________________ ncl-talk mailing list ncl-talk at mailman.ucar.edu List instructions, subscriber options, unsubscribe: https://mailman.ucar.edu/mailman/listinfo/ncl-talk ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.allured at noaa.gov Fri Sep 13 09:42:06 2024 From: dave.allured at noaa.gov (Dave Allured - NOAA Affiliate) Date: Fri, 13 Sep 2024 09:42:06 -0600 Subject: [ncl-talk] Query about probability density functions In-Reply-To: <28c3044f.5f1f6.191ebfd229f.Webtop.51@tim.it> References: <1ab9988c.57a1e.191e1637d72.Webtop.51@tim.it> <28c3044f.5f1f6.191ebfd229f.Webtop.51@tim.it> Message-ID: Giorgio, can you briefly explain what are the basic numerical differences between the NCL and Python/IDL versions? Please explain mathematically, not with code. On Fri, Sep 13, 2024 at 9:25?AM Giorgio Graffino wrote: > I've found a way to normalize the PDF and make it like Python and IDL. > > Here is the modified function for reference. I removed the call to the > Fortran subroutine, so now the function it's a bit slower than the original > version, and made all values float instead of double. > > undef("pdfx_norm") > function pdfx_norm(x:numeric, nbin[1]:integer, opt:logical) > local nGood, nbins, xMin, xMax, mnmxint, xSpace \ > ,bin, pdf, nTot, XMIN, XMAX, nLoOut, nHiOut > begin > > if (nbin.le.2) then > nbins = 25 ; default > else > nbins = nbin > end if > > nGood = num(.not.ismissing(x)) > if (nGood.lt.3) then > print("pdfx_norm: nGood="+nGood+" : Need more non-missing points") > pdf = new( nbins, "double", getFillValue(x)) > return( pdf ) > end if > > XMIN = min(x)*1. > XMAX = max(x)*1. > > xMin = 0. > xMax = 0. > > if (opt .and. isatt(opt,"bin_min")) then > xMin = opt at bin_min ; user set > else > xMin = XMIN ; calculate > end if > > if (opt .and. isatt(opt,"bin_max")) then > xMax = opt at bin_max ; user set > else > xMax = XMAX ; calculate > end if > > if (opt .and. isatt(opt,"bin_nice")) then ; nice xMin, xMax > outside = False > if (isatt(opt,"bin_nice_outside")) then > outside = opt at bin_nice_outside > end if > mnmxint = nice_mnmxintvl( XMIN, XMAX, nbins, outside) > xMin = mnmxint(0) > xMax = mnmxint(1) > xSpace = mnmxint(2) > ;;nbins = round( (xMax-xMin)/xSpace , 3) > end if > > ;;dbin = (xMax-xMin)/(nbins-1) ; 5.2.0 > dbin = (xMax-xMin)/nbins > binBound = xMin + ispan(0,nbins,1)*dbin > binBound(nbins) = xMax ; avoid roundoff > > binCenter = (binBound(0:nbins-1) + binBound(1:nbins))*0.5 > > binBoundMin = binBound(0) > binBoundMax = binBound(nbins) > > pdf = new( nbins,typeof(x), getFillValue(x)) > pdf = 0. > > do nb=0,nbins-1 > pdf(nb) = num( x.ge.binBound(nb) .and. x.lt.binBound(nb+1) ) > if (nb.eq.(nbins-1)) then ; last bin > pdf(nb) = pdf(nb) + num( x.eq.binBound(nb+1) ) ; include last bound > end if > end do > ;;---- > > pdf!0 = "x" > pdf&x = binCenter > ; max possible in data > nMax = num(x.ge.XMIN .and. x.le.XMAX) > ; actual number used > nUse = num(x.ge.binBoundMin .and. x.le.binBoundMax) > > nLoOut = num(x.lt.binBoundMin) ; number outliers > nHiOut = num(x.gt.binBoundMax) > > ; pdf = pdf/nMax*1.e2 ; original > percent-frequency PDF > pdf = pdf/sum(pdf)/dbin ; modified normalised PDF > > pdf at bin_center = binCenter > pdf at bin_bounds = binBound > pdf at bin_bound_min = binBoundMin > pdf at bin_bound_max = binBoundMax > pdf at bin_spacing = dbin ; binBound(2)-binBound(1) > pdf at nbins = nbins > pdf at nMax = nMax > pdf at nUse = nUse > if (nLoOut.gt.0 .or. nHiOut.gt.0) then > pdf at nLoOut = nLoOut > pdf at nHiOut = nHiOut > end if > > pdf at long_name = "PDF" > if (isatt(x,"long_name")) then > pdf at long_name = "PDF: "+x at long_name > end if > > return( pdf ) > end > > > ------ Messaggio Originale ------ > Da: ncl-talk at mailman.ucar.edu > A: dave.allured at noaa.gov Cc: ncl-talk at ucar.edu > Inviato: mercoled?, settembre 11? 2024, 05:35 PM > Oggetto: Re: [ncl-talk] Query about probability density functions > > Thanks Dave, > > The Fortran code looks straightforward. The last bit is for converting the > pdf into frequency ("if flag is set", according to the comment), but the > same code is also present in the contributed.ncl code. I'm assuming the > flag isn't set, otherwise the conversion would be done twice. > > Unfortunately I'm not getting any closer to solve this problem. Simply > dividing the NCL pdfs by 100 make them much smaller than Python or IDL > pdfs. I don't know if anyone ever encountered this issue before. > > Cheers, > > Giorgio > > > ------ Messaggio Originale ------ > Da: dave.allured at noaa.gov > A: g.graffino at tim.it Cc: ncl-talk at ucar.edu > Inviato: mercoled?, settembre 11? 2024, 04:58 PM > Oggetto: Re: [ncl-talk] Query about probability density functions > Giorgio, the fortran routine is at *ni/src/lib/nfpfort/xy1pdf77.f* in the > NCL v6.6.2 source code. > > > On Wed, Sep 11, 2024 at 8:01?AM Giorgio Graffino via ncl-talk < > ncl-talk at mailman.ucar.edu> wrote: > > >> Hi Dennis, >> >> Thank you for pointing out that detail. >> >> I've found the code of the pdfx function in here ( >> https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl) >> and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize >> the pdf by multiplying the frequency pdf by the number of data and divide >> by 100, but the results is very different depending on the number of data >> to compute the pdf. Also, the results are different from similar >> computations using Python, especially with large number of data. Without >> knowing what the Fortran routine does (and I can't find it), I don't know >> how to normalize NCL to match Python. >> >> Any suggestions? >> >> Cheers, >> >> Giorgio >> >> >> >> ------ Messaggio Originale ------ >> Da: shea at ucar.edu >> A: g.graffino at tim.it Cc: ncl-talk at ucar.edu >> Inviato: luned?, settembre 9? 2024, 08:15 PM >> Oggetto: Re: [ncl-talk] Query about probability density functions >> >> From the *pdfx* >> >> documentation: "The PDF units are percent [%]." >> >> Divide by 100. >> >> >> >> On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk < >> ncl-talk at mailman.ucar.edu> wrote: >> >> >>> Hello NCL experts, >>> >>> I'm using pdfx to compute probability density functions ( >>> https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml). I >>> noticed that the cumulative density functions sum up to 100 in NCL, instead >>> of 1 as per definition. Why is that? How can I normalize them back to 1? >>> >>> >>> Thanks for your help. >>> >>> Cheers, >>> >>> Giorgio >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.graffino at tim.it Tue Sep 17 04:09:55 2024 From: g.graffino at tim.it (Giorgio Graffino) Date: Tue, 17 Sep 2024 12:09:55 +0200 (CEST) Subject: [ncl-talk] Query about probability density functions In-Reply-To: <28c3044f.5f1f6.191ebfd229f.Webtop.51@tim.it> References: <28c3044f.5f1f6.191ebfd229f.Webtop.51@tim.it> Message-ID: <255bd78c.2a385.191ff75cb56.Webtop.43@tim.it> Hi Dave, ? Unfortunately I have no idea. I think Python and IDL functions do a kind of normalisation similar to pdf = pdf/sum(pdf)/dbin, but I don't know whether they use a different formula than NCL. ? ------ Messaggio Originale ------ Da: dave.allured at noaa.gov A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: venerd?, settembre 13? 2024, 05:42 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Giorgio, can you briefly explain what are the basic numerical differences between the NCL and Python/IDL versions?? Please explain mathematically, not with code. ? ? On Fri, Sep 13, 2024 at 9:25?AM Giorgio Graffino > wrote: ? I've found a way to normalize the PDF and make it like Python and IDL.? Here is the modified function for reference. I removed the call to the Fortran subroutine, so now the function it's a bit slower than the original version, and made all values float instead of double. undef("pdfx_norm") function pdfx_norm(x:numeric, nbin[1]:integer, opt:logical) local nGood, nbins, xMin, xMax, mnmxint, xSpace \ ? ? ,bin, pdf, nTot, XMIN, XMAX, nLoOut, nHiOut begin ?if (nbin.le.2) then ? ? ?nbins = 25 ? ? ? ? ? ? ; default? ?else ? ? ?nbins = nbin ?end if ?nGood = num(.not.ismissing(x)) ?if (nGood.lt.3) then ? ? ?print("pdfx_norm: nGood="+nGood+" : Need more non-missing points") ? ? ?pdf = new( nbins, "double", getFillValue(x)) ? ? ?return( pdf ) ?end if ?XMIN = min(x)*1. ?XMAX = max(x)*1. ?xMin = 0. ?xMax = 0. ?if (opt .and. isatt(opt,"bin_min")) then ? ? ? ?xMin = opt at bin_min ? ? ? ? ? ? ? ? ? ?; user set ?else ? ? ?xMin = XMIN ? ? ? ? ? ? ? ? ? ? ? ? ? ; calculate ?end if ?if (opt .and. isatt(opt,"bin_max")) then ? ? ? ?xMax = opt at bin_max ? ? ? ? ? ? ? ? ? ?; user set ?else ? ? ?xMax = XMAX ? ? ? ? ? ? ? ? ? ? ? ? ? ; calculate ?end if ?if (opt .and. isatt(opt,"bin_nice")) then ; nice xMin, xMax ? ? ?outside = False ? ? ?if (isatt(opt,"bin_nice_outside")) then ? ? ? ? ?outside = opt at bin_nice_outside ? ? ?end if ? ? ?mnmxint = nice_mnmxintvl( XMIN, XMAX, nbins, outside) ? ? ?xMin ? ?= mnmxint(0) ? ? ?xMax ? ?= mnmxint(1) ? ? ?xSpace ?= mnmxint(2) ? ?;;nbins ? = round( (xMax-xMin)/xSpace , 3)? ?end if ;;dbin ? ? ? ?= (xMax-xMin)/(nbins-1) ? ? ? ? ?; 5.2.0 ?? ?dbin ? ? ? ?= (xMax-xMin)/nbins ?? ?binBound ? ?= xMin + ispan(0,nbins,1)*dbin ?binBound(nbins) = xMax ? ? ? ? ? ? ? ? ? ? ? ; avoid roundoff ? ? ? ? ? ? ? ?binCenter ? = (binBound(0:nbins-1) + binBound(1:nbins))*0.5 ?binBoundMin = binBound(0) ? ? ? ? ?? ?binBoundMax = binBound(nbins) ? ? ? ?pdf ? ? ? ? = new( nbins,typeof(x), getFillValue(x)) ?pdf ? ? ? ? = 0. ? do nb=0,nbins-1 ? ? ? ? pdf(nb) = num( x.ge.binBound(nb) .and. x.lt.binBound(nb+1) ) ? if (nb.eq.(nbins-1)) then ? ? ? ? ? ? ? ? ; last bin? ? ? ? pdf(nb) = pdf(nb) + num( x.eq.binBound(nb+1) ) ? ; include last bound ? end if end do ;;---- ?pdf!0 ? ?= " ?pdf&x ? ?= binCenter ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ; max possible in data ? ?? ?nMax ? ? = num(x.ge.XMIN .and. x.le.XMAX) ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ; actual number used ?nUse ? ? = num(x.ge.binBoundMin .and. x.le.binBoundMax) ?nLoOut ? = num(x.lt.binBoundMin) ? ? ; number outliers ?nHiOut ? = num(x.gt.binBoundMax) ? ; ?pdf ? ? ?= pdf/nMax*1.e2 ? ? ? ? ? ? ? ? ? ? ?; original percent-frequency PDF ?pdf ? ? ?= pdf/sum(pdf)/dbin ? ? ? ? ? ? ? ? ; modified normalised PDF ? ?pdf at bin_center ? ? = binCenter ?pdf at bin_bounds ? ? = binBound ?pdf at bin_bound_min ?= binBoundMin ?pdf at bin_bound_max ?= binBoundMax ?pdf at bin_spacing ? ?= dbin ? ? ? ? ? ?; binBound(2)-binBound(1) ?pdf at nbins ? ? ? ? ?= nbins ?pdf at nMax ? ? ? ? ? = nMax ?pdf at nUse ? ? ? ? ? = nUse ?if (nLoOut.gt.0 .or. nHiOut.gt.0) then ? ? ?pdf at nLoOut ? ? = nLoOut ? ? ?? ? ? ?pdf at nHiOut ? ? = nHiOut ?end if ?pdf at long_name ? ? ?= " ?if (isatt(x,"long_name")) then ? ? ?pdf at long_name ?= " ?end if ? ?return( pdf ) end ? ------ Messaggio Originale ------ Da: ncl-talk at mailman.ucar.edu A: dave.allured at noaa.gov Cc: ncl-talk at ucar.edu Inviato: mercoled?, settembre 11? 2024, 05:35 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Thanks Dave, The Fortran code looks straightforward. The last bit is for converting the pdf into frequency ("if flag is set", according to the comment), but the same code is also present in the contributed.ncl code. I'm assuming the flag isn't set, otherwise the conversion would be done twice.? Unfortunately I'm not getting any closer to solve this problem. Simply dividing the NCL pdfs by 100 make them much smaller than Python or IDL pdfs. I don't know if anyone ever encountered this issue before. Cheers, Giorgio ? ------ Messaggio Originale ------ Da: dave.allured at noaa.gov A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: mercoled?, settembre 11? 2024, 04:58 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? Giorgio, the fortran routine is at ni/src/lib/nfpfort/xy1pdf77.f in the NCL v6.6.2 source code. ? On Wed, Sep 11, 2024 at 8:01?AM Giorgio Graffino via ncl-talk > wrote: ? Hi Dennis, Thank you for pointing out that detail. I've found the code of the pdfx function in here (https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl ) and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize the pdf by multiplying the frequency pdf by the number of data and divide by 100, but the results is very different depending on the number of data to compute the pdf. Also, the results are different from similar computations using Python, especially with large number of data. Without knowing what the Fortran routine does (and I can't find it), I don't know how to normalize NCL to match Python. Any suggestions? Cheers, Giorgio ? ------ Messaggio Originale ------ Da: shea at ucar.edu A: g.graffino at tim.it Cc: ncl-talk at ucar.edu Inviato: luned?, settembre 9? 2024, 08:15 PM Oggetto: Re: [ncl-talk] Query about probability density functions ? From the pdfx documentation:? "The PDF units are percent [%]." ? Divide by 100. ? ? On Mon, Sep 9, 2024 at 10:26?AM Giorgio Graffino via ncl-talk > wrote: ? Hello NCL experts, I'm using pdfx to compute probability density functions (https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml ). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1? ? Thanks for your help. Cheers, Giorgio -------------- next part -------------- An HTML attachment was scrubbed... URL: From mnotaro at wisc.edu Fri Sep 27 10:39:39 2024 From: mnotaro at wisc.edu (Michael Notaro) Date: Fri, 27 Sep 2024 16:39:39 +0000 Subject: [ncl-talk] SLP Message-ID: Is there a method for estimating hourly sea-level pressure from the SRF model output variables below? I do have ATM 3D atmospheric fields, like temperature, but they are daily, making them not useful for hour SLP calculation. Michael float topo(iy, jx) ; topo:long_name = "Surface Model Elevation" ; topo:standard_name = "surface_altitude" ; topo:units = "m" ; topo:coordinates = "xlat xlon" ; topo:grid_mapping = "rcm_map" ; float ps(time, iy, jx) ; ps:long_name = "Surface Pressure" ; ps:standard_name = "surface_air_pressure" ; ps:units = "hPa" ; ps:coordinates = "xlat xlon" ; ps:grid_mapping = "rcm_map" ; ps:cell_methods = "time: point" ; float ts(time, iy, jx) ; ts:long_name = "Ground surface temperature" ; ts:standard_name = "surface_temperature" ; ts:units = "K" ; ts:coordinates = "xlat xlon" ; ts:grid_mapping = "rcm_map" ; ts:cell_methods = "time: point" ; float tas(time, m2, iy, jx) ; tas:long_name = "Near surface air temperature" ; tas:standard_name = "air_temperature" ; tas:units = "K" ; tas:coordinates = "xlat xlon" ; tas:grid_mapping = "rcm_map" ; tas:cell_methods = "time: point" ; float qas(time, m2, iy, jx) ; qas:long_name = "Near surface air specific humidity" ; qas:standard_name = "specific_humidity" ; qas:units = "1" ; qas:coordinates = "xlat xlon" ; qas:grid_mapping = "rcm_map" ; qas:cell_methods = "time: point" ; " ; Michael Notaro Director Nelson Institute Center for Climatic Research University of Wisconsin-Madison Phone: (608) 261-1503 Email: mnotaro at wisc.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From shea at ucar.edu Fri Sep 27 15:51:58 2024 From: shea at ucar.edu (Dennis Shea) Date: Fri, 27 Sep 2024 15:51:58 -0600 Subject: [ncl-talk] SLP In-Reply-To: References: Message-ID: Hi Michael I am not aware of any method to estimate hourly sea-level pressure from daily data. If you find a method, post an outline to ncl-talk. Regards On Fri, Sep 27, 2024 at 10:39?AM Michael Notaro via ncl-talk < ncl-talk at mailman.ucar.edu> wrote: > Is there a method for estimating hourly sea-level pressure from > the SRF model output variables below? I do have ATM 3D atmospheric > fields, like temperature, but they are daily, making them not useful for > hour SLP calculation. > Michael > > > float topo(iy, jx) ; > topo:long_name = "Surface Model > Elevation" ; > topo:standard_name = > "surface_altitude" ; > topo:units = "m" ; > topo:coordinates = "xlat xlon" ; > topo:grid_mapping = "rcm_map" ; > float ps(time, iy, jx) ; > ps:long_name = "Surface Pressure" ; > ps:standard_name = > "surface_air_pressure" ; > ps:units = "hPa" ; > ps:coordinates = "xlat xlon" ; > ps:grid_mapping = "rcm_map" ; > ps:cell_methods = "time: point" ; > float ts(time, iy, jx) ; > ts:long_name = "Ground surface > temperature" ; > ts:standard_name = > "surface_temperature" ; > ts:units = "K" ; > ts:coordinates = "xlat xlon" ; > ts:grid_mapping = "rcm_map" ; > ts:cell_methods = "time: point" ; > float tas(time, m2, iy, jx) ; > tas:long_name = "Near surface air > temperature" ; > tas:standard_name = "air_temperature" ; > tas:units = "K" ; > tas:coordinates = "xlat xlon" ; > tas:grid_mapping = "rcm_map" ; > tas:cell_methods = "time: point" ; > float qas(time, m2, iy, jx) ; > qas:long_name = "Near surface air > specific humidity" ; > qas:standard_name = > "specific_humidity" ; > qas:units = "1" ; > qas:coordinates = "xlat xlon" ; > qas:grid_mapping = "rcm_map" ; > qas:cell_methods = "time: point" ; > > " ; > > Michael Notaro > Director > Nelson Institute Center for Climatic Research > University of Wisconsin-Madison > Phone: (608) 261-1503 > Email: mnotaro at wisc.edu > > _______________________________________________ > ncl-talk mailing list > ncl-talk at mailman.ucar.edu > List instructions, subscriber options, unsubscribe: > https://mailman.ucar.edu/mailman/listinfo/ncl-talk > -------------- next part -------------- An HTML attachment was scrubbed... URL: