<div dir="ltr"><div>Giorgio, the fortran routine is at <b>ni/src/lib/nfpfort/xy1pdf77.f</b> in the NCL v6.6.2 source code.</div><div><span style="color:rgb(0,0,0);font-family:Menlo;font-size:18px;background-color:rgb(255,238,235)"><br></span></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Sep 11, 2024 at 8:01 AM Giorgio Graffino via ncl-talk <<a href="mailto:ncl-talk@mailman.ucar.edu">ncl-talk@mailman.ucar.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-style:solid;border-left-color:rgb(204,204,204);padding-left:1ex"><p><span style="font-family:Arial;font-size:12pt">Hi Dennis,</span></p><p><span style="font-family:Arial;font-size:12pt">Thank you for pointing out that detail.</span></p><p><span style="font-family:Arial;font-size:12pt">I've found the code of the pdfx function in here (<a href="https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl" target="_blank">https://github.com/NCAR/ncl/blob/develop/ni/src/examples/gsun/contributed.ncl</a>) and the frequency is computed as pdf = 100d0*pdf/nMax. I tried to normalize the pdf by multiplying the frequency pdf by the number of data and divide by 100, but the results is very different depending on the number of data to compute the pdf. Also, the results are different from similar computations using Python, especially with large number of data. Without knowing what the Fortran routine does (and I can't find it), I don't know how to normalize NCL to match Python.</span></p><p><span style="font-family:Arial;font-size:12pt">Any suggestions?</span></p><p><span style="font-family:Arial;font-size:12pt">Cheers,</span></p><p><span style="font-family:Arial;font-size:12pt">Giorgio</span><br><br> </p><blockquote><p>------ Messaggio Originale ------<br>Da: <a href="mailto:shea@ucar.edu" target="_blank">shea@ucar.edu</a><br>A: <a href="mailto:g.graffino@tim.it" target="_blank">g.graffino@tim.it</a> Cc: <a href="mailto:ncl-talk@ucar.edu" target="_blank">ncl-talk@ucar.edu</a><br>Inviato: lunedì, settembre 9º 2024, 08:15 PM<br>Oggetto: Re: [ncl-talk] Query about probability density functions<br></p><div dir="ltr"><div>From the <a rel="noopener noreferrer" href="https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml" target="_blank"><strong>pdfx</strong></a> documentation: "The PDF units are percent [%]."</div><div> </div><div>Divide by 100.<br></div></div><p> </p><div class="gmail_quote"><div class="gmail_attr" dir="ltr">On Mon, Sep 9, 2024 at 10:26 AM Giorgio Graffino via ncl-talk <<a href="mailto:ncl-talk@mailman.ucar.edu" target="_blank"><span>ncl-talk@mailman.ucar.edu</span></a>> wrote:<br> </div><blockquote class="gmail_quote" style="border-left-width:1px;border-left-style:solid;border-left-color:rgb(204,204,204);margin:0px 0px 0px 0.8ex;padding-left:1ex"><p><span style="font-family:Arial;font-size:12pt">Hello NCL experts,</span></p><p><span style="font-family:Arial;font-size:12pt">I'm using pdfx to compute probability density functions (</span><a rel="noopener noreferrer" href="https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml" target="_blank"><span style="font-family:Arial;font-size:12pt">https://www.ncl.ucar.edu/Document/Functions/Contributed/pdfx.shtml</span></a><span style="font-family:Arial;font-size:12pt">). I noticed that the cumulative density functions sum up to 100 in NCL, instead of 1 as per definition. Why is that? How can I normalize them back to 1?</span><br></p><p><span style="font-family:Arial;font-size:12pt">Thanks for your help.</span></p><p><span style="font-family:Arial;font-size:12pt">Cheers,</span></p><p><span style="font-family:Arial;font-size:12pt">Giorgio</span></p></blockquote></div></blockquote>
</blockquote></div></div>