[ncl-talk] trend_manken data size limit?

Dennis Shea shea at ucar.edu
Sun Apr 5 08:17:04 MDT 2015


10,000 => 3 sec;   20,000 => 10 sec;   25,000 => 15 sec

To my knowledge, there is no 'internal' NCL reason for 29,737 => "infinity"
sec

===
To my knowledge, all TRMM data are float ... not double.
Hence, your comment "I switched from double to floating point" is a
surprise.

FYI:
*All* NCL computational functions use double precision. Float values
are temporarily
promoted to double for computations.
The results are then 'demoted' to float and returned. Punch line: users
need not create double versions of the data.


On Sat, Apr 4, 2015 at 12:56 PM, David Adams <dave.k.adams at gmail.com> wrote:

> Hi all,
> I am using the trend_manken
> function to look at the significance of  precipitation intensity trends
> (using TRMM data) for 15 years (every 3 hours data).  When I use 10,000
> data points it takes about 3 seconds to run, 20,000 data points about 10
> seconds, 25,000 about 15 seconds.  When I use the entire time series 29,737
> data points, it never finishes.  I have waited more than 20 minutes and it
> keeps spinning.  I switched from double to floating point, but that makes
> no difference.
>
> ;---------------------------------
> ; Mann Kendall test
> ;---------------------------------
> pa = trend_manken(precip_ave, False, 0)
> print(pa)
>
>
>
> Any ideas?
>
> saludos,
> Dave
>
> _______________________________________________
> ncl-talk mailing list
> List instructions, subscriber options, unsubscribe:
> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20150405/b86b0c3a/attachment.html 


More information about the ncl-talk mailing list