[ncl-talk] memory intensive script

Adam Herrington adam.herrington at stonybrook.edu
Fri Jul 22 17:11:56 MDT 2016


Hi all,

Im getting a core-dump from an ncl script that I'm running on yellowstone.
This is probably due to memory issues. The core-dump occurs after i load
the very costly 28km global simulations. My usual remedy is to loop through
each history file individually (there are ~60 history files for each model
run) instead of using "addfiles", which reduces the size of the variables.

Unfortunately, I'm still getting the core-dump. I've attached my script,
and would appreciate any suggestions on ways to cut down on memory.

I've never tried splitting the variables into smaller chunks, but I guess I
will start trying to do this. Unless someone has a better suggestion?

Thanks!

Adam
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20160722/86b5c52a/attachment.html 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: dzonals3.ncl
Type: application/octet-stream
Size: 11782 bytes
Desc: not available
Url : http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20160722/86b5c52a/attachment.obj 


More information about the ncl-talk mailing list