[ncl-talk] ERRNO=12
Dave Allured - NOAA Affiliate
dave.allured at noaa.gov
Thu Oct 8 19:22:32 MDT 2020
Please CC the mailing list on your replies.
Serializing means processing in chunks, which you are already doing. I did
not mean anything more than that. If you are getting memory errors, then
perhaps your chunks are too large.
It is also possible that something in your code is trying to read the
entire data set, and you don't realize it. For example, each of these
statements seems innocent, but each one tries to read an entire array
into memory before producing a very simple result. These will both cause
memory error or slow performance with a very large file:
printVarSummary (f->x)
dims = dimsizes (f->x)
If that does not help solve the problem, then please show the complete NCL
statement that gets the memory error, and the complete error message. Also
show the dimensions of any arrays used in that statement.
On Thu, Oct 8, 2020 at 7:01 PM STEFAN RAHIMI-ESFARJANI <s.rahimi at ucla.edu>
wrote:
> Thanks for this,
>
> 3 TB is not all to be in memory simultaneously. I have set up the code to
> read "chunks" of data at a time, average them, and then move on.
>
> Any specific recommendations are appreciated.... I'm not sure what you
> mean by serializing the code...
>
> Thanks again,
> -Stefan
>
> On Thu, Oct 8, 2020 at 6:31 PM Dave Allured - NOAA Affiliate via ncl-talk <
> ncl-talk at mailman.ucar.edu> wrote:
>
>> 3 Tb is a very large data set, difficult to contain all in memory at the
>> same time. I recommend using serial methods to perform your analysis along
>> one or two dimensions, one step at a time. You can implement serial
>> methods in NCL or any other scientific programming language. In NCL, you
>> can use the full complement of library functions to assist your serial
>> methods.
>>
>>
>> On Thu, Oct 8, 2020 at 12:08 PM Chathurika via ncl-talk <
>> ncl-talk at mailman.ucar.edu> wrote:
>>
>>>
>>> Hello,
>>>
>>>
>>> I want to handle large data set (more than 3TB).However, I am getting
>>> the error number 12 due to the failure of memory allocation. I read lot of
>>> Q&A for this issue, however there is no proper answer. I have to analyze
>>> this large data set. So please let me know how can I solve this memory
>>> allocation issue. I hope there is someone who already has a solution for
>>> this.
>>>
>>>
>>> Thank you so much and best regards
>>> ------------------------------
>>>
>>> Wickramage Chathurika Hemamali
>>>
>>> Msc in Physical Oceanography
>>> State Key Laboratory of Tropical Oceanography
>>> South China Sea Institute of Oceanology
>>> University of Chinese Academy of Science
>>> China
>>>
>>> *Specialized in Oceanography and Marine Geology (Bachelor)*
>>> *University of Ruhuna*
>>> *Matara*
>>> *Sri Lanka*
>>>
>>> Email: wickramagechathurika at rocketmail.com
>>> chatu at scsio.ac.cn
>>>
>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.ucar.edu/pipermail/ncl-talk/attachments/20201008/a1550131/attachment.html>
More information about the ncl-talk
mailing list