[ncl-talk] Dataset required

Soma Roy somaroy892 at gmail.com
Mon Feb 24 11:54:54 MST 2020


Not helpful for my case.

But thank you for trying to a solution for me.

Soma

On Tue, Feb 25, 2020, 00:20 Gus Correa via ncl-talk <ncl-talk at ucar.edu>
wrote:

> Hi Soma, Rick, list
>
> I don't know if this applies to NCL, but in the past I had trouble running
> Fortran and C programs
> when either the stacksize or the max number of open files was too small,
> as it seems to be
> in the case of Soma's computer.
>
> Actually, I never saw a Linux box with the max number of open files less
> than 1024 (I would suggest 4x this or more).
> The stacksize also seems too small (~2MB), why not make it a more sizeable
> fraction of the total memory in your computer?
> If NCL passes arrays to subroutines/functions through the stack (not the
> heap), a small stacksize may be the limiting factor.
> [Rick: Does NCL use the stack to pass arrays to subroutines?]
>
> Anyway, those numbers appear to be low.
> Is this a virtual machine, perhaps?
>
> Error messages can also be misleading red-herrings, and an error reported
> as out of memory may be caused by something else sometimes.
>
> I hope this helps,
> Gus Correa
>
> On Mon, Feb 24, 2020 at 8:59 AM Rick Brownrigg via ncl-talk <
> ncl-talk at ucar.edu> wrote:
>
>> So your system is reporting wmMaximumSize as 100MB, which is the default,
>> and should be more than sufficient for the dataset in question. I tried
>> limiting my system to 100MB (I generally have it set to 500MB), and the
>> script ran just fine.
>>
>> So I'm not sure what to advise you at this point. You can try to increase
>> the wsMaximumSize, but I'm skeptical that's going to make a difference. You
>> can do that programmatically by placing the following lines at the top of
>> your script:
>>
>> *setvalues* NhlGetWorkspaceObjectId()
>>     "wsMaximumSize" : 500000000
>>   *end setvalues*
>>
>> More commonly, people override the default by setting wsMaximumSize in
>> the so-called ".hluresfile"; see:
>>
>>     http://ncl.ucar.edu/Document/Graphics/hlures.shtml
>>
>> Good luck!
>> Rick
>>
>>
>>
>> On Mon, Feb 24, 2020 at 1:41 AM Soma Roy <somaroy892 at gmail.com> wrote:
>>
>>> Output is coming like below;
>>> ncl 0> getvalues  NhlGetWorkspaceObjectId()
>>> ncl 1>    "wsMaximumSize" : maxMem
>>> ncl 2> end getvalues
>>> ncl 3> print(maxMem)
>>>
>>>
>>> Variable: maxMem
>>> Type: long
>>> Total Size: 4 bytes
>>>             1 values
>>> Number of Dimensions: 1
>>> Dimensions and sizes:   [1]
>>> Coordinates:
>>> (0)     100000000
>>>
>>> *And result for ulimit -a is below*;
>>>
>>> $ ulimit -a
>>> core file size          (blocks, -c) unlimited
>>> data seg size           (kbytes, -d) unlimited
>>> file size               (blocks, -f) unlimited
>>> open files                      (-n) 256
>>> pipe size            (512 bytes, -p) 8
>>> stack size              (kbytes, -s) 2036
>>> cpu time               (seconds, -t) unlimited
>>> max user processes              (-u) 256
>>> virtual memory          (kbytes, -v) unlimited
>>>
>>> Soma
>>>
>>> On Mon, Feb 24, 2020 at 8:28 AM Rick Brownrigg <brownrig at ucar.edu>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> The script and that dataset work fine on my system, and barely make a
>>>> mark in the amount of memory consumed.  That message is coming from NCL
>>>> itself, not from your system. Please try running this little scriptlet from
>>>> within NCL, and report back to the group:
>>>>
>>>> getvalues  NhlGetWorkspaceObjectId()
>>>>    "wsMaximumSize" : maxMem
>>>> end getvalues
>>>> print(maxMem)
>>>>
>>>> Also, please report the results of "ulimit -a", as Gus advised.
>>>>
>>>> Finally, do you have a file named ".hluresfile" (note the leading
>>>> period) in your home directory?  If so, is there a line that begins
>>>> "*wsMaximumSize: xxxxx", and what is the value for xxxxx?  Perhaps bump it
>>>> up to 500000000.
>>>>
>>>> Rick
>>>>
>>>> On Sat, Feb 22, 2020 at 12:29 PM Soma Roy via ncl-talk <
>>>> ncl-talk at ucar.edu> wrote:
>>>>
>>>>> Thank you..
>>>>>
>>>>> But I am getting the error " dynamical memory allocation error".
>>>>>
>>>>> How much disk space is required approximately to plot this data?
>>>>>
>>>>> Thanks & Regards,
>>>>> Soma
>>>>>
>>>>> On Sun, Feb 23, 2020, 00:24 Dennis Shea <shea at ucar.edu> wrote:
>>>>>
>>>>>> Attached
>>>>>>
>>>>>> On Sat, Feb 22, 2020 at 11:26 AM Soma Roy via ncl-talk <
>>>>>> ncl-talk at ucar.edu> wrote:
>>>>>>
>>>>>>> Hello,
>>>>>>> I am trying to run the script vegland_1.ncl and for that I required
>>>>>>> sample input file.
>>>>>>>
>>>>>>> Input file name: IGBPa_1198.map.nc
>>>>>>>
>>>>>>> Please kindly send me the input file.
>>>>>>>
>>>>>>> Thanking you,
>>>>>>> Soma
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> ncl-talk mailing list
>>>>>>> ncl-talk at ucar.edu
>>>>>>> List instructions, subscriber options, unsubscribe:
>>>>>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>>>>>
>>>>>> _______________________________________________
>>>>> ncl-talk mailing list
>>>>> ncl-talk at ucar.edu
>>>>> List instructions, subscriber options, unsubscribe:
>>>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>>>
>>>> _______________________________________________
>> ncl-talk mailing list
>> ncl-talk at ucar.edu
>> List instructions, subscriber options, unsubscribe:
>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>
> _______________________________________________
> ncl-talk mailing list
> ncl-talk at ucar.edu
> List instructions, subscriber options, unsubscribe:
> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20200225/385bbc23/attachment.html>


More information about the ncl-talk mailing list