[ncl-talk] Dataset required

Rick Brownrigg brownrig at ucar.edu
Mon Feb 24 06:58:54 MST 2020


So your system is reporting wmMaximumSize as 100MB, which is the default,
and should be more than sufficient for the dataset in question. I tried
limiting my system to 100MB (I generally have it set to 500MB), and the
script ran just fine.

So I'm not sure what to advise you at this point. You can try to increase
the wsMaximumSize, but I'm skeptical that's going to make a difference. You
can do that programmatically by placing the following lines at the top of
your script:

*setvalues* NhlGetWorkspaceObjectId()
    "wsMaximumSize" : 500000000
  *end setvalues*

More commonly, people override the default by setting wsMaximumSize in the
so-called ".hluresfile"; see:

    http://ncl.ucar.edu/Document/Graphics/hlures.shtml

Good luck!
Rick



On Mon, Feb 24, 2020 at 1:41 AM Soma Roy <somaroy892 at gmail.com> wrote:

> Output is coming like below;
> ncl 0> getvalues  NhlGetWorkspaceObjectId()
> ncl 1>    "wsMaximumSize" : maxMem
> ncl 2> end getvalues
> ncl 3> print(maxMem)
>
>
> Variable: maxMem
> Type: long
> Total Size: 4 bytes
>             1 values
> Number of Dimensions: 1
> Dimensions and sizes:   [1]
> Coordinates:
> (0)     100000000
>
> *And result for ulimit -a is below*;
>
> $ ulimit -a
> core file size          (blocks, -c) unlimited
> data seg size           (kbytes, -d) unlimited
> file size               (blocks, -f) unlimited
> open files                      (-n) 256
> pipe size            (512 bytes, -p) 8
> stack size              (kbytes, -s) 2036
> cpu time               (seconds, -t) unlimited
> max user processes              (-u) 256
> virtual memory          (kbytes, -v) unlimited
>
> Soma
>
> On Mon, Feb 24, 2020 at 8:28 AM Rick Brownrigg <brownrig at ucar.edu> wrote:
>
>> Hi,
>>
>> The script and that dataset work fine on my system, and barely make a
>> mark in the amount of memory consumed.  That message is coming from NCL
>> itself, not from your system. Please try running this little scriptlet from
>> within NCL, and report back to the group:
>>
>> getvalues  NhlGetWorkspaceObjectId()
>>    "wsMaximumSize" : maxMem
>> end getvalues
>> print(maxMem)
>>
>> Also, please report the results of "ulimit -a", as Gus advised.
>>
>> Finally, do you have a file named ".hluresfile" (note the leading period)
>> in your home directory?  If so, is there a line that begins
>> "*wsMaximumSize: xxxxx", and what is the value for xxxxx?  Perhaps bump it
>> up to 500000000.
>>
>> Rick
>>
>> On Sat, Feb 22, 2020 at 12:29 PM Soma Roy via ncl-talk <ncl-talk at ucar.edu>
>> wrote:
>>
>>> Thank you..
>>>
>>> But I am getting the error " dynamical memory allocation error".
>>>
>>> How much disk space is required approximately to plot this data?
>>>
>>> Thanks & Regards,
>>> Soma
>>>
>>> On Sun, Feb 23, 2020, 00:24 Dennis Shea <shea at ucar.edu> wrote:
>>>
>>>> Attached
>>>>
>>>> On Sat, Feb 22, 2020 at 11:26 AM Soma Roy via ncl-talk <
>>>> ncl-talk at ucar.edu> wrote:
>>>>
>>>>> Hello,
>>>>> I am trying to run the script vegland_1.ncl and for that I required
>>>>> sample input file.
>>>>>
>>>>> Input file name: IGBPa_1198.map.nc
>>>>>
>>>>> Please kindly send me the input file.
>>>>>
>>>>> Thanking you,
>>>>> Soma
>>>>>
>>>>> _______________________________________________
>>>>> ncl-talk mailing list
>>>>> ncl-talk at ucar.edu
>>>>> List instructions, subscriber options, unsubscribe:
>>>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>>>
>>>> _______________________________________________
>>> ncl-talk mailing list
>>> ncl-talk at ucar.edu
>>> List instructions, subscriber options, unsubscribe:
>>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20200224/060d3cc3/attachment.html>


More information about the ncl-talk mailing list