[ncl-talk] Running the NCL code becomes slower
Zhifeng Yang
yangzf01 at gmail.com
Fri Jun 26 14:26:44 MDT 2015
Hi Walter & Dennis,
Actually, I just found that the slower process is mainly caused by the
plotting subroutine. Since SEVIRI data have higher spatial resolution (3km
near nadir), so when I plot the contour plot, it takes a lot of time, even
though I put the rasterfill resource as below.
res at cnFillMode = "RasterFill"
or
res at trGridType = "TriangularMesh"
Do you know whether there is other way to plot this kind of high resolution
data as contour much faster?
Thank you
Zhifeng
On Fri, Jun 26, 2015 at 2:08 PM, Dennis Shea <shea at ucar.edu> wrote:
> This type of behavior *may* be associated with a memory leak. However,
> more information is needed.
>
> [0]
> Please *always* include what version of NCL you are using.
>
> [1]
> As noted by Walter, multiple embedded do loops (you have 5 levels) in any
> interpreted language (NCL, Matlab, Python, R, ...) will lead to slow
> execution times. Again, as noted by Walter, you may have to 'rethink' your
> approach to obtaining the variable.
>
> [2]
> You have provided no indication of the format of the SEVERI files you are
> using.
>
> The only SEVERI file in the NCL archive is a HDF5 file. It has no
> variable named 'Cloud_Optical_Thickness_16'. You may point ncl-talk to a
> WWW location from which some sample SEVERI files of the the type you are
> using may be obtained. Or you can ftp some files to:
>
> ftp ftp.cgd.ucar.edu
> anonymous
> email
> cd incoming
> put ... SEVERI_file_01...
> put ... SEVERI_file_02...
> quit
>
> [3]
> As NCL gains more experience with handling HDF5, there have been
> *significant* improvements in handling HDF5 (H5) files over the last few
> NCL releases. See the following which document HDF5 improvements.
>
> 6.3.1: http://www.ncl.ucar.edu/future_release.shtml
> Not yet released **but** a test binary is **available**
>
> 6.3.0 and previous releases:
> http://www.ncl.ucar.edu/prev_releases.shtml
>
> In particular, 6.3.1 seems to be fairly robust.
>
> On Fri, Jun 26, 2015 at 10:23 AM, Zhifeng Yang <yangzf01 at gmail.com> wrote:
>
>> Hi
>>
>> I am trying to read SEVIRI data with a lot of variables and the dimension
>> of each variable is 3712*3712. I know the data are pretty large. But the
>> computer should read them smoothly. Since the memory that I specified is
>> about 50GB. Unfortunately, the code is becoming slower and slower while it
>> do the time loop. Here is a sample of my code.
>>
>> ; SET UP THE START TIME AND END TIME
>> start_year = 2008
>> end_year = 2008
>> start_month= 6
>> end_month = 6
>> start_day = 1
>> start_hour = 0
>> end_hour = 23
>> start_min = 0
>> end_min = 45
>> min_stride = 15
>> start_ind_lat = 1400
>> end_ind_lat = 3000
>> start_ind_lon = 1100
>> end_ind_lon = 2600
>>
>> ; DO YEAR LOOP
>> do iyear = start_year, end_year
>>
>> ; DO MONTH LOOP
>> do imonth = start_month, end_month
>>
>> ; CALCULATE THE NUMBER OF DAYS IN THIS MONTH
>> nday_month = days_in_month(iyear, imonth)
>> ; DO DAY LOOP
>> do iday = start_day, 10;nday_month
>> ; DO HOUR LOOP
>> do ihour = start_hour, end_hour
>> ; DO MINUTE LOOP
>> do imin = start_min, end_min, min_stride
>> ; READ VARIABLES FROM HDF FILE
>> a = addfile(dir + siyear + "/" + symd1 + "/" +
>> filename, "r")
>> lat = (/a->MSG_Latitude(start_ind_lat:end_ind_lat,
>> start_ind_lon:end_ind_lon)/)
>> lon =
>> (/a->MSG_Longitude(start_ind_lat:end_ind_lat, start_ind_lon:end_ind_lon)/)
>> Cloud_Optical_Thickness_16 =
>> a->Cloud_Optical_Thickness_16(start_ind_lat:end_ind_lat,
>> start_ind_lon:end_ind_lon)
>>
>> end do ;imin
>> end do ;ihour
>> end do ;iday
>> end do ;imonth
>> end do ;iyear
>>
>>
>> Thank you
>> Zhifeng
>>
>> _______________________________________________
>> ncl-talk mailing list
>> ncl-talk at ucar.edu
>> List instructions, subscriber options, unsubscribe:
>> http://mailman.ucar.edu/mailman/listinfo/ncl-talk
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/ncl-talk/attachments/20150626/3e6ebaec/attachment.html
More information about the ncl-talk
mailing list