[Wrf-users] Re: Wrf-users Digest, Vol 52, Issue 12

wrfhelp wrfhelp at ucar.edu
Wed Dec 24 17:20:04 MST 2008


Please see presentation p11.5  from this year's users' workshop to see  
if it may help
you improve performance on your computer.
wrfhelp

On Dec 24, 2008, at 12:00 PM, wrf-users-request at ucar.edu wrote:

> Send Wrf-users mailing list submissions to
> 	wrf-users at ucar.edu
>
> To subscribe or unsubscribe via the World Wide Web, visit
> 	http://mailman.ucar.edu/mailman/listinfo/wrf-users
> or, via email, send a message with subject or body 'help' to
> 	wrf-users-request at ucar.edu
>
> You can reach the person managing the list at
> 	wrf-users-owner at ucar.edu
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wrf-users digest..."
>
>
> Today's Topics:
>
>   1. Nesting problem (Wg. Cdr. Dileep Puranik (Faculty at DOSS))
>   2. RE: Nesting problem
>      (Bridgham, Christopher J TSgt USAF PACAF 36 OSS/OSW)
>   3. WRF-Var compilation error with intel compiler (Abdullah Kahraman)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 24 Dec 2008 02:24:28 +0530 (IST)
> From: "Wg. Cdr. Dileep Puranik (Faculty at DOSS)"
> 	<dileepmp at unipune.ernet.in>
> Subject: [Wrf-users] Nesting problem
> To: wrf-users at ucar.edu
> Message-ID: <1230.59.95.31.48.1230065668.squirrel at unipune.ernet.in>
> Content-Type: text/plain;charset=iso-8859-1
>
> Hello,
>
> I have WRF V3.0 installed on an Intel quad-core machine with 4 GB of  
> RAM,
> Fedora 9 Linux.
>
> WRF V3.0 is compiled with 'dmpar' option since OpenMPI-2 is used for
> parallelizing. This troubles me since all four CPU cores are  
> addressing
> the same memory. Should we not use 'smpar'? Nesting option is 'basic'.
>
> The model is run for a modestly sized domain, 15 x 15 degrees. When  
> the
> model is run at dx = 27 km, 15 km, 9 km it runs smoothly. When it is  
> run
> over a 360 km x 360 km domain at dx = 5 km or even 3 km, it runs  
> without
> problem. In the &domain part of the WRF namelist file, feedback = 1 so
> that 2-way nesting should be OK. However with even one tiny nest of  
> 5x5
> grid points, computation aborts with the (memory) Segmentation error.
>
> I request help to solve the problem.
>
> Dileep Puranik
>
> Dr D M Puranik
> Department of Atmospheric and Space Science
> University of Pune, Pune 411007, India
>
> -- 
> This message has been scanned for viruses and
> dangerous content by MailScanner, and is
> believed to be clean.
>
>
>
> ------------------------------
>
> Message: 2
> Date: Wed, 24 Dec 2008 10:01:19 +1000
> From: "Bridgham, Christopher J TSgt USAF PACAF 36 OSS/OSW"
> 	<christopher.bridgham at andersen.af.mil>
> Subject: RE: [Wrf-users] Nesting problem
> To: <wrf-users at ucar.edu>
> Message-ID:
> 	<9C868151DDCA58479AEEAF22DD9B92B403898906 at ANMLMB02.andersen.pacaf.ds.af.mil 
> >
> 	
> Content-Type: text/plain;	charset="us-ascii"
>
> Dr. Puranik,
>
> 	I have found that the WRF needs at least 2000 grid points to
> complete a run. I have run into the same problems trying to run small
> grids but increasing the grid size to about 2000 points does solve the
> problem. I don't know if there is an option you can set on compile to
> actually try and improve this.
>
> Respectfully,
> Chris Bridgham
> Weather Forecaster
> United States Air Force
>
> -----Original Message-----
> From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu]  
> On
> Behalf Of Wg. Cdr. Dileep Puranik (Faculty at DOSS)
> Sent: Wednesday, December 24, 2008 6:54 AM
> To: wrf-users at ucar.edu
> Subject: [Wrf-users] Nesting problem
>
> Hello,
>
> I have WRF V3.0 installed on an Intel quad-core machine with 4 GB of
> RAM,
> Fedora 9 Linux.
>
> WRF V3.0 is compiled with 'dmpar' option since OpenMPI-2 is used for
> parallelizing. This troubles me since all four CPU cores are  
> addressing
> the same memory. Should we not use 'smpar'? Nesting option is 'basic'.
>
> The model is run for a modestly sized domain, 15 x 15 degrees. When  
> the
> model is run at dx = 27 km, 15 km, 9 km it runs smoothly. When it is  
> run
> over a 360 km x 360 km domain at dx = 5 km or even 3 km, it runs  
> without
> problem. In the &domain part of the WRF namelist file, feedback = 1 so
> that 2-way nesting should be OK. However with even one tiny nest of  
> 5x5
> grid points, computation aborts with the (memory) Segmentation error.
>
> I request help to solve the problem.
>
> Dileep Puranik
>
> Dr D M Puranik
> Department of Atmospheric and Space Science
> University of Pune, Pune 411007, India
>
> -- 
> This message has been scanned for viruses and
> dangerous content by MailScanner, and is
> believed to be clean.
>
> _______________________________________________
> Wrf-users mailing list
> Wrf-users at ucar.edu
> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>
>
> ------------------------------
>
> Message: 3
> Date: Wed, 24 Dec 2008 14:31:39 +0200
> From: "Abdullah Kahraman" <havadurumu at gmail.com>
> Subject: [Wrf-users] WRF-Var compilation error with intel compiler
> To: wrf-users at ucar.edu
> Message-ID:
> 	<4313f910812240431i1b2f1c6enb7d262ea46c95926 at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello,
> I am trying to install WRF-Var into a linux (Fedora) computer, using  
> intel
> compiler (ver 10.1.008). I have succesfully installed and run WRF  
> v3.0.1.1
> and WPS. The error code I have after compilation attempt is about  
> 4700 lines
> long, so I am giving the first errors I have.
> Did anyone encounter problems compiling WRF-Var with intel compiler?
> Any suggestions would be helpful.
> Best,
> Abdullah Kahraman
> Istanbul Technical University
> Turkey
>
> ...
>
> io_int.o: In function `ext_int_write_field_':
> io_int.f:(.text+0x1b84): undefined reference to `wrf_error_fatal_'
> io_int.f:(.text+0x1bbe): undefined reference to `wrf_error_fatal_'
> io_int.f:(.text+0x1c27): undefined reference to `wrf_error_fatal_'
> io_int.o: In function `ext_int_read_field_':
> io_int.f:(.text+0x2819): undefined reference to `wrf_error_fatal_'
> io_int.f:(.text+0x2838): undefined reference to `wrf_error_fatal_'
> io_int.f:(.text+0x2aeb): undefined reference to `wrf_message_'
> io_int.f:(.text+0x3d0c): undefined reference to `wrf_message_'
> io_int.o: In function `ext_int_put_var_td_double_':
> io_int.f:(.text+0x3eb4): undefined reference to `wrf_error_fatal_'
>
> ...
>
>
> ifort -o da_par_util.o -c -O3 -w -ftz -align all -fno-alias -fp-model
> precise  -FR -convert big_endian  -r8        -i4 da_par_util.f
> fortcom: Error: da_par_util.f, line 11: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [MODULE_DOMAIN]
>   use module_domain, only : domain, xpose_type
> -------^
> fortcom: Error: da_par_util.f, line 13: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [DA_REPORTING]
>   use da_reporting, only : message
> -------^
> fortcom: Error: da_par_util.f, line 15: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [DA_DEFINE_STRUCTURES]
>   use da_define_structures, only : be_subtype, &
> -------^
> fortcom: Error: da_par_util.f, line 19: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [DA_CONTROL]
>   use da_control, only : trace_use,num_ob_indexes, myproc, root, comm,
> ierr, &
> -------^
> fortcom: Error: da_par_util.f, line 26: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [DA_REPORTING]
>   use da_reporting, only : da_error
> -------^
> fortcom: Error: da_par_util.f, line 27: Error in opening the  
> compiled module
> file.  Check INCLUDE paths.   [DA_TRACING]
>   use da_tracing, only : da_trace_entry, da_trace_exit
> -------^
>
>
>
> ...
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20081224/0717a253/attachment-0001.html
>
> ------------------------------
>
> _______________________________________________
> Wrf-users mailing list
> Wrf-users at ucar.edu
> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>
>
> End of Wrf-users Digest, Vol 52, Issue 12
> *****************************************



More information about the Wrf-users mailing list