[Wrf-users] error while running wrf.exe in 2way nesting
turuncu at be.itu.edu.tr
turuncu at be.itu.edu.tr
Thu Sep 11 00:49:01 MDT 2008
Hi,
I have exactly same problem in nesting option. I get following error,
Fatal error in MPI_Wait: Other MPI error, error stack:
MPI_Wait(139).............................: MPI_Wait(request=0x3fdf078,
status0x7fbf813340) failed
MPIDI_CH3_Progress(904)...................: handle_sock_op failed
MPIDI_CH3I_Progress_handle_sock_event(294):
MPIDU_Socki_handle_read(623)..............: connection closed by peer
(set=0,sock=3)
I am using Intel MPI, compiler and Redhat AS4U4 Linux.
> uname -a
Linux cn02 2.6.9-42.0.10.EL_SFS2.2_1smp #1 SMP Tue Feb 26 11:50:41 EET
2008 x86_64 x86_64 x86_64 GNU/Linux
I could not solve the problem yet. The one domain cases run successfully
such as JAN 2000 test case or others using MPI. I hope this problem could
be solved. Please let me know, if you get any information about it.
Best,
--ufuk
> dear users,
> we ran wrf with 2nests after running wrf.exe it shows this error but
> still it shows wrfout for all the three domains.
>
> (our system details.....
>
> Darwin mets-computer.local 8.11.1 Darwin Kernel Version 8.11.1: Wed Oct 10
> 18:23:28 PDT 2007; root:xnu-792.25.20~1/RELEASE_I386 i386 i386)
>
> in terminal it shows...
>
> rank 1 in job 19 mets-computer.local_49299 caused collective abort of
> all
> ranks
> exit status of rank 1: killed by signal 4
>
> in rslout........
>
> d01 2008-07-30_00:00:00 *** Initializing nest domain # 2 by horizontally
> interp
> olating parent domain # 1. ***
> STEPRA,STEPCU,STEPBL 54 10 1
> STEPRA,STEPCU,STEPBL 18 3 1
> Timing for Writing wrfout_d01_2008-07-30_00:00:00 for domain 1:
> 1.40720 elapsed seconds.
> Timing for processing lateral boundary for domain 1: 0.26370
> elapsed seconds.
> WRF NUMBER OF TILES = 1
>
> our namelist .........
>
> &time_control
> run_days = 1,
> run_hours = 0,
> run_minutes = 0,
> run_seconds = 0,
> start_year = 2008, 2008, 2008,
> start_month = 07, 07, 07,
> start_day = 30, 30, 30,
> start_hour = 00, 00, 00,
> start_minute = 00, 00, 00,
> start_second = 00, 00, 00,
> end_year = 2008, 2008, 2008,
> end_month = 07, 07, 07,
> end_day = 31, 31, 31,
> end_hour = 00, 00, 00,
> end_minute = 00, 00, 00,
> end_second = 00, 00, 00,
> interval_seconds = 21600,
> input_from_file = .true.,.false.,.false.,
> history_interval = 360, 360, 360,
> frames_per_outfile = 1000, 1000, 1000,
> restart = .false.,
> restart_interval = 5000,
> io_form_history = 2
> io_form_restart = 2
> io_form_input = 2
> io_form_boundary = 2
> debug_level = 0
> /
>
> &domains
> time_step = 90,
> time_step_fract_num = 0,
> time_step_fract_den = 1,
> max_dom = 3,
> s_we = 1, 1, 1,
> e_we = 100, 136, 262,
> s_sn = 1, 1, 1,
> e_sn = 123, 163, 178,
> s_vert = 1, 1, 1,
> e_vert = 28, 28, 28,
> num_metgrid_levels = 27
> dx =27000, 9000, 3000,
> dy =27000, 9000, 3000,
> grid_id = 1, 2, 3,
> parent_id = 0, 1, 2,
> i_parent_start = 1, 12, 20,
> j_parent_start = 1, 6, 9,
> parent_grid_ratio = 1, 3, 3,
> parent_time_step_ratio = 1, 3, 3,
> feedback = 1,
> smooth_option = 0
> /
>
> &physics
> mp_physics = 3, 3, 3,
> ra_lw_physics = 1, 1, 1,
> ra_sw_physics = 1, 1, 1,
> radt = 27, 27, 27,
> sf_sfclay_physics = 1, 1, 1,
> sf_surface_physics = 1, 1, 1,
> bl_pbl_physics = 1, 1, 1,
> bldt = 0, 0, 0,
> cu_physics = 1, 1, 0,
> cudt = 5, 5, 5,
> isfflx = 1,
> ifsnow = 0,
> icloud = 1,
> surface_input_source = 1,
> num_soil_layers = 5,
> ucmcall = 0,
> mp_zero_out = 0,
> maxiens = 1,
> maxens = 3,
> maxens2 = 3,
> maxens3 = 16,
> ensdim = 144,
> /
>
> &fdda
> /
>
> &dynamics
> w_damping = 0,
> diff_opt = 1,
> km_opt = 4,
> diff_6th_opt = 0,
> diff_6th_factor = 0.12,
> base_temp = 290.
> damp_opt = 0,
> zdamp = 5000., 5000., 5000.,
> dampcoef = 0.01, 0.01, 0.01
> khdif = 0, 0, 0,
> kvdif = 0, 0, 0,
> non_hydrostatic = .true., .true., .true.,
> pd_moist = .false., .false., .false.,
> pd_scalar = .false., .false., .false.,
> /
>
> &bdy_control
> spec_bdy_width = 5,
> spec_zone = 1,
> relax_zone = 4,
> specified = .true., .false.,.false.,
> nested = .false., .true., .true.,
> /
>
> &grib2
> /
>
> &namelist_quilt
> nio_tasks_per_group = 0,
> nio_groups = 1,
> /
>
>
> waiting for reply.
>
> --
> This message has been scanned for viruses and
> dangerous content by MailScanner, and is
> believed to be clean.
>
> _______________________________________________
> Wrf-users mailing list
> Wrf-users at ucar.edu
> http://mailman.ucar.edu/mailman/listinfo/wrf-users
>
More information about the Wrf-users
mailing list