Hi ,<br><br>Problem description: <br><br>I am running WRF3.2.1 in "dm+sm" parallel mode on a cluster with 64 bit linux OS which has 40 nodes and 16 cpus per node. The cluster is using openmpi 1.42 compiled with ifort 11.1.073. The compilation of the model with "dm+sm" seemed successful. <br>
<br><font class="Apple-style-span" color="#FF0000">When I set OMP_NUM_THREADS to 2 or greater , the model was terminated with an error message :</font><br><br><font class="Apple-style-span" color="#FF0000">forrtl: severe (40): recursive I/O operation, unit 0, file unknown</font> <br>
<br><font class="Apple-style-span" color="#FF0000">When I set OMP_NUM_THREADS to 1, the model ran successfully.</font><br><br>Please help me out. Thanks a lot ! <br><br><br><br><br>I am using openmpi with PBS, the job file looks like:<br>
<br>#!/bin/csh <br>#PBS -l nodes=2:ppn=8 <br>#PBS -m ae <br>setenv OMP_NUM_THREADS 2 <br>time mpirun wrf.csh<br><br><br>where "Wrf.csh" unlimits the stacksize and executes the model as follows: <br>#!/bin/csh <br>
limit stacksize unlimited <br>exec wrf.exe<div><br></div><div><br></div><div><br></div><div>Below are the outputs of ompi_info:</div><div><br></div><div><div><br></div><div> Package: Open MPI </div><div> Open MPI SVN revision: r23093</div>
<div> Open MPI release date: May 04, 2010</div><div> Open RTE: 1.4.2</div><div> Open RTE SVN revision: r23093</div><div> Open RTE release date: May 04, 2010</div><div> OPAL: 1.4.2</div>
<div> OPAL SVN revision: r23093</div><div> OPAL release date: May 04, 2010</div><div> Ident string: 1.4.2</div><div> Prefix: /usr/local/openmpi-intel-11.1.073</div><div> Configured architecture: x86_64-unknown-linux-gnu</div>
<div> Configure host: sirius</div><div> Configured by: marc</div><div> Configured on: Mon Aug 30 18:04:32 EDT 2010</div><div> Configure host: sirius</div><div> Built by: marc</div>
<div> Built on: Mon Aug 30 18:12:48 EDT 2010</div><div> Built host: sirius</div><div> C bindings: yes</div><div> C++ bindings: yes</div><div> Fortran77 bindings: yes (all)</div>
<div> Fortran90 bindings: yes</div><div> Fortran90 bindings size: small</div><div> C compiler: icc</div><div> C compiler absolute: /usr/local/intel/Compiler/11.1/073/bin/intel64/icc</div><div> C++ compiler: icpc</div>
<div> C++ compiler absolute: /usr/local/intel/Compiler/11.1/073/bin/intel64/icpc</div><div> Fortran77 compiler: ifort</div><div> Fortran77 compiler abs: /usr/local/intel/Compiler/11.1/073/bin/intel64/ifort</div><div>
Fortran90 compiler: ifort</div><div> Fortran90 compiler abs: /usr/local/intel/Compiler/11.1/073/bin/intel64/ifort</div><div> C profiling: yes</div><div> C++ profiling: yes</div><div> Fortran77 profiling: yes</div>
<div> Fortran90 profiling: yes</div><div> C++ exceptions: no</div><div> Thread support: posix (mpi: no, progress: no)</div><div> Sparse Groups: no</div><div> Internal debug support: no</div>
<div> MPI parameter check: runtime</div><div>Memory profiling support: no</div><div>Memory debugging support: no</div><div> libltdl support: yes</div><div> Heterogeneous support: no</div><div> mpirun default --prefix: yes</div>
<div> MPI I/O support: yes</div><div> MPI_WTIME support: gettimeofday</div><div>Symbol visibility support: yes</div><div> FT Checkpoint support: no (checkpoint thread: no)</div><div> MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA memory: ptmalloc2 (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA paffinity: linux (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA carto: file (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA maffinity: libnuma (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA timer: linux (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA installdirs: env (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA installdirs: config (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA dpm: orte (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA allocator: basic (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA coll: basic (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA coll: inter (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA coll: self (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA coll: sm (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA coll: sync (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA coll: tuned (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA io: romio (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA mpool: fake (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA mpool: sm (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA pml: cm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA pml: csum (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA pml: v (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA bml: r2 (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA rcache: vma (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA btl: ofud (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA btl: openib (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA btl: self (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA btl: sm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA btl: tcp (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA topo: unity (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA osc: rdma (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA iof: hnp (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA iof: orted (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA iof: tool (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA oob: tcp (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA odls: default (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ras: slurm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ras: tm (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA rmaps: load_balance (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA rml: oob (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA routed: binomial (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA routed: direct (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA routed: linear (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA plm: rsh (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA plm: slurm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA plm: tm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA filem: rsh (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA errmgr: default (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ess: env (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ess: hnp (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA ess: singleton (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ess: slurm (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA ess: tool (MCA v2.0, API v2.0, Component v1.4.2)</div>
<div> MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.4.2)</div><div> MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.4.2)</div></div><div><br></div><div><br></div><div>Regards,</div><div>
<br>
</div><div>Zhenduo</div><div><br></div>