Hi Maite,<br><br>Last week, I had same problem with our Linux / MPICH2 / ifort cluster. The wrf.exe rans ok with 1 process but with 2 or more processes it hanged and it never finished.<br>After a lot of time changing configurations, reinstalling MPICH2 software, etc, I found that one of the computer that make up the cluster had the clock 5 minutes fast. After synchronizing the clock of all computers in the cluster with a SNTP server on the Internet, wrf.exe runs fine. <br>
I hope this idea can help you.<br><br>Best regards,<br clear="all"><br>Jesus<br>
<br><br><div class="gmail_quote">2009/2/2 Maite Merino <span dir="ltr"><<a href="mailto:mmerino@am.ub.es" target="_blank">mmerino@am.ub.es</a>></span><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Dear wrf-users,<br>
<br>
<br>
I hope you can help me because I think I'm stuck with a wrf problem in<br>
parallel mode.<br>
I'm trying to work with WRF in a cluster with MPI and ifort. I have not<br>
problem in compiling it in serial mode or dm. In serial mode, geogrid,<br>
ungrib, metgrib, real and wrf worked perfectly.<br>
But in dm mode wrf.exe is not working, and I have no clue why.<br>
<br>
Once compiled WRFV3 and WPS in dm with Linux+ifort without any problem, all<br>
WPS programs worked fine and real.exe works well in 1 process executed as:<br>
./real.exe<br>
real also works ok executed through mpirun with more procs (I've tried up to<br>
4). In this cluster, we send the mpirun comand through a couple of scripts<br>
called in this case real.sh and qsub_real.sh. We use the same scripts<br>
structure for running other programs there, such as MM5.<br>
<br>
But with wrf.exe it does not work good. Compiled in serial mode, wrf.exe<br>
works fine. Compiled in dm with Linux+ifort, if I execute it as ./wrf.exe it<br>
works ok. It also works fine when if I use the respective scripts wrf.sh and<br>
qsub_wrf.sh to execute it through a mpirun command asking to do it with 1<br>
process.<br>
But when if I try to use it with 2 or more procs., it starts to do its tasks<br>
but it never finish. I've tried to wait till 3 days for a simulation (that<br>
with 1 proc. only takes a couple of hours) and finally I had also to cancel<br>
it. When this happens, I cannot see any error or warning in the files rsl.*,<br>
etc. It seems as it simply hungs and waits for something forever. It always<br>
stops in the same place, after the message "WRF NUMBER OF TILES=1".<br>
Neither me nor the informatic staff in my department have any idea of what<br>
can be hapenning. I hope you can help.<br>
<br>
For making you easy to revise it, I've made three diferent executions with<br>
the Colorado(NAM) case that we worked with during July'08 tutorial in<br>
Boulder. I attach you in this e-mail one zip file with the relevant output<br>
files of each:<br>
<br>
1.-pr_serial.zip => it has the relevant files of the execution as ./real.exe<br>
and ./wrf.exe. It worked fine. I might be useful for you to compare with the<br>
others.<br>
<br>
2.-pr_1procs.zip=> it has the relevant files of the execution as ./wrf.sh,<br>
asking for only 1 proces through mpirun. I did not redo real.exe in this<br>
case, I used the same wrfinput and wrfbdy files than before. You can see<br>
there that wrf.sh worked fine.<br>
<br>
3.-pr_4procs.zip => it has the relevant files of the execution as ./real.sh<br>
and ./wrf.sh, asking for 4 processes through mpirun (of course I cleaned<br>
before the previous executions outputs). You can see there that real worked<br>
fine but that I had to finally cancel wrf because it lasted forever.<br>
<br>
I also attach you a pdf file with all the information that the cluster can<br>
give us about the pr_4procs.zip process execution, half a day after its<br>
starting but before I cancelled it. You can find also there the<br>
specifications of the cluster itself.<br>
<br>
With all this information, could you please help me to discover what's<br>
hapenning and how to solve it? Any answer, help or suggestion will be<br>
certainly wellcomed.<br>
I'll be looking forward to your reply.<br>
Best regards,<br><font color="#888888">
<br>
Maite Merino<br>
_______________________________________________<br>
Wrf-users mailing list<br>
<a href="mailto:Wrf-users@ucar.edu" target="_blank">Wrf-users@ucar.edu</a><br>
<a href="http://mailman.ucar.edu/mailman/listinfo/wrf-users" target="_blank">http://mailman.ucar.edu/mailman/listinfo/wrf-users</a><br>
</font></blockquote></div><br>