From hedde.cea at gmail.com Mon Jan 3 08:20:04 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Mon, 3 Jan 2011 16:20:04 +0100 Subject: [Wrf-users] Resolution of global data In-Reply-To: <624705.6571.qm@web76010.mail.sg1.yahoo.com> References: <624705.6571.qm@web76010.mail.sg1.yahoo.com> Message-ID: I didn't see any response to your mail so here is one : 2.5? is not 250km !!!! Along an earth circumference (a meridian for example) 1 minute of angle is 1 nautic mile (nm) which is 1852m. you may apply this rule to latitudes (1?=60min=60nm=~111km). For longitudes unless you live on equator you have to multiply by cos(phi) where phi is the longitude. Cordially Thierry HEDDE 2010/12/7 Asnor Muizan Ishak > Dear Everyone, > > May I ask you all a question about the degree of resolution for global data > for intance ECMWF. This global data is consider as a input for WRF or MM5 as > well. According to ECMWF, the data could be downloaded from 2.5 degree up to > 0.0 degree. The 2.5 degree is equal to 250km, am I right? But in my case I > have used 1.0 degree data and therefore should I set my courses domain with > 100km resolution as well due to the 1.0 degree resolution? Is this will make > huge error after post processing? At the moment my course domain is 81km and > 27km and finest domain are 3km and 1km (I have 2 cases here). Therefore, > does anyone can help me about this conflict of information. I tried to find > this information via the guide line or procedures but until now I cant find > it. Please everyone to help me about this information. > > Many thanks in advance. > Asnor > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE DEN/DTN/SMTM/LMTE B?t. 307 Pi?ce 9 13108 ST PAUL LEZ DURANCE CEDEX FRANCE -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110103/296266c7/attachment.html From Don.Morton at alaska.edu Wed Jan 5 17:00:57 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Wed, 5 Jan 2011 15:00:57 -0900 Subject: [Wrf-users] Alaska Weather Symposium, 15-16 March 2011, Call for Abstracts Message-ID: The 2011 Alaska Weather Symposium (AWS '11) 15-16 March 2011 University of Alaska Fairbanks Call for Abstracts Symposium Web Page: http://weather.arsc.edu/Events/AWS11/ Symposium flyer suitable for adorning to walls, windows, doors, etc. http://weather.arsc.edu/Events/AWS11/AWS-2011-Flyer.pdf The following sponsors (listed alphabetically) - Alaska Region, NOAA National Weather Service - Arctic Region Supercomputing Center (UAF) - College of Natural Science and Mathematics (UAF) - Geophysical Institute (UAF) - International Arctic Research Center (UAF) invite you to attend the 2011 Alaska Weather Symposium. The symposium provides a forum for the exchange of operational and research information related to weather in the Alaska environment. Participation from academic, research, government, military, and private sectors is encouraged. Primary areas of focus are anticipated to be - Air quality - Data assimilation - Models and evaluation for arctic systems - Boundary layer processes - Communication of predictability and uncertainty However, as usual, all abstracts relating to weather in Alaska are welcome --Schedule/Venue-- This will be a two-day symposium held at the University of Alaska Fairbanks campus on Tuesday and Wednesday, 15-16 March 2011. Snacks will be provided and evening meals will be on-your-own, with an organized evening out (pay on your own) at a local establishment. --Abstract Submission-- The deadline for one-paragraph abstracts for a 20-minute presentation is Tuesday, 25 January 2011. See the symposium web page for abstract submission procedures. --Registration-- Registration is required by Tuesday, 08 March 2011. See the symposium web page for registration instructions. There is no registration fee. Symposium Web Page: http://weather.arsc.edu/Events/AWS11/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110105/d1ea3d66/attachment.html From Chris.Franks at noaa.gov Wed Jan 5 14:49:16 2011 From: Chris.Franks at noaa.gov (Chris Franks) Date: Wed, 05 Jan 2011 15:49:16 -0600 Subject: [Wrf-users] WPP Error Message-ID: <4D24E75C.4010709@noaa.gov> Greetings All, I am new to the list...but have seen this error message being discussed before in other places. I'm getting this kind of error: + mv WRFPRS001.tm00 WRFPRS_d01.001 mv: cannot stat `WRFPRS001.tm00': No such file or directory + ls -l WRFPRS_d01.001 ls: cannot access WRFPRS_d01.001: No such file or directory + err1=2 + test 2 -ne 0 + echo 'WRF POST FAILED, EXITTING' WRF POST FAILED, EXITTING + exit Does anyone have a working script that processes 15-min data? If I have 15-min WRF netCDF output, do I have to use something like this post process (found this online): Thanks, Chris ___________________________________________________________________ *# Specify forecast start date # fhr is the first forecast hour to be post-processed # lastfhr is the last forecast hour to be post-processed # incrementhr is the incement (in hours) between forecast files export startdate=2010061200 export fhr=00 export lastfhr=48 export incrementhr=01 export incrementmin=15 export lastmin=45 # Path names for WRF_POSTPROC and WRFV3 export WRF_POSTPROC_HOME=${TOP_DIR}/WPPV3 export POSTEXEC=${WRF_POSTPROC_HOME}/exec export SCRIPTS=${WRF_POSTPROC_HOME}/scripts export WRFPATH=${TOP_DIR}/WRFV3 # cd to working directory cd ${DOMAINPATH}/postprd # Link Ferrier's microphysic's table and WRF-POSTPROC control file, ln -fs ${WRFPATH}/run/ETAMPNEW_DATA eta_micro_lookup.dat ln -fs ${DOMAINPATH}/parm/wrf_cntrl.parm . export tmmark=tm00 export MP_SHARED_MEMORY=yes export MP_LABELIO=yes ####################################################### # 1. Run WRF-POSTPROC # # The WRF-POSTPROC is used to read native WRF model # output and put out isobaric state fields and derived fields. # ####################################################### pwd ls -x export NEWDATE=$startdate YYi=`echo $NEWDATE | cut -c1-4` MMi=`echo $NEWDATE | cut -c5-6` DDi=`echo $NEWDATE | cut -c7-8` HHi=`echo $NEWDATE | cut -c9-10` while [ $fhr -le $lastfhr ] ; do typeset -Z3 fhr NEWDATE=`${POSTEXEC}/ndate.exe +${fhr} $startdate` YY=`echo $NEWDATE | cut -c1-4` MM=`echo $NEWDATE | cut -c5-6` DD=`echo $NEWDATE | cut -c7-8` HH=`echo $NEWDATE | cut -c9-10` echo 'NEWDATE' $NEWDATE echo 'YY' $YY export min=00 while [ $min -le $lastmin ] ; do #for domain in d01 d02 d03 for domain in d02 do echo '*****-----*****-----*****----*****' cat > itag < wrfpost_${domain}.$fhr.out 2>&1 if [ $min = 00 ]; then mv WRFPRS$fhr.tm00 WRFPRS_${domain}.${fhr}_${min} else mv WRFPRS$fhr:$min.tm00 WRFPRS_${domain}.${fhr}_${min} fi* From mkudsy at gmail.com Wed Jan 5 18:53:26 2011 From: mkudsy at gmail.com (M Kudsy) Date: Thu, 6 Jan 2011 08:53:26 +0700 Subject: [Wrf-users] Error in two-way nesting Message-ID: Hi, I get an error when trying to run wrf.exe using 2-way nesting. I run real.exe successfully for 2-domains and get wrfbdy_d01, wrfinput_d01 and wrfinput_d02 files. When I try to run thw model, the following messages appear. Would someone give me some hints to correct the error? Many thanks, Mahally [kudsy at cumulus em_real]$ ./wrf.exe Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 5 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 120 1 139 ims,ime,jms,jme -4 125 -4 144 ips,ipe,jps,jpe 1 120 1 139 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 314807212 bytes allocated med_initialdata_input: calling input_input INPUT LandUse = "USGS" LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS WATER CATEGORY = 16 SNOW CATEGORY = 24 ************************************* Nesting domain ids,ide,jds,jde 1 100 1 100 ims,ime,jms,jme -4 105 -4 105 ips,ipe,jps,jpe 1 100 1 100 INTERMEDIATE domain ids,ide,jds,jde 146 184 28 66 ims,ime,jms,jme 141 189 23 71 ips,ipe,jps,jpe 144 186 26 68 ************************************* d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , 197559732 bytes allocated d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , 7433496 bytes allocated d01 2010-10-02_00:00:00 *** Initializing nest domain # 2 from an input file. *** -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 67 program wrf: error opening wrfinput_d02 for reading ierr= -1010 -- Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110106/25b5464f/attachment.html From drostkier at yahoo.com Thu Jan 6 00:42:21 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Wed, 5 Jan 2011 23:42:21 -0800 (PST) Subject: [Wrf-users] WRF-SCM drive with 3D-WRF Message-ID: <441606.15646.qm@web113101.mail.gq1.yahoo.com> Hi users, I would like to know if any of you have initialized and driven the WRF-SCM with WRF output? Thanks, Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110105/e29aba51/attachment.html From agnes.mika at bmtargoss.com Thu Jan 6 00:42:33 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Thu, 6 Jan 2011 08:42:33 +0100 Subject: [Wrf-users] Error in two-way nesting In-Reply-To: References: Message-ID: <20110106074233.GA2241@aggedor.argoss.nl> Hallo, Please post your namelist.input file as well. There is an error reading your wrfinput_d02 files, it might have to do with a wrong setting in your namelist. Agnes M Kudsy wrote: > Hi, > > I get an error when trying to run wrf.exe using 2-way nesting. I run > real.exe successfully for 2-domains and get wrfbdy_d01, wrfinput_d01 and > wrfinput_d02 files. When I try to run thw model, the following messages > appear. Would someone give me some hints to correct the error? > > Many thanks, > > Mahally > > > [kudsy at cumulus em_real]$ ./wrf.exe > Namelist dfi_control not found in namelist.input. Using registry defaults > for variables in dfi_control > Namelist tc not found in namelist.input. Using registry defaults for > variables in tc > Namelist scm not found in namelist.input. Using registry defaults for > variables in scm > Namelist fire not found in namelist.input. Using registry defaults for > variables in fire > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > auxinput4_interval = 0 for all domains > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > auxinput4_interval = 0 for all domains > --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and > ending time to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, > setting sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging > interval and ending time to 0 for that domain. > --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and > ending time to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, > setting sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging > interval and ending time to 0 for that domain. > --- NOTE: num_soil_layers has been set to 5 > WRF V3.2.1 MODEL > ************************************* > Parent domain > ids,ide,jds,jde 1 120 1 139 > ims,ime,jms,jme -4 125 -4 144 > ips,ipe,jps,jpe 1 120 1 139 > ************************************* > DYNAMICS OPTION: Eulerian Mass Coordinate > alloc_space_field: domain 1 , 314807212 bytes allocated > med_initialdata_input: calling input_input > INPUT LandUse = "USGS" > LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS > WATER CATEGORY = 16 SNOW CATEGORY = 24 > ************************************* > Nesting domain > ids,ide,jds,jde 1 100 1 100 > ims,ime,jms,jme -4 105 -4 105 > ips,ipe,jps,jpe 1 100 1 100 > INTERMEDIATE domain > ids,ide,jds,jde 146 184 28 66 > ims,ime,jms,jme 141 189 23 71 > ips,ipe,jps,jpe 144 186 26 68 > ************************************* > d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , > 197559732 bytes allocated > d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , > 7433496 bytes allocated > d01 2010-10-02_00:00:00 *** Initializing nest domain # 2 from an input > file. *** > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 67 > program wrf: error opening wrfinput_d02 for reading ierr= > -1010 > > -- > > Mahally Kudsy > Weather Modification Technology Center > Agency for the Assessment and Application of Technology > Jln MH Thamrin 8, Jakarta, Indonesia > Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. From mkudsy at gmail.com Thu Jan 6 01:15:19 2011 From: mkudsy at gmail.com (M Kudsy) Date: Thu, 6 Jan 2011 15:15:19 +0700 Subject: [Wrf-users] Error in two-way nesting In-Reply-To: <20110106074233.GA2241@aggedor.argoss.nl> References: <20110106074233.GA2241@aggedor.argoss.nl> Message-ID: Dear all, Thanks for the answers. I just realized that I made different settings of I and J_PARENT_START in WPS and wrf namelists for the second domains. Mahally On Thu, Jan 6, 2011 at 2:42 PM, Agnes Mika wrote: > Hallo, > > Please post your namelist.input file as well. There is an error > reading your wrfinput_d02 files, it might have to do with a wrong > setting in your namelist. > > Agnes > > M Kudsy wrote: > > Hi, > > > > I get an error when trying to run wrf.exe using 2-way nesting. I run > > real.exe successfully for 2-domains and get wrfbdy_d01, wrfinput_d01 and > > wrfinput_d02 files. When I try to run thw model, the following messages > > appear. Would someone give me some hints to correct the error? > > > > Many thanks, > > > > Mahally > > > > > > [kudsy at cumulus em_real]$ ./wrf.exe > > Namelist dfi_control not found in namelist.input. Using registry > defaults > > for variables in dfi_control > > Namelist tc not found in namelist.input. Using registry defaults for > > variables in tc > > Namelist scm not found in namelist.input. Using registry defaults for > > variables in scm > > Namelist fire not found in namelist.input. Using registry defaults for > > variables in fire > > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > > auxinput4_interval = 0 for all domains > > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > > auxinput4_interval = 0 for all domains > > --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and > > ending time to 0 for that domain. > > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, > > setting sgfdda interval and ending time to 0 for that domain. > > --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging > > interval and ending time to 0 for that domain. > > --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and > > ending time to 0 for that domain. > > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, > > setting sgfdda interval and ending time to 0 for that domain. > > --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging > > interval and ending time to 0 for that domain. > > --- NOTE: num_soil_layers has been set to 5 > > WRF V3.2.1 MODEL > > ************************************* > > Parent domain > > ids,ide,jds,jde 1 120 1 139 > > ims,ime,jms,jme -4 125 -4 144 > > ips,ipe,jps,jpe 1 120 1 139 > > ************************************* > > DYNAMICS OPTION: Eulerian Mass Coordinate > > alloc_space_field: domain 1 , 314807212 bytes > allocated > > med_initialdata_input: calling input_input > > INPUT LandUse = "USGS" > > LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS > > WATER CATEGORY = 16 SNOW CATEGORY = 24 > > ************************************* > > Nesting domain > > ids,ide,jds,jde 1 100 1 100 > > ims,ime,jms,jme -4 105 -4 105 > > ips,ipe,jps,jpe 1 100 1 100 > > INTERMEDIATE domain > > ids,ide,jds,jde 146 184 28 66 > > ims,ime,jms,jme 141 189 23 71 > > ips,ipe,jps,jpe 144 186 26 68 > > ************************************* > > d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , > > 197559732 bytes allocated > > d01 2010-10-02_00:00:00 alloc_space_field: domain 2 , > > 7433496 bytes allocated > > d01 2010-10-02_00:00:00 *** Initializing nest domain # 2 from an input > > file. *** > > -------------- FATAL CALLED --------------- > > FATAL CALLED FROM FILE: LINE: 67 > > program wrf: error opening wrfinput_d02 for reading ierr= > > -1010 > > > > -- > > > > Mahally Kudsy > > Weather Modification Technology Center > > Agency for the Assessment and Application of Technology > > Jln MH Thamrin 8, Jakarta, Indonesia > > Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com > > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- > Dr. ?gnes Mika > Advisor, Meteorology and Air Quality > > Tel: +31 (0)527-242299 > Fax: +31 (0)527-242016 > Web: www.bmtargoss.com > > BMT ARGOSS > P.O. Box 61, 8325 ZH Vollenhove > Voorsterweg 28, 8316 PT Marknesse > The Netherlands > > Confidentiality Notice & Disclaimer > > The contents of this e-mail and any attachments are intended for the > use of the mail addressee(s) shown. If you are not that person, you > are not allowed to take any action based upon it or to copy it, > forward, distribute or disclose its contents and you should delete it > from your system. BMT ARGOSS does not accept liability for any errors > or omissions in the context of this e-mail or its attachments which > arise as a result of internet transmission, nor accept liability for > statements which are those of the author and clearly not made on > behalf of BMT ARGOSS. > > > -- Dr.Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110106/5768f660/attachment.html From Matthew.Foster at noaa.gov Fri Jan 7 06:00:01 2011 From: Matthew.Foster at noaa.gov (Matt Foster) Date: Fri, 07 Jan 2011 07:00:01 -0600 Subject: [Wrf-users] Radiative transfer performance in DM+SM run Message-ID: <4D270E51.6030005@noaa.gov> During some recent testing of a newly-configured domain (474x474, 3km) I noticed a rather significant difference in the performance of the radiative transfer (RRTM/Dudhia) when running a DM+SM build. The radiative transfer timesteps are sometimes as much as 60% longer in DM+SM vs DM. I've found this to be true with either OpenMPI or MVAPICH2, although it seems to be more pronounced with MVAPICH2. Does anyone know why I'm seeing this behavior? Matt -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson -------------- next part -------------- A non-text attachment was scrubbed... Name: matthew_foster.vcf Type: text/x-vcard Size: 229 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110107/516791bf/attachment.vcf From jagan at tnau.ac.in Mon Jan 10 07:07:35 2011 From: jagan at tnau.ac.in (jagan TNAU) Date: Mon, 10 Jan 2011 19:37:35 +0530 Subject: [Wrf-users] ARWpost processing problems Message-ID: Dear Users, I have hyperslabbed a wrf-arw output for the variables RAINC and RAINNC using NCO and trying to post process with ARWpost. However, I received the following error while running ARWpost.exe. ----------------------------------------- FOUND the following input files: ./rain.nc START PROCESSING DATA ERROR: Error in ext_pkg_open_for_read --------------------------------------------- I have compiled the ARWpost by modifying gridinfo_module.F90 to read the 'basic' variables RAINC and RAINNC. When the original file with all the variables was post processed it was successful. I request users help to overcome this issue. -- With regards Dr.R.Jagannathan Professor of Agronomy, Department of Agronomy Tamil Nadu Agricultural University, Coimbatore - 641 003 India PHONE: Mob: +91 94438 89891 DO NOT PRINT THIS E-MAIL UNLESS NECESSARY. THE ENVIRONMENT CONCERNS US ALL. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110110/9ea6926b/attachment.html From jcadam at wsu.edu Mon Jan 10 09:17:51 2011 From: jcadam at wsu.edu (Adam, Jennifer C) Date: Mon, 10 Jan 2011 08:17:51 -0800 Subject: [Wrf-users] Earth Systems Modeling Postdoctoral Opportunity at Washington State University Message-ID: <1A8BF21ABC97FE4DB7F2E89E18AFED350151481E@EXCHANGEVS-04.ad.wsu.edu> Position Announcement Post-doctoral Research Position Center for Environmental Research, Education, and Outreach Washington State University Pullman, WA 99164-1030 The Center for Environmental Research, Education and Outreach (CEREO) at Washington State University (WSU) invites applications for a post-doctoral research appointment to participate in a large multi-institutional interdisciplinary effort to develop a biosphere-relevant earth systems model (Bio-EaSM). This Bio-EaSM will be developed to enable investigation of land-atmosphere interactions among carbon, nitrogen, and water under decadal-scale climate variability. Components of the model include the Weather Research and Forecast (WRF) model, the Community Multi-scale Air Quality (CMAQ) system, and the Variable Infiltration Capacity (VIC) land surface model. The successful candidate should have experience with the application and development of modeling applications based upon one or more of these modeling systems in a Linux environment. The selected candidate will also have demonstrated strong written and oral communication skills. An earned Ph.D. in a relevant science or engineering field is required before the date of hire. CEREO is a faculty-led initiative of which the premise is to make WSU's outstanding environmental programs more than just the sum of the parts through building synergism and creative collaboration among faculty involved in environmentally-oriented activities (http://www.cereo.wsu.edu/index.html). The candidate will also work directly with researchers at the Laboratory for Atmospheric Research (LAR) and in the Water Resource group in Civil and Environmental Engineering. LAR is an air quality research group with a strong reputation for instrument development and field observations of atmospheric chemistry and in numerical modeling of regional atmospheric chemistry and air quality (http://www.lar.wsu.edu ). This effort is also closely linked to an NSF IGERT on nitrogen cycling (http://igert.nspire.wsu.edu/). Screening of applications will begin on February 1, 2011 and will continue until the position is filled. Candidates should submit (via e-mail) a letter of application which addresses all of the above requirements and describes your research interests, curriculum vitae, and names and addresses of five references to the following individuals: Drs. Brian Lamb and Jennifer Adam blamb at wsu.edu and jcadam at wsu.edu ATTN: Post-doctoral Search Department of Civil and Environmental Engineering Washington State University PO Box 642910 Pullman, WA 99164-2910 WSU is an EEO employer. Protected group members are encouraged to apply. Jennifer C. Adam, Assistant Professor Civil and Environmental Engineering, Washington State University Pullman, WA 99164-2910 Phone: 509-335-7751 Fax: 509-335-7632 jcadam at wsu.edu http://www.ce.wsu.edu/Faculty_Staff/Profiles/adam.htm http://hydro.cee.wsu.edu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110110/0d87b2d5/attachment.html From kganbour at yahoo.com Sun Jan 9 04:06:28 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Sun, 9 Jan 2011 03:06:28 -0800 (PST) Subject: [Wrf-users] failure to get *.exe files Message-ID: <995774.58345.qm@web46309.mail.sp1.yahoo.com> ?Dear? All: I have tried to complie WRF on 32 bit machine with Linux open suse with GNU compiler Linux linux-vfrh 2.6.31.5-0.1-default #1 SMP 2009-10-26 15:49:03 +0100 i686 i686 i386 GNU/Linux and with PGI Workstation 10.0 and with netcdf 4.1.1. But I haven't get .exe files after the compilation until now .and? I tried with WRFV3.2.1?? and netcdf4.1.1 and way And I am going to attend next workshop in colorado on 31 Junary and I would like to go there and I have run WRF Model. I would like to know the meaning of next message after ./compile command khaled1 at linux-vfrh:~/WRFV3.2> ./compile ?? compile [-d] [-j n] wrf?? compile wrf in run dir (NOTE: no real.exe, ndown.exe, or ideal.exe generated) I compiled wit em_real&nmm_real with log files in attached files and aslo I attached the result of the compilation which displayed on the screen and the list files of main directory after compilation. I hope to help me Best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110109/fe195d43/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_em_real.log Type: text/x-log Size: 385559 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110109/fe195d43/attachment-0002.bin -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_nmm_real.log Type: text/x-log Size: 368946 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110109/fe195d43/attachment-0003.bin -------------- next part -------------- A non-text attachment was scrubbed... Name: em_real_screen Type: application/octet-stream Size: 97879 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110109/fe195d43/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: nmm_real_screen Type: application/octet-stream Size: 57562 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110109/fe195d43/attachment-0003.obj From moudipascal at yahoo.fr Fri Jan 7 11:06:51 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Fri, 7 Jan 2011 18:06:51 +0000 (GMT) Subject: [Wrf-users] Gen_be Message-ID: <790359.47596.qm@web25106.mail.ukl.yahoo.com> Hi to all and Happy New Year. I would like to generate be.dat file for my own region. I get the forecasts for the month of June initialized at OOOOUTC. I generate the forecasts every 3 hours. May someone help me to generate the be.dat file please? I am having the following error when i tried to test my data for 1 day with the interval of 12 Run Stage 0: Calculate ensemble perturbations from model forecasts. --------------------------------------------------------------- Beginning CPU time: Fri Jan 7 19:06:06 WAT 2011 gen_be_stage0_wrf: Calculating standard perturbation fields valid at time 2010060100 mv: cannot stat `pert.2010060100*': No such file or directory 2010053100 /home/wrfvar/WRF/WRFV3/run/BE//2010053100/wrfout_d01_2010-06-01_00:00:00 /home/wrfvar/WRF/WRFV3/run/BE//2010053112/wrfout_d01_2010-06-01_00:00:00 gen_be_stage0_wrf: Calculating standard perturbation fields valid at time 2010060112 mv: cannot stat `pert.2010060112*': No such file or directory 2010053112 /home/wrfvar/WRF/WRFV3/run/BE//2010053112/wrfout_d01_2010-06-01_12:00:00 /home/wrfvar/WRF/WRFV3/run/BE//2010060100/wrfout_d01_2010-06-01_12:00:00 gen_be_stage0_wrf: Calculating standard perturbation fields valid at time 2010060200 mv: cannot stat `pert.2010060200*': No such file or directory 2010060100 /home/wrfvar/WRF/WRFV3/run/BE//2010060100/wrfout_d01_2010-06-02_00:00:00 /home/wrfvar/WRF/WRFV3/run/BE//2010060112/wrfout_d01_2010-06-02_00:00:00 Ending CPU time: Fri Jan 7 19:06:07 WAT 2011 --------------------------------------------------------------- Run Stage 1: Read standard fields, and remove time/ensemble/area mean. --------------------------------------------------------------- Beginning CPU time: Fri Jan 7 19:06:07 WAT 2011 Stage 1 failed with error 24 Pascal MOUDI IGRI Ph-D Student at the Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110107/a25467a7/attachment.html From wxprofessor at gmail.com Sat Jan 8 20:31:40 2011 From: wxprofessor at gmail.com (patrick) Date: Sat, 08 Jan 2011 22:31:40 -0500 Subject: [Wrf-users] wrf3.2 Message-ID: <4D292C1C.3020008@gmail.com> real.exe failure on debian lenny system uname -a Linux 2.6.26-2-686 #1 SMP Thu Sep 16 19:35:51 UTC 2010 i686 GNU/Linux any advice would be appreciated it appears that " ./real.exe: free(): invalid next size (fast): 0x109b8980 ***" is the issue. cheers, --patrick output below: 2850-01:WRFV3/run# ./real.exe Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 110361480 bytes allocated Time period # 1 to process = 2011-01-07_22:00:00. Time period # 2 to process = 2011-01-07_22:06:00. Time period # 3 to process = 2011-01-07_22:12:00. Time period # 4 to process = 2011-01-07_22:18:00. Time period # 5 to process = 2011-01-07_22:24:00. Time period # 6 to process = 2011-01-07_22:30:00. Time period # 7 to process = 2011-01-07_22:36:00. Time period # 8 to process = 2011-01-07_22:42:00. Time period # 9 to process = 2011-01-07_22:48:00. Time period # 10 to process = 2011-01-07_22:54:00. Time period # 11 to process = 2011-01-07_23:00:00. Total analysis times to input = 11. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2011-01-07_22:00:00.0000, which is loop # 1 out of 11 configflags%julyr, %julday, %gmt: 2011 7 22.000000 metgrid input_wrf.F first_date_input = 2011-01-07_22:00:00 metgrid input_wrf.F first_date_nml = 2011-01-07_22:00:00 d01 2011-01-07_22:00:00 Timing for input 0 s. d01 2011-01-07_22:00:00 flag_soil_levels read from met_em file is 1 Max map factor in domain 1 = 0.98. Scale the dt in the model accordingly. Converged znw(kte) should be about 0.0 = 3.89766930E-09 d01 2011-01-07_22:00:00 Old data, no inland lake information Assume RUC LSM 6-level input *** glibc detected *** ./real.exe: free(): invalid next size (fast): 0x109b8980 *** ======= Backtrace: ========= /lib/i686/cmov/libc.so.6[0xb75c6764] /lib/i686/cmov/libc.so.6(cfree+0x96)[0xb75c8966] ./real.exe[0x84c8c9c] ./real.exe[0x84d5314] ./real.exe[0x8086978] ./real.exe[0x808f86f] ./real.exe[0x8053e9a] ./real.exe[0x80550ac] ./real.exe[0x8dbc159] /lib/i686/cmov/libc.so.6(__libc_start_main+0xe5)[0xb756e455] ./real.exe[0x804aa11] ======= Memory map: ======== 08048000-08eaa000 r-xp 00000000 08:01 4169734 /home/ldm/WRFV3/main/real.exe 08eaa000-09a34000 rw-p 00e62000 08:01 4169734 /home/ldm/WRFV3/main/real.exe 09a34000-0dfb0000 rw-p 09a34000 00:00 0 0fd59000-10ae0000 rw-p 0fd59000 00:00 0 [heap] b0400000-b0421000 rw-p b0400000 00:00 0 b0421000-b0500000 ---p b0421000 00:00 0 b05d3000-b7558000 rw-p b05d3000 00:00 0 b7558000-b76ad000 r-xp 00000000 08:01 5529966 /lib/i686/cmov/libc-2.7.so b76ad000-b76ae000 r--p 00155000 08:01 5529966 /lib/i686/cmov/libc-2.7.so b76ae000-b76b0000 rw-p 00156000 08:01 5529966 /lib/i686/cmov/libc-2.7.so b76b0000-b76b3000 rw-p b76b0000 00:00 0 b76b3000-b76bf000 r-xp 00000000 08:01 5529603 /lib/libgcc_s.so.1 b76bf000-b76c0000 rw-p 0000b000 08:01 5529603 /lib/libgcc_s.so.1 b76c0000-b76c1000 rw-p b76c0000 00:00 0 b76c1000-b76e5000 r-xp 00000000 08:01 5529619 /lib/i686/cmov/libm-2.7.so b76e5000-b76e7000 rw-p 00023000 08:01 5529619 /lib/i686/cmov/libm-2.7.so b76e7000-b7797000 r-xp 00000000 08:01 548076 /usr/lib/libgfortran.so.3.0.0 b7797000-b7798000 rw-p 000af000 08:01 548076 /usr/lib/libgfortran.so.3.0.0 b7798000-b7799000 rw-p b7798000 00:00 0 b77a2000-b77a4000 rw-p b77a2000 00:00 0 b77a4000-b77a5000 r-xp b77a4000 00:00 0 [vdso] b77a5000-b77bf000 r-xp 00000000 08:01 5529939 /lib/ld-2.7.so b77bf000-b77c1000 rw-p 0001a000 08:01 5529939 /lib/ld-2.7.so bfe20000-bfe3d000 rw-p bffe2000 00:00 0 [stack] Aborted From wxprofessor at gmail.com Mon Jan 10 11:51:36 2011 From: wxprofessor at gmail.com (patrick) Date: Mon, 10 Jan 2011 13:51:36 -0500 Subject: [Wrf-users] real.exe In-Reply-To: <790359.47596.qm@web25106.mail.ukl.yahoo.com> References: <790359.47596.qm@web25106.mail.ukl.yahoo.com> Message-ID: <4D2B5538.1080504@gmail.com> real.exe chokes with the following error: " ./real.exe: free(): invalid next size (fast): 0x109b8980 *** wrfv3.2 gfortran serial debian lenny uname -a Linux 2.6.26-2-686 #1 SMP Thu Sep 16 19:35:51 UTC 2010 i686 GNU/Linux help please :) cheers, --patrick From yesubabu2006 at gmail.com Mon Jan 10 21:56:06 2011 From: yesubabu2006 at gmail.com (V.YesuBabu) Date: Tue, 11 Jan 2011 10:26:06 +0530 Subject: [Wrf-users] "Re: Contents of Wrf-users digest...Vol 77, Issue 5" Message-ID: Dear Moudi Pascal, wrf forecasts for one month experiments should be in 12hrs output interval i.e., If you are generating be.dat file for Nov 2010 month then it will be output for gen_be directory as follows 2010110100/wrfout_d01_2010-11-01_00:00:00 2010110100/wrfout_d01_2010-11-01_12:00:00 2010110100/wrfout_d01_2010-11-02_00:00:00 2010110112/wrfout_d01_2010-11-01_12:00:00 2010110112/wrfout_d01_2010-11-02_00:00:00 2010110112/wrfout_d01_2010-11-02_12:00:00 2010110200/wrfout_d01_2010-11-02_00:00:00 2010110200/wrfout_d01_2010-11-02_12:00:00 2010110200/wrfout_d01_2010-11-03_00:00:00 2010110212/wrfout_d01_2010-11-02_12:00:00 2010110212/wrfout_d01_2010-11-03_00:00:00 2010110212/wrfout_d01_2010-11-03_00:00:00 --------------------------------------------------------- --------------------------------------------------------- 2010112900/wrfout_d01_2010-11-29_00:00:00 2010112900/wrfout_d01_2010-11-29_12:00:00 2010112900/wrfout_d01_2010-11-30_00:00:00 in WRFV32 namelist change your namelist.input following option &time_control history_interval = 720, 60, 60, frames_per_outfile = 1, 1, 1000, V Yesubabu, Project Engineer,CAS/SECG, C-DAC,Main Building, Pune University,Pune,India. Phone :020-25704226 On 11 January 2011 00:30, wrote: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. Gen_be (moudi pascal) > 2. wrf3.2 (patrick) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Fri, 7 Jan 2011 18:06:51 +0000 (GMT) > From: moudi pascal > Subject: [Wrf-users] Gen_be > To: WRF User's , WRF DA , > wrf_users at ucar.edu > Message-ID: <790359.47596.qm at web25106.mail.ukl.yahoo.com> > Content-Type: text/plain; charset="utf-8" > > Hi to all and Happy New Year. > I would like to generate be.dat file for my own region. I get the forecasts > for > the month of June initialized at OOOOUTC. I generate the forecasts every 3 > hours. > May someone help me to generate the be.dat file please? > I am having the following error when i tried to test my data for 1 day with > the > interval of 12 > > Run Stage 0: Calculate ensemble perturbations from model forecasts. > --------------------------------------------------------------- > > Beginning CPU time: Fri Jan 7 19:06:06 WAT > 2011 > > gen_be_stage0_wrf: Calculating standard perturbation fields valid at time > 2010060100 > > mv: cannot stat `pert.2010060100*': No such file or > directory > > 2010053100 > /home/wrfvar/WRF/WRFV3/run/BE//2010053100/wrfout_d01_2010-06-01_00:00:00 > /home/wrfvar/WRF/WRFV3/run/BE//2010053112/wrfout_d01_2010-06-01_00:00:00 > > gen_be_stage0_wrf: Calculating standard perturbation fields valid at time > 2010060112 > > mv: cannot stat `pert.2010060112*': No such file or > directory > > 2010053112 > /home/wrfvar/WRF/WRFV3/run/BE//2010053112/wrfout_d01_2010-06-01_12:00:00 > /home/wrfvar/WRF/WRFV3/run/BE//2010060100/wrfout_d01_2010-06-01_12:00:00 > > gen_be_stage0_wrf: Calculating standard perturbation fields valid at time > 2010060200 > > mv: cannot stat `pert.2010060200*': No such file or directory > 2010060100 > /home/wrfvar/WRF/WRFV3/run/BE//2010060100/wrfout_d01_2010-06-02_00:00:00 > /home/wrfvar/WRF/WRFV3/run/BE//2010060112/wrfout_d01_2010-06-02_00:00:00 > Ending CPU time: Fri Jan 7 19:06:07 WAT 2011 > --------------------------------------------------------------- > Run Stage 1: Read standard fields, and remove time/ensemble/area mean. > --------------------------------------------------------------- > Beginning CPU time: Fri Jan 7 19:06:07 WAT 2011 > Stage 1 failed with error 24 > > > > Pascal MOUDI IGRI > > Ph-D Student at the > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110107/a25467a7/attachment-0001.html > > ------------------------------ > > Message: 2 > Date: Sat, 08 Jan 2011 22:31:40 -0500 > From: patrick > Subject: [Wrf-users] wrf3.2 > To: wrf-users at ucar.edu > Message-ID: <4D292C1C.3020008 at gmail.com> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > > real.exe failure on debian lenny system > > uname -a > Linux 2.6.26-2-686 #1 SMP Thu Sep 16 19:35:51 UTC 2010 i686 GNU/Linux > > any advice would be appreciated > > it appears that " ./real.exe: free(): invalid next size (fast): > 0x109b8980 ***" is the issue. > > cheers, > > --patrick > > output below: > > 2850-01:WRFV3/run# ./real.exe > Namelist dfi_control not found in namelist.input. Using registry > defaults for variables in dfi_control > Namelist tc not found in namelist.input. Using registry defaults for > variables in tc > Namelist scm not found in namelist.input. Using registry defaults for > variables in scm > Namelist fire not found in namelist.input. Using registry defaults for > variables in fire > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > auxinput4_interval = 0 for all domains > --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and > ending time to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain > 1, setting sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging > interval and ending time to 0 for that domain. > --- NOTE: num_soil_layers has been set to 4 > REAL_EM V3.2.1 PREPROCESSOR > ************************************* > Parent domain > ids,ide,jds,jde 1 74 1 61 > ims,ime,jms,jme -4 79 -4 66 > ips,ipe,jps,jpe 1 74 1 61 > ************************************* > DYNAMICS OPTION: Eulerian Mass Coordinate > alloc_space_field: domain 1 , 110361480 bytes allocated > Time period # 1 to process = 2011-01-07_22:00:00. > Time period # 2 to process = 2011-01-07_22:06:00. > Time period # 3 to process = 2011-01-07_22:12:00. > Time period # 4 to process = 2011-01-07_22:18:00. > Time period # 5 to process = 2011-01-07_22:24:00. > Time period # 6 to process = 2011-01-07_22:30:00. > Time period # 7 to process = 2011-01-07_22:36:00. > Time period # 8 to process = 2011-01-07_22:42:00. > Time period # 9 to process = 2011-01-07_22:48:00. > Time period # 10 to process = 2011-01-07_22:54:00. > Time period # 11 to process = 2011-01-07_23:00:00. > Total analysis times to input = 11. > > > ----------------------------------------------------------------------------- > > Domain 1: Current date being processed: 2011-01-07_22:00:00.0000, > which is loop # 1 out of 11 > configflags%julyr, %julday, %gmt: 2011 7 22.000000 > metgrid input_wrf.F first_date_input = 2011-01-07_22:00:00 > metgrid input_wrf.F first_date_nml = 2011-01-07_22:00:00 > d01 2011-01-07_22:00:00 Timing for input 0 s. > d01 2011-01-07_22:00:00 flag_soil_levels read from met_em > file is 1 > Max map factor in domain 1 = 0.98. Scale the dt in the model accordingly. > Converged znw(kte) should be about 0.0 = 3.89766930E-09 > d01 2011-01-07_22:00:00 Old data, no inland lake information > Assume RUC LSM 6-level input > *** glibc detected *** ./real.exe: free(): invalid next size (fast): > 0x109b8980 *** > ======= Backtrace: ========= > /lib/i686/cmov/libc.so.6[0xb75c6764] > /lib/i686/cmov/libc.so.6(cfree+0x96)[0xb75c8966] > ./real.exe[0x84c8c9c] > ./real.exe[0x84d5314] > ./real.exe[0x8086978] > ./real.exe[0x808f86f] > ./real.exe[0x8053e9a] > ./real.exe[0x80550ac] > ./real.exe[0x8dbc159] > /lib/i686/cmov/libc.so.6(__libc_start_main+0xe5)[0xb756e455] > ./real.exe[0x804aa11] > ======= Memory map: ======== > 08048000-08eaa000 r-xp 00000000 08:01 4169734 > /home/ldm/WRFV3/main/real.exe > 08eaa000-09a34000 rw-p 00e62000 08:01 4169734 > /home/ldm/WRFV3/main/real.exe > 09a34000-0dfb0000 rw-p 09a34000 00:00 0 > 0fd59000-10ae0000 rw-p 0fd59000 00:00 0 [heap] > b0400000-b0421000 rw-p b0400000 00:00 0 > b0421000-b0500000 ---p b0421000 00:00 0 > b05d3000-b7558000 rw-p b05d3000 00:00 0 > b7558000-b76ad000 r-xp 00000000 08:01 5529966 /lib/i686/cmov/ > libc-2.7.so > b76ad000-b76ae000 r--p 00155000 08:01 5529966 /lib/i686/cmov/ > libc-2.7.so > b76ae000-b76b0000 rw-p 00156000 08:01 5529966 /lib/i686/cmov/ > libc-2.7.so > b76b0000-b76b3000 rw-p b76b0000 00:00 0 > b76b3000-b76bf000 r-xp 00000000 08:01 5529603 /lib/libgcc_s.so.1 > b76bf000-b76c0000 rw-p 0000b000 08:01 5529603 /lib/libgcc_s.so.1 > b76c0000-b76c1000 rw-p b76c0000 00:00 0 > b76c1000-b76e5000 r-xp 00000000 08:01 5529619 /lib/i686/cmov/ > libm-2.7.so > b76e5000-b76e7000 rw-p 00023000 08:01 5529619 /lib/i686/cmov/ > libm-2.7.so > b76e7000-b7797000 r-xp 00000000 08:01 548076 > /usr/lib/libgfortran.so.3.0.0 > b7797000-b7798000 rw-p 000af000 08:01 548076 > /usr/lib/libgfortran.so.3.0.0 > b7798000-b7799000 rw-p b7798000 00:00 0 > b77a2000-b77a4000 rw-p b77a2000 00:00 0 > b77a4000-b77a5000 r-xp b77a4000 00:00 0 [vdso] > b77a5000-b77bf000 r-xp 00000000 08:01 5529939 /lib/ld-2.7.so > b77bf000-b77c1000 rw-p 0001a000 08:01 5529939 /lib/ld-2.7.so > bfe20000-bfe3d000 rw-p bffe2000 00:00 0 [stack] > Aborted > > > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 77, Issue 5 > **************************************** > -- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110111/9ccdab4b/attachment.html From moudipascal at yahoo.fr Wed Jan 12 06:10:20 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Wed, 12 Jan 2011 13:10:20 +0000 (GMT) Subject: [Wrf-users] How to modify e_vert. Message-ID: <984653.41310.qm@web25102.mail.ukl.yahoo.com> Hi to all, I want to modify e_vert when running wrf. But when i change it, it always remains to 27. How to change it? is it in the namelist.wps ? when i put it there, the completion is unsuccessful. I've remarqued that, when i run ncdump -c met_em* :BOTTOM-TOP_GRID_DIMENSION = 27. How to change it from 27 to 41 for example? Thank you Pascal MOUDI IGRI Ph-D Student at the Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110112/8e0d6699/attachment.html From M.Collins at exeter.ac.uk Fri Jan 14 09:12:51 2011 From: M.Collins at exeter.ac.uk (Collins, Matthew) Date: Fri, 14 Jan 2011 16:12:51 +0000 Subject: [Wrf-users] Papers for IPCC AR5 Ch12 - Long term Climate Change Projections, Commitments and Irreversibility Message-ID: Dear Colleagues, We would like you to send us any papers you have submitted, in press or have recently published that are relevant for the forthcoming IPCC WG1 AR5 Chapter 12 "Long term Climate Change: Projections, Commitments and Irreversibility". An outline of the Chapter is included below. Please send all papers to reto.knutti at env.etchz.ch and he will arrange to hold them in a central place and pass them on to the relevant Lead Author. Also, please do keep us up-to-date on the progress of papers and send revised versions via Reto. Papers need to be submitted by July 31 2012 and accepted by 15 March 2013 in order to be cited in the report. Note however that there is no guarantee that your paper will be cited even if you send it to us. Please feel free to pass this message on to any of your colleagues we may have missed. Apologies if you get more than one copy. Best wishes, IPCC WG1 AR5 Chapter 12 authors Chapter 12 Outline 1 General introduction 1.2. Uncertainties in the chain from emissions to projections 2. Projected changes in emissions, concentrations and radiative forcing 2.1. Description of scenarios 2.2. Implementation of scenarios and forcings in CMIP5 3. Projected changes over the 21st century 3.1. Time-evolving global quantities 3.2. Changes in temperature and energy budget 3.3. Changes in the Water Cycle 3.4. Changes in Atmospheric Circulation 3.5. Pattern scaling 3.6. Changes in high-latitude climate and cryosphere 3.7. Changes in the ocean 3.8. Consistency and main differences CMIP3/CMIP5 and SRES/RCP 3.9. Changes associated with biogeochemical feedbacks 4. Global measures of climate sensitivity and transient response 4.1. Estimates based on climatology, ranges of CMIP5 and comparison to earlier CMIPs 4.2. Forcing and response, timescales of feedbacks 5. Long Term Climate Change, Commitment and Irreversibility 5.1. RCP extensions 5.2. Commitment 5.3. Climate stabilization 5.4. Abrupt change and irreversibility From Chris.Franks at noaa.gov Fri Jan 14 20:46:25 2011 From: Chris.Franks at noaa.gov (Chris Franks) Date: Fri, 14 Jan 2011 21:46:25 -0600 Subject: [Wrf-users] WPP errors with sub hourly data Message-ID: <4D311891.90904@noaa.gov> Hi all, I tried tweaking the minutes script but haven't been able to get it to run. I didn't actually compile WPP as I'm actually working off someone else's machine, but seems there weren't any compilation errors. So, looks like something wrong with my script (itag file looks off). I've attached my script...do you see my problems here? Think my paths are right. Doesn't look like one needs to change much. I'm testing it out on a small sample set of wrf output: wrfout_d01_2010-06-17_18:00:00 wrfout_d01_2010-06-17_18:15:00 wrfout_d01_2010-06-17_18:30:00 wrfout_d01_2010-06-17_18:45:00 wrfout_d01_2010-06-17_19:00:00 wrfout_d01_2010-06-17_19:15:00 wrfout_d01_2010-06-17_19:30:00 wrfout_d01_2010-06-17_19:45:00 Thanks, Chris -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Chris.txt Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110114/f8967563/attachment.txt From eric.kemp at nasa.gov Fri Jan 14 11:25:08 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.0)[NORTHROP GRUMMAN INFORMATION TECH]) Date: Fri, 14 Jan 2011 12:25:08 -0600 Subject: [Wrf-users] FW: DRAFT WRF 3.2.1 Bug Report In-Reply-To: Message-ID: Dear wrfhelp: I am experiencing problems running WRF3.2.1 on a case that works with WRF3.1.1. Two grids are used: max_dom = 2, s_we = 1, 1, 1, e_we = 1200, 393, 400, s_sn = 1, 1, 1, e_sn = 800, 385, 400, s_vert = 1, 1, 1, e_vert = 35, 35, 35, num_metgrid_levels = 40 dx = 4000, 2000, 1000, dy = 4000, 2000, 1000, And the initial and lateral boundary conditions are provided by the NAM (00Z 10 April 2009 - 12Z 11 April 2009). The WRF 3.1.1 runs this case without incident. However, the WRF3.2.1 will either fail with this message: FATAL CALLED FROM FILE: LINE: 15172 -------------- FATAL CALLED --------------- frame/module_domain.f: Failed to allocate grid%smois(sm31:em31,1:model_config_r ec%num_soil_layers,sm33:em33). Or else 3.2.1 will *usually* run with random and immediate CFL errors if "too many" MPI processes are used (above 40). In a handful of cases, however, the same executable will run without issue. I've noticed that the 3.2.1 real.exe executable also experiences the memory allocation error. Also, these problems occur regardless of whether WPS 3.1.1 or 3.2.1 is used to process the NAM data. I'm running on a Linux distributed cluster, using 40 nodes with 2 dual-core Intel Woodcrest chips with 4 GB RAM per node. The WRF is compiled with ifort 11.1.038 and Intel MPI 3.2.011. I've experimented with other MPI implementations (Intel MPI 4.0.0.025 and MVAPICH2-1.4.1) and I see the same behavior. I have sample namelist.input, wrfinput and wrfbdy files available for upload. Thanks, -Eric -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- ------ End of Forwarded Message -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110114/5f201553/attachment.html From miguel at gw-frias.homeip.net Wed Jan 19 05:45:55 2011 From: miguel at gw-frias.homeip.net (Miguel Frias) Date: Wed, 19 Jan 2011 12:45:55 +0000 Subject: [Wrf-users] Question about GFS data retrieval Message-ID: <4D36DD03.6050508@gw-frias.homeip.net> Hello, I don't know if this is the correct place to ask this but here it goes. I'm trying to use GFS to get upper air winds at a specific altitude. I found out about the partial http transfer using curkl and then wgrib to get the data. Now my question is: I need to supply as variables the latitude, longitude, altitude and that I want wind direction and speed. As an example, I would need the winds at 38N020W at 33000ft (11000m or a specific pressure level) which would then give me, for instance, 274/38 (274 degrees wind direction and 38 knots [or whatever metric used]). I know that GFS has all this information, it's how to ask for it and how to properly decode it that I'm in doubt. Finally, the information I want must be treated in raw. Don't want any images/graphs or the like. What I'm using: http://nomads.ncep.noaa.gov/txt_descriptions/fast_downloading_grib.shtml However the above doesn't let me specify latitude, longitude or altitude. Thanks for any help, Miguel Frias From preeti at csa.iisc.ernet.in Tue Jan 18 23:23:58 2011 From: preeti at csa.iisc.ernet.in (Preeti) Date: Wed, 19 Jan 2011 11:53:58 +0530 Subject: [Wrf-users] Compressing NetCDF Message-ID: Hello Does anyone know of any better compression software other than zlib/gzip for data formats like NetCDF ? I find that if the WRF output NetCDF file has only few variables say Pressure P, Temperature T etc. then gzip/zlib performs very poorly, namely the compression is from 892K (original) to only 742K (gzip-ed). Thanks Preeti From nriemer at illinois.edu Thu Jan 20 15:09:07 2011 From: nriemer at illinois.edu (Nicole Riemer) Date: Thu, 20 Jan 2011 16:09:07 -0600 Subject: [Wrf-users] Postdoc position in Atmospheric Chemistry Modeling at UIUC Message-ID: <20231C69-ECE3-4ED6-ADF6-21E1FC8D26F9@illinois.edu> Department of Atmospheric Sciences at the University of Illinois at Urbana-Champaign invites applications for Postdoctoral Researcher Opportunity One Postdoctoral Researcher position is open at the Department of Atmospheric Sciences at the University of Illinois at Urbana-Champaign, starting in May 2011. The successful candidate will work on developing a state-of-the-art regional model representation of nighttime chemistry building on existing model capabilities of the community model WRF/Chem. The NOAA-funded project involves improving the understanding of interaction of transport and chemistry during nighttime, quantifying the impact of nighttime chemistry for chlorine cycling, nitrate formation and ozone production, and validating the model with field observations from the CalNeX Campaign 2010. The initial appointment is for one year with the possibility of renewal for two additional years based on performance and continuation of funding. The stipend depends on qualifications. The applicant must have a Ph.D. in atmospheric sciences or a related field, with experience in regional modeling of atmospheric chemistry. Advanced scientific programming skills are essential and previous experience with WRF/Chem is highly desirable. To apply for this opportunity please email a cover letter, brief research statement, CV including publications list, and contact information for three persons willing to serve as references to Dr. Nicole Riemer at nriemer at illinois.edu. Screening of applications will begin on January 31 2011 and will continue until the position is filled. Nicole Riemer Assistant Professor Department of Atmospheric Sciences University of Illinois at Urbana-Champaign 105 S. Gregory Urbana, IL 61801 phone: 217-244-2844 nriemer at illinois.edu http://www.atmos.uiuc.edu/~nriemer From bbrashers at Environcorp.com Thu Jan 20 18:11:59 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Thu, 20 Jan 2011 17:11:59 -0800 Subject: [Wrf-users] Using high resolution snow data Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF30852061ED7BC@irvine01.irvine.environ.local> I would like to supply WRF with snow data at higher spatial resolution than in my initialization dataset (12km NAM) for my inner-most grids. Has anyone used one of the MODIS products, e.g. http://nsidc.org/data/myd10c1v5.html? It comes in HDF format, which is quasi-related to netCDF format. Has anyone written an HDF-to-intermediate converter that they're willing to share (maybe called unhdf)? Or a netCDF-to-intermediate converter? Or perhaps there's another data source for ~1km snow data that people like? Any advice would be appreciated. Bart bbrashers at environcorp.com This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110120/951736f9/attachment.html From giselezepka at bol.com.br Fri Jan 21 13:13:42 2011 From: giselezepka at bol.com.br (Gisele Zepka) Date: Fri, 21 Jan 2011 18:13:42 -0200 Subject: [Wrf-users] precipitation ice mass Message-ID: <4D39E8F6.5080703@bol.com.br> Dear users, I was very interested in one of the storm parameters that can be extracted from the WRF, however I was unable to get that variable from my simulations. I wonder if you could tell me how can I extract the "precipitation ice mass" from the model. My domain has a 3 km grid spacing and I am using the wrf_post from version 2.2 and ARWpost from version 3.1 to post-process my data. Thanks! All best, Gisele From kganbour at yahoo.com Fri Jan 21 01:01:45 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Fri, 21 Jan 2011 00:01:45 -0800 (PST) Subject: [Wrf-users] How to get data for my domain Message-ID: <718472.69699.qm@web46306.mail.sp1.yahoo.com> Dear All: I have compile WRF Model with real_em,and I have run it with? case which in the online tutorial "January 2000 Case". I would like to run the model on my domain as example Libya area. Please how can I get the Input data for my domain? I am looking forward for your replay. With best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110121/cf189d29/attachment.html From kganbour at yahoo.com Fri Jan 21 01:03:01 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Fri, 21 Jan 2011 00:03:01 -0800 (PST) Subject: [Wrf-users] wrf-users@ucar.edu Message-ID: <732917.23414.qm@web46307.mail.sp1.yahoo.com> Dear All: I have compile WRF Model with real_em,and I have run it with? case which in the online tutorial "January 2000 Case". I would like to run the model on my domain as example Libya area. Please how can I get the Input data fro my domain? I am looking forward for your replay. With best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110121/7402fa8c/attachment.html From kganbour at yahoo.com Fri Jan 21 02:17:24 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Fri, 21 Jan 2011 01:17:24 -0800 (PST) Subject: [Wrf-users] WRF Compile for many cases Message-ID: <492106.24473.qm@web46302.mail.sp1.yahoo.com> Dear All: I have run ARW WRF model with em_real. Can I compile with another case such Idealized cases or NMM WRF and keep the the first? compilation? best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110121/16bed134/attachment.html From kganbour at yahoo.com Fri Jan 21 02:18:23 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Fri, 21 Jan 2011 01:18:23 -0800 (PST) Subject: [Wrf-users] WRF Compile for many cases Message-ID: <141918.29712.qm@web46303.mail.sp1.yahoo.com> Dear All: I have run ARW WRF model with em_real. Can I compile with another case such Idealized cases or NMM WRF and keep the the first? compilation? best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110121/8d7baca8/attachment.html From totangjie at gmail.com Thu Jan 20 23:29:44 2011 From: totangjie at gmail.com (Jie TANG) Date: Fri, 21 Jan 2011 14:29:44 +0800 Subject: [Wrf-users] where can I find the detail introduction of IDEAL test of WRF? Message-ID: Hi ,wrf user group: Now I want to find some detail description of the numerical scheme of the ideal test given by WRF. for example: em_quarter_ss,em_grav2d_x em_hill2d_x . the readme of these ideal test is too simple and I can not get the detail. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110121/866b599f/attachment.html From FLiu at azmag.gov Mon Jan 24 09:09:43 2011 From: FLiu at azmag.gov (Feng Liu) Date: Mon, 24 Jan 2011 16:09:43 +0000 Subject: [Wrf-users] WRF Compile for many cases In-Reply-To: <492106.24473.qm@web46302.mail.sp1.yahoo.com> References: <492106.24473.qm@web46302.mail.sp1.yahoo.com> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C029EA14D@mag9006> Hi Khaled, You may save all current executable files in another specific directory but not in ../main/, alternatively, you can change those executable file names if you still want to keep them in the directory of ../main when you do ./clean -a and compile with another case. You can invoke different *.exe files in different name or in different directory when you run different cases. Please keep in mind you may use different Registry for compiling with different case. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Khaled Ganbour Sent: Friday, January 21, 2011 2:17 AM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF Compile for many cases Dear All: I have run ARW WRF model with em_real. Can I compile with another case such Idealized cases or NMM WRF and keep the the first compilation? best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/3d350a7c/attachment.html From kganbour at yahoo.com Mon Jan 24 09:18:24 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Mon, 24 Jan 2011 08:18:24 -0800 (PST) Subject: [Wrf-users] [wrf-users]I couldn't run NCL Message-ID: <729147.9049.qm@web46305.mail.sp1.yahoo.com> Dear All I have tried to install NCL program.but always I have errors,I attached the log file for installation if you can help me I am looking forward for your replay. With best regards -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/3333ed17/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: make-output Type: application/octet-stream Size: 990162 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/3333ed17/attachment-0001.obj From mkudsy at gmail.com Sat Jan 22 20:58:47 2011 From: mkudsy at gmail.com (M Kudsy) Date: Sun, 23 Jan 2011 10:58:47 +0700 Subject: [Wrf-users] How to get data for my domain In-Reply-To: <718472.69699.qm@web46306.mail.sp1.yahoo.com> References: <718472.69699.qm@web46306.mail.sp1.yahoo.com> Message-ID: You can get the input data from many sources, for example from ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod under its sub-dir there are daily global forecast products based from 00Z up to several days in grin format Other source of input can be obtained from pages indicated by http://www.mmm.ucar.edu/wrf/users/download/free_data.html Have a try. Mahally On Fri, Jan 21, 2011 at 3:01 PM, Khaled Ganbour wrote: > Dear All: > I have compile WRF Model with real_em,and I have run it with case which in > the online tutorial "January 2000 Case". > I would like to run the model on my domain as example Libya area. > Please how can I get the Input data for my domain? > > > I am looking forward for your replay. > > With best regards > > > Khaled > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Dr.Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110123/78eb4925/attachment.html From mkudsy at gmail.com Sat Jan 22 21:14:54 2011 From: mkudsy at gmail.com (M Kudsy) Date: Sun, 23 Jan 2011 11:14:54 +0700 Subject: [Wrf-users] precipitation ice mass In-Reply-To: <4D39E8F6.5080703@bol.com.br> References: <4D39E8F6.5080703@bol.com.br> Message-ID: Can you describe your domain? . On Sat, Jan 22, 2011 at 3:13 AM, Gisele Zepka wrote: > Dear users, > > I was very interested in one of the storm parameters that can be > extracted from the WRF, however I was unable to get that variable from > my simulations. I wonder if you could tell me how can I extract the > "precipitation ice mass" from the model. My domain has a 3 km grid > spacing and I am using the wrf_post from version 2.2 and ARWpost from > version 3.1 to post-process my data. > > Thanks! > > All best, > > Gisele > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Dr.Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110123/fb2fc4f7/attachment.html From kganbour at yahoo.com Mon Jan 24 10:40:23 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Mon, 24 Jan 2011 09:40:23 -0800 (PST) Subject: [Wrf-users] [wrf-users]WPPS Message-ID: <925436.65956.qm@web46305.mail.sp1.yahoo.com> Dear All: I compiled WPPS but I think It was n't compatible with my system. Does WPPS work with some processors. I attached the compilation and config.wpp file and the wrong after I run it. best regards -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/ce09a7ca/attachment-0001.html -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: WPPS.txt Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/ce09a7ca/attachment-0001.txt -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.wpp Type: application/octet-stream Size: 2046 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/ce09a7ca/attachment-0001.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_wpp.log Type: text/x-log Size: 51738 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110124/ce09a7ca/attachment-0001.bin From mkudsy at gmail.com Mon Jan 24 17:17:44 2011 From: mkudsy at gmail.com (M Kudsy) Date: Tue, 25 Jan 2011 07:17:44 +0700 Subject: [Wrf-users] [wrf-users]I couldn't run NCL In-Reply-To: <729147.9049.qm@web46305.mail.sp1.yahoo.com> References: <729147.9049.qm@web46305.mail.sp1.yahoo.com> Message-ID: Khaled, I think you haven't specified X11 libraries location. In Unix system they may be located in different places such as /usr/X11/lib, /usr/lib/X11 or /usr/X11/lib/X11 Try to locate such as Xaw using command locate Xaw |more Greet, Mahally On Mon, Jan 24, 2011 at 11:18 PM, Khaled Ganbour wrote: > > Dear All > > I have tried to install NCL program.but always I have errors,I attached the > log file for installation if you can help me > > I am looking forward for your replay. > > With best regards > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Dr.Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110125/245ee71b/attachment.html From maemarcus at gmail.com Tue Jan 25 14:04:47 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Wed, 26 Jan 2011 00:04:47 +0300 Subject: [Wrf-users] [wrf-users]WPPS In-Reply-To: <925436.65956.qm@web46305.mail.sp1.yahoo.com> References: <925436.65956.qm@web46305.mail.sp1.yahoo.com> Message-ID: Khaled, Application complains about absence of SSE3 instructions support on your cpu. In compile_wpp.log you attached there is a line ifort -free -O3 -xT here -xT, according to Intel Fortran manual, means use of SSE3. I think if you disable this option, the application should work fine for you. Good luck, - D. 2011/1/24 Khaled Ganbour > Dear All: > I compiled WPPS but I think It was n't compatible with my system. > Does WPPS work with some processors. > I attached the compilation and config.wpp file and the wrong after I run > it. > > > best regards > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/7712d614/attachment.html From FLiu at azmag.gov Wed Jan 26 16:08:53 2011 From: FLiu at azmag.gov (Feng Liu) Date: Wed, 26 Jan 2011 23:08:53 +0000 Subject: [Wrf-users] wrong time interval for wrfout Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C2262F1A1@mag9006> Hi, Users, I want to have hourly output for d01,d02, and d03 but 5 minutes for d04, please see time control session in namelist.input attached. I do not know why the time in wrfout_d04 increases by 2 seconds in each time level, that is not exact 5-minute interval. The output file name looks messy (see attached wrfout_d04_files attached). Any hint will be helpful. Thank you. Times = "2008-03-13_00:00:00", "2008-03-13_00:05:02", "2008-03-13_00:10:04", "2008-03-13_00:15:06", "2008-03-13_00:20:08", "2008-03-13_00:25:11", "2008-03-13_00:30:13", "2008-03-13_00:35:15", "2008-03-13_00:40:17", "2008-03-13_00:45:20", "2008-03-13_00:50:22", "2008-03-13_00:55:24" ; Feng -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/4324d479/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 1737 bytes Desc: namelist.input Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/4324d479/attachment.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: wrfout_d04_files Type: application/octet-stream Size: 992 bytes Desc: wrfout_d04_files Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/4324d479/attachment-0001.obj From kganbour at yahoo.com Wed Jan 26 13:33:45 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Wed, 26 Jan 2011 12:33:45 -0800 (PST) Subject: [Wrf-users] [wrf-users]ARWpost.exe fail to run Message-ID: <910736.46103.qm@web46312.mail.sp1.yahoo.com> Dear All: After I have run ARW WRF model?I compiled ARWpost fine but when I run ARWpost.exe I get an error segmentation ?fault .I work on small computer but I inserted :ulimit -s unlimitedand no result.I attached the error and comile.log of ARWpost. Best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/00af1e40/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log Type: application/octet-stream Size: 8049 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/00af1e40/attachment-0001.obj -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ARWpost.txt Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110126/00af1e40/attachment-0001.txt From jliang at arb.ca.gov Thu Jan 27 14:00:18 2011 From: jliang at arb.ca.gov (Liang, Jinyou@ARB) Date: Thu, 27 Jan 2011 13:00:18 -0800 Subject: [Wrf-users] WPS: ungrib input data Message-ID: <730450480DDFFB42A80EE000074F3E3005D8309ACA@MDTSSWECCR15.rf01.itservices.ca.gov> Dear WRF expert, I am a new user of WRF, and intend to run WRF-arw during Aug 16 (12Z) --- Aug 26 (12Z), 2010 in the following domain, as defined in the namelist.wps: __________________________________________________________________________________________________ &share wrf_core = 'ARW', max_dom = 2, io_form_geogrid = 2, start_date = '2010-08-16_12:00:00','2010-08-16_12:00:00', end_date = '2010-08-26_12:00:00','2010-08-26_12:00:00', interval_seconds = 10800 / &geogrid parent_id = 1, 1, parent_grid_ratio = 1, 3, s_we = 1, 1, s_sn = 1, 1, e_we = 273, 190, e_sn = 273, 190, i_parent_start = 1, 29, j_parent_start = 1, 49, geog_data_res = 'modis_landuse_20class_30s+5m','modis_landuse_20class_30s+2m', dx = 12000, dy = 12000, map_proj = 'lambert', ref_lat = 37.0, ref_lon = -120.5, truelat1 = 30.0, truelat2 = 60.0, stand_lon = -120.5, ... At the WPS steps, ungrib needs input GRIB files, for 'link_grib.csh', which I do not have. If you could provide the necessary data files, I would greatly appreciate your kindness. With best regards, Paul Jinyou (Paul) Liang, Ph.D. Staff Air Pollution Specialist California Air Resources Board Sacramento, CA 95812 Phone: (916) 327-8543 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110127/0b0bab50/attachment.html From maemarcus at gmail.com Thu Jan 27 14:00:49 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Fri, 28 Jan 2011 00:00:49 +0300 Subject: [Wrf-users] [wrf-users]ARWpost.exe fail to run In-Reply-To: <910736.46103.qm@web46312.mail.sp1.yahoo.com> References: <910736.46103.qm@web46312.mail.sp1.yahoo.com> Message-ID: Khaled, This is just a segfault message with no exact place where it happens. You can try collecting core dump or run the app under debugger, for example gdb. It's simple, first check "-g" option is added with your compilation options, if yes, just issue [khaled at localhost ARWpost]$ gdb ./ARWpost.exe then type "r" (without quotes) or "run" in gdb prompt, then after error occurs, type "bt" for backtrace to see the trace of functions inside applications. Post your trace here, this way it should get clearer, where it fails. 2011/1/26 Khaled Ganbour > Dear All: > After I have run ARW WRF model I compiled ARWpost fine but when I run > ARWpost.exe I get an error segmentation fault . > I work on small computer but I inserted :ulimit -s unlimited > and no result. > I attached the error and comile.log of ARWpost. > > > Best regards > > > Khaled > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110128/62c8b293/attachment.html From hedde.cea at gmail.com Mon Jan 31 05:25:15 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Mon, 31 Jan 2011 13:25:15 +0100 Subject: [Wrf-users] WPS: ungrib input data In-Reply-To: <730450480DDFFB42A80EE000074F3E3005D8309ACA@MDTSSWECCR15.rf01.itservices.ca.gov> References: <730450480DDFFB42A80EE000074F3E3005D8309ACA@MDTSSWECCR15.rf01.itservices.ca.gov> Message-ID: Dear Jinyou you may find the GFS data there : http://nomads.ncep.noaa.gov/ For a beginner I would give you the advice to use wrfPortal to start with wrf : http://www.wrfportal.org/ Cordially Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE 13108 ST PAUL LEZ DURANCE CEDEX FRANCE 2011/1/27 Liang, Jinyou at ARB > Dear WRF expert, > > > > I am a new user of WRF, and intend to run WRF-arw during Aug 16 (12Z) --- > Aug 26 (12Z), 2010 in the following domain, as defined in the namelist.wps: > > > > > __________________________________________________________________________________________________ > > &share > > wrf_core = 'ARW', > > max_dom = 2, > > io_form_geogrid = 2, > > start_date = '2010-08-16_12:00:00','2010-08-16_12:00:00', > > end_date = '2010-08-26_12:00:00','2010-08-26_12:00:00', > > interval_seconds = 10800 > > / > > &geogrid > > parent_id = 1, 1, > > parent_grid_ratio = 1, 3, > > s_we = 1, 1, > > s_sn = 1, 1, > > e_we = 273, 190, > > e_sn = 273, 190, > > i_parent_start = 1, 29, > > j_parent_start = 1, 49, > > geog_data_res = > 'modis_landuse_20class_30s+5m','modis_landuse_20class_30s+2m', > > dx = 12000, > > dy = 12000, > > map_proj = 'lambert', > > ref_lat = 37.0, > > ref_lon = -120.5, > > truelat1 = 30.0, > > truelat2 = 60.0, > > stand_lon = -120.5, > > ? > > > > At the WPS steps, ungrib needs input GRIB files, for ?link_grib.csh?, which > I do not have. If you could provide the necessary data files, I would > greatly appreciate your kindness. > > > > With best regards, > > Paul > > *Jinyou (Paul) Liang, Ph.D.* > > *Staff Air Pollution Specialist* > > *California** Air Resources Board* > > *Sacramento**, CA 95812* > > *Phone: (916) 327-8543* > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110131/c2ca7602/attachment.html From dbh409 at ku.edu Mon Jan 31 23:28:56 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 1 Feb 2011 06:28:56 +0000 Subject: [Wrf-users] Compile WPS with gfortran and gcc Message-ID: Hello all, I have run into a situation where I need to have a version of WPS compiled with the GNU compilers. I need to be able to write WPS intermediate files in NCL using a Fortran stub so that WPS can process them and generate NetCDF input files. I am using a GNU precompiled binaries version of NCL, so WPS and the stub must also be compiled with the GNU compilers. This build of WPS will not be used normally, and I don't need any of the utilities, geogrid.exe, or ungrib.exe, just metgrid.exe. It's a 64-bit CentOS machine running bash. The architecture specific settings in configure.wps are as follows: #### Architecture specific settings #### # Settings for PC Linux x86_64, gfortran compiler, serial, NO GRIB2 # COMPRESSION_LIBS = COMPRESSION_INC = FDEFS = FC = gfortran SFC = gfortran FFLAGS = -ffree-form -w -fno-underscoring F77FLAGS = -ffixed-form -w -fno-underscoring FNGFLAGS = $(FFLAGS) LDFLAGS = -w CC = gcc SCC = gcc CFLAGS = -w CPP = /usr/bin/cpp -C -P -traditional CPPFLAGS = -D_UNDERSCORE -DBYTESWAP -DLINUX -DIO_NETCDF -DBIT32 I get several errors with this and the compilation is not successful. I'm not sure what flags I have wrong here, but any guidance would be much appreciated. Thanks, Dave From dbh409 at ku.edu Tue Feb 1 12:55:17 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 1 Feb 2011 19:55:17 +0000 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: , Message-ID: Dmitry, Primarily "undefined reference" errors. For instance, module_debug.o: In function `__module_debug__mprintf': module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow output_module.o: In function `__output_module__output_close': output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' output_module.o: In function `__output_module__ext_put_dom_ti_char': output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' Dave From dbh409 at ku.edu Tue Feb 1 14:24:36 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 1 Feb 2011 21:24:36 +0000 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: , Message-ID: Dmitry, Thanks for the recipe! This resulted in different erros, so I tried adding the flags -DIO_BINARY and -DIO_GRIB1 to CPPFLAGS, but that didn't change anything. Also, in LDFLAGS, I tried explicitly adding the library directory with -L/home/dbh409/WRFV3/external/iogrib1 to no avail. I'm getting the same "undefined reference" errors but also /usr/bin/ld: cannot find -lio_grib1 collect2: ld returned 1 exit status ________________________________________ From: Dmitry N. Mikushin [maemarcus at gmail.com] Sent: Tuesday, February 01, 2011 2:38 PM To: Huber, David Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc David, Here's a common recipe to lookup for missing symbols. Navigate to the project dir and grep for particular symbol you have issue with, for instance: marcusmae at msiwind:~$ cd Programming/wrfv3/ marcusmae at msiwind:~/Programming/wrfv3$ grep ext_gr1_put_dom_ti_char * -R arch/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & external/io_grib1/io_grib1.f90:SUBROUTINE ext_gr1_put_dom_ti_char ( DataHandle,Element, Data, & external/io_grib1/io_grib1.f90: call wrf_debug ( DEBUG , 'Entering ext_gr1_put_dom_ti_char') external/io_grib1/io_grib1.f90:END SUBROUTINE ext_gr1_put_dom_ti_char Binary file external/io_grib1/libio_grib1.a matches external/io_grib1/io_grib1.F:SUBROUTINE ext_gr1_put_dom_ti_char ( DataHandle,Element, Data, & external/io_grib1/io_grib1.F: call wrf_debug ( DEBUG , 'Entering ext_gr1_put_dom_ti_char') external/io_grib1/io_grib1.F:END SUBROUTINE ext_gr1_put_dom_ti_char Binary file external/io_grib1/io_grib1.o matches frame/module_io.f90: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & frame/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & Binary file frame/module_io.o matches frame/module_io_quilt.F: CALL ext_gr1_put_dom_ti_char ( handle(DataHandle), TRIM(Element), TRIM(CData), Status) Binary file main/wrf.exe matches Binary file main/libwrflib.a matches Binary file main/ideal.exe matches So, we've found some sources define target symbol, and some libraries incorporate either its references or definitions. Now we can look which libraries just use the symbol, and where is it really defined. The nm tool can help: marcusmae at msiwind:~/Programming/wrfv3$ nm main/libwrflib.a | grep ext_gr1_put_dom_ti_char U ext_gr1_put_dom_ti_char_ "U" means libwrflib.a references, but does not define. It could be in some other library: marcusmae at msiwind:~/Programming/wrfv3$ nm external/io_grib1/libio_grib1.a | grep ext_gr1_put_dom_ti_char 0000df30 T ext_gr1_put_dom_ti_char_ 00d63600 b ext_gr1_put_dom_ti_char_$TMPSTR.0.56 - aha, "T" - means function body defined here, in libio_grib1. So my guess would be to check if your app is linked with -lio_grib1, and if not, adding -lio_grib1 to LDFLAGS should help. Hope it helps, - D. 2011/2/1 Huber, David : > Dmitry, > > Primarily "undefined reference" errors. For instance, > > > module_debug.o: In function `__module_debug__mprintf': > module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' > module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' > module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' > module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' > module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' > module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow > > > output_module.o: In function `__output_module__output_close': > output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' > output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' > output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' > output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' > output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' > output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' > output_module.o: In function `__output_module__ext_put_dom_ti_char': > output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' > output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' > output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' > > Dave From dbh409 at ku.edu Tue Feb 1 15:01:58 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 1 Feb 2011 22:01:58 +0000 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: , Message-ID: Dmitry, Oh now that was just silly. OK, that takes care of the bulk of the errors. There are still a few undefined references, but, I think I can take it from here. Thanks a bunch! Dave ________________________________________ From: Dmitry N. Mikushin [maemarcus at gmail.com] Sent: Tuesday, February 01, 2011 3:46 PM To: Huber, David Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc David, I think you're very close to get a working solution! The "-D" prefix just defines symbol in fortran or C source code, e.g. if there is #ifdef IO_BINARY ... #endif in code, then it will be effective with -DIO_BINARY. Is it what you want? Also, with -L option you're right, but is there a "_" missing, i.e. -L/home/dbh409/WRFV3/external/io_grib1 marcusmae at msiwind:~$ cd Programming/wrfv3/ marcusmae at msiwind:~/Programming/wrfv3$ find . -type f -name "libio_grib1.a" ./external/io_grib1/libio_grib1.a ? - D. 2011/2/2 Huber, David : > Dmitry, > > Thanks for the recipe! This resulted in different erros, so I tried adding the flags -DIO_BINARY and -DIO_GRIB1 to CPPFLAGS, but that didn't change anything. Also, in LDFLAGS, I tried explicitly adding the library directory with -L/home/dbh409/WRFV3/external/iogrib1 to no avail. I'm getting the same "undefined reference" errors but also > > /usr/bin/ld: cannot find -lio_grib1 > collect2: ld returned 1 exit status > > > ________________________________________ > From: Dmitry N. Mikushin [maemarcus at gmail.com] > Sent: Tuesday, February 01, 2011 2:38 PM > To: Huber, David > Cc: wrf-users at ucar.edu > Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc > > David, > > Here's a common recipe to lookup for missing symbols. Navigate to the > project dir and grep for particular symbol you have issue with, for > instance: > > marcusmae at msiwind:~$ cd Programming/wrfv3/ > marcusmae at msiwind:~/Programming/wrfv3$ grep ext_gr1_put_dom_ti_char * -R > arch/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, Data, & > external/io_grib1/io_grib1.f90:SUBROUTINE ext_gr1_put_dom_ti_char ( > DataHandle,Element, Data, & > external/io_grib1/io_grib1.f90: call wrf_debug ( DEBUG , 'Entering > ext_gr1_put_dom_ti_char') > external/io_grib1/io_grib1.f90:END SUBROUTINE ext_gr1_put_dom_ti_char > Binary file external/io_grib1/libio_grib1.a matches > external/io_grib1/io_grib1.F:SUBROUTINE ext_gr1_put_dom_ti_char ( > DataHandle,Element, Data, & > external/io_grib1/io_grib1.F: call wrf_debug ( DEBUG , 'Entering > ext_gr1_put_dom_ti_char') > external/io_grib1/io_grib1.F:END SUBROUTINE ext_gr1_put_dom_ti_char > Binary file external/io_grib1/io_grib1.o matches > frame/module_io.f90: CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, Data, & > frame/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, Data, & > Binary file frame/module_io.o matches > frame/module_io_quilt.F: CALL > ext_gr1_put_dom_ti_char ( handle(DataHandle), TRIM(Element), > TRIM(CData), Status) > Binary file main/wrf.exe matches > Binary file main/libwrflib.a matches > Binary file main/ideal.exe matches > > So, we've found some sources define target symbol, and some libraries > incorporate either its references or definitions. Now we can look > which libraries just use the symbol, and where is it really defined. > The nm tool can help: > > marcusmae at msiwind:~/Programming/wrfv3$ nm main/libwrflib.a | grep > ext_gr1_put_dom_ti_char > U ext_gr1_put_dom_ti_char_ > > "U" means libwrflib.a references, but does not define. It could be in > some other library: > > marcusmae at msiwind:~/Programming/wrfv3$ nm > external/io_grib1/libio_grib1.a | grep ext_gr1_put_dom_ti_char > 0000df30 T ext_gr1_put_dom_ti_char_ > 00d63600 b ext_gr1_put_dom_ti_char_$TMPSTR.0.56 > > - aha, "T" - means function body defined here, in libio_grib1. > > So my guess would be to check if your app is linked with -lio_grib1, > and if not, adding -lio_grib1 to LDFLAGS should help. > > Hope it helps, > - D. > > 2011/2/1 Huber, David : >> Dmitry, >> >> Primarily "undefined reference" errors. For instance, >> >> >> module_debug.o: In function `__module_debug__mprintf': >> module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' >> module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' >> module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' >> module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' >> module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' >> module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow >> >> >> output_module.o: In function `__output_module__output_close': >> output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' >> output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' >> output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' >> output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' >> output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' >> output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' >> output_module.o: In function `__output_module__ext_put_dom_ti_char': >> output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' >> output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' >> output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' >> >> Dave > From francina at hwr.arizona.edu Wed Feb 2 10:46:49 2011 From: francina at hwr.arizona.edu (Francina Dominguez) Date: Wed, 2 Feb 2011 10:46:49 -0700 Subject: [Wrf-users] Postdoctoral Position University of Arizona Message-ID: Postdoctoral Scholar ? Exploring hydrologic extremes in a changing climate using regional climate modeling Starting in the Spring of 2011, we seek a motivated scientist to become part of our expanding interdisciplinary regional modeling group, with a focus on land-atmosphere interactions. The selected candidate will work on a DOE-funded project that seeks to understand if regional climate models (in particular WRF) reproduce observed extreme historical hydrologic events and provide useful information about future hydrologic extremes. The selected candidate will be tasked with high-resolution nested simulations of extreme events, and their statistical characterization. Experience with WRF, good programming skills and a background in statistical analysis is desired. As part of the Hydrometeorology program at the University of Arizona, this project offers the opportunity to collaborate with an interdisciplinary group of scientists. The interested candidates should contact Francina Dominguez (francina at hwr.arizona.edu) or Chistopher Castro (castro at atmo.arizona.edu). -- Francina Dominguez Assistant Professor Department of Atmospheric Sciences Department of Hydrology and Water Resources University of Arizona From maemarcus at gmail.com Tue Feb 1 12:43:15 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 1 Feb 2011 22:43:15 +0300 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: Message-ID: David, And what are the errors you've encountered? - D. 2011/2/1 Huber, David : > Hello all, > > I have run into a situation where I need to have a version of WPS compiled with the GNU compilers. ?I need to be able to write WPS intermediate files in NCL using a Fortran stub so that WPS can process them and generate NetCDF input files. ?I am using a GNU precompiled binaries version of NCL, so WPS and the stub must also be compiled with the GNU compilers. ?This build of WPS will not be used normally, and I don't need any of the utilities, geogrid.exe, or ungrib.exe, just metgrid.exe. ?It's a 64-bit CentOS machine running bash. ?The architecture specific settings in configure.wps are as follows: > > #### Architecture specific settings #### > > # Settings for PC Linux x86_64, gfortran compiler, ? serial, NO GRIB2 > # > COMPRESSION_LIBS ? ? ? ?= > COMPRESSION_INC ? ? ? ? = > FDEFS ? ? ? ? ? ? ? ? ? = > FC ? ? ? ? ? ? ?= ? ? ? gfortran > SFC ? ? ? ? ? ? = ? ? ? gfortran > FFLAGS ? ? ? ? ?= ? ? ? -ffree-form -w -fno-underscoring > F77FLAGS ? ? ? ?= ? ? ? -ffixed-form -w -fno-underscoring > FNGFLAGS ? ? ? ?= ? ? ? $(FFLAGS) > LDFLAGS ? ? ? ? = ? ? ? -w > CC ? ? ? ? ? ? ?= ? ? ? gcc > SCC ? ? ? ? ? ? = ? ? ? gcc > CFLAGS ? ? ? ? ?= ? ? ? -w > CPP ? ? ? ? ? ? = ? ? ? /usr/bin/cpp -C -P -traditional > CPPFLAGS ? ? ? ?= ? ? ? -D_UNDERSCORE -DBYTESWAP -DLINUX -DIO_NETCDF -DBIT32 > > I get several errors with this and the compilation is not successful. ?I'm not sure what flags I have wrong here, but any guidance would be much appreciated. > > Thanks, > > Dave > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From maemarcus at gmail.com Tue Feb 1 13:38:24 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 1 Feb 2011 23:38:24 +0300 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: Message-ID: David, Here's a common recipe to lookup for missing symbols. Navigate to the project dir and grep for particular symbol you have issue with, for instance: marcusmae at msiwind:~$ cd Programming/wrfv3/ marcusmae at msiwind:~/Programming/wrfv3$ grep ext_gr1_put_dom_ti_char * -R arch/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & external/io_grib1/io_grib1.f90:SUBROUTINE ext_gr1_put_dom_ti_char ( DataHandle,Element, Data, & external/io_grib1/io_grib1.f90: call wrf_debug ( DEBUG , 'Entering ext_gr1_put_dom_ti_char') external/io_grib1/io_grib1.f90:END SUBROUTINE ext_gr1_put_dom_ti_char Binary file external/io_grib1/libio_grib1.a matches external/io_grib1/io_grib1.F:SUBROUTINE ext_gr1_put_dom_ti_char ( DataHandle,Element, Data, & external/io_grib1/io_grib1.F: call wrf_debug ( DEBUG , 'Entering ext_gr1_put_dom_ti_char') external/io_grib1/io_grib1.F:END SUBROUTINE ext_gr1_put_dom_ti_char Binary file external/io_grib1/io_grib1.o matches frame/module_io.f90: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & frame/md_calls.inc: CALL ext_gr1_put_dom_ti_char ( Hndl, Element, Data, & Binary file frame/module_io.o matches frame/module_io_quilt.F: CALL ext_gr1_put_dom_ti_char ( handle(DataHandle), TRIM(Element), TRIM(CData), Status) Binary file main/wrf.exe matches Binary file main/libwrflib.a matches Binary file main/ideal.exe matches So, we've found some sources define target symbol, and some libraries incorporate either its references or definitions. Now we can look which libraries just use the symbol, and where is it really defined. The nm tool can help: marcusmae at msiwind:~/Programming/wrfv3$ nm main/libwrflib.a | grep ext_gr1_put_dom_ti_char U ext_gr1_put_dom_ti_char_ "U" means libwrflib.a references, but does not define. It could be in some other library: marcusmae at msiwind:~/Programming/wrfv3$ nm external/io_grib1/libio_grib1.a | grep ext_gr1_put_dom_ti_char 0000df30 T ext_gr1_put_dom_ti_char_ 00d63600 b ext_gr1_put_dom_ti_char_$TMPSTR.0.56 - aha, "T" - means function body defined here, in libio_grib1. So my guess would be to check if your app is linked with -lio_grib1, and if not, adding -lio_grib1 to LDFLAGS should help. Hope it helps, - D. 2011/2/1 Huber, David : > Dmitry, > > Primarily "undefined reference" errors. ?For instance, > > > module_debug.o: In function `__module_debug__mprintf': > module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' > module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' > module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' > module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' > module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' > module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' > module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow > > > output_module.o: In function `__output_module__output_close': > output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' > output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' > output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' > output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' > output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' > output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' > output_module.o: In function `__output_module__ext_put_dom_ti_char': > output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' > output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' > output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' > > Dave From maemarcus at gmail.com Tue Feb 1 14:46:52 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Wed, 2 Feb 2011 00:46:52 +0300 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: Message-ID: David, I think you're very close to get a working solution! The "-D" prefix just defines symbol in fortran or C source code, e.g. if there is #ifdef IO_BINARY ... #endif in code, then it will be effective with -DIO_BINARY. Is it what you want? Also, with -L option you're right, but is there a "_" missing, i.e. -L/home/dbh409/WRFV3/external/io_grib1 marcusmae at msiwind:~$ cd Programming/wrfv3/ marcusmae at msiwind:~/Programming/wrfv3$ find . -type f -name "libio_grib1.a" ./external/io_grib1/libio_grib1.a ? - D. 2011/2/2 Huber, David : > Dmitry, > > Thanks for the recipe! ?This resulted in different erros, so I tried adding the flags -DIO_BINARY and -DIO_GRIB1 to CPPFLAGS, but that didn't change anything. ?Also, in LDFLAGS, I tried explicitly adding the library directory with -L/home/dbh409/WRFV3/external/iogrib1 to no avail. ?I'm getting the same "undefined reference" errors but also > > /usr/bin/ld: cannot find -lio_grib1 > collect2: ld returned 1 exit status > > > ________________________________________ > From: Dmitry N. Mikushin [maemarcus at gmail.com] > Sent: Tuesday, February 01, 2011 2:38 PM > To: Huber, David > Cc: wrf-users at ucar.edu > Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc > > David, > > Here's a common recipe to lookup for missing symbols. Navigate to the > project dir and grep for particular symbol you have issue with, for > instance: > > marcusmae at msiwind:~$ cd Programming/wrfv3/ > marcusmae at msiwind:~/Programming/wrfv3$ grep ext_gr1_put_dom_ti_char * -R > arch/md_calls.inc: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, ? Data, & > external/io_grib1/io_grib1.f90:SUBROUTINE ext_gr1_put_dom_ti_char ( > DataHandle,Element, ? Data, ?& > external/io_grib1/io_grib1.f90: ?call wrf_debug ( DEBUG , 'Entering > ext_gr1_put_dom_ti_char') > external/io_grib1/io_grib1.f90:END SUBROUTINE ext_gr1_put_dom_ti_char > Binary file external/io_grib1/libio_grib1.a matches > external/io_grib1/io_grib1.F:SUBROUTINE ext_gr1_put_dom_ti_char ( > DataHandle,Element, ? Data, ?& > external/io_grib1/io_grib1.F: ?call wrf_debug ( DEBUG , 'Entering > ext_gr1_put_dom_ti_char') > external/io_grib1/io_grib1.F:END SUBROUTINE ext_gr1_put_dom_ti_char > Binary file external/io_grib1/io_grib1.o matches > frame/module_io.f90: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, Data, & > frame/md_calls.inc: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, > Element, ? Data, & > Binary file frame/module_io.o matches > frame/module_io_quilt.F: ? ? ? ? ? ? ? ? ? ?CALL > ext_gr1_put_dom_ti_char ( handle(DataHandle), TRIM(Element), > TRIM(CData), Status) > Binary file main/wrf.exe matches > Binary file main/libwrflib.a matches > Binary file main/ideal.exe matches > > So, we've found some sources define target symbol, and some libraries > incorporate either its references or definitions. Now we can look > which libraries just use the symbol, and where is it really defined. > The nm tool can help: > > marcusmae at msiwind:~/Programming/wrfv3$ nm main/libwrflib.a | grep > ext_gr1_put_dom_ti_char > ? ? ? ? U ext_gr1_put_dom_ti_char_ > > "U" means libwrflib.a references, but does not define. It could be in > some other library: > > marcusmae at msiwind:~/Programming/wrfv3$ nm > external/io_grib1/libio_grib1.a | grep ext_gr1_put_dom_ti_char > 0000df30 T ext_gr1_put_dom_ti_char_ > 00d63600 b ext_gr1_put_dom_ti_char_$TMPSTR.0.56 > > - aha, "T" - means function body defined here, in libio_grib1. > > So my guess would be to check if your app is linked with -lio_grib1, > and if not, adding -lio_grib1 to LDFLAGS should help. > > Hope it helps, > - D. > > 2011/2/1 Huber, David : >> Dmitry, >> >> Primarily "undefined reference" errors. ?For instance, >> >> >> module_debug.o: In function `__module_debug__mprintf': >> module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' >> module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' >> module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' >> module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' >> module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' >> module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' >> module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow >> >> >> output_module.o: In function `__output_module__output_close': >> output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' >> output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' >> output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' >> output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' >> output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' >> output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' >> output_module.o: In function `__output_module__ext_put_dom_ti_char': >> output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' >> output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' >> output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' >> >> Dave > From maemarcus at gmail.com Tue Feb 1 15:06:09 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Wed, 2 Feb 2011 01:06:09 +0300 Subject: [Wrf-users] Compile WPS with gfortran and gcc In-Reply-To: References: Message-ID: Great to hear it's solved now, good luck! :) - D. 2011/2/2 Huber, David : > Dmitry, > > Oh now that was just silly. ?OK, that takes care of the bulk of the errors. ?There are still a few undefined references, but, I think I can take it from here. > > Thanks a bunch! > > Dave > ________________________________________ > From: Dmitry N. Mikushin [maemarcus at gmail.com] > Sent: Tuesday, February 01, 2011 3:46 PM > To: Huber, David > Cc: wrf-users at ucar.edu > Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc > > David, > > I think you're very close to get a working solution! The "-D" prefix > just defines symbol in fortran or C source code, e.g. if there is > > #ifdef IO_BINARY > ... > #endif > > in code, then it will be effective with -DIO_BINARY. Is it what you want? > > Also, with -L option you're right, but is there a "_" missing, i.e. > -L/home/dbh409/WRFV3/external/io_grib1 > > marcusmae at msiwind:~$ cd Programming/wrfv3/ > marcusmae at msiwind:~/Programming/wrfv3$ find . -type f -name "libio_grib1.a" > ./external/io_grib1/libio_grib1.a > > ? > > - D. > > 2011/2/2 Huber, David : >> Dmitry, >> >> Thanks for the recipe! ?This resulted in different erros, so I tried adding the flags -DIO_BINARY and -DIO_GRIB1 to CPPFLAGS, but that didn't change anything. ?Also, in LDFLAGS, I tried explicitly adding the library directory with -L/home/dbh409/WRFV3/external/iogrib1 to no avail. ?I'm getting the same "undefined reference" errors but also >> >> /usr/bin/ld: cannot find -lio_grib1 >> collect2: ld returned 1 exit status >> >> >> ________________________________________ >> From: Dmitry N. Mikushin [maemarcus at gmail.com] >> Sent: Tuesday, February 01, 2011 2:38 PM >> To: Huber, David >> Cc: wrf-users at ucar.edu >> Subject: Re: [Wrf-users] Compile WPS with gfortran and gcc >> >> David, >> >> Here's a common recipe to lookup for missing symbols. Navigate to the >> project dir and grep for particular symbol you have issue with, for >> instance: >> >> marcusmae at msiwind:~$ cd Programming/wrfv3/ >> marcusmae at msiwind:~/Programming/wrfv3$ grep ext_gr1_put_dom_ti_char * -R >> arch/md_calls.inc: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, >> Element, ? Data, & >> external/io_grib1/io_grib1.f90:SUBROUTINE ext_gr1_put_dom_ti_char ( >> DataHandle,Element, ? Data, ?& >> external/io_grib1/io_grib1.f90: ?call wrf_debug ( DEBUG , 'Entering >> ext_gr1_put_dom_ti_char') >> external/io_grib1/io_grib1.f90:END SUBROUTINE ext_gr1_put_dom_ti_char >> Binary file external/io_grib1/libio_grib1.a matches >> external/io_grib1/io_grib1.F:SUBROUTINE ext_gr1_put_dom_ti_char ( >> DataHandle,Element, ? Data, ?& >> external/io_grib1/io_grib1.F: ?call wrf_debug ( DEBUG , 'Entering >> ext_gr1_put_dom_ti_char') >> external/io_grib1/io_grib1.F:END SUBROUTINE ext_gr1_put_dom_ti_char >> Binary file external/io_grib1/io_grib1.o matches >> frame/module_io.f90: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, >> Element, Data, & >> frame/md_calls.inc: ? ? ? ? ? CALL ext_gr1_put_dom_ti_char ( Hndl, >> Element, ? Data, & >> Binary file frame/module_io.o matches >> frame/module_io_quilt.F: ? ? ? ? ? ? ? ? ? ?CALL >> ext_gr1_put_dom_ti_char ( handle(DataHandle), TRIM(Element), >> TRIM(CData), Status) >> Binary file main/wrf.exe matches >> Binary file main/libwrflib.a matches >> Binary file main/ideal.exe matches >> >> So, we've found some sources define target symbol, and some libraries >> incorporate either its references or definitions. Now we can look >> which libraries just use the symbol, and where is it really defined. >> The nm tool can help: >> >> marcusmae at msiwind:~/Programming/wrfv3$ nm main/libwrflib.a | grep >> ext_gr1_put_dom_ti_char >> ? ? ? ? U ext_gr1_put_dom_ti_char_ >> >> "U" means libwrflib.a references, but does not define. It could be in >> some other library: >> >> marcusmae at msiwind:~/Programming/wrfv3$ nm >> external/io_grib1/libio_grib1.a | grep ext_gr1_put_dom_ti_char >> 0000df30 T ext_gr1_put_dom_ti_char_ >> 00d63600 b ext_gr1_put_dom_ti_char_$TMPSTR.0.56 >> >> - aha, "T" - means function body defined here, in libio_grib1. >> >> So my guess would be to check if your app is linked with -lio_grib1, >> and if not, adding -lio_grib1 to LDFLAGS should help. >> >> Hope it helps, >> - D. >> >> 2011/2/1 Huber, David : >>> Dmitry, >>> >>> Primarily "undefined reference" errors. ?For instance, >>> >>> >>> module_debug.o: In function `__module_debug__mprintf': >>> module_debug.f90:(.text+0xfe): undefined reference to `cio_set_log_filename' >>> module_debug.f90:(.text+0x1cb): undefined reference to `cio_set_log_filename' >>> module_debug.f90:(.text+0x2a4): undefined reference to `cio_set_log_filename' >>> module_debug.f90:(.text+0x7b5): undefined reference to `cio_prints' >>> module_debug.f90:(.text+0xbe9): undefined reference to `cio_prints' >>> module_debug.f90:(.text+0xc93): undefined reference to `cio_prints' >>> module_debug.f90:(.text+0x10c8): undefined reference to `cio_prints' >>> module_debug.f90:(.text+0x1185): undefined reference to `cio_prints' >>> module_debug.o:module_debug.f90:(.text+0x11a0): more undefined references to `cio_prints' follow >>> >>> >>> output_module.o: In function `__output_module__output_close': >>> output_module.f90:(.text+0x3f): undefined reference to `ext_int_ioclose' >>> output_module.f90:(.text+0x58): undefined reference to `ext_ncd_ioclose' >>> output_module.f90:(.text+0x71): undefined reference to `ext_gr1_ioclose' >>> output_module.f90:(.text+0x1e2): undefined reference to `ext_int_ioexit' >>> output_module.f90:(.text+0x1f6): undefined reference to `ext_ncd_ioexit' >>> output_module.f90:(.text+0x20a): undefined reference to `ext_gr1_ioexit' >>> output_module.o: In function `__output_module__ext_put_dom_ti_char': >>> output_module.f90:(.text+0x3fd): undefined reference to `ext_int_put_dom_ti_char' >>> output_module.f90:(.text+0x47e): undefined reference to `ext_ncd_put_dom_ti_char' >>> output_module.f90:(.text+0x4ff): undefined reference to `ext_gr1_put_dom_ti_char' >>> >>> Dave >> > From pliu34 at gatech.edu Wed Feb 2 13:41:01 2011 From: pliu34 at gatech.edu (Liu, Peng) Date: Wed, 2 Feb 2011 15:41:01 -0500 (EST) Subject: [Wrf-users] ungrib fixed data in NARR Message-ID: <401451887.609307.1296679261902.JavaMail.root@mail5.gatech.edu> Dear WRF users, I encounter a problem when trying to ungrib the fixed data of NARR. I downloaded the 32km_native.EGDFIX.fixed data and ran link_grib.csh and then ran ungrib.exe (just follow the advice in Vtable.NARR of WPS). However, it turned out: ungrib - grib edition num 1 GRIB SECTION 0: Grib Length : 172058 Grib Edition : 1 GRIB SECTION 1: Length of PDS : 28 Parameter Table Version : 131 Center ID : 7 Process ID : 83 Grid ID : 192 Is there a Grid Desc. Section (GDS)? : Yes Is there a Bit Map Section (BMS)? : No Parameter : 7 Level type : 1 Height, pressure, etc : 0 Year : 79 Month : 11 Day : 8 Hour : 0 Minute : 0 Forecast time unit : 1 P1 : 0 P2 : 0 Time Range Indicator : 0 Number in Ave? : 0 Number missing from ave? : 0 Century : 20 Sub-center : 15 Decimal scale factor : 0 GRIB SECTION 2: Length of GRID Desc. Section : 32 Number of V. Coordinate Parms : 0 List Starting point : 255 Data Representation type : 203 GRIB SECTION 4: Length of BDS : 0 0/1: grid-point or sph. harm. data : 0 0/1: simple or complex packing : 0 0/1: floating or integer : 0 0/1: No addl flags or addl flags : 0 Unused bits : 0 Binary Scale Factor : 0 Reference Value : 0.00000000 Number of bits for packing : 0 Unrecognized grid: 203 This grid is not currently supported. Write your own program to put the data to the intermediate format And the program ungrib can not run successfully. Does anyone encounter such problem and know how to solve it? Thanks in advance Peng Liu From thomas.schwitalla at uni-hohenheim.de Thu Feb 3 00:03:15 2011 From: thomas.schwitalla at uni-hohenheim.de (Thomas Schwitalla) Date: Thu, 03 Feb 2011 08:03:15 +0100 Subject: [Wrf-users] Wrf-users Digest, Vol 78, Issue 1 In-Reply-To: References: Message-ID: <4D4A5333.2050204@uni-hohenheim.de> Dave, that's what I successfully use: FC = gfortran SFC = gfortran FFLAGS = -ffree-form -O -fno-second-underscore F77FLAGS = -ffixed-form -O -fno-second-underscore FNGFLAGS = $(FFLAGS) LDFLAGS = CC = gcc SCC = gcc CFLAGS = CPP = /lib/cpp -C -P -traditional CPPFLAGS = -D_UNDERSCORE -DBYTESWAP -DLINUX -DIO_NETCDF -DBIT32 Thomas Am 02.02.2011 20:00, schrieb wrf-users-request at ucar.edu: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. Compile WPS with gfortran and gcc (Huber, David) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Tue, 1 Feb 2011 06:28:56 +0000 > From: "Huber, David" > Subject: [Wrf-users] Compile WPS with gfortran and gcc > To: "wrf-users at ucar.edu" > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Hello all, > > I have run into a situation where I need to have a version of WPS compiled with the GNU compilers. I need to be able to write WPS intermediate files in NCL using a Fortran stub so that WPS can process them and generate NetCDF input files. I am using a GNU precompiled binaries version of NCL, so WPS and the stub must also be compiled with the GNU compilers. This build of WPS will not be used normally, and I don't need any of the utilities, geogrid.exe, or ungrib.exe, just metgrid.exe. It's a 64-bit CentOS machine running bash. The architecture specific settings in configure.wps are as follows: > > #### Architecture specific settings #### > > # Settings for PC Linux x86_64, gfortran compiler, serial, NO GRIB2 > # > COMPRESSION_LIBS = > COMPRESSION_INC = > FDEFS = > FC = gfortran > SFC = gfortran > FFLAGS = -ffree-form -w -fno-underscoring > F77FLAGS = -ffixed-form -w -fno-underscoring > FNGFLAGS = $(FFLAGS) > LDFLAGS = -w > CC = gcc > SCC = gcc > CFLAGS = -w > CPP = /usr/bin/cpp -C -P -traditional > CPPFLAGS = -D_UNDERSCORE -DBYTESWAP -DLINUX -DIO_NETCDF -DBIT32 > > I get several errors with this and the compilation is not successful. I'm not sure what flags I have wrong here, but any guidance would be much appreciated. > > Thanks, > > Dave > > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 78, Issue 1 > **************************************** From Don.Morton at alaska.edu Thu Feb 3 11:33:56 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Thu, 3 Feb 2011 09:33:56 -0900 Subject: [Wrf-users] ungrib fixed data in NARR In-Reply-To: <401451887.609307.1296679261902.JavaMail.root@mail5.gatech.edu> References: <401451887.609307.1296679261902.JavaMail.root@mail5.gatech.edu> Message-ID: On Wed, Feb 2, 2011 at 11:41 AM, Liu, Peng wrote: > Dear WRF users, > I encounter a problem when trying to ungrib the fixed data of NARR. > I downloaded the 32km_native.EGDFIX.fixed data and ran link_grib.csh > and then ran ungrib.exe (just follow the advice in Vtable.NARR of WPS). > Howdy, I tend to use the 32km_output.AWIP32.fixed I have some crude notes available in Appendix A of http://weather.arsc.edu/Training/WRFTutorial-June2009/SampleCaseStudies_June2009.pdf They're written for local use, but maybe they'll give you some ideas. Cheers, Don Morton -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110203/0b756cb9/attachment.html From c.chemel at herts.ac.uk Thu Feb 3 15:14:38 2011 From: c.chemel at herts.ac.uk (Chemel Charles) Date: Thu, 3 Feb 2011 22:14:38 +0000 Subject: [Wrf-users] Research Fellows in Air Pollution - Health and Air Quality - Climate Interactions In-Reply-To: <18EF08266D889C41A14D1099C7102CE2B6D488D40F@UH-MAILSTOR.herts.ac.uk> References: <18EF08266D889C41A14D1099C7102CE2B6D488D40F@UH-MAILSTOR.herts.ac.uk> Message-ID: Research Fellow in Air Pollution and Health ImpactsUniversity of Hertfordshire - Centre for Atmospheric and Instrumentation Research (CAIR) Research Opportunities For A Self Motivated And Dynamic Scientist Starting salary up to: ?31,671 (initially available for 3 years) You will be a high achieving individual with the drive and ambition to undertake research into topics related to air pollution and health impacts, such as, measurement and characterisation of particulate matter, source apportionment, health impact of aerosols or multiscale integrated modelling of air quality. As part of your duties you will assist in managing projects and explore further development of CAIR research programmes. You must have excellent communication skills in English and some experience of project management involving multiple interdisciplinary partners, preferably as part of the European Framework Programme. Closing Date: Friday 11 February 2011 Quote Reference: 001882AC You must have a good first degree in a relevant subject and a PhD in atmospheric sciences or related fields. Please contact Professor Ranjeet S Sokhi for informal discussions Tel: +44 (0) 1707 284520 Email: r.s.sokhi at herts.ac.uk. Information on CAIR can be found on the following link http://www.transphorm.eu and http://strc.herts.ac.uk/ cair. FOR FIXED TERM CONTRACTS ?Under current UKBA regulations, the University is unlikely to be able to get a work permit in respect of this post. We can therefore only accept applications from people who will have the right to work in the UK for the total duration of the contract.? The University offers a range of benefits including a final salary pension scheme, professional development, family friendly policies, child care vouchers, waiving of course fees for the children of staff at UH, discounted memberships at the Hertfordshire Sports Village and generous annual leave. Apply online at http://www.herts.ac.uk/jobs or request an application pack from Human Resources on 01707 284802 (24hr voicemail), quoting the appropriate reference number. ------------------------------------------------- Research Fellow in Modelling Air Quality and ClimateInteractions on Regional Scales University of Hertfordshire - Centre for Atmospheric and Instrumentation Research (CAIR) Research Opportunities For A Self Motivated And Dynamic Scientist Starting salary up to: ?26,523 (initially available for 2 years) You will be a high achieving research scientist who has the drive and ambition to work on frontier science problems in the field of air quality and climate interactions. As part of international project teams, you will develop methodologies for understanding the feedback interactions between air quality and climate processes. You will have experience of using and developing high resolution, regional and global scale air pollution and climate models in combination with measurements. You must have a good first degree in a relevant subject and a PhD in atmospheric sciences or related fields. Please contact Professor Ranjeet S Sokhi for informal discussions Tel: +44 (0) 1707 284520 Email: r.s.sokhi at herts.ac.uk. Information on CAIR can be found on the following link: http//strc.herts.ac.uk/cair. Closing Date: 11 February 2011 Quote Reference: 001282 FOR FIXED TERM CONTRACTS ?Under current UKBA regulations, the University is unlikely to be able to get a work permit in respect of this post. We can therefore only accept applications from people who will have the right to work in the UK for the total duration of the contract.? The University offers a range of benefits including a final salary pension scheme, professional development, family friendly policies, child care vouchers, waiving of course fees for the children of staff at UH, discounted memberships at the Hertfordshire Sports Village and generous annual leave. Apply online at http://www.herts.ac.uk/jobs or request an application pack from Human Resources on 01707 284802 (24hr voicemail), quoting the appropriate reference number. From dbh409 at ku.edu Thu Feb 3 15:01:50 2011 From: dbh409 at ku.edu (Huber, David) Date: Thu, 3 Feb 2011 22:01:50 +0000 Subject: [Wrf-users] Plot a line on a map with WRF data Message-ID: Howdy all, I've created a cross-section plot and I would like to plot a line on a map with land cover showing the location of the cross-section. Following is the snippet I've written for that purpose: in_C = addfile("/opt2/newC/wrfout_d01_2001-07-01_01.nc","r") file_pre_C = "/opt2/newC/wrfout_d01_2001-" file_pre_A = "/opt2/newA/wrfout_d01_2001-" t_max = int2flt(n-m) z_max = dimsizes(in_C->QVAPOR(0,:,0,0)) zw2 = z_max+1 x1 = 100 x2 = 130 xu1 = x1 xu2 = x2+1 y1 = 80 y2 = 125 yv1 = y1 yv2 = y2+1 y_max = y2-y1+1 x_max = x2-x1+1 xu_max = x_max+1 yv_max = y_max+1 lc = in_C->LU_INDEX(0,:,:) res = True res at cnFillOn = True res at cnLevelSelectionMode = "ManualLevels" res at cnLevelSpacingF = 1 res at cnFillMode = "RasterFill" wks3 = gsn_open_wks("eps", "slice_line") gsn_define_colormap(wks3, "wind_17lev") mpres = True mpres at mpUSStateLineThicknessF = 2 mpres at mpUSStateLineColor = "black" mpres at mpGeophysicalLineColor = "black" mpres at mpGeophysicalLineThicknessF = 2 mpres at mpNationalLineThicknessF = 2 mpres at mpNationalLineColor = "black" plot_bg = wrf_contour(in_C,wks3,lc,res) pres = True pres at gsLineColor = "black" pres at gsLineThicknessF = 1.5 plot_line = gsn_add_polyline(wks3,plot_bg,(/x1,y1/),(/x2,y2/),pres) map_line = wrf_map_overlays(in_C, wks3 ,(/plot_bg,plot_line/) ,True,mpres) delete(lc) delete(mpres) delete(pres) When I run this code, I get the following warnings and errors: [dbh409 at Starbuck ncl_scripts]$ ncl slice_winds.ncl Copyright (C) 1995-2010 - All Rights Reserved University Corporation for Atmospheric Research NCAR Command Language Version 5.2.0 The use of this software is governed by a License Agreement. See http://www.ncl.ucar.edu/ for more details. warning:tiMainString isn't a resource in this object warning:NhlGetValues:Error retrieving tiMainString fatal:Execute: Error occurred at or near line 3426 in file /usr/local/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl fatal:Execute: Error occurred at or near line 52 in file slice_winds.ncl Where line 52 corresponds to the line "map_line = wrf_map_overlays(in_C, wks3 ,(/plot_bg,plot_line/) ,True,mpres)". It's fairly apparent that wrf_map_overlays doesn't like adding the gsn_add_polyline plot to the mix. I've gone looking for other built in functions for this purpose, but have yet to find anything else that might work. Has anyone else found a suitable function? If not, is there a way to force gsn_add_polyline to play nice? Thanks, Dave From k_radhika at tropmet.res.in Sun Feb 6 23:44:56 2011 From: k_radhika at tropmet.res.in (Kanase Radhika D.) Date: Mon, 7 Feb 2011 12:14:56 +0530 (IST) Subject: [Wrf-users] BMJ Cumulus rain In-Reply-To: <313405511.145597.1297060958065.JavaMail.root@mail1.tropmet.res.in> Message-ID: <510971665.145614.1297061096722.JavaMail.root@mail1.tropmet.res.in> Hi all, I ran the WRF (3.1.1) model for a tropical cyclone case with cumulus, microphysics and PBL sensitivity. But when I checked the cumulus rain for BMJ scheme for all combinations of PBL and microphysics in GrADS, it shows constant field value 0(even though the cumulus convection is kept on). But for KF and GD scheme this parameter is displayed well. I am not able to resolve the problem. The namelist file is attached for reference. Suggestions are welcome Smt. Radhika D. Kanase IITM Research Fellow T.S. Division IITM, Pune-08 -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input.BMJ_YSU_W Type: application/octet-stream Size: 4461 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110207/fa53dfa3/attachment.obj From shachen at ucdavis.edu Sun Feb 6 23:54:53 2011 From: shachen at ucdavis.edu (Shu-Hua Chen) Date: Sun, 06 Feb 2011 22:54:53 -0800 Subject: [Wrf-users] Upcoming deadline for the 14th Mesoscale conference Message-ID: <4D4F973D.8090202@ucdavis.edu> Dear WRF users, I would like to bring your attention for the upcoming 14th conference on mesoscale processes, which will be held 1-4 August 2011 in Los Angles collocated with ARAM. The abstract deadline is 1 April. Below is the link to the call for papers! http://www.ametsoc.org/MEET/ann/callforpapers.html#14meso Shuhua -- +--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+ | Shu-Hua Chen | + Dept. of Land, Air& Water Resources + | University of California | + Tel: 530-752-1822, + | Fax: 530-752-1793 | + email:shachen at ucdavis.edu + | http://atm.ucdavis.edu/mmg | +--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+--+ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110206/681c7bc7/attachment-0001.html From izzaroyan at yahoo.com Tue Feb 8 01:16:17 2011 From: izzaroyan at yahoo.com (fatkhuroyan -) Date: Tue, 8 Feb 2011 00:16:17 -0800 (PST) Subject: [Wrf-users] wrf-var3.2.1 gen_be error Message-ID: <843374.40900.qm@web46109.mail.sp1.yahoo.com> ----- Forwarded Message ---- From: fatkhuroyan - To: wrfhelp at ucar.edu Sent: Mon, February 7, 2011 4:01:52 PM Subject: wrf-var3.2.1 gen_be error Dear all, I try to create my own be.dat but there were error messages when i run gen_be_wrapper.ksh using my wrfout. "Run Stage 3: Read 3D control variable fields, and calculate vertical covariances. --------------------------------------------------------------- Beginning CPU time: Mon Feb 7 10:40:09 WIT 2011 Ending CPU time: Mon Feb 7 10:40:33 WIT 2011 Beginning CPU time: Mon Feb 7 10:40:33 WIT 2011 --------------------------------------------------------------- Run Stage 4: Calculate horizontal covariances (regional lengthscales). --------------------------------------------------------------- Ending CPU time: Mon Feb 7 11:14:12 WIT 2011 Stage gen_be_diags failed with error 2 " But when i use wrfout test data,then run ge_be_wrapper,it was SUCCES.I don't know,there was something wrong with my wrfout.Could someone help me,please ! here i attach my namelist.wps,namelist.input,two file from gen_be result. Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 9858 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0001.jpe -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4594 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/vnd.ms-works Size: 820 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0001.bin -------------- next part -------------- A non-text attachment was scrubbed... Name: gen_be_stage3.t_u.log Type: application/octet-stream Size: 890 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: gen_be_stage4_regional.log Type: application/octet-stream Size: 13378 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/c173d856/attachment-0005.obj From soleiman at rambler.ru Tue Feb 8 10:23:50 2011 From: soleiman at rambler.ru (Mostamandy Soleiman) Date: Tue, 8 Feb 2011 20:23:50 +0300 Subject: [Wrf-users] WRFV3.2 compilation Error Message-ID: <1162519986.20110208202350@rambler.ru> Hello, Wrf-users. I tried to compile WRF V3.2.1 with gfortran + gcc with but unfortunately i haven't got success . I attached the compile.log and configure.wps could someone help me !what is the problem. thanks in advance ! -- ? ?????????, Mostamandy mailto:soleiman at rambler.ru From eric.kemp at nasa.gov Tue Feb 8 12:10:42 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.0)[NORTHROP GRUMMAN INFORMATION TECH]) Date: Tue, 8 Feb 2011 13:10:42 -0600 Subject: [Wrf-users] Problems running WRF-Var 3.2.1 with PREPBUFR Message-ID: Dear wrfhelp/wrf-users: I am learning to use the WRF-Var 3.2.1 program and following the example in the on-line User?s Guide. I can successfully run the program using the conventional observations in ASCII format. However, when I try reading the PREPBUFR file instead of the ASCII version I encounter run-time errors and fail to execute an analysis. I am running on an Intel Xeon platform with the ifort/icc 11.1.038 compilers and Intel MPI 3.2.011 (I?m not using the shared memory build option). I set the BUFR environment variable to 1 before running ?configure? and see ?-DBUFR? in the resulting ?configure.wrf? file. I also confirm that BUFRLIB in var/external/bufr is built ? I modified preproc.sh to echo all commands as they are executed before I compiled WRFDA. So far I?ve attempted six different tests (BUFRLIB compiled with ?DLITTLE_ENDIAN unless otherwise noted): Test 1. Compile w/ -convert big_endian in configure.wrf. Use linux PREPBUFR. BUFRLIB aborts, cannot file "BUFR" in file. Test 2. Compile w/ -convert big_endian in configure.wrf. Use regular PREPBUFR. BUFRLIB doesn't return any obs, program completes w/o data assimilation. Test 3. Compile w/o -convert big_endian in configure.wrf. Use linux PREPBUFR. Reads obs. Cannot read be file. Test 4. Compile w/o -convert big_endian in configure.wrf. Use regular PREPBUFR. BUFRLIB aborts; cannot find "BUFR" in file. Test 5. Compile w/ -convert big_endian, force BUFRLIB compile with -DBIG_ENDIAN. Use linux PREPBUFR. BUFRLIB aborts, can't determine machine native language. Test 6. Compile w/ -convert big_endian, force BUFRLIB compile with -DBIG_ENDIAN. Use regular PREPBUFR. BUFRLIB aborts, can't determine machine native language. Note that I?m only trying to assimilate conventional obs at this point (satellite radiances will come later). Also, I have a separate utility that can successfully read the linux PREPBUFR file (it can?t read the regular PREPBUFR). Can anyone help? Thanks, -Eric -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/0d10b8a3/attachment.html From soleiman at rambler.ru Tue Feb 8 10:47:59 2011 From: soleiman at rambler.ru (Mostamandy Soleiman) Date: Tue, 8 Feb 2011 20:47:59 +0300 Subject: [Wrf-users] WRFV3.2 compilation Error Message-ID: <69822787.20110208204759@rambler.ru> Hello, Wrf-users. I tried to compile WRF V3.2.1 with gfortran + gcc with but unfortunately i haven't got success . I attached the compile.log and configure.wps could someone help me !what is the problem. thanks in advance ! -- ? ?????????, Mostamandy mailto:soleiman at rambler.ru -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.wrf Type: application/octet-stream Size: 19430 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/62f098cd/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log Type: application/octet-stream Size: 566231 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110208/62f098cd/attachment-0003.obj From albi.jk at gmail.com Wed Feb 9 02:15:09 2011 From: albi.jk at gmail.com (alberto giacomini) Date: Wed, 9 Feb 2011 10:15:09 +0100 Subject: [Wrf-users] WRFV3.2 compilation Error In-Reply-To: <1162519986.20110208202350@rambler.ru> References: <1162519986.20110208202350@rambler.ru> Message-ID: hi, I gave a suggestion some time ago; the problem arised usually with old versions of gcc/gfortran. update the compiler suite or that time I suggested as follows: substitute the Times definition (in external/io_grib2/io_grib2.F): ! character (DateStrLen), dimension(:),allocatable :: Times(:) character (DateStrLen), pointer :: Times(:) and use associated() instead of allocated: ! if (allocated(fileinfo(DataHandle)%Times)) then ! deallocate(fileinfo(DataHandle)%Times) if (associated(fileinfo(DataHandle)%Times)) then deallocate(fileinfo(DataHandle)%Times) it compiles with gfortran 4.1.2 but i don't know if it really works. Let me know the perfomances if you compare with commercial compilers. alberto 2011/2/8 Mostamandy Soleiman : > Hello, Wrf-users. > > I tried ?to compile WRF V3.2.1 with gfortran + gcc with but > unfortunately i haven't got success . I attached the compile.log and > configure.wps > could someone help me !what is the problem. > > thanks in advance ! > > > > > -- > ? ?????????, > ?Mostamandy ? ? ? ? ? ? ? ? ? ? ? ? ?mailto:soleiman at rambler.ru > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Web site: http://www.webalice.it/albi.jk/ From hamed319 at yahoo.com Wed Feb 9 10:25:43 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Wed, 9 Feb 2011 09:25:43 -0800 (PST) Subject: [Wrf-users] Fatal -Onlone Tutorial Message-ID: <46587.73479.qm@web161210.mail.bf1.yahoo.com> Dear all, I set every thing just as online tutorial. but when I run the ./wrf.exe for domain 2 I got this error: any suggestion? Timing for Writing wrfout_d01_2007-08-16_00:00:00 for domain??????? 1:??? 0.15610 elapsed seconds. ??????????? 2 input_wrf: wrf_get_next_time current_date: 2007-08-16_00:00:00 Status =?????????? -4 ?-------------- FATAL CALLED --------------- ?FATAL CALLED FROM FILE:? ? LINE:???? 705 ?? ... May have run out of valid boundary conditions in file wrfbdy_d01 Thanks in advance, Hamed ____________________________________________________________________________________ Finding fabulous fares is fun. Let Yahoo! FareChase search your favorite travel sites to find flight and hotel bargains. http://farechase.yahoo.com/promo-generic-14795097 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/987c1303/attachment.html From hedde.cea at gmail.com Wed Feb 9 05:37:40 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Wed, 9 Feb 2011 13:37:40 +0100 Subject: [Wrf-users] WRFV3.2 compilation Error In-Reply-To: <1162519986.20110208202350@rambler.ru> References: <1162519986.20110208202350@rambler.ru> Message-ID: Hi, I had the pb with WRF 3.1 and I see it is not corrected yet. Question for WRF team : how should I do to contribute to improve WRF? Here is the reference to my question : http://mailman.ucar.edu/pipermail/wrf-users/2009/001410.html The fortran90 coding used in the routine io_grib2.F is not the standard, so I had to modify a bit this routine. I join the modified routine for WRF 3.1 (be carrefull for merging!) here is the lines I had to change around the variable "Times" : allocatable becomes pointer allocated becomes associated character (DateStrLen), dimension(:), pointer :: Times(:) ... if (associated(fileinfo(DataHandle)%Times)) then ... if (associated(fileinfo(DataHandle)%Times)) & Good luck! -- Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE DEN/DTN/SMTM/LMTE B?t. 307 Pi?ce 9 13108 ST PAUL LEZ DURANCE CEDEX FRANCE 2011/2/8 Mostamandy Soleiman > Hello, Wrf-users. > > I tried to compile WRF V3.2.1 with gfortran + gcc with but > unfortunately i haven't got success . I attached the compile.log and > configure.wps > could someone help me !what is the problem. > > thanks in advance ! > > > > > -- > ? ?????????, > Mostamandy mailto:soleiman at rambler.ru > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/c739bb9f/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: io_grib2.F Type: application/octet-stream Size: 138740 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/c739bb9f/attachment-0001.obj From maemarcus at gmail.com Tue Feb 8 15:20:10 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Wed, 9 Feb 2011 01:20:10 +0300 Subject: [Wrf-users] WRFV3.2 compilation Error In-Reply-To: <69822787.20110208204759@rambler.ru> References: <69822787.20110208204759@rambler.ru> Message-ID: Mostamandy, At least one of your errors character (DateStrLen), dimension(:),allocatable :: Times(:) 1 Error: Attribute at (1) is not allowed in a TYPE definition can't be reproduced with gfortran 4.4.3. What version of gfortran are you using? 8 ??????? 2011??. 20:47 ???????????? Mostamandy Soleiman ???????: > Hello, Wrf-users. > > I tried ?to compile WRF V3.2.1 with gfortran + gcc with but > unfortunately i haven't got success . I attached the compile.log and > configure.wps > could someone help me !what is the problem. > > thanks in advance ! > > > > > -- > ? ?????????, > ?Mostamandy ? ? ? ? ? ? ? ? ? ? ? ? ?mailto:soleiman at rambler.ru > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From ZWu at trcsolutions.com Wed Feb 9 12:01:17 2011 From: ZWu at trcsolutions.com (Wu, Zhong-Xiang (Lowell,MA-US)) Date: Wed, 9 Feb 2011 14:01:17 -0500 Subject: [Wrf-users] NAM 12-km Tiled data for WRF Initialization Message-ID: <10BD6AED41582E47946AABB34C58D1F803112C2778@EXVMBX2.EMPLOYEES.ROOT.local> Hi, I am trying to use NAM 12-km tiled forecasts from NCEP in the WPS processing to create met initial files for WRF modeling. My WRF domain covers an area more than one NAM 12-km tile, so multiple NAM files are needed at one hour. Is there anyone who used the NAM 12-km tiled forecasts to initialize WRF in the similar way? And how should I use the data if WPS accepts multiple tiled files? Thanks, Zhong ********************************************************* Zhong-Xiang Wu, Ph.D TRC 650 Suffolk Street Lowell, Massachusetts, 01854 Email: zwu at trcsolutions.com zhong at alum.mit.edu Tel: 978-656-3667 Fax: 978-453-1995 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/ed21843d/attachment.html From FLiu at azmag.gov Thu Feb 10 11:44:05 2011 From: FLiu at azmag.gov (Feng Liu) Date: Thu, 10 Feb 2011 18:44:05 +0000 Subject: [Wrf-users] NAM 12-km Tiled data for WRF Initialization In-Reply-To: <10BD6AED41582E47946AABB34C58D1F803112C2778@EXVMBX2.EMPLOYEES.ROOT.local> References: <10BD6AED41582E47946AABB34C58D1F803112C2778@EXVMBX2.EMPLOYEES.ROOT.local> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C226414F9@mag9006> Zhong, There are two resources you can download the datasets from 12-km NAM model (1) http://nomads.ncdc.noaa.gov/data.php#hires_weather_datasets, you may have a direct access to this site. (2) http://dss.ucar.edu/datasets/ds335.0/matrix.html you may need to register before you get access. That is free. The grid value is 218 (B). You should build your WPS with GRB2 option because 12-km NAM are GRIB2 data. You may check out the basic grid information about the datasets as below before you download them. In respect to application of WPS with multiple tiles in a data, please refer to 3-34~ 37 in the latest version of User's Guide issued in July 2010. I hope it is helpful. VALUE GRID DESCRIPTIONS about 12-km NAM Data 218 (B)[B] Grid over the Contiguous United States (used by the 12-km NAM Model) (Lambert Confo rmal) Nx 614 Ny 428 La1 12.190N Lo1 226.514E = 133.459W Res. & Comp. Flag 0 0 0 0 1 0 0 0 Lov 265.000E = 95.000W Dx 12.19058 km Dy 12.19058 km Projection Flag (bit 1) 0 (not bipolar) Scanning Mode (bits 1 2 3) 0 1 0 Lat/Lon values of the corners of the grid (1,1) 12.190N, 133.459W (1,428) 54.564N, 152.878W (614,428) 57.328N, 49.420W (614,1) 14.342N, 65.127W Pole point (I,J) (347.668, 1190.097) Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Wu, Zhong-Xiang (Lowell,MA-US) Sent: Wednesday, February 09, 2011 12:01 PM To: wrf-users at ucar.edu Subject: [Wrf-users] NAM 12-km Tiled data for WRF Initialization Hi, I am trying to use NAM 12-km tiled forecasts from NCEP in the WPS processing to create met initial files for WRF modeling. My WRF domain covers an area more than one NAM 12-km tile, so multiple NAM files are needed at one hour. Is there anyone who used the NAM 12-km tiled forecasts to initialize WRF in the similar way? And how should I use the data if WPS accepts multiple tiled files? Thanks, Zhong ********************************************************* Zhong-Xiang Wu, Ph.D TRC 650 Suffolk Street Lowell, Massachusetts, 01854 Email: zwu at trcsolutions.com zhong at alum.mit.edu Tel: 978-656-3667 Fax: 978-453-1995 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110210/44a04951/attachment.html From hamed319 at yahoo.com Thu Feb 10 09:03:05 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Thu, 10 Feb 2011 08:03:05 -0800 (PST) Subject: [Wrf-users] ./real.exe - FATAL Message-ID: <124414.2730.qm@web161207.mail.bf1.yahoo.com> Dear All, When I run the ./real.exe I just get this Fatal: ?Namelist dfi_control not found in namelist.input. Using registry defaults for v ?ariables in dfi_control ?Namelist tc not found in namelist.input. Using registry defaults for variables ?in tc ?Namelist scm not found in namelist.input. Using registry defaults for variables ? in scm ?Namelist fire not found in namelist.input. Using registry defaults for variable ?s in fire ?--- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval ? = 0 for all domains ?--- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval ? = 0 for all domains ?--- NOTE: grid_fdda is 0 for domain????? 1, setting gfdda interval and ending t ?ime to 0 for that domain. ?--- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain????? 1, setting ? sgfdda interval and ending time to 0 for that domain. ?--- NOTE: obs_nudge_opt is 0 for domain????? 1, setting obs nudging interval an ?d ending time to 0 for that domain. ?--- NOTE: grid_fdda is 0 for domain????? 2, setting gfdda interval and ending t ?ime to 0 for that domain. ?--- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain????? 2, setting ? sgfdda interval and ending time to 0 for that domain. ?--- NOTE: obs_nudge_opt is 0 for domain????? 2, setting obs nudging interval an ?d ending time to 0 for that domain. ?--- NOTE: num_soil_layers has been set to????? 4 ?REAL_EM V3.2.1 PREPROCESSOR ? ************************************* ? Parent domain ? ids,ide,jds,jde??????????? 1????????? 74?????????? 1????????? 61 ? ims,ime,jms,jme?????????? -4????????? 79????????? -4????????? 66 ? ips,ipe,jps,jpe??????????? 1????????? 74?????????? 1????????? 61 ? ************************************* ?DYNAMICS OPTION: Eulerian Mass Coordinate ??? alloc_space_field: domain??????????? 1,??? 104874600 bytes allocated Time period #?? 1 to process = 2007-07-23_00:00:00. Time period #?? 2 to process = 2007-07-23_01:00:00. Time period #?? 3 to process = 2007-07-23_02:00:00. Time period #?? 4 to process = 2007-07-23_03:00:00. Time period #?? 5 to process = 2007-07-23_04:00:00. Time period #?? 6 to process = 2007-07-23_05:00:00. Time period #?? 7 to process = 2007-07-23_06:00:00. Total analysis times to input =??? 7. ? ?----------------------------------------------------------------------------- ? ?Domain? 1: Current date being processed: 2007-07-23_00:00:00.0000, which is loop #?? 1 out of??? 7 ?configflags%julyr, %julday, %gmt:??????? 2007???????? 204? 0.0000000E+00 ?metgrid input_wrf.F first_date_input = 2007-07-23_00:00:00 ? metgrid input_wrf.F first_date_nml = 2007-07-23_00:00:00 ?-------------- FATAL CALLED --------------- ?FATAL CALLED FROM FILE:? ? LINE:???? 678 ? input_wrf.F: SIZE MISMATCH:? namelist ide,jde,num_metgrid_levels=????????? 74 ????????? 61????????? 27; input data ide,jde,num_metgrid_levels=???????? 100??? ????? 111????????? 27?????????????????????????????????????????????????????????? ??????????????????? ?------------------------------------------- Any suggestion would be appreciated. Hamed -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110210/08ef2be8/attachment-0001.html From mmkamal at sciborg.uwaterloo.ca Wed Feb 9 21:35:14 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Wed, 09 Feb 2011 23:35:14 -0500 Subject: [Wrf-users] NetCDF error with NARR data Message-ID: <20110209233514.16204717mkq1fjsw@www.nexusmail.uwaterloo.ca> Hi All, I have been trying to run WRF 3.2.1 in Linux 2.6.18-194.32.1.el5 x86_64 GNU/Linux with the following module : intel/intel-v11.1.072 openmpi/1.4.1-intel-v11.0-ofed fftw/2.1.5-intel-openmpi netcdf/4.0.1_nc3_intel I have compiled WPS serial and WRFV3 OpenMP option and successfully finished running the winter storm 2000 test case and output is okay. But while I am trying to run a simulation using NARR (North American Regional Reanalysis ) found the following message during real.exe but the following file wrfbdy_d01 and wrfinput_d01 is generate. In the log file I get the following message: Domain 1: Current date being processed: 2005-08-28_00:00:00.0000, which is loop # 1 out of 9 configflags%julyr, %julday, %gmt: 2005 240 0.0000000E+00 med_sidata_input: calling open_r_dataset for met_em.d. med_sidata_input: calling input_auxinput1 metgrid input_wrf.F first_date_input = 2005-08-28_00:00:00 metgrid input_wrf.F first_date_nml = 2005-08-28_00:00:00 NetCDF error: NetCDF: Attribute not found NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element P_TOP NetCDF error: NetCDF: Attribute not found NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element GMT NetCDF error: NetCDF: Attribute not found .............. ............... ............... ............... med_sidata_input: back from init_domain LBC valid between these times 2005-08-28_21:00:00.0000 2005-08-29_00:00:00 Timing for output 0 s. Timing for loop # 9 = 0 s. backfrom med_sidata_input real_em: SUCCESS COMPLETE REAL_EM INIT ---------------------------------------- Begin PBS Epilogue Wed Feb 9 22:58:32 EST 2011 1297310312 Job ID: 4353809.gpc-sched Username: mkamal Group: jcl Job Name: T74-04-1cpu-8d Session: 30241 Limits: neednodes=1:ppn=8,nodes=1:ppn=8,walltime=02:00:00 Resources: cput=00:00:01,mem=112668kb,vmem=206264kb,walltime=00:00:08 Queue: batch_eth Account: Nodes: gpc-f104n027 Killing leftovers... End PBS Epilogue Wed Feb 9 22:58:32 EST 2011 1297310312 I could not succeed running WRF using NARR data. Could any one please help me to find out the problem. Thanks in advance Kamal From wrf at nusculus.com Wed Feb 9 18:33:10 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Wed, 9 Feb 2011 18:33:10 -0700 Subject: [Wrf-users] NAM 12-km Tiled data for WRF Initialization In-Reply-To: <10BD6AED41582E47946AABB34C58D1F803112C2778@EXVMBX2.EMPLOYEES.ROOT.local> References: <10BD6AED41582E47946AABB34C58D1F803112C2778@EXVMBX2.EMPLOYEES.ROOT.local> Message-ID: I have not personally tried using tiled data, but you could probably handle the different tiles as different sources of data. There is a section in the WRF users guide that describes how to do that. It is titled: Using Multiple Meteorological Data Sources Or perhaps someone else has the experience to give specific directions. Kevin On Wed, Feb 9, 2011 at 12:01 PM, Wu, Zhong-Xiang (Lowell,MA-US) < ZWu at trcsolutions.com> wrote: > Hi, > > > > I am trying to use NAM 12-km tiled forecasts from NCEP in the WPS > processing to create met initial files for WRF modeling. My WRF domain > covers an area more than one NAM 12-km tile, so multiple NAM files are > needed at one hour. > > > > Is there anyone who used the NAM 12-km tiled forecasts to initialize WRF in > the similar way? And how should I use the data if WPS accepts multiple tiled > files? > > > > Thanks, > > > > Zhong > > > > > > > > ********************************************************* > > *Zhong-Xiang Wu, Ph.D* > > TRC > > 650 Suffolk Street > > Lowell, Massachusetts, 01854 > > > > Email: zwu at trcsolutions.com > > zhong at alum.mit.edu > > Tel: 978-656-3667 > > Fax: 978-453-1995 > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/40d86192/attachment.html From wrf at nusculus.com Wed Feb 9 18:45:13 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Wed, 9 Feb 2011 18:45:13 -0700 Subject: [Wrf-users] Fatal -Onlone Tutorial In-Reply-To: <46587.73479.qm@web161210.mail.bf1.yahoo.com> References: <46587.73479.qm@web161210.mail.bf1.yahoo.com> Message-ID: Greetings, When I get that message it usually means that real.exe, which creates the wrfbdy_d01 file, had different run times than wrf.exe, which uses the wrfbdy_d01 file. That can happen when I have run_days, run_hours, etc that do not match the start_year, start_month, etc and the end_year, end_month, etc. That is because wrf.exe uses the end_year, end_month, etc but real.exe uses the run_days, run_hours, etc. However, in newer versions of WRF, if you completely leave the run_days, run_hours, etc out of the namelist.input file, that forces real.exe to use the end_year, end_month, etc fields. That is a bit confusing, but try looking into that part of the namelist.input file. Kevin On Wed, Feb 9, 2011 at 10:25 AM, Hamed Sharifi wrote: > Dear all, > I set every thing just as online tutorial. but when I run the ./wrf.exe for > domain 2 I got this error: > any suggestion? > > Timing for Writing wrfout_d01_2007-08-16_00:00:00 for domain 1: > 0.15610 elapsed seconds. > 2 input_wrf: wrf_get_next_time current_date: > 2007-08-16_00:00:00 Status = -4 > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 705 > ... May have run out of valid boundary conditions in file wrfbdy_d01 > > Thanks in advance, > Hamed > > > ------------------------------ > We won't tell. Get more on shows you hate to love > (and love to hate): Yahoo! TV's Guilty Pleasures list. > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110209/315baded/attachment.html From FLiu at azmag.gov Fri Feb 11 11:43:53 2011 From: FLiu at azmag.gov (Feng Liu) Date: Fri, 11 Feb 2011 18:43:53 +0000 Subject: [Wrf-users] ./real.exe - FATAL In-Reply-To: <124414.2730.qm@web161207.mail.bf1.yahoo.com> References: <124414.2730.qm@web161207.mail.bf1.yahoo.com> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C22641794@mag9006> Hamed, Try to check the num_metgrid_levels in met_em.d0* files by `ncdump -h | grep num_metgrid_levels` , then set the consistent number for num_metgrid_levels in your namelist.input. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Hamed Sharifi Sent: Thursday, February 10, 2011 9:03 AM To: wrf-users at ucar.edu Subject: [Wrf-users] ./real.exe - FATAL Dear All, When I run the ./real.exe I just get this Fatal: Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 104874600 bytes allocated Time period # 1 to process = 2007-07-23_00:00:00. Time period # 2 to process = 2007-07-23_01:00:00. Time period # 3 to process = 2007-07-23_02:00:00. Time period # 4 to process = 2007-07-23_03:00:00. Time period # 5 to process = 2007-07-23_04:00:00. Time period # 6 to process = 2007-07-23_05:00:00. Time period # 7 to process = 2007-07-23_06:00:00. Total analysis times to input = 7. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2007-07-23_00:00:00.0000, which is loop # 1 out of 7 configflags%julyr, %julday, %gmt: 2007 204 0.0000000E+00 metgrid input_wrf.F first_date_input = 2007-07-23_00:00:00 metgrid input_wrf.F first_date_nml = 2007-07-23_00:00:00 -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 678 input_wrf.F: SIZE MISMATCH: namelist ide,jde,num_metgrid_levels= 74 61 27; input data ide,jde,num_metgrid_levels= 100 111 27 ------------------------------------------- Any suggestion would be appreciated. Hamed -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110211/27793ede/attachment.html From df6.626 at gmail.com Sat Feb 12 04:15:07 2011 From: df6.626 at gmail.com (Dmitry Vinokurov) Date: Sat, 12 Feb 2011 16:15:07 +0500 Subject: [Wrf-users] NetCDF paths Message-ID: Hi! I'm trying to compile WRF 3.2.1 on CentOS 5.5 (netcdf packages from rep) and get following error after ./configure: ---- grep: /usr/include/netcdf.inc: No such file or directory ---- NETCDF environment variable is set to "/usr", so as I understand WRF expect includes to be in "${NETCDF}/include" etc. Actually grep is right, there is no such file "/usr/include/netcdf.inc", NetCDF includes are located in "/usr/include/netcdf-3/" and libraries are in "/usr/lib64". The question is: is it possible to set separate paths for NetCDF includes and libraries? For example, Jasper uses $JASPERINC and $JASPERLIB variables and I think it'll be nice to work with NetCDF in the same way. Thanks. -- Best regards, Dmitry Vinokurov -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110212/d660f31a/attachment.html From Matthew.Foster at noaa.gov Mon Feb 14 09:28:10 2011 From: Matthew.Foster at noaa.gov (Matt Foster) Date: Mon, 14 Feb 2011 10:28:10 -0600 Subject: [Wrf-users] Using RUC input data with NOAH LSM Message-ID: <4D59581A.40900@noaa.gov> Is the use of RUC input data (for initial and/or boundary conditions) possible when using the NOAH LSM? Comments I found in module_initialize_real seem to imply that it is, however, when I try it real hangs. Matt -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson -------------- next part -------------- A non-text attachment was scrubbed... Name: matthew_foster.vcf Type: text/x-vcard Size: 229 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110214/77f8543a/attachment.vcf From irene.gallai at arpa.fvg.it Tue Feb 15 00:51:59 2011 From: irene.gallai at arpa.fvg.it (Irene Gallai) Date: Tue, 15 Feb 2011 08:51:59 +0100 Subject: [Wrf-users] ./real.exe - FATAL In-Reply-To: <124414.2730.qm@web161207.mail.bf1.yahoo.com> References: <124414.2730.qm@web161207.mail.bf1.yahoo.com> Message-ID: <4D5A309F.7050401@arpa.fvg.it> Your domain size in namelist.input (ide and jde entries) need to be exactly the same used to run the preprocessing (namelist of WPS) . In your case the input files you are using are defined over a grid of 100x111 points whereas in your namelist you defined a domain of 74x61 points. The vertical levels are correct On 02/10/2011 05:03 PM, Hamed Sharifi wrote: > Dear All, > > When I run the ./real.exe I just get this Fatal: > > Namelist dfi_control not found in namelist.input. Using registry > defaults for v > ariables in dfi_control > Namelist tc not found in namelist.input. Using registry defaults for > variables > in tc > Namelist scm not found in namelist.input. Using registry defaults for > variables > in scm > Namelist fire not found in namelist.input. Using registry defaults > for variable > s in fire > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > auxinput4_interval > = 0 for all domains > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and > auxinput4_interval > = 0 for all domains > --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval > and ending t > ime to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain > 1, setting > sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging > interval an > d ending time to 0 for that domain. > --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval > and ending t > ime to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain > 2, setting > sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging > interval an > d ending time to 0 for that domain. > --- NOTE: num_soil_layers has been set to 4 > REAL_EM V3.2.1 PREPROCESSOR > ************************************* > Parent domain > ids,ide,jds,jde 1 74 1 61 > ims,ime,jms,jme -4 79 -4 66 > ips,ipe,jps,jpe 1 74 1 61 > ************************************* > DYNAMICS OPTION: Eulerian Mass Coordinate > alloc_space_field: domain 1, 104874600 bytes allocated > Time period # 1 to process = 2007-07-23_00:00:00. > Time period # 2 to process = 2007-07-23_01:00:00. > Time period # 3 to process = 2007-07-23_02:00:00. > Time period # 4 to process = 2007-07-23_03:00:00. > Time period # 5 to process = 2007-07-23_04:00:00. > Time period # 6 to process = 2007-07-23_05:00:00. > Time period # 7 to process = 2007-07-23_06:00:00. > Total analysis times to input = 7. > > ----------------------------------------------------------------------------- > > Domain 1: Current date being processed: 2007-07-23_00:00:00.0000, > which is loop # 1 out of 7 > configflags%julyr, %julday, %gmt: 2007 204 0.0000000E+00 > metgrid input_wrf.F first_date_input = 2007-07-23_00:00:00 > metgrid input_wrf.F first_date_nml = 2007-07-23_00:00:00 > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 678 > input_wrf.F: SIZE MISMATCH: namelist > ide,jde,num_metgrid_levels= 74 > 61 27; input data > ide,jde,num_metgrid_levels= 100 > 111 27 > > ------------------------------------------- > > Any suggestion would be appreciated. > Hamed > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- +------------------------------------+ Irene Gallai Agenzia Regionale per la Protezione dell'Ambiente (ARPA FVG ) Via Cairoli 14 I-33057 Palmanova (UD) ITALY tel centr.: +39 0432 922611 e-mail : irene.gallai at arpa.fvg.it +------------------------------------+ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110215/16c3df05/attachment.html From hamed319 at yahoo.com Tue Feb 15 02:23:09 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Tue, 15 Feb 2011 01:23:09 -0800 (PST) Subject: [Wrf-users] wrfcamx Compiling Message-ID: <950937.80965.qm@web161216.mail.bf1.yahoo.com> Dear All, I use "make --check" command to compile the wrfcamx program, but I got this error: ld: cannot find -lnetcdf Any suggestion? Thanks in advance, Hamed ____________________________________________________________________________________ It's here! Your new message! Get new email alerts with the free Yahoo! Toolbar. http://tools.search.yahoo.com/toolbar/features/mail/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110215/3093ee56/attachment.html From objacoe at hotmail.com Tue Feb 15 07:37:26 2011 From: objacoe at hotmail.com (Obadias Cossa) Date: Tue, 15 Feb 2011 16:37:26 +0200 Subject: [Wrf-users] ECHAM5 data to ingest in WRF Message-ID: Dear Users I have to ingest into WRF data from a global model, say ECHAM5. The problem however is that WRF requests data in vertical pressure coordinates, but ECHAM5, gives in sigma coordinates. May one help on how to handle this problem. What would be the steps to follow on building Vtables, and other necessary inputs for the WRF? The choice of ECHAM5, was due to the possibility is has of reproducing the past climates. In fact, I intend to simulate the climates of the past in my region (Africa) using WRF. Thanks, Obadias Cossa Student at UCT Department of Oceanography -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110215/11c7c202/attachment.html From maemarcus at gmail.com Mon Feb 14 09:51:29 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Mon, 14 Feb 2011 19:51:29 +0300 Subject: [Wrf-users] NetCDF paths In-Reply-To: References: Message-ID: Hi, Different distros often move around the stuff that goes in dev (devel) packages, e.g. headers or static libraries. There is less deviation with shared libraries, but still sometimes they are also kept under subdirectiories, rather than in /usr/lib64, for example, mpich on red hats is in /usr/lib64/mpich2/lib/ and will also fail with simple -lmpich linking. So, *manual* adjustment of paths can't be fully avoided, there are too many different situations. It should be better to make configure script to perform some basic search and prompt for input if one of multiple variants can be used. Also, ./configure can use pkg-config to locate dependencies, if package provides .pc info. - D. 2011/2/12 Dmitry Vinokurov : > Hi! > > I'm trying to compile WRF 3.2.1 on CentOS 5.5 (netcdf packages from rep) and > get following error after ./configure: > ---- > grep: /usr/include/netcdf.inc: No such file or directory > ---- > NETCDF environment variable is set to "/usr", so as I understand WRF expect > includes to be in "${NETCDF}/include" etc. > > Actually grep is right, there is no such file "/usr/include/netcdf.inc", > NetCDF includes are located in "/usr/include/netcdf-3/" and libraries are in > "/usr/lib64". > > The question is: is it possible to set separate paths for NetCDF includes > and libraries? For example, Jasper uses $JASPERINC and $JASPERLIB variables > and I think it'll be nice to work with NetCDF in the same way. > > Thanks. > > -- > Best regards, > Dmitry Vinokurov > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From wrf at nusculus.com Tue Feb 15 11:05:28 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Tue, 15 Feb 2011 11:05:28 -0700 Subject: [Wrf-users] NetCDF paths In-Reply-To: References: Message-ID: Hi, Perhaps this would work (or not): Create your own NetCDF directory and simply have links, with the proper names that point to the include and library directories that have the wrong names. Then have $NETCDF point to your directory. mkdir myNetcdf cd myNetcdf ln -s /usr/include/netcdf-3/ include ln -s /usr/lib64 lib export NETCDF=`pwd` If it works, perhaps you could reply to the group. It seems like a problems others might encounter. Otherwise, folks have to compile their own NetCDF libraries just to get the correct directory structure. If it doesn't work - sorry 'bout that. Kevin On Sat, Feb 12, 2011 at 4:15 AM, Dmitry Vinokurov wrote: > Hi! > > I'm trying to compile WRF 3.2.1 on CentOS 5.5 (netcdf packages from rep) > and get following error after ./configure: > ---- > grep: /usr/include/netcdf.inc: No such file or directory > ---- > NETCDF environment variable is set to "/usr", so as I understand WRF expect > includes to be in "${NETCDF}/include" etc. > > Actually grep is right, there is no such file "/usr/include/netcdf.inc", > NetCDF includes are located in "/usr/include/netcdf-3/" and libraries are in > "/usr/lib64". > > The question is: is it possible to set separate paths for NetCDF includes > and libraries? For example, Jasper uses $JASPERINC and $JASPERLIB variables > and I think it'll be nice to work with NetCDF in the same way. > > Thanks. > > -- > Best regards, > Dmitry Vinokurov > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110215/547abcaa/attachment.html From kganbour at yahoo.com Wed Feb 16 03:25:57 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Wed, 16 Feb 2011 02:25:57 -0800 (PST) Subject: [Wrf-users] [WRF-users] data for my domain Message-ID: <449815.85099.qm@web46312.mail.sp1.yahoo.com> Dear all: I would like to run WRF with real data. I tried to download from site of http://polar.... and ncep.....but there are many files and I downloaded files but they were very large mab be 400MB. Please If you know the best site to get data and If possible to get this data only on my domain,tell me best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110216/85752e7c/attachment.html From ahsanshah01 at gmail.com Thu Feb 17 20:23:19 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Fri, 18 Feb 2011 08:23:19 +0500 Subject: [Wrf-users] Wrf-users Digest, Vol 78, Issue 17 In-Reply-To: References: Message-ID: Try this http://nomad3.ncep.noaa.gov/ On Fri, Feb 18, 2011 at 12:00 AM, wrote: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. [WRF-users] data for my domain (Khaled Ganbour) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 16 Feb 2011 02:25:57 -0800 (PST) > From: Khaled Ganbour > Subject: [Wrf-users] [WRF-users] data for my domain > To: wrf-users at ucar.edu > Message-ID: <449815.85099.qm at web46312.mail.sp1.yahoo.com> > Content-Type: text/plain; charset="us-ascii" > > Dear all: > > I would like to run WRF with real data. > I tried to download from site of http://polar.... and ncep.....but there > are many files and I downloaded files but they were very large mab be 400MB. > Please If you know the best site to get data and If possible to get this > data only on my domain,tell me > > > best regards > > > Khaled > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110216/85752e7c/attachment-0001.html > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 78, Issue 17 > ***************************************** > -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110218/5b265223/attachment.html From jonathan.case-1 at nasa.gov Fri Feb 18 12:01:46 2011 From: jonathan.case-1 at nasa.gov (Case, Jonathan (MSFC-VP61)[ENSCO INC]) Date: Fri, 18 Feb 2011 13:01:46 -0600 Subject: [Wrf-users] WRF-NMM mis-handling of SEAICE for small lakes In-Reply-To: <4D5EB695.1060002@noaa.gov> References: <4D5EB695.1060002@noaa.gov> Message-ID: Hello Dave/All, Here is just a quick point of clarification on how the WRF-NMM incorporates sea ice into the initial condition for version 3.1.1. Upon looking at the real_nmm code, I realized that the WRF-NMM initialization routine does NOT allow the user to set any sea ice based on input skin temperature thresholds. The only way the WRF-NMM can initialize ice is by reading in a SEAICE grid from an external GRIB2 file (e.g. NAM or SPoRT's Great Lakes SEAICE via the icegl option in the EMS). As a result, smaller water bodies in Canada and the Northeast U.S. are left as open water and assigned unrealistically cold water temperatures well below freezing in high-resolution NMM runs. I have not confirmed whether this has been fixed in WRF v3.2, but in case it hasn't been fixed, I have CC'd the wrfhelp and wrf-users forum email lists , as well as Bob Rozumalski to the reply. Best regards, Jonathan Case (NASA SPoRT Center) From: Dave Radell [mailto:David.Radell at noaa.gov] Sent: Friday, February 18, 2011 12:13 PM To: Michael Evans; Paul Sisson; Molthan, Andrew L. (MSFC-VP61); David Zaff; Robert LaPlante; Jeff Waldstreicher; Kenneth Johnson; Brian Miretzky; Joshua Watson; Fred Pierce; Christopher Mello; Chuck McGill; Ron Murphy; David Radell Cc: Case, Jonathan (MSFC-VP61)[ENSCO INC]; Vasil Koleci Subject: Update: Northeast WRF Ensemble Hi All, I figured I'd take this opportunity, while the winter weather is somewhat quieter, to give everyone an update on some recent WRF LES ensemble developments. 1. NMM Ice Cover Mask Issues SSD discovered a problem with the SpORT ice mask used in the NMM core of the WRF-EMS. We noticed this last week during the Feb. 9-11 event in Buffalo's CWA, though the problem has likely persisted much longer. Essentially, the SpORT ice mask in the NMM-WRF was supposed to be set by the SpORT SST product (if SST<273K, set the point to ice) , and that was not happening correctly. So for each model run, the NMM would revert back to deriving ice cover from NAM skin temperatures which for last week's event covered most of Lake Ontario with ice---not very realistic. So the SpORT group quickly came up with a fix that involved creating a separate ice mask (independent of the SST product) grib file for use in creating the initial WRF conditions. We have tested this here with NMM runs, and it's working correctly, giving the proper lake ice cover. Note that this did not affect those running the ARW core. Here are the configuration options to check in your NMM configuration files to ensure that you're set up correctly for the SpORT SST and ice cover product: To set up a run using both our SSTs (ftp://ftp.nsstc.org/outgoing/lafonta/sst/grib2/conus/) and the Great Lakes ice mask (ftp://ftp.nsstc.org/outgoing/lafonta/sst/grib2/great_lakes/) requires setting the following parameters in the conf/ems_auto/ems_autorun.conf file: SFC = sstsport,icegl BESTHR = sstsport (optional, for the diurnal matching of SSTs to model initialization hour) If running the EMS manually, then the following options should be used in ems_prep (as a working example): ems_prep --dslist namptile --sfc sstsport,icegl Also, EMS users must be using at least EMS v3.1.1.5.1 for the "icegl" option to work. A quick test of running 'ems_prep --dslist ' will reveal if the icegl option is available. If you have questions on this, please let me know, and we'll coordinate with Andrew Molthan and Jon Case at SPoRT as appropriate. 2. BUFKIT Soundings Ensemble soundings look to be generating regularly on the ER FTP server in \share\BTV\BUFKIT. A quick check this morning has sounding data for the following locations: CHD, CLE, COLT, EON, ERI, GOU, HZY, KBGM, KBTV, KITH, KMSS, KSLK, KSYR, RME, STAR. It appears CLE, BTV, and BGM are regularly providing their ensemble members for inclusion. 3. LES Ensemble Configurations As promised and for those interested, I have posted the information I've received from you all on the individual ensemble members on the ER Local Model Wiki page at the address below. If would to like to make changes or updates to these configurations, please feel free to edit away directly on the Wiki page. https://collaborate.werh.noaa.gov/wiki/index.php/Great_Lakes_Ensemble_Configuration -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110218/57925120/attachment.html From yesubabu2006 at gmail.com Thu Feb 17 19:33:48 2011 From: yesubabu2006 at gmail.com (V.YesuBabu) Date: Fri, 18 Feb 2011 08:03:48 +0530 Subject: [Wrf-users] "Re: Contents of Wrf-users digest Vol 78, Issue 17 Message-ID: Dear Khaled, You can get input global data for WRF from http://nomads.ncdc.noaa.gov/data.php?name=access#hires_weather_datasets or you can refer to http://www.mmm.ucar.edu/wrf/users/download/free_data.html to get free data to initialize WRF code. On 18 February 2011 00:30, wrote: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. [WRF-users] data for my domain (Khaled Ganbour) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 16 Feb 2011 02:25:57 -0800 (PST) > From: Khaled Ganbour > Subject: [Wrf-users] [WRF-users] data for my domain > To: wrf-users at ucar.edu > Message-ID: <449815.85099.qm at web46312.mail.sp1.yahoo.com> > Content-Type: text/plain; charset="us-ascii" > > Dear all: > > I would like to run WRF with real data. > I tried to download from site of http://polar.... and ncep.....but there > are many files and I downloaded files but they were very large mab be 400MB. > Please If you know the best site to get data and If possible to get this > data only on my domain,tell me > > > best regards > > > Khaled > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110216/85752e7c/attachment-0001.html > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 78, Issue 17 > ***************************************** > -- *********************************** V.Yesubabu, Project Engineer, CAS/SECG,C-DAC,Main Building, Pune University,Ganeshkhind,Pune. Phone :020-25704226 Cell: 07276298863 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110218/93c7db27/attachment-0001.html From ahsanshah01 at gmail.com Fri Feb 18 20:29:22 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Sat, 19 Feb 2011 08:29:22 +0500 Subject: [Wrf-users] Restart Run Message-ID: Dear, When we restart WRF from a specific time using a restart file a new wrf output file is made rather then writing the output to the previously saved file. Is there any was that the wrf output file remain the same although we restart it. -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110219/8e70338a/attachment.html From ebeigi3 at tigers.lsu.edu Sat Feb 19 23:04:14 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Sun, 20 Feb 2011 00:04:14 -0600 Subject: [Wrf-users] input and output data Message-ID: I am PhD student at LSU, and I want to dynamically downcale coarse resolution data of GCM and then evaluate the climate change effect on hyrologic cycle( temperature, precipitation, ...) . Is there avilabe input data for WRF to run in Lousiana state? i am not familiar with input and output data of WRF software. I am looking forward to hearing form you. Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110220/1c64a93c/attachment.html From hedde.cea at gmail.com Mon Feb 21 08:36:12 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Mon, 21 Feb 2011 16:36:12 +0100 Subject: [Wrf-users] wrfcamx Compiling In-Reply-To: <950937.80965.qm@web161216.mail.bf1.yahoo.com> References: <950937.80965.qm@web161216.mail.bf1.yahoo.com> Message-ID: Hi make -check doesn't compile, it is just a checking you can run after compiling. you have to run "make" alone to compile your code. If you still have the error msg, you have to check for netCDF library directory. may be you should add netcdf path in your configure or add a NETCDF environnement variable. type ./configure --help bye 2011/2/15 Hamed Sharifi > Dear All, > I use "make --check" command to compile the wrfcamx program, but I got this > error: > > ld: cannot find -lnetcdf > > Any suggestion? > Thanks in advance, > Hamed > ------------------------------ > Need Mail bonding? > Go to the Yahoo! Mail Q&Afor great > tips from Yahoo! Answersusers. > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE DEN/DTN/SMTM/LMTE B?t. 307 Pi?ce 9 13108 ST PAUL LEZ DURANCE CEDEX FRANCE -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110221/0f437bd8/attachment.html From hedde.cea at gmail.com Mon Feb 21 08:45:07 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Mon, 21 Feb 2011 16:45:07 +0100 Subject: [Wrf-users] WRFV3.2 compilation Error In-Reply-To: References: <69822787.20110208204759@rambler.ru> Message-ID: I am using this version of gcc: gcc version 4.1.2 20080704 (Red Hat 4.1.2-48) I'm waiting for an update from my support. I tell you if it works as soon as I can.... cordially 2011/2/8 Dmitry N. Mikushin > Mostamandy, > > At least one of your errors > > character (DateStrLen), dimension(:),allocatable :: Times(:) > 1 > Error: Attribute at (1) is not allowed in a TYPE definition > > can't be reproduced with gfortran 4.4.3. What version of gfortran are you > using? > > 8 ??????? 2011 ?. 20:47 ???????????? Mostamandy Soleiman > ???????: > > Hello, Wrf-users. > > > > I tried to compile WRF V3.2.1 with gfortran + gcc with but > > unfortunately i haven't got success . I attached the compile.log and > > configure.wps > > could someone help me !what is the problem. > > > > thanks in advance ! > > > > > > > > > > -- > > ? ?????????, > > Mostamandy mailto:soleiman at rambler.ru > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE DEN/DTN/SMTM/LMTE B?t. 307 Pi?ce 9 13108 ST PAUL LEZ DURANCE CEDEX FRANCE -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110221/cc466458/attachment.html From doyle at cima.fcen.uba.ar Tue Feb 22 10:20:55 2011 From: doyle at cima.fcen.uba.ar (doyle at cima.fcen.uba.ar) Date: Tue, 22 Feb 2011 14:20:55 -0300 (ART) Subject: [Wrf-users] 2m temperature error Message-ID: <38745.157.92.4.71.1298395255.squirrel@www.cima.fcen.uba.ar> Hi everyone I am running WRF-ARW on a linux cluster with 16 processors over all south America with the following physics in the namelist &physics mp_physics = 5, 3, 3, ra_lw_physics = 99, 1, 1, ra_sw_physics = 99, 1, 1, radt = 50, 30, 30, sf_sfclay_physics = 2, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 2, 1, 1, bldt = 0, 0, 0, cu_physics = 2, 1, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 1, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, When I look at the 2m temperature field (first figure in the pdf file) it seems to behave with horizontal stripes, far from reality. When I change ra_lw_physics and ra_sw_physics to 1 the field is close to reality (figure 2). The same occurs when changing just one ra_lw_physics or ra_sw_physics (figures 3 and 4). I tried running ARW V3.2.1 and 3.0 and find the same problem, in the last case with values vary far from reality. Has anyone run using this physics configuration? Any idea why this combination is not working? Thanks a lot Moira Doyle Dpt Atmospheric and Oceanic Sciences Univ. Buenos Aires -- Este mensaje ha sido analizado por el Servidor de Mail de CIMA en busca de virus y otros contenidos peligrosos, y se considera que esta limpio. -- Este mensaje ha sido analizado por el Servidor de Mail de CIMA en busca de virus y otros contenidos peligrosos, y se considera que esta limpio. -------------- next part -------------- A non-text attachment was scrubbed... Name: comparacion.pdf Type: application/pdf Size: 133123 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110222/bc10dba7/attachment-0001.pdf From ahsanshah01 at gmail.com Tue Feb 22 21:02:51 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Wed, 23 Feb 2011 09:02:51 +0500 Subject: [Wrf-users] Running wrf parallel problem Message-ID: Hello, I an stuck in a problem that is regarding the running for WRFV 3.2.1. I get the following error while running with mpirun. Any help would be highly appreciated. *[pmdtest at pmd02 em_real]$ mpirun -np 4 wrf.exe starting wrf task 0 of 4 starting wrf task 1 of 4 starting wrf task 3 of 4 starting wrf task 2 of 4 -------------------------------------------------------------------------- mpirun noticed that process rank 3 with PID 6044 on node pmd02.pakmet.com exited on signal 11 (Segmentation fault).** * Any help would be highly appreciated. -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110223/ba31c559/attachment.html From nbon0004 at um.edu.mt Wed Feb 23 02:19:28 2011 From: nbon0004 at um.edu.mt (Norbert Bonnici) Date: Wed, 23 Feb 2011 10:19:28 +0100 Subject: [Wrf-users] forecasting procedure Message-ID: Dear all, I would like to know how to set up a simulations which involves a forecast. Till now all I managed to do is run a simulation from previous data. Thanks, -- Norbert Bonnici From tnquanghuy at gmail.com Wed Feb 23 08:54:01 2011 From: tnquanghuy at gmail.com (Huy Tran) Date: Wed, 23 Feb 2011 06:54:01 -0900 Subject: [Wrf-users] WRF one-way nesting error Message-ID: <4D652D99.20106@gmail.com> Good morning everyone, I'm currently having trouble with running one-way nesting in WRF. The domain configuration includes 1 coarse domain of 300x400 grid-cells x 12km (domain 1) and 1 fine domain of 201x201 grid-cells x 4km (domain 2). The time steps are 36s and 12s for the domain 1 and domain 2, respectively. Frame output is every 1hour. Simulations are conducted with 8 processors machine. I finished running domain 1 successfully and the frame output is every 1hour. Then I follow the instruction of running one-way nesting with ndown (http://www.mmm.ucar.edu/wrf/users/wrfv2/runwrf.html#oneway). I finished all the steps and got the wrfinput_d02 and wrfbdy_d02 for the domain 2 with ndown, and renamed them to wrfinput_d01 and wrfbdy_d01, respectively. I took a look into these files and verified that all timestep and domain configuration are correctly. But when I run wrf.exe, it just finished upto 5 time steps and stopped there with the error "APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)" and there is no error recorded in the rsl.error.* files. Note that if I run the domain 2 without ndown (i.e. generated the wrfinput_d01 and wrfbdy_d01 with real.exe and go no further with ndown), wrf.exe run flawlessly without any error. I attach the rsl.error.0000, the namelist.input of domain 1 and domain 2 for your more information. Your suggestions are highly appreciated. Thanks. Huy -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: rsl.error.0000 Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110223/502b67d9/attachment-0003.pl -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: d01_namelist.input Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110223/502b67d9/attachment-0004.pl -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: d02_namelist.input Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110223/502b67d9/attachment-0005.pl From drostkier at yahoo.com Thu Feb 24 05:35:21 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Thu, 24 Feb 2011 04:35:21 -0800 (PST) Subject: [Wrf-users] WRF and stratus Message-ID: <27627.27479.qm@web113104.mail.gq1.yahoo.com> Hi, I am looking for publications on this subject. Sensitivity to parameterizations, etc. I appreciate getting any references. Thanks, Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110224/100fb231/attachment.html From FLiu at azmag.gov Wed Feb 23 14:57:34 2011 From: FLiu at azmag.gov (Feng Liu) Date: Wed, 23 Feb 2011 21:57:34 +0000 Subject: [Wrf-users] WRF one-way nesting error In-Reply-To: <4D652D99.20106@gmail.com> References: <4D652D99.20106@gmail.com> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C2266359B@mag9006> Hi Huy, Switch off microphysics and set mp_physics = 0 because ndown.exe does not work with mp_physics. Hope this is the solution to your problem. Feng -----Original Message----- From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Huy Tran Sent: Wednesday, February 23, 2011 8:54 AM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF one-way nesting error Good morning everyone, I'm currently having trouble with running one-way nesting in WRF. The domain configuration includes 1 coarse domain of 300x400 grid-cells x 12km (domain 1) and 1 fine domain of 201x201 grid-cells x 4km (domain 2). The time steps are 36s and 12s for the domain 1 and domain 2, respectively. Frame output is every 1hour. Simulations are conducted with 8 processors machine. I finished running domain 1 successfully and the frame output is every 1hour. Then I follow the instruction of running one-way nesting with ndown (http://www.mmm.ucar.edu/wrf/users/wrfv2/runwrf.html#oneway). I finished all the steps and got the wrfinput_d02 and wrfbdy_d02 for the domain 2 with ndown, and renamed them to wrfinput_d01 and wrfbdy_d01, respectively. I took a look into these files and verified that all timestep and domain configuration are correctly. But when I run wrf.exe, it just finished upto 5 time steps and stopped there with the error "APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1)" and there is no error recorded in the rsl.error.* files. Note that if I run the domain 2 without ndown (i.e. generated the wrfinput_d01 and wrfbdy_d01 with real.exe and go no further with ndown), wrf.exe run flawlessly without any error. I attach the rsl.error.0000, the namelist.input of domain 1 and domain 2 for your more information. Your suggestions are highly appreciated. Thanks. Huy From FLiu at azmag.gov Wed Feb 23 15:26:20 2011 From: FLiu at azmag.gov (Feng Liu) Date: Wed, 23 Feb 2011 22:26:20 +0000 Subject: [Wrf-users] 2m temperature error In-Reply-To: <38745.157.92.4.71.1298395255.squirrel@www.cima.fcen.uba.ar> References: <38745.157.92.4.71.1298395255.squirrel@www.cima.fcen.uba.ar> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C22663629@mag9006> Hi Doyle, You have taken cloud effect to the optical depth in radiation (icloud = 1) which only works for ra_sw_physics =1 and ra_lw_physics = 1 ). It makes no sense to compare the results if you take other long wave and short wave radiation schemes. Please refer to Page 5-42 in the USER's GUIDE. Thanks. Feng -----Original Message----- From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of doyle at cima.fcen.uba.ar Sent: Tuesday, February 22, 2011 10:21 AM To: wrf-users at ucar.edu Subject: [Wrf-users] 2m temperature error Hi everyone I am running WRF-ARW on a linux cluster with 16 processors over all south America with the following physics in the namelist &physics mp_physics = 5, 3, 3, ra_lw_physics = 99, 1, 1, ra_sw_physics = 99, 1, 1, radt = 50, 30, 30, sf_sfclay_physics = 2, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 2, 1, 1, bldt = 0, 0, 0, cu_physics = 2, 1, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 1, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, When I look at the 2m temperature field (first figure in the pdf file) it seems to behave with horizontal stripes, far from reality. When I change ra_lw_physics and ra_sw_physics to 1 the field is close to reality (figure 2). The same occurs when changing just one ra_lw_physics or ra_sw_physics (figures 3 and 4). I tried running ARW V3.2.1 and 3.0 and find the same problem, in the last case with values vary far from reality. Has anyone run using this physics configuration? Any idea why this combination is not working? Thanks a lot Moira Doyle Dpt Atmospheric and Oceanic Sciences Univ. Buenos Aires -- Este mensaje ha sido analizado por el Servidor de Mail de CIMA en busca de virus y otros contenidos peligrosos, y se considera que esta limpio. -- Este mensaje ha sido analizado por el Servidor de Mail de CIMA en busca de virus y otros contenidos peligrosos, y se considera que esta limpio. From kotroni at meteo.noa.gr Tue Mar 1 01:38:44 2011 From: kotroni at meteo.noa.gr (Vassiliki Kotroni) Date: Tue, 1 Mar 2011 10:38:44 +0200 Subject: [Wrf-users] WRF execution problem on amd processors + intel Message-ID: <002801cbd7ec$136968c0$3a3c3a40$@noa.gr> Dear colleagues we have been successfully running WRF on our AMD-Opteron cluster with pgfortran and mpi. We are trying to switch to ifortran and we have the following problem: (we tested both in aour amd-opteron cluster and on a 6-core amd-phenom) we successfully compiled mpich2, netcdf and wrf with ifort and wrf crashes just after writing the 00 hour wrfout and before giving the first timestep. Has anybody else encountered the same problem? On the other hand on the same processors we have successfully compiled mm5 with ifort and we are running it without any problem. We would appreciate any help on the subject best regards Vasso ---------------------------------------------------------------------------- -------- Dr. Vassiliki KOTRONI Institute of Environmental Research National Observatory of Athens Lofos Koufou, P. Pendeli, GR-15236 Athens, Greece Tel: +30 2 10 8109126 Fax: +30 2 10 8103236 Daily weather forecasts at: www.noa.gr/forecast (in english) www.meteo.gr (in greek) www.eurometeo.gr -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110301/882bd1a0/attachment.html From df6.626 at gmail.com Tue Mar 1 02:53:54 2011 From: df6.626 at gmail.com (Dmitry Vinokurov) Date: Tue, 1 Mar 2011 14:53:54 +0500 Subject: [Wrf-users] NetCDF paths In-Reply-To: References: Message-ID: We were moving to another office, now I've returned to work on WRF compiling. Thanks all for tips. Seems like suggested solution with symlinks worked, I don't get configure error anymore. Anyway, I think using 2 environment variables for NetCDF will improve portability and convenience. If sources and online tutorial will be fixed in this way, new user will not waste time on searching error and googling. If my reasoning are correct, maybe I should contact WRF developers? 2011/2/15 Kevin Matthew Nuss > Hi, > > Perhaps this would work (or not): > > Create your own NetCDF directory and simply have links, with the proper > names that point to the include and library directories that have the wrong > names. Then have $NETCDF point to your directory. > > mkdir myNetcdf > cd myNetcdf > ln -s /usr/include/netcdf-3/ include > ln -s /usr/lib64 lib > export NETCDF=`pwd` > > If it works, perhaps you could reply to the group. It seems like a problems > others might encounter. Otherwise, folks have to compile their own NetCDF > libraries just to get the correct directory structure. If it doesn't work - > sorry 'bout that. > > Kevin > > On Sat, Feb 12, 2011 at 4:15 AM, Dmitry Vinokurov wrote: > >> Hi! >> >> I'm trying to compile WRF 3.2.1 on CentOS 5.5 (netcdf packages from rep) >> and get following error after ./configure: >> ---- >> grep: /usr/include/netcdf.inc: No such file or directory >> ---- >> NETCDF environment variable is set to "/usr", so as I understand WRF >> expect includes to be in "${NETCDF}/include" etc. >> >> Actually grep is right, there is no such file "/usr/include/netcdf.inc", >> NetCDF includes are located in "/usr/include/netcdf-3/" and libraries are in >> "/usr/lib64". >> >> The question is: is it possible to set separate paths for NetCDF includes >> and libraries? For example, Jasper uses $JASPERINC and $JASPERLIB variables >> and I think it'll be nice to work with NetCDF in the same way. >> >> Thanks. >> >> -- >> Best regards, >> Dmitry Vinokurov >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110301/278a65b4/attachment.html From eric.kemp at nasa.gov Mon Feb 28 11:08:51 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.0)[NORTHROP GRUMMAN INFORMATION TECH]) Date: Mon, 28 Feb 2011 12:08:51 -0600 Subject: [Wrf-users] Bug in WRF 3.2.1 RSL_LITE library Message-ID: Dear wrfhelp: I continue to investigate a problem I reported several months ago: occasional CFL errors and/or crashes with identical initial conditions when running a particular domain. While I have not found a solution to this problem, I have found a bug in the RSL_LITE library. In WRFV3/external/RSL_LITE/c_code.c, function RSL_LITE_INIT_EXCH contains two local variables (nbytes_x_recv and nbytes_y_recv). These variables are declared without initial values. If the code is compiled with MPI support, these variables are assigned values within if blocks (i.e., under special conditions; in some cases the statements in the if blocks are not executed). If RSL_LITE is compiled serially, then the if statements are removed by the preprocessor. At the end of the function (near line 300), these variables are used to calculate xp_curs_recv and yp_curs_recv. My bug patch: Near line 229: #ifndef STUBMPI MPI_Comm comm, *comm0, dummy_comm ; nbytes_x_recv = 0; /* Bug fix */ nbytes_y_recv = 0; /* Bug fix */ comm0 = &dummy_comm ; Near line 295: buffer_for_proc ( xm , nbytes, RSL_SENDBUF ) ; } } #else /* Bug fix */ nbytes_x_recv = 0; /* Bug fix */ nbytes_y_recv = 0; /* Bug fix */ #endif yp_curs = 0 ; ym_curs = 0 ; xp_curs = 0 ; xm_curs = 0 ; I found this bug using ifort 11.1.038 and the ?-check? compiler flag. Cheers, -Eric -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110228/be7b01df/attachment.html From moudipascal at yahoo.fr Wed Mar 2 06:21:42 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Wed, 2 Mar 2011 13:21:42 +0000 (GMT) Subject: [Wrf-users] How can i attache my simulation to wrfhelp Message-ID: <945851.47753.qm@web25107.mail.ukl.yahoo.com> Dear all, I am having troubles with gen_be generation (obtaining be.dat file). I would like to upload my simulations (52 MB of space each). How can i do. I want wrfhelp to compile it so that they help me to fix the problem Pascal MOUDI IGRI Ph-D Student at the Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110302/7c4ce93d/attachment.html From geeta124 at hotmail.com Thu Mar 3 03:57:30 2011 From: geeta124 at hotmail.com (Geeta Geeta) Date: Thu, 3 Mar 2011 10:57:30 +0000 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. Message-ID: Dear All, I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; } bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 Kindly help. The namelist.input file is also attached for reference. Geeta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/c2863e66/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4583 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/c2863e66/attachment.obj From nbon0004 at um.edu.mt Thu Mar 3 04:39:12 2011 From: nbon0004 at um.edu.mt (Norbert Bonnici) Date: Thu, 3 Mar 2011 12:39:12 +0100 Subject: [Wrf-users] RSL CTL error Message-ID: Dear all, I made a domain over Australia to simulate the Yasi tropical storm (1-2Feb2011) When I ran the simulation with this namelist.input I got this error. (files are attached) Can someone help me please. Regards -- Norbert Bonnici -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0000 Type: application/octet-stream Size: 381855 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/0525bc7b/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0000 Type: application/octet-stream Size: 380973 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/0525bc7b/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 3448 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/0525bc7b/attachment-0005.obj From jagabandhu at gmail.com Thu Mar 3 22:17:39 2011 From: jagabandhu at gmail.com (jagabandhu panda) Date: Fri, 4 Mar 2011 13:17:39 +0800 Subject: [Wrf-users] problem with arwpost In-Reply-To: References: Message-ID: Hi I am trying to compute cape, cin and mcape through ARWpost (these are calculated through the fortran code called module_calc_cape.f90 which is provided in ARWpost). These variables are diagnostics through this package and are expected to be diagnosed if the package is compiled successfully. However, I am not able to do so even though I am successfult in compiling the package successfully and correct in specifying the variables in the namelist. Some other variables calculated through module_calc_dbz.f90 are also not being diagnosed. Even I tried the latest version of ARWpost uploaded in the website yesterday! However, it is the same! I am specifying the things properly through the namelist. However, I see segmentation fault when I include cape or mcape or cin or dbz. Even if I increase the debug level, still it is not coming. I tried in various directions actually. But, I am not successful in this! I am not sure what to do now! I will really be happy and thankful if I get some help in this direction. Thanks in advance ~Jagabandhu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/0ee7bfe5/attachment.html From geeta124 at hotmail.com Thu Mar 3 21:32:04 2011 From: geeta124 at hotmail.com (Geeta Geeta) Date: Fri, 4 Mar 2011 04:32:04 +0000 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. In-Reply-To: <4D703059.7000203@ucar.edu> References: , <4D703059.7000203@ucar.edu> Message-ID: Dear Sir, Thanks for the reply. I am running WRf 3.2 for the 3 domains 27, 9 and 3 km for the 48hrs Forecast on the IBM P570 Machine. The Operating system is AIX5.3 and the code is compiled using xlf with POE in place. I had given the command bash$ time poe ../wrf.exe -procs 8. 1. The model gave segmentation fault after 4hrs of Integration. I also plotted the Model forecast for 2nd and 3rd hour at 27km resolution (Domain 1) So it showed me some data being plotted But when I tried to plot the output for the 2nd and 3rd domain, IT says ENTIRE GRID UNDEFINED. Kindly suggest. 2. >>>What data I have used ?? I am using the .grib2 files as Initial and BC taken from ftp://ftpprd.ncep.noaa.gov.in at 1x1 degree resolution. 3. >>>What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. I am attatching the rsl.error* and the rsl.out* files, namelist.input and namelist.wps files. It seems to me that there is no CFL violation. Thanks. geeta Date: Thu, 3 Mar 2011 17:20:41 -0700 From: wrfhelp at ucar.edu To: geeta124 at hotmail.com Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. Geeta, What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. On 3/3/2011 3:57 AM, Geeta Geeta wrote: Dear All, I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; } bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 Kindly help. The namelist.input file is also attached for reference. Geeta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4695 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0018.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 1280 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0019.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0000 Type: application/octet-stream Size: 129542 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0020.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0001 Type: application/octet-stream Size: 5301 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0021.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0002 Type: application/octet-stream Size: 6211 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0022.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0003 Type: application/octet-stream Size: 6937 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0023.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0004 Type: application/octet-stream Size: 6214 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0024.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0005 Type: application/octet-stream Size: 6216 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0025.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0006 Type: application/octet-stream Size: 5304 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0026.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0007 Type: application/octet-stream Size: 5306 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0027.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0000 Type: application/octet-stream Size: 130228 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0028.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0001 Type: application/octet-stream Size: 10145 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0029.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0002 Type: application/octet-stream Size: 11054 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0030.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0003 Type: application/octet-stream Size: 10103 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0031.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0004 Type: application/octet-stream Size: 11055 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0032.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0005 Type: application/octet-stream Size: 11059 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0033.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0006 Type: application/octet-stream Size: 10147 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0034.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0007 Type: application/octet-stream Size: 10150 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0035.obj From gorangas at gmail.com Fri Mar 4 07:21:39 2011 From: gorangas at gmail.com (Goran Gasparac) Date: Fri, 4 Mar 2011 15:21:39 +0100 Subject: [Wrf-users] wrf exe stops without error Message-ID: Dear users, how can I find out type of an error beside from rsl.* files? Mine WRF stops after few second, real and all preprocessing was successful, but wrf.exe stops and doesn't give any error in rsl.* files. Thanks, Regards, Goran *rsl.errror*: * taskid: 0 hostname: Quilting with 1 groups of 0 I/O tasks. Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 1 , ntasks in Y 1 WRF V3.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 102 1 95 ims,ime,jms,jme -4 107 -4 100 ips,ipe,jps,jpe 1 102 1 95 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate med_initialdata_input: calling input_model_input INPUT LandUse = "USGS" LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS WATER CATEGORY = 16 SNOW CATEGORY = 24 INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ************************************* Nesting domain ids,ide,jds,jde 1 139 1 130 ims,ime,jms,jme -4 144 -4 135 ips,ipe,jps,jpe 1 139 1 130 INTERMEDIATE domain ids,ide,jds,jde 29 80 23 71 ims,ime,jms,jme 24 85 18 76 ips,ipe,jps,jpe 27 82 21 73 ************************************* d01 2011-03-02_00:00:00 *** Initializing nest domain # 2 by horizontally interpolating parent domain # 1. *** INPUT LandUse = "USGS" LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS WATER CATEGORY = 16 SNOW CATEGORY = 24 INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES INPUT LandUse = "USGS" LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS WATER CATEGORY = 16 SNOW CATEGORY = 24 INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES Timing for Writing wrfout_d01_2011-03-02_00:00:00 for domain 1: 0.30316 elapsed seconds. Timing for processing lateral boundary for domain 1: 0.01309 elapsed seconds. WRF NUMBER OF TILES = 1* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/6e74f401/attachment.html From tnquanghuy at gmail.com Thu Mar 3 12:52:24 2011 From: tnquanghuy at gmail.com (Huy Tran) Date: Thu, 03 Mar 2011 10:52:24 -0900 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. In-Reply-To: References: Message-ID: <4D6FF178.80506@gmail.com> Hi Geeta, I would suggest you to turn off the feedback option (feedback = 0 in the &domains) to see if this can solve the problem. This option may or may not be important depends on your study. Otherwise, it's quite difficult to tell the reason for the segmentation fault. You might want to acquire more memory to run wrf, and/or optimize the fortran compiler and mpi with appropriate flag settings. The latter option highly depends on your system and the compiler you chose. Hope this can help. Huy On 3/3/2011 1:57 AM, Geeta Geeta wrote: > Dear All, > I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After > integrating for 4hours, the model gives segmentation fault. and these > directories are created and does not integrate beyond. > > bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > } > > bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > > bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > > After integrating for 4hours, the model gives segmentation fault. and > these directories are created and does not integrate beyond. > drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 > > Kindly help. The namelist.input file is also attached for reference. > > Geeta > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110303/a29a8eed/attachment.html From hamed319 at yahoo.com Sat Mar 5 07:02:00 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Sat, 5 Mar 2011 06:02:00 -0800 (PST) Subject: [Wrf-users] LAMBERT 2 UTM convertor Message-ID: <73610.63440.qm@web161206.mail.bf1.yahoo.com> Hi All, Does anyone know about a free software for converting LAMBERT parameters to UTM ones. Thanks in advance, Hamed -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110305/dafc9ea2/attachment.html From hedde.cea at gmail.com Mon Mar 7 01:47:44 2011 From: hedde.cea at gmail.com (Thierry HEDDE) Date: Mon, 7 Mar 2011 09:47:44 +0100 Subject: [Wrf-users] RSL CTL error In-Reply-To: References: Message-ID: Hi Norbert, I just found that you are asking for a 2 days simulation whereas your start and end times give a 42 hours simulation. I'm not enough experienced on the physics and dynamics parts to tell anything. Cordially 2011/3/3 Norbert Bonnici > Dear all, > I made a domain over Australia to simulate the Yasi tropical storm > (1-2Feb2011) When I ran the simulation with this namelist.input I got > this error. (files are attached) > Can someone help me please. > Regards > -- > Norbert Bonnici > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Thierry HEDDE Laboratoire de Mod?lisation des Transferts dans l'Environnement CEA/CADARACHE DEN/DTN/SMTM/LMTE B?t. 307 Pi?ce 9 13108 ST PAUL LEZ DURANCE CEDEX FRANCE -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110307/d107e9c7/attachment.html From k_radhika at tropmet.res.in Fri Mar 4 22:21:35 2011 From: k_radhika at tropmet.res.in (Kanase Radhika D.) Date: Sat, 5 Mar 2011 10:51:35 +0530 (IST) Subject: [Wrf-users] Wrf-users Digest, Vol 79, Issue 4, message 1 In-Reply-To: Message-ID: <165235044.169815.1299302495117.JavaMail.root@mail1.tropmet.res.in> are u using grads to see the output. if yes then try the latest version of grads and also check the space on your system. Also try the output with ncview, if it shows something, then ARWpost don't have any problem, probably the problem is with the grads or the software ur using for display. Radhika ----- Original Message ----- From: wrf-users-request at ucar.edu To: wrf-users at ucar.edu Sent: Saturday, March 5, 2011 12:23:51 AM Subject: Wrf-users Digest, Vol 79, Issue 4 Send Wrf-users mailing list submissions to wrf-users at ucar.edu To subscribe or unsubscribe via the World Wide Web, visit http://mailman.ucar.edu/mailman/listinfo/wrf-users or, via email, send a message with subject or body 'help' to wrf-users-request at ucar.edu You can reach the person managing the list at wrf-users-owner at ucar.edu When replying, please edit your Subject line so it is more specific than "Re: Contents of Wrf-users digest..." Today's Topics: 1. problem with arwpost (jagabandhu panda) 2. Re: WRF 3.2 Model not running after 4hrs of Integration. (Geeta Geeta) ---------------------------------------------------------------------- Message: 1 Date: Fri, 4 Mar 2011 13:17:39 +0800 From: jagabandhu panda Subject: [Wrf-users] problem with arwpost To: wrf-users at ucar.edu Message-ID: Content-Type: text/plain; charset="iso-8859-1" Hi I am trying to compute cape, cin and mcape through ARWpost (these are calculated through the fortran code called module_calc_cape.f90 which is provided in ARWpost). These variables are diagnostics through this package and are expected to be diagnosed if the package is compiled successfully. However, I am not able to do so even though I am successfult in compiling the package successfully and correct in specifying the variables in the namelist. Some other variables calculated through module_calc_dbz.f90 are also not being diagnosed. Even I tried the latest version of ARWpost uploaded in the website yesterday! However, it is the same! I am specifying the things properly through the namelist. However, I see segmentation fault when I include cape or mcape or cin or dbz. Even if I increase the debug level, still it is not coming. I tried in various directions actually. But, I am not successful in this! I am not sure what to do now! I will really be happy and thankful if I get some help in this direction. Thanks in advance ~Jagabandhu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/0ee7bfe5/attachment-0001.html ------------------------------ Message: 2 Date: Fri, 4 Mar 2011 04:32:04 +0000 From: Geeta Geeta Subject: Re: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. To: wrf help Cc: wrf users group Message-ID: Content-Type: text/plain; charset="iso-8859-1" Dear Sir, Thanks for the reply. I am running WRf 3.2 for the 3 domains 27, 9 and 3 km for the 48hrs Forecast on the IBM P570 Machine. The Operating system is AIX5.3 and the code is compiled using xlf with POE in place. I had given the command bash$ time poe ../wrf.exe -procs 8. 1. The model gave segmentation fault after 4hrs of Integration. I also plotted the Model forecast for 2nd and 3rd hour at 27km resolution (Domain 1) So it showed me some data being plotted But when I tried to plot the output for the 2nd and 3rd domain, IT says ENTIRE GRID UNDEFINED. Kindly suggest. 2. >>>What data I have used ?? I am using the .grib2 files as Initial and BC taken from ftp://ftpprd.ncep.noaa.gov.in at 1x1 degree resolution. 3. >>>What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. I am attatching the rsl.error* and the rsl.out* files, namelist.input and namelist.wps files. It seems to me that there is no CFL violation. Thanks. geeta Date: Thu, 3 Mar 2011 17:20:41 -0700 From: wrfhelp at ucar.edu To: geeta124 at hotmail.com Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. Geeta, What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. On 3/3/2011 3:57 AM, Geeta Geeta wrote: Dear All, I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; } bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 Kindly help. The namelist.input file is also attached for reference. Geeta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4695 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 1280 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0001.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0000 Type: application/octet-stream Size: 129542 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0001 Type: application/octet-stream Size: 5301 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0002 Type: application/octet-stream Size: 6211 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0003 Type: application/octet-stream Size: 6937 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0005.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0004 Type: application/octet-stream Size: 6214 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0006.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0005 Type: application/octet-stream Size: 6216 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0007.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0006 Type: application/octet-stream Size: 5304 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0008.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0007 Type: application/octet-stream Size: 5306 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0009.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0000 Type: application/octet-stream Size: 130228 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0010.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0001 Type: application/octet-stream Size: 10145 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0011.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0002 Type: application/octet-stream Size: 11054 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0012.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0003 Type: application/octet-stream Size: 10103 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0013.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0004 Type: application/octet-stream Size: 11055 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0014.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0005 Type: application/octet-stream Size: 11059 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0015.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0006 Type: application/octet-stream Size: 10147 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0016.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.out.0007 Type: application/octet-stream Size: 10150 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110304/73010b5f/attachment-0017.obj ------------------------------ _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users End of Wrf-users Digest, Vol 79, Issue 4 **************************************** From manas.soni at gmail.com Sat Mar 5 02:08:23 2011 From: manas.soni at gmail.com (manish soni) Date: Sat, 5 Mar 2011 14:38:23 +0530 Subject: [Wrf-users] problem with arwpost Liquid water content Message-ID: Hi, *How to derive Liquid Water Content (LWC) from the output of WRF model? * -- *-- (?`?.???) `?.?(?`?.???) (?`?.???)?.?? `?.?.?? Manish Soni System Administrator & Jr.Scientific Officer, RSD, BIT Extension Center Jaipur* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110305/b92d34db/attachment.html From manas.soni at gmail.com Sat Mar 5 02:16:30 2011 From: manas.soni at gmail.com (manish soni) Date: Sat, 5 Mar 2011 14:46:30 +0530 Subject: [Wrf-users] problem with arwpost Liquid water content Message-ID: Hi, *How to derive Liquid Water Content (LWC) from the output of WRF model? * -- *-- (?`?.???) `?.?(?`?.???) (?`?.???)?.?? `?.?.?? Manish Soni Jaipur* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110305/22677bdb/attachment.html From Michael.Zulauf at iberdrolaren.com Fri Mar 4 17:09:52 2011 From: Michael.Zulauf at iberdrolaren.com (Zulauf, Michael) Date: Fri, 4 Mar 2011 16:09:52 -0800 Subject: [Wrf-users] problem with "shifted" SST grid when using RTG_SST_HR Message-ID: Hi WRF Community. . . I'm attempting to use the 1/12 degree RTG_SST_HR data with WRF (from http://polar.ncep.noaa.gov/sst/) in conjunction with 1/2 degree GFS output, and I'm having some problems. What I'm seeing is that the SST data is not being properly mapped to the model grid - instead it is being offset by a small (but significant) amount. For example, my d03 domain covers the Pacific Northwest at 3km resolution. If I examine the met_em.d03 files and look at the SST field, I see that the coastline is shifted west by approximately 1 degree (and by a smaller amount north) compared to the GFS and static fields. It appears to me as if the ungrib/metgrid combo isn't reading the grid locations properly. I'm guessing that might be because the grid definition in the RTG_SST_HR files seems a bit wonky. Using wgrib2 to output the grid information, I get this: % wgrib2 -grid /filer_data/tmp/rtgssthr_grb_0.083.grib2 1:0:grid_template=0: lat-lon grid:(4320 x 2160) units 1e-06 input WE:NS output WE:SN res 48 lat 89.958000 to -89.958000 by 0.083000 lon 0.042000 to 359.958000 by 0.083000 #points=9331200 The number of points is correct (assuming a 1/12 degree spacing), but the stated spacing and start/stop lat/lon values are only approximate. It seems to me that the lat and lon should be defined as: lon = (i - 1/2) * ds (i from 1 to 4320) lat = -90 + (j - 1/2) * ds (j from 1 to 2160) ds = 1/12 degree Is there any way to override the internal grid description in the grib file? Or rewrite the grid description in the grib file? Has anybody seen this problem before? Thanks, Mike -- PLEASE NOTE - NEW E-MAIL ADDRESS: michael.zulauf at iberdrolaren.com Mike Zulauf Meteorologist, Lead Senior Wind Asset Management Iberdrola Renewables 1125 NW Couch, Suite 700 Portland, OR 97209 Office: 503-478-6304 Cell: 503-913-0403 Please be advised that email addresses for Iberdrola Renewables personnel have changed to first.last at iberdrolaREN.com effective Aug. 16, 2010. Please make a note. Thank you. This message is intended for the exclusive attention of the recipient(s) indicated. Any information contained herein is strictly confidential and privileged. If you are not the intended recipient, please notify us by return e-mail and delete this message from your computer system. Any unauthorized use, reproduction, alteration, filing or sending of this message and/or any attached files may lead to legal action being taken against the party(ies) responsible for said unauthorized use. Any opinion expressed herein is solely that of the author(s) and does not necessarily represent the opinion of the Company. The sender does not guarantee the integrity, speed or safety of this message, and does not accept responsibility for any possible damage arising from the interception, incorporation of viruses, or any other damage as a result of manipulation. From nbon0004 at um.edu.mt Mon Mar 7 11:56:44 2011 From: nbon0004 at um.edu.mt (Norbert Bonnici) Date: Mon, 7 Mar 2011 19:56:44 +0100 Subject: [Wrf-users] RSL CTL error In-Reply-To: References: Message-ID: thanks :) On Mon, Mar 7, 2011 at 9:47 AM, Thierry HEDDE wrote: > Hi Norbert, > > I just found that you are asking for a 2 days simulation whereas your start > and end times give a 42 hours simulation. > I'm not enough experienced on the physics and dynamics parts to tell > anything. > > Cordially > > 2011/3/3 Norbert Bonnici >> >> Dear all, >> I made a domain over Australia to simulate the Yasi tropical storm >> (1-2Feb2011) When I ran the simulation with this namelist.input I got >> this error. (files are attached) >> Can someone help me please. >> Regards >> -- >> Norbert Bonnici >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > > > > -- > Thierry HEDDE > Laboratoire de Mod?lisation des Transferts dans l'Environnement > CEA/CADARACHE > DEN/DTN/SMTM/LMTE > B?t. 307 Pi?ce 9 > 13108 ST PAUL LEZ DURANCE CEDEX > FRANCE > > -- Norbert Bonnici From bbrashers at Environcorp.com Tue Mar 8 11:17:07 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Tue, 8 Mar 2011 10:17:07 -0800 Subject: [Wrf-users] Recent smpar-only benchmarks? Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520691BB35@irvine01.irvine.environ.local> Has anyone done some recent smpar-only WRF benchmarking of these two systems: Intel Xeon X5660 (6 core, 2.8 GHz) AMD Opteron 6174 (12 core, 2.2. GHz) You can get twice as many cores from the AMD system compared to the Intel system, for about the same amount of money. Note I'm not interested in dmpar scaling, so only up to 24 (Intel) or 48 (AMD) cores per run. I typically have many 5-day chunks to run, so I use smpar only. Thanks, Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From df6.626 at gmail.com Tue Mar 8 08:20:21 2011 From: df6.626 at gmail.com (Dmitry Vinokurov) Date: Tue, 08 Mar 2011 20:20:21 +0500 Subject: [Wrf-users] WPS linking error, undefined reference to `_gfortran_pow_r4_i4' etc Message-ID: <4D764935.5010404@gmail.com> Hi! I'm trying to compile WRF for real data on CentOS 5.5 with gcc44 WRF was built successfully (after some digging and rebuilding netcdf with gcc44 from srpm), but WPS didn't. According to manual ( http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap2.htm#Build_WPS ) only plotfmt.exe and plotgrids.exe are missing, other executables (geogrid.exe, metgrid.exe, ungrib.exe, and avg_tsfc.exe, calc_ecmwf_p.exe, g1print.exe, g2print.exe, height_ukmo.exe, mod_levs.exe, rd_intermediate.exe) were built successfully. As far I see from build log, there are a couple of linker errors: ---------------- ... gfortran44 -o plotfmt.exe -g plotfmt.o read_met_module.o module_debug.o \ misc_definitions_module.o cio.o met_data_module.o \ -L/usr/lib64/ncarg -L/usr/lib/gcc/x86_64-redhat-linux6E/4.4.0/ -lgfortran -lg2c -lncarg -lncarg_gks -lncarg_c -L/usr/X11R6/lib -lX1 /usr/lib64/ncarg/libncarg.a(cpcldr.o): In function `cpcldr_': (.text+0x734): undefined reference to `_gfortran_pow_r4_i4' /usr/lib64/ncarg/libncarg.a(cpcldr.o): In function `cpcldr_': (.text+0x79e): undefined reference to `_gfortran_pow_r4_i4' /usr/lib64/ncarg/libncarg.a(cpcldr.o): In function `cpcldr_': (.text+0xd2c): undefined reference to `_gfortran_copy_string' ... gfortran44 -o plotgrids.exe -g module_map_utils.o module_debug.o cio.o constants_module.o misc_definitions_module.o \ plotgrids.o -L/usr/lib64/ncarg -L/usr/lib/gcc/x86_64-redhat-linux6E/4.4.0/ -lgfortran -lg2c -lncarg -lncarg_gks -lncarg_c -L/usr/X11R6/lib -lX11 /usr/lib64/ncarg/libncarg.a(mdpit.o): In function `mdpit_': (.text+0x7ec): undefined reference to `_gfortran_pow_r8_i4' /usr/lib64/ncarg/libncarg.a(mdpit.o): In function `mdpit_': (.text+0xdef): undefined reference to `_gfortran_pow_r8_i4' /usr/lib64/ncarg/libncarg.a(mdlndr.o): In function `mdlndr_': ... ---------------- As I understand, _gfortran functions are from libgfortran.so. Here are results of searching for one of such functions: ---------------- [user at localhost ~]$ nm -D /usr/lib/gcc/x86_64-redhat-linux/4.1.2/libgfortran.so | grep _gfortran_pow_r4_i4 0000003e7c665c40 T _gfortran_pow_r4_i4 [user at localhost ~]$ nm -D /usr/lib/gcc/x86_64-redhat-linux6E/4.4.0/libgfortran.so | grep _gfortran_pow_r4_i4 [user at localhost ~]$ ---------------- Seems like there is no such function in 4.4 libs at all. Could anybody advise, how could I fix these errors or where they come from? Maybe I should rebuild ncl packages with gcc44? -- Best regards, Dmitry Vinokurov +7 905 862 17 11 skype: d.a.vinokurov From francisco.salamanca at ciemat.es Tue Mar 8 02:29:13 2011 From: francisco.salamanca at ciemat.es (Salamanca Palou, Francisco) Date: Tue, 8 Mar 2011 10:29:13 +0100 Subject: [Wrf-users] (no subject) Message-ID: <19034_1299576554_4D75F6EA_19034_6655_1_AB6D2377B3C7434F88A329B01223649A166C8C@STRC.ciemat.es> Dear all, I am simulating with WRF V3.2.1 a fog episode of three days. The model overstimates the fog formation and consequently the air temperature close to the ground is very low against measurements. How can i improve the results ? Any help will be welcome. Thanks so much Francisco Salamanca Palou CIEMAT (Research Centre for Energy, Environment and Technology) Avenida Complutense 22 28040 (MADRID) SPAIN ---------------------------- Confidencialidad: Este mensaje y sus ficheros adjuntos se dirige exclusivamente a su destinatario y puede contener informaci?n privilegiada o confidencial. Si no es vd. el destinatario indicado, queda notificado de que la utilizaci?n, divulgaci?n y/o copia sin autorizaci?n est? prohibida en virtud de la legislaci?n vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente respondiendo al mensaje y proceda a su destrucci?n. Disclaimer: This message and its attached files is intended exclusively for its recipients and may contain confidential information. If you received this e-mail in error you are hereby notified that any dissemination, copy or disclosure of this communication is strictly prohibited and may be unlawful. In this case, please notify us by a reply and delete this email and its contents immediately. ---------------------------- From francisco.salamanca at ciemat.es Tue Mar 8 02:33:35 2011 From: francisco.salamanca at ciemat.es (Salamanca Palou, Francisco) Date: Tue, 8 Mar 2011 10:33:35 +0100 Subject: [Wrf-users] problem with arwpost Liquid water content Message-ID: Hi Manish, The variable QCLOUD is the liquid water content (kg kg -1) and QVAPOR is the vapor water content ( kg kg-1). Warm regards Francisco ________________________________ De: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] En nombre de manish soni Enviado el: s?bado, 05 de marzo de 2011 10:17 Para: wrf-users at ucar.edu Asunto: [Wrf-users] problem with arwpost Liquid water content Hi, How to derive Liquid Water Content (LWC) from the output of WRF model? -- -- (?`?.???) `?.?(?`?.???) (?`?.???)?.?? `?.?.?? Manish Soni Jaipur -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110308/3d5dc63a/attachment.html From gorangas at gmail.com Tue Mar 8 07:03:45 2011 From: gorangas at gmail.com (Goran Gasparac) Date: Tue, 8 Mar 2011 15:03:45 +0100 Subject: [Wrf-users] wrf exe stops without error In-Reply-To: References: Message-ID: Dear users, thanks everybody for useful tips, after all, I didn't change anything and it's working now. Very strange ?! In fact I got that kind of an error for quite a long time. Sometimes I need to start wrf.exe for 5,6 - 10,...15 times and WRF start working. Every time it stoped (if stopping) after calling RRTM? I don't know is this connected. Hm..in rsl* files (with debug options 5000) there isn't any sign of "error" or "warning" word. Anybody got that kind of problem? Strange is also that WRF run (that WRF run that stopped few times, and finally, after few tries start working) had excellent results without any sign of extreme values which will give a clue that WRF is breaking because of possible CFL......,wrong parameter combination,...? I also did a check on met_em files with ncview, and there is also everything OK. Any suggestion what could be wrong? Maybe configuration, compilation? Just for recall, here is mine old post: http://mailman.ucar.edu/pipermail/wrf-users/2010/001550.html Thanks, kind regards, Goran Ga?parac -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110308/e657349b/attachment.html From yesubabu2006 at gmail.com Mon Mar 7 21:39:26 2011 From: yesubabu2006 at gmail.com (V.YesuBabu) Date: Tue, 8 Mar 2011 10:09:26 +0530 Subject: [Wrf-users] Contents of Wrf-users digest... Message-ID: Reply for : Wrf-users Digest, Vol 79, Issue 6 [ problem with "shifted" SST grid when using RTG_SST_ Zulauf, Michael ] Dear Mike, The intermediate **file format of WRF WPS ,SST and SKIN Temperature of land are represented by same parameter "SKINTEMP*"* .If we want to replace gfs sst by satellite sst ,first we need to make gfs SKIN Temperature to 1/12 degree resolution binary or acsii data then replace your RTG_SST_HR data to the same gfs SST grid points.Use the sample program and steps for ingesting other data in intermediate formate in WRF online tutorial http://www.mmm.ucar.edu/wrf/OnLineTutorial/Basics/UNGRIB/index.html -- *********************************** V.Yesubabu, Project Engineer, CAS/SECG,C-DAC,Main Building, Pune University,Pune. India-411007. Phone :020-25704226 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110308/5b07d79a/attachment.html From kotroni at meteo.noa.gr Tue Mar 8 13:38:09 2011 From: kotroni at meteo.noa.gr (Vassiliki Kotroni) Date: Tue, 8 Mar 2011 22:38:09 +0200 Subject: [Wrf-users] amd processors and intel compilers Message-ID: <001801cbddd0$bd2907f0$377b17d0$@noa.gr> dear colleagues has anyone been able to run wrf on amd processors with intel fortran compilers (dmpar)? I would be grateful if I could get any feedback about eventual incompatibilities thank you in advance Vasso ---------------------------------------------------------------------------- -------- Dr. Vassiliki KOTRONI Institute of Environmental Research National Observatory of Athens Lofos Koufou, P. Pendeli, GR-15236 Athens, Greece Tel: +30 2 10 8109126 Fax: +30 2 10 8103236 Daily weather forecasts at: www.noa.gr/forecast (in english) www.meteo.gr (in greek) www.eurometeo.gr -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110308/1a3c04a4/attachment-0001.html From andrew.robbie at gmail.com Tue Mar 8 15:16:22 2011 From: andrew.robbie at gmail.com (Andrew Robbie (Gmail)) Date: Wed, 9 Mar 2011 09:16:22 +1100 Subject: [Wrf-users] Recent smpar-only benchmarks? In-Reply-To: <1B8D1B9BF4DCDC4A90A42E312FF308520691BB35@irvine01.irvine.environ.local> References: <1B8D1B9BF4DCDC4A90A42E312FF308520691BB35@irvine01.irvine.environ.local> Message-ID: <1EEC4D3D-0C05-47D9-8478-75B768FCF091@gmail.com> On 09/03/2011, at 5:17 AM, Bart Brashers wrote: > Has anyone done some recent smpar-only WRF benchmarking of these two > systems: > > Intel Xeon X5660 (6 core, 2.8 GHz) > AMD Opteron 6174 (12 core, 2.2. GHz) > > You can get twice as many cores from the AMD system compared to the > Intel system, for about the same amount of money. I'd like to see some numbers on this too. It actually quite possible to get a quad-socket board populated with AMD 8 core (eg 6134) for less than a dual socket X5660 system. So 32 AMD opteron cores for the same price as 12 nehalem ones, because the AMD CPU has no price penalty for running quad socket. However I think the more relevant limitation is the memory bandwidth available. Each AMD socket addresses one four-way interleave RAM bank, and each Intel has a three-way interleave bank. However, especially in the case of the AMD 12 core, that bandwidth is being split among many cores. The AMD chip also has an unusual cache design and each pair of cores has shared access to a pair of FPUs -- hard to guess how this effects real world performance. c.f: http://www.realworldtech.com/page.cfm?ArticleID=RWT082610181333&p=3 > Note I'm not interested in dmpar scaling, so only up to 24 (Intel) > or 48 > (AMD) cores per run. I typically have many 5-day chunks to run, so I > use smpar only. I think dmpar benchmarks might also be greatly influenced by other tunables to do with MPI. Regards, Andrew From ebeigi3 at tigers.lsu.edu Tue Mar 8 15:37:00 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Tue, 8 Mar 2011 16:37:00 -0600 Subject: [Wrf-users] run an example Message-ID: Dear Sir/Madam, I want to run an example in WRF model, it installed WRF version 3.2.1 and I tried to run an example which mentioned in page in 89 ARWUsersGuideV3.pdf (Real Data Test Case: 2000 January 24/12 through 25/12), but i couldn't run it because of this error " Fortan 71 ( intger devied by zero)" , is this example run only for WRF version 2.2? Is there any example i can run for WRF 3.2.1 Please inform me t how can i find some example for running WRF 3.2.1 . Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110308/a43f3801/attachment.html From ahsanshah01 at gmail.com Tue Mar 8 20:37:47 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Wed, 9 Mar 2011 08:37:47 +0500 Subject: [Wrf-users] Vtable In-Reply-To: References: Message-ID: Dear, I am using GFS Grib2 data for WRF run. I am little confused about the use of variable tables. In GFS data there are many variables present then why there are just few variables in Vtable.GFS (the default Vtable). Also there is field regarding humidity in the Vtable as well as in the GFS data but there is no humidity in the wrf output file. Files are attached. Please help me. Thanking you in advance. regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/5be687d9/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Vtable.GFS Type: application/octet-stream Size: 3679 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/5be687d9/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: wrfout.ctl Type: application/octet-stream Size: 8766 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/5be687d9/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: gfs_00z.ctl Type: application/octet-stream Size: 25046 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/5be687d9/attachment-0005.obj From ahsanshah01 at gmail.com Tue Mar 8 20:38:25 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Wed, 9 Mar 2011 08:38:25 +0500 Subject: [Wrf-users] Vtable In-Reply-To: References: Message-ID: Dear, I am using GFS Grib2 data for WRF run. I am little confused about the use of variable tables. In GFS data there are many variables present then why there are just few variables in Vtable.GFS (the default Vtable). Also there is field regarding humidity in the Vtable as well as in the GFS data but there is no humidity in the wrf output file. Files are attached. Please help me. Thanking you in advance. regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/6c3591d8/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Vtable.GFS Type: application/octet-stream Size: 3679 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/6c3591d8/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: wrfout.ctl Type: application/octet-stream Size: 8766 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/6c3591d8/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: gfs_00z.ctl Type: application/octet-stream Size: 25046 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/6c3591d8/attachment-0005.obj From ahsanshah01 at gmail.com Tue Mar 8 20:44:53 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Wed, 9 Mar 2011 08:44:53 +0500 Subject: [Wrf-users] Winds Message-ID: Hello, I am running WRF 3.2.1 at 11km resolution for selected region. I am having problem while visualizing Winds at different pressure levels. while going down from 200hpa to 850hpa there is increasing white patch in some area showing no winds. The images are attached. Help is needed. Regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: 44.jpg Type: image/jpeg Size: 28019 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0004.jpg -------------- next part -------------- A non-text attachment was scrubbed... Name: 11.jpg Type: image/jpeg Size: 27045 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0005.jpg -------------- next part -------------- A non-text attachment was scrubbed... Name: 22.jpg Type: image/jpeg Size: 28574 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0006.jpg -------------- next part -------------- A non-text attachment was scrubbed... Name: 33.jpg Type: image/jpeg Size: 27566 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0007.jpg -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 742 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4674 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0003.obj From bbrashers at Environcorp.com Tue Mar 8 16:09:03 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Tue, 8 Mar 2011 15:09:03 -0800 Subject: [Wrf-users] SNODAS in WRF? Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520691BEDC@irvine01.irvine.environ.local> Has anyone used the SNODAS data in WRF? It's available in netCDF format here: http://nsidc.org/data/polaris/. 1km resolution snow data, including snow depth and liquid water coverage. Any suggestions on the mechanics of getting it to work, or feedback on whether it helped the simulation, would be greatly appreciated. Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From geeta124 at hotmail.com Wed Mar 9 02:35:09 2011 From: geeta124 at hotmail.com (Geeta Geeta) Date: Wed, 9 Mar 2011 09:35:09 +0000 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. Message-ID: Dear all, I was running the model on the IBM P570 machine with 3 nested domains, 27, 9 and 3 kms. This problem I had reported earlier as well. The time step I have used is 120sec. The model was running up to 6hrs and after which It aborted by giving Core file. I am attatching the namelist.input file and the rsl files. bash-3.2$ ls -lrt wrfout* -rw-r--r-- 1 51017984 Mar 08 03:06 wrfout_d01_2010-08-06_00:00:00 -rw-r--r-- 1 176907272 Mar 08 03:06 wrfout_d02_2010-08-06_00:00:00 -rw-r--r-- 1 252611432 Mar 08 03:06 wrfout_d03_2010-08-06_00:00:00 bash-3.2$ bash-3.2$ ncdump -v Times wrfout_d01_2010-08-06_00:00:00 imes = "2010-08-06_00:00:00", "2010-08-06_01:00:00", "2010-08-06_02:00:00", "2010-08-06_03:00:00", "2010-08-06_04:00:00", "2010-08-06_05:00:00", "2010-08-06_06:00:00" ; } bash-3.2$ AS SUGGESTED, I have tried to look for NaN in the wrfout files but It does not exist.!!!!!! bash-3.2$ ncdump wrfout_d01_2010-08-06_00:00:00 | grep -i NaN IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; ISLTYP:description = "DOMINANT SOIL CATEGORY" ; bash-3.2$ ncdump wrfout_d02_2010-08-06_00:00:00 | grep -i NaN IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; ISLTYP:description = "DOMINANT SOIL CATEGORY" ; bash-3.2$ ncdump wrfout_d03_2010-08-06_00:00:00 | grep -i NaN IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; ISLTYP:description = "DOMINANT SOIL CATEGORY" ; bash-3.2$ bash-3.2$ date Tue Mar 8 03:29:15 IST 2011 bash-3.2$ bash-3.2$ ls -lrt core* coredir.5: total 655848 -rw-r--r-- 1 335790224 Mar 08 03:07 core coredir.3: total 643664 -rw-r--r-- 1 329553200 Mar 08 03:07 core coredir.2: total 646352 -rw-r--r-- 1 330929600 Mar 08 03:07 core coredir.1: total 611240 -rw-r--r-- 1 312953392 Mar 08 03:07 core coredir.7: total 606376 -rw-r--r-- 1 310461104 Mar 08 03:07 core coredir.6: total 605224 -rw-r--r-- 1 309871168 Mar 07 05:15 core coredir.4: total 658528 -rw-r--r-- 1 337166176 Mar 08 03:07 core coredir.0: total 758952 -rw-r--r-- 1 388580736 Mar 08 03:07 core bash-3.2$ ls -lrt core* I have used GREP in the rsl files ash-3.2$ grep -i NAN rsl.error.0000 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 22 1 -NaNQ 5000.000000 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 49 1 NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0001 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 65 1 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0002 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 1 32 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0003 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 63 32 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0004 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 1 63 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0005 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 63 81 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0006 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 2 94 NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.error.0007 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 63 94 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.out.0000 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 22 1 -NaNQ 5000.000000 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 49 1 NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.out.0001 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 65 1 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.out.0002 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 1 32 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.out.0003 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 63 32 -NaNQ 5000.000000 bash-3.2$ grep -i NAN rsl.out.0004 WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 1 63 -NaNQ 5000.000000 I have Decreases the TIME step to 90sec but It has not helped me. The model aborts after running successfully for about 2-3 hours. PLS suggest. geeta Date: Fri, 4 Mar 2011 12:46:29 -0700 From: wrfhelp at ucar.edu To: geeta124 at hotmail.com Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. Message body Can you use ncview to look at your wrfout files? You may need to do to those outputs that are right before model stopped. Please check whether there is any weird data in those output files? Did you see any NaN when typing ncdump wrfout | grep -i Nan? On 3/3/2011 9:32 PM, Geeta Geeta wrote: Dear Sir, Thanks for the reply. I am running WRf 3.2 for the 3 domains 27, 9 and 3 km for the 48hrs Forecast on the IBM P570 Machine. The Operating system is AIX5.3 and the code is compiled using xlf with POE in place. I had given the command bash$ time poe ../wrf.exe -procs 8. 1. The model gave segmentation fault after 4hrs of Integration. I also plotted the Model forecast for 2nd and 3rd hour at 27km resolution (Domain 1) So it showed me some data being plotted But when I tried to plot the output for the 2nd and 3rd domain, IT says ENTIRE GRID UNDEFINED. Kindly suggest. 2. >>>What data I have used ?? I am using the .grib2 files as Initial and BC taken from ftp://ftpprd.ncep.noaa.gov.in at 1x1 degree resolution. 3. >>>What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. I am attatching the rsl.error* and the rsl.out* files, namelist.input and namelist.wps files. It seems to me that there is no CFL violation. Thanks. geeta Date: Thu, 3 Mar 2011 17:20:41 -0700 From: wrfhelp at ucar.edu To: geeta124 at hotmail.com Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. Geeta, What is the error message in your output file? Did you find any CFL violation? Your namelist.input looks fine. Please send us error messages in your run. On 3/3/2011 3:57 AM, Geeta Geeta wrote: Dear All, I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; } bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 Times = "2010-08-04_00:00:00", "2010-08-04_01:00:00", "2010-08-04_02:00:00", "2010-08-04_03:00:00", "2010-08-04_04:00:00" ; After integrating for 4hours, the model gives segmentation fault. and these directories are created and does not integrate beyond. drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 Kindly help. The namelist.input file is also attached for reference. Geeta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/6593bffd/attachment.html From rcalmeida at terra.com.br Wed Mar 9 12:40:28 2011 From: rcalmeida at terra.com.br (Ricardo Almeida) Date: Wed, 9 Mar 2011 16:40:28 -0300 Subject: [Wrf-users] Wrf-users Digest, Vol 79, Issue 11 In-Reply-To: References: Message-ID: <969B40445D244E338F945DB910232A63@ricardoPC> Dear Syed, That happens because some isobaric surfaces are "below" the surface level. Therefore winds are not defined on them. You should use some post processing software that extrapolates winds to those surfaces. I suggest you check WRF documentation. Regards, Ricardo ----- Original Message ----- From: To: Sent: Wednesday, March 09, 2011 3:43 PM Subject: Wrf-users Digest, Vol 79, Issue 11 > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. Winds (Ahsan Ali) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 9 Mar 2011 08:44:53 +0500 > From: Ahsan Ali > Subject: [Wrf-users] Winds > To: wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Hello, > > I am running WRF 3.2.1 at 11km resolution for selected region. I am having > problem while visualizing Winds at different pressure levels. while going > down from 200hpa to 850hpa there is increasing white patch in some area > showing no winds. The images are attached. Help is needed. > > Regards, > > -- > Syed Ahsan Ali Bokhari > Electronic Engineer (EE) > > Research & Development Division > Pakistan Meteorological Department H-8/4, Islamabad. > Phone # off +92518358714 > Cell # +923155145014 > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment.html > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: 44.jpg > Type: image/jpeg > Size: 28019 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment.jpg > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: 11.jpg > Type: image/jpeg > Size: 27045 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0001.jpg > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: 22.jpg > Type: image/jpeg > Size: 28574 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0002.jpg > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: 33.jpg > Type: image/jpeg > Size: 27566 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0003.jpg > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: namelist.wps > Type: application/octet-stream > Size: 742 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment.obj > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: namelist.input > Type: application/octet-stream > Size: 4674 bytes > Desc: not available > Url : > http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/03f1edc5/attachment-0001.obj > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 79, Issue 11 > ***************************************** > > E-mail verificado pelo Terra Anti-Spam. > Para classificar esta mensagem como spam ou n??o spam, visite > http://ecp.terra.com.br/cgi-bin/reportspam.cgi?+_d=SCYzODA1Mzc1I3Blcm0hdGVycmEmMSwxMjk5Njk2MjU5LjU5NTEzLjMyMzQ5LmJldHVuZS50ZXJyYS5jb20sNTU4NQ==TerraMail > Verifique periodicamente a pasta Spam para garantir que apenas mensagens > indesejadas sejam classificadas como Spam. > > Esta mensagem foi verificada pelo E-mail Protegido Terra. > Atualizado em 09/03/2011 > > From agnes.mika at bmtargoss.com Thu Mar 10 00:52:07 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Thu, 10 Mar 2011 08:52:07 +0100 Subject: [Wrf-users] Winds In-Reply-To: References: Message-ID: <20110310075207.GA16554@aggedor.argoss.nl> Hallo Ahsan, It looks like there are mountains in your region. ;-) So no wonder that you see no wind when the mountains are higher than the altitude of your pressure level (e.g., 850 hPa corresponds to roughly 1500 m). WRF does not produce forecasts under the ground level. Greetings, Agnes Ahsan Ali wrote: > Hello, > > I am running WRF 3.2.1 at 11km resolution for selected region. I am having > problem while visualizing Winds at different pressure levels. while going > down from 200hpa to 850hpa there is increasing white patch in some area > showing no winds. The images are attached. Help is needed. > > Regards, > > -- > Syed Ahsan Ali Bokhari > Electronic Engineer (EE) > > Research & Development Division > Pakistan Meteorological Department H-8/4, Islamabad. > Phone # off +92518358714 > Cell # +923155145014 > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. From agnes.mika at bmtargoss.com Thu Mar 10 01:01:30 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Thu, 10 Mar 2011 09:01:30 +0100 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. In-Reply-To: References: Message-ID: <20110310080130.GB16554@aggedor.argoss.nl> Hallo Geeta, The "WOULD GO OFF TOP" messages mean vertical CFL criteria violations. You do have to reduce your timestep further or decrease your vertical level spacing. It can happen in (primarily) convective situations that there are high vertical velocities and your timestep is too large to be able to resolve those (hence the CFL error). Where is your domain? If you are runnning the model for mountainous regions these errors are even more likely to occur. In this case you can also opt for smoothing your terrain data first so that you get less steep slopes. For instance, when I did simulations for Switzerland at 1 km resolution, without terrain smoothing, I had to use a timestep of 1 second to avoid violations to the CFL criterium - which makes the model very slooooooow... In most cases you can re-set the timestep to a larger value after after several hours (so to speed up your model run you could consider writing restart files, running the first say 5 hours with your original (120 sec) timestep, then running as long as needed with a much shorter timestep, then the rest of the simulation again with a larger timestep). Hope this helps, Agnes Geeta Geeta wrote: > > Dear all, > I was running the model on the IBM P570 machine with 3 nested domains, 27, 9 and 3 kms. This problem I had reported earlier as well. The time step I have used is 120sec. The model was running up to 6hrs and after which It aborted by giving Core file. > > > I am attatching the namelist.input file and the rsl files. > > bash-3.2$ ls -lrt wrfout* > -rw-r--r-- 1 51017984 Mar 08 03:06 wrfout_d01_2010-08-06_00:00:00 > -rw-r--r-- 1 176907272 Mar 08 03:06 wrfout_d02_2010-08-06_00:00:00 > -rw-r--r-- 1 252611432 Mar 08 03:06 wrfout_d03_2010-08-06_00:00:00 > bash-3.2$ > bash-3.2$ ncdump -v Times wrfout_d01_2010-08-06_00:00:00 > imes = > "2010-08-06_00:00:00", > "2010-08-06_01:00:00", > "2010-08-06_02:00:00", > "2010-08-06_03:00:00", > "2010-08-06_04:00:00", > "2010-08-06_05:00:00", > "2010-08-06_06:00:00" ; > } > bash-3.2$ > AS SUGGESTED, I have tried to look for NaN in the wrfout files but It does not exist.!!!!!! > > bash-3.2$ ncdump wrfout_d01_2010-08-06_00:00:00 | grep -i NaN > IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; > ISLTYP:description = "DOMINANT SOIL CATEGORY" ; > bash-3.2$ ncdump wrfout_d02_2010-08-06_00:00:00 | grep -i NaN > IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; > ISLTYP:description = "DOMINANT SOIL CATEGORY" ; > bash-3.2$ ncdump wrfout_d03_2010-08-06_00:00:00 | grep -i NaN > IVGTYP:description = "DOMINANT VEGETATION CATEGORY" ; > ISLTYP:description = "DOMINANT SOIL CATEGORY" ; > bash-3.2$ > bash-3.2$ date > Tue Mar 8 03:29:15 IST 2011 > bash-3.2$ > > bash-3.2$ ls -lrt core* > coredir.5: > total 655848 > -rw-r--r-- 1 335790224 Mar 08 03:07 core > > coredir.3: > total 643664 > -rw-r--r-- 1 329553200 Mar 08 03:07 core > > coredir.2: > total 646352 > -rw-r--r-- 1 330929600 Mar 08 03:07 core > > coredir.1: > total 611240 > -rw-r--r-- 1 312953392 Mar 08 03:07 core > > coredir.7: > total 606376 > -rw-r--r-- 1 310461104 Mar 08 03:07 core > > coredir.6: > total 605224 > -rw-r--r-- 1 309871168 Mar 07 05:15 core > > coredir.4: > total 658528 > -rw-r--r-- 1 337166176 Mar 08 03:07 core > > coredir.0: > total 758952 > -rw-r--r-- 1 388580736 Mar 08 03:07 core > bash-3.2$ ls -lrt core* > > I have used GREP in the rsl files > > > > > ash-3.2$ grep -i NAN > rsl.error.0000 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 22 1 -NaNQ 5000.000000 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 49 1 NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0001 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 65 1 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0002 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 1 32 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0003 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 63 32 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0004 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 1 63 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0005 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 63 81 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0006 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 2 94 NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.error.0007 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 63 94 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.out.0000 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 22 1 -NaNQ 5000.000000 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 49 1 NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.out.0001 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 65 1 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.out.0002 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 1 32 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.out.0003 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 63 32 -NaNQ 5000.000000 > > bash-3.2$ grep -i NAN > rsl.out.0004 > > WOULD GO OFF TOP: > KF_ETA_PARA I,J,DPTHMX,DPMIN 1 63 -NaNQ 5000.000000 > > > I have Decreases the TIME step to 90sec but It has not helped me. The model aborts after running successfully for about 2-3 hours. > PLS suggest. > > geeta > > > > Date: Fri, 4 Mar 2011 12:46:29 -0700 > From: wrfhelp at ucar.edu > To: geeta124 at hotmail.com > Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. > > > > > > > > Message body > > > Can you use ncview to look at your wrfout files? You may need to do > to those outputs that are right before model stopped. Please check > whether there is any weird data in those output files? > > > > Did you see any NaN when typing ncdump wrfout | grep -i Nan? > > > > > > On 3/3/2011 9:32 PM, Geeta Geeta wrote: > > > Dear Sir, > > Thanks for the reply. I am running WRf 3.2 for the 3 domains 27, 9 > and 3 km for the 48hrs Forecast on the IBM P570 Machine. The > Operating system is AIX5.3 and the code is compiled using xlf with > POE in place. > > I had given the command bash$ time poe ../wrf.exe -procs 8. > > > > 1. The model gave segmentation fault after 4hrs of Integration. I > also plotted the Model forecast for 2nd and 3rd hour at 27km > resolution (Domain 1) So it showed me some data being plotted > > But when I tried to plot the output for the 2nd and 3rd > domain, IT says ENTIRE GRID UNDEFINED. Kindly suggest. > > > > > 2. >>>What data I have used ?? > > I am using the .grib2 files as Initial and BC taken from > ftp://ftpprd.ncep.noaa.gov.in at 1x1 degree resolution. > > > > 3. >>>What is the error message in your output file? Did > you find any CFL violation? Your namelist.input looks fine. > Please send us error messages in your run. > > I am attatching the rsl.error* and the rsl.out* files, > namelist.input and namelist.wps files. It seems to me > that there is no CFL violation. > > > > Thanks. > > geeta > > > > > > > > Date: Thu, 3 Mar 2011 17:20:41 -0700 > > From: wrfhelp at ucar.edu > > To: geeta124 at hotmail.com > > Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. > > > > Geeta, > > > > What is the error message in your output file? Did you find any > CFL violation? > > > > Your namelist.input looks fine. Please send us error messages in > your run. > > > > > > On 3/3/2011 3:57 AM, Geeta Geeta wrote: > > Dear All, > > I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After > integrating for 4hours, the model gives segmentation fault. and > these directories are created and does not integrate beyond. > > > > bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 > > Times = > > "2010-08-04_00:00:00", > > "2010-08-04_01:00:00", > > "2010-08-04_02:00:00", > > "2010-08-04_03:00:00", > > "2010-08-04_04:00:00" ; > > } > > > > bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 > > Times = > > "2010-08-04_00:00:00", > > "2010-08-04_01:00:00", > > "2010-08-04_02:00:00", > > "2010-08-04_03:00:00", > > "2010-08-04_04:00:00" ; > > > > bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 > > Times = > > "2010-08-04_00:00:00", > > "2010-08-04_01:00:00", > > "2010-08-04_02:00:00", > > "2010-08-04_03:00:00", > > "2010-08-04_04:00:00" ; > > > > After integrating for 4hours, the model gives segmentation > fault. and these directories are created and does not integrate > beyond. > > drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 > > drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 > > drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 > > drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 > > > > Kindly help. The namelist.input file is also attached for > reference. > > > > Geeta > > > > > > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. From bbrashers at Environcorp.com Wed Mar 9 12:27:44 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Wed, 9 Mar 2011 11:27:44 -0800 Subject: [Wrf-users] Winds In-Reply-To: References: Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520697471C@irvine01.irvine.environ.local> The white areas are where that pressure level is below the ground. 850 hPa is about 1350 meters above sea level, so those areas have no data at that pressure level. Bart From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ahsan Ali Sent: Tuesday, March 08, 2011 7:45 PM To: wrf-users at ucar.edu Subject: [Wrf-users] Winds Hello, I am running WRF 3.2.1 at 11km resolution for selected region. I am having problem while visualizing Winds at different pressure levels. while going down from 200hpa to 850hpa there is increasing white patch in some area showing no winds. The images are attached. Help is needed. Regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/f2c8001d/attachment-0001.html From bbrashers at Environcorp.com Mon Mar 14 13:08:49 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Mon, 14 Mar 2011 12:08:49 -0700 Subject: [Wrf-users] Benchmarking problems Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF30852069D17E8@irvine01.irvine.environ.local> I'm trying to benchmark WRF on two comparable systems, Intel X5660 and AMD 6174, before I buy. I'm also trying to do a benchmark for those of us who have many 5-day WRF runs to complete -- many runs with relatively low core counts, rather than a single run with large core counts (the focus of most benchmarks). I downloaded the WRF 3.0 Benchmark parts from http://www.mmm.ucar.edu/wrf/WG2/bench/. I compiled using option 2 (smpar for PGI/gcc) with no problems. In the namelist.input I specified (for one particular run): &domains ...snip... numtiles = 1, nproc_x = 3, nproc_y = 2, num_metgrid_levels = 40, / I set OMP_NUM_THREADS to 6 in my run script that calls wrf.exe. And yet, when I look in the resulting wrf.out file I see: WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 6 WRF NUMBER OF TILES = 6 Hey! I told you to use 1 tile and split it 3 by 2! Is this a problem with WRF v3.0? Looking at some WRF 3.2.1 runs where I have numtiles = 1 and specified 4 by 1, I got more verbose output like " WRF TILE 1 IS 1 IE 165 JS 1 JE 33". Scaling is poor after only 4 cores, so I suspect something is going wrong. Any suggestions you have would be greatly appreciated. Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From claudiomet at gmail.com Wed Mar 9 13:20:24 2011 From: claudiomet at gmail.com (claudiomet) Date: Wed, 9 Mar 2011 17:20:24 -0300 Subject: [Wrf-users] Winds In-Reply-To: References: Message-ID: Is there are topography in the blank areas, mountains perhaps ? Look at the wind vectors around the blank areas, the vectors behavior looks like affected by topography cheers ! 2011/3/9 Ahsan Ali : > Hello, > ?I am running WRF 3.2.1 at 11km resolution for selected region. I am having > problem while visualizing Winds at different pressure levels. while going > down from 200hpa to 850hpa there is?increasing?white patch in some area > showing no winds. The images are attached. Help is needed. > Regards, > > -- > Syed Ahsan Ali Bokhari > Electronic Engineer (EE) > Research & Development Division > Pakistan Meteorological Department H-8/4, Islamabad. > Phone?#?off ?+92518358714 > Cell # +923155145014 > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Claudio Cortes +56 (2) 2994121 Meteorologo Laboratorio de Inform?tica Ambiental (LIA) Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) Centro Nacional del Medio Ambiente (CENMA) -- Claudio Cortes +56 (2) 2994121 Meteorologist Laboratory Research, Innovation and Applied Informatics Modeling and Air Quality Management Unit National Enviroment Center, Chile (CENMA) From claudiomet at gmail.com Wed Mar 9 13:22:57 2011 From: claudiomet at gmail.com (claudiomet) Date: Wed, 9 Mar 2011 17:22:57 -0300 Subject: [Wrf-users] Vtable In-Reply-To: References: Message-ID: Add diagnostic fields in your ARWpost namelist &io io_form_input = 2, input_root_name = 'INPUT_FILE', output_root_name = 'OUTPUT_FILE', plot = 'all_list' fields = 'height,geopt,theta,tc,tk,td,td2,rh,rh2,umet,vmet,pressure,u10m,v10m,wdir,wspd,wd10,ws10,slp,mcape,mcin,lcl,lfc,cape,cin,dbz,max_dbz,clfr' output_type = 'grads' mercator_defs = .true. / Cheers ! 2011/3/9 Ahsan Ali : > > Dear, > ?I am using GFS Grib2 data for WRF run. I am little confused about the use > of variable tables. In GFS data there are many variables present then why > ?there are just few variables in Vtable.GFS (the default Vtable). Also there > is field regarding humidity in the Vtable as well as in the GFS data but > there is no humidity in the wrf output file. Files are attached. Please help > me. > Thanking you in advance. > regards, > > > > -- > Syed Ahsan Ali Bokhari > Electronic Engineer (EE) > Research & Development Division > Pakistan Meteorological Department H-8/4, Islamabad. > Phone?#?off ?+92518358714 > Cell # +923155145014 > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Claudio Cortes +56 (2) 2994121 Meteorologo Laboratorio de Inform?tica Ambiental (LIA) Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) Centro Nacional del Medio Ambiente (CENMA) -- Claudio Cortes +56 (2) 2994121 Meteorologist Laboratory Research, Innovation and Applied Informatics Modeling and Air Quality Management Unit National Enviroment Center, Chile (CENMA) From claudiomet at gmail.com Thu Mar 10 05:20:02 2011 From: claudiomet at gmail.com (claudiomet) Date: Thu, 10 Mar 2011 09:20:02 -0300 Subject: [Wrf-users] Winds In-Reply-To: References: Message-ID: Set extrapolate = .true. In yor ARWpost namelist to extrapolate variables below ground. According to my experience the extrapolation it's not accurate, because the variables loses the modification influenced by the topography Cheers ! 2011/3/10 Ahsan Ali : > Yeah That area is hilly and having high mountains (upper areas of Pakistan). > But we are using another model HRM (High resolutions regional model) at > 11km, that is showing winds at all levels in this area. So is there any was > to change topograhy. Also there is much differennce in model height and > actual height of this area. > > On Thu, Mar 10, 2011 at 1:20 AM, claudiomet wrote: >> >> Is there are topography in the blank areas, mountains perhaps ? >> Look at the wind vectors around the blank areas, the vectors behavior >> looks like affected by topography >> cheers ! >> >> 2011/3/9 Ahsan Ali : >> > Hello, >> > ?I am running WRF 3.2.1 at 11km resolution for selected region. I am >> > having >> > problem while visualizing Winds at different pressure levels. while >> > going >> > down from 200hpa to 850hpa there is?increasing?white patch in some area >> > showing no winds. The images are attached. Help is needed. >> > Regards, >> > >> > -- >> > Syed Ahsan Ali Bokhari >> > Electronic Engineer (EE) >> > Research & Development Division >> > Pakistan Meteorological Department H-8/4, Islamabad. >> > Phone?#?off ?+92518358714 >> > Cell # +923155145014 >> > >> > >> > _______________________________________________ >> > Wrf-users mailing list >> > Wrf-users at ucar.edu >> > http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > >> > >> >> >> >> -- >> Claudio Cortes >> +56 (2) 2994121 >> >> Meteorologo >> Laboratorio de Inform?tica Ambiental (LIA) >> Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) >> Centro Nacional del Medio Ambiente (CENMA) >> >> -- >> Claudio Cortes >> +56 (2) 2994121 >> >> Meteorologist >> Laboratory Research, Innovation and Applied Informatics >> Modeling and Air Quality Management Unit >> National Enviroment Center, Chile (CENMA) > > > > -- > Syed Ahsan Ali Bokhari > Electronic Engineer (EE) > Research & Development Division > Pakistan Meteorological Department H-8/4, Islamabad. > Phone?#?off ?+92518358714 > Cell # +923155145014 > > -- Claudio Cortes +56 (2) 2994121 Meteorologo Laboratorio de Inform?tica Ambiental (LIA) Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) Centro Nacional del Medio Ambiente (CENMA) -- Claudio Cortes +56 (2) 2994121 Meteorologist Laboratory Research, Innovation and Applied Informatics Modeling and Air Quality Management Unit National Enviroment Center, Chile (CENMA) From ebeigi3 at tigers.lsu.edu Fri Mar 11 15:34:07 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Fri, 11 Mar 2011 16:34:07 -0600 Subject: [Wrf-users] running an Idealized test case em_b_wave Message-ID: I have this problem when i want to run an idealized test case (em_b_wave), I have ifort and Icc compiler. what should i do? ( cd main ; make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/ehsan/WRFV3/external/ esmf_time_f90 -I/home/ehsan/WRFV3/main -I/home/ehsan/WRFV3/external/io_netcdf -I/home/ehsan/WRFV3/external/io_int -I/home/ehsan/WRFV3/frame -I/home/ehsan/WRFV3/share -I/home/ehsan/WRFV3/phys -I/home/ehsan/WRFV3/chem -I/home/ehsan/WRFV3/inc -I/home/ehsan/netcdf3/include " SOLVER=em IDEAL_CASE=b_wave em_ideal ) make[1]: Entering directory `/home/ehsan/WRFV3/main' ( cd ../dyn_em ; make -i -r module_initialize_b_wave.o ) make[2]: Entering directory `/home/ehsan/WRFV3/dyn_em' make[2]: `module_initialize_b_wave.o' is up to date. make[2]: Leaving directory `/home/ehsan/WRFV3/dyn_em' ranlib libwrflib.a ifort -o ideal.exe -O3 -w -ftz -align all -fno-alias -fp-model precise -FR -convert big_endian -ip ideal.o ../dyn_em/module_initialize_b_wave.o libwrflib.a /home/ehsan/WRFV3/external/fftpack/fftpack5/libfftpack.a /home/ehsan/WRFV3/external/io_grib1/libio_grib1.a /home/ehsan/WRFV3/external/io_grib_share/libio_grib_share.a /home/ehsan/WRFV3/external/io_int/libwrfio_int.a /home/ehsan/WRFV3/external/esmf_time_f90/libesmf_time.a /home/ehsan/WRFV3/external/RSL_LITE/librsl_lite.a /home/ehsan/WRFV3/frame/module_internal_header_util.o /home/ehsan/WRFV3/frame/pack_utils.o /home/ehsan/WRFV3/external/io_netcdf/libwrfio_nf.a -L/home/ehsan/netcdf3/lib -lnetcdf make[1]: ifort: Command not found make[1]: [em_ideal] Error 127 (ignored) make[1]: Leaving directory `/home/ehsan/WRFV3/main' ( cd test/em_b_wave ; /bin/rm -f wrf.exe ; ln -s ../../main/wrf.exe . ) ( cd test/em_b_wave ; /bin/rm -f ideal.exe ; ln -s ../../main/ideal.exe . ) ( cd test/em_b_wave ; /bin/rm -f README.namelist ; ln -s ../../run/README.namelist . ) ( cd test/em_b_wave ; /bin/rm -f gribmap.txt ; ln -s ../../run/gribmap.txt . ) ( cd test/em_b_wave ; /bin/rm -f grib2map.tbl ; ln -s ../../run/grib2map.tbl . ) ( cd run ; /bin/rm -f ideal.exe ; ln -s ../main/ideal.exe . ) ( cd run ; if test -f namelist.input ; then \ /bin/cp -f namelist.input namelist.input.backup ; fi ; \ /bin/rm -f namelist.input ; ln -s ../test/em_b_wave/namelist.input . ) ( cd run ; /bin/rm -f input_jet ; ln -s ../test/em_b_wave/input_jet . ) build started: Thu Mar 10 18:36:50 CST 2011 build completed: Thu Mar 10 18:36:56 CST 2011 after typing ./ideal.exe in WRFV3/main dir, i see this error: -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 6919 ERROR OPENING namelist.input What should i do? Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110311/7fd35435/attachment-0001.html From ebeigi3 at tigers.lsu.edu Sun Mar 13 11:35:14 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Sun, 13 Mar 2011 12:35:14 -0500 Subject: [Wrf-users] running an idealized test case In-Reply-To: References: <672E3C46-C28B-4A06-A3D1-0136FF09A707@ucar.edu> <5947CCA8-ADC0-4A0E-9209-C140A22D3776@ucar.edu> Message-ID: Dear Sir/Madam, Thanks for your previous help,I installed the ifrot version ( l_fcompxe_ia32_2011.2.137.tgz , l_fc_p_10.0.023.tar.gz , and alos l_fc_p_10.1.026_ia32.tar.gz ), I configured the WRF with these three kind of ifort, after compiling em_real and also em_b_wave , I didn't see any error in compile.log : ar: creating ./libio_grib_share.a ar: creating ../libio_grib1.a ar: creating libesmf_time.a ar: creating libfftpack.a registry.c(22): warning #1079: return type of function "main" must be "int" main( int argc, char *argv[], char *env[] ) ^ registry.c(59): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ registry.c(131): warning #266: function "gen_io_boilerplate" declared implicitly gen_io_boilerplate() ; /* 20091213 jm. Generate the io_boilerplate_temporary.inc file */ ^ registry.c(133): warning #266: function "init_parser" declared implicitly init_parser() ; ^ registry.c(156): warning #266: function "pre_parse" declared implicitly if ( pre_parse( dir, fp_in, fp_tmp ) ) { ^ registry.c(176): warning #266: function "check_dimspecs" declared implicitly check_dimspecs() ; ^ registry.c(186): warning #266: function "gen_actual_args_new" declared implicitly gen_actual_args_new( "inc" ) ; ^ registry.c(188): warning #266: function "gen_dummy_args_new" declared implicitly gen_dummy_args_new( "inc" ) ; ^ registry.c(190): warning #266: function "gen_dummy_decls_new" declared implicitly gen_dummy_decls_new( "inc" ) ; ^ registry.c(192): warning #266: function "gen_namelist_statements" declared implicitly gen_namelist_statements("inc") ; ^ registry.c(202): warning #266: function "gen_nest_interp" declared implicitly gen_nest_interp( "inc" ) ; ^ registry.c(204): warning #266: function "gen_streams" declared implicitly gen_streams("inc") ; ^ registry.c(207): warning #266: function "gen_comms" declared implicitly gen_comms( "inc" ) ; /* this is either package supplied (by copying a */ ^ reg_parse.c(227): warning #266: function "tolower" declared implicitly x = tolower(tokens[F_DIMS][i]) ; ^ reg_parse.c(292): warning #177: label "normal" was declared but never referenced normal: ^ reg_parse.c(453): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'x' || sw_all_x_staggered ) field_struct->stag_x = 1 ; ^ reg_parse.c(454): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'y' || sw_all_y_staggered ) field_struct->stag_y = 1 ; ^ reg_parse.c(455): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'z' ) field_struct->stag_z = 1 ; ^ reg_parse.c(474): warning #266: function "tolower" declared implicitly x = tolower(tmp[i]) ; ^ reg_parse.c(514): warning #266: function "tolower" declared implicitly x = tolower(tokens[FIELD_IO][i]) ; ^ misc.c(175): warning #1011: missing return statement at end of non-void function "range_of_dimension" } ^ misc.c(217): warning #592: variable "zdex" is used before its value is set sprintf(tmp,"%ssm3%d,%ssm3%d,1,1", r,bdex,r,zdex ) ; ^ misc.c(321): warning #1011: missing return statement at end of non-void function "get_elem" } ^ misc.c(423): warning #1011: missing return statement at end of non-void function "close_the_file" } ^ misc.c(430): warning #266: function "getpid" declared implicitly sprintf(tempfile,"regtmp1%d",getpid()) ; ^ misc.c(444): warning #266: function "getpid" declared implicitly sprintf(tempfile,"regtmp1%d",getpid()) ; ^ misc.c(462): warning #266: function "toupper" declared implicitly for ( p = str ; *p ; p++ ) *p = toupper(*p) ; ^ misc.c(472): warning #266: function "tolower" declared implicitly for ( p = str ; *p ; p++ ) *p = tolower(*p) ; ^ misc.c(645): warning #1011: missing return statement at end of non-void function "dimension_size_expression" } ^ gen_allocs.c(73): warning #1011: missing return statement at end of non-void function "get_count_for_alloc" } ^ gen_allocs.c(109): warning #266: function "make_upper_case" declared implicitly make_upper_case(dname_tmp) ; ^ gen_scalar_indices.c(197): warning #266: function "make_lower_case" declared implicitly make_lower_case(fname) ; ^ gen_config.c(135): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ gen_config.c(167): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(172): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(178): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(409): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ sym.c(73): warning #266: function "create_ht" declared implicitly create_ht( &symtab ) ; ^ sym.c(77): warning #266: function "exit" declared implicitly exit(1) ; ^ sym.c(153): warning #266: function "create_ht" declared implicitly create_ht( &symtab ) ; ^ sym.c(157): warning #266: function "exit" declared implicitly exit(1) ; ^ symtab_gen.c(62): warning #266: function "hash" declared implicitly index = hash( name ) ; ^ gen_comms.c(157): warning #1011: missing return statement at end of non-void function "print_4d_i1_decls" } ^ gen_comms.c(196): warning #1011: missing return statement at end of non-void function "print_decl" } ^ gen_comms.c(206): warning #1011: missing return statement at end of non-void function "print_body" } ^ gen_comms.c(248): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(266): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(985): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1178): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1312): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1454): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1672): warning #268: the format string ends before this argument sprintf(fname,"shift_halo",*direction) ; ^ gen_comms.c(1949): warning #1011: missing return statement at end of non-void function "gen_shift" } ^ gen_comms.c(2428): warning #1011: missing return statement at end of non-void function "gen_debug" } ^ set_dim_strs.c(145): warning #1011: missing return statement at end of non-void function "set_dim_strs" } ^ set_dim_strs.c(153): warning #1011: missing return statement at end of non-void function "set_dim_strs2" } ^ set_dim_strs.c(159): warning #1011: missing return statement at end of non-void function "set_dim_strs3" } ^ gen_wrf_io.c(36): warning #266: function "sym_forget" declared implicitly OP_F(fp,"wrf_bdyout.inc") ; ^ gen_wrf_io.c(452): warning #266: function "make_upper_case" declared implicitly make_upper_case(dname) ; ^ gen_streams.c(25): warning #266: function "gen_io_domain_defs" declared implicitly gen_io_domain_defs( fp ) ; ^ gen_streams.c(33): warning #266: function "gen_set_timekeeping_defs" declared implicitly gen_set_timekeeping_defs( fp ) ; ^ gen_streams.c(41): warning #266: function "gen_set_timekeeping_alarms" declared implicitly gen_set_timekeeping_alarms( fp ) ; ^ gen_streams.c(49): warning #266: function "gen_io_form_for_dataset" declared implicitly gen_io_form_for_dataset( fp ) ; ^ gen_streams.c(57): warning #266: function "gen_io_form_for_stream" declared implicitly gen_io_form_for_stream( fp ) ; ^ gen_streams.c(65): warning #266: function "gen_switches_and_alarms" declared implicitly gen_switches_and_alarms( fp ) ; ^ gen_streams.c(73): warning #266: function "gen_check_auxstream_alarms" declared implicitly gen_check_auxstream_alarms( fp ) ; ^ gen_streams.c(81): warning #266: function "gen_fine_stream_input" declared implicitly gen_fine_stream_input( fp ) ; ^ gen_streams.c(89): warning #266: function "gen_med_auxinput_in" declared implicitly gen_med_auxinput_in( fp ) ; ^ gen_streams.c(97): warning #266: function "gen_med_hist_out_opens" declared implicitly gen_med_hist_out_opens( fp ) ; ^ gen_streams.c(105): warning #266: function "gen_med_hist_out_closes" declared implicitly gen_med_hist_out_closes( fp ) ; ^ gen_streams.c(113): warning #266: function "gen_med_auxinput_in_closes" declared implicitly gen_med_auxinput_in_closes( fp ) ; ^ gen_streams.c(121): warning #266: function "gen_med_last_solve_io" declared implicitly gen_med_last_solve_io( fp ) ; ^ gen_streams.c(129): warning #266: function "gen_med_open_esmf_calls" declared implicitly gen_med_open_esmf_calls( fp ) ; ^ gen_streams.c(137): warning #266: function "gen_med_find_esmf_coupling" declared implicitly gen_med_find_esmf_coupling( fp ) ; ^ gen_streams.c(145): warning #266: function "gen_shutdown_closes" declared implicitly gen_shutdown_closes( fp ) ; ^ gen_streams.c(180): warning #1011: missing return statement at end of non-void function "gen_io_domain_defs" } ^ gen_streams.c(213): warning #1011: missing return statement at end of non-void function "gen_set_timekeeping_defs" } ^ gen_streams.c(296): warning #1011: missing return statement at end of non-void function "gen_set_timekeeping_alarms" } ^ gen_streams.c(323): warning #1011: missing return statement at end of non-void function "gen_io_form_for_dataset" } ^ gen_streams.c(350): warning #1011: missing return statement at end of non-void function "gen_io_form_for_stream" } ^ gen_streams.c(369): warning #1011: missing return statement at end of non-void function "gen_switches_and_alarms" } ^ gen_streams.c(400): warning #1011: missing return statement at end of non-void function "gen_check_auxstream_alarms" } ^ gen_streams.c(422): warning #1011: missing return statement at end of non-void function "gen_fine_stream_input" } ^ gen_streams.c(437): warning #1011: missing return statement at end of non-void function "gen_med_auxinput_in" } ^ gen_streams.c(452): warning #1011: missing return statement at end of non-void function "gen_med_hist_out_opens" } ^ gen_streams.c(468): warning #1011: missing return statement at end of non-void function "gen_med_hist_out_closes" } ^ gen_streams.c(484): warning #1011: missing return statement at end of non-void function "gen_med_auxinput_in_closes" } ^ gen_streams.c(497): warning #1011: missing return statement at end of non-void function "gen_med_last_solve_io" } ^ gen_streams.c(508): warning #1011: missing return statement at end of non-void function "gen_shutdown_closes" } ^ gen_streams.c(628): warning #1011: missing return statement at end of non-void function "gen_io_boilerplate" } ^ standard.c(43): warning #266: function "strncpy" declared implicitly strncpy(q,p,4) ; q+=4 ; ^ standard.c(54): warning #266: function "strncmp" declared implicitly if ( !strncmp( wrf_error_fatal_str, "wrf_error_fatal", 15 ) && wrf_error_fatal_str[15] != '3' ) ^ standard.c(78): warning #266: function "strcpy" declared implicitly strcpy(lineo,p+3+ns) ; ^ standard.c(88): warning #266: function "strcat" declared implicitly strcat(lineo,linei) ; ^ standard.c(166): warning #1011: missing return statement at end of non-void function "drop_comment" } ^ standard.c(176): warning #1011: missing return statement at end of non-void function "change_to_lower" } ^ opening Registry/registry.dimspec including Registry/registry.dimspec opening Registry/registry.les including Registry/registry.les opening Registry/registry.io_boilerplate including Registry/registry.io_boilerplate opening Registry/io_boilerplate_temporary.inc including Registry/io_boilerplate_temporary.inc opening Registry/registry.fire including Registry/registry.fire opening Registry/registry.avgflx including Registry/registry.avgflx Registry INFO variable counts: 0d 1823 1d 69 2d 469 3d 281 ADVISORY: RSL_LITE version of gen_comms is linked in with registry program. ar: creating ../main/libwrflib.a after typing ./real.exe it see this error : Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 104874600 bytes allocated Time period # 1 to process = 2000-01-24_12:00:00. Time period # 2 to process = 2000-01-24_18:00:00. Time period # 3 to process = 2000-01-25_00:00:00. Time period # 4 to process = 2000-01-25_06:00:00. Time period # 5 to process = 2000-01-25_12:00:00. Total analysis times to input = 5. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2000-01-24_12:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2000 24 12.00000 metgrid input_wrf.F first_date_input = 2000-01-24_12:00:00 metgrid input_wrf.F first_date_nml = 2000-01-24_12:00:00 d01 2000-01-24_12:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.03. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 08DBF613 Unknown Unknown Unknown real.exe 08DBE330 Unknown Unknown Unknown real.exe 08D7B22E Unknown Unknown Unknown real.exe 08D32BBC Unknown Unknown Unknown real.exe 08D324BA Unknown Unknown Unknown real.exe 08D62029 Unknown Unknown Unknown real.exe 08093551 Unknown Unknown Unknown real.exe 0809C118 Unknown Unknown Unknown after typing ./wrf.exe I see this error: Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 98194920 bytes allocated -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 67 program wrf: error opening wrfinput_d01 for reading ierr= -1021 I would appreciate it if you could help me solve the problem. Best Regards Ehsan Beigi On Fri, Mar 11, 2011 at 7:20 PM, Ehsan Beigi wrote: > I had run the real test case em_real and i had this error : > > *forrtl: severe (71): integer divide by zero* > > > > do I have problem with my intel fortran compiler? > > Best Regards > > Ehsan > > > > > On Fri, Mar 11, 2011 at 7:13 PM, wrfhelp wrote: > > You should first check whether the code is properly compiled. I see that >> you have error messages like: >> >> >> make[1]: ifort: Command not found >> >> that seems to suggest that the code is not compiled because it didn't find >> the ifort compiler. >> >> wrfhelp >> >> >> On Mar 11, 2011, at 6:04 PM, Ehsan Beigi wrote: >> >> Thanks for your reply, I did run in this directory = test/em_b_wave , >>> but the problem doesn't solve, what should i do? >>> >>> Best Regards >>> Ehsan Beigi >>> >>> >>> >>> On Fri, Mar 11, 2011 at 6:58 PM, wrfhelp wrote: >>> You are supposed to run the model in test/em_b_wave directory, where a >>> namelist file exists. >>> >>> wrfhelp >>> >>> >>> On Mar 11, 2011, at 2:42 PM, Ehsan Beigi wrote: >>> >>> >>> Dear Sir/Madam, >>> >>> I have this problem when i want to run an idealized test case >>> (em_b_wave), I have ifort and Icc compiler. what should i do? >>> >>> ( cd main ; make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm >>> -I/home/ehsan/WRFV3/external/esmf_time_f90 -I/home/ehsan/WRFV3/main >>> -I/home/ehsan/WRFV3/external/io_netcdf -I/home/ehsan/WRFV3/external/io_int >>> -I/home/ehsan/WRFV3/frame -I/home/ehsan/WRFV3/share -I/home/ehsan/WRFV3/phys >>> -I/home/ehsan/WRFV3/chem -I/home/ehsan/WRFV3/inc >>> -I/home/ehsan/netcdf3/include " SOLVER=em IDEAL_CASE=b_wave em_ideal ) >>> make[1]: Entering directory `/home/ehsan/WRFV3/main' >>> ( cd ../dyn_em ; make -i -r module_initialize_b_wave.o ) >>> make[2]: Entering directory `/home/ehsan/WRFV3/dyn_em' >>> make[2]: `module_initialize_b_wave.o' is up to date. >>> make[2]: Leaving directory `/home/ehsan/WRFV3/dyn_em' >>> ranlib libwrflib.a >>> ifort -o ideal.exe -O3 -w -ftz -align all -fno-alias -fp-model precise >>> -FR -convert big_endian -ip ideal.o ../dyn_em/module_initialize_b_wave.o >>> libwrflib.a /home/ehsan/WRFV3/external/fftpack/fftpack5/libfftpack.a >>> /home/ehsan/WRFV3/external/io_grib1/libio_grib1.a >>> /home/ehsan/WRFV3/external/io_grib_share/libio_grib_share.a >>> /home/ehsan/WRFV3/external/io_int/libwrfio_int.a >>> /home/ehsan/WRFV3/external/esmf_time_f90/libesmf_time.a >>> /home/ehsan/WRFV3/external/RSL_LITE/librsl_lite.a >>> /home/ehsan/WRFV3/frame/module_internal_header_util.o >>> /home/ehsan/WRFV3/frame/pack_utils.o >>> /home/ehsan/WRFV3/external/io_netcdf/libwrfio_nf.a >>> -L/home/ehsan/netcdf3/lib -lnetcdf >>> make[1]: ifort: Command not found >>> make[1]: [em_ideal] Error 127 (ignored) >>> make[1]: Leaving directory `/home/ehsan/WRFV3/main' >>> ( cd test/em_b_wave ; /bin/rm -f wrf.exe ; ln -s ../../main/wrf.exe . ) >>> ( cd test/em_b_wave ; /bin/rm -f ideal.exe ; ln -s ../../main/ideal.exe . >>> ) >>> ( cd test/em_b_wave ; /bin/rm -f README.namelist ; ln -s >>> ../../run/README.namelist . ) >>> ( cd test/em_b_wave ; /bin/rm -f gribmap.txt ; ln -s >>> ../../run/gribmap.txt . ) >>> ( cd test/em_b_wave ; /bin/rm -f grib2map.tbl ; ln -s >>> ../../run/grib2map.tbl . ) >>> ( cd run ; /bin/rm -f ideal.exe ; ln -s ../main/ideal.exe . ) >>> ( cd run ; if test -f namelist.input ; then \ >>> /bin/cp -f namelist.input namelist.input.backup ; fi ; \ >>> /bin/rm -f namelist.input ; ln -s ../test/em_b_wave/namelist.input >>> . ) >>> ( cd run ; /bin/rm -f input_jet ; ln -s ../test/em_b_wave/input_jet . ) >>> build started: Thu Mar 10 18:36:50 CST 2011 >>> build completed: Thu Mar 10 18:36:56 CST 2011 >>> >>> >>> >>> after typing ./ideal.exe in WRFV3/main dir, i see this error: >>> -------------- FATAL CALLED --------------- >>> FATAL CALLED FROM FILE: LINE: 6919 >>> ERROR OPENING namelist.input >>> >>> >>> What should i do? >>> >>> Best Regards >>> >>> >>> >>> >>> -- >>> Ehsan Beigi >>> PhD Student >>> Department of Civil and and Environmental Engineering >>> 2408 Patrick F. Taylor Hall >>> Louisiana State University >>> Baton Rouge, LA, 70803 >>> >>> >>> >>> wrfhelp >>> http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html >>> >>> >>> >>> >>> >>> >>> -- >>> Ehsan Beigi >>> PhD Student >>> Department of Civil and and Environmental Engineering >>> 2408 Patrick F. Taylor Hall >>> Louisiana State University >>> Baton Rouge, LA, 70803 >>> >>> >>> >> wrfhelp >> http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html >> >> >> >> > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > 2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110313/5289cc57/attachment-0001.html From ebeigi3 at tigers.lsu.edu Sun Mar 13 13:17:55 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Sun, 13 Mar 2011 14:17:55 -0500 Subject: [Wrf-users] running real test case (em_real) Message-ID: *Dear Sir/Madam, my compiler is ifort and icc, i compiled em_real , and this is* compile.log ar: creating ./libio_grib_share.a ar: creating ../libio_grib1.a ar: creating libesmf_time.a ar: creating libfftpack.a registry.c(22): warning #1079: return type of function "main" must be "int" main( int argc, char *argv[], char *env[] ) ^ registry.c(59): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ registry.c(131): warning #266: function "gen_io_boilerplate" declared implicitly gen_io_boilerplate() ; /* 20091213 jm. Generate the io_boilerplate_temporary.inc file */ ^ registry.c(133): warning #266: function "init_parser" declared implicitly init_parser() ; ^ registry.c(156): warning #266: function "pre_parse" declared implicitly if ( pre_parse( dir, fp_in, fp_tmp ) ) { ^ registry.c(176): warning #266: function "check_dimspecs" declared implicitly check_dimspecs() ; ^ registry.c(186): warning #266: function "gen_actual_args_new" declared implicitly gen_actual_args_new( "inc" ) ; ^ registry.c(188): warning #266: function "gen_dummy_args_new" declared implicitly gen_dummy_args_new( "inc" ) ; ^ registry.c(190): warning #266: function "gen_dummy_decls_new" declared implicitly gen_dummy_decls_new( "inc" ) ; ^ registry.c(192): warning #266: function "gen_namelist_statements" declared implicitly gen_namelist_statements("inc") ; ^ registry.c(202): warning #266: function "gen_nest_interp" declared implicitly gen_nest_interp( "inc" ) ; ^ registry.c(204): warning #266: function "gen_streams" declared implicitly gen_streams("inc") ; ^ registry.c(207): warning #266: function "gen_comms" declared implicitly gen_comms( "inc" ) ; /* this is either package supplied (by copying a */ ^ reg_parse.c(227): warning #266: function "tolower" declared implicitly x = tolower(tokens[F_DIMS][i]) ; ^ reg_parse.c(292): warning #177: label "normal" was declared but never referenced normal: ^ reg_parse.c(453): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'x' || sw_all_x_staggered ) field_struct->stag_x = 1 ; ^ reg_parse.c(454): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'y' || sw_all_y_staggered ) field_struct->stag_y = 1 ; ^ reg_parse.c(455): warning #266: function "tolower" declared implicitly if ( tolower(tokens[FIELD_STAG][i]) == 'z' ) field_struct->stag_z = 1 ; ^ reg_parse.c(474): warning #266: function "tolower" declared implicitly x = tolower(tmp[i]) ; ^ reg_parse.c(514): warning #266: function "tolower" declared implicitly x = tolower(tokens[FIELD_IO][i]) ; ^ misc.c(175): warning #1011: missing return statement at end of non-void function "range_of_dimension" } ^ misc.c(217): warning #592: variable "zdex" is used before its value is set sprintf(tmp,"%ssm3%d,%ssm3%d,1,1", r,bdex,r,zdex ) ; ^ misc.c(321): warning #1011: missing return statement at end of non-void function "get_elem" } ^ misc.c(423): warning #1011: missing return statement at end of non-void function "close_the_file" } ^ misc.c(430): warning #266: function "getpid" declared implicitly sprintf(tempfile,"regtmp1%d",getpid()) ; ^ misc.c(444): warning #266: function "getpid" declared implicitly sprintf(tempfile,"regtmp1%d",getpid()) ; ^ misc.c(462): warning #266: function "toupper" declared implicitly for ( p = str ; *p ; p++ ) *p = toupper(*p) ; ^ misc.c(472): warning #266: function "tolower" declared implicitly for ( p = str ; *p ; p++ ) *p = tolower(*p) ; ^ misc.c(645): warning #1011: missing return statement at end of non-void function "dimension_size_expression" } ^ gen_allocs.c(73): warning #1011: missing return statement at end of non-void function "get_count_for_alloc" } ^ gen_allocs.c(109): warning #266: function "make_upper_case" declared implicitly make_upper_case(dname_tmp) ; ^ gen_scalar_indices.c(197): warning #266: function "make_lower_case" declared implicitly make_lower_case(fname) ; ^ gen_config.c(135): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ gen_config.c(167): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(172): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(178): warning #266: function "toupper" declared implicitly fputc(toupper(*i),fp); ^ gen_config.c(409): warning #266: function "sym_forget" declared implicitly sym_forget() ; ^ sym.c(73): warning #266: function "create_ht" declared implicitly create_ht( &symtab ) ; ^ sym.c(77): warning #266: function "exit" declared implicitly exit(1) ; ^ sym.c(153): warning #266: function "create_ht" declared implicitly create_ht( &symtab ) ; ^ sym.c(157): warning #266: function "exit" declared implicitly exit(1) ; ^ symtab_gen.c(62): warning #266: function "hash" declared implicitly index = hash( name ) ; ^ gen_comms.c(157): warning #1011: missing return statement at end of non-void function "print_4d_i1_decls" } ^ gen_comms.c(196): warning #1011: missing return statement at end of non-void function "print_decl" } ^ gen_comms.c(206): warning #1011: missing return statement at end of non-void function "print_body" } ^ gen_comms.c(248): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(266): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(985): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1178): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1312): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1454): warning #266: function "make_upper_case" declared implicitly make_upper_case(commname) ; ^ gen_comms.c(1672): warning #268: the format string ends before this argument sprintf(fname,"shift_halo",*direction) ; ^ gen_comms.c(1949): warning #1011: missing return statement at end of non-void function "gen_shift" } ^ gen_comms.c(2428): warning #1011: missing return statement at end of non-void function "gen_debug" } ^ set_dim_strs.c(145): warning #1011: missing return statement at end of non-void function "set_dim_strs" } ^ set_dim_strs.c(153): warning #1011: missing return statement at end of non-void function "set_dim_strs2" } ^ set_dim_strs.c(159): warning #1011: missing return statement at end of non-void function "set_dim_strs3" } ^ gen_wrf_io.c(36): warning #266: function "sym_forget" declared implicitly OP_F(fp,"wrf_bdyout.inc") ; ^ gen_wrf_io.c(452): warning #266: function "make_upper_case" declared implicitly make_upper_case(dname) ; ^ gen_streams.c(25): warning #266: function "gen_io_domain_defs" declared implicitly gen_io_domain_defs( fp ) ; ^ gen_streams.c(33): warning #266: function "gen_set_timekeeping_defs" declared implicitly gen_set_timekeeping_defs( fp ) ; ^ gen_streams.c(41): warning #266: function "gen_set_timekeeping_alarms" declared implicitly gen_set_timekeeping_alarms( fp ) ; ^ gen_streams.c(49): warning #266: function "gen_io_form_for_dataset" declared implicitly gen_io_form_for_dataset( fp ) ; ^ gen_streams.c(57): warning #266: function "gen_io_form_for_stream" declared implicitly gen_io_form_for_stream( fp ) ; ^ gen_streams.c(65): warning #266: function "gen_switches_and_alarms" declared implicitly gen_switches_and_alarms( fp ) ; ^ gen_streams.c(73): warning #266: function "gen_check_auxstream_alarms" declared implicitly gen_check_auxstream_alarms( fp ) ; ^ gen_streams.c(81): warning #266: function "gen_fine_stream_input" declared implicitly gen_fine_stream_input( fp ) ; ^ gen_streams.c(89): warning #266: function "gen_med_auxinput_in" declared implicitly gen_med_auxinput_in( fp ) ; ^ gen_streams.c(97): warning #266: function "gen_med_hist_out_opens" declared implicitly gen_med_hist_out_opens( fp ) ; ^ gen_streams.c(105): warning #266: function "gen_med_hist_out_closes" declared implicitly gen_med_hist_out_closes( fp ) ; ^ gen_streams.c(113): warning #266: function "gen_med_auxinput_in_closes" declared implicitly gen_med_auxinput_in_closes( fp ) ; ^ gen_streams.c(121): warning #266: function "gen_med_last_solve_io" declared implicitly gen_med_last_solve_io( fp ) ; ^ gen_streams.c(129): warning #266: function "gen_med_open_esmf_calls" declared implicitly gen_med_open_esmf_calls( fp ) ; ^ gen_streams.c(137): warning #266: function "gen_med_find_esmf_coupling" declared implicitly gen_med_find_esmf_coupling( fp ) ; ^ gen_streams.c(145): warning #266: function "gen_shutdown_closes" declared implicitly gen_shutdown_closes( fp ) ; ^ gen_streams.c(180): warning #1011: missing return statement at end of non-void function "gen_io_domain_defs" } ^ gen_streams.c(213): warning #1011: missing return statement at end of non-void function "gen_set_timekeeping_defs" } ^ gen_streams.c(296): warning #1011: missing return statement at end of non-void function "gen_set_timekeeping_alarms" } ^ gen_streams.c(323): warning #1011: missing return statement at end of non-void function "gen_io_form_for_dataset" } ^ gen_streams.c(350): warning #1011: missing return statement at end of non-void function "gen_io_form_for_stream" } ^ gen_streams.c(369): warning #1011: missing return statement at end of non-void function "gen_switches_and_alarms" } ^ gen_streams.c(400): warning #1011: missing return statement at end of non-void function "gen_check_auxstream_alarms" } ^ gen_streams.c(422): warning #1011: missing return statement at end of non-void function "gen_fine_stream_input" } ^ gen_streams.c(437): warning #1011: missing return statement at end of non-void function "gen_med_auxinput_in" } ^ gen_streams.c(452): warning #1011: missing return statement at end of non-void function "gen_med_hist_out_opens" } ^ gen_streams.c(468): warning #1011: missing return statement at end of non-void function "gen_med_hist_out_closes" } ^ gen_streams.c(484): warning #1011: missing return statement at end of non-void function "gen_med_auxinput_in_closes" } ^ gen_streams.c(497): warning #1011: missing return statement at end of non-void function "gen_med_last_solve_io" } ^ gen_streams.c(508): warning #1011: missing return statement at end of non-void function "gen_shutdown_closes" } ^ gen_streams.c(628): warning #1011: missing return statement at end of non-void function "gen_io_boilerplate" } ^ standard.c(43): warning #266: function "strncpy" declared implicitly strncpy(q,p,4) ; q+=4 ; ^ standard.c(54): warning #266: function "strncmp" declared implicitly if ( !strncmp( wrf_error_fatal_str, "wrf_error_fatal", 15 ) && wrf_error_fatal_str[15] != '3' ) ^ standard.c(78): warning #266: function "strcpy" declared implicitly strcpy(lineo,p+3+ns) ; ^ standard.c(88): warning #266: function "strcat" declared implicitly strcat(lineo,linei) ; ^ standard.c(166): warning #1011: missing return statement at end of non-void function "drop_comment" } ^ standard.c(176): warning #1011: missing return statement at end of non-void function "change_to_lower" } ^ opening Registry/registry.dimspec including Registry/registry.dimspec opening Registry/registry.les including Registry/registry.les opening Registry/registry.io_boilerplate including Registry/registry.io_boilerplate opening Registry/io_boilerplate_temporary.inc including Registry/io_boilerplate_temporary.inc opening Registry/registry.fire including Registry/registry.fire opening Registry/registry.avgflx including Registry/registry.avgflx Registry INFO variable counts: 0d 1823 1d 69 2d 469 3d 281 ADVISORY: RSL_LITE version of gen_comms is linked in with registry program. ar: creating ../main/libwrflib.a *and then downloaded the WRF Model Test Data (January 2000 WPS output data: http://www.mmm.ucar.edu/wrf/users/download/get_source2.html), after that in the test/em_real directory, I used this command :* cp namelist.input.jan00 namelist.input *and then ./real.exe . i saw this :* Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 104874600 bytes allocated Time period # 1 to process = 2000-01-24_12:00:00. Time period # 2 to process = 2000-01-24_18:00:00. Time period # 3 to process = 2000-01-25_00:00:00. Time period # 4 to process = 2000-01-25_06:00:00. Time period # 5 to process = 2000-01-25_12:00:00. Total analysis times to input = 5. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2000-01-24_12:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2000 24 12.00000 metgrid input_wrf.F first_date_input = 2000-01-24_12:00:00 metgrid input_wrf.F first_date_nml = 2000-01-24_12:00:00 d01 2000-01-24_12:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.03. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 08DBF613 Unknown Unknown Unknown real.exe 08DBE330 Unknown Unknown Unknown real.exe 08D7B22E Unknown Unknown Unknown real.exe 08D32BBC Unknown Unknown Unknown real.exe 08D324BA Unknown Unknown Unknown real.exe 08D62029 Unknown Unknown Unknown real.exe 08093551 Unknown Unknown Unknown real.exe 0809C118 Unknown Unknown Unknown *and after that , wrf.exe : i saw this * Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 98194920 bytes allocated -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 67 program wrf: error opening wrfinput_d01 for reading ierr= -1021 What should i do? please help me with this. Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110313/a5638164/attachment-0001.html From kganbour at yahoo.com Sun Mar 13 05:21:25 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Sun, 13 Mar 2011 04:21:25 -0700 (PDT) Subject: [Wrf-users] [WRF-users] Message-ID: <786644.29253.qm@web46307.mail.sp1.yahoo.com> Dear All: I have run WRF model for NMM&ARW by test data according the tutorial. And now I run by real data for NMM WRF,but I have an error when I execute real.exe I? attached namelist.input file and the error in real.log file Best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110313/f4842d53/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4593 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110313/f4842d53/attachment.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: real.log Type: text/x-log Size: 3632 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110313/f4842d53/attachment.bin From wrf at nusculus.com Wed Mar 9 12:23:11 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Wed, 9 Mar 2011 12:23:11 -0700 Subject: [Wrf-users] WRF 3.2 Model not running after 4hrs of Integration. In-Reply-To: References: <4D703059.7000203@ucar.edu> Message-ID: Hi Geeta, I noticed that in your rsl.error.0003 file you had multiple CFL errors. These were kind of hidden because you did not delete the rsl file between running real.exe and wrf.exe. When CFL errors occur, I often get segmentation faults, usually because the radiation scheme. Segmentation faults can cause core dumps. If not caused by the radiation scheme, maybe it occured in the Kains Fritsch cumulus scheme. I say that because you did get the message: "WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 40 18 NaNQ 5000.000000" which includes a NaN. To fix the CFL error, you may have to keep reducing the timestep, even though yours is not long for your grid resolution. That may fix things. If the timestep becomes impractically short, try adding some vertical dampening: w_damping = 1 or play around a little with epssm: epssm = 0.3, 0.3, 0.3 Increasing the vertical size of grid cells might help too with CFL errors, but you probably have those where you want them. It will help you find the errors I found if you delete or remove the rsl files before running wrf.exe. best wishes, Kevin On Thu, Mar 3, 2011 at 9:32 PM, Geeta Geeta wrote: > Dear Sir, > Thanks for the reply. I am running WRf 3.2 for the 3 domains 27, 9 and 3 km > for the 48hrs Forecast on the IBM P570 Machine. The Operating system is > AIX5.3 and the code is compiled using xlf with POE in place. > I had given the command *bash$ time poe ../wrf.exe -procs 8. * > > 1. The model gave segmentation fault after 4hrs of Integration. I also > plotted the Model forecast for 2nd and 3rd hour at 27km resolution (Domain > 1) So it showed me some data being plotted > *But when I tried to plot the output for the 2nd and 3rd domain, IT says ENTIRE > GRID UNDEFINED. Kindly suggest. * > > 2. >>>What data I have used ?? > *I am using the .grib2 files as Initial and BC taken from > ftp://ftpprd.ncep.noaa.gov.in at 1x1 degree resolution. > * > 3. >>>What is the error message in your output file? Did you find any CFL > violation? Your namelist.input looks fine. Please send us error messages in > your run. > *I am attatching the rsl.error* and the rsl.out* files, namelist.input and > namelist.wps files. It seems to me that there is no CFL violation. * > > Thanks. > geeta > > > > ------------------------------ > Date: Thu, 3 Mar 2011 17:20:41 -0700 > From: wrfhelp at ucar.edu > To: geeta124 at hotmail.com > Subject: Re: WRF 3.2 Model not running after 4hrs of Integration. > > Geeta, > > What is the error message in your output file? Did you find any CFL > violation? > > Your namelist.input looks fine. Please send us error messages in your run. > > > On 3/3/2011 3:57 AM, Geeta Geeta wrote: > > Dear All, > I am running WRF3.2 for 3 domains, at 27, 9 and 3kms. After integrating > for 4hours, the model gives segmentation fault. and these directories are > created and does not integrate beyond. > > bash-3.2$ ncdump -v Times wrfout_d01_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > } > > bash-3.2$ ncdump -v Times wrfout_d02_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > > bash-3.2$ ncdump -v Times wrfout_d03_2010-08-04_00:00:00 > Times = > "2010-08-04_00:00:00", > "2010-08-04_01:00:00", > "2010-08-04_02:00:00", > "2010-08-04_03:00:00", > "2010-08-04_04:00:00" ; > > After integrating for 4hours, the model gives segmentation fault. and > these directories are created and does not integrate beyond. > drwxr-xr-x 2 256 Mar 03 04:20 coredir.5 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.3 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.2 > drwxr-xr-x 2 256 Mar 03 04:20 coredir.1 > > Kindly help. The namelist.input file is also attached for reference. > > Geeta > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110309/3969bd7d/attachment.html From andrew.porter at stfc.ac.uk Tue Mar 15 03:28:21 2011 From: andrew.porter at stfc.ac.uk (Andrew Porter) Date: Tue, 15 Mar 2011 09:28:21 +0000 Subject: [Wrf-users] Benchmarking problems In-Reply-To: <1B8D1B9BF4DCDC4A90A42E312FF30852069D17E8@irvine01.irvine.environ.local> References: <1B8D1B9BF4DCDC4A90A42E312FF30852069D17E8@irvine01.irvine.environ.local> Message-ID: <4D7F3135.6090909@stfc.ac.uk> Hi Bart, > > &domains > ...snip... > numtiles = 1, > nproc_x = 3, > nproc_y = 2, > num_metgrid_levels = 40, > / > > I set OMP_NUM_THREADS to 6 in my run script that calls wrf.exe. > > And yet, when I look in the resulting wrf.out file I see: > > WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 6 > WRF NUMBER OF TILES = 6 > > Hey! I told you to use 1 tile and split it 3 by 2! > I think nproc_x and nproc_y only applies to MPI processes. By telling it to use only 1 tile I think you'll end up with 5 threads not doing anything! You can leave ntiles out to have it set to OMP_NUM_THREADS by default or you can set it > OMP_NUM_THREADS to see whether that fits things into cache more successfully. God bless, Andy. -- Dr. Andrew Porter Computational Scientist Advanced Research Computing Group Computational Science and Engineering Dept. STFC Daresbury Laboratory Keckwick Lane Daresbury WA4 4AD Tel. : +44 (0)1925 603607 email: andrew.porter at stfc.ac.uk From Craig.Tierney at noaa.gov Mon Mar 14 14:30:59 2011 From: Craig.Tierney at noaa.gov (Craig Tierney) Date: Mon, 14 Mar 2011 14:30:59 -0600 Subject: [Wrf-users] Benchmarking problems In-Reply-To: <1B8D1B9BF4DCDC4A90A42E312FF30852069D17E8@irvine01.irvine.environ.local> References: <1B8D1B9BF4DCDC4A90A42E312FF30852069D17E8@irvine01.irvine.environ.local> Message-ID: <4D7E7B03.6060006@noaa.gov> On 3/14/11 1:08 PM, Bart Brashers wrote: > I'm trying to benchmark WRF on two comparable systems, Intel X5660 and > AMD 6174, before I buy. I'm also trying to do a benchmark for those of > us who have many 5-day WRF runs to complete -- many runs with relatively > low core counts, rather than a single run with large core counts (the > focus of most benchmarks). > > I downloaded the WRF 3.0 Benchmark parts from > http://www.mmm.ucar.edu/wrf/WG2/bench/. I compiled using option 2 > (smpar for PGI/gcc) with no problems. In the namelist.input I specified > (for one particular run): > > &domains > ...snip... > numtiles = 1, > nproc_x = 3, > nproc_y = 2, > num_metgrid_levels = 40, > / > > I set OMP_NUM_THREADS to 6 in my run script that calls wrf.exe. > > And yet, when I look in the resulting wrf.out file I see: > > WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 6 > WRF NUMBER OF TILES = 6 > > Hey! I told you to use 1 tile and split it 3 by 2! > > Is this a problem with WRF v3.0? Looking at some WRF 3.2.1 runs where I > have numtiles = 1 and specified 4 by 1, I got more verbose output like " > WRF TILE 1 IS 1 IE 165 JS 1 JE 33". > > Scaling is poor after only 4 cores, so I suspect something is going > wrong. Any suggestions you have would be greatly appreciated. > I am not sure about the split, but I do know that at low core core counts you are going to get better performance using MPI, even if you are only using a single node. If you are using OpenMP, you need to look into issues related to socket-affinity to ensure that your OpenMP threads stay local to the socket. If you don't get the affinity right, it is going to hurt scalability. Craig > Bart > > > > This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users From mmkamal at sciborg.uwaterloo.ca Mon Mar 14 18:38:54 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Mon, 14 Mar 2011 20:38:54 -0400 Subject: [Wrf-users] Model is not running with CAM radiation scheme Message-ID: <20110314203854.214378yszfv4l9wc@www.nexusmail.uwaterloo.ca> Hi All, I have been trying to run WRF 3.2 over North-Eastern Canada & USA using CAM shortwave and longwave radiation scheme. But I am having problem with those two radiation scheme although I have no problem with RRTM ( longwave radiation ) & Dudhia scheme (shortwave). I am attaching the two namelist file and rsl.errro & rsl.out log file. Could anybody please help me about the problem. Thanks in advance Kamal -------------- next part -------------- &time_control run_days = 5, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2008, 2008, 2008, start_month = 06, 06, 06, start_day = 21, 21, 21, start_hour = 00, 00, 00, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2008, 2008, 2008, end_month = 06, 06, 06, end_day = 29, 29, 29, end_hour = 00, 00, 00, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800 input_from_file = .true.,.true.,.true., history_interval = 180, 180, 180, frames_per_outfile = 1000, 1000, 1000, restart = .false., restart_interval = 5000, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 50 / &domains time_step = 48, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 3, e_we = 225, 202, 181, e_sn = 175, 202, 271, e_vert = 28, 28, 28, p_top_requested = 10000, num_metgrid_levels = 30 num_metgrid_soil_levels = 4, dx = 8000, 2666.666, 888.888, dy = 8000, 2666.666, 888.888, grid_id = 1, 2, 3, parent_id = 1, 1, 2, i_parent_start = 1, 75, 70, j_parent_start = 1, 65, 50, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 / &physics mp_physics = 6, 6, 6, ra_lw_physics = 3, 3, 3, ra_sw_physics = 3, 3, 3, radt = 8, 8, 8, cam_abs_freq_s = 21600 levsiz = 59 paerlev = 29 cam_abs_dim1 = 4 cam_abs_dim2 = 28 sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 3, 3, 3, bl_pbl_physics = 1, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 0, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 6, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &dynamics w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, 0, 0, diff_6th_factor = 0.12, 0.12, 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.2, 0.2, 0.2 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, scalar_adv_opt = 1, 1, 1, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false., nested = .false., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / -------------- next part -------------- &share wrf_core = 'ARW', max_dom = 3, start_date = '2008-06-21_00:00:00','2008-06-21_00:00:00', '2008-06-21_00:00:00', end_date = '2008-06-29_00:00:00','2008-06-21_00:00:00', '2008-06-21_00:00:00', interval_seconds = 10800 io_form_geogrid = 2, / &geogrid parent_id = 1, 1, 2, parent_grid_ratio = 1, 3, 3, i_parent_start = 1, 75, 70, j_parent_start = 1, 65, 50, e_we = 225, 202, 181, e_sn = 175, 202, 271, geog_data_res = ?modis_30s+10m?, ?modis_30s+2m?, ?modis_30s+30s? dx = 8000, dy = 8000, map_proj = 'lambert', ref_lat = 43.0, ref_lon = -80.0, truelat1 = 30.0, truelat2 = 60.0, stand_lon = -80.0, geog_data_path = '/work/kamal/brown/DATA/geog' / &ungrib out_format = 'WPS', prefix = 'NARR', / &metgrid fg_name = 'NARR' io_form_metgrid = 2, constants_name ='./NARRFIX:1979-11-08_00' / &mod_levs press_pa = 201300 , 200100 , 100000 , 95000 , 90000 , 85000 , 80000 , 75000 , 70000 , 65000 , 60000 , 55000 , 50000 , 45000 , 40000 , 35000 , 30000 , 25000 , 20000 , 15000 , 10000 , 5000 , 1000 / -------------- next part -------------- taskid: 18 hostname: bro61 Ntasks in X 8, ntasks in Y 16 NOTE: CAM radiation is in use, setting: paerlev=29, levsiz=59, cam_abs_dim1=4, cam_abs_dim2=e_vert NOTE: num_soil_layers has been set to 6 WRF V3.2 MODEL ************************************* Parent domain ids,ide,jds,jde 1 225 1 175 ims,ime,jms,jme 50 91 16 40 ips,ipe,jps,jpe 57 84 23 33 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 29081796 bytes allocated wrf main: calling open_r_dataset for wrfinput med_initialdata_input: calling input_input mminlu = 'USGS' WRF TILE 1 IS 57 IE 84 JS 23 JE 33 set_tiles3: NUMBER OF TILES = 1 INPUT LandUse = "USGS" num_months = 13 forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source wrf.exe 00000000014217C1 Unknown Unknown Unknown wrf.exe 0000000001423821 Unknown Unknown Unknown wrf.exe 00000000014225FA Unknown Unknown Unknown wrf.exe 00000000011A6D5A Unknown Unknown Unknown wrf.exe 0000000000D2D2D8 Unknown Unknown Unknown wrf.exe 0000000000D27583 Unknown Unknown Unknown wrf.exe 0000000000A8B6F7 Unknown Unknown Unknown wrf.exe 00000000009CA6AE Unknown Unknown Unknown wrf.exe 00000000009C82E7 Unknown Unknown Unknown wrf.exe 000000000047FBC2 Unknown Unknown Unknown wrf.exe 000000000047F009 Unknown Unknown Unknown wrf.exe 000000000047EFAC Unknown Unknown Unknown libc.so.6 00002ACA1BAA0994 Unknown Unknown Unknown wrf.exe 000000000047EEB9 Unknown Unknown Unknown -------------- next part -------------- Ntasks in X 8, ntasks in Y 16 NOTE: CAM radiation is in use, setting: paerlev=29, levsiz=59, cam_abs_dim1=4, cam_abs_dim2=e_vert NOTE: num_soil_layers has been set to 6 WRF V3.2 MODEL ************************************* Parent domain ids,ide,jds,jde 1 225 1 175 ims,ime,jms,jme 190 230 38 62 ips,ipe,jps,jpe 197 225 45 55 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 28547296 bytes allocated wrf main: calling open_r_dataset for wrfinput med_initialdata_input: calling input_input mminlu = 'USGS' WRF TILE 1 IS 197 IE 225 JS 45 JE 55 set_tiles3: NUMBER OF TILES = 1 INPUT LandUse = "USGS" LANDUSE TYPE = "USGS" FOUND 33 CATEGORIES 2 SEASONS WATER CATEGORY = 16 SNOW CATEGORY = 24 *** SATURATION VAPOR PRESSURE TABLE COMPLETED *** num_months = 13 AEROSOLS: Background aerosol will be limited to bottom 6 model interfaces. From yagnesh at lowtem.hokudai.ac.jp Mon Mar 14 15:15:36 2011 From: yagnesh at lowtem.hokudai.ac.jp (yagnesh) Date: Tue, 15 Mar 2011 06:15:36 +0900 Subject: [Wrf-users] automation scripts to run real case.? Message-ID: <4D7E8578.8030408@lowtem.hokudai.ac.jp> Hello wrf-users., is there any scripts(perl/bash) to automate the real case simulation.? -- yyr From bbrashers at Environcorp.com Wed Mar 16 13:04:05 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Wed, 16 Mar 2011 12:04:05 -0700 Subject: [Wrf-users] OpenMPI vs MPICH2 Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF3085206A55C5A@irvine01.irvine.environ.local> Has anyone done any benchmark testing of WRF dmpar using OpenMPI vs. MPICH2? Which one is "better" or "faster"? Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From FLiu at azmag.gov Tue Mar 15 11:22:39 2011 From: FLiu at azmag.gov (Feng Liu) Date: Tue, 15 Mar 2011 17:22:39 +0000 Subject: [Wrf-users] Model is not running with CAM radiation scheme In-Reply-To: <20110314203854.214378yszfv4l9wc@www.nexusmail.uwaterloo.ca> References: <20110314203854.214378yszfv4l9wc@www.nexusmail.uwaterloo.ca> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C25471F9F@mag9006> Kamal, Notice that you are using NARR as initials, and I am sure Noah land surface model is applicable for your case. You may check the number of soil layer should be 4 in Vtable.NARR (also see num_metgrid_soil_levels = 4 in your namelist.input) other than 6 which is used for RUC land surface model. Double check surface physics scheme in your namelist.input, and replace RUC with Noah in it as below, you should be fine then. Change sf_surface_physics = 3, 3, 3, into sf_surface_physics = 2, 2, 2, I hope it will be helpful Feng -----Original Message----- From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of mmkamal at sciborg.uwaterloo.ca Sent: Monday, March 14, 2011 5:39 PM To: wrf-users at ucar.edu Subject: [Wrf-users] Model is not running with CAM radiation scheme Hi All, I have been trying to run WRF 3.2 over North-Eastern Canada & USA using CAM shortwave and longwave radiation scheme. But I am having problem with those two radiation scheme although I have no problem with RRTM ( longwave radiation ) & Dudhia scheme (shortwave). I am attaching the two namelist file and rsl.errro & rsl.out log file. Could anybody please help me about the problem. Thanks in advance Kamal From maria.frediani at gmail.com Tue Mar 15 12:28:06 2011 From: maria.frediani at gmail.com (Maria Eugenia) Date: Tue, 15 Mar 2011 14:28:06 -0400 Subject: [Wrf-users] problem installing wrf v3.2.1 with gcc v4.2.4 Message-ID: Dear all, I'm trying to install WRFV3.2.1 (dmpar, without support for grib2 io) in a SGI Altix 3700 IA64 Itanium 2. I'm using gcc/gfortran v 4.2.4, netcdf v4.1.1, mpich2 v1.3.2 and libpng v1.2.12. I have successfully built WRF V3.2, with the above configuration, but the newest version does not compile completely. After compiling for em_real, I only have wrf.exe in the WRFV3/main directory, no real.exe nor any of the other executables that should be there. I don't understand why wrf.exe compiles and real.exe doesn't.... I also tried to compile it serial only, to exclude a possible problem with mpich2, but the result is the same. Maybe it is relevant to mention that ggc is not the default compiler, the default is ifort 9.0. However I have checked the paths and installed the required softwares with gcc (netcdf,mpich2,libpng). Besides that, when I run ./configure, I only see options to use intel compiler: 1. Linux SGI Altix, ifort compiler with icc 9.x,10.x (serial) 2. Linux SGI Altix, ifort compiler with icc 9.x,10.x (smpar) 3. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dmpar) 4. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dm+sm) Therefore I created my configure.wrf file based on gcc. Anyway, I don't believe this is the problem given that I successfully built v3.2 with the same configuration. The configure.wrf file for the serial and dmpar build are attached as well as the compile.log. I sincerely appreciate your help. Maria E. B. Frediani ------------------------------------------------------------------------------------- Visiting Scholar University of Connecticut School of Engineering 261 Glenbrook Rd Storrs, CT 06269 frediani at engr.uconn.edu -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_serial Type: application/octet-stream Size: 19317 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0004.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log.wrfv3.2.1.serial Type: application/octet-stream Size: 492280 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0005.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_dmpar Type: application/octet-stream Size: 19428 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0006.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log.wrfv3.2.1.dmpar Type: application/octet-stream Size: 509771 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0007.obj From Don.Morton at alaska.edu Wed Mar 16 19:15:38 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Wed, 16 Mar 2011 17:15:38 -0800 Subject: [Wrf-users] OpenMPI vs MPICH2 In-Reply-To: <1B8D1B9BF4DCDC4A90A42E312FF3085206A55C5A@irvine01.irvine.environ.local> References: <1B8D1B9BF4DCDC4A90A42E312FF3085206A55C5A@irvine01.irvine.environ.local> Message-ID: Bart, we missed you at the Alaska Weather Symposium this year! The only comment I'll offer is that we were running WRF dmpar using MPICH2 on a Sun Opteron system: http://www.arsc.edu/resources/midnight.html compiled with PGI and on large problems, especially those with nests, we found intermittent "hang" problems with the scatterv operation of the LBC's to the tasks. On certain domains, the hang would inevitably occur, but not always at the same time. Since we've moved to OpenMPI we haven't had the issue. Other users on this system have experienced the same. Best, Don On Wed, Mar 16, 2011 at 11:04 AM, Bart Brashers wrote: > Has anyone done any benchmark testing of WRF dmpar using OpenMPI vs. > MPICH2? Which one is "better" or "faster"? > > Bart > > > > This message contains information that may be confidential, privileged or > otherwise protected by law from disclosure. It is intended for the exclusive > use of the Addressee(s). Unless you are the addressee or authorized agent of > the addressee, you may not review, copy, distribute or disclose to anyone > the message or any information contained within. If you have received this > message in error, please contact the sender by electronic reply to > email at environcorp.com and immediately delete all copies of the message. > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110316/426994b7/attachment.html From ebeigi3 at tigers.lsu.edu Wed Mar 16 17:02:54 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Wed, 16 Mar 2011 18:02:54 -0500 Subject: [Wrf-users] running real test case (em_real) In-Reply-To: References: <4D7E8A8D.5030601@ucar.edu> Message-ID: Dear Sir/Madam, There are a lot of data on the website you mentioned in your previous email (http://dss.ucar.edu/datasets/ds083.2/), I downloaded some file, but i don't know how to run them, could you please send me just an example for running wrf? I always have this error ( MMINLU error on input), and i am not able to run any of the test case. could you please help me with that? Best Regards Ehsan Beigi On Mon, Mar 14, 2011 at 5:10 PM, Ehsan Beigi wrote: > Thanks for you reply. yes, the compile is successful and i can see all of > the .exe file except plotgride.exe and > > On Mon, Mar 14, 2011 at 4:37 PM, wrfhelp wrote: > >> Those warning message doesn't matter. My question is, have you >> successfully built WRF and WPS? Did you see all those .exe files in your >> directory? >> >> Can you download FNL data from http://dss.ucar.edu/datasets/ds083.2/ , >> then go through WPS process by yourself? I make this suggestion because >> sometimes our data at website doesn't work for some users. We are not sure >> what is the problem. usually we suggest our users to run WPS by themselves, >> then move on to REAL and WRF. >> >> >> >> >> On 3/13/2011 1:17 PM, Ehsan Beigi wrote: >> >> *Dear Sir/Madam, >> my compiler is ifort and icc, i compiled em_real , and this is* compile.log >> >> >> ar: creating ./libio_grib_share.a >> ar: creating ../libio_grib1.a >> ar: creating libesmf_time.a >> ar: creating libfftpack.a >> registry.c(22): warning #1079: return type of function "main" must be >> "int" >> main( int argc, char *argv[], char *env[] ) >> ^ >> >> registry.c(59): warning #266: function "sym_forget" declared implicitly >> sym_forget() ; >> ^ >> >> registry.c(131): warning #266: function "gen_io_boilerplate" declared >> implicitly >> gen_io_boilerplate() ; /* 20091213 jm. Generate the >> io_boilerplate_temporary.inc file */ >> ^ >> >> registry.c(133): warning #266: function "init_parser" declared implicitly >> init_parser() ; >> ^ >> >> registry.c(156): warning #266: function "pre_parse" declared implicitly >> if ( pre_parse( dir, fp_in, fp_tmp ) ) { >> ^ >> >> registry.c(176): warning #266: function "check_dimspecs" declared >> implicitly >> check_dimspecs() ; >> ^ >> >> registry.c(186): warning #266: function "gen_actual_args_new" declared >> implicitly >> gen_actual_args_new( "inc" ) ; >> ^ >> >> registry.c(188): warning #266: function "gen_dummy_args_new" declared >> implicitly >> gen_dummy_args_new( "inc" ) ; >> ^ >> >> registry.c(190): warning #266: function "gen_dummy_decls_new" declared >> implicitly >> gen_dummy_decls_new( "inc" ) ; >> ^ >> >> registry.c(192): warning #266: function "gen_namelist_statements" declared >> implicitly >> gen_namelist_statements("inc") ; >> ^ >> >> registry.c(202): warning #266: function "gen_nest_interp" declared >> implicitly >> gen_nest_interp( "inc" ) ; >> ^ >> >> registry.c(204): warning #266: function "gen_streams" declared implicitly >> gen_streams("inc") ; >> ^ >> >> registry.c(207): warning #266: function "gen_comms" declared implicitly >> gen_comms( "inc" ) ; /* this is either package supplied (by copying >> a */ >> ^ >> >> reg_parse.c(227): warning #266: function "tolower" declared implicitly >> x = tolower(tokens[F_DIMS][i]) ; >> ^ >> >> reg_parse.c(292): warning #177: label "normal" was declared but never >> referenced >> normal: >> ^ >> >> reg_parse.c(453): warning #266: function "tolower" declared implicitly >> if ( tolower(tokens[FIELD_STAG][i]) == 'x' || sw_all_x_staggered ) >> field_struct->stag_x = 1 ; >> ^ >> >> reg_parse.c(454): warning #266: function "tolower" declared implicitly >> if ( tolower(tokens[FIELD_STAG][i]) == 'y' || sw_all_y_staggered ) >> field_struct->stag_y = 1 ; >> ^ >> >> reg_parse.c(455): warning #266: function "tolower" declared implicitly >> if ( tolower(tokens[FIELD_STAG][i]) == 'z' ) field_struct->stag_z = >> 1 ; >> ^ >> >> reg_parse.c(474): warning #266: function "tolower" declared implicitly >> x = tolower(tmp[i]) ; >> ^ >> >> reg_parse.c(514): warning #266: function "tolower" declared implicitly >> x = tolower(tokens[FIELD_IO][i]) ; >> ^ >> >> misc.c(175): warning #1011: missing return statement at end of non-void >> function "range_of_dimension" >> } >> ^ >> >> misc.c(217): warning #592: variable "zdex" is used before its value is set >> sprintf(tmp,"%ssm3%d,%ssm3%d,1,1", r,bdex,r,zdex ) ; >> ^ >> >> misc.c(321): warning #1011: missing return statement at end of non-void >> function "get_elem" >> } >> ^ >> >> misc.c(423): warning #1011: missing return statement at end of non-void >> function "close_the_file" >> } >> ^ >> >> misc.c(430): warning #266: function "getpid" declared implicitly >> sprintf(tempfile,"regtmp1%d",getpid()) ; >> ^ >> >> misc.c(444): warning #266: function "getpid" declared implicitly >> sprintf(tempfile,"regtmp1%d",getpid()) ; >> ^ >> >> misc.c(462): warning #266: function "toupper" declared implicitly >> for ( p = str ; *p ; p++ ) *p = toupper(*p) ; >> ^ >> >> misc.c(472): warning #266: function "tolower" declared implicitly >> for ( p = str ; *p ; p++ ) *p = tolower(*p) ; >> ^ >> >> misc.c(645): warning #1011: missing return statement at end of non-void >> function "dimension_size_expression" >> } >> ^ >> >> gen_allocs.c(73): warning #1011: missing return statement at end of >> non-void function "get_count_for_alloc" >> } >> ^ >> >> gen_allocs.c(109): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(dname_tmp) ; >> ^ >> >> gen_scalar_indices.c(197): warning #266: function "make_lower_case" >> declared implicitly >> make_lower_case(fname) ; >> ^ >> >> gen_config.c(135): warning #266: function "sym_forget" declared implicitly >> sym_forget() ; >> ^ >> >> gen_config.c(167): warning #266: function "toupper" declared implicitly >> fputc(toupper(*i),fp); >> ^ >> >> gen_config.c(172): warning #266: function "toupper" declared implicitly >> fputc(toupper(*i),fp); >> ^ >> >> gen_config.c(178): warning #266: function "toupper" declared implicitly >> fputc(toupper(*i),fp); >> ^ >> >> gen_config.c(409): warning #266: function "sym_forget" declared implicitly >> sym_forget() ; >> ^ >> >> sym.c(73): warning #266: function "create_ht" declared implicitly >> create_ht( &symtab ) ; >> ^ >> >> sym.c(77): warning #266: function "exit" declared implicitly >> exit(1) ; >> ^ >> >> sym.c(153): warning #266: function "create_ht" declared implicitly >> create_ht( &symtab ) ; >> ^ >> >> sym.c(157): warning #266: function "exit" declared implicitly >> exit(1) ; >> ^ >> >> symtab_gen.c(62): warning #266: function "hash" declared implicitly >> index = hash( name ) ; >> ^ >> >> gen_comms.c(157): warning #1011: missing return statement at end of >> non-void function "print_4d_i1_decls" >> } >> ^ >> >> gen_comms.c(196): warning #1011: missing return statement at end of >> non-void function "print_decl" >> } >> ^ >> >> gen_comms.c(206): warning #1011: missing return statement at end of >> non-void function "print_body" >> } >> ^ >> >> gen_comms.c(248): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(266): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(985): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(1178): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(1312): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(1454): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(commname) ; >> ^ >> >> gen_comms.c(1672): warning #268: the format string ends before this >> argument >> sprintf(fname,"shift_halo",*direction) ; >> ^ >> >> gen_comms.c(1949): warning #1011: missing return statement at end of >> non-void function "gen_shift" >> } >> ^ >> >> gen_comms.c(2428): warning #1011: missing return statement at end of >> non-void function "gen_debug" >> } >> ^ >> >> set_dim_strs.c(145): warning #1011: missing return statement at end of >> non-void function "set_dim_strs" >> } >> ^ >> >> set_dim_strs.c(153): warning #1011: missing return statement at end of >> non-void function "set_dim_strs2" >> } >> ^ >> >> set_dim_strs.c(159): warning #1011: missing return statement at end of >> non-void function "set_dim_strs3" >> } >> ^ >> >> gen_wrf_io.c(36): warning #266: function "sym_forget" declared implicitly >> OP_F(fp,"wrf_bdyout.inc") ; >> ^ >> >> gen_wrf_io.c(452): warning #266: function "make_upper_case" declared >> implicitly >> make_upper_case(dname) ; >> ^ >> >> gen_streams.c(25): warning #266: function "gen_io_domain_defs" declared >> implicitly >> gen_io_domain_defs( fp ) ; >> ^ >> >> gen_streams.c(33): warning #266: function "gen_set_timekeeping_defs" >> declared implicitly >> gen_set_timekeeping_defs( fp ) ; >> ^ >> >> gen_streams.c(41): warning #266: function "gen_set_timekeeping_alarms" >> declared implicitly >> gen_set_timekeeping_alarms( fp ) ; >> ^ >> >> gen_streams.c(49): warning #266: function "gen_io_form_for_dataset" >> declared implicitly >> gen_io_form_for_dataset( fp ) ; >> ^ >> >> gen_streams.c(57): warning #266: function "gen_io_form_for_stream" >> declared implicitly >> gen_io_form_for_stream( fp ) ; >> ^ >> >> gen_streams.c(65): warning #266: function "gen_switches_and_alarms" >> declared implicitly >> gen_switches_and_alarms( fp ) ; >> ^ >> >> gen_streams.c(73): warning #266: function "gen_check_auxstream_alarms" >> declared implicitly >> gen_check_auxstream_alarms( fp ) ; >> ^ >> >> gen_streams.c(81): warning #266: function "gen_fine_stream_input" declared >> implicitly >> gen_fine_stream_input( fp ) ; >> ^ >> >> gen_streams.c(89): warning #266: function "gen_med_auxinput_in" declared >> implicitly >> gen_med_auxinput_in( fp ) ; >> ^ >> >> gen_streams.c(97): warning #266: function "gen_med_hist_out_opens" >> declared implicitly >> gen_med_hist_out_opens( fp ) ; >> ^ >> >> gen_streams.c(105): warning #266: function "gen_med_hist_out_closes" >> declared implicitly >> gen_med_hist_out_closes( fp ) ; >> ^ >> >> gen_streams.c(113): warning #266: function "gen_med_auxinput_in_closes" >> declared implicitly >> gen_med_auxinput_in_closes( fp ) ; >> ^ >> >> gen_streams.c(121): warning #266: function "gen_med_last_solve_io" >> declared implicitly >> gen_med_last_solve_io( fp ) ; >> ^ >> >> gen_streams.c(129): warning #266: function "gen_med_open_esmf_calls" >> declared implicitly >> gen_med_open_esmf_calls( fp ) ; >> ^ >> >> gen_streams.c(137): warning #266: function "gen_med_find_esmf_coupling" >> declared implicitly >> gen_med_find_esmf_coupling( fp ) ; >> ^ >> >> gen_streams.c(145): warning #266: function "gen_shutdown_closes" declared >> implicitly >> gen_shutdown_closes( fp ) ; >> ^ >> >> gen_streams.c(180): warning #1011: missing return statement at end of >> non-void function "gen_io_domain_defs" >> } >> ^ >> >> gen_streams.c(213): warning #1011: missing return statement at end of >> non-void function "gen_set_timekeeping_defs" >> } >> ^ >> >> gen_streams.c(296): warning #1011: missing return statement at end of >> non-void function "gen_set_timekeeping_alarms" >> } >> ^ >> >> gen_streams.c(323): warning #1011: missing return statement at end of >> non-void function "gen_io_form_for_dataset" >> } >> ^ >> >> gen_streams.c(350): warning #1011: missing return statement at end of >> non-void function "gen_io_form_for_stream" >> } >> ^ >> >> gen_streams.c(369): warning #1011: missing return statement at end of >> non-void function "gen_switches_and_alarms" >> } >> ^ >> >> gen_streams.c(400): warning #1011: missing return statement at end of >> non-void function "gen_check_auxstream_alarms" >> } >> ^ >> >> gen_streams.c(422): warning #1011: missing return statement at end of >> non-void function "gen_fine_stream_input" >> } >> ^ >> >> gen_streams.c(437): warning #1011: missing return statement at end of >> non-void function "gen_med_auxinput_in" >> } >> ^ >> >> gen_streams.c(452): warning #1011: missing return statement at end of >> non-void function "gen_med_hist_out_opens" >> } >> ^ >> >> gen_streams.c(468): warning #1011: missing return statement at end of >> non-void function "gen_med_hist_out_closes" >> } >> ^ >> >> gen_streams.c(484): warning #1011: missing return statement at end of >> non-void function "gen_med_auxinput_in_closes" >> } >> ^ >> >> gen_streams.c(497): warning #1011: missing return statement at end of >> non-void function "gen_med_last_solve_io" >> } >> ^ >> >> gen_streams.c(508): warning #1011: missing return statement at end of >> non-void function "gen_shutdown_closes" >> } >> ^ >> >> gen_streams.c(628): warning #1011: missing return statement at end of >> non-void function "gen_io_boilerplate" >> } >> ^ >> >> standard.c(43): warning #266: function "strncpy" declared implicitly >> strncpy(q,p,4) ; q+=4 ; >> ^ >> >> standard.c(54): warning #266: function "strncmp" declared implicitly >> if ( !strncmp( wrf_error_fatal_str, "wrf_error_fatal", 15 ) && >> wrf_error_fatal_str[15] != '3' ) >> ^ >> >> standard.c(78): warning #266: function "strcpy" declared implicitly >> strcpy(lineo,p+3+ns) ; >> ^ >> >> standard.c(88): warning #266: function "strcat" declared implicitly >> strcat(lineo,linei) ; >> ^ >> >> standard.c(166): warning #1011: missing return statement at end of >> non-void function "drop_comment" >> } >> ^ >> >> standard.c(176): warning #1011: missing return statement at end of >> non-void function "change_to_lower" >> } >> ^ >> >> opening Registry/registry.dimspec >> including Registry/registry.dimspec >> opening Registry/registry.les >> including Registry/registry.les >> opening Registry/registry.io_boilerplate >> including Registry/registry.io_boilerplate >> opening Registry/io_boilerplate_temporary.inc >> including Registry/io_boilerplate_temporary.inc >> opening Registry/registry.fire >> including Registry/registry.fire >> opening Registry/registry.avgflx >> including Registry/registry.avgflx >> Registry INFO variable counts: 0d 1823 1d 69 2d 469 3d 281 >> ADVISORY: RSL_LITE version of gen_comms is linked in with registry >> program. >> ar: creating ../main/libwrflib.a >> >> >> *and then downloaded the WRF Model Test Data (January 2000 WPS output >> data: >> http://www.mmm.ucar.edu/wrf/users/download/get_source2.html), after that >> in the test/em_real directory, I used this command :* >> >> cp namelist.input.jan00 namelist.input >> >> *and then ./real.exe . i saw this :* >> >> >> Namelist dfi_control not found in namelist.input. Using registry defaults >> for v >> ariables in dfi_control >> Namelist tc not found in namelist.input. Using registry defaults for >> variables >> in tc >> Namelist scm not found in namelist.input. Using registry defaults for >> variables >> in scm >> Namelist fire not found in namelist.input. Using registry defaults for >> variable >> s in fire >> --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and >> auxinput4_interval >> = 0 for all domains >> --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and >> ending t >> ime to 0 for that domain. >> --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, >> setting >> sgfdda interval and ending time to 0 for that domain. >> --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging >> interval an >> d ending time to 0 for that domain. >> --- NOTE: num_soil_layers has been set to 4 >> REAL_EM V3.2.1 PREPROCESSOR >> ************************************* >> Parent domain >> ids,ide,jds,jde 1 74 1 61 >> ims,ime,jms,jme -4 79 -4 66 >> ips,ipe,jps,jpe 1 74 1 61 >> ************************************* >> DYNAMICS OPTION: Eulerian Mass Coordinate >> alloc_space_field: domain 1, 104874600 bytes allocated >> Time period # 1 to process = 2000-01-24_12:00:00. >> Time period # 2 to process = 2000-01-24_18:00:00. >> Time period # 3 to process = 2000-01-25_00:00:00. >> Time period # 4 to process = 2000-01-25_06:00:00. >> Time period # 5 to process = 2000-01-25_12:00:00. >> Total analysis times to input = 5. >> >> >> ----------------------------------------------------------------------------- >> >> Domain 1: Current date being processed: 2000-01-24_12:00:00.0000, which >> is loop # 1 out of 5 >> configflags%julyr, %julday, %gmt: 2000 24 12.00000 >> metgrid input_wrf.F first_date_input = 2000-01-24_12:00:00 >> metgrid input_wrf.F first_date_nml = 2000-01-24_12:00:00 >> d01 2000-01-24_12:00:00 Timing for input 0 s. >> Max map factor in domain 1 = 1.03. Scale the dt in the model >> accordingly. >> forrtl: severe (66): output statement overflows record, unit -5, file >> Internal Formatted Write >> Image PC Routine Line >> Source >> real.exe 08DBF613 Unknown Unknown Unknown >> real.exe 08DBE330 Unknown Unknown Unknown >> real.exe 08D7B22E Unknown Unknown Unknown >> real.exe 08D32BBC Unknown Unknown Unknown >> real.exe 08D324BA Unknown Unknown Unknown >> real.exe 08D62029 Unknown Unknown Unknown >> real.exe 08093551 Unknown Unknown Unknown >> real.exe 0809C118 Unknown Unknown Unknown >> >> *and after that , wrf.exe : i saw this * >> >> >> Namelist dfi_control not found in namelist.input. Using registry defaults >> for v >> ariables in dfi_control >> Namelist tc not found in namelist.input. Using registry defaults for >> variables >> in tc >> Namelist scm not found in namelist.input. Using registry defaults for >> variables >> in scm >> Namelist fire not found in namelist.input. Using registry defaults for >> variable >> s in fire >> --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and >> auxinput4_interval >> = 0 for all domains >> --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and >> ending t >> ime to 0 for that domain. >> --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, >> setting >> sgfdda interval and ending time to 0 for that domain. >> --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging >> interval an >> d ending time to 0 for that domain. >> --- NOTE: num_soil_layers has been set to 4 >> WRF V3.2.1 MODEL >> ************************************* >> Parent domain >> ids,ide,jds,jde 1 74 1 61 >> ims,ime,jms,jme -4 79 -4 66 >> ips,ipe,jps,jpe 1 74 1 61 >> ************************************* >> DYNAMICS OPTION: Eulerian Mass Coordinate >> alloc_space_field: domain 1, 98194920 bytes allocated >> -------------- FATAL CALLED --------------- >> FATAL CALLED FROM FILE: LINE: 67 >> program wrf: error opening wrfinput_d01 for reading ierr= >> -1021 >> >> >> What should i do? please help me with this. >> >> Best Regards >> >> >> >> -- >> *Ehsan Beigi* >> *PhD Student* >> *Department of Civil and and Environmental Engineering >> 2408 Patrick F. Taylor Hall >> Louisiana State University >> Baton Rouge, LA, 70803* >> >> >> >> > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > 2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110316/33f81762/attachment-0001.html From ebeigi3 at tigers.lsu.edu Thu Mar 17 15:09:33 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Thu, 17 Mar 2011 16:09:33 -0500 Subject: [Wrf-users] namelist.input file Message-ID: i am running the real test case which is mentioned on the "ARW USER GUIDE" manual on page 93 , Real Data Test Case: 2000 January 24/12 through 25/12 , i already installed WPS , i have this error when i type ./real.exe Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 104874600 bytes allocated Time period # 1 to process = 2000-01-24_12:00:00. Time period # 2 to process = 2000-01-24_18:00:00. Time period # 3 to process = 2000-01-25_00:00:00. Time period # 4 to process = 2000-01-25_06:00:00. Time period # 5 to process = 2000-01-25_12:00:00. Total analysis times to input = 5. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2000-01-24_12:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2000 24 12.00000 metgrid input_wrf.F first_date_input = 2000-01-24_12:00:00 metgrid input_wrf.F first_date_nml = 2000-01-24_12:00:00 d01 2000-01-24_12:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.03. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 08DBF613 Unknown Unknown Unknown real.exe 08DBE330 Unknown Unknown Unknown real.exe 08D7B22E Unknown Unknown Unknown real.exe 08D32BBC Unknown Unknown Unknown real.exe 08D324BA Unknown Unknown Unknown real.exe 08D62029 Unknown Unknown Unknown real.exe 08093551 Unknown Unknown Unknown real.exe 0809C118 Unknown Unknown Unknown and when i type ./wrf,exe, i can see this Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 98194920 bytes allocated -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 67 program wrf: error opening wrfinput_d01 for reading ierr= -1021 i think the problem is ralated to the input file, could you please send me an input file for jan 00 em_real test case? Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110317/d86de73e/attachment.html From ebeigi3 at tigers.lsu.edu Fri Mar 18 14:25:46 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Fri, 18 Mar 2011 15:25:46 -0500 Subject: [Wrf-users] running the real test case Message-ID: i am running the real test case which is mentioned on the "ARW USER GUIDE" manual on page 93 , Real Data Test Case: 2000 January 24/12 through 25/12 , i already installed WPS , i have this error when i type ./real.exe Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ****************************** ******* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 104874600 bytes allocated Time period # 1 to process = 2000-01-24_12:00:00. Time period # 2 to process = 2000-01-24_18:00:00. Time period # 3 to process = 2000-01-25_00:00:00. Time period # 4 to process = 2000-01-25_06:00:00. Time period # 5 to process = 2000-01-25_12:00:00. Total analysis times to input = 5. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2000-01-24_12:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2000 24 12.00000 metgrid input_wrf.F first_date_input = 2000-01-24_12:00:00 metgrid input_wrf.F first_date_nml = 2000-01-24_12:00:00 d01 2000-01-24_12:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.03. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 08DBF613 Unknown Unknown Unknown real.exe 08DBE330 Unknown Unknown Unknown real.exe 08D7B22E Unknown Unknown Unknown real.exe 08D32BBC Unknown Unknown Unknown real.exe 08D324BA Unknown Unknown Unknown real.exe 08D62029 Unknown Unknown Unknown real.exe 08093551 Unknown Unknown Unknown real.exe 0809C118 Unknown Unknown Unknown and when i type ./wrf,exe, i can see this Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 74 1 61 ims,ime,jms,jme -4 79 -4 66 ips,ipe,jps,jpe 1 74 1 61 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 98194920 bytes allocated -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 67 program wrf: error opening wrfinput_d01 for reading ierr= -1021 i think the problem is ralated to the input file, could you please send me an input file for jan 00 em_real test case? Best Regards -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110318/ab26bc6f/attachment.html From mmkamal at sciborg.uwaterloo.ca Thu Mar 17 19:50:09 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Thu, 17 Mar 2011 21:50:09 -0400 Subject: [Wrf-users] Model is crashing with CAM shortwave scheme Message-ID: <20110317215009.11151ig42kp8yr8k@www.nexusmail.uwaterloo.ca> Hi All, After finishing a successful run with Dudhia shortwave scheme I tried to use CAM shortwave scheme but could not succeed. I was wondering whether any one faced the same problem while using CAM radiation scheme. I get the following error message: taskid: 0 hostname: bro95 Namelist fdda not found in namelist.input. Using registry defaults for variable s in fdda Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire Ntasks in X 2, ntasks in Y 4 NOTE: CAM radiation is in use, setting: paerlev=29, levsiz=59, cam_abs_dim1=4, cam_abs_dim2=e_vert NOTE: num_soil_layers has been set to 4 WRF V3.2 MODEL wrf: calling alloc_and_configure_domain ************************************* Parent domain ids,ide,jds,jde 1 225 1 175 ims,ime,jms,jme -4 119 -4 51 ips,ipe,jps,jpe 1 112 1 44 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 184262200 bytes allocated wrf: calling model_to_grid_config_rec wrf: calling set_scalar_indices_from_config wrf: calling init_wrfio Entering ext_gr1_ioinit setup_timekeeping: set xtime to 0.0000000E+00 setup_timekeeping: set julian to 172.0000 setup_timekeeping: returning... wrf main: calling open_r_dataset for wrfinput med_initialdata_input: calling input_input input_wrf: set xtime to 0.0000000E+00 mminlu = 'USGS' med_initialdata_input: back from input_input checking boundary conditions for grid boundary conditions OK for grid start_domain_em: Before call to phy_init WRF TILE 1 IS 1 IE 112 JS 1 JE 44 set_tiles3: NUMBER OF TILES = 1 top of phy_init phy_init: start_of_simulation = T calling nl_get_iswater, nl_get_isice, nl_get_mminlu_loc after nl_get_iswater, nl_get_isice, nl_get_mminlu_loc top of landuse_init INPUT LandUse = "USGS" returning from of landuse_init reading CAM_ABS_DATA num_months = 13 reading CAM_AEROPT_DATA forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source wrf.exe 00000000014217C1 Unknown Unknown Unknown wrf.exe 0000000001423821 Unknown Unknown Unknown wrf.exe 00000000014225FA Unknown Unknown Unknown wrf.exe 00000000011A6D5A Unknown Unknown Unknown wrf.exe 0000000000D2DB60 Unknown Unknown Unknown wrf.exe 0000000000D27583 Unknown Unknown Unknown wrf.exe 0000000000A8B6F7 Unknown Unknown Unknown wrf.exe 00000000009CA6AE Unknown Unknown Unknown wrf.exe 00000000009C82E7 Unknown Unknown Unknown wrf.exe 000000000047FBC2 Unknown Unknown Unknown wrf.exe 000000000047F009 Unknown Unknown Unknown wrf.exe 000000000047EFAC Unknown Unknown Unknown libc.so.6 00002AE11F2F9994 Unknown Unknown Unknown wrf.exe 000000000047EEB9 Unknown Unknown Unknown ############################################################################ ### The namelist.input file I am using is given below ############## ############################################################################ &time_control run_days = 5, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2008, 2008, 2008, start_month = 06, 06, 06, start_day = 21, 21, 21, start_hour = 00, 00, 00, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2008, 2008, 2008, end_month = 06, 06, 06, end_day = 29, 29, 29, end_hour = 00, 00, 00, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800 input_from_file = .true.,.true.,.true., history_interval = 180, 180, 180, frames_per_outfile = 1000, 1000, 1000, restart = .false., restart_interval = 5000, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 100 / &domains time_step = 48, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 3, e_we = 225, 202, 181, e_sn = 175, 202, 271, e_vert = 28, 28, 28, p_top_requested = 10000, num_metgrid_levels = 30 num_metgrid_soil_levels = 4, dx = 8000, 2666.666, 888.888, dy = 8000, 2666.666, 888.888, grid_id = 1, 2, 3, parent_id = 1, 1, 2, i_parent_start = 1, 75, 70, j_parent_start = 1, 65, 50, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 / &physics mp_physics = 6, 6, 6, ra_lw_physics = 1, 1, 1, ra_sw_physics = 3, 3, 3, radt = 8, 8, 8, sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 1, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 0, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &dynamics w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, 0, 0, diff_6th_factor = 0.12, 0.12, 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.2, 0.2, 0.2 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, scalar_adv_opt = 1, 1, 1, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false., nested = .false., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / ############################################################################ ### The namelist.wps file I am using is given below ############## ############################################################################ &share wrf_core = 'ARW', max_dom = 3, start_date = '2008-06-21_00:00:00','2008-06-21_00:00:00', '2008-06-21_00:00:00', end_date = '2008-06-29_00:00:00','2008-06-21_00:00:00', '2008-06-21_00:00:00', interval_seconds = 10800 io_form_geogrid = 2, / &geogrid parent_id = 1, 1, 2, parent_grid_ratio = 1, 3, 3, i_parent_start = 1, 75, 70, j_parent_start = 1, 65, 50, e_we = 225, 202, 181, e_sn = 175, 202, 271, geog_data_res = ?modis_30s+5m?, ?modis_30s+2m?, ?modis_30s+30s? dx = 8000, dy = 8000, map_proj = 'lambert', ref_lat = 43.0, ref_lon = -80.0, truelat1 = 30.0, truelat2 = 60.0, stand_lon = -80.0, geog_data_path = '/work/kamal/brown/DATA/geog' / &ungrib out_format = 'WPS', prefix = 'NARR', / &metgrid fg_name = 'NARR' io_form_metgrid = 2, constants_name ='./NARRFIX:1979-11-08_00' / &mod_levs press_pa = 201300 , 200100 , 100000 , 95000 , 90000 , 85000 , 80000 , 75000 , 70000 , 65000 , 60000 , 55000 , 50000 , 45000 , 40000 , 35000 , 30000 , 25000 , 20000 , 15000 , 10000 , 5000 , 1000 / So, it only seems to be a problem (for me) when I use CAM shortwave scheme instead of Dudhia scheme. Does anyone know the reason for this ? Thanks Kamal From thomas.schwitalla at uni-hohenheim.de Thu Mar 17 06:37:54 2011 From: thomas.schwitalla at uni-hohenheim.de (Thomas Schwitalla) Date: Thu, 17 Mar 2011 13:37:54 +0100 Subject: [Wrf-users] Wrf-users Digest, Vol 79, Issue 19 In-Reply-To: References: Message-ID: <4D8200A2.1050103@uni-hohenheim.de> Dear Maria, the problems are the "intent(in)" and "intent(out)" statements at line 2363/2364 of dyn_em/module_initialize_real.F . Try to delete the intent statements, it should help for gfortran 4.2.4. The other way would be to upgrade to gfortran 4.3.x which can handle these pointer declarations without problems. Best regards, Thomas Am 16.03.2011 20:53, schrieb wrf-users-request at ucar.edu: > Message: 3 > Date: Tue, 15 Mar 2011 14:28:06 -0400 > From: Maria Eugenia > Subject: [Wrf-users] problem installing wrf v3.2.1 with gcc v4.2.4 > To: wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Dear all, > > I'm trying to install WRFV3.2.1 (dmpar, without support for grib2 io) > in a SGI Altix 3700 IA64 Itanium 2. > I'm using gcc/gfortran v 4.2.4, netcdf v4.1.1, mpich2 v1.3.2 and libpng v1.2.12. > I have successfully built WRF V3.2, with the above configuration, but > the newest version does not compile completely. > > After compiling for em_real, I only have wrf.exe in the WRFV3/main > directory, no real.exe nor any of the other executables that should be > there. I don't understand why wrf.exe compiles and real.exe > doesn't.... > > I also tried to compile it serial only, to exclude a possible problem > with mpich2, but the result is the same. > Maybe it is relevant to mention that ggc is not the default compiler, > the default is ifort 9.0. However I have checked the paths and > installed the required softwares with gcc (netcdf,mpich2,libpng). > Besides that, when I run ./configure, I only see options to use intel > compiler: > 1. Linux SGI Altix, ifort compiler with icc 9.x,10.x (serial) > 2. Linux SGI Altix, ifort compiler with icc 9.x,10.x (smpar) > 3. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dmpar) > 4. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dm+sm) > Therefore I created my configure.wrf file based on gcc. Anyway, I > don't believe this is the problem given that I successfully built v3.2 > with the same configuration. > > The configure.wrf file for the serial and dmpar build are attached as > well as the compile.log. > I sincerely appreciate your help. > > Maria E. B. Frediani > ------------------------------------------------------------------------------------- > Visiting Scholar > University of Connecticut > School of Engineering > 261 Glenbrook Rd > Storrs, CT 06269 > frediani at engr.uconn.edu > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_serial > Type: application/octet-stream > Size: 19317 bytes > Desc: not available > Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment.obj > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: compile.log.wrfv3.2.1.serial > Type: application/octet-stream > Size: 492280 bytes > Desc: not available > Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0001.obj > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_dmpar > Type: application/octet-stream > Size: 19428 bytes > Desc: not available > Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0002.obj > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: compile.log.wrfv3.2.1.dmpar > Type: application/octet-stream > Size: 509771 bytes > Desc: not available > Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/attachment-0003.obj > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 79, Issue 19 > ***************************************** From ahsanshah01 at gmail.com Fri Mar 18 21:51:55 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Sat, 19 Mar 2011 08:51:55 +0500 Subject: [Wrf-users] Error Ungrib Message-ID: Hello, I am trying to run WPS with NCEP GFS 1.0deg Global model data at 00Z with analysis and 180 hour forecasts included in each file. Forecast variables are available at 3 hour intervals. But ./ungrib.exe is giving following error. Namelist.wps attached. Please help me. *[pmdtest at pmd02 wps]$ ./ungrib.exe * * *** Starting program ungrib.exe **** *Start_date = 2010-07-20_00:00:00 , End_date = 2010-07-21_00:00:00* *output format is WPS* *Path to intermediate files is ./* * ungrib - grib edition num 2* * * * ############################################################################### * * * *Inventory for date = 2010-07-20 00:00:00* * * *PRES TT UU VV RH HGT PSFC PMSL SM000010 SM010040 SM040100 SM100200 SM010200 ST000010 ST010040 ST040100 ST100200 ST010200 SEAICE LANDSEA SOILHGT SKINTEMP SNOW SOILCAT VEGCAT * * ------------------------------------------------------------------------------- * *2013.0 O O O O O O X O O O O O O O O O O O O O O O O O * *2001.0 X X X X O X O X X X X O X X X X O X X X X X O O * *1000.0 X X X X X * * 975.0 X X X X X * * 950.0 X X X X X * * 925.0 X X X X X * * 900.0 X X X X X * * 850.0 X X X X X * * 800.0 X X X X X * * 750.0 X X X X X * * 700.0 X X X X X * * 650.0 X X X X X * * 600.0 X X X X X * * 550.0 X X X X X * * 500.0 X X X X X * * 450.0 X X X X X * * 400.0 X X X X X * * 350.0 X X X X X * * 300.0 X X X X X * * 250.0 X X X X X * * 200.0 X X X X X * * 150.0 X X X X X * * 100.0 X X X X X * * 70.0 X X X X X * * 50.0 X X X X X * * 30.0 X X X X X * * 20.0 X X X X * * 10.0 X X X X X * * ------------------------------------------------------------------------------- * * Begin rrpr* *Interpolating to fill in RH at level 2000* * * * ############################################################################### * * * *Inventory for date = 2010-07-20 00:00:00* * * *PRES TT UU VV RH HGT PSFC PMSL SM000010 SM010040 SM040100 SM100200 SM010200 ST000010 ST010040 ST040100 ST100200 ST010200 SEAICE LANDSEA SOILHGT SKINTEMP SNOW SOILCAT VEGCAT * * ------------------------------------------------------------------------------- * *2013.0 O O O O O O X O O O O O O O O O O O O O O O O O * *2001.0 X X X X O X O X X X X O X X X X O X X X X X O O * *1000.0 X X X X X * * 975.0 X X X X X * * 950.0 X X X X X * * 925.0 X X X X X * * 900.0 X X X X X * * 850.0 X X X X X * * 800.0 X X X X X * * 750.0 X X X X X * * 700.0 X X X X X * * 650.0 X X X X X * * 600.0 X X X X X * * 550.0 X X X X X * * 500.0 X X X X X * * 450.0 X X X X X * * 400.0 X X X X X * * 350.0 X X X X X * * 300.0 X X X X X * * 250.0 X X X X X * * 200.0 X X X X X * * 150.0 X X X X X * * 100.0 X X X X X * * 70.0 X X X X X * * 50.0 X X X X X * * 30.0 X X X X X * * 20.0 X X X X X * * 10.0 X X X X X * * ------------------------------------------------------------------------------- * *Subroutine DATINT: Interpolating 3-d files to fill in any missing data...* *Looking for data at time 2010-07-20_00* *Found file: GFS:2010-07-20_00* *Looking for data at time 2010-07-20_03* *ERROR: Data not found: 2010-07-20_03:00:00.0000* -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110319/0cf1f72c/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 742 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110319/0cf1f72c/attachment-0001.obj From claudiomet at gmail.com Sat Mar 19 14:26:53 2011 From: claudiomet at gmail.com (claudiomet) Date: Sat, 19 Mar 2011 17:26:53 -0300 Subject: [Wrf-users] vertical levels interpolation Message-ID: Greetings ! I have an operational WRF running for 1x1 km and 27 vertical levels. Now, I need to extract information each 10 vertical meters for the first 2000 meters above ground. It's possible to made this with ARWpost or I need another tool for this? Greetings ! -- Claudio Cortes +56 (2) 2994121 Meteorologo Laboratorio de Innovaci?n e Inform?tica Ambiental (LIIA) Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) Centro Nacional del Medio Ambiente (CENMA) -- Claudio Cortes +56 (2) 2994121 Meteorologist Laboratory of Innovation and Environmental Informatics Modeling and Air Quality Management Unit National Enviroment Center, Chile (CENMA) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110319/39f43b14/attachment.html From eric.kemp at nasa.gov Mon Mar 21 07:25:05 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.3)[NORTHROP GRUMMAN INFORMATION TECH]) Date: Mon, 21 Mar 2011 08:25:05 -0500 Subject: [Wrf-users] Wrf-users Digest, Vol 79, Issue 19 In-Reply-To: <4D8200A2.1050103@uni-hohenheim.de> Message-ID: I've also encountered this problem with gfortran 4.2. It's worth noting that declaring pointer arguments with the intent attribute is illegal in Fortran 90, and is only recently permitted with Fortran 2003. So perhaps those intent statements should be removed in the official source code distribution for better standard conformance. -Eric On 3/17/11 8:37 AM, "Thomas Schwitalla" wrote: > Dear Maria, > > the problems are the "intent(in)" and "intent(out)" statements at line > 2363/2364 of dyn_em/module_initialize_real.F . Try to delete the intent > statements, it should help for gfortran 4.2.4. The other way would be to > upgrade to gfortran 4.3.x which can handle these pointer declarations > without problems. > > Best regards, > Thomas > > Am 16.03.2011 20:53, schrieb wrf-users-request at ucar.edu: >> Message: 3 >> Date: Tue, 15 Mar 2011 14:28:06 -0400 >> From: Maria Eugenia >> Subject: [Wrf-users] problem installing wrf v3.2.1 with gcc v4.2.4 >> To: wrf-users at ucar.edu >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> Dear all, >> >> I'm trying to install WRFV3.2.1 (dmpar, without support for grib2 io) >> in a SGI Altix 3700 IA64 Itanium 2. >> I'm using gcc/gfortran v 4.2.4, netcdf v4.1.1, mpich2 v1.3.2 and libpng >> v1.2.12. >> I have successfully built WRF V3.2, with the above configuration, but >> the newest version does not compile completely. >> >> After compiling for em_real, I only have wrf.exe in the WRFV3/main >> directory, no real.exe nor any of the other executables that should be >> there. I don't understand why wrf.exe compiles and real.exe >> doesn't.... >> >> I also tried to compile it serial only, to exclude a possible problem >> with mpich2, but the result is the same. >> Maybe it is relevant to mention that ggc is not the default compiler, >> the default is ifort 9.0. However I have checked the paths and >> installed the required softwares with gcc (netcdf,mpich2,libpng). >> Besides that, when I run ./configure, I only see options to use intel >> compiler: >> 1. Linux SGI Altix, ifort compiler with icc 9.x,10.x (serial) >> 2. Linux SGI Altix, ifort compiler with icc 9.x,10.x (smpar) >> 3. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dmpar) >> 4. Linux SGI Altix, ifort compiler with icc 9.x,10.x (dm+sm) >> Therefore I created my configure.wrf file based on gcc. Anyway, I >> don't believe this is the problem given that I successfully built v3.2 >> with the same configuration. >> >> The configure.wrf file for the serial and dmpar build are attached as >> well as the compile.log. >> I sincerely appreciate your help. >> >> Maria E. B. Frediani >> ----------------------------------------------------------------------------- >> -------- >> Visiting Scholar >> University of Connecticut >> School of Engineering >> 261 Glenbrook Rd >> Storrs, CT 06269 >> frediani at engr.uconn.edu >> -------------- next part -------------- >> A non-text attachment was scrubbed... >> Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_serial >> Type: application/octet-stream >> Size: 19317 bytes >> Desc: not available >> Url : >> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/att >> achment.obj >> -------------- next part -------------- >> A non-text attachment was scrubbed... >> Name: compile.log.wrfv3.2.1.serial >> Type: application/octet-stream >> Size: 492280 bytes >> Desc: not available >> Url : >> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/att >> achment-0001.obj >> -------------- next part -------------- >> A non-text attachment was scrubbed... >> Name: configure.wrfv3.2.1_x86_64_gfortran_gcc_dmpar >> Type: application/octet-stream >> Size: 19428 bytes >> Desc: not available >> Url : >> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/att >> achment-0002.obj >> -------------- next part -------------- >> A non-text attachment was scrubbed... >> Name: compile.log.wrfv3.2.1.dmpar >> Type: application/octet-stream >> Size: 509771 bytes >> Desc: not available >> Url : >> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110315/fbbd2fa2/att >> achment-0003.obj >> >> ------------------------------ >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> >> >> End of Wrf-users Digest, Vol 79, Issue 19 >> ***************************************** > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- From edonnell at weatherzone.com.au Mon Mar 21 22:55:22 2011 From: edonnell at weatherzone.com.au (Elisabeth Donnell) Date: Tue, 22 Mar 2011 15:55:22 +1100 (EST) Subject: [Wrf-users] How to get RH field from dew point using ungrib? In-Reply-To: <1656872308.549.1300768902148.JavaMail.root@zimbra.theweather.com.au> Message-ID: <139141404.560.1300769722711.JavaMail.root@zimbra.theweather.com.au> Hello wrf-users, I am using CMC to drive WRF-NMM (V3.2). My input grib files contain dew point at 2m and dew point depression at other vertical levels (not RH). I think that WRF needs as input an RH field. I noticed there is some code in the ungrib (rrpr.F) that looks like it should do the conversion for me (see at end of email). My question is therefore ...is there something I should be doing to utilise this code, as at present when I run ungrib, RH does not appear in the intermediate file ? Should I set up my Vtable (and METGRID.TBL) differently? Any ideas much appreciated. This is the section of code I am referring to in in rrpr.F ! If surface RH is missing, see if we can compute RH from Specific Humidity ! or Dewpoint or Dewpoint depression: ! if (.not. is_there (200100, 'RH')) then if (is_there(200100, 'TT').and. & is_there(200100, 'PSFC' ) .and. & is_there(200100, 'SPECHUMD')) then call get_dims(200100, 'TT') call compute_rh_spechumd(map%nx, map%ny) call mprintf(.true.,DEBUG, & "RRPR: SURFACE RH is computed") elseif (is_there(200100, 'TT' ).and. & is_there(200100, 'DEWPT')) then call get_dims(200100, 'TT') call compute_rh_dewpt(map%nx, map%ny) elseif (is_there(200100, 'TT').and. & is_there(200100, 'DEPR')) then call get_dims(200100, 'TT') call compute_rh_depr(map%nx, map%ny, 200100.) endif endif and the Vtable wrf at model11:/var/domains/AUS9_CMC/wpsprd$ more Vtable GRIB1| Level| From | To | metgrid | metgrid | metgrid | Param| Type |Level1|Level2| Name | Units | Description | -----+------+------+------+----------+---------+------------------------------------------+ 11 | 100 | * | | TT | K | Temperature | 33 | 100 | * | | UU | m s-1 | U | 34 | 100 | * | | VV | m s-1 | V | 7 | 100 | * | | HGT | m | Height | 11 | 105 | 2 | | TT | K | Temperature at 2 m | 17 | 105 | 2 | | DEWPT | K | Dew point temperature at 2 m | 33 | 105 | 10 | | UU | m s-1 | U at 10 m | 34 | 105 | 10 | | VV | m s-1 | V at 10 m | 81 | 1 | 0 | | LANDSEA | proprtn | Land/Sea flag (1=land,0=sea) | 2 | 102 | 0 | | PMSL | Pa | Sea-level Pressure | 80 | 1 | 0 | | SST | K | Sea Surface Temperature | 76 | 100 | * | | QC | kg kg-1 | Cloud water mixing ratio -----+------+------+------+----------+---------+------------------------------------------+ Cheers, Liz Dr. Elisabeth Donnell weatherzone? M 0425322032 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110322/3b3edf56/attachment.html From guitingsong at ntu.edu.sg Tue Mar 22 02:51:30 2011 From: guitingsong at ntu.edu.sg (Song Guiting) Date: Tue, 22 Mar 2011 16:51:30 +0800 Subject: [Wrf-users] wrf_user_intrp3d problem Message-ID: Dear WRF users and NCL guys, when I use wrf_user_intrp3d in NCL to make a vertical interpolation (in meter), I found the first interpolating level is always missing value. For example my model top is 30 km, then the first interpolating level is 300 meter, and it is always a missing value. As you can from the figure-Z_Vertical_CrossSection.000004.png, the white part in the surface represents the missing value which contaminates the true terrain height. I will be very appreciated if anyone give me some suggestion. I copy the script as follows. Best regards, Guiting load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl" load "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl" ;load "./WRFUserARW.ncl" begin ; ; The WRF ARW input file. ; This needs to have a ".nc" appended, so just do it. a = addfile("./wrfout.nc","r") ; We generate plots, but what kind do we prefer? type = "x11" ; type = "pdf" ; type = "ps" ; type = "ncgm" type = "png" wks = gsn_open_wks(type,"Z_Vertical_CrossSection") ; Set some basic resources res = True res at MainTitle = "REAL-TIME WRF" res at Footer = False pltres = True ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ; What times and how many time steps are in the data set? FirstTime = True FirstTimeMap = True times = wrf_user_list_times(a) ; get times in the file ntimes = dimsizes(times) ; number of times in the file ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; xlat = wrf_user_getvar(a, "XLAT",0) xlon = wrf_user_getvar(a, "XLONG",0) ter = wrf_user_getvar(a, "HGT",0) do it = 0,ntimes-1,2 ; TIME LOOP print("Working on time: " + times(it) ) res at TimeLabel = times(it) ; Set Valid time to use on plots ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ; First get the variables we will need tc = wrf_user_getvar(a,"tc",it) ; T in C rh = wrf_user_getvar(a,"rh",it) ; relative humidity z = wrf_user_getvar(a, "z",it) ; grid point height ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; do ip = 1, 3 ; we are doing 3 plots ; all with the pivot point (plane) in the center of the domain ; at angles 0, 45 and 90 ; | ; | ; angle=0 is |, angle=90 is ------ ; | ; | ; Build plane (pivot point) through which the cross section will go ; OR set to zero, if start and end points are specified ; IF using plane and angle, set opts in wrf_user_intrp3d to False dimsrh = dimsizes(rh) plane = new(2,float) plane = (/ dimsrh(2)/2, dimsrh(1)/2 /) ; pivot point is center of domain opts = False if(ip .eq. 1) then angle = 90. X_plane = wrf_user_intrp2d(xlon,plane,angle,opts) X_desc = "longitude" end if if(ip .eq. 2) then angle = 0. X_plane = wrf_user_intrp2d(xlat,plane,angle,opts) X_desc = "latitude" end if if(ip .eq. 3) then angle = 45. X_plane = wrf_user_intrp2d(xlon,plane,angle,opts) X_desc = "longitude" end if ; X-axis lables dimsX = dimsizes(X_plane) xmin = X_plane(0) xmax = X_plane(dimsX(0)-1) xspan = dimsX(0)-1 nxlabs = floattoint( (xmax-xmin)/2 + 1) if (FirstTimeMap) then lat_plane = wrf_user_intrp2d(xlat,plane,angle,opts) lon_plane = wrf_user_intrp2d(xlon,plane,angle,opts) mpres = True mpres at mpGeophysicalLineColor = "Black" mpres at mpNationalLineColor = "Black" mpres at mpUSStateLineColor = "Black" mpres at mpGridLineColor = "Black" mpres at mpLimbLineColor = "Black" mpres at mpPerimLineColor = "Black" pltres = True pltres at FramePlot = False optsM = res optsM at NoHeaderFooter = True optsM at cnFillOn = True optsM at lbTitleOn = False contour = wrf_contour(a,wks,ter,optsM) plot = wrf_map_overlays(a,wks,(/contour/),pltres,mpres) lnres = True lnres at gsLineThicknessF = 3.0 lnres at gsLineColor = "black";"Red" do ii = 0,dimsX(0)-2 gsn_polyline(wks,plot,(/lon_plane(ii),lon_plane(ii+1)/),(/lat_plane(ii),lat_plane(ii+1)/),lnres) end do frame(wks) delete(lon_plane) delete(lat_plane) pltres at FramePlot = True end if if (FirstTime) then ; THIS IS NEEDED FOR LABLES - ALWAYS DO (Z axis only needed once. X everytime plane changes) ; Y-axis labels zmax = 6000. ; We only want to see the first 6 km zz = wrf_user_intrp3d(z,z,"v",plane,angle,opts) dims = dimsizes(zz) do imax = 0,dims(0)-1 if ( .not.ismissing(zz(imax,0)) .and. zz(imax,0) .lt. zmax ) then zmax_pos = imax end if end do zspan = zmax_pos zmin = z(0,0,0) zmax = zz(zmax_pos,0) print(zmax) zmin=zmin/1000. zmax=zmax/1000. nzlabs = floattoint(zmax + 1) FirstTime = False ; END OF ALWAYS DO end if ; Interpolate data vertically (in z) rh_plane = wrf_user_intrp3d(rh,z,"v",plane,angle,opts) tc_plane = wrf_user_intrp3d(tc,z,"v",plane,angle,opts) ; Options for XY Plots opts_xy = res opts_xy at tiXAxisString = X_desc opts_xy at tiYAxisString = "Height (km)" opts_xy at cnMissingValPerimOn = True opts_xy at cnMissingValFillColor = 0 opts_xy at cnMissingValFillPattern = 11 opts_xy at tmXTOn = False opts_xy at tmYROn = False opts_xy at tmXBMode = "Explicit" opts_xy at tmXBValues = fspan(0,xspan,nxlabs) ; Create the correct tick marks opts_xy at tmXBLabels = sprintf("%.1f",fspan(xmin,xmax,nxlabs)); Create labels opts_xy at tmXBLabelFontHeightF = 0.015 opts_xy at tmYLMode = "Explicit" opts_xy at tmYLValues = fspan(0,zspan,nzlabs) ; Create the correct tick marks opts_xy at tmYLLabels = sprintf("%.1f",fspan(zmin,zmax,nzlabs)); Create labels opts_xy at tiXAxisFontHeightF = 0.020 opts_xy at tiYAxisFontHeightF = 0.020 opts_xy at tmXBMajorLengthF = 0.02 opts_xy at tmYLMajorLengthF = 0.02 opts_xy at tmYLLabelFontHeightF = 0.015 opts_xy at PlotOrientation = tc_plane at Orientation ; Plotting options for RH opts_rh = opts_xy opts_rh at ContourParameters = (/ 10., 90., 10. /) opts_rh at pmLabelBarOrthogonalPosF = -0.1 opts_rh at cnFillOn = True opts_rh at cnFillColors = (/"White","White","White", \ "White","Chartreuse","Green", \ "Green3","Green4", \ "ForestGreen","PaleGreen4"/) ; Plotting options for Temperature opts_tc = opts_xy opts_tc at cnInfoLabelZone = 1 opts_tc at cnInfoLabelSide = "Top" opts_tc at cnInfoLabelPerimOn = True opts_tc at cnInfoLabelOrthogonalPosF = -0.00005 opts_tc at ContourParameters = (/ 5. /) ; Get the contour info for the rh and temp contour_tc = wrf_contour(a,wks,tc_plane(0:zmax_pos,:),opts_tc) contour_rh = wrf_contour(a,wks,rh_plane(0:zmax_pos,:),opts_rh) ; MAKE PLOTS plot = wrf_overlays(a,wks,(/contour_rh,contour_tc/),pltres) ; Delete options and fields, so we don't have carry over delete(opts_xy) delete(opts_tc) delete(opts_rh) delete(tc_plane) delete(rh_plane) delete(X_plane) end do ; make next cross section ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; FirstTimeMap = False end do ; END OF TIME LOOP end ________________________________ CONFIDENTIALITY: This email is intended solely for the person(s) named and may be confidential and/or privileged. If you are not the intended recipient, please delete it, notify us and do not copy, use, or disclose its content. Thank you. Towards A Sustainable Earth: Print Only When Necessary -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110322/694e58e1/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: wrf_CrossSection4.ncl Type: application/octet-stream Size: 7900 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110322/694e58e1/attachment-0001.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: Z_Vertical_CrossSection.000003.png Type: image/png Size: 143595 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110322/694e58e1/attachment-0002.png -------------- next part -------------- A non-text attachment was scrubbed... Name: Z_Vertical_CrossSection.000004.png Type: image/png Size: 109681 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110322/694e58e1/attachment-0003.png From davidstephenbryan at yahoo.com Wed Mar 23 12:02:02 2011 From: davidstephenbryan at yahoo.com (David Bryan) Date: Wed, 23 Mar 2011 11:02:02 -0700 (PDT) Subject: [Wrf-users] limiting WRF output Message-ID: <552372.45132.qm@web56205.mail.re3.yahoo.com> I want to limit the output of WRF 3.2 ARW?to just a few parameters--wind speed and temperature--while keeping the rest of its operation as a default installation.? After reading the User's Guide, I thought that perhaps changing the Registry file was the way to address that.? However, after looking at the Registry files, all the parameters seemed to affect the internal operation of the model; I didn't find a part that was solely concerned with output. Should I be focusing on the namelist.output instead? Also, is there a way to limit the output file to a subset of gridpoints within a given domain? Thanks! From J.Kala at murdoch.edu.au Thu Mar 24 20:29:24 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Fri, 25 Mar 2011 10:29:24 +0800 Subject: [Wrf-users] WRF is "hanging" Message-ID: Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110325/af72eebd/attachment.html From johannes.keller at psi.ch Thu Mar 24 00:48:00 2011 From: johannes.keller at psi.ch (Keller Johannes) Date: Thu, 24 Mar 2011 07:48:00 +0100 Subject: [Wrf-users] limiting WRF output In-Reply-To: <552372.45132.qm@web56205.mail.re3.yahoo.com> References: <552372.45132.qm@web56205.mail.re3.yahoo.com> Message-ID: <9A1F6C5E-C547-46FA-897B-F8881E8C30AE@psi.ch> David, save the original Registry file Registry.EM as a backup, then remove the "h* (stands for history) in the "IO" column of those state variables you don't want to be written to the wrfout_.... file(s). Hannes On Mar 23, 2011, at 7:02 PM, David Bryan wrote: > I want to limit the output of WRF 3.2 ARW to just a few parameters--wind speed > and temperature--while keeping the rest of its operation as a default > installation. After reading the User's Guide, I thought that perhaps changing > the Registry file was the way to address that. However, after looking at the > Registry files, all the parameters seemed to affect the internal operation of > the model; I didn't find a part that was solely concerned with output. > > Should I be focusing on the namelist.output instead? > > Also, is there a way to limit the output file to a subset of gridpoints within a > given domain? > > Thanks! > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users --------------------------------------------------------------------- Dr.Johannes Keller Paul Scherrer Institut (PSI) Laboratory of Atmospheric Chemistry (LAC) Building OFLA / 012 CH-5232 Villigen PSI, Switzerland Phone: +41 56 310 20 65, Fax: +41 56 310 45 25 e-mail: johannes.keller(at)psi.ch http://www.psi.ch/ http://lac.web.psi.ch --------------------------------------------------------------------- From nilima.2002 at gmail.com Thu Mar 24 01:26:35 2011 From: nilima.2002 at gmail.com (nilima natoo) Date: Thu, 24 Mar 2011 08:26:35 +0100 Subject: [Wrf-users] wps compilation error In-Reply-To: References: Message-ID: Hello wrf users, I have successfully compiled the WRFV3 and now trying to do for WPS. However there are some errors while compiling WPS. Please find attached the log file and kindly let me know what corrections are necessary.. I am using 64 bit cluster machine, 4.0.1 version netcdf compiled with gfortran. I successfully compiled WRFV3 with gfortran and was trying the same for WPS. many thanks and regards, nma -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110324/17e06cd1/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log Type: text/x-log Size: 88090 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110324/17e06cd1/attachment-0001.bin From tnquanghuy at gmail.com Thu Mar 24 12:12:38 2011 From: tnquanghuy at gmail.com (Huy Tran) Date: Thu, 24 Mar 2011 10:12:38 -0800 Subject: [Wrf-users] limiting WRF output In-Reply-To: <552372.45132.qm@web56205.mail.re3.yahoo.com> References: <552372.45132.qm@web56205.mail.re3.yahoo.com> Message-ID: <4D8B8996.9090200@gmail.com> Hi David, Modifying the Registry file is right path. You can read more on WRF Registry from here: http://www.mmm.ucar.edu/wrf/users/tutorial/200807/WRF%20Registry%20and%20Examples.pdf For each parameter there is I/O description, such as "i01rhusdf". Here the "h" indicates how that parameter will be written out in WRF history output file. The "h" maybe followed by integer numbers, for example "h01" which indicate that the parameter will be written both in principle output and in the first auxiliary files. So if you eliminate the "h" and any number follow it, that parameter will not be written out. If you just want to limit to a very few number of parameters to be written out, the best way to do is make those parameters be written out in a auxiliary file (e.g auxiliary 1). Then in the namelist.input set the auxhist1_interval to the desired frequency, and change the history_interval to much larger value to make WRF write the principle outputs less frequently. Doing this way will only require modifications to the concerned parameters and not have to pay attention to tons of other parameters in the Registry. As far as I know there is no way to limit the output file to a subset of gridpoints of a domain. Help this can help. Huy On 3/23/2011 10:02 AM, David Bryan wrote: > I want to limit the output of WRF 3.2 ARW to just a few parameters--wind speed > and temperature--while keeping the rest of its operation as a default > installation. After reading the User's Guide, I thought that perhaps changing > the Registry file was the way to address that. However, after looking at the > Registry files, all the parameters seemed to affect the internal operation of > the model; I didn't find a part that was solely concerned with output. > > Should I be focusing on the namelist.output instead? > > Also, is there a way to limit the output file to a subset of gridpoints within a > given domain? > > Thanks! > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From bllamptey at gmail.com Fri Mar 25 16:07:12 2011 From: bllamptey at gmail.com (Benjamin Lamptey) Date: Fri, 25 Mar 2011 22:07:12 +0000 Subject: [Wrf-users] WRF Physics Schemes Message-ID: Hello, I am preparing some notes on the WRF Physics schemes. I wish to have information that will ensure the appropriate use of the schemes. I am using the schemes in v3.2.1. I should be grateful if you could address the following questions/issues about any of the schemes for me. I am willing to make the information available after the compilation. Thanks Ben WRF Physics Schemes (WRFV3.2.1) The objective of this is for the following issues to be answered for each scheme; 1. What was the objective for with which the scheme was developed? 2. What temporal scale was it developed for? 3. What kind of weather phenomena was it developed for? 4. What spatial scale or resolution was it developed for? 5. Any other information to ensure appropriate use of the schemes 6. Include appropriate references 14.1 Microphysics Schemes 14.1.1 Kessler 14.1.2 Lin et al. 14.1.3 WRF Single Moment 3 (WSM3) 14.1.4 WRF Single Moment 5 (WSM5) 14.1.5 Ferrier (new Eta) 14.1.6 WRF Single Moment 6 (WSM6) 14.1.7 Goddard GCE 14.1.8 Thompson 14.1.9 Milbrandt-Yau 14.1.10 Morrison (2 moments) 14.1.11 WDM 5-class 14.1.12 WDM 6-class 14.1.13 Thompson scheme (version from V3.0) 14.2 Convective Schemes 14.2.1 Kain-Fritsch (new Eta) 14.2.2 Betts-Miller-Janjic 14.2.3 Grell-Devenyi ensemble scheme 14.2.4 Simpli?ed Arakawa-Schubert (NMM only) 14.2.5 Grell 3D ensemble scheme 14.2.6 previous Kain-Fritsch 14.3 Boundary Layer Schemes 14.3.1 YSU scheme 14.3.2 Mellor-Yamada-Janjic TKE scheme 14.3.3 NCEP Global Forecast System scheme (NMM only) 14.3.4 Quasi-Normal Scale Elimination PBL 14.3.5 MYNN 2.5 level TKE scheme, works with sf sfclay physics=1 or 2 as well as 5 14.3.6 MYNN 3rd level TKE scheme, works only MYNNSFC (sf sfclay physics = 5) 14.3.7 ACM2 (Pleim) PBL (ARW) 14.3.8 Bougeault and Lacarrere (BouLac) PBL 14.3.9 MRF scheme -- Benjamin Lamptey, PhD Senior Lecturer Nautical Science Department Regional Maritime University P.O. Box GP 1115, Accra Ghana and Scientist (Meteorologist and Geoscientist) West African Science Service Centre for Climate and Adapted Land Use c/o AGRA at CSIR Office Complex PMB KIA 114, Airport-Accra, Ghana FAX: +233 030 2 768602 Cell: +233(0)273135062 bllamptey at gmail.com http://www.rap.ucar.edu/~lamptey -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110325/771cff85/attachment.html From FLiu at azmag.gov Fri Mar 25 19:04:28 2011 From: FLiu at azmag.gov (Feng Liu) Date: Sat, 26 Mar 2011 01:04:28 +0000 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: References: Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> Hi Jatin, I do not know exactly what is wrong for your case, but one thing you can try is to reduce time_step in namelist.input by 3 times. Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Thursday, March 24, 2011 7:29 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF is "hanging" Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110326/305a5be3/attachment.html From J.Kala at murdoch.edu.au Sat Mar 26 01:19:03 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Sat, 26 Mar 2011 15:19:03 +0800 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> Message-ID: Thanks for the suggestion Feng, but this is not related to namelist inputs. The namelist I am running worked fine on a different machine. The issue here is that WRF simply hangs and does nothing at initialisation of Grid 2. Ie, the rsl.out and rsl.error files print out: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input and that's it. The rsl.error and rsl.out files do not keep growing in size, there are no more prints, they just stop printing stuff. The job however is still in the queue and does NOT error out, until the walltime is elapsed. No wrfout_d0* files are created. Other people seem to have had this issue before: http://mailman.ucar.edu/pipermail/wrf-users/2010/001749.html http://mailman.ucar.edu/pipermail/wrf-users/2010/001747.html Any help more than welcome. Regards, Jatin From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Saturday, 26 March 2011 9:04 AM To: Jatin Kala; wrf-users at ucar.edu Subject: RE: WRF is "hanging" Hi Jatin, I do not know exactly what is wrong for your case, but one thing you can try is to reduce time_step in namelist.input by 3 times. Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Thursday, March 24, 2011 7:29 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF is "hanging" Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110326/451d6989/attachment-0001.html From Don.Morton at alaska.edu Mon Mar 28 12:55:27 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Mon, 28 Mar 2011 10:55:27 -0800 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> Message-ID: I have run into these kinds of issues a number of times. In one case, it was buggy implementation of MPI, in the scatterv() call, and switching to openmpi fixed the problem. In other cases, there were simply bad nodes on the machine. My own theory (may be completely wrong) is that these things hangs very frequently occur while the master task is scattering stuff to all the slaves. This is seems to be a good operation for stressing MPI and/or node communications. I have found that these kinds of problems are often (but not always) intermittent, and sometimes reducing the number of tasks will get it running (presumably because you're not stressing the underlying software and hardware infrastructure. To date, I've never found these to be "WRF" problems. Good luck! Don On Fri, Mar 25, 2011 at 11:19 PM, Jatin Kala wrote: > Thanks for the suggestion Feng, but this is not related to namelist > inputs. The namelist I am running worked fine on a different machine. > > The issue here is that WRF simply hangs and does nothing at initialisation > of Grid 2. Ie, the rsl.out and rsl.error files print out: > > > > d01 2009-10-01_00:00:00 alloc_space_field: domain 2, > 84045408 b > > ytes allocated > > d01 2009-10-01_00:00:00 alloc_space_field: domain 2, > 3084672 b > > ytes allocated > > d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input > file. ** > > * > > d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input > > > > and that?s it. The rsl.error and rsl.out files do not keep growing in size, > there are no more prints, they just stop printing stuff. The job however is > still in the queue and does NOT error out, until the walltime is elapsed. No > wrfout_d0* files are created. > > > > Other people seem to have had this issue before: > > > > http://mailman.ucar.edu/pipermail/wrf-users/2010/001749.html > > > > http://mailman.ucar.edu/pipermail/wrf-users/2010/001747.html > > > > > > Any help more than welcome. > > > > Regards, > > > > Jatin > > > > > > > > *From:* Feng Liu [mailto:FLiu at azmag.gov] > *Sent:* Saturday, 26 March 2011 9:04 AM > *To:* Jatin Kala; wrf-users at ucar.edu > *Subject:* RE: WRF is "hanging" > > > > Hi Jatin, > > I do not know exactly what is wrong for your case, but one thing you can > try is to reduce time_step in namelist.input by 3 times. Good luck. > > Feng > > > > > > *From:* wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] *On > Behalf Of *Jatin Kala > *Sent:* Thursday, March 24, 2011 7:29 PM > *To:* wrf-users at ucar.edu > *Subject:* [Wrf-users] WRF is "hanging" > > > > Dear WRF-users, > > > > I have compiled WRF3.2 on our new supercomputing facility, and having some > trouble. Namely, WRF is just ?hanging? at: > > > > d01 2009-10-01_00:00:00 alloc_space_field: domain 2, > 84045408 b > > ytes allocated > > d01 2009-10-01_00:00:00 alloc_space_field: domain 2, > 3084672 b > > ytes allocated > > d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input > file. ** > > * > > d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input > > > > > > The job remains in the queue, i.e, does not error out until walltime is > elapsed. > > > > I have compiled with ?O0 but that did not help. I have also compiled with > the updated ?gen_allocs.c? form the WRF website, but that has not helped > either. I did do a ?clean ?a? before. > > > > I have compiled WRF with the follows libs: > > > > intel-compilers/2011.1.107 > > jasper/1.900.1 > > ncarg/5.2.1 > > mpi/intel/openmpi/1.4.2-qlc > > netcdf/4.0.1/intel-2011.1.107 > > export WRFIO_NCD_LARGE_FILE_SUPPORT=1 > > > > Any help would be greatly appreciated! > > > > Kind regards, > > > > Jatin > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110328/eceddcf4/attachment.html From kotroni at meteo.noa.gr Mon Mar 28 13:07:07 2011 From: kotroni at meteo.noa.gr (Vassiliki Kotroni) Date: Mon, 28 Mar 2011 22:07:07 +0300 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> Message-ID: <001f01cbed7b$56674580$0335d080$@noa.gr> Dear all we recently had the same problem. We had compiled mpi and wrf with the latest available version of intel and when trying to run the model was hanging. We found out that the problem was that we has only installed 64-bit intel (as our system is 64-bit, an amd-phaenom) but indeed installation of 32-bit on the same system was also needed. Once we installed the 32-bit without any recompilation the model was running OK. Bizar but that is what happened to us. best Vasso ---------------------------------------------------------------------------- -------- Dr. Vassiliki KOTRONI Institute of Environmental Research National Observatory of Athens Lofos Koufou, P. Pendeli, GR-15236 Athens, Greece Tel: +30 2 10 8109126 Fax: +30 2 10 8103236 Daily weather forecasts at: www.noa.gr/forecast (in english) www.meteo.gr (in greek) www.eurometeo.gr From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Don Morton Sent: 28 March 2011 21:55 To: Jatin Kala Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] WRF is "hanging" I have run into these kinds of issues a number of times. In one case, it was buggy implementation of MPI, in the scatterv() call, and switching to openmpi fixed the problem. In other cases, there were simply bad nodes on the machine. My own theory (may be completely wrong) is that these things hangs very frequently occur while the master task is scattering stuff to all the slaves. This is seems to be a good operation for stressing MPI and/or node communications. I have found that these kinds of problems are often (but not always) intermittent, and sometimes reducing the number of tasks will get it running (presumably because you're not stressing the underlying software and hardware infrastructure. To date, I've never found these to be "WRF" problems. Good luck! Don On Fri, Mar 25, 2011 at 11:19 PM, Jatin Kala wrote: Thanks for the suggestion Feng, but this is not related to namelist inputs. The namelist I am running worked fine on a different machine. The issue here is that WRF simply hangs and does nothing at initialisation of Grid 2. Ie, the rsl.out and rsl.error files print out: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input and that's it. The rsl.error and rsl.out files do not keep growing in size, there are no more prints, they just stop printing stuff. The job however is still in the queue and does NOT error out, until the walltime is elapsed. No wrfout_d0* files are created. Other people seem to have had this issue before: http://mailman.ucar.edu/pipermail/wrf-users/2010/001749.html http://mailman.ucar.edu/pipermail/wrf-users/2010/001747.html Any help more than welcome. Regards, Jatin From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Saturday, 26 March 2011 9:04 AM To: Jatin Kala; wrf-users at ucar.edu Subject: RE: WRF is "hanging" Hi Jatin, I do not know exactly what is wrong for your case, but one thing you can try is to reduce time_step in namelist.input by 3 times. Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Thursday, March 24, 2011 7:29 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF is "hanging" Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110328/6f088b99/attachment-0001.html From Amy.Maples at noblis.org Tue Mar 29 08:51:05 2011 From: Amy.Maples at noblis.org (Maples, Amy C.) Date: Tue, 29 Mar 2011 14:51:05 +0000 Subject: [Wrf-users] CCSM data into WPS Message-ID: <732F5334E2229D4A9F0D8B22293489C10F532822@mbx1.mitretek.org> WRF users, We are attempting to run CCSM data through WRF using a domain of the continental US. We've set up a preprocessing script to turn our netcdf data files into appropriately formatted grib files. When we run NNRP data in netcdf through this script followed by WPS and WRF, the output appears fine. However, the CCSM data, after being run through the exact same script, has extremely strange precipitation patterns when it comes out of WRF (at the moment, precipitation only over bodies of water). We believe the main issues lies with the pressures. Both input sets of grib files are set up on pressures levels and include variables for PMSL and PSFC (all in Pa). When we run the NNRP data through WPS, the output PSFC and PMSL values look like the input grib data. However, when we run the CCSM values through WPS, the output PSFC and PMSL have identical values which are over an unreasonably small range. The two data sets are run through WPS using the same Vtable and without any changes to METGRID.TBL. The only changes to namelist.wps are in the date fields. The only real difference between the datasets, once in grib format, is that the CCSM data has missing values. The CCSM data is interpolated from sigma levels and so has missing values below the surface of the earth, however we could extrapolate below the surface to eliminate the missing values. But the NNRP data does not have missing values. We have tried setting the missing_value parameters in METGRID.TBL to be the same as the grib files, but this doesn't change the WRF output. Does anyone know what could be causing the difference in PSFC and PMSL? Or other possible causes of the precipitation patterns? Thanks, Amy Maples Center For Sustainability Noblis, Inc. Falls Church, VA -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110329/a3943b78/attachment.html From bbrashers at Environcorp.com Wed Mar 30 09:50:46 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Wed, 30 Mar 2011 08:50:46 -0700 Subject: [Wrf-users] Alternative to RT_fdda_reformat_obsnud.pl? Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF3085206C6D606@irvine01.irvine.environ.local> Hi all, Does anyone have an alternative to RT_fdda_reformat_obsnud.pl? I'm running WRF in 5.5 day chunks, with about ~50 MB/hour of OBS in LITTLE_R format. Because the amount of data is so large, this perl script takes a really long time to run -- about 4 days per 5.5-day init (running on a compute node with 4 GB of RAM: I/O over NFS, and clearly swapping). That's longer than I expect my WRF run to take. Has anyone re-written this program in FORTRAN or another compiled language that might be more efficient? Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From ebouzeid at princeton.edu Wed Mar 30 10:58:46 2011 From: ebouzeid at princeton.edu (Elie Bou-Zeid) Date: Wed, 30 Mar 2011 12:58:46 -0400 Subject: [Wrf-users] postdoc or more senior position available in Princeton University Message-ID: <4D936146.3040402@princeton.edu> _*Postdoctoral Research Associate in Urban Hydrometeorology*_ A postdoctoral or more senior researcher position(s) in hydrometeorological modeling is available in the Department of Civil and Environmental Engineering at Princeton University. The prospective researcher will perform multi-scale atmospheric modeling to investigate the effects of urbanization and other anthropogenic forcings on the water cycle in China. The work will use the Weather Research and Forecasting model for mesoscale and large-eddy simulations, focusing on droughts, floods, and land-atmospheric exchanges of relevance to the hydrological cycle. The research will be supervised by Professors Jim Smith (http://hydrometeorology.princeton.edu/) and Elie Bou-Zeid (http://efm.princeton.edu/). Applicants from all academic backgrounds will be considered, but previous experience in atmospheric modeling and/or computational fluid dynamics is strongly desired. A PhD is required. Interested candidates are encouraged to apply as soon as possible; the position will remain open until filled. All applications should be submitted through the Princeton University jobs site http://jobs.princeton.edu position number 0110134. Please include a CV, a brief statement of research experience and interests, and the names and emails of three references. This appointment is for one year, and may be renewed pending satisfactory performance and sufficient funding. Princeton University is an equal opportunity employer and complies with applicable EEO and affirmative action regulations. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110330/6f3c3bee/attachment.html From FLiu at azmag.gov Mon Mar 28 16:00:47 2011 From: FLiu at azmag.gov (Feng Liu) Date: Mon, 28 Mar 2011 22:00:47 +0000 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C2547A0CF@mag9006> Jatin and Don, If the problem can not resolved by reducing time_step, I suspect and would say that it is highly resulted from your system. I experienced the similar issue before and took a lot of time to research what happened to system. We also asked wrfhelp for the solution but in vain, please see replies from wrfhelp immediate after my reply. We have a cluster to run different models such as WRF, WRF/Chem, CMAQ, CAMx, etc. our cluster equipped with 8 nodes each of which is dual quad-core, originally had a 100M switch and all models including WRF3.2.1 ran perfectly. In order to improve computing efficiency we upgraded the switch to 1 Gb and the CMAQ, CAMx models still run fine, however, WRF 3.2.1 model hang up more often with increasing the number of processor involved in the computing. But it seemed fairly random. When it hang up, no error message, no stop. We checked MPI libraries, MPICH, compiler flags..., many things. So we now leave it back to and on the 100 M switch. Everything works well and no hang up happens to WRF model any more though cluster slows down. Why did CMAQ (parallel version) ran successfully with the 1 Gb switch and full nodes, but WRF 3.2.1 did not? Before we re-order another 1 Gb or more advanced switch and test it the question is still opened. For effective test, you may use a pilot program test.f attached. If your WRF hangs up, it should hang up too, but it lets you get quick check. I hope it is helpful. I will keep a close eye on this issue. Thanks. Feng -------------------------------------------- Wrfhelp repliy: Since we have not had report from other users, I am guessing the problem has to do with your system than with the code. If you can get help from your system support or the vendor, that might be helpful. wrfhelp On Jan 21, 2011, at 9:41 AM, Feng Liu wrote: > Hi, > I re-compiled WRF3.2.1. The "hung up" problem sometimes still happens, > sometimes does not, it seems fairly random. It hangs more often with > increasing processor number. I also consulted with our IT staff but no > solution so far. Your support is highly appreciated. > Feng > > > -----Original Message----- > From: wrfhelp [mailto:wrfhelp at ucar.edu] > Sent: Thursday, January 06, 2011 4:38 PM > To: Feng Liu > Subject: Re: job hang up without error message when I used > > Could you work with your system support people and see if they can > help? > wrfhelp > > On Jan 5, 2011, at 8:15 PM, Feng Liu wrote: > >> Hi, >> I had the hang up problem with version 3.2 no matter multiple >> processors or single one used but when I modified namelist.input, it >> did work. For version 3.2.1 I think the course of this problem is >> different because it does work with master nodes. >> On stability of computer you mentioned, you may be right. We updated >> network work card from 100M switch to 1 G. We had stable WRF running >> with old network card even though its performance was poor. However, >> I can run CMAQ4.7.1 with 64 processors ( we have 8 nodes each of >> which has 8 processors) successfully, and speedup factor is almost >> 2.8 comparing against the system with old network card. >> Thanks. >> Feng >> >> >> -----Original Message----- >> From: wrfhelp [mailto:wrfhelp at ucar.edu] >> Sent: Wednesday, January 05, 2011 7:24 PM >> To: Feng Liu >> Subject: Re: job hang up without error message when I used >> >> Have you seen this problem with other versions of the model code >> before? Is your system stable? >> Can you run other MPI jobs steadily on this system? What I am saying >> is that it is possible that it is problem with the computer, not the >> model code. >> >> wrfhelp >> >> On Jan 5, 2011, at 5:11 PM, Feng Liu wrote: >> >>> Hi, >>> Thanks for your response. But I am using WRF3.2.1 which has the same >>> problem. I have no idea so far. >>> Thanks. >>> Feng >>> >>> >>> -----Original Message----- >>> From: wrfhelp [mailto:wrfhelp at ucar.edu] >>> Sent: Wednesday, January 05, 2011 4:20 PM >>> To: Feng Liu >>> Subject: Re: job hang up without error message when I used >>> >>> Mike was using 3.2 at the time, and the fix has been included in >>> 3.2.1. >>> >>> wrfhelp >>> >>> On Jan 5, 2011, at 1:30 PM, Feng Liu wrote: >>> >>>> Hi, >>>> I can run WRF3.2.1 successfully if I only use master node with 8 >>>> processors. However, my jobs (with MPI) hand up when I was using >>>> more 16 processors or more than two nodes, no error message, no >>>> crashes. This problem was described by Michael Zulauf as below: >>>> (see http://mailman.ucar.edu/pipermail/wrf-users/2010/001745.html >>>> >>>> "My jobs sporadically (but usually eventually) hang up, most often >>>> after a new wrfout file is opened. No error messages, no crashes >>>> - the processes continue, but _all_ output stops. I eventually >>>> just have to kill the job. The wrfouts are small, and all output >>>> looks good up until the failed wrfout." >>>> >>>> Mike mentioned he got a modified code from wrfhelp and seemed to >>>> fix this issue. I also need to know which code need to be modified >>>> and what is the problem related to? Thanks for support on fixing >>>> this problem. >>>> Feng >>>> >>> >>> wrfhelp From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Saturday, March 26, 2011 12:19 AM To: wrf-users at ucar.edu Subject: Re: [Wrf-users] WRF is "hanging" Thanks for the suggestion Feng, but this is not related to namelist inputs. The namelist I am running worked fine on a different machine. The issue here is that WRF simply hangs and does nothing at initialisation of Grid 2. Ie, the rsl.out and rsl.error files print out: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input and that's it. The rsl.error and rsl.out files do not keep growing in size, there are no more prints, they just stop printing stuff. The job however is still in the queue and does NOT error out, until the walltime is elapsed. No wrfout_d0* files are created. Other people seem to have had this issue before: http://mailman.ucar.edu/pipermail/wrf-users/2010/001749.html http://mailman.ucar.edu/pipermail/wrf-users/2010/001747.html Any help more than welcome. Regards, Jatin From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Saturday, 26 March 2011 9:04 AM To: Jatin Kala; wrf-users at ucar.edu Subject: RE: WRF is "hanging" Hi Jatin, I do not know exactly what is wrong for your case, but one thing you can try is to reduce time_step in namelist.input by 3 times. Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Thursday, March 24, 2011 7:29 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF is "hanging" Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110328/77173821/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: test.f Type: application/octet-stream Size: 397 bytes Desc: test.f Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110328/77173821/attachment-0001.obj From J.Kala at murdoch.edu.au Wed Mar 30 19:58:19 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Thu, 31 Mar 2011 09:58:19 +0800 Subject: [Wrf-users] WRF is "hanging" In-Reply-To: <9BDE2A7F9712AF45A0C08451B3CD8E5C2547A0CF@mag9006> References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25478A31@mag9006> <9BDE2A7F9712AF45A0C08451B3CD8E5C2547A0CF@mag9006> Message-ID: Hi all, Thanks for all the replies. It turns out that there was something "funny" with our openmpi library. Our system admins have partly fixed the issue, so, now WRF only hangs on certain numbers of CPUs per node. (Don't know why yet!). They are still working on it. Cheers, Jatin From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Tuesday, 29 March 2011 6:01 AM To: Jatin Kala; wrf-users at ucar.edu Cc: Jatin Kala Subject: RE: WRF is "hanging" Jatin and Don, If the problem can not resolved by reducing time_step, I suspect and would say that it is highly resulted from your system. I experienced the similar issue before and took a lot of time to research what happened to system. We also asked wrfhelp for the solution but in vain, please see replies from wrfhelp immediate after my reply. We have a cluster to run different models such as WRF, WRF/Chem, CMAQ, CAMx, etc. our cluster equipped with 8 nodes each of which is dual quad-core, originally had a 100M switch and all models including WRF3.2.1 ran perfectly. In order to improve computing efficiency we upgraded the switch to 1 Gb and the CMAQ, CAMx models still run fine, however, WRF 3.2.1 model hang up more often with increasing the number of processor involved in the computing. But it seemed fairly random. When it hang up, no error message, no stop. We checked MPI libraries, MPICH, compiler flags..., many things. So we now leave it back to and on the 100 M switch. Everything works well and no hang up happens to WRF model any more though cluster slows down. Why did CMAQ (parallel version) ran successfully with the 1 Gb switch and full nodes, but WRF 3.2.1 did not? Before we re-order another 1 Gb or more advanced switch and test it the question is still opened. For effective test, you may use a pilot program test.f attached. If your WRF hangs up, it should hang up too, but it lets you get quick check. I hope it is helpful. I will keep a close eye on this issue. Thanks. Feng -------------------------------------------- Wrfhelp repliy: Since we have not had report from other users, I am guessing the problem has to do with your system than with the code. If you can get help from your system support or the vendor, that might be helpful. wrfhelp On Jan 21, 2011, at 9:41 AM, Feng Liu wrote: > Hi, > I re-compiled WRF3.2.1. The "hung up" problem sometimes still happens, > sometimes does not, it seems fairly random. It hangs more often with > increasing processor number. I also consulted with our IT staff but no > solution so far. Your support is highly appreciated. > Feng > > > -----Original Message----- > From: wrfhelp [mailto:wrfhelp at ucar.edu] > Sent: Thursday, January 06, 2011 4:38 PM > To: Feng Liu > Subject: Re: job hang up without error message when I used > > Could you work with your system support people and see if they can > help? > wrfhelp > > On Jan 5, 2011, at 8:15 PM, Feng Liu wrote: > >> Hi, >> I had the hang up problem with version 3.2 no matter multiple >> processors or single one used but when I modified namelist.input, it >> did work. For version 3.2.1 I think the course of this problem is >> different because it does work with master nodes. >> On stability of computer you mentioned, you may be right. We updated >> network work card from 100M switch to 1 G. We had stable WRF running >> with old network card even though its performance was poor. However, >> I can run CMAQ4.7.1 with 64 processors ( we have 8 nodes each of >> which has 8 processors) successfully, and speedup factor is almost >> 2.8 comparing against the system with old network card. >> Thanks. >> Feng >> >> >> -----Original Message----- >> From: wrfhelp [mailto:wrfhelp at ucar.edu] >> Sent: Wednesday, January 05, 2011 7:24 PM >> To: Feng Liu >> Subject: Re: job hang up without error message when I used >> >> Have you seen this problem with other versions of the model code >> before? Is your system stable? >> Can you run other MPI jobs steadily on this system? What I am saying >> is that it is possible that it is problem with the computer, not the >> model code. >> >> wrfhelp >> >> On Jan 5, 2011, at 5:11 PM, Feng Liu wrote: >> >>> Hi, >>> Thanks for your response. But I am using WRF3.2.1 which has the same >>> problem. I have no idea so far. >>> Thanks. >>> Feng >>> >>> >>> -----Original Message----- >>> From: wrfhelp [mailto:wrfhelp at ucar.edu] >>> Sent: Wednesday, January 05, 2011 4:20 PM >>> To: Feng Liu >>> Subject: Re: job hang up without error message when I used >>> >>> Mike was using 3.2 at the time, and the fix has been included in >>> 3.2.1. >>> >>> wrfhelp >>> >>> On Jan 5, 2011, at 1:30 PM, Feng Liu wrote: >>> >>>> Hi, >>>> I can run WRF3.2.1 successfully if I only use master node with 8 >>>> processors. However, my jobs (with MPI) hand up when I was using >>>> more 16 processors or more than two nodes, no error message, no >>>> crashes. This problem was described by Michael Zulauf as below: >>>> (see http://mailman.ucar.edu/pipermail/wrf-users/2010/001745.html >>>> >>>> "My jobs sporadically (but usually eventually) hang up, most often >>>> after a new wrfout file is opened. No error messages, no crashes >>>> - the processes continue, but _all_ output stops. I eventually >>>> just have to kill the job. The wrfouts are small, and all output >>>> looks good up until the failed wrfout." >>>> >>>> Mike mentioned he got a modified code from wrfhelp and seemed to >>>> fix this issue. I also need to know which code need to be modified >>>> and what is the problem related to? Thanks for support on fixing >>>> this problem. >>>> Feng >>>> >>> >>> wrfhelp From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Saturday, March 26, 2011 12:19 AM To: wrf-users at ucar.edu Subject: Re: [Wrf-users] WRF is "hanging" Thanks for the suggestion Feng, but this is not related to namelist inputs. The namelist I am running worked fine on a different machine. The issue here is that WRF simply hangs and does nothing at initialisation of Grid 2. Ie, the rsl.out and rsl.error files print out: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input and that's it. The rsl.error and rsl.out files do not keep growing in size, there are no more prints, they just stop printing stuff. The job however is still in the queue and does NOT error out, until the walltime is elapsed. No wrfout_d0* files are created. Other people seem to have had this issue before: http://mailman.ucar.edu/pipermail/wrf-users/2010/001749.html http://mailman.ucar.edu/pipermail/wrf-users/2010/001747.html Any help more than welcome. Regards, Jatin From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Saturday, 26 March 2011 9:04 AM To: Jatin Kala; wrf-users at ucar.edu Subject: RE: WRF is "hanging" Hi Jatin, I do not know exactly what is wrong for your case, but one thing you can try is to reduce time_step in namelist.input by 3 times. Good luck. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jatin Kala Sent: Thursday, March 24, 2011 7:29 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF is "hanging" Dear WRF-users, I have compiled WRF3.2 on our new supercomputing facility, and having some trouble. Namely, WRF is just "hanging" at: d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 84045408 b ytes allocated d01 2009-10-01_00:00:00 alloc_space_field: domain 2, 3084672 b ytes allocated d01 2009-10-01_00:00:00 *** Initializing nest domain # 2 from an input file. ** * d01 2009-10-01_00:00:00 med_initialdata_input: calling input_input The job remains in the queue, i.e, does not error out until walltime is elapsed. I have compiled with -O0 but that did not help. I have also compiled with the updated "gen_allocs.c" form the WRF website, but that has not helped either. I did do a "clean -a" before. I have compiled WRF with the follows libs: intel-compilers/2011.1.107 jasper/1.900.1 ncarg/5.2.1 mpi/intel/openmpi/1.4.2-qlc netcdf/4.0.1/intel-2011.1.107 export WRFIO_NCD_LARGE_FILE_SUPPORT=1 Any help would be greatly appreciated! Kind regards, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110331/28f73197/attachment-0001.html From davidstephenbryan at yahoo.com Fri Apr 1 12:07:11 2011 From: davidstephenbryan at yahoo.com (David Bryan) Date: Fri, 1 Apr 2011 11:07:11 -0700 (PDT) Subject: [Wrf-users] WRF Compilation Problems on Ubuntu 10.10 Message-ID: <413749.65936.qm@web56202.mail.re3.yahoo.com> I'm compiling WRFv3.2.1 (gfortran, smpar, basic nesting) on Ubuntu 10.10 with gfortran. I succesfully compiled NetCDF, installing all needed packages (with the same gfortran) for this build-essential, m4, gfortran and g++) and adding all environment variables in the .bashrc. But the WRF compilation fails with this first error: ? "Fatal Error: Can't open module file 'wrf_data.mod' for reading at (1): No such file or directory" ? Can you please suggest a way to address this?? Thanks! ? David Bryan From jagan at tnau.ac.in Fri Apr 1 23:05:44 2011 From: jagan at tnau.ac.in (jagan TNAU) Date: Sat, 2 Apr 2011 10:35:44 +0530 Subject: [Wrf-users] solar radiation calculation Message-ID: Hello Users, I need to calculate the solar radiation received from the sun for the whole day. The parameter SWDOWN output is likely to provide solution. But how to calculate the accumulated solar radiation for the whole day. -- With regards Dr.R.Jagannathan Professor of Agronomy, Department of Agronomy Tamil Nadu Agricultural University, Coimbatore - 641 003 India PHONE: Mob: +91 94438 89891 DO NOT PRINT THIS E-MAIL UNLESS NECESSARY. THE ENVIRONMENT CONCERNS US ALL. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110402/a47c5759/attachment.html From ahsanshah01 at gmail.com Fri Apr 1 21:17:08 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Sat, 2 Apr 2011 08:17:08 +0500 Subject: [Wrf-users] WRF running parallel in cluster Message-ID: Hello, I want to run WRF on multiple nodes in a linux cluster using openmpi, giving the command *mpirun -np 4 ./wrf.exe* just submit it to the single node . I don't know how to run it on other nodes as well. Help needed. Regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110402/bed44a5b/attachment.html From Jacob.Wimberley at noaa.gov Sat Apr 2 13:07:50 2011 From: Jacob.Wimberley at noaa.gov (Jake Wimberley) Date: Sat, 02 Apr 2011 14:07:50 -0500 Subject: [Wrf-users] WRF Compilation Problems on Ubuntu 10.10 Message-ID: <000301cbf169$4312d980$c9388c80$%wimberley@noaa.gov> David: This is likely a cascading error. You might check to make sure that there was not an error earlier in the build that prevented the wrf_data.mod module from being compiled. I have experienced lots of errors in the past where a module fails to build, but Make ignores the error and continues compilation; everything is OK in that case until something tries to use that module, then you get another error similar to this one. Jake Wimberley Meteorologist NWS WFO Milwaukee/Sullivan, Wis. David Bryan said: "I'm compiling WRFv3.2.1 (gfortran, smpar, basic nesting) on Ubuntu 10.10 with gfortran. I succesfully compiled NetCDF, installing all needed packages (with the same gfortran) for this build-essential, m4, gfortran and g++) and adding all environment variables in the .bashrc. But the WRF compilation fails with this first error: "Fatal Error: Can't open module file 'wrf_data.mod' for reading at (1): No such file or directory"" -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110402/f5bfb098/attachment.html From moudipascal at yahoo.fr Mon Apr 4 02:26:05 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Mon, 4 Apr 2011 09:26:05 +0100 (BST) Subject: [Wrf-users] Need ncl gen_be script Message-ID: <85443.66952.qm@web29013.mail.ird.yahoo.com> Hello to All, Please, would someone sent me the ncl gen_be scripts he has used? I would like to generate plots, but i got errors. I have some troubles. Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110404/c10c90c3/attachment.html From william.gustafson at pnl.gov Mon Apr 4 12:29:51 2011 From: william.gustafson at pnl.gov (Gustafson, William I) Date: Mon, 4 Apr 2011 11:29:51 -0700 Subject: [Wrf-users] Vacancy: Post-doc for Scale-Adaptive Parameterization Development for Climate Models Message-ID: POST-DOC OPENING AT PACIFIC NORTHWEST NATIONAL LABORATORY Job Title: Post Doctorate RA - Scale-Adaptive Parameterization Development for Climate Models Job ID: 300774 Location: Pacific Northwest National Laboratories - Richland, WA Directorate: Fundamental & Computational Sciences Group: Atmospheric Chemistry & Meteorology Job Description --------------- The Atmospheric Chemistry and Meteorology Group of the Atmospheric Sciences and Global Change Division seeks a Post Doctoral Researcher to actively participate in research involving the investigation and development of scale-aware parameterizations for climate models. The position will target the improvement of parameterizations for next-generation climate models with multi-resolution domains. Work will focus on evaluating scale dependency of current parameterization techniques and also on the development of new techniques that will enable parameterizations to function across a range of model grid spacings. Experience with regional and global models, along with cloud parameterizations and the interplay between parameterizations is desired. The PNNL group interacts with other leading climate science activities nationally and internationally and this position can lead to high visibility and leadership in the research community. The successful candidate will be expected to: * Conduct technical research on reducing scale dependence of model parameterizations * Publish research results in highly visible, peer-reviewed venues (conferences & journals) * Interact with internal and external research staff and domain scientists for collaboration purposes * Participate and make presentations on the work * Participate in team meetings, and potentially interact with funding clients Minimum Requirements -------------------- Candidates must have received a PhD within the past five years from an accredited college or university. All staff at the Pacific Northwest National Laboratory must be able to demonstrate the legal right to work in the United States. Qualifications -------------- * Familiarity with the WRF and/or CESM models, or equivalent, ideally for applications ranging across scales * Strong understanding of current climate model parameterization techniques * Demonstrated ability to analyze observations and use them to evaluate and understand model results * Fluency in modern Fortran required, and at least a working familiarity with an interpreted computer language (Perl, Python, IDL, etc.) * Peer-reviewed publication record in a relevant area highly desired * Ability to work independently and to efficiently deliver results * Excellent verbal and written English communication skills Equal Employment Opportunity ---------------------------- Pacific Northwest National Laboratory (PNNL) is an Affirmative Action / Equal Opportunity Employer and supports diversity in the workplace. All employment decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, marital or family status, sexual orientation, gender identity, or genetic information. All staff at the Pacific Northwest National Laboratory must be able to demonstrate the legal right to work in the United States. Apply at or . _______________________________________________ William I. Gustafson Jr., Ph.D. Scientist ATMOSPHERIC SCIENCES AND GLOBAL CHANGE DIVISION Pacific Northwest National Laboratory P.O. 999, MSIN K9-30 Richland, WA 99352 Tel: 509-372-6110 William.Gustafson at pnl.gov http://www.pnl.gov/atmospheric/staff/staff_info.asp?staff_num=5716 http://www.researcherid.com/rid/A-7732-2008 From bbrashers at Environcorp.com Mon Apr 4 09:48:29 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Mon, 4 Apr 2011 08:48:29 -0700 Subject: [Wrf-users] WRF running parallel in cluster In-Reply-To: References: Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF3085206CC4155@irvine01.irvine.environ.local> Are you using any queuing system like SGE, Torque, PBS, etc.? In OpenMPI, the mpirun (and mpiexec) are really just links to orterun. Orterun is smart enough to get the list of hostnames to use from the queuing system. If you're not using a queuing system, then you need to tell orterun which machines to use. There's several ways, see `man orterun`. You could do any of these: # cat hosts.txt machine1 machine1 machine2 machine2 # orterun -np 4 -hostfile hosts.txt wrf.exe # mpirun -np 4 -machinefile hosts.txt wrf.exe # orterun -np 4 -host machine1,machine1,machine2,machine2 wrf.exe Bart Brashers From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ahsan Ali Sent: Friday, April 01, 2011 8:17 PM To: wrfhelp Subject: [Wrf-users] WRF running parallel in cluster Hello, I want to run WRF on multiple nodes in a linux cluster using openmpi, giving the command mpirun -np 4 ./wrf.exe just submit it to the single node . I don't know how to run it on other nodes as well. Help needed. Regards, -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110404/8eda2452/attachment.html From satyaban_2001 at rediffmail.com Tue Apr 5 18:59:57 2011 From: satyaban_2001 at rediffmail.com (Satyaban Bishoyi Ratna) Date: 6 Apr 2011 00:59:57 -0000 Subject: [Wrf-users] =?utf-8?q?Metgrid_problem-SST_in_ERA-Interim_data?= Message-ID: <20110406005957.10400.qmail@f5mail-236-220.rediffmail.com> Dear User,I would like to update SST in my simulation with ERAIN data. I could able to complete the simulation successfully, however the SST data in the model is reading wrongly. I have tried to verify the "met_em......" files in WPS and came to know that the SST values are not correct in these files. I have post processed the "met_em..." file for one tine using ARWpost and seen using GrADS that the SST values are "Contouring: -1.1e+30 to 1e+29 interval 1e+29". The original SST grib data provided to WPS are correct and in degree K but I am surprise to see the wrong values in the files generated by "metgrid". Can you please suggest how to solve this error. Thanks in advance. Best regards Satyaban -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110406/8aebe1c6/attachment.html From Don.Morton at alaska.edu Wed Apr 6 18:59:12 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Wed, 6 Apr 2011 16:59:12 -0800 Subject: [Wrf-users] WRF on bluefire Message-ID: Howdy, I'm getting ready to do some basic WRF testing on bluefire, and am wondering if there are some centralized WRF resources already installed (for example, all the geog files), so that I don't have to use up valuable disc space repeating what possibly many other users have already done. And, is there a pre-compiled WPS and WRF on bluefire that people tend to use (at least until they get into something that requires more fine-tuning)? If so, is there a resource out there that might describe all of this? Or, does everybody just install their own geog and WRF/WPS, etc. in their own user space? Thanks, Don Morton -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110406/1ea9962f/attachment.html From bbrashers at Environcorp.com Wed Apr 6 09:46:07 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Wed, 6 Apr 2011 08:46:07 -0700 Subject: [Wrf-users] WRF Problem running in Parallel In-Reply-To: References: Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF3085206D3CAD4@irvine01.irvine.environ.local> Looks the me like OpenMPI is not installed on all the compute nodes within your cluster. Note the line: bash: orted: command not found which says it can't run, because the executable daemon for OpenMPI doesn't exist (on the compute node). You need to install OpenMPI on all the nodes. It looks like you're using a RocksCluster.org cluster (based on the naming of the compute nodes). If so, you could install OpenMPI in /share/apps/openmpi (or something similar under /share/apps). Everything in /share/apps is shared via NFS to all the nodes in the cluster. Alternatively, you could create an RPM of the OpenMPI bits, and install the RPM on all the nodes in your cluster. When running OpenMPI's configure, you could use something like this: # configure --with-tm=/opt/lsf --prefix=/share/apps/openmpi Where you'll have to adjust the path to LSF to be the real path. When you run WRF using a properly installed orterun, you won't have to specify the -np or -hostfile. Just "orterun real.exe" or "orterun wrf.exe". Bart Brashers From: Ahsan Ali [mailto:ahsanshah01 at gmail.com] Sent: Tuesday, April 05, 2011 11:57 PM To: Bart Brashers Subject: WRF Problem running in Parallel Dear Bart It gives following error for each command. We have LSF installed but am not sure how to integrate WRF with LSF. [root at pmd02 em_real]# orterun -np 4 -hostfile hosts.txt real.exe bash: orted: command not found ------------------------------------------------------------------------ -- A daemon (pid 13139) died unexpectedly with status 127 while attempting to launch so we are aborting. There may be more information reported by the environment (see above). This may be because the daemon was unable to find all the needed shared libraries on the remote node. You may set your LD_LIBRARY_PATH to have the location of the shared libraries on the remote nodes and this will automatically be forwarded to the remote nodes. ------------------------------------------------------------------------ -- ------------------------------------------------------------------------ -- orterun noticed that the job aborted, but has no info as to the process that caused that situation. ------------------------------------------------------------------------ -- bash: orted: command not found ------------------------------------------------------------------------ -- orterun was unable to cleanly terminate the daemons on the nodes shown below. Additional manual cleanup may be required - please refer to the "orte-clean" tool for assistance. ------------------------------------------------------------------------ -- compute-02-01 - daemon did not report back when launched compute-02-02 - daemon did not report back when launched Are you using any queuing system like SGE, Torque, PBS, etc.? In OpenMPI, the mpirun (and mpiexec) are really just links to orterun. Orterun is smart enough to get the list of hostnames to use from the queuing system. If you're not using a queuing system, then you need to tell orterun which machines to use. There's several ways, see `man orterun`. You could do any of these: # cat hosts.txt machine1 machine1 machine2 machine2 # orterun -np 4 -hostfile hosts.txt wrf.exe # mpirun -np 4 -machinefile hosts.txt wrf.exe # orterun -np 4 -host machine1,machine1,machine2,machine2 wrf.exe Bart Brashers -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110406/40ebfacf/attachment.html From maria.frediani at gmail.com Wed Apr 6 12:18:06 2011 From: maria.frediani at gmail.com (Maria Eugenia) Date: Wed, 6 Apr 2011 14:18:06 -0400 Subject: [Wrf-users] Wrf-users Digest, Vol 80, Issue 5 In-Reply-To: References: Message-ID: Metgrid problem-SST in ERA-Interim data (Satyaban Bishoyi Ratna) Dear Satyaban, I had the same problem when using ECMWF global analysis. I believe this problem comes from a difference between WRF's and ECMWF's landmask. To solve that I interpolated the original SST field (from ECMWF grib file) over the entire domain, i.e, expanding the SST field into the land. I used this resulting file as a secondary data input in WPS. It worked just fine. Pls contact me if you need more details on this procedure. Good luck Maria E. B. Frediani ------------------------------------------------------------------------------------- Visiting Scholar University of Connecticut School of Engineering 261 Glenbrook Rd Storrs, CT 06269 frediani at engr.uconn.edu On Wed, Apr 6, 2011 at 2:00 PM, wrote: > Send Wrf-users mailing list submissions to > ? ? ? ?wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > ? ? ? ?http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > ? ? ? ?wrf-users-request at ucar.edu > > You can reach the person managing the list at > ? ? ? ?wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > ? 1. Metgrid problem-SST in ERA-Interim data (Satyaban Bishoyi Ratna) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: 6 Apr 2011 00:59:57 -0000 > From: "Satyaban Bishoyi Ratna" > Subject: [Wrf-users] Metgrid problem-SST in ERA-Interim data > To: "Wrf Users " > Message-ID: <20110406005957.10400.qmail at f5mail-236-220.rediffmail.com> > Content-Type: text/plain; charset="utf-8" > > > > > > ? ? ? ? ? ? ? ? ? ? ? ?Dear User,I would like to update SST in my simulation with ERAIN data. > I could able to complete the simulation successfully, however the SST data > in the model is reading wrongly. > I have tried to verify the "met_em......" files in WPS and came to know that > the SST values are not correct in these files. > I have post processed the "met_em..." file for one tine using ARWpost and > seen using GrADS that the SST values are "Contouring: -1.1e+30 to 1e+29 interval 1e+29". > The original SST grib data provided to WPS are correct and in degree K but > I am surprise to see the wrong values in the files generated by "metgrid". > Can you please suggest how to solve this error. > Thanks in advance. > > Best regards > Satyaban > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110406/8aebe1c6/attachment-0001.html > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 80, Issue 5 > **************************************** > From tilkara17 at gmail.com Wed Apr 6 15:10:30 2011 From: tilkara17 at gmail.com (Florencia Luraschi) Date: Wed, 6 Apr 2011 18:10:30 -0300 Subject: [Wrf-users] Viento en niveles z(km) Message-ID: Hola, Estoy trabajando con la versi?n 3.2 del modelo, configurado para dos anidados sobre la regi?n de Am?rica del Sur. Mi problema esta en calcular el viento en niveles z, es decir en km y no en niveles de presi?n. En la ARWUsersGuideV3 dice que para obtener eso cambie en la namelist.ARWpost la opci?n interp_levels por n?meros crecientes, teniendo interp_levels = 0.01,0.02,0.03,0.04,0.05,0.06,0.07,0.08,0.09,0.10,0.12,0.13,/ (necesitoa 10,20, 50, 70 metros por ejemplo) en lugar de tener interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100.,/ Pero cuando gr?fico los campos con grads, quedan regiones sin datos. Ahora, si comparo el viento a 10m calculado de esta forma con el viento a 10m que calcula el ARW, (es decir que es una variable que se puede elegir de la lista de variables disponibles con el ARWpost), se ve claramente que no esta calculando lo mismo. Esto es as? y no hay forma de mejorarlo o estoy haciendo algo mal o me falta aclarlar alguna cosa? Desde ya muchas gracias y espero que alguien pueda ayudarme. Florencia Luraschi Servicio Meteorol?gico Nacional, Argentina -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110406/db96680c/attachment-0001.html From nbon0004 at um.edu.mt Sat Apr 9 12:08:40 2011 From: nbon0004 at um.edu.mt (Norbert Bonnici) Date: Sat, 9 Apr 2011 20:08:40 +0200 Subject: [Wrf-users] WPS 3.3 compilation error Message-ID: Dear users, I downloaded the latest version of WPS (3.3) and when I tried to compile it I got this final error: make[1]: Leaving directory `/home/nbon0004/WRF/WPS/util/src' if [ -h height_ukmo.exe ] ; then \ /bin/rm -f height_ukmo.exe ; \ fi ; \ if [ -h ../height_ukmo.exe ] ; then \ /bin/rm -f ../height_ukmo.exe ; \ fi ; \ if [ -e src/height_ukmo.exe ] ; then \ ln -sf src/height_ukmo.exe . ; \ fi I had previously successfully compiled 3.2 and 3.2.1 on the same system, I do not know why I'm getting this error -_- Regards -- Norbert Bonnici From yingli at mail.usf.edu Fri Apr 8 16:25:08 2011 From: yingli at mail.usf.edu (yingli zhu) Date: Fri, 8 Apr 2011 18:25:08 -0400 Subject: [Wrf-users] Where is the 4DVAR nonlinear model? Message-ID: Hi, everyone, The new version of WRFDA has been released. Does the new 4DVAR not need WRFNL now? How does the 4DVAR use the nonlinear model? Best, Yingli From pliu34 at gatech.edu Mon Apr 11 14:58:18 2011 From: pliu34 at gatech.edu (Liu, Peng) Date: Mon, 11 Apr 2011 16:58:18 -0400 (EDT) Subject: [Wrf-users] difference of landsea and landmask field Message-ID: <240319177.315606.1302555498385.JavaMail.root@mail5.gatech.edu> Dear wrf users, I am confused about the field of landmask in the output file of geog.exe and the landsea field in the output file of metgrid.exe. As far as my understanding, the landmask field comes from static landuse data and the landsea field comes from the meteorological data ( namely the input of WRF). In my case, my input data has the land sea flag of its own, so it is written into intermediate file and further written into metgrid.exe output file. So the metgrid.exe output has both the field of landsea and landmask. I wonder what if the landmask and landsea field are not the same? I mean when WPS interpolates the meteorological data horizontally, which field is used to control whether or not certain grid is water or land? landmask or landsea? And when running wrf.exe, which field is to control the land and sea? Thank you very much Peng From ahmed4kernel at gmail.com Tue Apr 12 15:09:22 2011 From: ahmed4kernel at gmail.com (ahmed lasheen) Date: Tue, 12 Apr 2011 23:09:22 +0200 Subject: [Wrf-users] WPS Compilation error Message-ID: Hello, I am using Fedora 14 ,PGI 10.0 and Netcdf 4.1.1 I am working on the WPS 3.3 , when i compile it i found alot of errors i have redirected to compile.log file and it is attached in the email I choose the following option in the configure 10. PC Linux x86_64 (IA64 and Opteron), PGI compiler 5.2 or higher, serial i donot know what cause these errors thanks in advance Ahmed -- =============== Ahmed Lasheen Junior researcher at Cairo Numerical Weather Prediction Center (CNWPC) Egyptian Meteorological Authority(EMA) Cairo,Egypt =============== -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110412/8389333c/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log Type: text/x-log Size: 106653 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110412/8389333c/attachment-0001.bin From dbh409 at ku.edu Thu Apr 14 07:38:51 2011 From: dbh409 at ku.edu (Huber, David) Date: Thu, 14 Apr 2011 13:38:51 +0000 Subject: [Wrf-users] Noah fix in WRF 3.3 Message-ID: Hello, I noticed in the fixes listed for WRF 3.3 that the Noah LSM underground runoff was fixed. I was wondering if you know exactly what the problem was and how it was fixed. Thanks, Dave From Michael.Zulauf at iberdrolaren.com Tue Apr 12 14:44:03 2011 From: Michael.Zulauf at iberdrolaren.com (Zulauf, Michael) Date: Tue, 12 Apr 2011 13:44:03 -0700 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet Message-ID: Hi all - quick question. Does anyone have data or experience comparing the performance and scaling of WRF with cluster interconnects utilizing 10 gigabit ethernet (10GigE) vs Infiniband? We're looking to expand and update our computing resources, and we'd originally spec'd it with Infiniband. The nodes will have either dual quad- or hex-core Nehalem type processors. There's been some IT pushback, suggesting that we should go with 10GigE. I don't have any direct experience with 10GigE, but my experience with 1GigE shows that Infiniband scales far better. I've seen what I'd consider marketing material on the web that suggests that 10GigE is comparable to Infiniband, but they don't specifically mention WRF. On the other hand, I've seen other sites that suggest an Infiniband interconnect is far superior. Again, these don't specifically mention WRF. I know the particular application in use is critical when deciding these things, and that WRF is a pretty demanding application when it comes to the interconnect. My suspicion is that Infiniband is still significantly superior, but if I'm going to be able to make any headway with IT, then I'll probably need some type of numbers to back up my arguments. Can anyone help? Thanks, Mike -- PLEASE NOTE - NEW E-MAIL ADDRESS: michael.zulauf at iberdrolaren.com Mike Zulauf Meteorologist, Lead Senior Wind Asset Management Iberdrola Renewables 1125 NW Couch, Suite 700 Portland, OR 97209 Office: 503-478-6304 Cell: 503-913-0403 Please be advised that email addresses for Iberdrola Renewables personnel have changed to first.last at iberdrolaREN.com effective Aug. 16, 2010. Please make a note. Thank you. This message is intended for the exclusive attention of the recipient(s) indicated. Any information contained herein is strictly confidential and privileged. If you are not the intended recipient, please notify us by return e-mail and delete this message from your computer system. Any unauthorized use, reproduction, alteration, filing or sending of this message and/or any attached files may lead to legal action being taken against the party(ies) responsible for said unauthorized use. Any opinion expressed herein is solely that of the author(s) and does not necessarily represent the opinion of the Company. The sender does not guarantee the integrity, speed or safety of this message, and does not accept responsibility for any possible damage arising from the interception, incorporation of viruses, or any other damage as a result of manipulation. From apsims at ncsu.edu Thu Apr 14 12:32:47 2011 From: apsims at ncsu.edu (Aaron Sims) Date: Thu, 14 Apr 2011 14:32:47 -0400 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet In-Reply-To: References: Message-ID: <4DA73DCF.6060108@ncsu.edu> Mike, Its been my experience that latency is the critical issue when running wrf in parallel, not raw speed. I would spec out the latency of 10GigE, which I think is about the same as GigE, and compare it to Infiniband. I think youll find infiniband is better in this case. Aaron Zulauf, Michael wrote: > Hi all - quick question. Does anyone have data or experience comparing > the performance and scaling of WRF with cluster interconnects utilizing > 10 gigabit ethernet (10GigE) vs Infiniband? > > We're looking to expand and update our computing resources, and we'd > originally spec'd it with Infiniband. The nodes will have either dual > quad- or hex-core Nehalem type processors. There's been some IT > pushback, suggesting that we should go with 10GigE. I don't have any > direct experience with 10GigE, but my experience with 1GigE shows that > Infiniband scales far better. > > I've seen what I'd consider marketing material on the web that suggests > that 10GigE is comparable to Infiniband, but they don't specifically > mention WRF. On the other hand, I've seen other sites that suggest an > Infiniband interconnect is far superior. Again, these don't > specifically mention WRF. I know the particular application in use is > critical when deciding these things, and that WRF is a pretty demanding > application when it comes to the interconnect. > > My suspicion is that Infiniband is still significantly superior, but if > I'm going to be able to make any headway with IT, then I'll probably > need some type of numbers to back up my arguments. > > Can anyone help? > > Thanks, > Mike > > From mmkamal at sciborg.uwaterloo.ca Thu Apr 14 13:31:52 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Thu, 14 Apr 2011 15:31:52 -0400 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet In-Reply-To: References: Message-ID: <20110414153152.208531h1uie5fyyo@www.nexusmail.uwaterloo.ca> Hi Mike, Yes, interconnect is a big issue while running WRF on a HPC cluster. InfiniBand increases WRF performance by up to 115% for cluster size of 24 nodes. I have used GigE since last one year but when I get the opportunity to use Infiniband it really surprised me. You can take a look on the following two resources for further details. http://www.hpcadvisorycouncil.com/pdf/081016a%20WRF%20Model%20analysis.pdf http://www.linuxclustersinstitute.org/conferences/archive/2009/PDF/Shainer_64557.pdf Thanks Kamal Quoting "Zulauf, Michael" : > Hi all - quick question. Does anyone have data or experience comparing > the performance and scaling of WRF with cluster interconnects utilizing > 10 gigabit ethernet (10GigE) vs Infiniband? > > We're looking to expand and update our computing resources, and we'd > originally spec'd it with Infiniband. The nodes will have either dual > quad- or hex-core Nehalem type processors. There's been some IT > pushback, suggesting that we should go with 10GigE. I don't have any > direct experience with 10GigE, but my experience with 1GigE shows that > Infiniband scales far better. > > I've seen what I'd consider marketing material on the web that suggests > that 10GigE is comparable to Infiniband, but they don't specifically > mention WRF. On the other hand, I've seen other sites that suggest an > Infiniband interconnect is far superior. Again, these don't > specifically mention WRF. I know the particular application in use is > critical when deciding these things, and that WRF is a pretty demanding > application when it comes to the interconnect. > > My suspicion is that Infiniband is still significantly superior, but if > I'm going to be able to make any headway with IT, then I'll probably > need some type of numbers to back up my arguments. > > Can anyone help? > > Thanks, > Mike > > -- > > PLEASE NOTE - NEW E-MAIL ADDRESS: > michael.zulauf at iberdrolaren.com > > Mike Zulauf > Meteorologist, Lead Senior > Wind Asset Management > Iberdrola Renewables > 1125 NW Couch, Suite 700 > Portland, OR 97209 > Office: 503-478-6304 Cell: 503-913-0403 > > > Please be advised that email addresses for Iberdrola Renewables > personnel have changed to first.last at iberdrolaREN.com effective Aug. > 16, 2010. Please make a note. Thank you. > > This message is intended for the exclusive attention of the > recipient(s) indicated. Any information contained herein is strictly > confidential and privileged. If you are not the intended recipient, > please notify us by return e-mail and delete this message from your > computer system. Any unauthorized use, reproduction, alteration, > filing or sending of this message and/or any attached files may lead > to legal action being taken against the party(ies) responsible for > said unauthorized use. Any opinion expressed herein is solely that > of the author(s) and does not necessarily represent the opinion of > the Company. The sender does not guarantee the integrity, speed or > safety of this message, and does not accept responsibility for any > possible damage arising from the interception, incorporation of > viruses, or any other damage as a result of manipulation. > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From or at belgingur.is Thu Apr 14 15:00:45 2011 From: or at belgingur.is (=?iso-8859-1?Q?=D3lafur_R=F6gnvaldsson?=) Date: Thu, 14 Apr 2011 21:00:45 +0000 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet In-Reply-To: <4DA73DCF.6060108@ncsu.edu> References: <4DA73DCF.6060108@ncsu.edu> Message-ID: <6B5E3FCB-5221-4ABD-B15A-425B1B9A146B@belgingur.is> I agree with Aaron, it's the low latency of the InfiniBand that makes it a more suitable inter-node fabric than 1/10G Ethernet. Kind regards, ?lafur. > Mike, > > Its been my experience that latency is the critical issue when running > wrf in parallel, not raw speed. I would spec out the latency of 10GigE, > which I think is about the same as GigE, and compare it to Infiniband. > I think youll find infiniband is better in this case. > > Aaron > > Zulauf, Michael wrote: >> Hi all - quick question. Does anyone have data or experience comparing >> the performance and scaling of WRF with cluster interconnects utilizing >> 10 gigabit ethernet (10GigE) vs Infiniband? >> >> We're looking to expand and update our computing resources, and we'd >> originally spec'd it with Infiniband. The nodes will have either dual >> quad- or hex-core Nehalem type processors. There's been some IT >> pushback, suggesting that we should go with 10GigE. I don't have any >> direct experience with 10GigE, but my experience with 1GigE shows that >> Infiniband scales far better. >> >> I've seen what I'd consider marketing material on the web that suggests >> that 10GigE is comparable to Infiniband, but they don't specifically >> mention WRF. On the other hand, I've seen other sites that suggest an >> Infiniband interconnect is far superior. Again, these don't >> specifically mention WRF. I know the particular application in use is >> critical when deciding these things, and that WRF is a pretty demanding >> application when it comes to the interconnect. >> >> My suspicion is that Infiniband is still significantly superior, but if >> I'm going to be able to make any headway with IT, then I'll probably >> need some type of numbers to back up my arguments. >> >> Can anyone help? >> >> Thanks, >> Mike >> >> > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Kve?ja/regards, ?lafur R?gnvaldsson Reiknistofa ? ve?urfr??i - Belgingur www.riv.is - www.belgingur.is From or at belgingur.is Thu Apr 14 15:36:03 2011 From: or at belgingur.is (=?iso-8859-1?Q?=D3lafur_R=F6gnvaldsson?=) Date: Thu, 14 Apr 2011 21:36:03 +0000 Subject: [Wrf-users] WPS Compilation error In-Reply-To: References: Message-ID: Hello Ahmed. Few things come to mind. 1) I believe that as of version 4 of the GCC compiler the "g2c" procedure is no longer supported. Consequently you can not compile "plotfmt.exe" and "plotgrids.exe". 2) I've just installed the latest versions of WRF and WPS on a Ubuntu 10.10 system. Rather than compiling the JPEG compression libraries, I used the update manager of Ubuntu (called "aptitude") to install the necessary software. I believe Fedora has similar (maybe "yum" or "apt"). I recommend you try installing "libjpeg" using Fedora's update manager. 3) Make sure you have the correct path to the PGI libraries. By default the configure.wps file points to version 5.2 of the PGI suite. Also, check if your libX11 library is located under /lib/X11R (as by default in configure.wps) or if it is under /usr/lib (if so, you have to modify the configure.wps file accordingly). Once you have checked this and installed the jpeg library, you should be able to compile everything (with the exceptions of "plotfmt.exe" and "plotgrids.exe"). Hope this helps. Best, ?lafur. > Hello, > I am using Fedora 14 ,PGI 10.0 and Netcdf 4.1.1 > I am working on the WPS 3.3 , when i compile it i found alot of errors > i have redirected to compile.log file and it is attached in the email > I choose the following option in the configure > 10. PC Linux x86_64 (IA64 and Opteron), PGI compiler 5.2 or higher, serial > i donot know what cause these errors > thanks in advance > Ahmed > -- > =============== > Ahmed Lasheen > Junior researcher at Cairo Numerical Weather Prediction Center (CNWPC) > Egyptian Meteorological Authority(EMA) > Cairo,Egypt > =============== > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Kve?ja/regards, ?lafur R?gnvaldsson Reiknistofa ? ve?urfr??i - Belgingur www.riv.is - www.belgingur.is From wrf at nusculus.com Thu Apr 14 13:11:43 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Thu, 14 Apr 2011 13:11:43 -0600 Subject: [Wrf-users] Noah fix in WRF 3.3 In-Reply-To: References: Message-ID: Greetings, I reported that bug and they correctly, imho, fixed it. The runoff variable is meant to be a composite of two variables, runoff2 and runoff3, which are two different kinds of underground runoff valaues. But they added them together twice, once in module_sf_noahlsm.F and again in module_sf_noahdrv.F, which caused runoff3 to get counted twice. Runoff2 is the normal flow of water out the bottom of the soil. Runoff3 is the excess water if the bottom soil layer is somehow more than saturated - the excess is just taken out. If you use such things, please be aware that the two accumulated values, SFCRUNOFF and UDRUNOFF are not set to zero at the beginning of a run for Noah. It is for RUC LSM. You may get a zero initial value by chance but don't depend on that, I don't. I reported this to wrfhelp, but was not able to get the idea across that accumulated values should start with zero. So they didn't fix it. Here is what I did to set things to zero in V3.2.1: In module_sf_noahdrv.F, about line 652, which is within the loops going thru every cell during just the first timestep, I added these lines: ! Initialize surface and under ground runoff values SFCRUNOFF(I,J) = 0.0 UDRUNOFF(I,J) = 0.0 Yes, I know that Fortran lets you set all values in an array without looping, but I try to make my code fit in with that to which I am adding. By the way, the variables are SFCRUNOFF and UDRUNOFF in the code, but SFCROFF and UDROFF in the output files. Hope that helps, Kevin On Thu, Apr 14, 2011 at 7:38 AM, Huber, David wrote: > Hello, > > I noticed in the fixes listed for WRF 3.3 that the Noah LSM underground > runoff was fixed. I was wondering if you know exactly what the problem was > and how it was fixed. > > Thanks, > > Dave > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110414/57ee4c38/attachment-0001.html From saji at u-aizu.ac.jp Thu Apr 14 16:51:48 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Fri, 15 Apr 2011 07:51:48 +0900 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet In-Reply-To: <6B5E3FCB-5221-4ABD-B15A-425B1B9A146B@belgingur.is> References: <4DA73DCF.6060108@ncsu.edu> <6B5E3FCB-5221-4ABD-B15A-425B1B9A146B@belgingur.is> Message-ID: Very good. I am considering Infiniband too, based on the work that is mentioned by Kamal. Anybody has any idea of the cost involved in Infiniband infrastructure? saji 2011/4/15 ?lafur R?gnvaldsson > I agree with Aaron, it's the low latency of the InfiniBand that makes it a > more suitable inter-node fabric than 1/10G Ethernet. > > Kind regards, ?lafur. > > > Mike, > > > > Its been my experience that latency is the critical issue when running > > wrf in parallel, not raw speed. I would spec out the latency of 10GigE, > > which I think is about the same as GigE, and compare it to Infiniband. > > I think youll find infiniband is better in this case. > > > > Aaron > > > > Zulauf, Michael wrote: > >> Hi all - quick question. Does anyone have data or experience comparing > >> the performance and scaling of WRF with cluster interconnects utilizing > >> 10 gigabit ethernet (10GigE) vs Infiniband? > >> > >> We're looking to expand and update our computing resources, and we'd > >> originally spec'd it with Infiniband. The nodes will have either dual > >> quad- or hex-core Nehalem type processors. There's been some IT > >> pushback, suggesting that we should go with 10GigE. I don't have any > >> direct experience with 10GigE, but my experience with 1GigE shows that > >> Infiniband scales far better. > >> > >> I've seen what I'd consider marketing material on the web that suggests > >> that 10GigE is comparable to Infiniband, but they don't specifically > >> mention WRF. On the other hand, I've seen other sites that suggest an > >> Infiniband interconnect is far superior. Again, these don't > >> specifically mention WRF. I know the particular application in use is > >> critical when deciding these things, and that WRF is a pretty demanding > >> application when it comes to the interconnect. > >> > >> My suspicion is that Infiniband is still significantly superior, but if > >> I'm going to be able to make any headway with IT, then I'll probably > >> need some type of numbers to back up my arguments. > >> > >> Can anyone help? > >> > >> Thanks, > >> Mike > >> > >> > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- > Kve?ja/regards, > ?lafur R?gnvaldsson > Reiknistofa ? ve?urfr??i - Belgingur > www.riv.is - www.belgingur.is > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110415/d3636de2/attachment.html From andrew.robbie at gmail.com Mon Apr 18 04:29:48 2011 From: andrew.robbie at gmail.com (Andrew Robbie (Gmail)) Date: Mon, 18 Apr 2011 20:29:48 +1000 Subject: [Wrf-users] cluster interconnect - Infiniband vs 10 gigabit ethernet In-Reply-To: References: Message-ID: On 13/04/2011, at 6:44 AM, Zulauf, Michael wrote: > Hi all - quick question. Does anyone have data or experience > comparing > the performance and scaling of WRF with cluster interconnects > utilizing > 10 gigabit ethernet (10GigE) vs Infiniband? As people have said, the lower latency of IB (especially with the latest generation Mellanox and QLogic HBAs) makes it a better choice. In our experience the added speed of QDR (quad data rate IB, 40Gbit/s) does not provide much improvement over DDR IB. However, the extra bandwidth is really useful if you plan to use the IB network for storage (NFS/Lustre/PVFS2 etc). > I don't have any > direct experience with 10GigE, but my experience with 1GigE shows that > Infiniband scales far better. Comparing with IB with GigE is not really fair. But IB will scale better than 10GigE in the data center. GigE is easier to maintain as the drivers are prepackaged and just work, which makes it a reasonable choice for small clusters. > I've seen what I'd consider marketing material on the web that > suggests > that 10GigE is comparable to Infiniband, but they don't specifically > mention WRF. On the other hand, I've seen other sites that suggest an > Infiniband interconnect is far superior. Again, these don't > specifically mention WRF. I know the particular application in use is > critical when deciding these things, and that WRF is a pretty > demanding > application when it comes to the interconnect. Some 10GigE switches have impressively low latency, but not as good as IB. Also, in our experience 10GigE switches cost more. Maybe because of the market segments they target like 'Enterprise Switching' rather than HPC. The main reason to have 10GigE is to connect distributed clusters, as sending IB across a WAN is challenging. Also good to have the storage nodes with IB and 10GigE, IB for inside the cluster and 10GigE to serve results to external clients (eg desktop workstations). That is, 10GigE is the interface to the corporate IT network, via the head node and the file server nodes. > My suspicion is that Infiniband is still significantly superior, but > if > I'm going to be able to make any headway with IT, then I'll probably > need some type of numbers to back up my arguments. Are they going to set up and manage your cluster? Tune MPI to run on 10GigE? Will they provide money to pay for a more expensive 10GigE solution and make the cluster 30% bigger to make up for the slower 10GigE? I doubt it. Offer them a 10GigE pipe to connect to their systems and they will probably be happy. Andrew From ebeigi3 at tigers.lsu.edu Sun Apr 17 22:52:26 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Mon, 18 Apr 2011 00:52:26 -0400 Subject: [Wrf-users] installing netcdf Message-ID: Thanks for your previous help. I am installing netcdf.4.1.2 and before that i installed zlib , libpng and jasper library which is needed for WPS. when i want to install netcdf, it asks for curl directory which make me to install curl. my question is that : would it be possible to use this command for sharing other installed libraries with netcdf: CC=icc FC=ifort ./configure --prefix=/home/ehsan/netcdf --with-zlib=/home/ehsan zlib --with-libpng=/home/ehsan/libpng --with-jasper=/home/ehsan/jasper --with curl=/home/ehsan/curl Best Regards Ehsan -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110418/c68d2354/attachment.html From hamed319 at yahoo.com Sat Apr 16 09:45:40 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Sat, 16 Apr 2011 08:45:40 -0700 (PDT) Subject: [Wrf-users] NETCDF Installation error Message-ID: <931873.55276.qm@web161213.mail.bf1.yahoo.com> Dear All, I just installed both ifort and icc but in the installation of NETCDF I got this: Fatal Error: This program was not built to run on the processor in your system. The allowed processors are: Intel(R) Core(TM) Duo processors and compatible Intel processors with supplemental Streaming SIMD Extensions 3 (SSSE3) instruction support. configure:5275: $? = 1 configure:5282: error: in `/home/hamed/netcdf-4.1.1': configure:5286: error: cannot run C compiled programs. If you meant to cross compile, use `--host'. Any Idea? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110416/b5449d86/attachment.html From hamed319 at yahoo.com Mon Apr 18 06:04:10 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Mon, 18 Apr 2011 05:04:10 -0700 (PDT) Subject: [Wrf-users] WPS,Grib2 making ungrib.exe error Message-ID: <641452.40406.qm@web161217.mail.bf1.yahoo.com> Dear all, I've been trying to compile WPS with grib2 option. In my compile.log there is no error, but in the screen I got following. Although it compiled geogrid.exe and metgrid.exe, it could not compile the ungrib.exe.(BTW, I compiled successfully WPS with the "NO GRIB2" option) Any suggestion would be appreciated. Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110418/7fce50cc/attachment.html From mmkamal at sciborg.uwaterloo.ca Fri Apr 15 18:58:18 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Fri, 15 Apr 2011 20:58:18 -0400 Subject: [Wrf-users] Memory fault in WPP (run_wrfpostandgrads) Message-ID: <20110415205818.19816riy6f9hqz8c@www.nexusmail.uwaterloo.ca> Dear All, I have been trying to use WPP to convert my ARW NetCDF output into GRIB format but could not succeed. I have successfully finish compiling WPP but while trying to run "run_wrfpostandgrads" I get only a single error message "Memory fault" in "wrfpost_d01.000.out". Has anyone any idea on how to solve this. Thanks in advance Kamal From mmkamal at sciborg.uwaterloo.ca Fri Apr 15 19:03:28 2011 From: mmkamal at sciborg.uwaterloo.ca (mmkamal at sciborg.uwaterloo.ca) Date: Fri, 15 Apr 2011 21:03:28 -0400 Subject: [Wrf-users] Memory fault in WPP (run_wrfpostandgrads) Message-ID: <20110415210328.47749j02h7le75r4@www.nexusmail.uwaterloo.ca> Note : Dear Moderator, could you please skip the 1st mail with the same Subject and post the 2nd mail in which I have included the log file. Dear All, I have been trying to use WPP to convert my ARW NetCDF output into GRIB format but could not succeed. I have successfully finish compiling WPP but while trying to run "run_wrfpostandgrads" I get only a single error message "Memory fault" in "wrfpost_d01.000.out". Has anyone any idea on how to solve this. Thanks in advance Kamal -------------- next part -------------- [mmkamal at thor scripts]$ ./run_wrfpostandgrads + export TOP_DIR=/bluejay/kamal/WRF3 + export DOMAINPATH=/bluejay/kamal/WRF3/DOMAINS + dyncore=ARW + [ ARW = NMM ] + [ ARW = ARW ] + export tag=NCAR + export startdate=2005082800 + export fhr=00 + export lastfhr=00 + export incrementhr=03 + export WRF_POSTPROC_HOME=/bluejay/kamal/WRF3/WPPV3 + export POSTEXEC=/bluejay/kamal/WRF3/WPPV3/exec + export SCRIPTS=/bluejay/kamal/WRF3/WPPV3/scripts + export WRFPATH=/bluejay/kamal/WRF3/WRFV3 + cd /bluejay/kamal/WRF3/DOMAINS/postprd + ln -sf /bluejay/kamal/WRF3/WPPV3/scripts/cbar.gs . + ln -fs /bluejay/kamal/WRF3/WRFV3/run/ETAMPNEW_DATA eta_micro_lookup.dat + ln -fs /bluejay/kamal/WRF3/DOMAINS/parm/wrf_cntrl.parm . + export tmmark=tm00 + export MP_SHARED_MEMORY=yes + export MP_LABELIO=yes + pwd /bluejay/kamal/WRF3/DOMAINS/postprd + ls -x cbar.gs eta_micro_lookup.dat fort.110 fort.14 itag run_wrfpostandgrads wrf_cntrl.parm wrfpost_d01.000.out + export NEWDATE=2005082800 + [ 00 -le 00 ] + typeset -Z3 fhr + /bluejay/kamal/WRF3/WPPV3/exec/ndate.exe +000 2005082800 + NEWDATE=2005082800 + echo 2005082800 + cut -c1-4 + YY=2005 + echo 2005082800 + cut -c5-6 + MM=08 + echo 2005082800 + cut -c7-8 + DD=28 + echo 2005082800 + cut -c9-10 + HH=00 + echo NEWDATE 2005082800 NEWDATE 2005082800 + echo YY 2005 YY 2005 + cat + > itag + << EOF + rm fort.110 fort.14 + ln -sf wrf_cntrl.parm fort.14 + ln -sf griddef.out fort.110 + /bluejay/kamal/WRF3/WPPV3/exec/wrfpost.exe + < itag + > wrfpost_d01.000.out + 2>&1 + mv WRFPRS000.tm00 WRFPRS_d01.000 mv: cannot stat `WRFPRS000.tm00': No such file or directory + ls -l WRFPRS_d01.000 ls: WRFPRS_d01.000: No such file or directory + err1=1 + test 1 -ne 0 + echo WRF POST FAILED, EXITTING WRF POST FAILED, EXITTING + exit From stuefer at gi.alaska.edu Mon Apr 18 17:30:18 2011 From: stuefer at gi.alaska.edu (Martin) Date: Mon, 18 Apr 2011 15:30:18 -0800 Subject: [Wrf-users] PhD studentship, Geophysical Institute, University of Alaska Fairbanks Message-ID: <4DACC98A.80309@gi.alaska.edu> PhD studentship, 'WRF/Chem modeling of volcanic ash and SO2', University of Alaska Fairbanks. Applicants are invited for a PhD position at the University of Alaska Fairbanks starting mid 2011 as part of our volcanic ash modeling efforts. The project aims at investigating the dispersion of volcanic ash in the atmosphere using WRF/Chem. A variety of volcanic eruption models are used to initialize WRF/Chem. The models will be verified using case studies worldwide with the specific purpose to provide improvements for the modeling basis of volcanic ash concentration calculations. Methods will include model inter-comparison and testing of satellite remote sensing algorithms; operational tools are developed to support volcanic ash advisory centers. The successful candidate will have a strong background in geophysics with emphasis in atmospheric sciences and fluid dynamics. Excellent programming skills are required. To apply, please contact professors Martin Stuefer, stuefer at gi.alaska.edu, or Peter Webley, pwebley at gi.alaska.edu. From Alexander.Knyazev at intel.com Mon Apr 18 11:10:23 2011 From: Alexander.Knyazev at intel.com (Knyazev, Alexander) Date: Mon, 18 Apr 2011 18:10:23 +0100 Subject: [Wrf-users] NETCDF Installation error In-Reply-To: <931873.55276.qm@web161213.mail.bf1.yahoo.com> References: <931873.55276.qm@web161213.mail.bf1.yahoo.com> Message-ID: Typically this occurs when you try to compile using -x switch on the system that doesn't support specified instruction set (f.i. -xSSE3 in Pentium 4). What was exact system configuration and switches used (I believe they are both available in config.log ). WBR, Alex. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Hamed Sharifi Sent: Saturday, April 16, 2011 7:46 PM To: wrf-users at ucar.edu Subject: [Wrf-users] NETCDF Installation error Dear All, I just installed both ifort and icc but in the installation of NETCDF I got this: Fatal Error: This program was not built to run on the processor in your system. The allowed processors are: Intel(R) Core(TM) Duo processors and compatible Intel processors with supplemental Streaming SIMD Extensions 3 (SSSE3) instruction support. configure:5275: $? = 1 configure:5282: error: in `/home/hamed/netcdf-4.1.1': configure:5286: error: cannot run C compiled programs. If you meant to cross compile, use `--host'. Any Idea? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | -------------------------------------------------------------------- Closed Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110418/846e0dba/attachment.html From hamed319 at yahoo.com Wed Apr 20 09:49:07 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Wed, 20 Apr 2011 08:49:07 -0700 (PDT) Subject: [Wrf-users] MPI Message-ID: <889410.16749.qm@web161201.mail.bf1.yahoo.com> Dear all, when I just want to use the? mpi option it got me this error: $mpirun -np 4 ./wrf.exe mpiexec failed: gethostbyname_ex failed for localhost.localdomain any suggestion would be appreciated. Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110420/9841be48/attachment.html From Michael.Zulauf at iberdrolaren.com Mon Apr 18 13:35:01 2011 From: Michael.Zulauf at iberdrolaren.com (Zulauf, Michael) Date: Mon, 18 Apr 2011 12:35:01 -0700 Subject: [Wrf-users] WRF3.3 - syntax error in module_fr_sfire_core? Message-ID: Hi there, WRF Help and WRF Users. . . I've found what on _my_ system is being interpreted as a syntax error, but I don't know if it actually is a syntax error, a compiler directive that our system can't interpret, a configuration error, a compiler bug, or what. . . When attempting to build WRF 3.3, I get the following error: --------------------- snip ------------------------------------------------------ /apps/user_apps/OpenMPI-1.2.4_PGI7.0-2/bin/mpif90 -o module_fr_sfire_core.o -c -fastsse -Mvect=noaltcode -Mprefetch=distance:8 -Mfprelaxed -Mipa=fast,inline,safe -w -Mfree -byteswapio -I../dyn_em -I../dyn_nmm -module /apps/user_apps/WRF_versions/WRFV3.3_rev264/main -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/external/esmf_time_f90 -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/main -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/external/io_netcdf -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/external/io_int -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/frame -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/share -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/phys -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/chem -I/apps/user_apps/WRF_versions/WRFV3.3_rev264/inc -I/apps/user_apps/netcdf-4.1.1_PGI7.0-2/install_dir/include -r4 -i4 module_fr_sfire_core.f90 ; \ fi PGF90-S-0034-Syntax error at or near end of line (module_fr_sfire_core.f90: 233) 0 inform, 0 warnings, 1 severes, 0 fatal for nearest make[3]: [module_fr_sfire_core.o] Error 2 (ignored) --------------------- snip ------------------------------------------------------ When I examine module_fr_sfire_core.f90, I find that line 233 contains the following: !DEC$ ATTRIBUTES FORCEINLINE That looks to me something like a compiler directive (?), but given that it starts with the exclamation point, I would think it would be ignored if the compiler didn't understand it. Checking with module_fr_sfire_core.F, it looks like that line is directly inherited from there. When I remove that line (or replace it with a standard comment line), then that file compiles fine using the same compilation command. I haven't yet tried to see if this allows me to completely build the executable properly. As you can see from the command above, I'm using PGI 7.0-2. We've got later versions installed on our system, but I'm currently having some licensing issues that I've got to resolve. It's a bit interesting, because I've found that similar lines didn't cause problems elsewhere in the code base. For example, the file module_big_step_utilities_em.f90 contains several instances of the following, and generated no errors: !DEC$ loop count(3) Any thoughts? Thanks, Mike -- PLEASE NOTE - NEW E-MAIL ADDRESS: michael.zulauf at iberdrolaren.com Mike Zulauf Meteorologist, Lead Senior Wind Asset Management Iberdrola Renewables 1125 NW Couch, Suite 700 Portland, OR 97209 Office: 503-478-6304 Cell: 503-913-0403 Please be advised that email addresses for Iberdrola Renewables personnel have changed to first.last at iberdrolaREN.com effective Aug. 16, 2010. Please make a note. Thank you. This message is intended for the exclusive attention of the recipient(s) indicated. Any information contained herein is strictly confidential and privileged. If you are not the intended recipient, please notify us by return e-mail and delete this message from your computer system. Any unauthorized use, reproduction, alteration, filing or sending of this message and/or any attached files may lead to legal action being taken against the party(ies) responsible for said unauthorized use. Any opinion expressed herein is solely that of the author(s) and does not necessarily represent the opinion of the Company. The sender does not guarantee the integrity, speed or safety of this message, and does not accept responsibility for any possible damage arising from the interception, incorporation of viruses, or any other damage as a result of manipulation. From andybpenny at hotmail.com Mon Apr 25 13:23:03 2011 From: andybpenny at hotmail.com (Andrew Penny) Date: Mon, 25 Apr 2011 12:23:03 -0700 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 Message-ID: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. Here are the compilation options we are using for NetCDF 4.1.2: --------------------------------------------------- export CC=mpicc export CXX=mpicxx export FC=mpif90 export F77=mpif77 export F90=mpif90 export CPP='icpc -E' export CXXCPP='icpc -E' export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf --------------------------------------------------- The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: --------------------------------------------------- LIB_EXTERNAL = \ -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl --------------------------------------------------- Here are the error messages that show up during the './compile em_real' process: --------------------------------------------------- /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': fort-control.c:(.text+0x64): undefined reference to `nc_create' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': fort-control.c:(.text+0x1a0): undefined reference to `nc__create' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': fort-control.c:(.text+0x2b4): undefined reference to `nc_open' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': fort-control.c:(.text+0x3d8): undefined reference to `nc__open' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': fort-control.c:(.text+0x4f4): undefined reference to `nc_close' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': fort-control.c:(.text+0x55a): undefined reference to `nc_delete' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' make[2]: [diffwrf] Error 1 (ignored) --------------------------------------------------- What are we doing wrong here? Thanks Andrew Penny Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 Phone: (831) 656-3101 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110425/ace102f2/attachment-0001.html From Rebecca.Selin.Ctr at offutt.af.mil Mon Apr 25 13:29:56 2011 From: Rebecca.Selin.Ctr at offutt.af.mil (Selin, Rebecca D CTR USAF AFWA 16 WS/WXE) Date: Mon, 25 Apr 2011 14:29:56 -0500 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> Message-ID: <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. Becky Adams Selin Atmospheric & Environmental Research, Inc. AFWA 16th Wx Sqdn (402) 294-5273 -----Original Message----- From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny Sent: Monday, April 25, 2011 2:23 PM To: wrf-users at ucar.edu Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. Here are the compilation options we are using for NetCDF 4.1.2: --------------------------------------------------- export CC=mpicc export CXX=mpicxx export FC=mpif90 export F77=mpif77 export F90=mpif90 export CPP='icpc -E' export CXXCPP='icpc -E' export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf --------------------------------------------------- The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: --------------------------------------------------- LIB_EXTERNAL = \ -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl --------------------------------------------------- Here are the error messages that show up during the './compile em_real' process: --------------------------------------------------- /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': fort-control.c:(.text+0x64): undefined reference to `nc_create' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': fort-control.c:(.text+0x1a0): undefined reference to `nc__create' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': fort-control.c:(.text+0x2b4): undefined reference to `nc_open' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': fort-control.c:(.text+0x3d8): undefined reference to `nc__open' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': fort-control.c:(.text+0x4f4): undefined reference to `nc_close' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': fort-control.c:(.text+0x55a): undefined reference to `nc_delete' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' make[2]: [diffwrf] Error 1 (ignored) --------------------------------------------------- What are we doing wrong here? Thanks Andrew Penny Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 Phone: (831) 656-3101 From Rebecca.Selin.Ctr at offutt.af.mil Mon Apr 25 15:11:23 2011 From: Rebecca.Selin.Ctr at offutt.af.mil (Selin, Rebecca D CTR USAF AFWA 16 WS/WXE) Date: Mon, 25 Apr 2011 16:11:23 -0500 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> Message-ID: <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Andrew, Excellent, glad that's not it! However, the problem still is that your netcdf library isn't linking in correctly. I recognize all those undefined references from when my netcdf library didn't link. I can't find in your compilation options below where you define your netcdf path. Could you possibly be linking to a 32-bit version instead of 64-bit? Becky Adams Selin Atmospheric & Environmental Research, Inc. AFWA 16th Wx Sqdn (402) 294-5273 -----Original Message----- From: Andrew Penny [mailto:andybpenny at hotmail.com] Sent: Monday, April 25, 2011 4:11 PM To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 Hi Becky, The '/path/to/netcdf' is not an actual directory - it was meant to be simply a placeholder for the NetCDF install location. It is declared correctly for the LIB_EXTERNAL variable. Andrew Penny Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 Phone: (831) 656-3101 On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: > Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. > > Becky Adams Selin > Atmospheric & Environmental Research, Inc. > AFWA 16th Wx Sqdn (402) 294-5273 > > > -----Original Message----- > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny > Sent: Monday, April 25, 2011 2:23 PM > To: wrf-users at ucar.edu > Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 > > We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. > > Here are the compilation options we are using for NetCDF 4.1.2: > > --------------------------------------------------- > export CC=mpicc > export CXX=mpicxx > > export FC=mpif90 > export F77=mpif77 > export F90=mpif90 > > export CPP='icpc -E' > export CXXCPP='icpc -E' > > export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' > > export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > > export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" > > ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf > --------------------------------------------------- > > > The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. > > > Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: > > --------------------------------------------------- > LIB_EXTERNAL = \ > -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl > --------------------------------------------------- > > Here are the error messages that show up during the './compile em_real' process: > > --------------------------------------------------- > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': > fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': > fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': > fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': > fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': > fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': > fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': > fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': > fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': > fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': > fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': > fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': > fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': > fort-control.c:(.text+0x64): undefined reference to `nc_create' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': > fort-control.c:(.text+0x1a0): undefined reference to `nc__create' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': > fort-control.c:(.text+0x2b4): undefined reference to `nc_open' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': > fort-control.c:(.text+0x3d8): undefined reference to `nc__open' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': > fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': > fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': > fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': > fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': > fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': > fort-control.c:(.text+0x4f4): undefined reference to `nc_close' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': > fort-control.c:(.text+0x55a): undefined reference to `nc_delete' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': > fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': > fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': > fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': > fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': > fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': > fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': > fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': > fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': > fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': > fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': > fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': > fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': > fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': > fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': > fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': > fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': > fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': > fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': > fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': > fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': > fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': > fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': > fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': > fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': > fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': > fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': > fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': > fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': > fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': > fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': > fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': > fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': > fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': > fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': > fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': > fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': > fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': > fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': > fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': > fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': > fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': > fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': > fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' > fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': > fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' > fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': > fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': > fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' > fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': > fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': > fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': > fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': > fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': > fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': > fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': > fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': > fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': > fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': > fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': > fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': > fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': > fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': > fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': > fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': > fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': > fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': > fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': > fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': > fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': > fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': > fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': > fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': > fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': > fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': > fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': > fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': > fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' > make[2]: [diffwrf] Error 1 (ignored) > > --------------------------------------------------- > > > What are we doing wrong here? > > Thanks > > > Andrew Penny > Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 > Phone: (831) 656-3101 > > > > > From ktyle at atmos.albany.edu Mon Apr 25 19:55:07 2011 From: ktyle at atmos.albany.edu (Kevin R. Tyle) Date: Tue, 26 Apr 2011 01:55:07 +0000 (UTC) Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Message-ID: Hi Andrew and Rebecca, netCDF 4.1.2 now splits off the FORTRAN libraries into libnetcdff.a. I had to manually add "-lnetcdff" into my configure.wrf file in my WRFV3 build directory in order to eliminate the link errors you are seeing. Hope this helps . . . --Kevin ______________________________________________________________________ Kevin Tyle, Systems Administrator ********************** Dept. of Atmospheric & Environmental Sciences ktyle at atmos.albany.edu University at Albany, ES-235 518-442-4578 (voice) 1400 Washington Avenue 518-442-5825 (fax) Albany, NY 12222 ********************** ______________________________________________________________________ On Mon, 25 Apr 2011, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: > Andrew, > > Excellent, glad that's not it! > > However, the problem still is that your netcdf library isn't linking in correctly. I recognize all those undefined references from when my netcdf library didn't link. I can't find in your compilation options below where you define your netcdf path. Could you possibly be linking to a 32-bit version instead of 64-bit? > > Becky Adams Selin > Atmospheric & Environmental Research, Inc. > AFWA 16th Wx Sqdn (402) 294-5273 > > > -----Original Message----- > From: Andrew Penny [mailto:andybpenny at hotmail.com] > Sent: Monday, April 25, 2011 4:11 PM > To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE > Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 > > Hi Becky, > > The '/path/to/netcdf' is not an actual directory - it was meant to be > simply a placeholder for the NetCDF install location. It is declared > correctly for the LIB_EXTERNAL variable. > > Andrew Penny > Research Associate - Meteorology Department > Naval Postgraduate School > Monterey, CA 93943 > Phone: (831) 656-3101 > > > > > On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: > >> Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. >> >> Becky Adams Selin >> Atmospheric & Environmental Research, Inc. >> AFWA 16th Wx Sqdn (402) 294-5273 >> >> >> -----Original Message----- >> From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny >> Sent: Monday, April 25, 2011 2:23 PM >> To: wrf-users at ucar.edu >> Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 >> >> We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. >> >> Here are the compilation options we are using for NetCDF 4.1.2: >> >> --------------------------------------------------- >> export CC=mpicc >> export CXX=mpicxx >> >> export FC=mpif90 >> export F77=mpif77 >> export F90=mpif90 >> >> export CPP='icpc -E' >> export CXXCPP='icpc -E' >> >> export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >> export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >> export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' >> >> export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >> export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >> export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >> export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >> export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >> >> export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" >> >> ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf >> --------------------------------------------------- >> >> >> The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. >> >> >> Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: >> >> --------------------------------------------------- >> LIB_EXTERNAL = \ >> -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl >> --------------------------------------------------- >> >> Here are the error messages that show up during the './compile em_real' process: >> >> --------------------------------------------------- >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': >> fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': >> fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': >> fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': >> fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': >> fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': >> fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': >> fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': >> fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': >> fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': >> fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': >> fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': >> fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': >> fort-control.c:(.text+0x64): undefined reference to `nc_create' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': >> fort-control.c:(.text+0x1a0): undefined reference to `nc__create' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': >> fort-control.c:(.text+0x2b4): undefined reference to `nc_open' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': >> fort-control.c:(.text+0x3d8): undefined reference to `nc__open' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': >> fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': >> fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': >> fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': >> fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': >> fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': >> fort-control.c:(.text+0x4f4): undefined reference to `nc_close' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': >> fort-control.c:(.text+0x55a): undefined reference to `nc_delete' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': >> fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': >> fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': >> fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': >> fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': >> fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': >> fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': >> fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': >> fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': >> fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': >> fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': >> fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': >> fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': >> fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': >> fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': >> fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': >> fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': >> fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': >> fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': >> fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': >> fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': >> fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': >> fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': >> fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': >> fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': >> fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': >> fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': >> fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': >> fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': >> fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': >> fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': >> fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': >> fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': >> fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': >> fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': >> fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': >> fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': >> fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': >> fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': >> fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': >> fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': >> fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': >> fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': >> fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' >> fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': >> fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' >> fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': >> fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': >> fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' >> fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': >> fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': >> fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': >> fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': >> fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': >> fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': >> fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': >> fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': >> fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': >> fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': >> fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': >> fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': >> fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': >> fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': >> fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': >> fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': >> fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': >> fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': >> fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': >> fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': >> fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': >> fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': >> fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': >> fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': >> fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': >> fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': >> fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': >> fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' >> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': >> fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' >> make[2]: [diffwrf] Error 1 (ignored) >> >> --------------------------------------------------- >> >> >> What are we doing wrong here? >> >> Thanks >> >> >> Andrew Penny >> Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 >> Phone: (831) 656-3101 >> >> >> >> >> > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From andybpenny at hotmail.com Mon Apr 25 21:19:19 2011 From: andybpenny at hotmail.com (Andrew Penny) Date: Mon, 25 Apr 2011 20:19:19 -0700 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Message-ID: Hi Kevin, Where does that addition need to be made? For WRF version 3.3 (at least), if the "libnetcdff.a" file is present, it gets added automatically to the paths in the 'LIB_EXTERNAL' variable. Does it need to be added to any other variable? Could you send us your NetCDF config script, or the flags/environment variables that you used? Thanks! Andrew On Apr 25, 2011, at 6:55 PM, Kevin R. Tyle wrote: > Hi Andrew and Rebecca, > > netCDF 4.1.2 now splits off the FORTRAN libraries into libnetcdff.a. I > had to manually add "-lnetcdff" into my configure.wrf file in my WRFV3 > build directory in order to eliminate the link errors you are seeing. > > Hope this helps . . . > > --Kevin > > ______________________________________________________________________ > Kevin Tyle, Systems Administrator ********************** > Dept. of Atmospheric & Environmental Sciences ktyle at atmos.albany.edu > University at Albany, ES-235 518-442-4578 (voice) > 1400 Washington Avenue 518-442-5825 (fax) > Albany, NY 12222 ********************** > ______________________________________________________________________ > > On Mon, 25 Apr 2011, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: > >> Andrew, >> >> Excellent, glad that's not it! >> >> However, the problem still is that your netcdf library isn't linking in correctly. I recognize all those undefined references from when my netcdf library didn't link. I can't find in your compilation options below where you define your netcdf path. Could you possibly be linking to a 32-bit version instead of 64-bit? >> >> Becky Adams Selin >> Atmospheric & Environmental Research, Inc. >> AFWA 16th Wx Sqdn (402) 294-5273 >> >> >> -----Original Message----- >> From: Andrew Penny [mailto:andybpenny at hotmail.com] >> Sent: Monday, April 25, 2011 4:11 PM >> To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE >> Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 >> >> Hi Becky, >> >> The '/path/to/netcdf' is not an actual directory - it was meant to be >> simply a placeholder for the NetCDF install location. It is declared >> correctly for the LIB_EXTERNAL variable. >> >> Andrew Penny >> Research Associate - Meteorology Department >> Naval Postgraduate School >> Monterey, CA 93943 >> Phone: (831) 656-3101 >> >> >> >> >> On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: >> >>> Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. >>> >>> Becky Adams Selin >>> Atmospheric & Environmental Research, Inc. >>> AFWA 16th Wx Sqdn (402) 294-5273 >>> >>> >>> -----Original Message----- >>> From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny >>> Sent: Monday, April 25, 2011 2:23 PM >>> To: wrf-users at ucar.edu >>> Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 >>> >>> We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. >>> >>> Here are the compilation options we are using for NetCDF 4.1.2: >>> >>> --------------------------------------------------- >>> export CC=mpicc >>> export CXX=mpicxx >>> >>> export FC=mpif90 >>> export F77=mpif77 >>> export F90=mpif90 >>> >>> export CPP='icpc -E' >>> export CXXCPP='icpc -E' >>> >>> export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>> export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>> export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' >>> >>> export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>> export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>> export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>> export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>> export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>> >>> export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" >>> >>> ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf >>> --------------------------------------------------- >>> >>> >>> The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. >>> >>> >>> Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: >>> >>> --------------------------------------------------- >>> LIB_EXTERNAL = \ >>> -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl >>> --------------------------------------------------- >>> >>> Here are the error messages that show up during the './compile em_real' process: >>> >>> --------------------------------------------------- >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': >>> fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': >>> fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': >>> fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': >>> fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': >>> fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': >>> fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': >>> fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': >>> fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': >>> fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': >>> fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': >>> fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': >>> fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': >>> fort-control.c:(.text+0x64): undefined reference to `nc_create' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': >>> fort-control.c:(.text+0x1a0): undefined reference to `nc__create' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': >>> fort-control.c:(.text+0x2b4): undefined reference to `nc_open' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': >>> fort-control.c:(.text+0x3d8): undefined reference to `nc__open' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': >>> fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': >>> fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': >>> fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': >>> fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': >>> fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': >>> fort-control.c:(.text+0x4f4): undefined reference to `nc_close' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': >>> fort-control.c:(.text+0x55a): undefined reference to `nc_delete' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': >>> fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': >>> fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': >>> fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': >>> fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': >>> fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': >>> fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': >>> fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': >>> fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': >>> fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': >>> fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': >>> fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': >>> fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': >>> fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': >>> fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': >>> fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': >>> fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': >>> fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': >>> fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': >>> fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': >>> fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': >>> fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': >>> fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': >>> fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': >>> fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': >>> fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': >>> fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': >>> fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': >>> fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': >>> fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': >>> fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': >>> fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': >>> fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': >>> fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': >>> fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': >>> fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': >>> fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': >>> fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': >>> fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': >>> fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': >>> fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': >>> fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': >>> fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': >>> fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' >>> fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': >>> fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' >>> fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': >>> fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': >>> fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' >>> fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': >>> fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': >>> fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': >>> fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': >>> fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': >>> fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': >>> fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': >>> fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': >>> fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': >>> fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': >>> fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': >>> fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': >>> fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': >>> fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': >>> fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': >>> fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': >>> fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': >>> fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': >>> fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': >>> fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': >>> fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >>> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >>> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >>> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >>> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': >>> fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': >>> fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': >>> fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': >>> fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': >>> fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': >>> fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': >>> fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' >>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': >>> fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' >>> make[2]: [diffwrf] Error 1 (ignored) >>> >>> --------------------------------------------------- >>> >>> >>> What are we doing wrong here? >>> >>> Thanks >>> >>> >>> Andrew Penny >>> Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 >>> Phone: (831) 656-3101 >>> >>> >>> >>> >>> >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From davidstephenbryan at yahoo.com Tue Apr 26 10:11:39 2011 From: davidstephenbryan at yahoo.com (David Bryan) Date: Tue, 26 Apr 2011 09:11:39 -0700 (PDT) Subject: [Wrf-users] storing location info in geo files Message-ID: <193486.62186.qm@web65901.mail.ac4.yahoo.com> I'm trying to understand how WRF geo files store location information. For instance, I recently made a 49x49 geo file and opened it in Panoply.? I looked at CLAT (and then CLONG) under the Array tab (numeric values, not visualized).? What I found was a 48x48 array of lat-longs.? My questions: 1.? Do WRF's geo files store location information (lat-longs) at the midpoint between grid points? 2.? This reminded me of the illustration of coarse grid in the "Nesting in WRF" document (p. 13 here:? http://www.mmm.ucar.edu/wrf/users/tutorial/201101/WRFNesting.pdf).? In the illustration, wind speed components, u & v, were stored on the grid point (ie, at the perimeter of each "cell" assuming a 5x5 grid), while temperature, T (or theta), was stored at the midpoint between grid points (ie, in the center of each "cell").? If that illustration is accurate (and my understanding of it correct), does that mean that each T value is associated with a CLAT-CLONG pair?? Is there one less row and one less column of T data per grid (ie, in my case 48x48)? Thanks! From J.Kala at murdoch.edu.au Mon Apr 25 19:16:01 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Tue, 26 Apr 2011 09:16:01 +0800 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com><201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil><5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Message-ID: Hi, if you still have trouble once you have linked the netcdf lib properly, you may consider using the 4.0.1 version, rather than 4.1.x versions. We had some issues with 4.1.1, which were solved by using 4.0.1. instead. This may or may not apply to you, but worth bearing in mind. cheers, jatin -----Original Message----- From: wrf-users-bounces at ucar.edu on behalf of Selin, Rebecca D CTR USAF AFWA 16 WS/WXE Sent: Tue 4/26/2011 5:11 AM To: 'Andrew Penny' Cc: 'wrf-users at ucar.edu' Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 Andrew, Excellent, glad that's not it! However, the problem still is that your netcdf library isn't linking in correctly. I recognize all those undefined references from when my netcdf library didn't link. I can't find in your compilation options below where you define your netcdf path. Could you possibly be linking to a 32-bit version instead of 64-bit? Becky Adams Selin Atmospheric & Environmental Research, Inc. AFWA 16th Wx Sqdn (402) 294-5273 -----Original Message----- From: Andrew Penny [mailto:andybpenny at hotmail.com] Sent: Monday, April 25, 2011 4:11 PM To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 Hi Becky, The '/path/to/netcdf' is not an actual directory - it was meant to be simply a placeholder for the NetCDF install location. It is declared correctly for the LIB_EXTERNAL variable. Andrew Penny Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 Phone: (831) 656-3101 On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: > Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. > > Becky Adams Selin > Atmospheric & Environmental Research, Inc. > AFWA 16th Wx Sqdn (402) 294-5273 > > > -----Original Message----- > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny > Sent: Monday, April 25, 2011 2:23 PM > To: wrf-users at ucar.edu > Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 > > We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. > > Here are the compilation options we are using for NetCDF 4.1.2: > > --------------------------------------------------- > export CC=mpicc > export CXX=mpicxx > > export FC=mpif90 > export F77=mpif77 > export F90=mpif90 > > export CPP='icpc -E' > export CXXCPP='icpc -E' > > export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' > > export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' > export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > > export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" > > ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf > --------------------------------------------------- > > > The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. > > > Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: > > --------------------------------------------------- > LIB_EXTERNAL = \ > -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl > --------------------------------------------------- > > Here are the error messages that show up during the './compile em_real' process: > > --------------------------------------------------- > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': > fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': > fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': > fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': > fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': > fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': > fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': > fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': > fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': > fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': > fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': > fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': > fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': > fort-control.c:(.text+0x64): undefined reference to `nc_create' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': > fort-control.c:(.text+0x1a0): undefined reference to `nc__create' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': > fort-control.c:(.text+0x2b4): undefined reference to `nc_open' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': > fort-control.c:(.text+0x3d8): undefined reference to `nc__open' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': > fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': > fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': > fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': > fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': > fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': > fort-control.c:(.text+0x4f4): undefined reference to `nc_close' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': > fort-control.c:(.text+0x55a): undefined reference to `nc_delete' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': > fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': > fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': > fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': > fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': > fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': > fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': > fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': > fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': > fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': > fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': > fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': > fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': > fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': > fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': > fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': > fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': > fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': > fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': > fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': > fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': > fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': > fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': > fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': > fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': > fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': > fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': > fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': > fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': > fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': > fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': > fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': > fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': > fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': > fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': > fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': > fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': > fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': > fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': > fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': > fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': > fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': > fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': > fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' > fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': > fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' > fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': > fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': > fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' > fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': > fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': > fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': > fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': > fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': > fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': > fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': > fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': > fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': > fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': > fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': > fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': > fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': > fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': > fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': > fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': > fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': > fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': > fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': > fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': > fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': > fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': > fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': > fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': > fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': > fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': > fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': > fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': > fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' > make[2]: [diffwrf] Error 1 (ignored) > > --------------------------------------------------- > > > What are we doing wrong here? > > Thanks > > > Andrew Penny > Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 > Phone: (831) 656-3101 > > > > > _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110426/518e8d9f/attachment-0001.html From saji at u-aizu.ac.jp Mon Apr 25 17:45:54 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Tue, 26 Apr 2011 08:45:54 +0900 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Message-ID: I think it is a problem peculiar to netcdf4. We also had the same problem and hastily switched back to netcdf 3 to solve the issue. If you google for this, you may find more info about this problem (there is some discussion on the pgfortran groups regarding this..) saji On Tue, Apr 26, 2011 at 6:11 AM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE < Rebecca.Selin.Ctr at offutt.af.mil> wrote: > Andrew, > > Excellent, glad that's not it! > > However, the problem still is that your netcdf library isn't linking in > correctly. I recognize all those undefined references from when my netcdf > library didn't link. I can't find in your compilation options below where > you define your netcdf path. Could you possibly be linking to a 32-bit > version instead of 64-bit? > > Becky Adams Selin > Atmospheric & Environmental Research, Inc. > AFWA 16th Wx Sqdn (402) 294-5273 > > > -----Original Message----- > From: Andrew Penny [mailto:andybpenny at hotmail.com] > Sent: Monday, April 25, 2011 4:11 PM > To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE > Subject: Re: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF > V4.12 > > Hi Becky, > > The '/path/to/netcdf' is not an actual directory - it was meant to be > simply a placeholder for the NetCDF install location. It is declared > correctly for the LIB_EXTERNAL variable. > > Andrew Penny > Research Associate - Meteorology Department > Naval Postgraduate School > Monterey, CA 93943 > Phone: (831) 656-3101 > > > > > On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE > wrote: > > > Andrew - Looks like your netcdf library isn't linking in correctly. In > your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf > path. > > > > Becky Adams Selin > > Atmospheric & Environmental Research, Inc. > > AFWA 16th Wx Sqdn (402) 294-5273 > > > > > > -----Original Message----- > > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On > Behalf Of Andrew Penny > > Sent: Monday, April 25, 2011 2:23 PM > > To: wrf-users at ucar.edu > > Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF > V4.12 > > > > We are having problems getting the latest version of WRF compiled with > the latest version of NetCDF. > > > > Here are the compilation options we are using for NetCDF 4.1.2: > > > > --------------------------------------------------- > > export CC=mpicc > > export CXX=mpicxx > > > > export FC=mpif90 > > export F77=mpif77 > > export F90=mpif90 > > > > export CPP='icpc -E' > > export CXXCPP='icpc -E' > > > > export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > > export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > > export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include > -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include > -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' > > > > export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC > -assume nounderscore' > > export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC > -assume nounderscore' > > export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC > -assume nounderscore' > > export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC > -assume nounderscore' > > export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' > > > > export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 > -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 > -lz -lm -L/opt/szip/intel/lib -lsz" > > > > ./configure --disable-shared --enable-large-file-tests --enable-fortran > --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 > --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 > --enable-utilities --with-libcf --with-zlib=/usr > --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel > --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin > --prefix=/path/to/netcdf > > --------------------------------------------------- > > > > > > The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN > compilers. > > > > > > Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file > for compiling WRF version 3.3: > > > > --------------------------------------------------- > > LIB_EXTERNAL = \ > > -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf > -L/path/to/netcdf/lib -lnetcdff -lnetcdf > -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib > -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib > -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c > -L/usr/lib64 -lcurl > > --------------------------------------------------- > > > > Here are the error messages that show up during the './compile em_real' > process: > > > > --------------------------------------------------- > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_text_': > > fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_text_': > > fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_int1_': > > fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_int1_': > > fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_int2_': > > fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_int2_': > > fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_int_': > > fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_int_': > > fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_real_': > > fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_real_': > > fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_put_att_double_': > > fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function > `nf_get_att_double_': > > fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_create_': > > fort-control.c:(.text+0x64): undefined reference to `nc_create' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf__create_': > > fort-control.c:(.text+0x1a0): undefined reference to `nc__create' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_open_': > > fort-control.c:(.text+0x2b4): undefined reference to `nc_open' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf__open_': > > fort-control.c:(.text+0x3d8): undefined reference to `nc__open' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_set_fill_': > > fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_redef_': > > fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_enddef_': > > fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf__enddef_': > > fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_sync_': > > fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_close_': > > fort-control.c:(.text+0x4f4): undefined reference to `nc_close' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_delete_': > > fort-control.c:(.text+0x55a): undefined reference to `nc_delete' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf__create_mp_': > > fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf__open_mp_': > > fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_delete_mp_': > > fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_set_base_pe_': > > fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_inq_base_pe_': > > fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_abort_': > > fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function > `nf_set_default_format_': > > fort-control.c:(.text+0x9cf): undefined reference to > `nc_set_default_format' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_def_dim_': > > fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_inq_dimid_': > > fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_inq_dim_': > > fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_inq_dimname_': > > fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_inq_dimlen_': > > fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function > `nf_rename_dim_': > > fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_inq_att_': > > fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_inq_attid_': > > fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_inq_atttype_': > > fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_inq_attlen_': > > fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_inq_attname_': > > fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_copy_att_': > > fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_rename_att_': > > fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function > `nf_del_att_': > > fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_': > > fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_ndims_': > > fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_nvars_': > > fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_natts_': > > fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_unlimdim_': > > fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function > `nf_inq_format_': > > fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_def_var_': > > fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_var_': > > fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_varid_': > > fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_varname_': > > fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_vartype_': > > fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_varndims_': > > fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_vardimid_': > > fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_inq_varnatts_': > > fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_rename_var_': > > fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function > `nf_copy_var_': > > fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `c2f_dimids': > > fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `c2f_chunksizes': > > fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `f2c_chunksizes': > > fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `f2c_coords': > > fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `f2c_counts': > > fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' > > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): > more undefined references to `nc_inq_varndims' follow > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `nc_inq_varids_f': > > fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' > > fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `nc_inq_dimids_f': > > fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' > > fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `nc_insert_array_compound_f': > > fort-lib.c:(.text+0x81f): undefined reference to > `nc_insert_array_compound' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function > `nc_inq_compound_field_f': > > fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' > > fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function > `nf_inq_libvers_': > > fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function > `nf_strerror_': > > fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_text_': > > fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_text_': > > fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_int1_': > > fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_int1_': > > fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_int2_': > > fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_int2_': > > fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_int_': > > fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_int_': > > fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_real_': > > fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_real_': > > fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_double_': > > fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_double_': > > fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_put_var_': > > fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function > `nf_get_var_': > > fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_text_': > > fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_text_': > > fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_int1_': > > fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_int1_': > > fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_int2_': > > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_int2_': > > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_int2_': > > fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_int2_': > > fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_int_': > > fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_int_': > > fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_real_': > > fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_real_': > > fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_double_': > > fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_double_': > > fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_put_vara_': > > fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' > > /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function > `nf_get_vara_': > > fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' > > make[2]: [diffwrf] Error 1 (ignored) > > > > --------------------------------------------------- > > > > > > What are we doing wrong here? > > > > Thanks > > > > > > Andrew Penny > > Research Associate - Meteorology Department Naval Postgraduate School > Monterey, CA 93943 > > Phone: (831) 656-3101 > > > > > > > > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110426/cb838968/attachment-0001.html From ktyle at atmos.albany.edu Tue Apr 26 16:30:41 2011 From: ktyle at atmos.albany.edu (Kevin R. Tyle) Date: Tue, 26 Apr 2011 22:30:41 +0000 Subject: [Wrf-users] error compiling (icc & ifort) WRFV3.3 with NetCDF V4.12 In-Reply-To: References: <284776D2-1EC0-4B4B-8ADA-0CF171EFF060@hotmail.com> <201104251915.p3PJEvMr015213@sgbp-fwl-001.offutt.af.mil> <5FD3BF59-B2D9-4D36-A84A-80C3929B8BF7@hotmail.com> <201104252056.p3PKuRqb034355@sgbp-fwl-001.offutt.af.mil> Message-ID: <4DB74791.90305@atmos.albany.edu> Hi Andrew, Here is the diff between the original configure.wrf and the one that worked for me. It is all in the LIB_EXTERNAL line in the makefile: 216c216 < -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/usr/local/lib -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/usr/local/lib64 -ljasper -lsz -lhdf5_hl -lhdf5 -lcurl --- > -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/usr/local/lib64 -lnetcdf -lnetcdff -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/usr/local/lib -ljasper -lsz -lhdf5_hl -lhdf5 -lcurl Note that I build netcdf4 with support for hdf5 and szip, as well as for curl, so that's why you see those four extra libraries at the end. I first modify the $WRF/arch/postamble_new file so these libraries will be included when configure is run: diff postamble_new postamble_new.orig 58c58 < #NOWIN CONFIGURE_NETCDF_LIB_PATH CONFIGURE_PNETCDF_LIB_PATH CONFIGURE_GRIB2_LIB CONFIGURE_ATMOCN_LIB -lsz -lhdf5_hl -lhdf5 -lcurl --- > #NOWIN CONFIGURE_NETCDF_LIB_PATH CONFIGURE_PNETCDF_LIB_PATH CONFIGURE_GRIB2_LIB CONFIGURE_ATMOCN_LIB My WRF build for dmpar on a single machine is preceded by the building of szip2.1, hdf5-1.8.5-patch1, netcdf-4.1.2, and jasper-1.900.1 with the intel compilers. The configure lines I invoke are as follows: szip: configure --prefix=/usr/local --libdir=/usr/local/lib64 hdf5: configure --enable-cxx --enable-shared --enable-fortran --with-szlib=/usr/local --libdir=/usr/local/lib64 --prefix=/usr/local netcdf4: configure --enable-netcdf-4 --disable-cxx --enable-benchmarks --enable-ncgen4 --with-hdf5=/usr/local --with-szlib=/usr/local --libdir=/usr/local/lib64 jasper: configure --prefix=/usr/local --libdir=/usr/local/lib64 Meanwhile, for WPS, I modified arch/preamble: diff preamble preamble.orig 51c51 < -L$(NETCDF)/lib CONFIGURE_NETCDFF_LIB -lnetcdff -lnetcdf --- > -L$(NETCDF)/lib CONFIGURE_NETCDFF_LIB -lnetcdf I also modified arch/configure.defaults: diff configure.defaults configure.defaults.orig 352c352 < SCC = icc --- > SCC = gcc I'm not sure if this is really necessary or even foolhardy, but this way at least everything is built with the intel compilers. And, since my NCAR Graphics were built with the Gnu compilers, I had to manually change one line in configure.wps after running configure: 24c24 < -L/usr/X11R6/lib -lX11 --- > -L/usr/X11R6/lib -lX11 -lgfortran This is on a 64bit CentOS 5.6 machine with Intel compilers 11.1. I hope this is helpful! --Kevin ______________________________________________________________________ Kevin Tyle, Systems Administrator ********************** Dept. of Atmospheric & Environmental Sciences ktyle at atmos.albany.edu University at Albany, ES-235 518-442-4578 (voice) 1400 Washington Avenue 518-442-5825 (fax) Albany, NY 12222 ********************** ______________________________________________________________________ On 04/26/2011 03:19 AM, Andrew Penny wrote: > Hi Kevin, > > Where does that addition need to be made? For WRF version 3.3 (at least), > if the "libnetcdff.a" file is present, it gets added automatically to the > paths in the 'LIB_EXTERNAL' variable. Does it need to be added to any > other variable? > > Could you send us your NetCDF config script, or the flags/environment > variables that you used? > > Thanks! > Andrew > > On Apr 25, 2011, at 6:55 PM, Kevin R. Tyle wrote: > >> Hi Andrew and Rebecca, >> >> netCDF 4.1.2 now splits off the FORTRAN libraries into libnetcdff.a. I >> had to manually add "-lnetcdff" into my configure.wrf file in my WRFV3 >> build directory in order to eliminate the link errors you are seeing. >> >> Hope this helps . . . >> >> --Kevin >> >> ______________________________________________________________________ >> Kevin Tyle, Systems Administrator ********************** >> Dept. of Atmospheric& Environmental Sciences ktyle at atmos.albany.edu >> University at Albany, ES-235 518-442-4578 (voice) >> 1400 Washington Avenue 518-442-5825 (fax) >> Albany, NY 12222 ********************** >> ______________________________________________________________________ >> >> On Mon, 25 Apr 2011, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: >> >>> Andrew, >>> >>> Excellent, glad that's not it! >>> >>> However, the problem still is that your netcdf library isn't linking in correctly. I recognize all those undefined references from when my netcdf library didn't link. I can't find in your compilation options below where you define your netcdf path. Could you possibly be linking to a 32-bit version instead of 64-bit? >>> >>> Becky Adams Selin >>> Atmospheric& Environmental Research, Inc. >>> AFWA 16th Wx Sqdn (402) 294-5273 >>> >>> >>> -----Original Message----- >>> From: Andrew Penny [mailto:andybpenny at hotmail.com] >>> Sent: Monday, April 25, 2011 4:11 PM >>> To: Selin, Rebecca D CTR USAF AFWA 16 WS/WXE >>> Subject: Re: [Wrf-users] error compiling (icc& ifort) WRFV3.3 with NetCDF V4.12 >>> >>> Hi Becky, >>> >>> The '/path/to/netcdf' is not an actual directory - it was meant to be >>> simply a placeholder for the NetCDF install location. It is declared >>> correctly for the LIB_EXTERNAL variable. >>> >>> Andrew Penny >>> Research Associate - Meteorology Department >>> Naval Postgraduate School >>> Monterey, CA 93943 >>> Phone: (831) 656-3101 >>> >>> >>> >>> >>> On Apr 25, 2011, at 12:29 PM, Selin, Rebecca D CTR USAF AFWA 16 WS/WXE wrote: >>> >>>> Andrew - Looks like your netcdf library isn't linking in correctly. In your LIB_EXTERNAL setting, change '-L/path/to/netcdf/lib' to your netcdf path. >>>> >>>> Becky Adams Selin >>>> Atmospheric& Environmental Research, Inc. >>>> AFWA 16th Wx Sqdn (402) 294-5273 >>>> >>>> >>>> -----Original Message----- >>>> From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Andrew Penny >>>> Sent: Monday, April 25, 2011 2:23 PM >>>> To: wrf-users at ucar.edu >>>> Subject: [Wrf-users] error compiling (icc& ifort) WRFV3.3 with NetCDF V4.12 >>>> >>>> We are having problems getting the latest version of WRF compiled with the latest version of NetCDF. >>>> >>>> Here are the compilation options we are using for NetCDF 4.1.2: >>>> >>>> --------------------------------------------------- >>>> export CC=mpicc >>>> export CXX=mpicxx >>>> >>>> export FC=mpif90 >>>> export F77=mpif77 >>>> export F90=mpif90 >>>> >>>> export CPP='icpc -E' >>>> export CXXCPP='icpc -E' >>>> >>>> export CFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>>> export CXXFLAGS='-O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>>> export CPPFLAGS='-DpgiFortran -I/opt/hdf5/intel/include -I/opt/hdf4/intel/include -I/opt/udunits/intel/include -I/usr/include -I/opt/szip/intel/include -I/usr/mpi/intel/mvapich2-1.6/include' >>>> >>>> export FFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>>> export FCFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>>> export FCFLAGS_f90='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>>> export F90FLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC -assume nounderscore' >>>> export LDFLAGS='-DpgiFortran -O3 -xSSSE3 -ip -no-prec-div -m64 -fPIC' >>>> >>>> export LIBS="-L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/hdf4/intel/lib -ldf -L/opt/udunits/intel/lib -ludunits2 -L/usr/lib64 -lz -lm -L/opt/szip/intel/lib -lsz" >>>> >>>> ./configure --disable-shared --enable-large-file-tests --enable-fortran --enable-f90 --enable-f77 --disable-netcdf-4 --enable-netcdf4 --enable-parallel --enable-cxx-4 --enable-cxx --enable-hdf4 --enable-utilities --with-libcf --with-zlib=/usr --with-szlib=/opt/szip/intel --with-hdf4=/opt/hdf4/intel --with-hdf5=/opt/hdf5/intel --with-curl-config=/usr/bin --prefix=/path/to/netcdf >>>> --------------------------------------------------- >>>> >>>> >>>> The mpif77/mpicc/mpif90 binaries are compiled using the Intel C++/FORTRAN compilers. >>>> >>>> >>>> Here is the "LIB_EXTERNAL" variable setting in the 'configure.wrf' file for compiling WRF version 3.3: >>>> >>>> --------------------------------------------------- >>>> LIB_EXTERNAL = \ >>>> -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/path/to/netcdf/lib -lnetcdff -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_grib2 -lio_grib2 -L/opt/jasper/intel/lib -ljasper -L/opt/hdf4/intel/lib -ldf -lmfhdf -L/opt/hdf5/intel/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -L/opt/g2clib/intel/lib -lgrib2c -L/usr/lib64 -lcurl >>>> --------------------------------------------------- >>>> >>>> Here are the error messages that show up during the './compile em_real' process: >>>> >>>> --------------------------------------------------- >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_text_': >>>> fort-attio.c:(.text+0x7a): undefined reference to `nc_put_att_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_text_': >>>> fort-attio.c:(.text+0x18d): undefined reference to `nc_get_att_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int1_': >>>> fort-attio.c:(.text+0x2af): undefined reference to `nc_put_att_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int1_': >>>> fort-attio.c:(.text+0x3bd): undefined reference to `nc_get_att_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int2_': >>>> fort-attio.c:(.text+0x4df): undefined reference to `nc_put_att_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int2_': >>>> fort-attio.c:(.text+0x5ed): undefined reference to `nc_get_att_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_int_': >>>> fort-attio.c:(.text+0x70f): undefined reference to `nc_put_att_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_int_': >>>> fort-attio.c:(.text+0x81d): undefined reference to `nc_get_att_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_real_': >>>> fort-attio.c:(.text+0x93f): undefined reference to `nc_put_att_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_real_': >>>> fort-attio.c:(.text+0xa4d): undefined reference to `nc_get_att_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_put_att_double_': >>>> fort-attio.c:(.text+0xb6f): undefined reference to `nc_put_att_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-attio.o): In function `nf_get_att_double_': >>>> fort-attio.c:(.text+0xc7d): undefined reference to `nc_get_att_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_create_': >>>> fort-control.c:(.text+0x64): undefined reference to `nc_create' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_': >>>> fort-control.c:(.text+0x1a0): undefined reference to `nc__create' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_open_': >>>> fort-control.c:(.text+0x2b4): undefined reference to `nc_open' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_': >>>> fort-control.c:(.text+0x3d8): undefined reference to `nc__open' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_fill_': >>>> fort-control.c:(.text+0x491): undefined reference to `nc_set_fill' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_redef_': >>>> fort-control.c:(.text+0x4a4): undefined reference to `nc_redef' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_enddef_': >>>> fort-control.c:(.text+0x4b4): undefined reference to `nc_enddef' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__enddef_': >>>> fort-control.c:(.text+0x4d0): undefined reference to `nc__enddef' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_sync_': >>>> fort-control.c:(.text+0x4e4): undefined reference to `nc_sync' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_close_': >>>> fort-control.c:(.text+0x4f4): undefined reference to `nc_close' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_': >>>> fort-control.c:(.text+0x55a): undefined reference to `nc_delete' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__create_mp_': >>>> fort-control.c:(.text+0x68a): undefined reference to `nc__create_mp' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf__open_mp_': >>>> fort-control.c:(.text+0x7d0): undefined reference to `nc__open_mp' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_delete_mp_': >>>> fort-control.c:(.text+0x8e3): undefined reference to `nc_delete_mp' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_base_pe_': >>>> fort-control.c:(.text+0x986): undefined reference to `nc_set_base_pe' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_inq_base_pe_': >>>> fort-control.c:(.text+0x99f): undefined reference to `nc_inq_base_pe' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_abort_': >>>> fort-control.c:(.text+0x9b4): undefined reference to `nc_abort' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-control.o): In function `nf_set_default_format_': >>>> fort-control.c:(.text+0x9cf): undefined reference to `nc_set_default_format' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_def_dim_': >>>> fort-dim.c:(.text+0x78): undefined reference to `nc_def_dim' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimid_': >>>> fort-dim.c:(.text+0x19b): undefined reference to `nc_inq_dimid' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dim_': >>>> fort-dim.c:(.text+0x2e5): undefined reference to `nc_inq_dim' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimname_': >>>> fort-dim.c:(.text+0x4bb): undefined reference to `nc_inq_dimname' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_inq_dimlen_': >>>> fort-dim.c:(.text+0x614): undefined reference to `nc_inq_dimlen' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-dim.o): In function `nf_rename_dim_': >>>> fort-dim.c:(.text+0x695): undefined reference to `nc_rename_dim' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_att_': >>>> fort-genatt.c:(.text+0x87): undefined reference to `nc_inq_att' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attid_': >>>> fort-genatt.c:(.text+0x19e): undefined reference to `nc_inq_attid' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_atttype_': >>>> fort-genatt.c:(.text+0x2ae): undefined reference to `nc_inq_atttype' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attlen_': >>>> fort-genatt.c:(.text+0x3be): undefined reference to `nc_inq_attlen' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_inq_attname_': >>>> fort-genatt.c:(.text+0x4f7): undefined reference to `nc_inq_attname' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_copy_att_': >>>> fort-genatt.c:(.text+0x6c8): undefined reference to `nc_copy_att' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_rename_att_': >>>> fort-genatt.c:(.text+0x842): undefined reference to `nc_rename_att' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genatt.o): In function `nf_del_att_': >>>> fort-genatt.c:(.text+0x9d5): undefined reference to `nc_del_att' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_': >>>> fort-geninq.c:(.text+0x34): undefined reference to `nc_inq' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_ndims_': >>>> fort-geninq.c:(.text+0x8f): undefined reference to `nc_inq_ndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_nvars_': >>>> fort-geninq.c:(.text+0xaf): undefined reference to `nc_inq_nvars' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_natts_': >>>> fort-geninq.c:(.text+0xcf): undefined reference to `nc_inq_natts' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_unlimdim_': >>>> fort-geninq.c:(.text+0xf7): undefined reference to `nc_inq_unlimdim' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-geninq.o): In function `nf_inq_format_': >>>> fort-geninq.c:(.text+0x12f): undefined reference to `nc_inq_format' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_def_var_': >>>> fort-genvar.c:(.text+0xa0): undefined reference to `nc_def_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_var_': >>>> fort-genvar.c:(.text+0x217): undefined reference to `nc_inq_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varid_': >>>> fort-genvar.c:(.text+0x424): undefined reference to `nc_inq_varid' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varname_': >>>> fort-genvar.c:(.text+0x55b): undefined reference to `nc_inq_varname' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vartype_': >>>> fort-genvar.c:(.text+0x6b4): undefined reference to `nc_inq_vartype' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varndims_': >>>> fort-genvar.c:(.text+0x6e4): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_vardimid_': >>>> fort-genvar.c:(.text+0x724): undefined reference to `nc_inq_vardimid' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_inq_varnatts_': >>>> fort-genvar.c:(.text+0x764): undefined reference to `nc_inq_varnatts' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_rename_var_': >>>> fort-genvar.c:(.text+0x7e5): undefined reference to `nc_rename_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-genvar.o): In function `nf_copy_var_': >>>> fort-genvar.c:(.text+0x88a): undefined reference to `nc_copy_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_dimids': >>>> fort-lib.c:(.text+0x10): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `c2f_chunksizes': >>>> fort-lib.c:(.text+0x130): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_chunksizes': >>>> fort-lib.c:(.text+0x1d0): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_coords': >>>> fort-lib.c:(.text+0x270): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `f2c_counts': >>>> fort-lib.c:(.text+0x320): undefined reference to `nc_inq_varndims' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o):fort-lib.c:(.text+0x3d0): more undefined references to `nc_inq_varndims' follow >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_varids_f': >>>> fort-lib.c:(.text+0x47c): undefined reference to `nc_inq_varids' >>>> fort-lib.c:(.text+0x4b0): undefined reference to `nc_inq_varids' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_dimids_f': >>>> fort-lib.c:(.text+0x5f6): undefined reference to `nc_inq_dimids' >>>> fort-lib.c:(.text+0x631): undefined reference to `nc_inq_dimids' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_insert_array_compound_f': >>>> fort-lib.c:(.text+0x81f): undefined reference to `nc_insert_array_compound' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-lib.o): In function `nc_inq_compound_field_f': >>>> fort-lib.c:(.text+0x8ac): undefined reference to `nc_inq_compound_field' >>>> fort-lib.c:(.text+0x8e0): undefined reference to `nc_inq_compound_field' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_inq_libvers_': >>>> fort-misc.c:(.text+0xa): undefined reference to `nc_inq_libvers' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-misc.o): In function `nf_strerror_': >>>> fort-misc.c:(.text+0x16e): undefined reference to `nc_strerror' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_text_': >>>> fort-vario.c:(.text+0x8): undefined reference to `nc_put_var_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_text_': >>>> fort-vario.c:(.text+0x18): undefined reference to `nc_get_var_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int1_': >>>> fort-vario.c:(.text+0x28): undefined reference to `nc_put_var_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int1_': >>>> fort-vario.c:(.text+0x38): undefined reference to `nc_get_var_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int2_': >>>> fort-vario.c:(.text+0x48): undefined reference to `nc_put_var_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int2_': >>>> fort-vario.c:(.text+0x58): undefined reference to `nc_get_var_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_int_': >>>> fort-vario.c:(.text+0x68): undefined reference to `nc_put_var_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_int_': >>>> fort-vario.c:(.text+0x78): undefined reference to `nc_get_var_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_real_': >>>> fort-vario.c:(.text+0x88): undefined reference to `nc_put_var_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_real_': >>>> fort-vario.c:(.text+0x98): undefined reference to `nc_get_var_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_double_': >>>> fort-vario.c:(.text+0xa8): undefined reference to `nc_put_var_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_double_': >>>> fort-vario.c:(.text+0xb8): undefined reference to `nc_get_var_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_put_var_': >>>> fort-vario.c:(.text+0xc8): undefined reference to `nc_put_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-vario.o): In function `nf_get_var_': >>>> fort-vario.c:(.text+0xd8): undefined reference to `nc_get_var' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_text_': >>>> fort-varaio.c:(.text+0x5c): undefined reference to `nc_put_vara_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_text_': >>>> fort-varaio.c:(.text+0xcc): undefined reference to `nc_get_vara_text' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int1_': >>>> fort-varaio.c:(.text+0x13c): undefined reference to `nc_put_vara_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int1_': >>>> fort-varaio.c:(.text+0x1ac): undefined reference to `nc_get_vara_schar' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >>>> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >>>> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int2_': >>>> fort-varaio.c:(.text+0x21c): undefined reference to `nc_put_vara_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int2_': >>>> fort-varaio.c:(.text+0x28c): undefined reference to `nc_get_vara_short' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_int_': >>>> fort-varaio.c:(.text+0x2fc): undefined reference to `nc_put_vara_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_int_': >>>> fort-varaio.c:(.text+0x36c): undefined reference to `nc_get_vara_int' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_real_': >>>> fort-varaio.c:(.text+0x3dc): undefined reference to `nc_put_vara_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_real_': >>>> fort-varaio.c:(.text+0x44c): undefined reference to `nc_get_vara_float' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_double_': >>>> fort-varaio.c:(.text+0x4bc): undefined reference to `nc_put_vara_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_double_': >>>> fort-varaio.c:(.text+0x52c): undefined reference to `nc_get_vara_double' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_put_vara_': >>>> fort-varaio.c:(.text+0x59c): undefined reference to `nc_put_vara' >>>> /work/sbarve/netcdf/lib/libnetcdff.a(fort-varaio.o): In function `nf_get_vara_': >>>> fort-varaio.c:(.text+0x60c): undefined reference to `nc_get_vara' >>>> make[2]: [diffwrf] Error 1 (ignored) >>>> >>>> --------------------------------------------------- >>>> >>>> >>>> What are we doing wrong here? >>>> >>>> Thanks >>>> >>>> >>>> Andrew Penny >>>> Research Associate - Meteorology Department Naval Postgraduate School Monterey, CA 93943 >>>> Phone: (831) 656-3101 >>>> >>>> >>>> >>>> >>>> >>> _______________________________________________ >>> Wrf-users mailing list >>> Wrf-users at ucar.edu >>> http://mailman.ucar.edu/mailman/listinfo/wrf-users >>> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From xingang.fan at wku.edu Wed Apr 27 15:07:11 2011 From: xingang.fan at wku.edu (Fan, Xingang) Date: Wed, 27 Apr 2011 16:07:11 -0500 Subject: [Wrf-users] RIP4 problem with GHT and SLP Message-ID: Hi, Hope someone can help with the RIP plots in the attached PDF file. I had problems with pressure level plotting before. In the attached plots, slp is shown within the range of 100~110 hPa, which is not right. Ght at the three pressure levels have problems over land areas obviously. These were generated by adopting the two example .in files within the RIP4 package. They are attached too. Model information is shown below the plots. Thanks for any help! Xingang Fan, Ph.D. Assistant Professor Dept. Geography& Geology Western Kentucky University 1906 College Heights Blvd, #31066 Bowling Green, KY 42101-1066 Phone: (270)745-5980 Email: xingang.fan at wku.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110427/9a637f79/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: rip_201004.pdf Type: application/pdf Size: 4078357 bytes Desc: rip_201004.pdf Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110427/9a637f79/attachment-0001.pdf -------------- next part -------------- A non-text attachment was scrubbed... Name: ripdp_wrfarw.in Type: application/octet-stream Size: 96 bytes Desc: ripdp_wrfarw.in Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110427/9a637f79/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: rip_201004.in Type: application/octet-stream Size: 1587 bytes Desc: rip_201004.in Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110427/9a637f79/attachment-0003.obj From Matthew.Foster at noaa.gov Fri Apr 29 07:28:26 2011 From: Matthew.Foster at noaa.gov (Matt Foster) Date: Fri, 29 Apr 2011 08:28:26 -0500 Subject: [Wrf-users] Radar DFI: What should it be doing? Message-ID: <4DBABCFA.2010703@noaa.gov> I did a build of WRF 3.2.1 (DM, OpenMPI) with Radar DFI enabled. I turned on TDFI and dfi_radar in the namelist file. When I run the simulation (3km, 433x433 grid points), all CPUs are at 100% usage, and nothing is appearing to happen. All we see in the rsl.out files is... Ntasks in X 1, ntasks in Y 71 We are running on 72 CPUs with IO Quilting enabled. I let it run for nearly 30 minutes, during which time there was no additional output in the rsl.out files, and the CPUs were at 100% for the entire time. I also checked our Infiniband switch, and there was no activity. To me, that indicated that nothing productive was happening. Is there anyone on here with experience using Radar DFI that might know why this is happening? One suspicion I had was that the build was also done with -DRUC_CLOUD. I wonder if that and Radar DFI should not be combined? Matt -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson -------------- next part -------------- A non-text attachment was scrubbed... Name: matthew_foster.vcf Type: text/x-vcard Size: 229 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110429/170bf038/attachment.vcf From ebeigi3 at tigers.lsu.edu Fri Apr 29 20:55:44 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Fri, 29 Apr 2011 22:55:44 -0400 Subject: [Wrf-users] installing netcdf In-Reply-To: <4DB605CE.5090208@ucar.edu> References: <4DAC95F2.6000807@ucar.edu> <4DB605CE.5090208@ucar.edu> Message-ID: Thanks for your previous reply. i compiled the WRF for GRIB2 version succesfuly, and then I installed three needed libaraies (libpng-zlib, and jasper), but after compiling WPS, i can not see the ungrib.exe. i tried different path to theses libaraires in the /.bashrc, but it still doesn't work. how could i solve this problem? 1- should i install all of libraries as root ? 2- should i install all of the them in the same directory? Best Regards Ehsan Beigi On Mon, Apr 25, 2011 at 7:37 PM, wrfhelp wrote: > You need to install these libbraries before you build WPS. > > They can be in different places other than where you will build WRF/WPS. > Only that you need to set correct path to make them accessible when you > build WPS. > > Please see > http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap3.htm#_How_to_Install > > > On 4/24/2011 2:00 PM, Ehsan Beigi wrote: > > Thanks for you previous help. i have problem in compiling wrf for grib2. i > have netcdf 3.6.2 and i am using icc and ifort compoler and my linux is : > > Linux ehsan 2.6.32-71.24.1.el6.i686 #1 SMP Sat Mar 26 15:30:33 EDT 2011 > i686 i686 i386 GNU/Linux > > i would be grateful if you could answer flowing questions: > > 1- the jasper, libpng, and zlib are necessary for wrf grib2 or for wps > grib2, can i link them with netcdf by using this command CC=icc FC=ifort > ./configure --prefix=/home/ehsan/netcdf --with-zlib=/home/ehsan zlib > --with-libpng=/home/ehsan/ > libpng --with-jasper=/home/ehsan/jasper > ? > > 2- should these libraries be installed on the same directory and same > place? > > > Thanks > > Ehsan Beigi > > > On Mon, Apr 18, 2011 at 5:05 PM, Ehsa > n Beigi wrote: > >> Thanks >> >> Ehsan Beigi >> >> >> >> On Mon, Apr 18, 2011 at 2:50 PM, wrfhelp wrote: >> >>> Please install netCDF v3.6.x instead of V4 and above version. This is >>> because WRF is not consistent to higher version netCDF. >>> >>> >>> On 4/17/2011 10:52 PM, Ehsan Beigi wrote: >>> >>> >>> Thanks for your previous help. I am installing netcdf.4.1.2 and before >>> that i installed zlib , libpng and jasper library which is needed for WPS. >>> when i want to install netcdf, it asks for curl directory which make me to >>> install curl. my question is that : would it be possible to use this command >>> for sharing other installed libraries with netcdf: >>> >>> CC=icc FC=ifort ./configure --prefix=/home/ehsan/netcdf >>> --with-zlib=/home/ehsan zlib --with-libpng=/home/ehsan/libpng >>> --with-jasper=/home/ehsan/jasper --with curl=/home/ehsan/curl >>> >>> Best Regards >>> >>> Ehsan >>> -- >>> *Ehsan Beigi* >>> *PhD Student* >>> *Department of Civil and and Environmental Engineering >>> 2408 Patrick F. Taylor Hall >>> Louisiana State University >>> Baton Rouge, LA, 70803* >>> >>> >>> >>> >> >> >> -- >> *Ehsan Beigi* >> *PhD Student* >> *Department of Civil and and Environmental Engineering >> 2408 Patrick F. Taylor Hall >> Louisiana State University >> Baton Rouge, LA, 70803* >> >> >> > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > 2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110429/8ccd56fb/attachment.html From ahsanshah01 at gmail.com Tue May 3 02:04:25 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Tue, 3 May 2011 13:04:25 +0500 Subject: [Wrf-users] WRF Problem running in Parallel on multiple nodes (cluster) Message-ID: Hello, I am able to run WRFV3.2.1 using mpirun on multiple cores of single machine, but when I want to run it across multiple nodes in cluster using hostlist then I get error, The compute nodes are mounted with the master node during boot using NFS. I get following error. Please help. [root at pmd02 em_real]# mpirun -np 10 -hostfile /home/pmdtest/hostlist ./real.exe bash: orted: command not found bash: orted: command not found -------------------------------------------------------------------------- A daemon (pid 22006) died unexpectedly with status 127 while attempting to launch so we are aborting. There may be more information reported by the environment (see above). This may be because the daemon was unable to find all the needed shared libraries on the remote node. You may set your LD_LIBRARY_PATH to have the location of the shared libraries on the remote nodes and this will automatically be forwarded to the remote nodes. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun noticed that the job aborted, but has no info as to the process that caused that situation. -------------------------------------------------------------------------- mpirun: clean termination accomplished -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110503/85e38e96/attachment.html From bbrashers at Environcorp.com Tue May 3 10:06:46 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Tue, 3 May 2011 09:06:46 -0700 Subject: [Wrf-users] WRF Problem running in Parallel on multiple nodes(cluster) In-Reply-To: References: Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF3085206FEB0C2@irvine01.irvine.environ.local> It looks like OpenMPI is not installed on all your execution machines. You need to install at least the libs on all machines, or on an NFS-shared location. Check the output of "which orted" on the machine that works. Bart From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ahsan Ali Sent: Tuesday, May 03, 2011 1:04 AM To: users at open-mpi.org Subject: [Wrf-users] WRF Problem running in Parallel on multiple nodes(cluster) Hello, I am able to run WRFV3.2.1 using mpirun on multiple cores of single machine, but when I want to run it across multiple nodes in cluster using hostlist then I get error, The compute nodes are mounted with the master node during boot using NFS. I get following error. Please help. [root at pmd02 em_real]# mpirun -np 10 -hostfile /home/pmdtest/hostlist ./real.exe bash: orted: command not found bash: orted: command not found ------------------------------------------------------------------------ -- A daemon (pid 22006) died unexpectedly with status 127 while attempting to launch so we are aborting. There may be more information reported by the environment (see above). This may be because the daemon was unable to find all the needed shared libraries on the remote node. You may set your LD_LIBRARY_PATH to have the location of the shared libraries on the remote nodes and this will automatically be forwarded to the remote nodes. ------------------------------------------------------------------------ -- ------------------------------------------------------------------------ -- mpirun noticed that the job aborted, but has no info as to the process that caused that situation. ------------------------------------------------------------------------ -- mpirun: clean termination accomplished -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110503/85a78be1/attachment-0001.html From Don.Morton at alaska.edu Tue May 3 15:17:08 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Tue, 3 May 2011 13:17:08 -0800 Subject: [Wrf-users] dmpar version of geogrid.exe fails on several architectures Message-ID: Howdy, I've been trying to track down problems with the dmpar-compiled geogrid.exe failing, as follows: ERROR: ERROR: In GEOGRID.TBL, for index file of data at entry 8, category_max is specified, but category_min is not. Both must be specified. ERROR: In GEOGRID.TBL, for index file of data at entry ERROR: In GEOGRID.TBL, for index file of data at entry 99, category_max is specified, but category_min is not. Both must be specified. In GEOGRID.TBL, for index file of data at entry 8, category_max is specified, but category_min is not. Both must be specified. This happens on a Cray XT5 (kraken), a Cray XE6 (chugach), a Penguin computing cluster, and a Sun cluster. In all cases, I am using Portland Group compilers. In all cases, the serial version runs just fine, from the same directory, with the same GEOGRID.TBL This happens using WPS V3.2 and WPS V3.2.1 (I guess I should try WPS V3.3, too). When I use WPS V3.1, the dmpar-compiled geogrid.exe works fine, as expected. The only "hint" I've found so far from googling is a WRF Users Forum entry where someone found that with the Intel compiler the I_MPI_DEBUG flag, if set to a certain value, was causing the same problem. Of course, I"m not using the Intel compiler, but it's interesting that this case was related to an MPI implementation with exactly the same problem. I'm wondering if others have encountered this. I'm trying to create some domains that are too big for serial geogrid.exe Thanks, Don Morton -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110503/4827540f/attachment.html From Don.Morton at alaska.edu Tue May 3 18:29:55 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Tue, 3 May 2011 16:29:55 -0800 Subject: [Wrf-users] WRF Workshop Abstract Submission shortchanges us on word count! :) Message-ID: Howdy, has anybody else had problems submitting abstracts in the 100-200 word range for the WRF Summer Workshop 2011? I've got a 173 word abstract I pasted in, and the page won't accept it until I cut it down closer to 100 words. Is this something workshop organizers are aware of? Thanks, Don Morton -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110503/4fb992cc/attachment.html From Don.Morton at alaska.edu Tue May 3 18:42:52 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Tue, 3 May 2011 16:42:52 -0800 Subject: [Wrf-users] WRF Workshop Abstract Submission shortchanges us on word count! :) In-Reply-To: References: Message-ID: It looks like what's happening is that it's max'ing out on 999 characters (and, although I have 1061 not counting spaces, and 1200+ counting spaces, I only have 173 words). Is there some way somebody can increase this max character value to allow for reasonable length abstracts? Thanks, Don On Tue, May 3, 2011 at 4:29 PM, Don Morton wrote: > Howdy, has anybody else had problems submitting abstracts in the 100-200 > word range for the WRF Summer Workshop 2011? > > I've got a 173 word abstract I pasted in, and the page won't accept it > until I cut it down closer to 100 words. Is this something workshop > organizers are aware of? > > Thanks, > > Don Morton > > -- > Voice: 907 450 8679 > Arctic Region Supercomputing Center > http://weather.arsc.edu/ > http://www.arsc.edu/~morton/ > > -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://www.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110503/c4f9ccf2/attachment.html From J.Kala at murdoch.edu.au Wed May 4 03:12:21 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Wed, 4 May 2011 17:12:21 +0800 Subject: [Wrf-users] WRF-CCSM Message-ID: Hi there, I am trying to run WRF with CCSM data, and is starting to write some code to do the conversion from netcdf to "intermediate" format. Turns out it's a fairly massive job to do from scratch, and I am just wondering if somebody has already done some of that and is willing to share their code ? One thing I am yet to figure out is that the Noah LSM needs 6 hourly soil data (I am pretty sure), but 6 hourly soil data is seldom available from any GCM! The best I have found is some monthly means.... Does this mean I have to somehow interpolate monthly means to 6 hourly ? any suggestions? Cheers, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110504/0a48e256/attachment.html From reenb at meteo.psu.edu Wed May 4 06:32:34 2011 From: reenb at meteo.psu.edu (Brian Reen) Date: Wed, 04 May 2011 08:32:34 -0400 Subject: [Wrf-users] dmpar version of geogrid.exe fails on several architectures In-Reply-To: References: Message-ID: <4DC14762.8060207@meteo.psu.edu> Don, I think you'll find that WPSV3.3 will fix this issue. I was getting similar errors in WPSV3.2 due to some uninitialized variables. If you want to use WPSV3.2, you can probably get it working by initializing some variables (e.g., set is_category_min and is_category_max to false) around line 264 in source_data_module.F. Looking at the WPSV3.3 version should let you know which variables need to be initialized. Thanks, Brian Reen On 5/3/2011 5:17 PM, Don Morton wrote: > Howdy, I've been trying to track down problems with the dmpar-compiled > geogrid.exe failing, as follows: > > ERROR: ERROR: In GEOGRID.TBL, for index file of data at entry 8, > category_max is specified, but category_min is not. Both must be specified. > ERROR: In GEOGRID.TBL, for index file of data at entry ERROR: In > GEOGRID.TBL, for index file of data at entry 99, category_max is specified, > but category_min is not. Both must be specified. > In GEOGRID.TBL, for index file of data at entry 8, category_max is > specified, but category_min is not. Both must be specified. > > > This happens on a Cray XT5 (kraken), a Cray XE6 (chugach), a Penguin > computing cluster, and a Sun cluster. > > In all cases, I am using Portland Group compilers. > > In all cases, the serial version runs just fine, from the same directory, > with the same GEOGRID.TBL > > This happens using WPS V3.2 and WPS V3.2.1 (I guess I should try WPS V3.3, > too). > > When I use WPS V3.1, the dmpar-compiled geogrid.exe works fine, as expected. > > The only "hint" I've found so far from googling is a WRF Users Forum entry > where someone found that with the Intel compiler the I_MPI_DEBUG flag, if > set to a certain value, was causing the same problem. Of course, I"m not > using the Intel compiler, but it's interesting that this case was related to > an MPI implementation with exactly the same problem. I'm wondering if > others have encountered this. I'm trying to create some domains that are > too big for serial geogrid.exe > > Thanks, > > Don Morton > > > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users From yinjinfang88 at 163.com Thu May 5 01:17:21 2011 From: yinjinfang88 at 163.com (yinjinfang) Date: Thu, 5 May 2011 15:17:21 +0800 Subject: [Wrf-users] errors of compile all_wrfvar Message-ID: <201105051517215315473@163.com> Dear all, I'm building WRF-Var V3.2 and 3.3 on a IBM server (AIX 5.3.0.0) I've got an error when compiling WRF-Var using the commands: [f01n01:/gpfs1/home/yinjf/src/WRFDA]$ ./configure wrfda checking for perl5... no checking for perl... found /usr/bin/perl (perl) Will use NETCDF in dir: /gpfs1/home/yinjf/local/netcdf PHDF5 not set in environment. Will configure WRF for use without. Configuring to use jasper library to build Grib2 I/O... $JASPERLIB = /gpfs1/home/yinjf/local/jasper/lib $JASPERINC = /gpfs1/home/yinjf/local/jasper/include ------------------------------------------------------------------------ Please select from among the following supported platforms. 1. AIX xlf compiler with xlc (serial) 2. AIX xlf compiler with xlc (smpar) 3. AIX xlf compiler with xlc (dmpar) 4. AIX xlf compiler with xlc (dm+sm) Enter selection [1-4] : 3 Then, type the command $ compile all_wrfvar >& compile.log & Here is the error message: ( cd var/build; make depend; make -i -r all_wrfvar ) (cd /gpfs1/home/wangdh/yinjf/src/WRFDA; tools/registry -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=1 -DDFI_RADAR=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=8 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DNATIVE_MASSV -DBUFR -DFFTPACK -DNORESHAPE -DDM_PARALLEL -DNETCDF -DGRIB2 -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -DNEW_BDYS Registry/Registry ; cd / gpfs1/home/wangdh/yinjf/src/WRFDA/var/build ) opening Registry/registry.dimspec including Registry/registry.dimspec opening Registry/registry.io_boilerplate including Registry/registry.io_boilerplate opening Registry/io_boilerplate_temporary.inc including Registry/io_boilerplate_temporary.inc /bin/sh: 107418 Memory fault(coredump) make: The error code from the last command is 139. Stop. make: The error code from the last command is 2. Stop. It also generated a "core" file and a "Registry_tmp.107418 " file. The detail information of the Registry_tmp.107418 file is following. The file's size is 352286 kb. -rw-r--r-- 1 wangdh shiys 352286 May 5 15:05 Registry_tmp.107418 ~ 2011-05-05 yinjinfang -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110505/35406018/attachment.html From drostkier at yahoo.com Sun May 8 02:24:01 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Sun, 8 May 2011 01:24:01 -0700 (PDT) Subject: [Wrf-users] auxinput11_interval_s and auxinput11_end_h Message-ID: <209260.67101.qm@web113111.mail.gq1.yahoo.com> Dear users, Could anybody explain to me what these parameters mean and what values should I give to them? Thanks. Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110508/d671ecdc/attachment.html From hamed319 at yahoo.com Tue May 10 01:35:18 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Tue, 10 May 2011 00:35:18 -0700 (PDT) Subject: [Wrf-users] Flerchinger USEd in NEW version. Iterations= 10 Message-ID: <639645.50426.qm@web161213.mail.bf1.yahoo.com> Dear All, I am trying to run a WRF simulation with 3 telescoping domains. At the beginning of the run, I started to see the same message been repeated again and again: Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 ... Could you please help me to figure out what is the problem here, and what is going wrong with my simulation? Thank in advance. ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110510/a793d605/attachment.html From jinxiu58 at gmail.com Tue May 10 06:03:17 2011 From: jinxiu58 at gmail.com (qiancheng jinxiu) Date: Tue, 10 May 2011 20:03:17 +0800 Subject: [Wrf-users] how to run ideal TC test in WRFV3.3? Message-ID: hi ,everyone. Now I tried to study to run ideal_tc test in WRF V3.3 in the PC-CLUSTER The system is linux and compiler is pgi. When I compiled the WRF code by the choice "dmpar" succefully and run irun_me_first.csh in ./test/em_tropical_cyclone directory firstly, I submit the script to run the ideal.exe but I can not get the ideal input domain file and ideal run failed.When I check the error information some of the rsl.out.files have showed as below.Can any one tell me how to resolve it? . Iterate: th,qv,pi = 1 NaN NaN NaN th,qv,pi = 2 NaN NaN NaN th,qv,pi = 3 NaN NaN NaN th,qv,pi = 4 NaN NaN NaN th,qv,pi = 5 NaN NaN NaN th,qv,pi = 6 NaN NaN NaN th,qv,pi = 7 NaN NaN NaN th,qv,pi = 8 NaN NaN NaN th,qv,pi = 9 NaN NaN NaN th,qv,pi = 10 NaN NaN NaN th,qv,pi = 11 NaN NaN NaN th,qv,pi = 12 NaN NaN NaN th,qv,pi = 13 NaN NaN NaN th,qv,pi = 14 NaN NaN NaN th,qv,pi = 15 NaN NaN NaN th,qv,pi = 16 NaN NaN NaN th,qv,pi = 17 NaN NaN NaN th,qv,pi = 18 NaN NaN NaN th,qv,pi = 19 NaN NaN NaN th,qv,pi = 20 NaN NaN NaN But some rsl.out file seems to be just interupted. vref: 1 7500.000 3.098607 2 22500.00 7.257306 3 37500.00 9.722547 4 52500.00 11.18232 5 67500.00 12.01058 6 82500.00 12.42429 7 97500.00 12.55589 8 112500.0 12.48963 9 127500.0 12.28086 10 142500.0 11.96697 11 157500.0 11.57375 12 172500.0 11.11931 13 187500.0 10.61656 ~ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110510/bbf59b3d/attachment.html From Michael.Zulauf at iberdrolaren.com Tue May 10 17:03:03 2011 From: Michael.Zulauf at iberdrolaren.com (Zulauf, Michael) Date: Tue, 10 May 2011 16:03:03 -0700 Subject: [Wrf-users] WRF/WPS 3.3 geogrid.exe segfault Message-ID: Hi all, I'm having troubles getting a working WRF/WPS 3.3 using PGI compilers. For my most recent attempt, I used: WRF/WPS 3.3 PGI 10.5 OpenMPI 1.4.1 netcdf 3.6.3 I'm building WPS with GRIB2 support: jasper 1.900 libpng 1.4.1 Everything appears to build correctly (unlike my previous attempts with an older version of PGI), but when I try to run geogrid, I get a segmentation fault when processing LANDUSEF for domain 1. There are no useful clues in the output or the geogrid.log. This is for a model setup that is working well with WRF/WPS 3.2.1. I realize there are a lot of ways that things can go wrong. Has anybody gotten WRF/WPS 3.3 working with PGI 10.3 or 10.5? If so, I was wondering if somebody could maybe share their configure.wrf and configure.wps files with me. If it's not obvious from the configure files, I'd also like the details of the versions of libraries you're building against. I'd like to try and match up my set of libraries and compiler options (etc) in a similar way, to see if I can get this working. Thanks, Mike -- PLEASE NOTE - NEW E-MAIL ADDRESS: michael.zulauf at iberdrolaren.com Mike Zulauf Meteorologist, Lead Senior Wind Asset Management Iberdrola Renewables 1125 NW Couch, Suite 700 Portland, OR 97209 Office: 503-478-6304 Cell: 503-913-0403 Please be advised that email addresses for Iberdrola Renewables personnel have changed to first.last at iberdrolaREN.com effective Aug. 16, 2010. Please make a note. Thank you. This message is intended for the exclusive attention of the recipient(s) indicated. Any information contained herein is strictly confidential and privileged. If you are not the intended recipient, please notify us by return e-mail and delete this message from your computer system. Any unauthorized use, reproduction, alteration, filing or sending of this message and/or any attached files may lead to legal action being taken against the party(ies) responsible for said unauthorized use. Any opinion expressed herein is solely that of the author(s) and does not necessarily represent the opinion of the Company. The sender does not guarantee the integrity, speed or safety of this message, and does not accept responsibility for any possible damage arising from the interception, incorporation of viruses, or any other damage as a result of manipulation. From davidstephenbryan at yahoo.com Wed May 11 09:38:41 2011 From: davidstephenbryan at yahoo.com (David Bryan) Date: Wed, 11 May 2011 08:38:41 -0700 (PDT) Subject: [Wrf-users] How to specify additional vertical levels Message-ID: <650574.69313.qm@web65909.mail.ac4.yahoo.com> Is there a way to run WRF at customized vertical levels?? I don't just mean increasing the value of e_vert.? And I'm talking about adding vertical levels not contained intermediate file format, so running mod_levs.exe would not be useful.? Generally, based on what I've read in the user's guide, it seems like the only control one has is to increase e_vert. Thanks! From hamed319 at yahoo.com Wed May 11 09:53:36 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Wed, 11 May 2011 08:53:36 -0700 (PDT) Subject: [Wrf-users] Flerchinger Error Message-ID: <327630.1002.qm@web161208.mail.bf1.yahoo.com> Dear All, I am trying to run a WRF simulation with 3 telescoping domains. At the beginning of the run, I started to see the same message been repeated again and again: Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 ... Could you please help me to figure out what is the problem here, and what is going wrong with my simulation? Thank in advance. Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110511/0e2329e4/attachment.html From agnes.mika at bmtargoss.com Thu May 12 01:05:44 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Thu, 12 May 2011 09:05:44 +0200 Subject: [Wrf-users] How to specify additional vertical levels In-Reply-To: <650574.69313.qm@web65909.mail.ac4.yahoo.com> References: <650574.69313.qm@web65909.mail.ac4.yahoo.com> Message-ID: <20110512070544.GA13141@aggedor.argoss.nl> Hallo David, You have to specify the eta_levels variable in the &domains section of your namelist.input. You need to specify explicitely which eta_levels you want the model to use. You also have to adjust e_vert so that it matches the number of eta_levels you have specified. Greetings, Agnes David Bryan wrote: > Is there a way to run WRF at customized vertical levels?? I don't just mean > increasing the value of e_vert.? And I'm talking about adding vertical levels > not contained intermediate file format, so running mod_levs.exe would not be > useful.? Generally, based on what I've read in the user's guide, it seems like > the only control one has is to increase e_vert. > > Thanks! > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. From ahsanshah01 at gmail.com Wed May 11 21:24:43 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Thu, 12 May 2011 08:24:43 +0500 Subject: [Wrf-users] Flerchinger USEd in NEW version. Iterations= 10 (Hamed Sharifi) Message-ID: Dear Hamed, Try reducing your time step. > Date: Tue, 10 May 2011 00:35:18 -0700 (PDT) > From: Hamed Sharifi > Subject: [Wrf-users] Flerchinger USEd in NEW version. Iterations= 10 > To: "wrf-users at ucar.edu" > Message-ID: <639645.50426.qm at web161213.mail.bf1.yahoo.com> > Content-Type: text/plain; charset="iso-8859-1" > > Dear All, > > I am trying to run a WRF simulation with 3 telescoping domains. At the > beginning of the run, I started to see the same message been repeated > again and again: > > Flerchinger USEd in NEW version. Iterations= 10 > Flerchinger USEd in NEW version. Iterations= 10 > Flerchinger USEd in NEW version. Iterations= 10 > Flerchinger USEd in NEW version. Iterations= 10 > ... > > Could you please help me to > figure out what is the problem here, and what is going wrong with my > simulation? > > Thank in advance. > ? > Hamed Sharifi, > M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | > +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | > -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/3fe15964/attachment.html From hamed319 at yahoo.com Thu May 12 00:24:28 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Wed, 11 May 2011 23:24:28 -0700 (PDT) Subject: [Wrf-users] Problem :Flerchinger USEd in NEW version & ncview SMOIS variable Message-ID: <843635.61330.qm@web161210.mail.bf1.yahoo.com> Dear Sir/Maam, I have a problem with running wrf.exe. I used 3 telescoping grid but it got me this error: Flerchinger USEd in NEW version. Iterations= 10 I reduced time_step to 5 second and 3 sec; but the error occurred again. I changed the? "sf_sfclay_physics" to "1,???? 1,???? 1," . Then it went right and WRF right properly. But I don't know why the problem is. Also, when I want to see the SMOIS&SH2O with ncview, it got me this: "min and max both 0 for variable SMOIS. I can check all data instead of subsampling if that's OK, or just cancel viewing this variable." when I click OK, it says:"min and max both 0 for variable SMOIS (checked all data) Setting range to (-1,1)" Also, this message appears for other variables such as (SH2O,CLDFRA,QCLOUD,QRAIN,SMCREL,ACGRDFLX,ACHFX,CANWAT,GLW,GRAUPELNC,GRDFLX,HAILNC,...). Why and when does this problem occur?Could you please give me some help? Thanks in advance, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110511/62e372b1/attachment.html From michalg at edan.co.il Thu May 12 03:01:51 2011 From: michalg at edan.co.il (Michal Gerti) Date: Thu, 12 May 2011 12:01:51 +0300 Subject: [Wrf-users] Oholo Conference 2011 - Save the Date Message-ID: Dear Colleague, We are pleased to announce that the 48th Oholo Conference entitled "Emerging Remote Sensing Techniques and Associated Modeling for Air Pollution Applications" is scheduled to take place in Eilat, Israel on November 6-10, 2011. A website with further information, as well as registration and abstract submission forms will be activated soon. You are welcome to contact the Oholo Conference Secretariat at Oholo at iibr.gov.il. On behalf of the Organizing Committee -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/49de8665/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: save the date - final120511.JPG Type: image/jpeg Size: 891049 bytes Desc: save the date - final120511.JPG Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/49de8665/attachment-0001.jpe From patrik.norman at chalmers.se Thu May 12 04:17:56 2011 From: patrik.norman at chalmers.se (Patrik Norman) Date: Thu, 12 May 2011 12:17:56 +0200 Subject: [Wrf-users] Flerchinger Error In-Reply-To: <327630.1002.qm@web161208.mail.bf1.yahoo.com> References: <327630.1002.qm@web161208.mail.bf1.yahoo.com> Message-ID: <7DB1A809-F209-40C1-99E7-27F88F2D6A03@chalmers.se> Hamed, I think that means that you get numerical instability in your model run. Try setting a lower value for timestep. //Patrik -------------------------------------------------- Patrik Norman Optical Remote Sensing Group Department of Earth and Space Sciences Chalmers University of Technology H?rsalsv?gen 11, Floor 4 SE-412 96 G?teborg, Sweden Phone: +46 31 772 5655 Fax: +46 31 772 1884 -------------------------------------------------- 11 maj 2011 kl. 17.53 skrev Hamed Sharifi: Dear All, I am trying to run a WRF simulation with 3 telescoping domains. At the beginning of the run, I started to see the same message been repeated again and again: Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10 ... Could you please help me to figure out what is the problem here, and what is going wrong with my simulation? Thank in advance. Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/e017866d/attachment.html From drostkier at yahoo.com Thu May 12 17:18:34 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Thu, 12 May 2011 16:18:34 -0700 (PDT) Subject: [Wrf-users] 48th Oholo Conference: "Emerging Remote Sensing Techniques and Associated Modeling for Air Pollution Applications" Message-ID: <210917.78402.qm@web113103.mail.gq1.yahoo.com> Dear WRF users, Please, see attached an announcement to a conference that I am co-organizing. Hope to see you. Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/37281b1f/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: save the date - final120511.JPG Type: image/jpeg Size: 891049 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110512/37281b1f/attachment-0001.jpe From aruny at iitk.ac.in Wed May 18 03:31:12 2011 From: aruny at iitk.ac.in (Arun) Date: Wed, 18 May 2011 15:01:12 +0530 Subject: [Wrf-users] WRF 3.2.1 compilation errors In-Reply-To: <4D0A11E2.4040501@meteo.bg> References: <4D0A11E2.4040501@meteo.bg> Message-ID: <4DD391E0.1080709@iitk.ac.in> Hi everyone, I am trying to compile WRF on an intel machine but keep getting error related to module_initialize_real.f90 file. I'm attaching my log file along with this mail. Please see if someone can help me. Googling didn't help. Regards, Arun -------------- next part -------------- A non-text attachment was scrubbed... Name: compile.log Type: text/x-log Size: 432149 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110518/a1cf577d/attachment-0001.bin From ebeigi3 at tigers.lsu.edu Tue May 17 11:29:50 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Tue, 17 May 2011 13:29:50 -0400 Subject: [Wrf-users] Error when make plotgrids.exe and plotfmt.exe Message-ID: Dear Sir/Madam, I have linux redhat 6 (RHEL6) and ICC and IFORT. I am trying to compile WPS with NCL 5.2.1 , and I installed the NCL correctly, but i can not see plotgrid.exe and plotfmt.exe when i compile wps. i attached the compile_wps.log for you consideration i really appreciate your help. Best Regards Ehsan Beigi On Mon, May 16, 2011 at 3:06 PM, Ehsan Beigi wrote: > thanks for you help. i found the prblem, it was related to the time. is it > NCL necessary for WRF? > > Best Regards > > Ehsan Beigi > > > > On Mon, May 16, 2011 at 2:13 PM, wrfhelp wrote: > >> Did you see ungrib.exe . metgrid.exe, geogrid.exe in your WPS directory? >> I found some errors in your log file, but those are related to ncarg >> graphic. So I suppose they shouldn't affect your ungrib.exe. >> >> However, I am not sure whether you have built WPS successfully because >> your log file is incomplete. >> >> The error message you got seemed like that, you try to ungrib GRIB2 format >> data, but your WPS doesn't support GRIB2. I am suspicious that you didn't >> build WPS successfully ( at least with GRIB2 support). >> >> Please look at >> http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap3.htm#_How_to_Install >> >> This page showed how to install all necessary libraries for WPS and how to >> build WPS. >> >> >> >> On 5/13/2011 7:38 PM, Ehsan Beigi wrote: >> >> >> Thanks for your previous help. i have this error when i am trying to run >> the example of WPS file in your website: >> http://www.mmm.ucar.edu/wrf/users/download/get_source2.html , and on the WRF >> Preprocessing System test data. i have WRFV3.2.1, and icc and ifort, i installed zlib, libpng and >> jasper, but i can not run this example, i have this error : >> ERROR: Grib2 file or date problem, stopping in edition_num. >> >> i attached the compile_wps.log >> >> I really appreciate your help. >> >> Best Regards, >> >> -- >> *Ehsan Beigi* >> *PhD Student* >> *Department of Civil and and Environmental Engineering >> 2408 Patrick F. Taylor Hall >> Louisiana State University >> Baton Rouge, LA, 70803* >> >> >> >> > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > 2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110517/0196b6bf/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_wps.log Type: application/octet-stream Size: 91894 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110517/0196b6bf/attachment-0001.obj From guillermo.fernandez at meteogalicia.es Fri May 13 12:17:31 2011 From: guillermo.fernandez at meteogalicia.es (=?ISO-8859-15?Q?Guillermo_Fern=E1ndez?=) Date: Fri, 13 May 2011 20:17:31 +0200 Subject: [Wrf-users] WRF with NetCDF4 and HDF5 compression enable Message-ID: <4DCD75BB.2070203@meteogalicia.es> Dear All, We are considering to use WRF with NetCDF4 and HDF5 compression output enabled. Unfortunately the only information we have found so far is this link from 2008: www.unidata.ucar.edu/software/netcdf/papers/AGU_2008_poster.pdf 1.- Has someone used NetCDF4 and HDF5 compression in WRF? 2.- Are utilities like postprocessor (WPP) ready for it? Thank you in advance, Guillermo -- MeteoGalicia Guillermo Fern?ndez Garc?a Dpto. Predici?n Num?rica e Investigaci?n R?a Roma, 6 15.707 Santiago de Compostela Tel: +34 981 957 462 Fax: +34 981 957 466 guillermo.fernandez at meteogalicia.es www.meteogalicia.es From hamed319 at yahoo.com Sat May 14 05:10:58 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Sat, 14 May 2011 04:10:58 -0700 (PDT) Subject: [Wrf-users] KF_ETA_PARA error Message-ID: <587815.45575.qm@web161202.mail.bf1.yahoo.com> Dear Sir/Ma'am, I have a problem with running wrf.exe. I used 3 telescoping grid but it got first me this error: Flerchinger USEd in NEW version. Iterations= 10 I reduced time_step to 5 second and 3 sec; but the error occurred again. I changed the "sf_sfclay_physics" to "1, 1, 1," . Then it went right and WRF right properly. But another error hit me: WOULD GO OFF TOP: KF_ETA_PARA I,J,DPTHMX,DPMIN 8 78 NaN 5000.000 Also, when I want to see the SMOIS&SH2O with ncview, it got me this: "min and max both 0 for variable SMOIS. I can check all data instead of subsampling if that's OK, or just cancel viewing this variable." when I click OK, it says:"min and max both 0 for variable SMOIS (checked all data) Setting range to (-1,1)" Also, this message appears for other variables such as (SH2O,CLDFRA,QCLOUD,QRAIN,SMCREL,ACGRDFLX,ACHFX,CANWAT,GLW,GRAUPELNC,GRDFLX,HAILNC,...). Why and when does this problem occur? Could you please give me some help? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110514/2214adf7/attachment.html From nilima.2002 at gmail.com Wed May 18 04:54:04 2011 From: nilima.2002 at gmail.com (nilima natoo) Date: Wed, 18 May 2011 12:54:04 +0200 Subject: [Wrf-users] segmentation fault with grib2 data Message-ID: Dear wrf_users, While using ./ungrib.exe I came across with following error message for grib2 type data: [nma at k6 WPS]$ ./ungrib.exe *** Starting program ungrib.exe *** Start_date = 1979-01-01_00:00:00 , End_date = 1979-01-01_00:00:00 output format is WPS Path to intermediate files is ./ ungrib - grib edition num 2 Segmentation fault This kind of segmentation problem does not occur with grib1 type data. In addition to that there is no 'Error' message while compilation and all executables are created. This is how ungrib specific portion of my namelist.wps looks like: &share wrf_core = 'ARW', max_dom = 1, start_date = '1979-01-01_00:00:00', end_date = '1979-01-01_00:00:00', interval_seconds = 21600 io_form_geogrid = 2, &ungrib out_format = 'WPS', prefix = 'FILE', I am using NCEP-CFSR grib2 data and Vtable.CFSR looks like this GRIB1| Level| From | To | metgrid | metgrid | metgrid |GRIB2|GRIB2|GRIB2|GRIB2| Param| Type |Level1|Level2| Name | Units | Description |Discp|Catgy|Param|Level| -----+------+------+------+----------+---------+-----------------------------------------+------------------------------------------------+ 11 | 100 | * | | TT | K | Temperature | 0 | 0 | 0 | 100 | 33 | 100 | * | | UU | m s-1 | U | 0 | 2 | 2 | 100 | 34 | 100 | * | | VV | m s-1 | V | 0 | 2 | 3 | 100 | 52 | 100 | * | | RH | % | Relative Humidity | 0 | 1 | 1 | 100 | 7 | 100 | * | | HGT | m | Height | 0 | 3 | 5 | 100 | 11 | 105 | 2 | | TT | K | Temperature at 2 m | 0 | 0 | 0 | 103 | 51 | 105 | 2 | | SPECHUMD | kg kg-1 | Specific Humidity at 2 m | 0 | 1 | 0 | 103 | 33 | 105 | 10 | | UU | m s-1 | U at 10 m | 0 | 2 | 2 | 103 | 34 | 105 | 10 | | VV | m s-1 | V at 10 m | 0 | 2 | 3 | 103 | 1 | 1 | 0 | | PSFC | Pa | Surface Pressure | 0 | 3 | 0 | 1 | 2 | 102 | 0 | | PMSL | Pa | Sea-level Pressure | 0 | 3 | 1 | 101 | 144 | 112 | 0 | 10 | SM000010 | fraction | Soil Moist 0-10 cm below grn layer (Up) | 2 | 0 | 192 | 106 | 144 | 112 | 10 | 40 | SM010040 | fraction | Soil Moist 10-40 cm below grn layer | 2 | 0 | 192 | 106 | 144 | 112 | 40 | 100 | SM040100 | fraction | Soil Moist 40-100 cm below grn layer | 2 | 0 | 192 | 106 | 144 | 112 | 100 | 200 | SM100200 | fraction | Soil Moist 100-200 cm below gr layer | 2 | 0 | 192 | 106 | 11 | 112 | 0 | 10 | ST000010 | K | T 0-10 cm below ground layer (Upper) | 0 | 0 | 0 | 106 | 11 | 112 | 10 | 40 | ST010040 | K | T 10-40 cm below ground layer (Upper) | 0 | 0 | 0 | 106 | 11 | 112 | 40 | 100 | ST040100 | K | T 40-100 cm below ground layer (Upper) | 0 | 0 | 0 | 106 | 11 | 112 | 100 | 200 | ST100200 | K | T 100-200 cm below ground layer (Bottom) | 0 | 0 | 0 | 106 | 91 | 1 | 0 | | SEAICE | proprtn | Ice flag | 10 | 2 | 0 | 1 | 81 | 1 | 0 | | LANDSEA | proprtn | Land/Sea flag (1=land, 0 or 2=sea) | 2 | 0 | 0 | 1 | 7 | 1 | 0 | | SOILHGT | m | Terrain field of source analysis | 0 | 3 | 5 | 1 | 11 | 1 | 0 | | SKINTEMP | K | Skin temperature (can use for SST also) | 0 | 0 | 0 | 1 | 65 | 1 | 0 | | SNOW | kg m-2 | Water equivalent snow depth | 0 | 1 | 13 | 1 | 224| 1 | 0 | | SOILCAT | Tab4.213| Dominant soil type cat. | 2 | 3 | 0 | 1 | 225| 1 | 0 | | VEGCAT | Tab4.212| Dominant land use cat. | 2 | 0 | 198 | 1 | -----+------+------+------+----------+---------+-----------------------------------------+-------------------------------------------------+ In case you have already faced this problem or have some idea to solve it, kindly help me with some hint. Thanks. with regards, nilima -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110518/4d302092/attachment.html From rdelgado at dim.uchile.cl Wed May 18 08:47:48 2011 From: rdelgado at dim.uchile.cl (=?ISO-8859-1?Q?Rodrigo_Delgado_Urz=FAa?=) Date: Wed, 18 May 2011 10:47:48 -0400 Subject: [Wrf-users] WRF NMM latlon mismatch error Message-ID: <4DD3DC14.7010000@dim.uchile.cl> Dear All, I am trying to run a WRF (version 3.2.1) NMM simulation with 3 telescoping domains. WPS stage (geogrid, ungrib and metgrid runs are OK; plotgrids gmeta file shows an apparently correct domain configuration) and real_nmm runs without problems (wrfinput* and wrfbdy* files are in place). When I try to run wrf.exe the following message appears: SOME MATCHING TEST i_parent_start, j_parent_start 36 84 WRFSI LAT COMPUTED LAT -41.19375 -35.02003 WRFSI LON COMPUTED LON -76.28421 -75.00367 CHECK WRFSI CONFIGURATION AND INPUT HIGH RESOLUTION TOPOGRAPHY AND/OR GRID RATIO -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: NMM_NEST_UTILS1.b LINE: 2169 LATLON MISMATCH: ERROR READING static FILE FOR THE NEST ------------------------------------------- The domain dimensions set in namelist.input file are exactly the same as configured in previous stages. I have met_nmm* and geo_nmm* files in place. What I am doing wrong? Thanks Rodrigo -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110518/64fbb25a/attachment-0001.html From s.hawkins at ed.ac.uk Wed May 18 04:38:11 2011 From: s.hawkins at ed.ac.uk (Sam Hawkins) Date: Wed, 18 May 2011 11:38:11 +0100 Subject: [Wrf-users] Understanding WRF surface layer schemes Message-ID: <4DD3A193.7050604@ed.ac.uk> Dear WRF users I'm looking for any documentation on how WRF calculates surface wind speeds and surface momentum flux over water, (in particular using the Eta scheme). If surface wind speed and friction velocity depend on the roughness length, but the roughness length itself depends on the friction velocity, where do you start? I've read the surface fluxes are computed 'by an iterative method', but haven't found any description of it. Thanks, Sam. -- The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. From eric.kemp at nasa.gov Wed May 18 09:58:25 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.3)[NORTHROP GRUMMAN]) Date: Wed, 18 May 2011 10:58:25 -0500 Subject: [Wrf-users] WRF 3.2.1 compilation errors In-Reply-To: <4DD391E0.1080709@iitk.ac.in> Message-ID: The problem is the "intent(in)" and "intent(out)" attributes with the pointers in module_initialize_real.F. Technically it is illegal in Fortran 90 to use these attributes with pointer arguments (Fortran 2003 allows it). Most Fortran compilers accept it, but the gfortran compiler is not one of them. The fix is to modify module_initialize_real.F and remove the attributes, then recompile. -Eric On 5/18/11 5:31 AM, "Arun" wrote: > Hi everyone, > I am trying to compile WRF on an intel machine but keep getting error > related to module_initialize_real.f90 file. I'm attaching my log file > along with this mail. Please see if someone can help me. Googling didn't > help. > > Regards, > Arun -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- From FLiu at azmag.gov Wed May 18 09:52:25 2011 From: FLiu at azmag.gov (Feng Liu) Date: Wed, 18 May 2011 15:52:25 +0000 Subject: [Wrf-users] Error when make plotgrids.exe and plotfmt.exe In-Reply-To: References: Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C256ACAC7@mag9006> Could you please post your configure.wps? And then I will check that for you. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ehsan Beigi Sent: Tuesday, May 17, 2011 10:30 AM To: wrfhelp Cc: wrf-users at ucar.edu Subject: [Wrf-users] Error when make plotgrids.exe and plotfmt.exe Dear Sir/Madam, I have linux redhat 6 (RHEL6) and ICC and IFORT. I am trying to compile WPS with NCL 5.2.1 , and I installed the NCL correctly, but i can not see plotgrid.exe and plotfmt.exe when i compile wps. i attached the compile_wps.log for you consideration i really appreciate your help. Best Regards Ehsan Beigi On Mon, May 16, 2011 at 3:06 PM, Ehsan Beigi > wrote: thanks for you help. i found the prblem, it was related to the time. is it NCL necessary for WRF? Best Regards Ehsan Beigi On Mon, May 16, 2011 at 2:13 PM, wrfhelp > wrote: Did you see ungrib.exe . metgrid.exe, geogrid.exe in your WPS directory? I found some errors in your log file, but those are related to ncarg graphic. So I suppose they shouldn't affect your ungrib.exe. However, I am not sure whether you have built WPS successfully because your log file is incomplete. The error message you got seemed like that, you try to ungrib GRIB2 format data, but your WPS doesn't support GRIB2. I am suspicious that you didn't build WPS successfully ( at least with GRIB2 support). Please look at http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap3.htm#_How_to_Install This page showed how to install all necessary libraries for WPS and how to build WPS. On 5/13/2011 7:38 PM, Ehsan Beigi wrote: Thanks for your previous help. i have this error when i am trying to run the example of WPS file in your website: http://www.mmm.ucar.edu/wrf/users/download/get_source2.html , and on the WRF Preprocessing System test data . i have WRFV3.2.1, and icc and ifort, i installed zlib, libpng and jasper, but i can not run this example, i have this error : ERROR: Grib2 file or date problem, stopping in edition_num. i attached the compile_wps.log I really appreciate your help. Best Regards, -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110518/96e56e46/attachment.html From maemarcus at gmail.com Wed May 18 16:03:54 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Thu, 19 May 2011 02:03:54 +0400 Subject: [Wrf-users] WRF 3.2.1 compilation errors In-Reply-To: References: <4DD391E0.1080709@iitk.ac.in> Message-ID: Arun, Could you check how old is your gfortran version and update it if it is too old? I've just checked gfortran 4.5 handle pointers marked intent(in) without any problems. - D. 2011/5/18 Kemp, Eric M. (GSFC-610.3)[NORTHROP GRUMMAN] : > > The problem is the "intent(in)" and "intent(out)" attributes with the > pointers in module_initialize_real.F. ?Technically it is illegal in Fortran > 90 to use these attributes with pointer arguments (Fortran 2003 allows it). > Most Fortran compilers accept it, but the gfortran compiler is not one of > them. > > The fix is to modify module_initialize_real.F and remove the attributes, > then recompile. > > -Eric > > > On 5/18/11 5:31 AM, "Arun" wrote: > >> Hi everyone, >> ? I am trying to compile WRF on an intel machine but keep getting error >> related to module_initialize_real.f90 file. I'm attaching my log file >> along with this mail. Please see if someone can help me. Googling didn't >> help. >> >> Regards, >> Arun > > -------------------------------------------------------------------- > Eric M. Kemp ? ? ? ? ? ? ? ? ? Northrop Grumman Corporation > Meteorologist ? ? ? ? ? ? ? ? ?Information Systems > Civil Enterprise Solutions ? ? Civil Systems Division > > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Goddard Space Flight Center > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Mailstop 610.3 > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Greenbelt, MD 20771 > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Telephone ? 301-286-9768 > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Fax ? ? ? ? 301-286-1775 > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? E-mail: ? ? eric.kemp at nasa.gov > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? E-mail: ? ? eric.kemp at ngc.com > -------------------------------------------------------------------- > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From aruny at iitk.ac.in Thu May 19 03:40:46 2011 From: aruny at iitk.ac.in (Arun) Date: Thu, 19 May 2011 15:10:46 +0530 Subject: [Wrf-users] WRF 3.2.1 compilation errors In-Reply-To: References: <4DD391E0.1080709@iitk.ac.in> Message-ID: <4DD4E59E.5000501@iitk.ac.in> Hi, I'm using gfortran version 4.2.4 right now but as I'm using a shared system updating gfortran is not an option. As Eric suggested, I will edit my module_initialize_real.F to remove those pointer and get back to you guys if anything goes wrong. Arun On Thursday 19 May 2011 03:33 AM, Dmitry N. Mikushin wrote: > Arun, > > Could you check how old is your gfortran version and update it if it > is too old? I've just checked gfortran 4.5 handle pointers marked > intent(in) without any problems. > > - D. > > 2011/5/18 Kemp, Eric M. (GSFC-610.3)[NORTHROP GRUMMAN]: >> The problem is the "intent(in)" and "intent(out)" attributes with the >> pointers in module_initialize_real.F. Technically it is illegal in Fortran >> 90 to use these attributes with pointer arguments (Fortran 2003 allows it). >> Most Fortran compilers accept it, but the gfortran compiler is not one of >> them. >> >> The fix is to modify module_initialize_real.F and remove the attributes, >> then recompile. >> >> -Eric >> >> >> On 5/18/11 5:31 AM, "Arun" wrote: >> >>> Hi everyone, >>> I am trying to compile WRF on an intel machine but keep getting error >>> related to module_initialize_real.f90 file. I'm attaching my log file >>> along with this mail. Please see if someone can help me. Googling didn't >>> help. >>> >>> Regards, >>> Arun >> -------------------------------------------------------------------- >> Eric M. Kemp Northrop Grumman Corporation >> Meteorologist Information Systems >> Civil Enterprise Solutions Civil Systems Division >> >> Goddard Space Flight Center >> Mailstop 610.3 >> Greenbelt, MD 20771 >> Telephone 301-286-9768 >> Fax 301-286-1775 >> E-mail: eric.kemp at nasa.gov >> E-mail: eric.kemp at ngc.com >> -------------------------------------------------------------------- >> >> >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> From maemarcus at gmail.com Thu May 19 04:23:20 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Thu, 19 May 2011 14:23:20 +0400 Subject: [Wrf-users] WRF 3.2.1 compilation errors In-Reply-To: <4DD4E59E.5000501@iitk.ac.in> References: <4DD391E0.1080709@iitk.ac.in> <4DD4E59E.5000501@iitk.ac.in> Message-ID: Ok, just in case, you can also install alternative version of gfortran locally into your personal system account only, or switch between multiple versions on system level using "module load", "update-alternatives" or links. - D. 2011/5/19 Arun : > Hi, > I'm using gfortran version 4.2.4 right now but as I'm using a shared system > updating gfortran is not an option. As Eric suggested, I will edit my > module_initialize_real.F to remove those pointer and get back to you guys if > anything goes wrong. > > Arun > > On Thursday 19 May 2011 03:33 AM, Dmitry N. Mikushin wrote: >> >> Arun, >> >> Could you check how old is your gfortran version and update it if it >> is too old? I've just checked gfortran 4.5 handle pointers marked >> intent(in) without any problems. >> >> - D. >> >> 2011/5/18 Kemp, Eric M. (GSFC-610.3)[NORTHROP >> GRUMMAN]: >>> >>> The problem is the "intent(in)" and "intent(out)" attributes with the >>> pointers in module_initialize_real.F. ?Technically it is illegal in >>> Fortran >>> 90 to use these attributes with pointer arguments (Fortran 2003 allows >>> it). >>> Most Fortran compilers accept it, but the gfortran compiler is not one of >>> them. >>> >>> The fix is to modify module_initialize_real.F and remove the attributes, >>> then recompile. >>> >>> -Eric >>> >>> >>> On 5/18/11 5:31 AM, "Arun" ?wrote: >>> >>>> Hi everyone, >>>> ? I am trying to compile WRF on an intel machine but keep getting error >>>> related to module_initialize_real.f90 file. I'm attaching my log file >>>> along with this mail. Please see if someone can help me. Googling didn't >>>> help. >>>> >>>> Regards, >>>> Arun >>> >>> -------------------------------------------------------------------- >>> Eric M. Kemp ? ? ? ? ? ? ? ? ? Northrop Grumman Corporation >>> Meteorologist ? ? ? ? ? ? ? ? ?Information Systems >>> Civil Enterprise Solutions ? ? Civil Systems Division >>> >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Goddard Space Flight Center >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Mailstop 610.3 >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Greenbelt, MD 20771 >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Telephone ? 301-286-9768 >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Fax ? ? ? ? 301-286-1775 >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? E-mail: ? ? eric.kemp at nasa.gov >>> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? E-mail: ? ? eric.kemp at ngc.com >>> -------------------------------------------------------------------- >>> >>> >>> >>> _______________________________________________ >>> Wrf-users mailing list >>> Wrf-users at ucar.edu >>> http://mailman.ucar.edu/mailman/listinfo/wrf-users >>> > > From nschiff2 at atmos.uiuc.edu Thu May 19 09:37:19 2011 From: nschiff2 at atmos.uiuc.edu (Nicole Schiffer) Date: Thu, 19 May 2011 10:37:19 -0500 Subject: [Wrf-users] Error in boundary condition specification Message-ID: <4DD5392F.1050804@earth.uiuc.edu> I am trying to run the tutorial case that is one-way nesting using ndown. Running real.exe gives me the following error in three of the four rsl.out files: Domain 2: Current date being processed: 2005-08-28_00:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2005 240 0.000000 metgrid input_wrf.F first_date_input = 2005-08-28_00:00:00 metgrid input_wrf.F first_date_nml = 2005-08-28_00:00:00 d02 2005-08-28_00:00:00 Timing for input 0 s. d02 2005-08-28_00:00:00 flag_soil_layers read from met_em file is 1 *** Error in boundary condition specification boundary conditions at xs 0 boundary conditions at xe 0 boundary conditions at ys 0 boundary conditions at ye 0 boundary conditions logicals are periodic_x F periodic_y F symmetric_xs F symmetric_xe F symmetric_ys F symmetric_ye F open_xs F open_xe F open_ys F open_ye F polar F nested F specified F -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 165 *** Error in boundary condition specification ------------------------------------------- Where should I look to solve this problem? Thanks, Nicole -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * Nicole Schiffer * Graduate Research Fellow (Dept. of Energy) * Department of Atmospheric Sciences * University of Illinois, Urbana-Champaign * Email: nschiff2 [at] illinois [dot] edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110519/49439aac/attachment.html From wrf at nusculus.com Wed May 18 20:45:05 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Wed, 18 May 2011 20:45:05 -0600 Subject: [Wrf-users] WRF-CCSM In-Reply-To: References: Message-ID: Hi Jatin, I noticed that nobody gave you an answer, so I thought I would give it a try: Within WRF, Noah LSM does not need continuous soil temperature and moisture data. real.exe creates data in the wrfinput_d0? file for initial conditions, but from there the LSMs models soil temperature and moisture themselves based on what the atmosphere does, and which in turn affects the atmosphere, of course. Even if you use the sst update option, there is no soil moisture or temperature data in the wrflowinp_d0? files. The WPS process uses a lot of variables, but some only make it into the initial conditions, not ongoing forcings. Regarding the intermediate format - the users guide gives a description of the format, and it also mentions that the source code metgrid/src/write_met_module.F90 can be used as a starting point. You probably knew that already. The netCDF folks at unidata(?) (Google netCDF) have library interfaces for several programming languages so you might find one that fits your programming experience. I have not written intermediate format myself, so I can't help with any source code. Sorry. Best Wishes, Kevin On Wed, May 4, 2011 at 3:12 AM, Jatin Kala wrote: > Hi there, > > > > I am trying to run WRF with CCSM data, and is starting to write some code > to do the conversion from netcdf to ?intermediate? format. Turns out it?s a > fairly massive job to do from scratch, and I am just wondering if somebody > has already done some of that and is willing to share their code ? > > > > One thing I am yet to figure out is that the Noah LSM needs 6 hourly soil > data (I am pretty sure), but 6 hourly soil data is seldom available from any > GCM! The best I have found is some monthly means?. Does this mean I have to > somehow interpolate monthly means to 6 hourly ? any suggestions? > > > > Cheers, > > > > Jatin > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110518/04b851c8/attachment.html From ela at cola.iges.org Thu May 19 14:13:49 2011 From: ela at cola.iges.org (Eric Altshuler) Date: Thu, 19 May 2011 16:13:49 -0400 (EDT) Subject: [Wrf-users] Error in boundary condition specification In-Reply-To: <4DD5392F.1050804@earth.uiuc.edu> Message-ID: <155597868.39156.1305836029704.JavaMail.root@mail.iges.org> Hi Nicole, A domain has to get its lateral boundary conditions from either the forcing dataset (e.g. GFS, NAM, ECMWF etc.) or its parent domain. If the domain is the "coarse" domain (i.e. has the lowest resolution), it is numbered as domain 1 and gets its lateral boundary conditions from the forcing dataset. Your namelist.input should have specified = .true., .false., .false., [.false., ...] Your error message pertains to domain 2. If it's a nest (which it most likely is, since there is usually only one coarse domain), your namelist.input should indicate this as follows: nested = .false., .true., .true., [.true., ...] Your problem is that for domain 2, 'nested' is .false. in your namelist. Change the setting of 'nested' to .true. for all nest domains. Best regards, Eric L. Altshuler Assistant Research Scientist Center for Ocean-Land-Atmosphere Studies 4041 Powder Mill Road, Suite 302 Calverton, MD 20705-3106 USA E-mail: ela at cola.iges.org Phone: (301) 902-1257 Fax: (301) 595-9793 ----- Original Message ----- From: "Nicole Schiffer" To: wrf-users at ucar.edu Sent: Thursday, May 19, 2011 11:37:19 AM Subject: [Wrf-users] Error in boundary condition specification I am trying to run the tutorial case that is one-way nesting using ndown. Running real.exe gives me the following error in three of the four rsl.out files: Domain 2: Current date being processed: 2005-08-28_00:00:00.0000, which is loop # 1 out of 5 configflags%julyr, %julday, %gmt: 2005 240 0.000000 metgrid input_wrf.F first_date_input = 2005-08-28_00:00:00 metgrid input_wrf.F first_date_nml = 2005-08-28_00:00:00 d02 2005-08-28_00:00:00 Timing for input 0 s. d02 2005-08-28_00:00:00 flag_soil_layers read from met_em file is 1 *** Error in boundary condition specification boundary conditions at xs 0 boundary conditions at xe 0 boundary conditions at ys 0 boundary conditions at ye 0 boundary conditions logicals are periodic_x F periodic_y F symmetric_xs F symmetric_xe F symmetric_ys F symmetric_ye F open_xs F open_xe F open_ys F open_ye F polar F nested F specified F -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 165 *** Error in boundary condition specification ------------------------------------------- Where should I look to solve this problem? Thanks, Nicole -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * Nicole Schiffer * Graduate Research Fellow (Dept. of Energy) * Department of Atmospheric Sciences * University of Illinois, Urbana-Champaign * Email: nschiff2 [at] illinois [dot] edu ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users From Michael.Zulauf at iberdrolaren.com Fri May 20 13:39:37 2011 From: Michael.Zulauf at iberdrolaren.com (Zulauf, Michael) Date: Fri, 20 May 2011 12:39:37 -0700 Subject: [Wrf-users] adaptive time step problems continue in WRF 3.3? Message-ID: Hi all. . . While I (and others) had problems with adaptive time-stepping in WRF 3.2, I was hoping that this would be fixed in WRF 3.3. Unfortunately, I'm still having problems - although of a different sort. In WRF 3.2 I had problems with the data being output at times slightly "off" from when they were supposed to be output (despite using the namelist options that are supposed to fix that). In WRF 3.3, I've gotten a crash that appears to be related to Grid FDDA. This is for a test case that worked with adaptive time-stepping turned off. The basics of the run are that there are 4 domains (27km, 9km, 3km, 1km), initialized and forced with 0.5 degree GFS, with the nested grids being initialized at 3 hour intervals. The crash occurs right after domain 3 is initialized (at hour 6), but the message is that the time in the input file is not the time on the domain for the FDDA file for domain 2. I've included the bottom portion of rsl.error.0000 (below), as well as my namelist.input. Any thoughts on this? Is this something that can be handled by changing some of my namelist options? Is this a problem with adaptive time stepping and FDDA? Thanks, Mike --------------------------------- rsl.error.0000 ---------------------------------------------- d02 2011-05-10_05:56:08+**/** *** Initializing nest domain # 3 from an input file. *** d02 2011-05-10_05:56:08+**/** med_initialdata_input: calling input_auxinput2 Timing for processing wrfinput file (stream 0) for domain 3: 1.82700 elapsed seconds. INPUT LandUse = "USGS" D03 3-D analysis nudging for wind is applied and Guv= 0.1000E-03 D03 3-D analysis nudging for temperature is applied and Gt= 0.1000E-03 D03 3-D analysis nudging for water vapor mixing ratio is applied and Gq= 0.1000E-05 D03 3-D analysis nudging for wind is turned off within the PBL. D03 3-D analysis nudging for temperature is turned off within the PBL. D03 3-D analysis nudging for water vapor mixing ratio is turned off within the PBL. D03 3-D analysis nudging for wind is turned off below layer 19 D03 3-D analysis nudging for temperature is turned off below layer 10 D03 3-D analysis nudging for water vapor mixing ratio is turned off below layer 10 D03 analysis nudging is ramped down near the end of the nudging period, starting at 17.00h, ending at 18.00h. INPUT LandUse = "USGS" D03 3-D analysis nudging for wind is applied and Guv= 0.1000E-03 D03 3-D analysis nudging for temperature is applied and Gt= 0.1000E-03 D03 3-D analysis nudging for water vapor mixing ratio is applied and Gq= 0.1000E-05 D03 3-D analysis nudging for wind is turned off within the PBL. D03 3-D analysis nudging for temperature is turned off within the PBL. D03 3-D analysis nudging for water vapor mixing ratio is turned off within the PBL. D03 3-D analysis nudging for wind is turned off below layer 19 D03 3-D analysis nudging for temperature is turned off below layer 10 D03 3-D analysis nudging for water vapor mixing ratio is turned off below layer 10 D03 analysis nudging is ramped down near the end of the nudging period, starting at 17.00h, ending at 18.00h. Timing for main (dt= 48.00): time 2011-05-10_05:56:56 on domain 2: 11.03700 elapsed seconds. Timing for main (dt= 48.00): time 2011-05-10_05:57:44 on domain 2: 8.30000 elapsed seconds. Timing for main (dt= 48.00): time 2011-05-10_05:58:32 on domain 2: 4.54300 elapsed seconds. Timing for main (dt= 48.00): time 2011-05-10_05:59:20 on domain 2: 4.84000 elapsed seconds. Timing for Writing wrfout_d03_2011-05-10_06:00:00 for domain 3: 17.89600 elapsed seconds. d03 2011-05-10_06:00:00 Input data processed for aux input 10 for domain 3 WRF TILE 1 IS 1 IE 70 JS 1 JE 36 WRF NUMBER OF TILES = 1 D03 3-D analysis nudging reads new data at time = 360.000 min. D03 3-D analysis nudging bracketing times = 360.00 540.00 min. Timing for main (dt= 20.00): time 2011-05-10_06:00:16 on domain 3: 27.35300 elapsed seconds. Timing for main (dt= 48.00): time 2011-05-10_06:00:08 on domain 2: 32.13700 elapsed seconds. Timing for main (dt=240.00): time 2011-05-10_06:00:00 on domain 1: 64.06500 elapsed seconds. Timing for Writing wrfout_d01_2011-05-10_06:00:00 for domain 1: 13.37800 elapsed seconds. d01 2011-05-10_06:00:00 Input data processed for aux input 10 for domain 1 Timing for processing lateral boundary for domain 1: 2.06900 elapsed seconds. D01 3-D analysis nudging reads new data at time = 360.000 min. D01 3-D analysis nudging bracketing times = 360.00 540.00 min. Timing for Writing wrfout_d02_2011-05-10_06:00:00 for domain 2: 61.44100 elapsed seconds. Time in file: 2011-05-10_06:00:00 Time on domain: 2011-05-10_06:00:08 **WARNING** Time in input file not equal to time on domain **WARNING** **WARNING** Trying next time in file wrffdda_d02 ... Time in file: 2011-05-10_09:00:00 Time on domain: 2011-05-10_06:00:08 **WARNING** Time in input file not equal to time on domain **WARNING** **WARNING** Trying next time in file wrffdda_d02 ... Time in file: 2011-05-10_12:00:00 Time on domain: 2011-05-10_06:00:08 **WARNING** Time in input file not equal to time on domain **WARNING** **WARNING** Trying next time in file wrffdda_d02 ... Time in file: 2011-05-10_15:00:00 Time on domain: 2011-05-10_06:00:08 **WARNING** Time in input file not equal to time on domain **WARNING** **WARNING** Trying next time in file wrffdda_d02 ... 3 input_wrf: wrf_get_next_time current_date: 2011-05-10_15:00:00 Status = -4 -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 751 ... Could not find matching time in input file wrffdda_d02 ------------------------------------------------------------------------ ----------------------- ----------------------------------- namelist.input -------------------------------------------- &time_control run_days = 0, run_hours = 18, run_minutes = 0, run_seconds = 0, start_year = 2011,2011,2011,2011, start_month = 05,05,05,05, start_day = 10,10,10,10, start_hour = 00,03,06,09, start_minute = 00, 00, 00, 00, 00, 00, start_second = 00, 00, 00, 00, 00, 00, end_year = 2011,2011,2011,2011, end_month = 05,05,05,05, end_day = 10,10,10,10, end_hour = 18,18,18,18, end_minute = 00, 00, 00, 00, 00, 00, end_second = 00, 00, 00, 00, 00, 00, interval_seconds = 10800, input_from_file = .true.,.true.,.true.,.true.,.true., fine_input_stream = 0, 2, 2, 2, io_form_auxinput2 = 2 history_interval = 180,180,60,20, frames_per_outfile = 1, 1, 1, 1, 1, 1, restart = .false., restart_interval = 1440, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 0 adjust_output_times = .true. / &domains time_step = 180, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 4, s_we = 1,1,1,1, e_we = 268,622,280,382, s_sn = 1,1,1,1, e_sn = 184,418,250,196, s_vert = 1, 1, 1, 1, 1, 1, e_vert = 31, 31, 31, 31, 31, 31, num_metgrid_levels = 27 , eta_levels = 1.000, 0.993, 0.980, 0.966, 0.950, 0.933, 0.913, 0.892, 0.869, 0.844, 0.816, 0.786, 0.753, 0.718, 0.680, 0.639, 0.596, 0.550, 0.501, 0.451, 0.398, 0.345, 0.290, 0.236, 0.188, 0.145, 0.108, 0.075, 0.046, 0.021, 0.000, p_top_requested = 5000, dx = 27000,9000,3000,1000, dy = 27000,9000,3000,1000, grid_id = 1, 2, 3, 4, 5, 6, parent_id = 1, 1, 2, 3, 4, 5, i_parent_start = 1, 31, 91, 92, j_parent_start = 1, 23,183, 93, parent_grid_ratio = 1, 3, 3, 3, 3, 3, parent_time_step_ratio = 1, 3, 3, 3, 3, 3, feedback = 0, smooth_option = 2 use_adaptive_time_step = .true. step_to_output_time = .true. target_cfl = 1.1,1.1,1.1,1.1, max_step_increase_pct = 5, 51, 51, 51, 51, 51 starting_time_step = 180, 60, 20, 6.66666667 max_time_step = 240, 80, 26.66666667, 8.88888889 min_time_step = 27, 9, 3, 1 adaptation_domain = 4 / &physics mp_physics = 5, 5, 5, 5, ra_lw_physics = 1, 1, 1, 1, ra_sw_physics = 1, 1, 1, 1, radt = 30, 30, 30, 30, 30, 30, sf_sfclay_physics = 1, 1, 1, 1, sf_surface_physics = 1, 1, 1, 1, bl_pbl_physics = 1, 1, 1, 1, bldt = 0, 0, 0, 0, 0, 0, cu_physics = 1, 1, 0, 0, 0, 0, cudt = 5, 5, 5, 0, 0, 0, cam_abs_freq_s = 21600, levsiz = 59, paerlev = 29, cam_abs_dim1 = 4, cam_abs_dim2 = 31, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 5, sf_urban_physics = 0, 0, 0, 0, mp_zero_out = 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, slope_rad = 0, topo_shading = 0, / &fdda grid_fdda = 1, 1, 1, 1, gfdda_inname = "wrffdda_d", gfdda_interval_m = 180, 180, 180, 180, gfdda_end_h = 18, 18, 18, 18, io_form_gfdda = 2, fgdt = 0, 0, 0, 0, if_no_pbl_nudging_uv = 1, 1, 1, 1, if_no_pbl_nudging_t = 1, 1, 1, 1, if_no_pbl_nudging_q = 1, 1, 1, 1, if_zfac_uv = 1, 1, 1, 1, k_zfac_uv = 19, 19, 19, 19, if_zfac_t = 1, 1, 1, 1, k_zfac_t = 10, 10, 10, 10, if_zfac_q = 1, 1, 1, 1, k_zfac_q = 10, 10, 10, 10, guv = 0.0001, 0.0001, 0.0001, 0.0001, gt = 0.0001, 0.0001, 0.0001, 0.0001, gq = 0.000001, 0.000001, 0.000001, 0.000001, if_ramping = 1, dtramp_min = -60.0, / &dynamics w_damping = 1, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, diff_6th_factor = 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.01, 0.01, 0.01 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, 1 scalar_adv_opt = 1, 1, 1, 1 use_baseparam_fr_nml = .true. / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false.,.false.,.false., .false., nested = .false., .true., .true.,.true., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / ------------------------------------------------------------------------ ----------------------- -- PLEASE NOTE - NEW E-MAIL ADDRESS: michael.zulauf at iberdrolaren.com Mike Zulauf Meteorologist, Lead Senior Wind Asset Management Iberdrola Renewables 1125 NW Couch, Suite 700 Portland, OR 97209 Office: 503-478-6304 Cell: 503-913-0403 Please be advised that email addresses for Iberdrola Renewables personnel have changed to first.last at iberdrolaREN.com effective Aug. 16, 2010. Please make a note. Thank you. This message is intended for the exclusive attention of the recipient(s) indicated. Any information contained herein is strictly confidential and privileged. If you are not the intended recipient, please notify us by return e-mail and delete this message from your computer system. Any unauthorized use, reproduction, alteration, filing or sending of this message and/or any attached files may lead to legal action being taken against the party(ies) responsible for said unauthorized use. Any opinion expressed herein is solely that of the author(s) and does not necessarily represent the opinion of the Company. The sender does not guarantee the integrity, speed or safety of this message, and does not accept responsibility for any possible damage arising from the interception, incorporation of viruses, or any other damage as a result of manipulation. From ekimalab at gmail.com Wed May 25 01:07:11 2011 From: ekimalab at gmail.com (Michael Bala) Date: Wed, 25 May 2011 15:07:11 +0800 Subject: [Wrf-users] ungrib.exe make error Message-ID: I tried compiling WPS but it failed to make ungrib.exe. I have installed jasper, png, and zlib. And the NCL works too. I did a grep of the compile.log and found this Errors: make[2]: [enc_png.o] Error 1 (ignored) > make[2]: [dec_png.o] Error 1 (ignored) > make[2]: [libg2_4.a] Error 1 (ignored) > make[1]: [ungrib.exe] Error 1 (ignored) > make[2]: [enc_png.o] Error 1 (ignored) > make[2]: [dec_png.o] Error 1 (ignored) > make[2]: [libg2_4.a] Error 1 (ignored) > make[2]: [enc_png.o] Error 1 (ignored) > make[2]: [dec_png.o] Error 1 (ignored) > make[2]: [libg2_4.a] Error 1 (ignored) > make[1]: [g2print.exe] Error 1 (ignored) > make[1]: [plotfmt.exe] Error 1 (ignored) > make[1]: [plotgrids.exe] Error 1 (ignored) > Also, ar: enc_png.o: No such file or directory > ar: enc_png.o: No such file or directory > ar: enc_png.o: No such file or directory > And, ld: cannot find -lg2_4 > ld: cannot find -lg2_4 > Are these related? I hope you can help me out. Thanks, Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110525/841d2893/attachment.html From carlosmgd5 at hotmail.com Wed May 25 14:02:10 2011 From: carlosmgd5 at hotmail.com (Carlos Mario Gonzalez Duque) Date: Wed, 25 May 2011 15:02:10 -0500 Subject: [Wrf-users] Problem running ndown Message-ID: Hi I am running the wrf model for 2 domains and I'm using ndown for this purpose. However, I have a problem running ndown and I'm lost about the solution. I can generate correctly the input files to run the program, but after running ndown the following error appears in rsl.error file: -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 355 program ndown: opening too many files ------------------------------------------- The namelist.input file that I have been using is: &time_control run_days = 0, run_hours = 12, run_minutes = 0, run_seconds = 0, start_year = 2010, 2010, start_month = 10, 10, start_day = 15, 15, start_hour = 00, 00, end_year = 2010, 2010, end_month = 10, 10, end_day = 16, 16, end_hour = 00, 00, interval_seconds = 21600, input_from_file = .true., .true., history_interval = 30, 30, frames_per_outfile = 1000, 1000, restart = .false., restart_interval = 720, io_form_history = 2, io_form_restart = 2, io_form_input = 2, io_form_boundary = 2, io_form_auxinput2 = 2, debug_level = 0, / &domains time_step = 90, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 2, s_we = 1, 1, e_we = 244, 292, s_sn = 1, 1, e_sn = 187, 235, s_vert = 1, 1, e_vert = 28, 28, p_top_requested = 5000, num_metgrid_levels = 27, num_metgrid_levels = 27, num_metgrid_soil_levels = 4, dx = 2766.667, 922.222, dy = 2766.667, 922.222, grid_id = 1, 2, parent_id = 1, 1, i_parent_start = 1, 57, j_parent_start = 1, 48, parent_grid_ratio = 1, 3, parent_time_step_ratio = 1, 3, feedback = 0, smooth_option = 0, use_adaptive_time_step = .true., step_to_output_time = .true., target_cfl = 1.2, max_step_increase_pct = 5, starting_time_step = -1, max_time_step = -1, min_time_step = -1, / &physics mp_physics = 3, 3, ra_lw_physics = 1, 1, ra_sw_physics = 1, 1, radt = 30, 30, sf_sfclay_physics = 1, 1, sf_surface_physics = 2, 2, bl_pbl_physics = 1, 1, bldt = 0, 0, cu_physics = 1, 1, cudt = 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, tmn_update = 1, sst_skin = 1, / &fdda / &dynamics w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, 0, diff_6th_factor = 0.12, 0.12, base_temp = 290., damp_opt = 0, zdamp = 5000., 5000., dampcoef = 0.2, 0.2, khdif = 0, 0, kvdif = 0, 0, non_hydrostatic = .true., .true., moist_adv_opt = 1, 1, scalar_adv_opt = 1, 1, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, spec_exp = 0.33, specified = .true., nested = .false., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / I'm new using wrf model. In this sense, I really appreciate if anyone can help me with this problem. Thanks for your attention ----------------------------------- Carlos Mario Gonzalez Duque Ingeniero Qu?mico Universidad Nacional de Colombia Sede Manizales -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110525/87375a63/attachment.html From hamed319 at yahoo.com Sun May 22 05:06:56 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Sun, 22 May 2011 04:06:56 -0700 (PDT) Subject: [Wrf-users] Changing land use Message-ID: <358161.27758.qm@web161215.mail.bf1.yahoo.com> Dear All, I want to change some values?of my landuse in order to see how it affects the whole domain. For example, change part of the landuse to Forest or Lack. Does anyone know about this issue? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110522/19d6ad3b/attachment-0001.html From jason.evans at unsw.edu.au Wed May 25 01:03:33 2011 From: jason.evans at unsw.edu.au (Jason Evans) Date: Wed, 25 May 2011 17:03:33 +1000 Subject: [Wrf-users] WRF modelling postdoctoral position at UNSW Sydney, Australia Message-ID: <4DDCA9C5.5000605@unsw.edu.au> Dear WRF-users, Below is a position available for a WRF modeller to work in the Climate Change Research Centre at the University of New South Wales, Sydney, Australia. Please forward to anyone you think may be interested. Cheers, Jason Research Associate Job Reference: 8012NET *Salary Level A/B:* A$72,675 - A$96,231 per year, plus 17% employer superannuation plus leave loading.The salary level will be commensurate with experience. Applications are invited from suitably qualified researchers to join the Climate Change Research Centre on a project led by Dr Jason Evans to produce an ensemble of regional climate projections for south-east Australia using the Weather Research and Forecasting (WRF) modeling system. This project is funded by the New South Wales Office of Environment and Heritage. The successful applicant will have relevant experience in atmospheric modelling, coupled climate modelling, regional climate modelling and climate variability/change.An individual with a PhD in quantitative applied mathematics or physics would also likely be very competitive without explicit experience in regional climate modelling. The ideal candidate will also possess demonstrated programming experience in a Unix/Linux environment (e.g. Fontran 9X, shell scripts, NCL, R, Matlab). This is a fulltime, fixed term position for two years with the possibility of extension to three years. For more information http://www.hr.unsw.edu.au/services/recruitment/jobs/20051104.html -- Dr. Jason P. Evans ARC Australian Research Fellow Senior Lecturer Climate Change Research Centre University of New South Wales Sydney, NSW, 2052 Australia email: jason.evans at unsw.edu.au ph: +61-2-9385 7066 fax: +61-2-9385 7123 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110525/84c07248/attachment.html From ahsanshah01 at gmail.com Wed May 25 22:00:13 2011 From: ahsanshah01 at gmail.com (Ahsan Ali) Date: Thu, 26 May 2011 09:00:13 +0500 Subject: [Wrf-users] ungrib.exe make error Message-ID: > > Please check the path to wrf directory in configure.wps (WRF_DIR > = ../WRFV3) before compiling. > > > Message: 1 > Date: Wed, 25 May 2011 15:07:11 +0800 > From: Michael Bala > Subject: [Wrf-users] ungrib.exe make error > To: wrf-users at ucar.edu > Message-ID: > Content-Type: text/plain; charset="iso-8859-1" > > I tried compiling WPS but it failed to make ungrib.exe. > > I have installed jasper, png, and zlib. And the NCL works too. > > I did a grep of the compile.log and found this Errors: > > make[2]: [enc_png.o] Error 1 (ignored) > > make[2]: [dec_png.o] Error 1 (ignored) > > make[2]: [libg2_4.a] Error 1 (ignored) > > make[1]: [ungrib.exe] Error 1 (ignored) > > make[2]: [enc_png.o] Error 1 (ignored) > > make[2]: [dec_png.o] Error 1 (ignored) > > make[2]: [libg2_4.a] Error 1 (ignored) > > make[2]: [enc_png.o] Error 1 (ignored) > > make[2]: [dec_png.o] Error 1 (ignored) > > make[2]: [libg2_4.a] Error 1 (ignored) > > make[1]: [g2print.exe] Error 1 (ignored) > > make[1]: [plotfmt.exe] Error 1 (ignored) > > make[1]: [plotgrids.exe] Error 1 (ignored) > > > > Also, > > ar: enc_png.o: No such file or directory > > ar: enc_png.o: No such file or directory > > ar: enc_png.o: No such file or directory > > > > And, > > ld: cannot find -lg2_4 > > ld: cannot find -lg2_4 > > > > Are these related? I hope you can help me out. > > > Thanks, > > Mike > -- Syed Ahsan Ali Bokhari Electronic Engineer (EE) Research & Development Division Pakistan Meteorological Department H-8/4, Islamabad. Phone # off +92518358714 Cell # +923155145014 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/030d1168/attachment.html From J.Kala at murdoch.edu.au Wed May 25 19:11:02 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Thu, 26 May 2011 09:11:02 +0800 Subject: [Wrf-users] Changing land use In-Reply-To: <358161.27758.qm@web161215.mail.bf1.yahoo.com> References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> Message-ID: One method would be to run geogrid.exe and modify LU_index in the geo_em* files, and re-write the geo_em* files. I have not tried that myself. Cheers, Jatin From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Hamed Sharifi Sent: Sunday, 22 May 2011 7:07 PM To: wrf-users at ucar.edu Subject: [Wrf-users] Changing land use Dear All, I want to change some values of my landuse in order to see how it affects the whole domain. For example, change part of the landuse to Forest or Lack. Does anyone know about this issue? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/6744c04e/attachment.html From ybao2009 at gmail.com Wed May 25 17:08:51 2011 From: ybao2009 at gmail.com (Yan Bao) Date: Wed, 25 May 2011 17:08:51 -0600 Subject: [Wrf-users] wrf-les running Message-ID: Hi, Everyone, I'm trying to run WRF-LES with defaulted case (no any changes from README.namelist and input_sounding in test/em_les/), but cannot make it successfully. Firstly when running ideal.exe, I got the NaN in rsl.out.0000 file, through the rsl.error.0000 and rsl.out.0000 ended with 'wrf initilizale successfully", and wrfinput_d01 file was generated, with all LU_INDEX values were 0. After changed ztop in namelist.input from 19000 to 10000m, there was no NaN data appeared in rsl.out.0000 file any more (see the following). I'm not sure if this kind of change reasonable or not. While when run wrf.exe, I got the message as the following (wether I adjusted ztop in namelist.input file). BTW, I have set up 'ulimit -s unlimited' in wrf running script, the geo*nc file was provided in WPS subdirectory. The namelist.input file was also posted below. Based on these probelms, I got couple of questions: 1. How to determine ztop in the namelist.input file? Is it related to "p_top_request" or sounding data provided? 2. How does the WRF model initilizate land surface variables (LU_INDEX, soil ) in an ideal case? Seems wrf.exe cannot run because the MMINLU was not defined. Do I need to set up it in code or provide more surface information in namelist.input file?But how I'm supposed to do it since as I know almost all the surface related options are for real case? 3 Is any one willing to post his/her namelist.input file so I can take an example? Thanks, any of your help is appreciated. Yan I also post the namelist.input file below. If any one met this kind of problem and how you process this? Can you post your namelist.input file here? You are rsl.out.0000 for running ideal.exe returned from reading sounding, nl_in is 41 ptop is 4755.000 --------------- (is this reasonable ?) base state grid%mub(1,1), p_surf is 95246.47 100001.5 getting moist sounding for full state input sounding surface parameters surface pressure (mb) 1000.000 surface pot. temp (K) 305.0000 surface mixing ratio (g/kg) 14.00000 1 0.250E+02 0.300E+03 0.100E+02 0.000E+00 0.000E+00 2 0.750E+02 0.300E+03 0.100E+02 0.000E+00 0.000E+00 3 0.125E+03 0.300E+03 0.100E+02 0.000E+00 0.000E+00 4 0.175E+03 0.300E+03 0.100E+02 0.000E+00 0.000E+00 ... 41 0.205E+03 .... grid%ph_1 calc 0.000000 38.26545 95099.04 -147.4297 0.8720049 1.2673497E-02 -27.00002 nxc, nyc for perturbation 45 40 delt for perturbation 3.000000 grid%mu_1 from comp -147.4297 full state sounding from comp, ph, grid%p, grid%al, grid%t_1, qv 1 0.000E+00 9.852E+04 8.843E-01 3.000E+02 1.000E-02 2 3.115E+03 9.496E+04 9.079E-01 3.000E+02 1.000E-02 3 6.313E+03 9.140E+04 9.330E-01 3.000E+02 1.000E-02 4 9.599E+03 8.786E+04 9.769E-01 3.083E+02 4.000E-03 5 1.304E+04 8.432E+04 1.009E+00 3.094E+02 4.000E-03 6 1.659E+04 8.078E+04 1.044E+00 3.105E+02 4.000E-03 7 2.027E+04 7.725E+04 1.082E+00 3.116E+02 4.000E-03 8 2.409E+04 7.371E+04 1.123E+00 3.127E+02 4.000E-03 9 2.804E+04 7.018E+04 1.168E+00 3.139E+02 4.000E-03 10 3.215E+04 6.664E+04 1.216E+00 3.150E+02 4.000E-03 11 3.644E+04 6.310E+04 1.269E+00 3.162E+02 4.000E-03 12 4.091E+04 5.957E+04 1.327E+00 3.173E+02 4.000E-03 13 4.558E+04 5.603E+04 1.391E+00 3.185E+02 4.000E-03 14 5.048E+04 5.249E+04 1.463E+00 3.196E+02 4.000E-03 15 5.563E+04 4.896E+04 1.543E+00 3.207E+02 4.000E-03 16 6.107E+04 4.542E+04 1.634E+00 3.219E+02 4.000E-03 17 6.682E+04 4.189E+04 1.737E+00 3.230E+02 4.000E-03 18 7.294E+04 3.835E+04 1.857E+00 3.242E+02 4.000E-03 19 7.948E+04 3.481E+04 1.996E+00 3.253E+02 4.000E-03 20 8.651E+04 3.128E+04 2.163E+00 3.264E+02 4.000E-03 21 9.413E+04 2.774E+04 2.365E+00 3.276E+02 4.000E-03 22 1.025E+05 2.420E+04 2.616E+00 3.287E+02 4.000E-03 23 1.117E+05 2.067E+04 2.938E+00 3.299E+02 4.000E-03 24 1.220E+05 1.713E+04 3.371E+00 3.310E+02 4.000E-03 25 1.339E+05 1.360E+04 3.990E+00 3.321E+02 4.000E-03 26 1.479E+05 1.006E+04 4.965E+00 3.333E+02 4.000E-03 27 1.654E+05 6.523E+03 6.788E+00 3.344E+02 4.000E-03 pert state sounding from comp, grid%ph_1, pp, alp, grid%t_1, qv 1 0.000E+00 2.815E+02 1.234E-02 4.079E-02 1.000E-02 2 3.869E+01 2.517E+02 1.280E-02 4.079E-02 1.000E-02 3 7.887E+01 2.220E+02 1.330E-02 4.079E-02 1.000E-02 4 1.207E+02 2.028E+02 4.921E-03 8.330E+00 4.000E-03 5 1.327E+02 1.941E+02 4.975E-03 9.361E+00 4.000E-03 6 1.447E+02 1.855E+02 5.174E-03 1.047E+01 4.000E-03 7 1.573E+02 1.769E+02 5.385E-03 1.161E+01 4.000E-03 8 1.704E+02 1.683E+02 5.610E-03 1.275E+01 4.000E-03 9 1.840E+02 1.596E+02 5.855E-03 1.389E+01 4.000E-03 10 1.983E+02 1.510E+02 6.121E-03 1.503E+01 4.000E-03 11 2.133E+02 1.424E+02 6.415E-03 1.617E+01 4.000E-03 12 2.290E+02 1.337E+02 6.738E-03 1.731E+01 4.000E-03 13 2.455E+02 1.251E+02 7.096E-03 1.845E+01 4.000E-03 14 2.629E+02 1.165E+02 7.496E-03 1.959E+01 4.000E-03 15 2.814E+02 1.079E+02 7.942E-03 2.073E+01 4.000E-03 16 3.010E+02 9.923E+01 8.458E-03 2.187E+01 4.000E-03 17 3.219E+02 9.060E+01 9.039E-03 2.301E+01 4.000E-03 18 3.443E+02 8.197E+01 9.714E-03 2.415E+01 4.000E-03 19 3.684E+02 7.334E+01 1.052E-02 2.529E+01 4.000E-03 20 3.947E+02 6.471E+01 1.148E-02 2.643E+01 4.000E-03 21 4.233E+02 5.608E+01 1.267E-02 2.757E+01 4.000E-03 22 4.551E+02 4.746E+01 1.416E-02 2.871E+01 4.000E-03 23 4.908E+02 3.883E+01 1.610E-02 2.986E+01 4.000E-03 24 5.315E+02 3.020E+01 1.879E-02 3.099E+01 4.000E-03 25 5.794E+02 2.157E+01 2.283E-02 3.214E+01 4.000E-03 26 6.382E+02 1.294E+01 2.951E-02 3.328E+01 4.000E-03 27 7.152E+02 4.314E+00 4.352E-02 3.442E+01 4.000E-03 wrf: SUCCESS COMPLETE IDEAL INIT rsl.out.0000 for wrf.exe Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 1 , ntasks in Y 1 --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 80 1 80 ims,ime,jms,jme -4 85 -4 85 ips,ipe,jps,jpe 1 80 1 80 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 136632968 bytes allocated med_initialdata_input: calling input_input MMINLU error on input INITIALIZE THREE Noah LSM RELATED TABLES Skipping over LUTYPE = USGS Skipping over LUTYPE = MODIFIED_IGBP_MODIS_NOAH Skipping over LUTYPE = USGS-RUC -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 1547 Land Use Dataset '' not found in VEGPARM.TBL. namelist.input I used &time_control run_days = 1, run_hours = 12, run_minutes = 0, run_seconds = 0, start_year = 2008, 2008, 2008, start_month = 06, 01, 01, start_day = 06, 24, 24, start_hour = 00, 12, 12, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2008, 2008, 2008, end_month = 06, 01, 01, end_day = 07, 25, 25, end_hour = 00, 12, 12, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800 input_from_file = .false.,.true.,.true., history_interval = 60, 60, 60, frames_per_outfile = 1, 1000, 1000, restart = .false., restart_interval = 5000, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 0 / &domains time_step = 5, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, e_we = 80, 112, 94, e_sn = 80, 97, 91, e_vert = 29, 28, 28, num_metgrid_levels = 30, num_metgrid_soil_levels = 4, dx = 100, 10000, 3333.33, dy = 100, 10000, 3333.33, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 1, 31, 30, j_parent_start = 1, 17, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 ztop = 10000. interp_type = 2 extrap_type = 2 t_extrap_type = 2 use_levels_below_ground = .true. use_surface = .true. lagrange_order = 1 zap_close_levels = 500 lowest_lev_from_sfc = .false. force_sfc_in_vinterp = 1 sfcp_to_sfcp = .false. smooth_cg_topo = .false. use_tavg_for_tsk = .false. vert_refine_fact = 1 / &physics mp_physics = 3, 3, 3, ra_lw_physics = 1, 1, 1, ra_sw_physics = 1, 1, 1, radt = 30, 30, 30, sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 0, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 1, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &fdda / &dynamics w_damping = 0, diff_opt = 2, km_opt = 2, diff_6th_opt = 0, 0, 0, diff_6th_factor = 0.12, 0.12, 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.2, 0.2, 0.2 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, scalar_adv_opt = 1, 1, 1, pert_coriolis = .true., do_coriolis = .true., do_curvature = .true., do_gradp = .true., / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .false., .false.,.false., periodic_x = .true.,.false.,.false., symmetric_xs = .false.,.false.,.false., symmetric_xe = .false.,.false.,.false., open_xs = .false.,.false.,.false., open_xe = .false.,.false.,.false., periodic_y = .true.,.false.,.false., symmetric_ys = .false.,.false.,.false., symmetric_ye = .false.,.false.,.false., open_ys = .false.,.false.,.false., open_ye = .false,.false.,.false., nested = .false., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / -- Yan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110525/397bc334/attachment-0001.html From maria.frediani at gmail.com Thu May 26 13:12:15 2011 From: maria.frediani at gmail.com (Maria Eugenia) Date: Thu, 26 May 2011 15:12:15 -0400 Subject: [Wrf-users] problem running real with GFS fnl* data for 2005 Message-ID: Dear wrf help, I'm trying to run a simulation in ARW V3.2 (also tried V3.2.1 in a different machine) using GFS fnl* grib1 data for a case in 2005 (http://dss.ucar.edu/datasets/ds083.2/matrix.html). I have been running wrf successfully for a long time with other source of data. I'm using the Vtable.GFS, and I have also tried Vtable.GFS+TROP and Vtable.NCEP2, but I cannot run real.exe. In my rsl.error I get a list of NetCDF error: NetCDF: Variable not found I suspect there is something different with this dataset or the Vtable is not adequate. Could you pls help me figure out what is going wrong? Attached you find the rsl.error, namelist.wps and namelist.input and .sh file I use to submit the simulation. Thanks a lot Maria E. B. Frediani ------------------------------------------------------------------------------------- Visiting Scholar University of Connecticut School of Engineering 261 Glenbrook Rd Storrs, CT 06269 frediani at engr.uconn.edu -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0000 Type: application/octet-stream Size: 2299811 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/b0f45eab/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4093 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/b0f45eab/attachment-0003.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/vnd.ms-works Size: 1432 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/b0f45eab/attachment-0001.bin -------------- next part -------------- A non-text attachment was scrubbed... Name: run_wps_all-files.sh Type: application/x-sh Size: 3542 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110526/b0f45eab/attachment-0001.sh From ar.ragi at gmail.com Sun May 29 23:52:46 2011 From: ar.ragi at gmail.com (A.R Ragi) Date: Mon, 30 May 2011 11:22:46 +0530 Subject: [Wrf-users] WRF V3.3 error Message-ID: Dear WRF Users, I was running WRF Latest version (version 3.3) for 2004 aug 01 (24hr). The WPS part is running successfully. But when am running real.exe the following error am getting. I'm attaching the namelist.wps and namelist.input. Can anyone help me to sort this out? The same I had done for WRF3.1.1 and is successfully running. Is this problem with Version 3.3? *taskid: 0 hostname: ajaymeru.cas.iitd.ernet.in ------ ERROR while reading namelist time_control ------ Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc ------ ERROR while reading namelist physics ------ Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 9094 ERRORS while reading one or more namelists from namelist.input. ------------------------------------------- application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0[unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 * Thanks in advance* * *-- * ************************************************************************* *Regards,* *A.R.Ragi* *Research Scholar * *CAS, IIT Delhi* ************************************************************************* *"I want to know how God created this world. I am not interested in this or that phenomenon. I want to know, his thought, the rest are details." ---Albert Einstein* ************************************************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110530/98f1917e/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist3.3.input Type: application/octet-stream Size: 6005 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110530/98f1917e/attachment.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 720 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110530/98f1917e/attachment-0001.obj From davidstephenbryan at yahoo.com Sun May 29 08:09:27 2011 From: davidstephenbryan at yahoo.com (David Bryan) Date: Sun, 29 May 2011 07:09:27 -0700 (PDT) Subject: [Wrf-users] real.exe fail from NAM input Message-ID: <200769.9734.qm@web65909.mail.ac4.yahoo.com> I'm running WRF 3.3 with NAM input data. The NAM data is NOMADS archived for 2010 and recently downloaded. The problem is that real.exe does not produce wrfbdy_d01 or wrfinput_d01. Of course, real.exe doesn't give much in terms of useful output to diagnose the problem. I know that the domain and settings work fine with current GFS input. I'd appreciate any insight into what the problem might be. My namelist.input follows. Thanks! &time_control run_days = 0, run_hours = 3, run_minutes = 0, run_seconds = 0, start_year = 2010,2010,2010,2010,2010, start_month = 7,7,7,7,7, start_day = 2,2,2,2,2, start_hour = 12,12,12,12,12, start_minute = 00,00,00,00,00, start_second = 00,00,00,00,00, end_year = 2010,2010,2010,2010,2010, end_month = 7,7,7,7,7, end_day = 2,2,2,2,2, end_hour = 15,15,15,15,15, end_minute = 00,00,00,00,00, end_second = 00,00,00,00,00, interval_seconds = 10800 input_from_file = .true.,.true.,.true.,.true.,.true., history_interval = 60,60,60,60,60, frames_per_outfile = 1,1,1,1,1, restart = .false., restart_interval = 100000, io_form_history = 2, io_form_restart = 2, io_form_input = 2, io_form_boundary = 2, io_form_auxinput2 = 2, debug_level = 0, / &domains eta_levels = 1.000, 0.963, 0.953351547, 0.949933093, 0.915704803, 0.881476513, 0.878268356, 0.850754286, 0.832777143, 0.8148, 0.7778, 0.7407, 0.7037, 0.6667, 0.6296, 0.5926, 0.5556, 0.5185, 0.4815, 0.4444, 0.4074, 0.3704, 0.3333, 0.2963, 0.2593, 0.2222, 0.1852, 0.1481, 0.1111, 0.0741, 0.037, 0.000, time_step = 18, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 5, s_we = 1,1,1,1,1, e_we = 213, 49, 49, 49,49, s_sn = 1,1,1,1,1, e_sn = 129, 49, 49, 49,49, s_vert = 1,1,1,1,1, e_vert = 32,32,32,32,32, num_metgrid_levels = 27, dx = 3000,1000,1000,1000,1000, dy = 3000,1000,1000,1000,1000, grid_id = 1,2,3,4,5, parent_id = 1, 1, 1,1,1, i_parent_start = 1, 29, 31, 78,160, j_parent_start = 1, 70, 4, 103,27, parent_grid_ratio = 1, 3, 3,3,3, parent_time_step_ratio = 1, 3, 3,3,3, feedback = 1, smooth_option = 2, use_adaptive_time_step= .false., target_cfl = 1.2, max_step_increase_pct = 100, starting_time_step = -1, max_time_step = 110, min_time_step = 5, / &physics mp_physics = 3, 3, 3, 3, 3, ra_lw_physics = 1, 1, 1, 1, 1, ra_sw_physics = 1, 1, 1, 1, 1, radt = 2, 1, 1, 1, 1, sf_sfclay_physics = 2, 2, 2, 2, 2, sf_surface_physics = 2, 2, 2, 2, 2, bl_pbl_physics = 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, bldt = 0, 0, 0, 0, 0, cu_physics = 0, 0, 0, 0, 0, cudt = 5, 5, 5, 5, 5, isfflx = 1, ifsnow = 1, icloud = 1, surface_input_source = 1, num_soil_layers = 4, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &fdda / &dynamics rk_ord = 3, w_damping = 1, diff_opt = 2, km_opt = 2, damp_opt = 0, moist_adv_opt = .true.,.true.,.true.,.true.,.true., scalar_adv_opt = .true.,.true.,.true.,.true.,.true., tke_adv_opt = .true.,.true.,.true.,.true.,.true., zdamp = 5000.,5000.,5000.,5000.,5000., dampcoef = 0.2,0.2,0.2,0.2,0.2, khdif = 0,0, kvdif = 0,0, smdiv = 0.1,0.1,0.1,0.1,0.1, emdiv = 0.01,0.01,0.01,0.01,0.01, epssm = 0.1,0.1,0.1,0.1,0.1, non_hydrostatic = .true.,.true.,.true.,.true.,.true., time_step_sound = 4,4,4,4,4, h_mom_adv_order = 5,5,5,5,5, v_mom_adv_order = 3,3,3,3,3, h_sca_adv_order = 5,5,5,5,5, v_sca_adv_order = 3,3,3,3,3, diff_6th_opt = 0,0, diff_6th_factor = 0.06,0.06,0.06,0.06,0.06, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true.,.false.,.false.,.false.,.false., periodic_x = .false.,.false.,.false.,.false.,.false., symmetric_xs = .false.,.false.,.false.,.false.,.false., symmetric_xe = .false.,.false.,.false.,.false.,.false., open_xs = .false.,.false.,.false.,.false.,.false., open_xe = .false.,.false.,.false.,.false.,.false., periodic_y = .false.,.false.,.false.,.false.,.false., symmetric_ys = .false.,.false.,.false.,.false.,.false., symmetric_ye = .false.,.false.,.false.,.false.,.false., open_ys = .false.,.false.,.false.,.false.,.false., open_ye = .false.,.false.,.false.,.false.,.false., nested = .false.,.true.,.true.,.true.,.true.,.true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / background_maps= 10,10,10,10,10 From ebeigi3 at tigers.lsu.edu Mon May 30 18:34:21 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Mon, 30 May 2011 20:34:21 -0400 Subject: [Wrf-users] read_wrf_nc.f error Message-ID: Dear Sir/Madam, Thanks for your previous help. I am trying to use read_wrf_nc.f for extracting my desire vaiables from wrf output files. i can see data inside me_em..nc* files, but i can not see the output of wrf (wrfout.do1...), and this is my error message /read_wrf_nc.f: line 1: Special: command not found information: cannot open `information' (No such file or directory) on: cannot open `on' (No such file or directory) the: cannot open `the' (No such file or directory) screen: cannot open `screen' (No such file or directory) ./read_wrf_nc.f: line 3: Can: command not found ./read_wrf_nc.f: line 4: syntax error near unexpected token `(' ./read_wrf_nc.f: line 4: `! Can read double precision file (like WRF-Var)' how can i extract some desired variables from wrfout files. i don't care about maps, i just need numbers. i really appreciate your help On Fri, May 27, 2011 at 10:09 PM, wrfhelp wrote: > Yes, they are, unless you change them in the namelist.wps file. It should > not change when you change stand_lon. > But you might want to change them if latitude 30 or 60 are not in your > domain. > > wrfhelp > > > On May 27, 2011, at 10:00 AM, Ehsan Beigi wrote: > > Dear Sir/Madam, >> Thanks for your previous help. is truelat1 = 30.0, and truelat2 = >> 60.0 in Lambert projection is always the same for every domain? how about >> stand_lon = -98.0? if they are different for each domain, how can i >> calculate them for my domain? >> >> Best Regards >> >> Ehsan Beigi >> > > wrfhelp > http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html > > > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110530/3dcdbdc1/attachment-0001.html From hamed319 at yahoo.com Fri May 27 06:03:27 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Fri, 27 May 2011 05:03:27 -0700 (PDT) Subject: [Wrf-users] configre.wps request Message-ID: <864411.9813.qm@web161213.mail.bf1.yahoo.com> Dear All, I need a mpi configuration (configure.wps file)? to compare. my configuration is: $uname -a: Linux modeling-MS-7666 2.6.35-22-generic #33-Ubuntu SMP Sun Sep 19 20:34:50 UTC 2010 i686 GNU/Linux $ifort --version: ifort (IFORT) 12.0.4 20110427 Copyright (C) 1985-2011 Intel Corporation.? All rights reserved. ?$icc --version: icc (ICC) 12.0.4 20110427 Copyright (C) 1985-2011 Intel Corporation.? All rights reserved. ?Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110527/23947a27/attachment.html From hamed319 at yahoo.com Fri May 27 06:21:13 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Fri, 27 May 2011 05:21:13 -0700 (PDT) Subject: [Wrf-users] catastrophic error: could not open source file "asm/socket.h" Message-ID: <915227.6134.qm@web161203.mail.bf1.yahoo.com> Dear Sir/Ma'am, In compiling wrf at screen I got this error: /usr/include/bits/socket.h(381): catastrophic error: could not open source file "asm/socket.h" Any idea? Thanks in advance, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110527/fc907699/attachment.html From maria.frediani at gmail.com Fri May 27 11:58:19 2011 From: maria.frediani at gmail.com (Maria Eugenia) Date: Fri, 27 May 2011 13:58:19 -0400 Subject: [Wrf-users] Re : problem running real with GFS fnl* data for 2005 In-Reply-To: <420548.98227.qm@web29018.mail.ird.yahoo.com> References: <420548.98227.qm@web29018.mail.ird.yahoo.com> Message-ID: Hi everyone, I guess the problem is in the computer, it's overloaded. It has been running real.exe for 24h and it's not done yet. Meanwhile I set up the same simulation in a different machine, real.exe ran in less than 5 min and wrf.exe is working fine. Regarding the error messages, I believe real.exe is complaining about variables I don't have in the dataset. I mean, none of these variables appear in my Vtable (default Vtable.GFS). Therefore I believe they are either not in the dataset, or they are not necessary for the simulation. I'm describing the errors I found in my rsl.out file in case my assumption is wrong, and somebody knows how to fix this. I have a list of (1) "variable not found" and a list of (2) "attribute not found", as shown below. d01 2005-10-07_18:00:00 NetCDF error: NetCDF: Variable not found d01 2005-10-07_18:00:00 NetCDF error in wrf_io.F90, line 2712 Varname TAVGSFC d01 2005-10-07_18:00:00 NetCDF error: NetCDF: Attribute not found d01 2005-10-07_18:00:00 NetCDF error in ext_ncd_get_dom_ti.code INTEGER, line 83 Element FLAG_TSK For (1) the variables names are: TAVGSFC QV SPECHUMD ICEFRAC MU0 T2 SOIL_LEVELS SW SOILT SOILM SM000007 SM007028 SM028100 SM100255 ST000007 ST007028 ST028100 ST100255 SM010200 SOILM000 SOILM005 SOILM020 SOILM040 SOILM160 SOILM300 SW000010 SW010040 SW040100 SW100200 SW010200 SOILW000 SOILW005 SOILW020 SOILW040 SOILW160 SOILW300 ST010200 SOILT000 SOILT005 SOILT020 SOILT040 SOILT160 SOILT300 TOPOSTDV TOPOSLPX TOPOSLPY SHDMAX SHDMIN SOILCAT VEGCAT SNOWH CANWAT SST LAI MF_VX_INV HGT TSK XLAT XLONG ALBBCK TMN And for (2) the elements names are: P_TOP GMT JULYR JULDAY FLAG_TSK FLAG_TAVGSFC FLAG_QV FLAG_QC FLAG_QR FLAG_QI FLAG_QS FLAG_QG FLAG_QNI FLAG_SH FLAG_P_INTERP FLAG_SST FLAG_SNOWH FLAG_TOPOSOIL FLAG_ICEDEPTH FLAG_ICEFRAC FLAG_SOIL_LEVELS My netcdf version is 3.6.3, and I use gcc/gfortran 4.2.4. Please let me know if these errors will compromise the results of my simulation. Thanks ever Maria E. B. Frediani ------------------------------------------------------------------------------------- Visiting Scholar University of Connecticut School of Engineering 261 Glenbrook Rd Storrs, CT 06269 frediani at engr.uconn.edu On Fri, May 27, 2011 at 7:56 AM, moudi pascal wrote: > hello, > what version of netcdf do u use? > > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > > ________________________________ > De : Maria Eugenia > ? : wrf-users at ucar.edu > Envoy? le : Jeu 26 mai 2011, 21h 12min 15s > Objet?: [Wrf-users] problem running real with GFS fnl* data for 2005 > > Dear wrf help, > > I'm trying to run a simulation in ARW V3.2 (also tried V3.2.1 in a > different machine) using GFS fnl* grib1 data for a case in 2005 > (http://dss.ucar.edu/datasets/ds083.2/matrix.html). I have been > running wrf successfully for a long time with other source of data. > I'm using the Vtable.GFS, and I have also tried Vtable.GFS+TROP and > Vtable.NCEP2, but I cannot run real.exe. In my rsl.error I get a list > of NetCDF error: NetCDF: Variable not found > > I suspect there is something different with this dataset or the Vtable > is not adequate. Could you pls help me figure out what is going wrong? > > Attached you find the rsl.error, namelist.wps and namelist.input and > .sh file I use to submit the simulation. > > Thanks a lot > Maria E. B. Frediani > ------------------------------------------------------------------------------------- > Visiting Scholar > University of Connecticut > School of Engineering > 261 Glenbrook Rd > Storrs, CT 06269 > frediani at engr.uconn.edu > From Matthew.Foster at noaa.gov Fri May 27 09:22:22 2011 From: Matthew.Foster at noaa.gov (Matt Foster) Date: Fri, 27 May 2011 10:22:22 -0500 Subject: [Wrf-users] Changing land use In-Reply-To: <358161.27758.qm@web161215.mail.bf1.yahoo.com> References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> Message-ID: <4DDFC1AE.4030401@noaa.gov> Hamed, We have done experiments here in the past, where we modified the greeness fraction based on data from polar-orbiting satellites. We simply modified the geo_em NetCDF file, and it worked well. You should be able to take the same approach with land use. Matt On 5/22/2011 6:06 AM, Hamed Sharifi wrote: > Dear All, > I want to change some values of my landuse in order to see how it > affects the whole domain. For example, > change part of the landuse to Forest or Lack. > Does anyone know about this issue? > Thanks in advance, > Hamed Sharifi, > M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | > +98-9364024805 | hamed_sharifi at aut.ac.ir | > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110527/96c65557/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: matthew_foster.vcf Type: text/x-vcard Size: 229 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110527/96c65557/attachment.vcf From jaareval at gmail.com Tue May 31 09:47:49 2011 From: jaareval at gmail.com (Jorge Alejandro Arevalo Borquez) Date: Tue, 31 May 2011 11:47:49 -0400 Subject: [Wrf-users] Changing land use In-Reply-To: <4DDFC1AE.4030401@noaa.gov> References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> Message-ID: Hi, other option is modify the binary files of some existing landuse (modis or usgs), matlab read and write those files whit almost no problem. Atentamente Jorge Ar?valo B?rquez On Fri, May 27, 2011 at 11:22 AM, Matt Foster wrote: > Hamed, > > We have done experiments here in the past, where we modified the greeness > fraction based on data from polar-orbiting satellites. We simply modified > the geo_em NetCDF file, and it worked well. You should be able to take the > same approach with land use. > > Matt > > > > On 5/22/2011 6:06 AM, Hamed Sharifi wrote: > > Dear All, > I want to change some values of my landuse in order to see how it affects > the whole domain. For example, > change part of the landuse to Forest or Lack. > Does anyone know about this issue? > Thanks in advance, > > Hamed Sharifi, > M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | > +98-9364024805 | hamed_sharifi at aut.ac.ir | > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.eduhttp://mailman.ucar.edu/mailman/listinfo/wrf-users > > > -- > Do not go where the path may lead; go instead where there is no path and leave a trail. > -- Ralph Waldo Emerson > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/a9074c86/attachment.html From TitovM at ap.aurecongroup.com Tue May 31 15:05:36 2011 From: TitovM at ap.aurecongroup.com (Mikhail Titov) Date: Wed, 1 Jun 2011 07:05:36 +1000 Subject: [Wrf-users] Changing land use In-Reply-To: References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> Message-ID: Hi, I have a program written on C that transforms any input terrestrial ASCII code file (topo, land-use and so on) in WPS binary format: 'rd_wr_binary.exe'. It is written on C as Fortran always leaves empty bits between 2 lines that is not appropriate. This program is 2-direction one and can be used to check binary input terrestrial files transforming them in ASCII code. The program is very flexible and easily can be changed and re-compiled. Of cause after preparation of our own terrestrial files I create new subdirectory (with tiles) and 'index' file in 'geog' and edit "GEOGRID.TBL" in ' WPS/geogrid/' sub-directory to create several pointers on new terrestrial files and to choose an appropriate' interp_option'. We use to create our own terrain and land-use files (using special statistical methods and GIS) all the time for fine resolution WRF runs (1000 - 500m) to study wind resources for different on-shore and off-shore sites. USGS 2m - 30s terrestrial data is too coarse and very often is a crap. Regards, ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jorge Alejandro Arevalo Borquez Sent: Wednesday, 1 June 2011 3:48 a.m. To: Matt Foster Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Changing land use Hi, other option is modify the binary files of some existing landuse (modis or usgs), matlab read and write those files whit almost no problem. Atentamente Jorge Ar?valo B?rquez On Fri, May 27, 2011 at 11:22 AM, Matt Foster > wrote: Hamed, We have done experiments here in the past, where we modified the greeness fraction based on data from polar-orbiting satellites. We simply modified the geo_em NetCDF file, and it worked well. You should be able to take the same approach with land use. Matt On 5/22/2011 6:06 AM, Hamed Sharifi wrote: Dear All, I want to change some values of my landuse in order to see how it affects the whole domain. For example, change part of the landuse to Forest or Lack. Does anyone know about this issue? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110601/adaae452/attachment.html From ebeigi3 at tigers.lsu.edu Tue May 31 20:59:39 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Tue, 31 May 2011 22:59:39 -0400 Subject: [Wrf-users] read_wrf_nc.f error In-Reply-To: <9BDE2A7F9712AF45A0C08451B3CD8E5C25771E73@mag9006> References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25770BF7@mag9006> <9BDE2A7F9712AF45A0C08451B3CD8E5C25771E73@mag9006> Message-ID: Thanks for your previous help. I tried to run WRF for diffent days by changing the namelist.input , and WRF ran was ok for all of the days except the last day. and everytime i encountered this error, i really appreciate your help in advance Timing for main: time 2011-04-10_00:00:00 on domain 3: 0.79400 elapsed seconds. Timing for Writing wrfout_d03_2011-04-10_00:00:00 for domain 3: 0.06420 elapsed seconds. Timing for main: time 2011-04-10_00:00:00 on domain 2: 3.06700 elapsed seconds. Timing for Writing wrfout_d02_2011-04-10_00:00:00 for domain 2: 0.03080 elapsed seconds. Timing for main: time 2011-04-10_00:00:00 on domain 1: 9.21270 elapsed seconds. Timing for Writing wrfout_d01_2011-04-10_00:00:00 for domain 1: 0.02410 elapsed seconds. 2 input_wrf: wrf_get_next_time current_date: 2011-04-10_00:00:00 Sta tus = -4 -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 705 ... May have run out of valid boundary conditions in file wrfbdy_d01 Best Regards Ehsan Beigi On Tue, May 31, 2011 at 6:41 PM, Feng Liu wrote: > It sounds good. If you need to extract one variable (VAR) in entire > domain, you can use read_wrf_nc ?w VAR, if your need a time series of VAR in > a specific location you may use > > Read_wrf_nc ?ts xy X Y VAR ?lev 1 wrfout_d*.nc > > You can get the basic application information about the program by > read_wrf_nc ?help. > > > > Use shell programming to process multiple day runs. > > > > Hope it is helpful > > Feng > > > > > > *From:* Ehsan Beigi [mailto:ebeigi3 at tigers.lsu.edu] > *Sent:* Tuesday, May 31, 2011 3:18 PM > *To:* Feng Liu > *Subject:* Re: [Wrf-users] read_wrf_nc.f error > > > > Thanks very much for you help. it worked!!! > do you recommend me to use other software for reading wrfout, for example > MATLAB netcdf toolbox? > > I ran the WRF for 30 days, now i don't know how i can extract numbers of > some variables for example Precipitation and Temperature for all grid for > 30days among all of output?as you know there are many variables as output, i > need to make time series of Precipitation and Temperature > > Best Regards > > > > > Ehsan Beigi > > On Tue, May 31, 2011 at 12:07 PM, Feng Liu wrote: > > Hi, > > I think you failed to compile the program successfully due to incomplete > flags. I wonder how did you compile this program? If you are using Portland > Group PGF90, for example, you may compile the program by: > > pgf90 read_wrf_nc.f -L/path_of_your_netCDF_lib -lnetcdf -lm > -I/path_of_your_netCDF_include -Mfree -o read_wrf_nc > > > > Hope this is helpful. > > Thanks. > > > > *From:* wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] *On > Behalf Of *Ehsan Beigi > *Sent:* Monday, May 30, 2011 5:34 PM > *To:* wrfhelp > *Cc:* wrf-users at ucar.edu > *Subject:* [Wrf-users] read_wrf_nc.f error > > > > Dear Sir/Madam, > > Thanks for your previous help. I am trying to use read_wrf_nc.f for > extracting my desire vaiables from wrf output files. i can see data inside > me_em..nc* files, but i can not see the output of wrf (wrfout.do1...), and > this is my error message > > /read_wrf_nc.f: line 1: Special: command not found > information: cannot open `information' (No such file or directory) > on: cannot open `on' (No such file or directory) > the: cannot open `the' (No such file or directory) > screen: cannot open `screen' (No such file or directory) > ./read_wrf_nc.f: line 3: Can: command not found > ./read_wrf_nc.f: line 4: syntax error near unexpected token `(' > ./read_wrf_nc.f: line 4: `! Can read double precision file (like WRF-Var)' > > how can i extract some desired variables from wrfout files. i don't care > about maps, i just need numbers. i really appreciate your help > > On Fri, May 27, 2011 at 10:09 PM, wrfhelp wrote: > > Yes, they are, unless you change them in the namelist.wps file. It should > not change when you change stand_lon. > But you might want to change them if latitude 30 or 60 are not in your > domain. > > wrfhelp > > > > On May 27, 2011, at 10:00 AM, Ehsan Beigi wrote: > > Dear Sir/Madam, > Thanks for your previous help. is truelat1 = 30.0, and truelat2 = > 60.0 in Lambert projection is always the same for every domain? how about > stand_lon = -98.0? if they are different for each domain, how can i > calculate them for my domain? > > Best Regards > > Ehsan Beigi > > > > wrfhelp > http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html > > > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > **2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > > > > -- > *Ehsan Beigi* > *PhD Student* > *Department of Civil and and Environmental Engineering > **2408 Patrick F. Taylor Hall > Louisiana State University > Baton Rouge, LA, 70803* > > -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/ff401ce7/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4681 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/ff401ce7/attachment-0001.obj From FLiu at azmag.gov Tue May 31 10:07:24 2011 From: FLiu at azmag.gov (Feng Liu) Date: Tue, 31 May 2011 16:07:24 +0000 Subject: [Wrf-users] read_wrf_nc.f error In-Reply-To: References: Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C25770BF7@mag9006> Hi, I think you failed to compile the program successfully due to incomplete flags. I wonder how did you compile this program? If you are using Portland Group PGF90, for example, you may compile the program by: pgf90 read_wrf_nc.f -L/path_of_your_netCDF_lib -lnetcdf -lm -I/path_of_your_netCDF_include -Mfree -o read_wrf_nc Hope this is helpful. Thanks. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ehsan Beigi Sent: Monday, May 30, 2011 5:34 PM To: wrfhelp Cc: wrf-users at ucar.edu Subject: [Wrf-users] read_wrf_nc.f error Dear Sir/Madam, Thanks for your previous help. I am trying to use read_wrf_nc.f for extracting my desire vaiables from wrf output files. i can see data inside me_em..nc* files, but i can not see the output of wrf (wrfout.do1...), and this is my error message /read_wrf_nc.f: line 1: Special: command not found information: cannot open `information' (No such file or directory) on: cannot open `on' (No such file or directory) the: cannot open `the' (No such file or directory) screen: cannot open `screen' (No such file or directory) ./read_wrf_nc.f: line 3: Can: command not found ./read_wrf_nc.f: line 4: syntax error near unexpected token `(' ./read_wrf_nc.f: line 4: `! Can read double precision file (like WRF-Var)' how can i extract some desired variables from wrfout files. i don't care about maps, i just need numbers. i really appreciate your help On Fri, May 27, 2011 at 10:09 PM, wrfhelp > wrote: Yes, they are, unless you change them in the namelist.wps file. It should not change when you change stand_lon. But you might want to change them if latitude 30 or 60 are not in your domain. wrfhelp On May 27, 2011, at 10:00 AM, Ehsan Beigi wrote: Dear Sir/Madam, Thanks for your previous help. is truelat1 = 30.0, and truelat2 = 60.0 in Lambert projection is always the same for every domain? how about stand_lon = -98.0? if they are different for each domain, how can i calculate them for my domain? Best Regards Ehsan Beigi wrfhelp http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/5efb4e4e/attachment.html From FLiu at azmag.gov Tue May 31 10:37:29 2011 From: FLiu at azmag.gov (Feng Liu) Date: Tue, 31 May 2011 16:37:29 +0000 Subject: [Wrf-users] WRF V3.3 error In-Reply-To: References: Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C25771C49@mag9006> Hi, The problem may be related to digital filter initialization (DFI) which can be used for multiple domains in V3.3. It should be set in a separate namelist record, how to add an extra namelist record for DFI in your namelist.input, you may refer to the examples.namelist attached in this version. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of A.R Ragi Sent: Sunday, May 29, 2011 10:53 PM To: wrf-users at ucar.edu Subject: [Wrf-users] WRF V3.3 error Dear WRF Users, I was running WRF Latest version (version 3.3) for 2004 aug 01 (24hr). The WPS part is running successfully. But when am running real.exe the following error am getting. I'm attaching the namelist.wps and namelist.input. Can anyone help me to sort this out? The same I had done for WRF3.1.1 and is successfully running. Is this problem with Version 3.3? taskid: 0 hostname: ajaymeru.cas.iitd.ernet.in ------ ERROR while reading namelist time_control ------ Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc ------ ERROR while reading namelist physics ------ Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 9094 ERRORS while reading one or more namelists from namelist.input. ------------------------------------------- application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0[unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 Thanks in advance -- *********************************************************************** Regards, A.R.Ragi Research Scholar CAS, IIT Delhi *********************************************************************** "I want to know how God created this world. I am not interested in this or that phenomenon. I want to know, his thought, the rest are details." ---Albert Einstein *********************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/44cde336/attachment.html From hamed319 at yahoo.com Tue May 31 23:09:36 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Tue, 31 May 2011 22:09:36 -0700 (PDT) Subject: [Wrf-users] Best method for PBL Message-ID: <71589.44026.qm@web161209.mail.bf1.yahoo.com> Dear All, Which of the PBL category does work the best for a Megacity? Thanks, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/1230b598/attachment-0001.html From hamed319 at yahoo.com Wed Jun 1 04:46:45 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Wed, 1 Jun 2011 03:46:45 -0700 (PDT) Subject: [Wrf-users] metgrid.exe---forrtl: severe (173) Message-ID: <18438.66830.qm@web161215.mail.bf1.yahoo.com> Dear All, When I run metgrid.exe, I got this error: forrtl: severe (173): A pointer passed to DEALLOCATE points to an array that cannot be deallocated Any suggestion would be appreciated. Thanks in advance, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110601/b89d5c52/attachment-0001.html From mmkamal at uwaterloo.ca Tue May 31 15:50:08 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Tue, 31 May 2011 17:50:08 -0400 Subject: [Wrf-users] Changing land use In-Reply-To: References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> Message-ID: <20110531175008.75424udz5aeaasu8@www.nexusmail.uwaterloo.ca> Hi Hamed, I am currently working on some land use sensitivity experiment and I used an open source statistical package called "R" to read the land use binary data (inside the geog directory of the data set). Using R I am able to read the data set, then modified some category in my area of interest and save the modification. Finally, I fed the model with the modified data set and find that it does work properly. You can get more detail about "R" in the following link. Or simply make google search "read/write binary data using R". Thanks Kamal Quoting Mikhail Titov : > Hi, > > I have a program written on C that transforms any input terrestrial > ASCII code file (topo, land-use and so on) in WPS binary format: > 'rd_wr_binary.exe'. > It is written on C as Fortran always leaves empty bits between 2 > lines that is not appropriate. This program is 2-direction one and > can be used to check > binary input terrestrial files transforming them in ASCII code. The > program is very flexible and easily can be changed and re-compiled. > > Of cause after preparation of our own terrestrial files I create new > subdirectory (with tiles) and 'index' file in 'geog' and edit > "GEOGRID.TBL" in > ' WPS/geogrid/' sub-directory to create several pointers on new > terrestrial files and to choose an appropriate' interp_option'. > > We use to create our own terrain and land-use files (using special > statistical methods and GIS) all the time for fine resolution WRF runs > (1000 - 500m) to study wind resources for different on-shore and > off-shore sites. USGS 2m - 30s terrestrial data is too coarse and > very often is a crap. > > Regards, > ________________________________________________________________________________ > > Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon > Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 > | Mob: +64 21 106 5563 > Email: TitovM at ap.aurecongroup.com > Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand > PO Box 1061 > http://www.aurecongroup.com > http://www.aurecongroup.com/apac/groupentity/ > _________________________________________________________________________________ > > Please consider your environment before printing this e-mail. > > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] > On Behalf Of Jorge Alejandro Arevalo Borquez > Sent: Wednesday, 1 June 2011 3:48 a.m. > To: Matt Foster > Cc: wrf-users at ucar.edu > Subject: Re: [Wrf-users] Changing land use > > Hi, > other option is modify the binary files of some existing landuse > (modis or usgs), matlab read and write those files whit almost no > problem. > > Atentamente > Jorge Ar?valo B?rquez > > On Fri, May 27, 2011 at 11:22 AM, Matt Foster > > wrote: > Hamed, > > We have done experiments here in the past, where we modified the > greeness fraction based on data from polar-orbiting satellites. We > simply modified the geo_em NetCDF file, and it worked well. You > should be able to take the same approach with land use. > > Matt > > > > On 5/22/2011 6:06 AM, Hamed Sharifi wrote: > Dear All, > I want to change some values of my landuse in order to see how it > affects the whole domain. For example, > change part of the landuse to Forest or Lack. > Does anyone know about this issue? > Thanks in advance, > > Hamed Sharifi, > M.Sc Student, AUT Tehran/Iran > |hamed319 at yahoo.com | > +98-9364024805 | > hamed_sharifi at aut.ac.ir | > > > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > -- > > Do not go where the path may lead; go instead where there is no path > and leave a trail. > > -- Ralph Waldo Emerson > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ > From wrf at nusculus.com Tue May 31 10:02:44 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Tue, 31 May 2011 10:02:44 -0600 Subject: [Wrf-users] WRF V3.3 error In-Reply-To: References: Message-ID: Hi, Try removing the line with ##sst_update = 1 Trying to comment a namelist like that does not work for me. Kevin On Sun, May 29, 2011 at 11:52 PM, A.R Ragi wrote: > Dear WRF Users, > > I was running WRF Latest version (version 3.3) for 2004 aug 01 (24hr). The > WPS part is running successfully. But when am running real.exe the following > error am getting. I'm attaching the namelist.wps and namelist.input. Can > anyone help me to sort this out? > The same I had done for WRF3.1.1 and is successfully running. Is this > problem with Version 3.3? > > > > *taskid: 0 hostname: ajaymeru.cas.iitd.ernet.in > ------ ERROR while reading namelist time_control ------ > Namelist dfi_control not found in namelist.input. Using registry defaults > for v > ariables in dfi_control > Namelist tc not found in namelist.input. Using registry defaults for > variables > in tc > ------ ERROR while reading namelist physics ------ > Namelist scm not found in namelist.input. Using registry defaults for > variables > in scm > Namelist fire not found in namelist.input. Using registry defaults for > variable > s in fire > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 9094 > ERRORS while reading one or more namelists from namelist.input. > ------------------------------------------- > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0[unset]: > aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 > > * > Thanks in advance* > * > > *-- * > ************************************************************************* > *Regards,* > *A.R.Ragi* > *Research Scholar > * > *CAS, IIT Delhi* > > ************************************************************************* > *"I want to know how God created this world. > I am not interested in this or that phenomenon. > I want to know, his thought, the rest are details." > ---Albert Einstein* > > ************************************************************************* > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110531/93d28d23/attachment.html From FLiu at azmag.gov Wed Jun 1 09:38:51 2011 From: FLiu at azmag.gov (Feng Liu) Date: Wed, 1 Jun 2011 15:38:51 +0000 Subject: [Wrf-users] read_wrf_nc.f error In-Reply-To: References: <9BDE2A7F9712AF45A0C08451B3CD8E5C25770BF7@mag9006> <9BDE2A7F9712AF45A0C08451B3CD8E5C25771E73@mag9006> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C25771FD1@mag9006> Check your input files 'met_em.d0*' from WPS and see if they are available for the last day. The easy way to check that is to see if the met_em.d0* file has the same size as previous days. I noticed that the interval of your input data is 24 hours, daily. Thanks. Feng From: Ehsan Beigi [mailto:ebeigi3 at tigers.lsu.edu] Sent: Tuesday, May 31, 2011 8:00 PM To: Feng Liu Cc: wrfhelp; wrf-users at ucar.edu Subject: Re: [Wrf-users] read_wrf_nc.f error Thanks for your previous help. I tried to run WRF for diffent days by changing the namelist.input , and WRF ran was ok for all of the days except the last day. and everytime i encountered this error, i really appreciate your help in advance Timing for main: time 2011-04-10_00:00:00 on domain 3: 0.79400 elapsed seconds. Timing for Writing wrfout_d03_2011-04-10_00:00:00 for domain 3: 0.06420 elapsed seconds. Timing for main: time 2011-04-10_00:00:00 on domain 2: 3.06700 elapsed seconds. Timing for Writing wrfout_d02_2011-04-10_00:00:00 for domain 2: 0.03080 elapsed seconds. Timing for main: time 2011-04-10_00:00:00 on domain 1: 9.21270 elapsed seconds. Timing for Writing wrfout_d01_2011-04-10_00:00:00 for domain 1: 0.02410 elapsed seconds. 2 input_wrf: wrf_get_next_time current_date: 2011-04-10_00:00:00 Sta tus = -4 -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 705 ... May have run out of valid boundary conditions in file wrfbdy_d01 Best Regards Ehsan Beigi On Tue, May 31, 2011 at 6:41 PM, Feng Liu > wrote: It sounds good. If you need to extract one variable (VAR) in entire domain, you can use read_wrf_nc -w VAR, if your need a time series of VAR in a specific location you may use Read_wrf_nc -ts xy X Y VAR -lev 1 wrfout_d*.nc You can get the basic application information about the program by read_wrf_nc -help. Use shell programming to process multiple day runs. Hope it is helpful Feng From: Ehsan Beigi [mailto:ebeigi3 at tigers.lsu.edu] Sent: Tuesday, May 31, 2011 3:18 PM To: Feng Liu Subject: Re: [Wrf-users] read_wrf_nc.f error Thanks very much for you help. it worked!!! do you recommend me to use other software for reading wrfout, for example MATLAB netcdf toolbox? I ran the WRF for 30 days, now i don't know how i can extract numbers of some variables for example Precipitation and Temperature for all grid for 30days among all of output?as you know there are many variables as output, i need to make time series of Precipitation and Temperature Best Regards Ehsan Beigi On Tue, May 31, 2011 at 12:07 PM, Feng Liu > wrote: Hi, I think you failed to compile the program successfully due to incomplete flags. I wonder how did you compile this program? If you are using Portland Group PGF90, for example, you may compile the program by: pgf90 read_wrf_nc.f -L/path_of_your_netCDF_lib -lnetcdf -lm -I/path_of_your_netCDF_include -Mfree -o read_wrf_nc Hope this is helpful. Thanks. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ehsan Beigi Sent: Monday, May 30, 2011 5:34 PM To: wrfhelp Cc: wrf-users at ucar.edu Subject: [Wrf-users] read_wrf_nc.f error Dear Sir/Madam, Thanks for your previous help. I am trying to use read_wrf_nc.f for extracting my desire vaiables from wrf output files. i can see data inside me_em..nc* files, but i can not see the output of wrf (wrfout.do1...), and this is my error message /read_wrf_nc.f: line 1: Special: command not found information: cannot open `information' (No such file or directory) on: cannot open `on' (No such file or directory) the: cannot open `the' (No such file or directory) screen: cannot open `screen' (No such file or directory) ./read_wrf_nc.f: line 3: Can: command not found ./read_wrf_nc.f: line 4: syntax error near unexpected token `(' ./read_wrf_nc.f: line 4: `! Can read double precision file (like WRF-Var)' how can i extract some desired variables from wrfout files. i don't care about maps, i just need numbers. i really appreciate your help On Fri, May 27, 2011 at 10:09 PM, wrfhelp > wrote: Yes, they are, unless you change them in the namelist.wps file. It should not change when you change stand_lon. But you might want to change them if latitude 30 or 60 are not in your domain. wrfhelp On May 27, 2011, at 10:00 AM, Ehsan Beigi wrote: Dear Sir/Madam, Thanks for your previous help. is truelat1 = 30.0, and truelat2 = 60.0 in Lambert projection is always the same for every domain? how about stand_lon = -98.0? if they are different for each domain, how can i calculate them for my domain? Best Regards Ehsan Beigi wrfhelp http://www.mmm.ucar.edu/wrf/users/supports/wrfhelp.html -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110601/25b45ec7/attachment.html From mmkamal at uwaterloo.ca Wed Jun 1 19:48:36 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Wed, 01 Jun 2011 21:48:36 -0400 Subject: [Wrf-users] Changing land use In-Reply-To: References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> <20110531175008.75424udz5aeaasu8@www.nexusmail.uwaterloo.ca> Message-ID: <20110601214836.164529r7glp709no@www.nexusmail.uwaterloo.ca> Hi Hameed, I am sorry that I forget to include the link of R package. Here you go the links: http://www.r-project.org/ How to read binary data using R: http://www.ats.ucla.edu/stat/r/faq/read_binary.htm To write binary data : http://www.ats.ucla.edu/stat/r/faq/write_binary_bycolumn.htm Hope the above link will help you. Thanks Kamal Quoting Saji Hameed : > I guess NCL also can be used to modify both the netcdf files (geogrid*.nc) > or the binary source files that geogrid.exe uses... > > saji > > On Wed, Jun 1, 2011 at 6:50 AM, wrote: > >> Hi Hamed, >> >> I am currently working on some land use sensitivity experiment and I >> used an open source statistical package called "R" to read the land >> use binary data (inside the geog directory of the data set). Using R I >> am able to read the data set, then modified some category in my area >> of interest and save the modification. Finally, I fed the model with >> the modified data set and find that it does work properly. You can get >> more detail about "R" in the following link. Or simply make google >> search "read/write binary data using R". >> >> >> Thanks >> Kamal >> >> >> >> >> >> >> >> Quoting Mikhail Titov : >> >> > Hi, >> > >> > I have a program written on C that transforms any input terrestrial >> > ASCII code file (topo, land-use and so on) in WPS binary format: >> > 'rd_wr_binary.exe'. >> > It is written on C as Fortran always leaves empty bits between 2 >> > lines that is not appropriate. This program is 2-direction one and >> > can be used to check >> > binary input terrestrial files transforming them in ASCII code. The >> > program is very flexible and easily can be changed and re-compiled. >> > >> > Of cause after preparation of our own terrestrial files I create new >> > subdirectory (with tiles) and 'index' file in 'geog' and edit >> > "GEOGRID.TBL" in >> > ' WPS/geogrid/' sub-directory to create several pointers on new >> > terrestrial files and to choose an appropriate' interp_option'. >> > >> > We use to create our own terrain and land-use files (using special >> > statistical methods and GIS) all the time for fine resolution WRF runs >> > (1000 - 500m) to study wind resources for different on-shore and >> > off-shore sites. USGS 2m - 30s terrestrial data is too coarse and >> > very often is a crap. >> > >> > Regards, >> > >> ________________________________________________________________________________ >> > >> > Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon >> > Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 >> > | Mob: +64 21 106 5563 >> > Email: TitovM at ap.aurecongroup.com >> > Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand >> > PO Box 1061 >> > http://www.aurecongroup.com >> > http://www.aurecongroup.com/apac/groupentity/ >> > >> _________________________________________________________________________________ >> > >> > Please consider your environment before printing this e-mail. >> > >> > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] >> > On Behalf Of Jorge Alejandro Arevalo Borquez >> > Sent: Wednesday, 1 June 2011 3:48 a.m. >> > To: Matt Foster >> > Cc: wrf-users at ucar.edu >> > Subject: Re: [Wrf-users] Changing land use >> > >> > Hi, >> > other option is modify the binary files of some existing landuse >> > (modis or usgs), matlab read and write those files whit almost no >> > problem. >> > >> > Atentamente >> > Jorge Ar?valo B?rquez >> > >> > On Fri, May 27, 2011 at 11:22 AM, Matt Foster >> > > wrote: >> > Hamed, >> > >> > We have done experiments here in the past, where we modified the >> > greeness fraction based on data from polar-orbiting satellites. We >> > simply modified the geo_em NetCDF file, and it worked well. You >> > should be able to take the same approach with land use. >> > >> > Matt >> > >> > >> > >> > On 5/22/2011 6:06 AM, Hamed Sharifi wrote: >> > Dear All, >> > I want to change some values of my landuse in order to see how it >> > affects the whole domain. For example, >> > change part of the landuse to Forest or Lack. >> > Does anyone know about this issue? >> > Thanks in advance, >> > >> > Hamed Sharifi, >> > M.Sc Student, AUT Tehran/Iran >> > |hamed319 at yahoo.com | >> > +98-9364024805 | >> > hamed_sharifi at aut.ac.ir | >> > >> > >> > >> > _______________________________________________ >> > >> > Wrf-users mailing list >> > >> > Wrf-users at ucar.edu >> > >> > http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > >> > >> > >> > -- >> > >> > Do not go where the path may lead; go instead where there is no path >> > and leave a trail. >> > >> > -- Ralph Waldo Emerson >> > >> > _______________________________________________ >> > Wrf-users mailing list >> > Wrf-users at ucar.edu >> > http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > >> > >> > Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ >> > >> >> >> >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > > > > -- > > Saji N Hameed, > ARC-ENV, Center for Advanced Information Science and Technology, > University of Aizu, Tsuruga, Ikki-machi, > Aizuwakamatsu-shi, Fukushima 965-8580, > Japan > > Tel: +81242 37-2736 > email: saji at u-aizu.ac.jp > url: http://www.u-aizu.ac.jp > bib: http://www.researcherid.com/rid/B-9188-2009 > From rudra.shrestha at postgrad.manchester.ac.uk Thu Jun 2 06:37:31 2011 From: rudra.shrestha at postgrad.manchester.ac.uk (Rudra Shrestha) Date: Thu, 02 Jun 2011 13:37:31 +0100 Subject: [Wrf-users] Ensemble run Message-ID: <1307018251.8039.16.camel@rudrasimona03.seaes.manchester.ac.uk> Dear users, I want to make ensemble run to investigate effect of aerosol and temperature on rainfall. I can make change in the aerosol by changing variables in the microphysics scheme (e.g. Morrison scheme) but don't have very good idea how to make change in the temperature. I tried to change in the potential temperature ('TT' variable in met_em* file) using 'ncsave' function in Matlab. However there are significant number of files and it is really time consuming while changing in the individual met_em files. So I was wondering if there are any standard method to make ensemble run in the wrf or make change variables in the met_em* file. I am using NCEP2 data. Any help would be highly appreciated. Many Thanks, Rudra From saji at u-aizu.ac.jp Wed Jun 1 18:56:47 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Thu, 2 Jun 2011 09:56:47 +0900 Subject: [Wrf-users] Changing land use In-Reply-To: References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> Message-ID: Hi Mikhail, Can I have a copy of your C-program? Thanks, Saji On Wed, Jun 1, 2011 at 6:05 AM, Mikhail Titov wrote: > Hi, > > > > I have a program written on C that transforms any input terrestrial ASCII > code file (topo, land-use and so on) in WPS binary format: > ?rd_wr_binary.exe?. > > It is written on C as Fortran always leaves empty bits between 2 lines that > is not appropriate. This program is 2-direction one and can be used to check > > binary input terrestrial files transforming them in ASCII code. The program > is very flexible and easily can be changed and re-compiled. > > > > Of cause after preparation of our own terrestrial files I create new > subdirectory (with tiles) and ?index? file in ?geog? and edit ?GEOGRID.TBL? > in > > ? WPS/geogrid/? sub-directory to create several pointers on new terrestrial > files and to choose an appropriate? interp_option?. > > > > We use to create our own terrain and land-use files (using special > statistical methods and GIS) all the time for fine resolution WRF runs > > (1000 ? 500m) to study wind resources for different on-shore and off-shore > sites. USGS 2m ? 30s terrestrial data is too coarse and very often is a > crap. > > > > Regards, > > * > ________________________________________________________________________________ > * > > * * > > *Dr. Mikhail Titov **| Senior Prof. Officer, Energy* *|* Aurecon > > *Ph: *+64 3 366 08 21 ext.9231 DDI +64 367 32 31 *| **Fax:* +64 3 379 6955 > *|* *Mob:* +64 21 106 5563 > > *Email: TitovM at ap.aurecongroup.com * > > Unit 1, 150 Cavendish Road *|* Christchurch 8051 *|* New Zealand > > PO Box 1061 > > http://www.aurecongroup.com > > http://www.aurecongroup.com/apac/groupentity/ > > * > _________________________________________________________________________________ > * > > > > Please consider your environment before printing this e-mail. > > > > *From:* wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] *On > Behalf Of *Jorge Alejandro Arevalo Borquez > *Sent:* Wednesday, 1 June 2011 3:48 a.m. > *To:* Matt Foster > *Cc:* wrf-users at ucar.edu > *Subject:* Re: [Wrf-users] Changing land use > > > > Hi, > other option is modify the binary files of some existing landuse (modis or > usgs), matlab read and write those files whit almost no problem. > > Atentamente > Jorge Ar?valo B?rquez > > On Fri, May 27, 2011 at 11:22 AM, Matt Foster > wrote: > > Hamed, > > We have done experiments here in the past, where we modified the greeness > fraction based on data from polar-orbiting satellites. We simply modified > the geo_em NetCDF file, and it worked well. You should be able to take the > same approach with land use. > > Matt > > > > > On 5/22/2011 6:06 AM, Hamed Sharifi wrote: > > Dear All, > > I want to change some values of my landuse in order to see how it affects > the whole domain. For example, > > change part of the landuse to Forest or Lack. > > Does anyone know about this issue? > > Thanks in advance, > > > > Hamed Sharifi, > M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | > +98-9364024805 | hamed_sharifi at aut.ac.ir | > > > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > -- > > Do not go where the path may lead; go instead where there is no path and leave a trail. > > -- Ralph Waldo Emerson > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110602/39111728/attachment.html From saji at u-aizu.ac.jp Wed Jun 1 18:58:05 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Thu, 2 Jun 2011 09:58:05 +0900 Subject: [Wrf-users] Changing land use In-Reply-To: <20110531175008.75424udz5aeaasu8@www.nexusmail.uwaterloo.ca> References: <358161.27758.qm@web161215.mail.bf1.yahoo.com> <4DDFC1AE.4030401@noaa.gov> <20110531175008.75424udz5aeaasu8@www.nexusmail.uwaterloo.ca> Message-ID: I guess NCL also can be used to modify both the netcdf files (geogrid*.nc) or the binary source files that geogrid.exe uses... saji On Wed, Jun 1, 2011 at 6:50 AM, wrote: > Hi Hamed, > > I am currently working on some land use sensitivity experiment and I > used an open source statistical package called "R" to read the land > use binary data (inside the geog directory of the data set). Using R I > am able to read the data set, then modified some category in my area > of interest and save the modification. Finally, I fed the model with > the modified data set and find that it does work properly. You can get > more detail about "R" in the following link. Or simply make google > search "read/write binary data using R". > > > Thanks > Kamal > > > > > > > > Quoting Mikhail Titov : > > > Hi, > > > > I have a program written on C that transforms any input terrestrial > > ASCII code file (topo, land-use and so on) in WPS binary format: > > 'rd_wr_binary.exe'. > > It is written on C as Fortran always leaves empty bits between 2 > > lines that is not appropriate. This program is 2-direction one and > > can be used to check > > binary input terrestrial files transforming them in ASCII code. The > > program is very flexible and easily can be changed and re-compiled. > > > > Of cause after preparation of our own terrestrial files I create new > > subdirectory (with tiles) and 'index' file in 'geog' and edit > > "GEOGRID.TBL" in > > ' WPS/geogrid/' sub-directory to create several pointers on new > > terrestrial files and to choose an appropriate' interp_option'. > > > > We use to create our own terrain and land-use files (using special > > statistical methods and GIS) all the time for fine resolution WRF runs > > (1000 - 500m) to study wind resources for different on-shore and > > off-shore sites. USGS 2m - 30s terrestrial data is too coarse and > > very often is a crap. > > > > Regards, > > > ________________________________________________________________________________ > > > > Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon > > Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 > > | Mob: +64 21 106 5563 > > Email: TitovM at ap.aurecongroup.com > > Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand > > PO Box 1061 > > http://www.aurecongroup.com > > http://www.aurecongroup.com/apac/groupentity/ > > > _________________________________________________________________________________ > > > > Please consider your environment before printing this e-mail. > > > > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] > > On Behalf Of Jorge Alejandro Arevalo Borquez > > Sent: Wednesday, 1 June 2011 3:48 a.m. > > To: Matt Foster > > Cc: wrf-users at ucar.edu > > Subject: Re: [Wrf-users] Changing land use > > > > Hi, > > other option is modify the binary files of some existing landuse > > (modis or usgs), matlab read and write those files whit almost no > > problem. > > > > Atentamente > > Jorge Ar?valo B?rquez > > > > On Fri, May 27, 2011 at 11:22 AM, Matt Foster > > > wrote: > > Hamed, > > > > We have done experiments here in the past, where we modified the > > greeness fraction based on data from polar-orbiting satellites. We > > simply modified the geo_em NetCDF file, and it worked well. You > > should be able to take the same approach with land use. > > > > Matt > > > > > > > > On 5/22/2011 6:06 AM, Hamed Sharifi wrote: > > Dear All, > > I want to change some values of my landuse in order to see how it > > affects the whole domain. For example, > > change part of the landuse to Forest or Lack. > > Does anyone know about this issue? > > Thanks in advance, > > > > Hamed Sharifi, > > M.Sc Student, AUT Tehran/Iran > > |hamed319 at yahoo.com | > > +98-9364024805 | > > hamed_sharifi at aut.ac.ir | > > > > > > > > _______________________________________________ > > > > Wrf-users mailing list > > > > Wrf-users at ucar.edu > > > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > > > > > -- > > > > Do not go where the path may lead; go instead where there is no path > > and leave a trail. > > > > -- Ralph Waldo Emerson > > > > _______________________________________________ > > Wrf-users mailing list > > Wrf-users at ucar.edu > > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > > > > Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ > > > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110602/105f629c/attachment-0001.html From TitovM at ap.aurecongroup.com Thu Jun 2 16:37:52 2011 From: TitovM at ap.aurecongroup.com (Mikhail Titov) Date: Fri, 3 Jun 2011 08:37:52 +1000 Subject: [Wrf-users] Topo and land use ASCII - netCDF transformation program Message-ID: Hello All Users, As there was some interest regarding the program to transform ASCII terrain and land-use in WRF format I would like to send all 3 subroutines to everyone who will need it. Just follow the instructions below and in 'README'. I working on SGI cluster use as input ASCII file format F7.1 but this is flexible and can be changed in the procedure. The program is two way and you can check your binary files applying reverse transformation. Please let me know if you improve the transformation procedure. Good luck and kind regards, Mikhail Find attached the program itself 'rd_wr_formatted' (executable and Fortran code) and two C subroutines: to read ('read_geogrid.c') and to write ('write_geogrid.c'). Also I attached README as every computer has its niceties. All additional instructions you will find in 'rd_wr.formatted.f90' itself. The program works with formatted ASCII input files prepared using Statistica, TextPad or any piece of code. I work on SGI and use Linux Red Hat. Good luck, Mikhail ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Wednesday, 1 June 2011 10:01 a.m. To: Mikhail Titov Subject: RE: [Wrf-users] Changing land use Hi Mikhail, It is highly appreciated if you could share your program with us. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Mikhail Titov Sent: Tuesday, May 31, 2011 2:06 PM To: Jorge Alejandro Arevalo Borquez; Matt Foster Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Changing land use Hi, I have a program written on C that transforms any input terrestrial ASCII code file (topo, land-use and so on) in WPS binary format: 'rd_wr_binary.exe'. It is written on C as Fortran always leaves empty bits between 2 lines that is not appropriate. This program is 2-direction one and can be used to check binary input terrestrial files transforming them in ASCII code. The program is very flexible and easily can be changed and re-compiled. Of cause after preparation of our own terrestrial files I create new subdirectory (with tiles) and 'index' file in 'geog' and edit "GEOGRID.TBL" in ' WPS/geogrid/' sub-directory to create several pointers on new terrestrial files and to choose an appropriate' interp_option'. We use to create our own terrain and land-use files (using special statistical methods and GIS) all the time for fine resolution WRF runs (1000 - 500m) to study wind resources for different on-shore and off-shore sites. USGS 2m - 30s terrestrial data is too coarse and very often is a crap. Regards, ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jorge Alejandro Arevalo Borquez Sent: Wednesday, 1 June 2011 3:48 a.m. To: Matt Foster Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Changing land use Hi, other option is modify the binary files of some existing landuse (modis or usgs), matlab read and write those files whit almost no problem. Atentamente Jorge Ar?valo B?rquez On Fri, May 27, 2011 at 11:22 AM, Matt Foster > wrote: Hamed, We have done experiments here in the past, where we modified the greeness fraction based on data from polar-orbiting satellites. We simply modified the geo_em NetCDF file, and it worked well. You should be able to take the same approach with land use. Matt On 5/22/2011 6:06 AM, Hamed Sharifi wrote: Dear All, I want to change some values of my landuse in order to see how it affects the whole domain. For example, change part of the landuse to Forest or Lack. Does anyone know about this issue? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: rd_wr_formatted.exe Type: application/octet-stream Size: 2419074 bytes Desc: rd_wr_formatted.exe Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0001.exe -------------- next part -------------- A non-text attachment was scrubbed... Name: rd_wr_formatted.f90 Type: application/octet-stream Size: 2429 bytes Desc: rd_wr_formatted.f90 Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0002.obj -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: write_geogrid.c Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0002.c -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: read_geogrid.c Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0003.c -------------- next part -------------- A non-text attachment was scrubbed... Name: README Type: application/octet-stream Size: 793 bytes Desc: README Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/e9ee1c5c/attachment-0003.obj From TitovM at ap.aurecongroup.com Thu Jun 2 18:06:11 2011 From: TitovM at ap.aurecongroup.com (Mikhail Titov) Date: Fri, 3 Jun 2011 10:06:11 +1000 Subject: [Wrf-users] Topo and land use ASCII - netCDF transformation program Message-ID: Hello, As there was some interest regarding the program to transform ASCII terrain and land-use in WRF format I would like to send all 3 subroutines to everyone who will need it. Just follow the instructions below and in 'README'. I working on SGI cluster use as input ASCII file format F7.1 but this is flexible and can be changed in the procedure. The program is two way and you can check your binary files applying reverse transformation. Please let me know if you improve the transformation procedure. Good luck and kind regards, Mikhail Find attached the program itself 'rd_wr_formatted' (executable and Fortran code) and two C subroutines: to read ('read_geogrid.c') and to write ('write_geogrid.c'). Also I attached README as every computer has its niceties. All additional instructions you will find in 'rd_wr.formatted.f90' itself. The program works with formatted ASCII input files prepared using Statistica, TextPad or any piece of code. I work on SGI and use Linux Red Hat. Good luck, Mikhail ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: Feng Liu [mailto:FLiu at azmag.gov] Sent: Wednesday, 1 June 2011 10:01 a.m. To: Mikhail Titov Subject: RE: [Wrf-users] Changing land use Hi Mikhail, It is highly appreciated if you could share your program with us. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Mikhail Titov Sent: Tuesday, May 31, 2011 2:06 PM To: Jorge Alejandro Arevalo Borquez; Matt Foster Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Changing land use Hi, I have a program written on C that transforms any input terrestrial ASCII code file (topo, land-use and so on) in WPS binary format: 'rd_wr_binary.exe'. It is written on C as Fortran always leaves empty bits between 2 lines that is not appropriate. This program is 2-direction one and can be used to check binary input terrestrial files transforming them in ASCII code. The program is very flexible and easily can be changed and re-compiled. Of cause after preparation of our own terrestrial files I create new subdirectory (with tiles) and 'index' file in 'geog' and edit "GEOGRID.TBL" in ' WPS/geogrid/' sub-directory to create several pointers on new terrestrial files and to choose an appropriate' interp_option'. We use to create our own terrain and land-use files (using special statistical methods and GIS) all the time for fine resolution WRF runs (1000 - 500m) to study wind resources for different on-shore and off-shore sites. USGS 2m - 30s terrestrial data is too coarse and very often is a crap. Regards, ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jorge Alejandro Arevalo Borquez Sent: Wednesday, 1 June 2011 3:48 a.m. To: Matt Foster Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Changing land use Hi, other option is modify the binary files of some existing landuse (modis or usgs), matlab read and write those files whit almost no problem. Atentamente Jorge Ar?valo B?rquez On Fri, May 27, 2011 at 11:22 AM, Matt Foster > wrote: Hamed, We have done experiments here in the past, where we modified the greeness fraction based on data from polar-orbiting satellites. We simply modified the geo_em NetCDF file, and it worked well. You should be able to take the same approach with land use. Matt On 5/22/2011 6:06 AM, Hamed Sharifi wrote: Dear All, I want to change some values of my landuse in order to see how it affects the whole domain. For example, change part of the landuse to Forest or Lack. Does anyone know about this issue? Thanks in advance, Hamed Sharifi, M.Sc Student, AUT Tehran/Iran |hamed319 at yahoo.com | +98-9364024805 | hamed_sharifi at aut.ac.ir | _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Do not go where the path may lead; go instead where there is no path and leave a trail. -- Ralph Waldo Emerson _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: rd_wr_formatted.exe Type: application/octet-stream Size: 2419074 bytes Desc: rd_wr_formatted.exe Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0001.exe -------------- next part -------------- A non-text attachment was scrubbed... Name: rd_wr_formatted.f90 Type: application/octet-stream Size: 2429 bytes Desc: rd_wr_formatted.f90 Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0002.obj -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: write_geogrid.c Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0002.c -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: read_geogrid.c Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0003.c -------------- next part -------------- A non-text attachment was scrubbed... Name: README Type: application/octet-stream Size: 793 bytes Desc: README Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/16653f1b/attachment-0003.obj From aruny at iitk.ac.in Fri Jun 3 03:43:47 2011 From: aruny at iitk.ac.in (Arun) Date: Fri, 03 Jun 2011 15:13:47 +0530 Subject: [Wrf-users] Internal compiler error while compiling Wrf/chem Message-ID: <4DE8ACD3.5010903@iitk.ac.in> Hi, I'm trying to compile WRF/Chem with KPP turned off. But I'm getting this internal compiler error while doing so. I'm attaching my log file along with this mail. Any help/suggestion will be highly appreciated. I have no clue about how to move forward from here. The first error is following: module_ra_gsfcsw.f90:1090: internal compiler error: in emit_move_insn_1, at expr.c:3251 Please submit a full bug report, with preprocessed source if appropriate. See for instructions. make[3]: [module_ra_gsfcsw.o] Error 1 (ignored) Other errors which occur after this are because of this error only, as far as I can tell. Regards, Arun -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_chem1.log Type: text/x-log Size: 510077 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/b2841584/attachment-0001.bin From saji at u-aizu.ac.jp Thu Jun 2 20:13:43 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Fri, 3 Jun 2011 11:13:43 +0900 Subject: [Wrf-users] Ensemble run In-Reply-To: <1307018251.8039.16.camel@rudrasimona03.seaes.manchester.ac.uk> References: <1307018251.8039.16.camel@rudrasimona03.seaes.manchester.ac.uk> Message-ID: I am not very sure about it, but DART may contain utilities to help create ensembles (disclaimer: i have not yet used DART) http://www.image.ucar.edu/DAReS/DART/ saji On Thu, Jun 2, 2011 at 9:37 PM, Rudra Shrestha < rudra.shrestha at postgrad.manchester.ac.uk> wrote: > Dear users, > I want to make ensemble run to investigate effect of aerosol and > temperature on rainfall. I can make change in the aerosol by changing > variables in the microphysics scheme (e.g. Morrison scheme) but don't > have very good idea how to make change in the temperature. I tried to > change in the potential temperature ('TT' variable in met_em* file) > using 'ncsave' function in Matlab. However there are significant number > of files and it is really time consuming while changing in the > individual met_em files. So I was wondering if there are any standard > method to make ensemble run in the wrf or make change variables in the > met_em* file. I am using NCEP2 data. > > Any help would be highly appreciated. > > Many Thanks, > Rudra > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110603/6651779d/attachment.html From TitovM at ap.aurecongroup.com Tue Jun 7 16:18:13 2011 From: TitovM at ap.aurecongroup.com (Mikhail Titov) Date: Wed, 8 Jun 2011 08:18:13 +1000 Subject: [Wrf-users] Topo and land use ASCII - netCDF transformation - Make file In-Reply-To: References: Message-ID: Hi Ehsan, Sorry I didn't send you and other users who were interested in local terrestrial files transformation 'Makefile' to compile 'rd_wr_formatted.f90". Please don't remove # in "write_geogrid.c" and "read_geogrid.c". Edit and use 'Makefile' to compile the program. Just correct FC, CC and flags in 'Makefie'. I use pgf90 and you as I can see use 'ifort'. If you want to swap real on integer input you easily can do it editting 'Makefile' and use integer land-use categories as input. Good luck, Mikhail NB I don't have any other software to transform ASCII input in binary files. ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: Ehsan Beigi [mailto:ebeigi3 at tigers.lsu.edu] Sent: Tuesday, 7 June 2011 8:35 a.m. To: Mikhail Titov Subject: Re: Topo and land use ASCII - netCDF transformation thanks for your previous help. i remove the # in write_geogrid.c and read_geogrid.cfrom at the start of these lines to make it compatible with compilers by looking at configure.wps ifdef _UNDERSCORE define read_geogrid read_geogrid_ endif and then i use this command to compile the rd_wr_formatted.f90 code, is it correct : CC=icc FC=ifort ifort rd_wr_formatted.f90 -L/home/ehsan/Documents/Netcdf/4.1/lib -lnetcdf -lm \-I/home/ehsan/Documents/Netcdf/4.1/include -free -o rd_wr_formatted.f90 after type it in command line , it gives me : rd_wr_formatted.f90(25): error #6423: This name has already been used as an external function name. [IARGC] if (iargc /= 1) then -------^ rd_wr_formatted.f90(25): error #8497: Illegal use of a procedure name in an expression, possibly a function call missing parenthesis. [IARGC] if (iargc /= 1) then -------^ compilation aborted for rd_wr_formatted.f90 (code 1) CC=icc FC=ifort ifort rd_wr_formatted.f90 -L/home/ehsan/Documents/Netcdf/4.1/lib -lnetcdf -lm \-I/home/ehsan/Documents/Netcdf/4.1/include -free -o rd_wr_formatted.f90 rd_wr_formatted.f90(25): error #6423: This name has already been used as an external function name. [IARGC] if (iargc /= 1) then -------^ rd_wr_formatted.f90(25): error #8497: Illegal use of a procedure name in an expression, possibly a function call missing parenthesis. [IARGC] if (iargc /= 1) then -------^ compilation aborted for rd_wr_formatted.f90 (code 1) What should i do ? Best Regards, Ehsan Beigi Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110608/51307505/attachment.html From TitovM at ap.aurecongroup.com Tue Jun 7 16:22:25 2011 From: TitovM at ap.aurecongroup.com (Mikhail Titov) Date: Wed, 8 Jun 2011 08:22:25 +1000 Subject: [Wrf-users] Topo and land use ASCII - netCDF transformation - Make file In-Reply-To: References: Message-ID: It looks like I forgot to attach the file - sorry. Mikhail From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Mikhail Titov Sent: Wednesday, 8 June 2011 10:18 a.m. To: Ehsan Beigi Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] Topo and land use ASCII - netCDF transformation - Make file Hi Ehsan, Sorry I didn't send you and other users who were interested in local terrestrial files transformation 'Makefile' to compile 'rd_wr_formatted.f90". Please don't remove # in "write_geogrid.c" and "read_geogrid.c". Edit and use 'Makefile' to compile the program. Just correct FC, CC and flags in 'Makefie'. I use pgf90 and you as I can see use 'ifort'. If you want to swap real on integer input you easily can do it editting 'Makefile' and use integer land-use categories as input. Good luck, Mikhail NB I don't have any other software to transform ASCII input in binary files. ________________________________________________________________________________ Dr. Mikhail Titov | Senior Prof. Officer, Energy | Aurecon Ph: +64 3 366 08 21 ext.9231 DDI +64 367 32 31 | Fax: +64 3 379 6955 | Mob: +64 21 106 5563 Email: TitovM at ap.aurecongroup.com Unit 1, 150 Cavendish Road | Christchurch 8051 | New Zealand PO Box 1061 http://www.aurecongroup.com http://www.aurecongroup.com/apac/groupentity/ _________________________________________________________________________________ Please consider your environment before printing this e-mail. From: Ehsan Beigi [mailto:ebeigi3 at tigers.lsu.edu] Sent: Tuesday, 7 June 2011 8:35 a.m. To: Mikhail Titov Subject: Re: Topo and land use ASCII - netCDF transformation thanks for your previous help. i remove the # in write_geogrid.c and read_geogrid.cfrom at the start of these lines to make it compatible with compilers by looking at configure.wps ifdef _UNDERSCORE define read_geogrid read_geogrid_ endif and then i use this command to compile the rd_wr_formatted.f90 code, is it correct : CC=icc FC=ifort ifort rd_wr_formatted.f90 -L/home/ehsan/Documents/Netcdf/4.1/lib -lnetcdf -lm \-I/home/ehsan/Documents/Netcdf/4.1/include -free -o rd_wr_formatted.f90 after type it in command line , it gives me : rd_wr_formatted.f90(25): error #6423: This name has already been used as an external function name. [IARGC] if (iargc /= 1) then -------^ rd_wr_formatted.f90(25): error #8497: Illegal use of a procedure name in an expression, possibly a function call missing parenthesis. [IARGC] if (iargc /= 1) then -------^ compilation aborted for rd_wr_formatted.f90 (code 1) CC=icc FC=ifort ifort rd_wr_formatted.f90 -L/home/ehsan/Documents/Netcdf/4.1/lib -lnetcdf -lm \-I/home/ehsan/Documents/Netcdf/4.1/include -free -o rd_wr_formatted.f90 rd_wr_formatted.f90(25): error #6423: This name has already been used as an external function name. [IARGC] if (iargc /= 1) then -------^ rd_wr_formatted.f90(25): error #8497: Illegal use of a procedure name in an expression, possibly a function call missing parenthesis. [IARGC] if (iargc /= 1) then -------^ compilation aborted for rd_wr_formatted.f90 (code 1) What should i do ? Best Regards, Ehsan Beigi Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ Disclaimer - http://www.aurecongroup.com/apac/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110608/abe70103/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Makefile Type: application/octet-stream Size: 521 bytes Desc: Makefile Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110608/abe70103/attachment-0001.obj From mkudsy at gmail.com Tue Jun 7 20:21:07 2011 From: mkudsy at gmail.com (M Kudsy) Date: Wed, 8 Jun 2011 09:21:07 +0700 Subject: [Wrf-users] Titan Radar to WRF Message-ID: Hi, We want to assimilate Titan Radar data to WRF for research purpose. I appreciate very much any helps to convert the data to WRF readable format. Many thanks, -- Mahally Kudsy Weather Modification Scientist Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110608/89a96070/attachment.html From cak10c at fsu.edu Wed Jun 8 14:27:49 2011 From: cak10c at fsu.edu (Chris Klich) Date: Wed, 8 Jun 2011 16:27:49 -0400 Subject: [Wrf-users] Problem with three domains in WRF/WRF-Chem Message-ID: Hi all, I am currently running WRF/WRF-Chem Version 3.2.1. I have been able to successfully run 2 domains, but running 3 has not seemed to work. There is no error in the rsl.error.0000 file, it just stops. I have pasted the rsl.error.0000 file below: taskid: 0 hostname: sully.met.fsu.edu Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 2 , ntasks in Y 4 --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 3, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 3, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 3, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 103 1 103 ims,ime,jms,jme -4 58 -4 33 ips,ipe,jps,jpe 1 51 1 26 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 198885272 bytes allocated med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* ************************************* Nesting domain ids,ide,jds,jde 1 211 1 211 ims,ime,jms,jme -4 117 -4 63 ips,ipe,jps,jpe 1 105 1 53 INTERMEDIATE domain ids,ide,jds,jde 13 88 13 88 ims,ime,jms,jme 8 59 8 42 ips,ipe,jps,jpe 11 49 11 32 ************************************* alloc_space_field: domain 2 , 496635684 bytes allocated *** Initializing nest domain # 2 from an input file. *** med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* Timing for Writing wrfout_d01_2008-04-18_00:00:00 for domain 1: 8.69800 elapsed seconds. mediation_integrate: med_read_wrf_chem_emissions: Read emissions for time 2008-04-18_00:00:00 mediation_integrate: med_read_wrf_chem_emissions: Open file wrfchemi_d01_2008-04-18_00:00:00 Timing for processing lateral boundary for domain 1: 5.72800 elapsed seconds. WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 1 WRF TILE 1 IS 1 IE 51 JS 1 JE 26 WRF NUMBER OF TILES = 1 ************************************* Nesting domain ids,ide,jds,jde 1 151 1 151 ims,ime,jms,jme -4 87 -4 48 ips,ipe,jps,jpe 1 75 1 38 INTERMEDIATE domain ids,ide,jds,jde 73 128 38 93 ims,ime,jms,jme 68 109 33 62 ips,ipe,jps,jpe 71 99 36 52 ************************************* alloc_space_field: domain 3 , 304724904 bytes allocated *** Initializing nest domain # 3 from an input file. *** med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES You can see that it stops right after this step, and it looks like most of my namelist is right, but I may be missing something. I have also posted the namelist.input below: &time_control run_days = 1, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2008, 2008, 2008,2008,2008, start_month = 04,04,04,04,04, start_day = 18,18,18,18,18, start_hour = 00,00,00,00,00, start_minute = 00, 00, 00, 00,00, start_second = 00, 00, 00, 00,00, end_year = 2008, 2008, 2008, 2008, 2008, end_month = 04,04,04,04,04, end_day = 19,19,19,19,19, end_hour = 00,00,00,00,00, end_minute = 00, 00, 00, 00,00, end_second = 00, 00, 00, 00,00, interval_seconds = 21600, input_from_file = .true.,.true.,.true.,.true.,.true., fine_input_stream = 0, 0, 0, 0, 0, history_interval = 60,60,60,60,60, frames_per_outfile = 1,1,1,1,1, restart = .false., restart_interval = 0, io_form_history = 2, io_form_restart = 2, io_form_input = 2, io_form_boundary = 2, io_form_auxinput3 = 2, io_form_auxinput4 = 2, io_form_auxinput5 = 2, debug_level = 0, auxinput5_interval = 2592000, auxinput11_interval_s = 180, 180, 180, 180, 180, auxinput11_end_h = 6, 6, 6, 6, 6, / &domains time_step = 270, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 3, s_we = 1, 1, 1, 1, 1, e_we = 103,211,151,502,, s_sn = 1, 1, 1, 1,1, e_sn = 103,211,151,502,, s_vert = 1, 1, 1, 1,1, e_vert = 35, 35, 35, 35,35, num_metgrid_levels = 27, num_metgrid_soil_levels = 4, eta_levels = 1.000, 0.995, 0.98, 0.963, 0.95,0.9372, 0.9247, 0.9122, 0.9001, 0.888,0.8761, 0.8642, 0.8524, 0.84, 0.8322,0.817, 0.79, 0.76, 0.72, 0.68,0.63, 0.59, 0.54, 0.49, 0.435, 0.38, 0.335, 0.2886, 0.2504, 0.2119,0.1722, 0.1309, 0.0875, 0.0418, 0.000, p_top_requested = 5000, dx = 45000,15000, 5000, , , dy = 45000,15000, 5000, , , grid_id = 1, 2, 3, 4,5, parent_id = 1, 1, 2, 3,4, i_parent_start = 1,15,75,200, j_parent_start = 1,15,40,130, parent_grid_ratio = 1,3,3,3,3, parent_time_step_ratio = 1,3,3,3,3, feedback = 1, smooth_option = 0, use_adaptive_time_step = .true., step_to_output_time = .true., target_cfl = 1.2, 1.2, 1.2, 1.2, 1.2, max_step_increase_pct = 10, 10, 10, 10, 10, starting_time_step = 270, max_time_step = -1, min_time_step = -1, / &physics mp_physics = 2, 2,2,2,2, ra_lw_physics = 1, 1, 1, 1, 1, ra_sw_physics = 2, 2, 2, 2, 2, radt = 10,10,10,10,10, sf_sfclay_physics = 2, 2, 2, 2, 2, sf_surface_physics = 2, 2, 2, 2, 2, bl_pbl_physics = 2, 2, 2, 2, 2, bldt = 0,0,0,0,0, cu_physics = 1, 1,1,1,1, cudt = 5,5,5,5,5, isfflx = 1, ifsnow = 1, icloud = 1, surface_input_source = 1, num_soil_layers = 4, pxlsm_smois_init = 0,0,0,0,0, mp_zero_out = 2, sf_urban_physics = 1, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, sst_update = 0, cu_rad_feedback = .true., cugd_avedx = 1, / &fdda obs_nudge_opt = 0,0,0,0,0 max_obs = 150000, fdda_start = 0., 0., 0., 0., 0. fdda_end = 99999., 99999., 99999., 99999., 99999. obs_nudge_wind = 1,1,1,1,1 obs_coef_wind = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_nudge_temp = 1,1,1,1,1 obs_coef_temp = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_nudge_mois = 1,1,1,1,1 obs_coef_mois = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_rinxy = 240.,240.,180.,180,180 obs_rinsig = 0.1, obs_twindo = 0.6666667,0.6666667,0.6666667,0.6666667,0.6666667, obs_npfi = 10, obs_ionf = 2, 2, 2, 2, 2, obs_idynin = 0, obs_dtramp = 40., obs_prt_freq = 10, 10, 10, 10, 10, obs_prt_max = 10 obs_ipf_errob = .true. obs_ipf_nudob = .true. obs_ipf_in4dob = .true. obs_ipf_init = .true. / &dynamics rk_ord = 3, w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0,0,0,0,0, diff_6th_factor = 0.12,0.12,0.12,0.12,0.12, damp_opt = 1, base_temp = 290., zdamp = 5000., 5000., 5000., 5000.,5000., dampcoef = 0.01, 0.01, 0.01, 0.01, 0.01, khdif = 0, 0, 0, 0,0, kvdif = 0, 0, 0,0,0, smdiv = 0.1, emdiv = 0.01, epssm = 0.1, time_step_sound = 4, h_mom_adv_order = 5, v_mom_adv_order = 3, h_sca_adv_order = 5, v_sca_adv_order = 3, non_hydrostatic = .true., .true., .true.,.true.,.true., moist_adv_opt = 2, 2, 2, scalar_adv_opt = 2, 2, 2, chem_adv_opt = 2, 2, 2, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false.,.false.,.false., nested = .false., .true., .true.,.true.,.true., / &grib2 / &chem kemit = 1, chem_opt = 301, 301, 301, bioemdt = 30, 30, 30, photdt = 30, 30, 30, chemdt = 270,270,270, frames_per_emissfile = 1, io_style_emissions = 2, emiss_inpt_opt = 1, 1, 1, emiss_opt = 5, 5, 5, chem_in_opt = 0, 0, 0, phot_opt = 0, 0, 0, gas_drydep_opt = 1, 1, 1, bio_emiss_opt = 0, 0, 0, dust_opt = 0, dmsemis_opt = 0, seas_opt = 0, gas_bc_opt = 1, 1, 1, gas_ic_opt = 1, 1, 1, aer_bc_opt = 1, 1, 1, aer_ic_opt = 1, 1, 1, gaschem_onoff = 1, 1, 1, aerchem_onoff = 1, 1, 1, wetscav_onoff = 0, 0, 1, cldchem_onoff = 0, 0, 1, vertmix_onoff = 1, 1, 1, chem_conv_tr = 1, 1, 1, biomass_burn_opt = 0, 0, 0, plumerisefire_frq = 270, 270, 270, aer_ra_feedback = 0, 0, 0, have_bcs_chem = .false., .false., .false., / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / Any help would be greatly appreciated. I am using self-made scripts for most of the running process, but it hasn't caused any errors up until this point. Chris -- Christopher Klich Meteorology Graduate Student Florida State University (908) 208-9743 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110608/263ce568/attachment-0001.html From veiga.uea at gmail.com Thu Jun 9 15:33:24 2011 From: veiga.uea at gmail.com (=?ISO-8859-1?Q?Jose_Augusto_Paix=E3o_Veiga?=) Date: Thu, 9 Jun 2011 17:33:24 -0400 Subject: [Wrf-users] WRF Climate version Message-ID: Dear all, I usually do short runs with WR-ARW model, i.e., 4-5 days simulations of weather events, but now I would like to run WRF for climate experiments. Thus, I would like to know if someone knows if is there a climate version for WRF model. Thanks in advance, Jos? Augusto P. Veiga, ====================================== Universidade do Estado do Amazonas Departamento de Meteorologia Escola Superior de Tecnologia (EST) ----------------------------------------------------------------------------- Av. Darcy Vargas, 1200, Manaus-AM Brasil Work phone: (92) 3878 4333, Ramal 4333 Cell phone : (92) 8196 7122 Skype: veiga_j.a.p. ----------------------------------------------------------------------------- CV: http://lattes.cnpq.br/4027612512091565 URL:http://scientificmet.wordpress.com/ ----------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110609/7c5c7155/attachment.html From hamed319 at yahoo.com Fri Jun 10 03:55:02 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Fri, 10 Jun 2011 02:55:02 -0700 (PDT) Subject: [Wrf-users] rd_wr_formatted.f90 compiling with ifort and icc Message-ID: <987173.33932.qm@web161204.mail.bf1.yahoo.com> Dear WRF Users, Is there anyone who compile "rd_wr_formatted.f90" with ifort&icc compiler? I used FLAGS from configure.wps but the make file got me this error: --------------------------------------------------------------------------------- ifort -c -FR -convert big_endian rd_wr_formatted.f90 rd_wr_formatted.f90(25): error #6423: This name has already been used as an external function name.?? [IARGC] ?? if (iargc /= 1) then -------^ rd_wr_formatted.f90(25): error #8497: Illegal use of a procedure name in an expression, possibly a function call missing parenthesis.?? [IARGC] ?? if (iargc /= 1) then -------^ compilation aborted for rd_wr_formatted.f90 (code 1) make: *** [rd_wr_formatted.o] Error 1 --------------------------------------------------------------------------------- I'm looking forward for the answer. Thanks in advance, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110610/78392a64/attachment.html From ebeigi3 at tigers.lsu.edu Fri Jun 10 12:09:13 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Fri, 10 Jun 2011 14:09:13 -0400 Subject: [Wrf-users] Changing ASCII or binary format meteorological data to intermediate format Message-ID: Dear Sir/Madam, I am going to ingest new meteorological dataset with ASCII and Binary format, and as you may know, it is needed to write a code in order to convert ASCII or Binary format to WPS intermediate format. Have any one done it before? I really appreciate your help in advance. Kind Regards, -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110610/9f561686/attachment.html From FLiu at azmag.gov Fri Jun 10 16:10:22 2011 From: FLiu at azmag.gov (Feng Liu) Date: Fri, 10 Jun 2011 22:10:22 +0000 Subject: [Wrf-users] Problem with three domains in WRF/WRF-Chem In-Reply-To: References: Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C257777B1@mag9006> The resolution of nested domain (5-km) is higher than you ran 2 domains. So reduce your time step from 270 to 90 and try again. Please keep in mind it will be computationally expensive if works with this solution. Thanks. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Chris Klich Sent: Wednesday, June 08, 2011 1:28 PM To: wrf-users at ucar.edu Subject: [Wrf-users] Problem with three domains in WRF/WRF-Chem Hi all, I am currently running WRF/WRF-Chem Version 3.2.1. I have been able to successfully run 2 domains, but running 3 has not seemed to work. There is no error in the rsl.error.0000 file, it just stops. I have pasted the rsl.error.0000 file below: taskid: 0 hostname: sully.met.fsu.edu Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 2 , ntasks in Y 4 --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 3, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 3, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 3, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 WRF V3.2.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 103 1 103 ims,ime,jms,jme -4 58 -4 33 ips,ipe,jps,jpe 1 51 1 26 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 198885272 bytes allocated med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* ************************************* Nesting domain ids,ide,jds,jde 1 211 1 211 ims,ime,jms,jme -4 117 -4 63 ips,ipe,jps,jpe 1 105 1 53 INTERMEDIATE domain ids,ide,jds,jde 13 88 13 88 ims,ime,jms,jme 8 59 8 42 ips,ipe,jps,jpe 11 49 11 32 ************************************* alloc_space_field: domain 2 , 496635684 bytes allocated *** Initializing nest domain # 2 from an input file. *** med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES ********************************************************************* * PROGRAM: WRF/CHEM VERSION 3.1.1 * * * * PLEASE REPORT ANY BUGS TO WRF/CHEM HELP at * * * * wrfchemhelp.gsd at noaa.gov * * * ********************************************************************* Timing for Writing wrfout_d01_2008-04-18_00:00:00 for domain 1: 8.69800 elapsed seconds. mediation_integrate: med_read_wrf_chem_emissions: Read emissions for time 2008-04-18_00:00:00 mediation_integrate: med_read_wrf_chem_emissions: Open file wrfchemi_d01_2008-04-18_00:00:00 Timing for processing lateral boundary for domain 1: 5.72800 elapsed seconds. WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 1 WRF TILE 1 IS 1 IE 51 JS 1 JE 26 WRF NUMBER OF TILES = 1 ************************************* Nesting domain ids,ide,jds,jde 1 151 1 151 ims,ime,jms,jme -4 87 -4 48 ips,ipe,jps,jpe 1 75 1 38 INTERMEDIATE domain ids,ide,jds,jde 73 128 38 93 ims,ime,jms,jme 68 109 33 62 ips,ipe,jps,jpe 71 99 36 52 ************************************* alloc_space_field: domain 3 , 304724904 bytes allocated *** Initializing nest domain # 3 from an input file. *** med_initialdata_input: calling input_input INPUT LandUse = "USGS" INITIALIZE THREE Noah LSM RELATED TABLES LANDUSE TYPE = USGS FOUND 27 CATEGORIES INPUT SOIL TEXTURE CLASSIFICAION = STAS SOIL TEXTURE CLASSIFICATION = STAS FOUND 19 CATEGORIES You can see that it stops right after this step, and it looks like most of my namelist is right, but I may be missing something. I have also posted the namelist.input below: &time_control run_days = 1, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2008, 2008, 2008,2008,2008, start_month = 04,04,04,04,04, start_day = 18,18,18,18,18, start_hour = 00,00,00,00,00, start_minute = 00, 00, 00, 00,00, start_second = 00, 00, 00, 00,00, end_year = 2008, 2008, 2008, 2008, 2008, end_month = 04,04,04,04,04, end_day = 19,19,19,19,19, end_hour = 00,00,00,00,00, end_minute = 00, 00, 00, 00,00, end_second = 00, 00, 00, 00,00, interval_seconds = 21600, input_from_file = .true.,.true.,.true.,.true.,.true., fine_input_stream = 0, 0, 0, 0, 0, history_interval = 60,60,60,60,60, frames_per_outfile = 1,1,1,1,1, restart = .false., restart_interval = 0, io_form_history = 2, io_form_restart = 2, io_form_input = 2, io_form_boundary = 2, io_form_auxinput3 = 2, io_form_auxinput4 = 2, io_form_auxinput5 = 2, debug_level = 0, auxinput5_interval = 2592000, auxinput11_interval_s = 180, 180, 180, 180, 180, auxinput11_end_h = 6, 6, 6, 6, 6, / &domains time_step = 270, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 3, s_we = 1, 1, 1, 1, 1, e_we = 103,211,151,502,, s_sn = 1, 1, 1, 1,1, e_sn = 103,211,151,502,, s_vert = 1, 1, 1, 1,1, e_vert = 35, 35, 35, 35,35, num_metgrid_levels = 27, num_metgrid_soil_levels = 4, eta_levels = 1.000, 0.995, 0.98, 0.963, 0.95,0.9372, 0.9247, 0.9122, 0.9001, 0.888,0.8761, 0.8642, 0.8524, 0.84, 0.8322,0.817, 0.79, 0.76, 0.72, 0.68,0.63, 0.59, 0.54, 0.49, 0.435, 0.38, 0.335, 0.2886, 0.2504, 0.2119,0.1722, 0.1309, 0.0875, 0.0418, 0.000, p_top_requested = 5000, dx = 45000,15000, 5000, , , dy = 45000,15000, 5000, , , grid_id = 1, 2, 3, 4,5, parent_id = 1, 1, 2, 3,4, i_parent_start = 1,15,75,200, j_parent_start = 1,15,40,130, parent_grid_ratio = 1,3,3,3,3, parent_time_step_ratio = 1,3,3,3,3, feedback = 1, smooth_option = 0, use_adaptive_time_step = .true., step_to_output_time = .true., target_cfl = 1.2, 1.2, 1.2, 1.2, 1.2, max_step_increase_pct = 10, 10, 10, 10, 10, starting_time_step = 270, max_time_step = -1, min_time_step = -1, / &physics mp_physics = 2, 2,2,2,2, ra_lw_physics = 1, 1, 1, 1, 1, ra_sw_physics = 2, 2, 2, 2, 2, radt = 10,10,10,10,10, sf_sfclay_physics = 2, 2, 2, 2, 2, sf_surface_physics = 2, 2, 2, 2, 2, bl_pbl_physics = 2, 2, 2, 2, 2, bldt = 0,0,0,0,0, cu_physics = 1, 1,1,1,1, cudt = 5,5,5,5,5, isfflx = 1, ifsnow = 1, icloud = 1, surface_input_source = 1, num_soil_layers = 4, pxlsm_smois_init = 0,0,0,0,0, mp_zero_out = 2, sf_urban_physics = 1, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, sst_update = 0, cu_rad_feedback = .true., cugd_avedx = 1, / &fdda obs_nudge_opt = 0,0,0,0,0 max_obs = 150000, fdda_start = 0., 0., 0., 0., 0. fdda_end = 99999., 99999., 99999., 99999., 99999. obs_nudge_wind = 1,1,1,1,1 obs_coef_wind = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_nudge_temp = 1,1,1,1,1 obs_coef_temp = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_nudge_mois = 1,1,1,1,1 obs_coef_mois = 6.E-4,6.E-4,6.E-4,6.E-4,6.E-4 obs_rinxy = 240.,240.,180.,180,180 obs_rinsig = 0.1, obs_twindo = 0.6666667,0.6666667,0.6666667,0.6666667,0.6666667, obs_npfi = 10, obs_ionf = 2, 2, 2, 2, 2, obs_idynin = 0, obs_dtramp = 40., obs_prt_freq = 10, 10, 10, 10, 10, obs_prt_max = 10 obs_ipf_errob = .true. obs_ipf_nudob = .true. obs_ipf_in4dob = .true. obs_ipf_init = .true. / &dynamics rk_ord = 3, w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0,0,0,0,0, diff_6th_factor = 0.12,0.12,0.12,0.12,0.12, damp_opt = 1, base_temp = 290., zdamp = 5000., 5000., 5000., 5000.,5000., dampcoef = 0.01, 0.01, 0.01, 0.01, 0.01, khdif = 0, 0, 0, 0,0, kvdif = 0, 0, 0,0,0, smdiv = 0.1, emdiv = 0.01, epssm = 0.1, time_step_sound = 4, h_mom_adv_order = 5, v_mom_adv_order = 3, h_sca_adv_order = 5, v_sca_adv_order = 3, non_hydrostatic = .true., .true., .true.,.true.,.true., moist_adv_opt = 2, 2, 2, scalar_adv_opt = 2, 2, 2, chem_adv_opt = 2, 2, 2, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false.,.false.,.false., nested = .false., .true., .true.,.true.,.true., / &grib2 / &chem kemit = 1, chem_opt = 301, 301, 301, bioemdt = 30, 30, 30, photdt = 30, 30, 30, chemdt = 270,270,270, frames_per_emissfile = 1, io_style_emissions = 2, emiss_inpt_opt = 1, 1, 1, emiss_opt = 5, 5, 5, chem_in_opt = 0, 0, 0, phot_opt = 0, 0, 0, gas_drydep_opt = 1, 1, 1, bio_emiss_opt = 0, 0, 0, dust_opt = 0, dmsemis_opt = 0, seas_opt = 0, gas_bc_opt = 1, 1, 1, gas_ic_opt = 1, 1, 1, aer_bc_opt = 1, 1, 1, aer_ic_opt = 1, 1, 1, gaschem_onoff = 1, 1, 1, aerchem_onoff = 1, 1, 1, wetscav_onoff = 0, 0, 1, cldchem_onoff = 0, 0, 1, vertmix_onoff = 1, 1, 1, chem_conv_tr = 1, 1, 1, biomass_burn_opt = 0, 0, 0, plumerisefire_frq = 270, 270, 270, aer_ra_feedback = 0, 0, 0, have_bcs_chem = .false., .false., .false., / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / Any help would be greatly appreciated. I am using self-made scripts for most of the running process, but it hasn't caused any errors up until this point. Chris -- Christopher Klich Meteorology Graduate Student Florida State University (908) 208-9743 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110610/7559aefe/attachment-0001.html From mmkamal at uwaterloo.ca Mon Jun 13 20:31:38 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Mon, 13 Jun 2011 22:31:38 -0400 Subject: [Wrf-users] WPP3.2 compilation error (undefined reference..........) Message-ID: <20110613223138.87572jx6rskohgys@www.nexusmail.uwaterloo.ca> Hi All, This is the first time I am trying to compile WPP3.2 but could not succeed. I have successfully compiled (dmpar) and tested WRFV3.2.1 using ifort and icc compiler (11.1.072) and trying to compile WPP using the same compiler serially. At the end of the compilation I am getting the following error: ================================================================================ ifort -free -O3 -xT -fp-model precise -assume byterecl -convert big_endian -fpe0 -g -traceback -Wl,-noinhibit-exec -o ../../exec/wrfpost.exe getVariable.o getIVariable.o getVariableB.o getIVariableN.o getVariableRSM.o kinds_mod.o MASKS_mod.o SOIL_mod.o VRBLS2D_mod.o VRBLS3D_mod.o BNDLYR.o BOUND.o CALCAPE.o CALDWP.o CALDRG.o CALHEL.o CALLCL.o CALMCVG.o CALMXW.o CALPBL.o CALPBLREGIME.o CALPOT.o CALPW.o CALRH.o CALRCH.o CALSTRM.o CALTAU.o CALTHTE.o CALVIS.o CALVOR.o CALWXT.o CALWXT_RAMER.o CALWXT_BOURG.o CALWXT_REVISED.o CALWXT_EXPLICIT.o CALWXT_DOMINANT.o CLDRAD.o CLMAX.o COLLECT.o COLLECT_LOC.o DEWPOINT.o FDLVL.o FIXED.o FRZLVL.o FRZLVL2.o GET_BITS.o GRIBIT.o INITPOST.o LFMFLD.o INITPOST_BIN.o MISCLN.o MIXLEN.o MDL2P.o MDLFLD.o MSFPS.o MPI_FIRST.o MPI_LAST.o NGMFLD.o NGMSLP.o OTLFT.o OTLIFT.o SLP_new.o SLP_NMM.o EXCH.o PARA_RANGE.o POSTDATA.o PROCESS.o INITPOST_NMM.o EXCH2.o READCNTRL.o SCLFLD.o SERVER.o SETUP_SERVERS.o SURFCE.o SPLINE.o TABLE.o TABLEQ.o TRANSPORTWIND.o TRPAUS.o TTBLEX.o WETBULB.o WRFPOST.o INITPOST_NMM_BIN.o CALMICT.o MICROINIT.o GPVS.o MDL2SIGMA.o ETCALC.o CANRES.o CALGUST.o WETFRZLVL.o SNFRAC.o SNFRAC_GFS.o MDL2AGL.o INITPOST_RSM.o AVIATION.o DEALLOCATE.o /scratch/mkamal/WRF/WPPV3/../WRFV3/frame/module_internal_header_util.o /scratch/mkamal/WRF/WPPV3/../WRFV3/frame/pack_utils.o /scratch/mkamal/WRF/WPPV3/../WRFV3/frame/wrf_debug.o /scratch/mkamal/WRF/WPPV3/../WRFV3/frame/module_wrf_error.o -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/io_int -lwrfio_int /scratch/mkamal/WRF/WPPV3/../WRFV3/main/libwrflib.a -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/fftpack/fftpack5 -lfftpack -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/io_netcdf -lwrfio_nf -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/RSL_LITE -lrsl_lite -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/io_grib1 -lio_grib1 -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/io_grib_share -lio_grib_share -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/esmf_time_f90 -lesmf_time -L/scratch/mkamal/WRF/WPPV3/../WRFV3/external/io_netcdf -lwrfio_nf ../../lib/libw3.a ../../lib/libmpi.a -L/scratch/mkamal/WRF/WPPV3/netcdf_links/lib -lnetcdf /scratch/mkamal/WRF/WPPV3/../WRFV3/main/libwrflib.a(collect_on_comm.o): In function `collect_on_comm_': collect_on_comm.c:(.text+0x26): undefined reference to `MPI_Comm_f2c' collect_on_comm.c:(.text+0x79): undefined reference to `ompi_mpi_int' collect_on_comm.c:(.text+0x27a): undefined reference to `ompi_mpi_char' collect_on_comm.c:(.text+0x307): undefined reference to `ompi_mpi_comm_world' /scratch/mkamal/WRF/WPPV3/../WRFV3/main/libwrflib.a(collect_on_comm.o): In function `collect_on_comm0_': collect_on_comm.c:(.text+0x344): undefined reference to `MPI_Comm_f2c' collect_on_comm.c:(.text+0x391): undefined reference to `ompi_mpi_int' collect_on_comm.c:(.text+0x592): undefined reference to `ompi_mpi_char' collect_on_comm.c:(.text+0x622): undefined reference to `ompi_mpi_comm_world' /scratch/mkamal/WRF/WPPV3/../WRFV3/main/libwrflib.a(collect_on_comm.o): In function `col_on_comm': ' ' ' ' ' ' ' ' f_xpose.f:(.text+0x177d): undefined reference to `mpi_alltoallv_' /scratch/mkamal/WRF/WPPV3/../WRFV3/external/RSL_LITE/librsl_lite.a(f_xpose.o): In function `trans_x2y_': f_xpose.f:(.text+0x2be2): undefined reference to `mpi_alltoallv_' f_xpose.f:(.text+0x30e3): undefined reference to `mpi_alltoallv_' make[2]: Leaving directory `/scratch/mkamal/WRF/WPPV3/sorc/wrfpost' make[1]: Leaving directory `/scratch/mkamal/WRF/WPPV3/sorc' =============================================================== configure.wpp =============================================================== SHELL = /bin/sh # Listing of options that are usually independent of machine type. # When necessary, these are over-ridden by each architecture. ARFLAGS = #### Architecture specific settings #### # Settings for Linux x86_64, Intel compiler (serial)# LDFLAGS = -Wl,-noinhibit-exec SFC=ifort SF90=ifort -free SCC=icc FFLAGS=-O3 -xT -fp-model precise -assume byterecl -convert big_endian -fpe0 -g -traceback $(LDFLAGS) DM_FC=ifort DM_F90=ifort -free DM_CC=icc FC = $(SFC) F90 = $(SF90) CC = $(SCC) CFLAGS=-O0 -DLINUX ########################################################### # # Macros, these should be generic for all machines LN = ln -sf MAKE = make -i -r RM = /bin/rm -f CP = /bin/cp AR = ar ru WRF_INCLUDE = -I$(WRF_DIR)/frame WRF_LIB = $(WRF_DIR)/frame/module_internal_header_util.o \ $(WRF_DIR)/frame/pack_utils.o \ $(WRF_DIR)/frame/wrf_debug.o \ $(WRF_DIR)/frame/module_wrf_error.o \ -L$(WRF_DIR)/external/io_int -lwrfio_int \ $(WRF_DIR)/main/libwrflib.a \ -L$(WRF_DIR)/external/fftpack/fftpack5 -lfftpack \ -L$(WRF_DIR)/external/io_netcdf -lwrfio_nf \ -L$(WRF_DIR)/external/RSL_LITE -lrsl_lite \ -L$(WRF_DIR)/external/io_grib1 -lio_grib1 \ -L$(WRF_DIR)/external/io_grib_share -lio_grib_share \ -L$(WRF_DIR)/external/esmf_time_f90 -lesmf_time \ -L$(WRF_DIR)/external/io_netcdf -lwrfio_nf WRF_LIB2 = NETCDFPATH = /scratch/mkamal/WRF/WPPV3/netcdf_links NETCDFLIBS = -lnetcdf WRF_DIR = /scratch/mkamal/WRF/WPPV3/../WRFV3 COMMS_ADD_OBJ = COMMS_ADD_OBJST = COMMS_LIB = libmpi.a MAIN_OBJ = WRFPOST.o .IGNORE: .SUFFIXES: .c .f .F .o # There is probably no reason to modify these rules .c.o: $(RM) $@ $(CC) $(CPPFLAGS) $(CFLAGS) -c $< .f.o: $(RM) $@ $*.mod $(FC) $(F77FLAGS) -c $< $(WRF_INCLUDE) .F.o: $(RM) $@ $*.mod $(CPP) $(CPPFLAGS) $(FDEFS) $(WRF_INCLUDE) $< > $*.f90 $(FC) $(FFLAGS) -c $*.f90 $(WRF_INCLUDE) # $(RM) $*.f90 ====================================================================== confuguration.wrf ====================================================================== #### Architecture specific settings #### # Settings for Linux x86_64 i486 i586 i686, ifort compiler with icc (dmpar) DMPARALLEL = 1 OMPCPP = # -D_OPENMP OMP = # -openmp -fpp -auto OMPCC = # -openmp -fpp -auto SFC = ifort SCC = icc CCOMP = icc DM_FC = mpif90 # -f90=$(SFC) DM_CC = mpicc -DMPI2_SUPPORT # -cc=$(SCC) FC = $(DM_FC) CC = $(DM_CC) -DFSEEKO64_OK LD = $(FC) RWORDSIZE = $(NATIVE_RWORDSIZE) PROMOTION = -i4 ARCH_LOCAL = -DNONSTANDARD_SYSTEM_FUNC CFLAGS_LOCAL = -w -O3 -ip LDFLAGS_LOCAL = -ip CPLUSPLUSLIB = ESMF_LDFLAG = $(CPLUSPLUSLIB) FCOPTIM = -O3 FCREDUCEDOPT = $(FCOPTIM) FCNOOPT = -O0 -fno-inline -fno-ip FCDEBUG = # -g $(FCNOOPT) -traceback FORMAT_FIXED = -FI FORMAT_FREE = -FR FCSUFFIX = BYTESWAPIO = -convert big_endian FCBASEOPTS_NO_G = -w -ftz -align all -fno-alias -fp-model precise $(FORMAT_FREE) $(BYTESWAPIO) FCBASEOPTS = $(FCBASEOPTS_NO_G) $(FCDEBUG) MODULE_SRCH_FLAG = TRADFLAG = -traditional CPP = /lib/cpp -C -P AR = ar ARFLAGS = ru M4 = m4 RANLIB = ranlib CC_TOOLS = $(SCC) ########################################################### ###################### # POSTAMBLE FGREP = fgrep -iq ARCHFLAGS = $(COREDEFS) -DIWORDSIZE=$(IWORDSIZE) -DDWORDSIZE=$(DWORDSIZE) -DRWORDSIZE=$(RWORDSIZE) -DLWORDSIZE=$(LWORDSIZE) \ $(ARCH_LOCAL) \ $(DA_ARCHFLAGS) \ -DDM_PARALLEL \ \ -DNETCDF \ \ \ \ \ \ \ -DUSE_ALLOCATABLES \ -DGRIB1 \ -DINTIO \ -DLIMIT_ARGS \ -DCONFIG_BUF_LEN=$(CONFIG_BUF_LEN) \ -DMAX_DOMAINS_F=$(MAX_DOMAINS) \ -DMAX_HISTORY=$(MAX_HISTORY) \ -DNMM_NEST=$(WRF_NMM_NEST) CFLAGS = $(CFLAGS_LOCAL) -DDM_PARALLEL \ -DMAX_HISTORY=$(MAX_HISTORY) FCFLAGS = $(FCOPTIM) $(FCBASEOPTS) ESMF_LIB_FLAGS = ESMF_IO_LIB = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a ESMF_IO_LIB_EXT = -L$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a INCLUDE_MODULES = $(MODULE_SRCH_FLAG) \ $(ESMF_MOD_INC) $(ESMF_LIB_FLAGS) \ -I$(WRF_SRC_ROOT_DIR)/main \ -I$(WRF_SRC_ROOT_DIR)/external/io_netcdf \ -I$(WRF_SRC_ROOT_DIR)/external/io_int \ -I$(WRF_SRC_ROOT_DIR)/frame \ -I$(WRF_SRC_ROOT_DIR)/share \ -I$(WRF_SRC_ROOT_DIR)/phys \ -I$(WRF_SRC_ROOT_DIR)/chem -I$(WRF_SRC_ROOT_DIR)/inc \ -I$(NETCDFPATH)/include \ \ REGISTRY = Registry LIB_BUNDLED = \ $(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5/libfftpack.a \ $(WRF_SRC_ROOT_DIR)/external/io_grib1/libio_grib1.a \ $(WRF_SRC_ROOT_DIR)/external/io_grib_share/libio_grib_share.a \ $(WRF_SRC_ROOT_DIR)/external/io_int/libwrfio_int.a \ $(ESMF_IO_LIB) \ $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a \ $(WRF_SRC_ROOT_DIR)/frame/module_internal_header_util.o \ $(WRF_SRC_ROOT_DIR)/frame/pack_utils.o LIB_EXTERNAL = \ $(WRF_SRC_ROOT_DIR)/external/io_netcdf/libwrfio_nf.a -L/scinet/gpc/Libraries/netcdf-4.0.1_nc3_intel/lib -lnetcdf LIB = $(LIB_BUNDLED) $(LIB_EXTERNAL) $(LIB_LOCAL) LDFLAGS = $(OMP) $(FCFLAGS) $(LDFLAGS_LOCAL) ENVCOMPDEFS = WRF_CHEM = 0 CPPFLAGS = $(ARCHFLAGS) $(ENVCOMPDEFS) -I$(LIBINCLUDE) $(TRADFLAG) NETCDFPATH = /scinet/gpc/Libraries/netcdf-4.0.1_nc3_intel PNETCDFPATH = bundled: wrf_ioapi_includes wrfio_grib_share wrfio_grib1 wrfio_int esmf_time fftpack external: wrfio_nf $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a gen_comms_rsllite module_dm_rsllite $(ESMF_TARGET) ========================================================================= Has anyone any idea on how to solve this. Thanks in advance Kamal From ekimalab at gmail.com Thu Jun 16 02:37:14 2011 From: ekimalab at gmail.com (Michael Bala) Date: Thu, 16 Jun 2011 16:37:14 +0800 Subject: [Wrf-users] Specifying lat/lon on idealized Sea Breeze Message-ID: Hi! I'm trying to run an idealized case in WRF where I can input different winds, temperature and moisture values over a specific location in the Philippines. I'm fairly new with WRF and can't seem to work my way around this. The em_seabreeze2d_x idealized case doen't have terrain/topography input and I'm not sure if the real cases allow false sounding input. Hope you can help me. Thank you so much! Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110616/a3c34db3/attachment.html From yh29 at mail.gatech.edu Tue Jun 14 11:19:37 2011 From: yh29 at mail.gatech.edu (Hu, Yongtao) Date: Tue, 14 Jun 2011 13:19:37 -0400 (EDT) Subject: [Wrf-users] Postdoctoral Research opportunities at Georgia Institute of Technology In-Reply-To: <555957506.163335.1308071721999.JavaMail.root@mail5.gatech.edu> Message-ID: <1988987240.163550.1308071977659.JavaMail.root@mail5.gatech.edu> School of Civil and Environmental Engineering at Georgia Institute of Technology invites applications for Postdoctoral Research opportunities Two opportunities for Postdoctoral Research Fellow positions are open in the School of Civil and Environmental Engineering at the Georgia Institute of Technology, starting immediately. The successful candidates will have opportunity to work on challenging projects with funding from EPA, NASA, DoD and other resources. The projects include developing data assimilation techniques for air quality forecasting, integrated source apportionment using chemistry/transport models, and adaptive grid modeling of air quality impacts from various emission sources. The projects will involve working on model development, performance evaluation, emission strength analysis, and application of regional air quality models. Information on projects and the research team can be found at http://lamda.ce.gatech.edu/. The initial appointment is for one year with the possibility of renewal for additional years based on performance and continuation of funding. Salary is contingent on qualifications with full benefits. The applicant must have a Ph.D. in atmospheric sciences, environmental engineering or a related field, with experience in regional air quality modeling. Advanced scientific programming skills are essential and previous experience with models like CMAQ, CAMx, MM5, and WRF is highly desirable. To apply for these opportunities, please email a cover letter (Subject: Postdoc Application), brief statement of research interests, CV including publications list, and contact information for three persons willing to serve as references, to Dr. M. T. Odman at odman at gatech.edu. Screening of applications will begin at receipt of any full application and will continue until positions are filled. The Georgia Institute of Technology is an Equal Opportunity/Affirmative Action employer, and applications from women and under-represented minorities are encouraged. -- Yongtao Hu, Ph.D., Research Scientist II School of Civil and Environmental Engineering Georgia Institute of Technology 311 Ferst Drive, ES&T building Atlanta, GA 30332-0340 tel: 404-385-4558 fax: 404-894-8266 e-mail:yh29 at mail.gatech.edu From hamed319 at yahoo.com Sat Jun 18 01:34:29 2011 From: hamed319 at yahoo.com (Hamed Sharifi) Date: Sat, 18 Jun 2011 00:34:29 -0700 (PDT) Subject: [Wrf-users] Mexico City namelist.input require Message-ID: <149711.71048.qm@web161207.mail.bf1.yahoo.com> Hi All, Is there anyone here who use WRF for simulating Mexico City? I need some help about WRF-namelis.input physics. Thanks in advance, ? Hamed Sharifi, M.Sc Student, AUT Tehran/Iran ? ?? |hamed319 at yahoo.com? | +98-9364024805 ? ? ? ? ? ? ? ? ? ? ? ?? ? | hamed_sharifi at aut.ac.ir | -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110618/c09e73bb/attachment.html From sstolaki at geo.auth.gr Mon Jun 20 04:08:37 2011 From: sstolaki at geo.auth.gr (Stavroula Stolaki) Date: Mon, 20 Jun 2011 13:08:37 +0300 Subject: [Wrf-users] SST data for WRF Message-ID: <20110620130836.12406h9yedxkxtlw@webmail.auth.gr> Hi to everyone! I am quite a beginner in using WRF and I would like to ask you one thing about SST data. I am confused about which SST type of data to use. On the NCEP Real-Time SST archives (http://polar.ncep.noaa.gov/sst/oper/Welcome.html) webpage (as proposed in the WRF-ARW online tutorial) there are data that reside in the ftp://polar.ncep.noaa.gov/pub/history/sst/ophi directory. On the other hand there are also data in ftp://ftp.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.YYYYMMDDHH which are of the type: sst2dvar_grb_0.5.grib2 sst2dvar.t12z.nam_grid sst.t12z.nam_grid rtgssthr_grb_0.5.grib2 and rtgssthr_grb_0.083.grib2 (for the 1/12 degrees dataset) I am not sure which data I should use for my WRF runs (independent of the resolution of them). What about the sst.t12z.nam_grid data?Are they for the 1200UTC time?In which format are they? Moreover,in the case I would like to start my runs at 1200UTC of a day, how could I use the 0000 UTC SST data,since as far as I know,the SST data available concern the 0000 UTC time only. Any help would be appreciated. Thank you for your time! Stavroula Stolaki From drostkier at yahoo.com Fri Jun 24 06:55:48 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Fri, 24 Jun 2011 05:55:48 -0700 (PDT) Subject: [Wrf-users] =?utf-8?q?=E2=80=8F=E2=80=8FUpdate-48th_Oholo_Confere?= =?utf-8?q?nce_-_Eilat=2C_Israel_-_November_6-10=2C_2011?= Message-ID: <1308920148.11817.YahooMailRC@web113103.mail.gq1.yahoo.com> Dear colleague, ? We kindly invite you to visit the 48th Oholo Conference website which is now on-line www.oholoconference.com ? We look forward to having you amongst the participants of the 48th Oholo Conference. ? Please do not hesitate to contact us for any further information or assistance you may need. ? Best wishes, Ariella ?Ariella Raz Secretary, 48th Oholo Conference ? Tel: 972-8-9381656 Fax: 972-8-9401404 Email: oholo at iibr.gov.il -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110624/cca6dcd1/attachment.html From drostkier at yahoo.com Fri Jun 24 07:05:13 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Fri, 24 Jun 2011 06:05:13 -0700 (PDT) Subject: [Wrf-users] =?utf-8?q?=E2=80=8F=E2=80=8FUpdate-Emerging_remote_se?= =?utf-8?q?nsing_techniques_and_associated_modeling_for_air_polurion_appli?= =?utf-8?q?cations-_48th_Oholo_Conference_-_Eilat=2C_Israel_-_November_6-1?= =?utf-8?q?0=2C_2011?= Message-ID: <1308920713.57983.YahooMailRC@web113107.mail.gq1.yahoo.com> Dear colleague, ? We kindly invite you to visit the 48th Oholo Conference website which is now on-line www.oholoconference.com ? We look forward to having you amongst the participants of the 48th Oholo Conference. ? Please do not hesitate to contact us for any further information or assistance you may need. ? Best wishes, Ariella ?Ariella Raz Secretary, 48th Oholo Conference ? Tel: 972-8-9381656 Fax: 972-8-9401404 Email: oholo at iibr.gov.il -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110624/fa97929b/attachment.html -------------- next part -------------- _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users From jinxiu58 at gmail.com Mon Jun 20 20:57:27 2011 From: jinxiu58 at gmail.com (qiancheng jinxiu) Date: Tue, 21 Jun 2011 10:57:27 +0800 Subject: [Wrf-users] how to run the sensitive test about heat and moist flux? Message-ID: Hi ,WRF group: Just now I want to run a series numerical tests via WRFv3.2. Some of them are related with boundary layer flux ( QFX and HFX) . How can I modify them (QFX and HFX) and run the WRF test? It seems I can modify the module_sf_ruclsm.F,and set the value of QFX and HFX. Am I right ? Or just modify the variables including isfflx in namelist.input ? in the WRF help document,when isfflx=0 means * no flux *from the surface (not for sf_surface_sfclay = 2).but this file also says 0 = *constant fluxes* defined by tke_drag_coefficient,tke_heat_flux; 1 = use model computed u*, and heat and moisture fluxes; 2 = use model computed u*, and specified heat flux by tke_heat_flux ifsnow 0 snow-cover effects Can anyone tell me when isfflx is 0 ,when the flux is zero or constant ? thank you .:) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110621/b50d9aa4/attachment.html From kganbour at yahoo.com Thu Jun 23 05:03:37 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Thu, 23 Jun 2011 04:03:37 -0700 (PDT) Subject: [Wrf-users] [wrf-users] Error when I run real.exe Message-ID: <1308827017.76739.YahooMailClassic@web46308.mail.sp1.yahoo.com> Dear all: I would like to help me. I will try to post all my steps. I have run WRF for ARW&NMM by test data. But when I changed it to run in NMM state on my domain and by another data ,I run all geogrid.exe,ungrib.exe,and metgrid.exe and no error. But when I run real.exe ,I have this error : --------------------------------------------------------------------------------------------- Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 33 1 20 ims,ime,jms,jme -4 38 -4 25 ips,ipe,jps,jpe 1 33 1 20 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 22248296 bytes allocated Time period # 1 to process = 2010-12-14_00:00:00. Time period # 2 to process = 2010-12-14_03:00:00. Time period # 3 to process = 2010-12-14_06:00:00. Time period # 4 to process = 2010-12-14_09:00:00. Time period # 5 to process = 2010-12-14_12:00:00. Time period # 6 to process = 2010-12-14_15:00:00. Time period # 7 to process = 2010-12-14_18:00:00. Time period # 8 to process = 2010-12-14_21:00:00. Time period # 9 to process = 2010-12-15_00:00:00. Time period # 10 to process = 2010-12-15_03:00:00. Total analysis times to input = 10. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2010-12-14_00:00:00.0000, which is loop # 1 out of 10 configflags%julyr, %julday, %gmt: 2010 348 0.0000000E+00 metgrid input_wrf.F first_date_input = 2010-12-14_00:00:00 metgrid input_wrf.F first_date_nml = 2010-12-14_00:00:00 d01 2010-12-14_00:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.08. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 00000000012A7CBD Unknown Unknown Unknown real.exe 00000000012A67C5 Unknown Unknown Unknown real.exe 00000000012485D9 Unknown Unknown Unknown real.exe 00000000011E779D Unknown Unknown Unknown real.exe 00000000011E6FEA Unknown Unknown Unknown real.exe 0000000001226880 Unknown Unknown Unknown real.exe 0000000000445A92 Unknown Unknown Unknown real.exe 000000000045055D Unknown Unknown Unknown real.exe 0000000000405966 Unknown Unknown Unknown real.exe 00000000004041CC Unknown Unknown Unknown libc.so.6 00000032BDB1C3FB Unknown Unknown Unknown real.exe 00000000004040FA Unknown Unknown Unknown -------------------------------------------------------------------------------------------------- I use Input data: GRIBFILE.AAA -> ../DATA/fh.000_tl.press_ar.octanti GRIBFILE.AAB -> ../DATA/fh.003_tl.press_ar.octanti GRIBFILE.AAC -> ../DATA/fh.006_tl.press_ar.octanti GRIBFILE.AAD -> ../DATA/fh.009_tl.press_ar.octanti GRIBFILE.AAE -> ../DATA/fh.012_tl.press_ar.octanti GRIBFILE.AAF -> ../DATA/fh.015_tl.press_ar.octanti GRIBFILE.AAG -> ../DATA/fh.018_tl.press_ar.octanti GRIBFILE.AAH -> ../DATA/fh.021_tl.press_ar.octanti GRIBFILE.AAI -> ../DATA/fh.024_tl.press_ar.octanti GRIBFILE.AAJ -> ../DATA/fh.027_tl.press_ar.octanti GRIBFILE.AAK -> ../DATA/fh.030_tl.press_ar.octanti ------------------------------------------------------------- and the namelist of WPS: &share wrf_core = 'ARW', max_dom = 1, start_date = '2010-12-14_00:00:00', end_date = '2010-12-15_03:00:00', interval_seconds = 10800 io_form_geogrid = 2, / &geogrid parent_id = 1, parent_grid_ratio = 1, i_parent_start = 1, j_parent_start = 1, e_we = 33, e_sn = 20, geog_data_res = '10m', dx = 30000, dy = 30000, map_proj = 'lambert', ref_lat = 20, ref_lon = 20, truelat1 = 30.0, truelat2 = 60.0, stand_lon = 2.0, geog_data_path = '/home/khaled/WRFI/WPS/geog' &ungrib out_format = 'WPS', prefix = 'FILE', / &metgrid fg_name = 'FILE' io_form_metgrid = 2, / --------------------------------------------- and the namelist.input is: &time_control run_days = 0, run_hours = 27, run_minutes = 0, run_seconds = 0, start_year = 2010, 2000, 2000, start_month = 12, 01, 01, start_day = 14, 24, 24, start_hour = 00, 12, 12, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2010, 2000, 2000, end_month = 12, 01, 01, end_day = 15, 25, 25, end_hour = 03, 12, 12, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800 input_from_file = .true.,.true.,.true., history_interval = 180, 60, 60, frames_per_outfile = 1000, 1000, 1000, restart = .false., restart_interval = 5000, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 0 / &domains time_step = 180, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, e_we = 33, 112, 94, e_sn = 20, 97, 91, e_vert = 27, 28, 28, p_top_requested = 7000, num_metgrid_levels = 13, num_metgrid_soil_levels = 0 , dx = 30000, 10000, 3333.33, dy = 30000, 10000, 3333.33, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 1, 31, 30, j_parent_start = 1, 17, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 / &physics mp_physics = 3, 3, 3, ra_lw_physics = 1, 1, 1, ra_sw_physics = 1, 1, 1, radt = 30, 30, 30, sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 1, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 1, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &fdda / &dynamics w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, 0, 0, diff_6th_factor = 0.12, 0.12, 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.2, 0.2, 0.2 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, scalar_adv_opt = 1, 1, 1, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false., nested = .false., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / -------------------------------------------------- I am looking forward for your replay with best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110623/1e2da4a3/attachment-0001.html From yagnesh at lowtem.hokudai.ac.jp Mon Jun 20 16:15:08 2011 From: yagnesh at lowtem.hokudai.ac.jp (yagnesh) Date: Tue, 21 Jun 2011 07:15:08 +0900 Subject: [Wrf-users] changing land points to sea.? Message-ID: <4DFFC66C.5050600@lowtem.hokudai.ac.jp> Hello wrf-users, I am trying to change land point to sea points. if I modify LANDMASK, it seems model is resetting back in the real.exe. could some one let me know what are the other variables i should look for? Thanks&Regards., yyr From annawaxegard at hotmail.com Mon Jun 27 01:08:41 2011 From: annawaxegard at hotmail.com (anna waxegard) Date: Mon, 27 Jun 2011 07:08:41 +0000 Subject: [Wrf-users] WRF SST Message-ID: Hello, I?m working with a 3 nested domain with the polar WRF 3.2. When I put STT_update=1 I get segmentationerror after 30 minutes when running wrf.exe. I run the model with the timestep 120 sec. I tried to run the model with only one domain but that doesn't seem to do the trick either.. Real. exe succefully create the wrflowinp_d01. I configured the modell with intel and dmpar. Any suggestions? This is my namelist:&time_controlrun_days = 1,run_hours = 0,run_minutes = 0,run_seconds = 0,start_year = 2007, 2007, 2007,start_month = 01, 01, 01,start_day = 01, 01, 01,start_hour = 00, 00, 00,start_minute = 00, 00, 00,start_second = 00, 00, 00,end_year = 2007, 2007, 2007,end_month = 01, 01, 01,end_day = 03, 02, 02,end_hour = 00, 00, 00,end_minute = 00, 00, 00,end_second = 00, 00, 00,interval_seconds = 21600input_from_file = .true.,.true.,.true.,history_interval = 180, 180, 180,frames_per_outfile = 240, 240, 240,restart = .false.,restart_interval = 43200,io_form_history = 2io_form_restart = 2io_form_input = 2io_form_boundary = 2debug_level = 0io_form_auxinput4 = 2auxinput4_inname = "wrflowinp_d"auxinput4_interval = 360, 360, 360,/&domainstime_step = 120,time_step_fract_num = 0,time_step_fract_den = 1,max_dom = 1,e_we = 39, 64, 115,e_sn = 44, 76, 106,e_vert = 28, 28, 28,p_top_requested = 5000,num_metgrid_levels = 15,num_metgrid_soil_levels = 4,dx = 24000, 8000, 2666.67,dy = 24000, 8000, 2666.67,grid_id = 1, 2, 3,parent_id = 1, 1, 2,i_parent_start = 1, 10, 10,j_parent_start = 1, 11, 30,parent_grid_ratio = 1, 3, 3,parent_time_step_ratio = 1, 3, 3,feedback = 0,smooth_option = 0,/&physicsmp_physics = 10, 10, 10,ra_lw_physics = 1, 1, 1,ra_sw_physics = 2, 2, 2,radt = 30, 30, 30,sf_sfclay_physics = 2, 2, 2,sf_surface_physics = 2, 2, 2,bl_pbl_physics = 2, 2, 2,bldt = 0, 0, 0,cu_physics = 3, 3, 3,cudt = 5, 5, 5,isfflx = 1,ifsnow = 0,icloud = 1,surface_input_source = 1,num_soil_layers = 4,sf_urban_physics = 0, 0, 0,maxiens = 1,maxens = 3,maxens2 = 3,maxens3 = 16,ensdim = 144,sst_update = 1,fractional_seaice = 0,Grateful for answer!Please tell me if there is anything else I can submit to ease?/Anna -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110627/ca62db68/attachment.html From kganbour at yahoo.com Sun Jun 26 01:48:08 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Sun, 26 Jun 2011 00:48:08 -0700 (PDT) Subject: [Wrf-users] [wrf-users]Error when I run real.exe for ARW case Message-ID: <1309074488.35628.YahooMailClassic@web46311.mail.sp1.yahoo.com> Dear all: Sorry in this case I compilated WRF with ARW,I confused becuase also I have some wrong with NMM case in another linux user I will try to post all my steps. I have run WRF for ARW&NMM by test data. But when I changed it to run in ARW state on my domain and by another data ,I run all geogrid.exe,ungrib.exe,and metgrid.exe and no error. But when I run real.exe ,I have this error : --------------------------------------------------------------------------------------------- Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variable s in fire --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval = 0 for all domains --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t ime to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an d ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.2.1 PREPROCESSOR ************************************* Parent domain ids,ide,jds,jde 1 33 1 20 ims,ime,jms,jme -4 38 -4 25 ips,ipe,jps,jpe 1 33 1 20 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 22248296 bytes allocated Time period # 1 to process = 2010-12-14_00:00:00. Time period # 2 to process = 2010-12-14_03:00:00. Time period # 3 to process = 2010-12-14_06:00:00. Time period # 4 to process = 2010-12-14_09:00:00. Time period # 5 to process = 2010-12-14_12:00:00. Time period # 6 to process = 2010-12-14_15:00:00. Time period # 7 to process = 2010-12-14_18:00:00. Time period # 8 to process = 2010-12-14_21:00:00. Time period # 9 to process = 2010-12-15_00:00:00. Time period # 10 to process = 2010-12-15_03:00:00. Total analysis times to input = 10. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2010-12-14_00:00:00.0000, which is loop # 1 out of 10 configflags%julyr, %julday, %gmt: 2010 348 0.0000000E+00 metgrid input_wrf.F first_date_input = 2010-12-14_00:00:00 metgrid input_wrf.F first_date_nml = 2010-12-14_00:00:00 d01 2010-12-14_00:00:00 Timing for input 0 s. Max map factor in domain 1 = 1.08. Scale the dt in the model accordingly. forrtl: severe (66): output statement overflows record, unit -5, file Internal Formatted Write Image PC Routine Line Source real.exe 00000000012A7CBD Unknown Unknown Unknown real.exe 00000000012A67C5 Unknown Unknown Unknown real.exe 00000000012485D9 Unknown Unknown Unknown real.exe 00000000011E779D Unknown Unknown Unknown real.exe 00000000011E6FEA Unknown Unknown Unknown real.exe 0000000001226880 Unknown Unknown Unknown real.exe 0000000000445A92 Unknown Unknown Unknown real.exe 000000000045055D Unknown Unknown Unknown real.exe 0000000000405966 Unknown Unknown Unknown real.exe 00000000004041CC Unknown Unknown Unknown libc.so.6 00000032BDB1C3FB Unknown Unknown Unknown real.exe 00000000004040FA Unknown Unknown Unknown -------------------------------------------------------------------------------------------------- I use Input data: GRIBFILE.AAA -> ../DATA/fh.000_tl.press_ar.octanti GRIBFILE.AAB -> ../DATA/fh.003_tl.press_ar.octanti GRIBFILE.AAC -> ../DATA/fh.006_tl.press_ar.octanti GRIBFILE.AAD -> ../DATA/fh.009_tl.press_ar.octanti GRIBFILE.AAE -> ../DATA/fh.012_tl.press_ar.octanti GRIBFILE.AAF -> ../DATA/fh.015_tl.press_ar.octanti GRIBFILE.AAG -> ../DATA/fh.018_tl.press_ar.octanti GRIBFILE.AAH -> ../DATA/fh.021_tl.press_ar.octanti GRIBFILE.AAI -> ../DATA/fh.024_tl.press_ar.octanti GRIBFILE.AAJ -> ../DATA/fh.027_tl.press_ar.octanti GRIBFILE.AAK -> ../DATA/fh.030_tl.press_ar.octanti ------------------------------------------------------------- and the namelist of WPS: &share wrf_core = 'ARW', max_dom = 1, start_date = '2010-12-14_00:00:00', end_date = '2010-12-15_03:00:00', interval_seconds = 10800 io_form_geogrid = 2, / &geogrid parent_id = 1, parent_grid_ratio = 1, i_parent_start = 1, j_parent_start = 1, e_we = 33, e_sn = 20, geog_data_res = '10m', dx = 30000, dy = 30000, map_proj = 'lambert', ref_lat = 20, ref_lon = 20, truelat1 = 30.0, truelat2 = 60.0, stand_lon = 2.0, geog_data_path = '/home/khaled/WRFI/WPS/geog' &ungrib out_format = 'WPS', prefix = 'FILE', / &metgrid fg_name = 'FILE' io_form_metgrid = 2, / --------------------------------------------- and the namelist.input is: &time_control run_days = 0, run_hours = 27, run_minutes = 0, run_seconds = 0, start_year = 2010, 2000, 2000, start_month = 12, 01, 01, start_day = 14, 24, 24, start_hour = 00, 12, 12, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2010, 2000, 2000, end_month = 12, 01, 01, end_day = 15, 25, 25, end_hour = 03, 12, 12, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 10800 input_from_file = .true.,.true.,.true., history_interval = 180, 60, 60, frames_per_outfile = 1000, 1000, 1000, restart = .false., restart_interval = 5000, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 debug_level = 0 / &domains time_step = 180, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, e_we = 33, 112, 94, e_sn = 20, 97, 91, e_vert = 27, 28, 28, p_top_requested = 7000, num_metgrid_levels = 13, num_metgrid_soil_levels = 0 , dx = 30000, 10000, 3333.33, dy = 30000, 10000, 3333.33, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 1, 31, 30, j_parent_start = 1, 17, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 / &physics mp_physics = 3, 3, 3, ra_lw_physics = 1, 1, 1, ra_sw_physics = 1, 1, 1, radt = 30, 30, 30, sf_sfclay_physics = 1, 1, 1, sf_surface_physics = 2, 2, 2, bl_pbl_physics = 1, 1, 1, bldt = 0, 0, 0, cu_physics = 1, 1, 0, cudt = 5, 5, 5, isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, 0, maxiens = 1, maxens = 3, maxens2 = 3, maxens3 = 16, ensdim = 144, / &fdda / &dynamics w_damping = 0, diff_opt = 1, km_opt = 4, diff_6th_opt = 0, 0, 0, diff_6th_factor = 0.12, 0.12, 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.2, 0.2, 0.2 khdif = 0, 0, 0, kvdif = 0, 0, 0, non_hydrostatic = .true., .true., .true., moist_adv_opt = 1, 1, 1, scalar_adv_opt = 1, 1, 1, / &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified = .true., .false.,.false., nested = .false., .true., .true., / &grib2 / &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, / -------------------------------------------------- I am looking forward for your replay with best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/0e4cef92/attachment-0001.html From kganbour at yahoo.com Sun Jun 26 03:33:31 2011 From: kganbour at yahoo.com (Khaled Ganbour) Date: Sun, 26 Jun 2011 02:33:31 -0700 (PDT) Subject: [Wrf-users] [wrf_users] Message-ID: <1309080811.27479.YahooMailClassic@web46309.mail.sp1.yahoo.com> Dear all:I am sorry for many emails,but I tried a lot ?to run WRF and when I have some error I will stop.I don't know which Vtable for my data "fh.000_tl.press_ar.octanti".Today,I tried to use Vtable.ARW with ARW caseI executed geogrid.exe,ungrib.exe,and metgrid.exe without errors.But when I run real.exe,I have this error:?-------------- FATAL CALLED ---------------?FATAL CALLED FROM FILE: ? ?LINE: ? ? 701? input_wrf.F: SIZE MISMATCH: ?namelist ide,jde,num_metgrid_levels= ? ? ? ? ?33?? ? ? ? ? 20 ? ? ? ? ?13; input data ide,jde,num_metgrid_levels= ? ? ? ? ?33 ? ?? ? ? ?20 ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ??? ? ? ? ? ? ? ? ? ???------------------------------------------- and also I it isn't clear for me all parameters in namelists especially to determine the domain,please help me to put my domain lat&long,and all other numbers.I attached my domain image which I want,and also I attached the result or real.exe execution,and the namelist.wps&namelist.input which I used.If there is error in namelists,please correct the according my domain image. I am looking forward for your replay Best regards Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/c797d740/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: domain.gif Type: image/gif Size: 471204 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/c797d740/attachment-0001.gif -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: error.txt Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/c797d740/attachment-0001.txt -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 4586 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/c797d740/attachment-0002.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.wps Type: application/octet-stream Size: 663 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110626/c797d740/attachment-0003.obj From agnes.mika at bmtargoss.com Tue Jun 28 01:38:41 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Tue, 28 Jun 2011 09:38:41 +0200 Subject: [Wrf-users] [wrf_users] In-Reply-To: <1309080811.27479.YahooMailClassic@web46309.mail.sp1.yahoo.com> References: <1309080811.27479.YahooMailClassic@web46309.mail.sp1.yahoo.com> Message-ID: <20110628073841.GA2476@aggedor.argoss.nl> Hallo Khaled, I suggest you read carefully the WRF users' manual, it should answer your questions about how to set a domain up. As for the problem you encounter when running real: the error message says that the domain parameters (number of grid points in the x, y and z directions) are not the same in your namelist.input file and your input data (the output from metgrid). It seems that in your metgrid data there is only one vertical level while you say in your namelist.input that you have 13. Check the number of vertical levels in your global input (NCEP?) data. If that's all right then run the pre-processing steps again, checking after each step if the results are ok. I hope this helps. Regards, Agnes Khaled Ganbour wrote: > Dear all:I am sorry for many emails,but I tried a lot ?to run WRF and when I have some error I will stop.I don't know which Vtable for my data "fh.000_tl.press_ar.octanti".Today,I tried to use Vtable.ARW with ARW caseI executed geogrid.exe,ungrib.exe,and metgrid.exe without errors.But when I run real.exe,I have this error:?-------------- FATAL CALLED ---------------?FATAL CALLED FROM FILE: ? ?LINE: ? ? 701? input_wrf.F: SIZE MISMATCH: ?namelist ide,jde,num_metgrid_levels= ? ? ? ? ?33?? ? ? ? ? 20 ? ? ? ? ?13; input data ide,jde,num_metgrid_levels= ? ? ? ? ?33 ? ?? ? ? ?20 ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ??? ? ? ? ? ? ? ? ? ???------------------------------------------- > and also I it isn't clear for me all parameters in namelists especially to determine the domain,please help me to put my domain lat&long,and all other numbers.I attached my domain image which I want,and also I attached the result or real.exe execution,and the namelist.wps&namelist.input which I used.If there is error in namelists,please correct the according my domain image. > > I am looking forward for your replay > Best regards > Khaled > Namelist dfi_control not found in namelist.input. Using registry defaults for v > ariables in dfi_control > Namelist tc not found in namelist.input. Using registry defaults for variables > in tc > Namelist scm not found in namelist.input. Using registry defaults for variables > in scm > Namelist fire not found in namelist.input. Using registry defaults for variable > s in fire > --- NOTE: sst_update is 0, setting io_form_auxinput4 = 0 and auxinput4_interval > = 0 for all domains > --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending t > ime to 0 for that domain. > --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting > sgfdda interval and ending time to 0 for that domain. > --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval an > d ending time to 0 for that domain. > --- NOTE: num_soil_layers has been set to 4 > REAL_EM V3.2.1 PREPROCESSOR > ************************************* > Parent domain > ids,ide,jds,jde 1 33 1 20 > ims,ime,jms,jme -4 38 -4 25 > ips,ipe,jps,jpe 1 33 1 20 > ************************************* > DYNAMICS OPTION: Eulerian Mass Coordinate > alloc_space_field: domain 1, 22248296 bytes allocated > Time period # 1 to process = 2010-12-14_00:00:00. > Time period # 2 to process = 2010-12-14_03:00:00. > Time period # 3 to process = 2010-12-14_06:00:00. > Time period # 4 to process = 2010-12-14_09:00:00. > Time period # 5 to process = 2010-12-14_12:00:00. > Time period # 6 to process = 2010-12-14_15:00:00. > Time period # 7 to process = 2010-12-14_18:00:00. > Time period # 8 to process = 2010-12-14_21:00:00. > Time period # 9 to process = 2010-12-15_00:00:00. > Time period # 10 to process = 2010-12-15_03:00:00. > Total analysis times to input = 10. > > ----------------------------------------------------------------------------- > > Domain 1: Current date being processed: 2010-12-14_00:00:00.0000, which is loop # 1 out of 10 > configflags%julyr, %julday, %gmt: 2010 348 0.0000000E+00 > metgrid input_wrf.F first_date_input = 2010-12-14_00:00:00 > metgrid input_wrf.F first_date_nml = 2010-12-14_00:00:00 > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 701 > input_wrf.F: SIZE MISMATCH: namelist ide,jde,num_metgrid_levels= 33 > 20 13; input data ide,jde,num_metgrid_levels= 33 > 20 1 > > ------------------------------------------- > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. Please consider the environmental impacts of printing this e-mail, and only do so if really necessary. From aruny at iitk.ac.in Fri Jul 1 07:55:33 2011 From: aruny at iitk.ac.in (Arun) Date: Fri, 01 Jul 2011 19:25:33 +0530 Subject: [Wrf-users] MMILNU error on input Message-ID: <4E0DD1D5.9070404@iitk.ac.in> Hi, I'm trying to do a wrf/chem simulation for Jan'11. After running real.exe when I try to run wrf.exe, it gives me an error "MMINLU error on input". I'm attaching my compiler's log file as well as the namelist.input file. Please tell me where am I going wrong. I have tried googling but no help till now. Regards, Arun -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: namelist.input Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110701/dbd66b80/attachment.pl -------------- next part -------------- A non-text attachment was scrubbed... Name: rsl.error.0000.log Type: text/x-log Size: 3208 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110701/dbd66b80/attachment.bin From veiga.uea at gmail.com Tue Jun 28 17:37:54 2011 From: veiga.uea at gmail.com (=?ISO-8859-1?Q?Jose_Augusto_Paix=E3o_Veiga?=) Date: Tue, 28 Jun 2011 19:37:54 -0400 Subject: [Wrf-users] Compiling WPS (version 3.3 and 3.2) Message-ID: Dear all, I am trying to compile WPS in a cluster of linux machines using option 8 (PC Linux x86_64 (IA64 and Opteron), PGI compiler 5.2 or higher, DM parallel, NO GRIB2) in the configuration procedure. However, after an apparent successful compilation, just ungrib.exe file was created in WPS directory. My question: why the others executable files (geogrid.exe and metgrid.exe) will not be created? What must I do in this case? Thanks in advance, Jos? Augusto P. Veiga, ====================================== Universidade do Estado do Amazonas Departamento de Meteorologia Escola Superior de Tecnologia (EST) ----------------------------------------------------------------------------- Av. Darcy Vargas, 1200, Manaus-AM Brasil Work phone: (92) 3878 4333, Ramal 4333 Cell phone : (92) 8196 7122 Skype: veiga_j.a.p. ----------------------------------------------------------------------------- CV: http://lattes.cnpq.br/4027612512091565 URL:http://scientificmet.wordpress.com/ ----------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110628/2c9222b8/attachment.html From jasonpadovani at gmail.com Wed Jul 6 17:56:36 2011 From: jasonpadovani at gmail.com (Jason Padovani Ginies) Date: Thu, 7 Jul 2011 01:56:36 +0200 Subject: [Wrf-users] Fresh Installation + LES Message-ID: Hi all, I intend to do a fresh installation of WRF on the university Cluster. Was wondering if you could help me out with these questions: - Best version to use which runs smoothest with LES and possibly CLWRF - Any pre-requisite software needed before installing WRF - whether LES capability has to be installed at compilation time? Kind regards, Jason Padovani Ginies Department of Physics University of Malta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110707/0f31b4ce/attachment.html From camilo.hernandez at gmail.com Wed Jul 6 23:50:20 2011 From: camilo.hernandez at gmail.com (Juan Camilo Hernandez D) Date: Thu, 7 Jul 2011 00:50:20 -0500 Subject: [Wrf-users] MET Obs ASCII Help. Message-ID: Hello everyone I built a database with information obtained from surface weather stations (Six months of hourly data); with the following variables, temperature (2m), relative humidity(2m), wind speed (10m) and wind direction (10m) percent. I want to use the MET tool to verify the results of my simulations, but do not quite understand how to prepare ASCII files for the tool ASCII2NC. I tried to find an example that can apply to my case but I have not been successful (Only the example included in the source code) Does anyone can provide me an example for my case? Thank you very much. -- JUAN CAMILO HERN?NDEZ D?AZ http://www.jkoyo.net -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110707/ca670655/attachment.html From ngphquyan at gmail.com Thu Jul 7 03:13:03 2011 From: ngphquyan at gmail.com (Nguyen Phuoc Quy An) Date: Thu, 7 Jul 2011 04:13:03 -0500 Subject: [Wrf-users] error - run MCIP with WRF Message-ID: Dear Sir, I run MCIP with WRF and have this error. Can you explain for me this error? Thank you very much! ********************************************************************** *** SUBROUTINE: SETUP *** ERROR RETRIEVING VARIABLE FROM WRF FILE *** VARIABLE = DYN_OPT *** RCODE = -43 ********************************************************************** *** ERROR ABORT in subroutine SETUP ABNORMAL TERMINATION IN SETUP Date and time 0:00:00 Aug. 1, 2007 (2007213:000000) Anna -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110707/91c887f0/attachment.html From ngphquyan at gmail.com Thu Jul 7 03:17:57 2011 From: ngphquyan at gmail.com (Nguyen Phuoc Quy An) Date: Thu, 7 Jul 2011 04:17:57 -0500 Subject: [Wrf-users] Error - run MCIP with WRF Message-ID: Dear Sir, I run MCIP with WRF and have this error. Can you explain for me this error? Thank you very much! ********************************************************************** *** SUBROUTINE: SETUP *** ERROR RETRIEVING VARIABLE FROM WRF FILE *** VARIABLE = DYN_OPT *** RCODE = -43 ********************************************************************** *** ERROR ABORT in subroutine SETUP ABNORMAL TERMINATION IN SETUP Date and time 0:00:00 Aug. 1, 2007 (2007213:000000) Anna -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110707/a7a29550/attachment.html From sp4rk136 at hotmail.com Thu Jul 7 03:43:32 2011 From: sp4rk136 at hotmail.com (Jason Padovani Ginies) Date: Thu, 7 Jul 2011 11:43:32 +0200 Subject: [Wrf-users] Installation/LES/CLWRF Message-ID: Hi all, I intend to do a fresh installation of WRF on the university Cluster. Was wondering if you could help me out with these questions: - Best version to use which runs smoothest with LES and possibly CLWRF - Any pre-requisite software needed before installing WRF Kind regards, Jason Padovani Ginies Department of Physics University of Malta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110707/5d13ff2e/attachment.html From andrew.robbie at gmail.com Sun Jul 10 09:02:48 2011 From: andrew.robbie at gmail.com (Andrew Robbie (GMail)) Date: Mon, 11 Jul 2011 01:02:48 +1000 Subject: [Wrf-users] Installation/LES/CLWRF In-Reply-To: References: Message-ID: Hi Jason, The WRF UCAR website describes the prerequisites quite well, but briefly: - gcc and/or other compilers (e.g. Intel and PGI compilers) - HDF5 - NetCDF 4 - Jasper - Proj4 - szip - udunits2 - zlib You will probably also need: - NCL - Vapor - ncview - nco When you say "runs smoothest with LES" I presume you mean the LES code in WRF? In which case I would say the latest version of WRF, v3.3. Make sure to apply all the patches or you will likely experience frustration. Re clWRF, no experience with it, but it says it is a patch against 3.1.1 so you would have to use that version unless they have validated their changes against a newer version. Regards, Andrew On Thu, Jul 7, 2011 at 7:43 PM, Jason Padovani Ginies wrote: > Hi all, > > I intend to do a fresh installation of WRF on the university Cluster. Was > wondering if you could help me out with these questions: > > Best version to use which runs smoothest with LES and possibly CLWRF > Any pre-requisite software needed before installing WRF > > Kind regards, > > Jason Padovani Ginies > Department of Physics > University of Malta > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From ar.ragi at gmail.com Tue Jul 12 05:01:44 2011 From: ar.ragi at gmail.com (Ragi A.R) Date: Tue, 12 Jul 2011 16:31:44 +0530 Subject: [Wrf-users] Ideal.exe error Message-ID: * Dear WRF Users, ** I am running WRF ideal case (the given data and input sounding). I had generated ideal.exe and run it and was successful. While running wrf.exe i came across the following error. Can anyone suggest how to proceed further? * taskid: 0 hostname: ajaymeru.cas.iitd.ernet.in Quilting with 1 groups of 0 I/O tasks. Namelist fdda not found in namelist.input. Using registry defaults for variable s in fdda Namelist dfi_control not found in namelist.input. Using registry defaults for v ariables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist grib2 not found in namelist.input. Using registry defaults for variabl es in grib2 Namelist fire not found in namelist.input. Using registry defaults for variable s in fire Ntasks in X 1, ntasks in Y 1 WRF V3.1.1 MODEL ************************************* Parent domain ids,ide,jds,jde 1 3 1 3 ims,ime,jms,jme -4 8 -4 8 ips,ipe,jps,jpe 1 3 1 3 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 6796220 bytes allocated med_initialdata_input: calling input_model_input INPUT LandUse = "USGS" -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 7409 module_ra_rrtm: error opening RRTM_DATA on unit 10 ------------------------------------------- application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0[unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 *-- * ************************************************************************* *Regards,* *A.R.Ragi* *Research Scholar * *CAS, IIT Delhi* ************************************************************************* *"I want to know how God created this world. I am not interested in this or that phenomenon. I want to know, his thought, the rest are details." ---Albert Einstein* ************************************************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110712/be1227b1/attachment.html From yingli at mail.usf.edu Wed Jul 13 08:02:15 2011 From: yingli at mail.usf.edu (yingli zhu) Date: Wed, 13 Jul 2011 10:02:15 -0400 Subject: [Wrf-users] wrf output varialbe OLR Message-ID: Hi, all, Is the variable in wrfout, TOA outgoing long wave radiation, equivalent to the satellite product (GOES-8), broadband long wave flux? Have a good day Yingli From selaya062 at yahoo.com Fri Jul 15 04:25:36 2011 From: selaya062 at yahoo.com (Arie maryadi) Date: Fri, 15 Jul 2011 18:25:36 +0800 (SGT) Subject: [Wrf-users] Please help In-Reply-To: <1310725361.55706.YahooMailNeo@web78204.mail.sg1.yahoo.com> References: <1310725361.55706.YahooMailNeo@web78204.mail.sg1.yahoo.com> Message-ID: <1310725536.71830.YahooMailNeo@web78216.mail.sg1.yahoo.com> Dear WRF users, I have? problem while processing ARWpost.exe. The error? notification is like this : ======================================================= START PROCESSING DATA ?? Interpolating to PRESSURE levels ?Processing? time --- 2010-02-09_04:00:18 ?? Found the right date - continue ?Processing? time --- 2010-02-09_04:10:18 ?? Found the right date - continue Troubles finding level? 100 above ground. Problems first occur at (?? 1,?? 1) Surface pressure =??? NaN hPa. Error_in_finding_100_hPa_up ============================================== Please help me, what to do to solve this problem. Thank you.. ? Regards, Ari Maryadi -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110715/ed3e5e19/attachment.html From selaya062 at yahoo.com Fri Jul 15 04:27:16 2011 From: selaya062 at yahoo.com (Arie maryadi) Date: Fri, 15 Jul 2011 18:27:16 +0800 (SGT) Subject: [Wrf-users] Please help In-Reply-To: <1310725536.71830.YahooMailNeo@web78216.mail.sg1.yahoo.com> References: <1310725361.55706.YahooMailNeo@web78204.mail.sg1.yahoo.com> <1310725536.71830.YahooMailNeo@web78216.mail.sg1.yahoo.com> Message-ID: <1310725636.41488.YahooMailNeo@web78214.mail.sg1.yahoo.com> Dear WRF users, I have? problem while processing ARWpost.exe. The error? notification is like this : ======================================================= START PROCESSING DATA ?? Interpolating to PRESSURE levels ?Processing? time --- 2010-02-09_04:00:18 ?? Found the right date - continue ?Processing? time --- 2010-02-09_04:10:18 ?? Found the right date - continue Troubles finding level? 100 above ground. Problems first occur at (?? 1,?? 1) Surface pressure =??? NaN hPa. Error_in_finding_100_hPa_up ============================================== Please help me, what to do to solve this problem. Thank you.. ? Regards, Ari Maryadi -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110715/d4f902f5/attachment.html From apattantyus2008 at my.fit.edu Fri Jul 15 10:52:16 2011 From: apattantyus2008 at my.fit.edu (Andre Pattantyus) Date: Fri, 15 Jul 2011 10:52:16 -0600 Subject: [Wrf-users] WRF DFFA OBS_DOMAIN format Message-ID: Hi all, I am having trouble finding out how to format observation data correctly in fdda. I have my own example set to go off of and I made a new OBS_DOMAIN with only one sounding in it. When I run wrf however I get an error when the file gets read in during intialization when it looks for data in file. It says something like error reading end of file. The documentation I found by cindy bruyere from NCAR mentions that this is a necessary part of the file. This data is not visible in either my example file I am working off of or the example given by wrf, found here http://www.mmm.ucar.edu/wrf/src/OBS_DOMAIN. So how do I correct this and fix my obs_domain file so it is read in? Thanks. Here is what my OBS_DOMAINS look like: 20040119120000 31.87 -106.70 99001 Maybe more site info UA MET FROM KEPZ FM-35 TEMP 1252. T F 32 878.000 0.000 1252.000 0.000 2.400 0.000 1.450 129.000 0.528 129.000 90.000 0.000 855.000 0.000 1466.000 0.000 6.000 0.000 4.700 129.000 2.092 129.000 71.000 0.000 850.000 0.000 1513.000 0.000 6.000 0.000 5.129 129.000 2.392 129.000 70.000 0.000 835.000 0.000 1659.000 0.000 5.800 0.000 5.440 129.000 1.560 129.000 61.000 0.000 726.000 0.000 2785.000 0.000 -3.300 0.000 3.535 129.000 0.687 129.000 76.000 0.000 700.000 0.000 3074.000 0.000 -3.100 0.000 2.417 129.000 0.880 129.000 54.000 0.000 693.000 0.000 3153.000 0.000 -3.700 0.000 2.862 129.000 1.156 129.000 46.000 0.000 681.000 0.000 3291.000 0.000 -3.500 0.000 3.264 129.000 1.522 129.000 16.000 0.000 669.000 0.000 3431.000 0.000 -3.700 0.000 3.600 129.000 1.995 129.000 25.000 0.000 660.000 0.000 3538.000 0.000 -3.300 0.000 3.926 129.000 2.454 129.000 10.000 0.000 641.000 0.000 3768.000 0.000 -4.300 0.000 4.109 129.000 3.096 129.000 36.000 0.000 629.000 0.000 3917.000 0.000 -4.700 0.000 3.598 129.000 2.914 129.000 23.000 0.000 603.000 0.000 4246.000 0.000 -7.700 0.000 2.960 129.000 2.859 129.000 41.000 0.000 566.000 0.000 4735.000 0.000 -10.900 0.000 5.144 129.000 0.000 129.000 31.000 0.000 562.000 0.000 4790.000 0.000 -10.500 0.000 5.125 129.000 -0.448 129.000 10.000 0.000 546.000 0.000 5011.000 0.000 -11.100 0.000 5.763 129.000 -2.212 129.000 14.000 0.000 500.000 0.000 5680.000 0.000 -15.900 0.000 7.951 129.000 -2.130 129.000 9.000 0.000 400.000 0.000 7320.000 0.000 -28.100 0.000 10.535 129.000 -7.377 129.000 13.000 0.000 390.000 0.000 7502.000 0.000 -29.100 0.000 10.535 129.000 -7.377 129.000 6.000 0.000 331.000 0.000 8652.000 0.000 -39.500 0.000 15.852 129.000 -7.392 129.000 27.000 0.000 300.000 0.000 9320.000 0.000 -44.300 0.000 18.853 129.000 -6.862 129.000 20.000 0.000 250.000 0.000 10520.000 0.000 -53.300 0.000 20.870 129.000 -5.592 129.000 22.000 0.000 229.000 0.000 11078.000 0.000 -56.300 0.000 21.012 129.000 -1.838 129.000 24.000 0.000 209.000 0.000 11659.000 0.000 -53.300 0.000 22.292 129.000 3.931 129.000 13.000 0.000 200.000 0.000 11940.000 0.000 -55.300 0.000 21.785 129.000 3.841 129.000 10.000 0.000 188.000 0.000 12335.000 0.000 -56.700 0.000 25.457 129.000 6.347 129.000 9.000 0.000 167.000 0.000 13091.000 0.000 -55.500 0.000 23.094 129.000 -1.615 129.000 6.000 0.000 150.000 0.000 13770.000 0.000 -60.300 0.000 17.937 129.000 1.569 129.000 6.000 0.000 127.000 0.000 14802.000 0.000 -61.100 0.000 16.460 129.000 0.287 129.000 5.000 0.000 107.000 0.000 15850.000 0.000 -66.100 0.000 12.920 129.000 3.462 129.000 5.000 0.000 100.000 0.000 16260.000 0.000 -65.300 0.000 14.411 129.000 3.861 129.000 6.000 0.000 Andre -- Andre Pattantyus, Graduate Student Research Assistant Marine and Environmental Systems, Florida Institute of Technology 150 W. University Blvd, Melbourne, FL 32901 Phone: (321) 674-8330 | Fax: (321) 674-7212 | Email: apattantyus2008 at fit.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110715/58ffcd93/attachment.html From done at ucar.edu Fri Jul 15 14:00:33 2011 From: done at ucar.edu (James Done) Date: Fri, 15 Jul 2011 14:00:33 -0600 Subject: [Wrf-users] Regional Climate Prediction Session at AGU 2011 Message-ID: <4E209C61.1080404@ucar.edu> Dear All We invite you to submit abstracts to our session on Regional Climate Prediction at High Resolution at the American Geophysical Union Fall Meeting, 5-9 Dec 2011, San Francisco, CA. http://sites.agu.org/fallmeeting/ Abstract Deadline: 4th Aug. Session Details: GC09: Regional Climate Modeling 3. Regional Climate Prediction at High Resolution Sponsor: Global Environmental Change (GC) Co-Sponsor(s): Atmospheric Sciences (A), Earth and Space Science Informatics (IN), Public Affairs (PA) Convener(s): 1. Greg Holland, NCAR 2. Howard Kunreuther, Wharton, University of Pennsylvania 3. William Skamarock, NCAR Description: Regional climate predictions at high resolution and decadal time scales are needed by industry, government and society to enable sufficient understanding and mitigate future costs and disruptions. This exciting session will present the latest scientific results and applications in high resolution climate prediction. Presentations are invited on: predictions of regional climate and high-impact weather statistics on decadal time scales, including uncertainty; coupled data assimilation for regional coupled prediction systems; coupled regional Earth system processes; statistical downscaling, and societal decision support tools. This session will stimulate interaction between diverse areas of expertise and promote novel collaboration. Many thanks, James Done From apattantyus2008 at my.fit.edu Fri Jul 15 13:32:51 2011 From: apattantyus2008 at my.fit.edu (Andre Pattantyus) Date: Fri, 15 Jul 2011 13:32:51 -0600 Subject: [Wrf-users] WRF DFFA OBS_DOMAIN format Message-ID: I found my error - header formatting was off a bit. -- Andre Pattantyus, Graduate Student Research Assistant Marine and Environmental Systems, Florida Institute of Technology 150 W. University Blvd, Melbourne, FL 32901 Phone: (321) 674-8330 | Fax: (321) 674-7212 | Email: apattantyus2008 at fit.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110715/4624b598/attachment.html From ar.ragi at gmail.com Sat Jul 23 22:13:48 2011 From: ar.ragi at gmail.com (Ragi A.R) Date: Sun, 24 Jul 2011 09:43:48 +0530 Subject: [Wrf-users] SCM Help Message-ID: Dear WRF Users, Did anyone tried generating input for SCM using make_scm_forcing.ncl? The "GABLS_II_forcing.txt" is for 1999 case which generate forcing for 1999. I wanted to generate forcing input for 2000. >From where I will get input for it? Thanks in advance *-- * ************************************************************************* *Regards,* *A.R.Ragi* *Research Scholar * *CAS, IIT Delhi* ************************************************************************* *"I want to know how God created this world. I am not interested in this or that phenomenon. I want to know, his thought, the rest are details." ---Albert Einstein* ************************************************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110724/d4327079/attachment.html From jiwen.fan at pnnl.gov Sun Jul 24 15:58:38 2011 From: jiwen.fan at pnnl.gov (Fan, Jiwen) Date: Sun, 24 Jul 2011 14:58:38 -0700 Subject: [Wrf-users] Young Scientist Forum in 2011 IYC O3 and Climate Change Symposium In-Reply-To: Message-ID: Apologies for cross-posting... Dear colleagues: We are organizing a Young Scientist Forum in a symposium titled "Protecting the stratospheric ozone layer - A successful collaboration of scientists, policy makers, industry, and intergovernmental agencies: Can lessons learned help us deal with climate changes?" to be held from Nov. 7-10, 2011, in Washington, DC, to celebrate the 2011 United Nations International Year of Chemistry (IYC). The symposium will include presentations and working sessions that 1) review and update stratospheric ozone layer and climate-change research, 2) the development and assessment of policies to mitigate stratospheric ozone loss and climate change, 3) possible parallels and relevant lessons learned that might benefit climate change science and policy, and 4) communication of climate change to the public. A detailed description of the symposium can be found at the meeting's official web site (http://www.2011-iyc-o3.org/). The American Meteorological Society (AMS), American Geophysical Union (AGU), and American Chemical Society (ACS) are co-sponsors for the symposium. International co-sponsors for this event include the World Meteorological Organization (WMO) and United Nations Environment Programme (UNEP). Drs. Mario Molina (1995 Chemistry Nobel Price Winner), Ralph Cicerone (President of National Academy of Science), Susan Solomon (Winner of the National Medal of Science), and Bob Watson (former IPCC Chair) will deliver keynotes at the symposium. There will be participations from White House and Congressional staff, including Department of State Deputy Assistant Secretary for Environment. The Young Scientist Forum will include both oral and poster presentations on the subjects of research in atmospheric sciences and climate change and integration between science and policy. NSF and DOE have provided funding to support young scientists to attend the symposium, and we anticipate an attendance of more than 50 young scientists at the Young Scientist Forum, including non-tenured faculty, post docs, and graduate students. The application form for the travel grants can be found at the meeting's website. Look forward to seeing you in the symposium. Jiwen Fan On behalf of the organizers of Young Scientist Forum in 2011 IYC O3 and Climate Change Symposium (Jiwen Fan, PNNL, Trude Storelvmo, Yale Univ., and Annmarie G. Carlton, Rutgers Univ.) ---- Jiwen Fan, Ph.D. Scientist Atmospheric Science & Global Change Division Pacific Northwest National Laboratory PO Box 999, MSIN K9-24 Richland, WA 99352 509/375-2116 (o) Jiwen.fan at pnl.gov -- From jiwen.fan at pnnl.gov Sun Jul 24 23:23:23 2011 From: jiwen.fan at pnnl.gov (Fan, Jiwen) Date: Sun, 24 Jul 2011 22:23:23 -0700 Subject: [Wrf-users] The Fourth Symposium on Aerosol-Cloud-Climate Interactions at 92nd AMS Annual Meeting In-Reply-To: Message-ID: Dear Colleagues, We would like to bring your attention to the Fourth Symposium on Aerosol-Cloud-Climate Interactions at the upcoming 92nd American Meteorological Society Annual Meeting (January 22-26, 2012). We invite papers on observational, modeling, instrumentation and/or laboratory studies related to the topic. Particularly, we will organize sessions in Aerosol-Cloud Interactions from Cloud-Scale to large-Scale Circulation. Please feel free to forward the invitation to your colleagues who might be interested in submitting an abstract to this symposium. The abstract submission is due pretty soon on August 1st. Below is the link to submit your abstract: http://ams.confex.com/ams/92Annual/4aerocloud/papers/index.cgi?username=195364&password=825550 Sorry for cross-posting. Jiwen On behalf of the organizers of the Fourth Symposium on Aerosol-Cloud-Climate Interactions ---- Jiwen Fan, Ph.D. Scientist Atmospheric Science & Global Change Division Pacific Northwest National Laboratory PO Box 999, MSIN K9-24 Richland, WA 99352 509/375-2116 (o) Jiwen.fan at pnl.gov From mmkamal at uwaterloo.ca Wed Jul 20 13:05:37 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Wed, 20 Jul 2011 15:05:37 -0400 Subject: [Wrf-users] Does anyone modified run_wrfpost/run_wrfpostandgrads (WPP) script for multiple time steps Message-ID: <20110720150537.472043s2uhrvb7s4@www.nexusmail.uwaterloo.ca> Hi All, I was wondering whether anyone has already modified the run_wrfpost/run_wrfpostandgrads (WPP) script to read history file with multiple time steps or not. If so, then could you please share with me. Thanks Kamal From reenb at meteo.psu.edu Fri Jul 15 14:55:24 2011 From: reenb at meteo.psu.edu (Brian Reen) Date: Fri, 15 Jul 2011 16:55:24 -0400 Subject: [Wrf-users] WRF DFFA OBS_DOMAIN format In-Reply-To: References: Message-ID: <4E20A93C.8080403@meteo.psu.edu> Andre, It looks like you are specifying that there are 32 levels in the observation, but only providing 31 levels. Try changing 32 to 31 in the header (at the end of the line that starts with "FM-35 TEMP"). Also, it looks like you have your temperatures in the wrong units. I believe the temperature should be in K, so the lowest level in your ob has a temperature of 2.4 K. You may want to format the obs for input to OBSGRID (http://www.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap7.htm) so that it can do QC, etc., and then it will output the OBS_DOMAIN file. For information on the data format to put into OBSGRID see section 6.12 in: http://www.mmm.ucar.edu/mm5/documents/MM5_tut_Web_notes/OA/OA.htm Thanks, Brian On 7/15/2011 12:52 PM, Andre Pattantyus wrote: > Hi all, > > I am having trouble finding out how to format observation data correctly in > fdda. I have my own example set to go off of and I made a new OBS_DOMAIN > with only one sounding in it. When I run wrf however I get an error when the > file gets read in during intialization when it looks for data in file. It > says something like error reading end of file. The documentation I found by > cindy bruyere from NCAR mentions that this is a necessary part of the file. > This data is not visible in either my example file I am working off of or > the example given by wrf, found here > http://www.mmm.ucar.edu/wrf/src/OBS_DOMAIN. So how do I correct this and fix > my obs_domain file so it is read in? Thanks. > > Here is what my OBS_DOMAINS look like: > > 20040119120000 > 31.87 -106.70 > 99001 Maybe more site info UA MET FROM KEPZ > FM-35 TEMP 1252. T F 32 > 878.000 0.000 1252.000 0.000 2.400 > 0.000 1.450 129.000 0.528 129.000 90.000 > 0.000 > 855.000 0.000 1466.000 0.000 6.000 > 0.000 4.700 129.000 2.092 129.000 71.000 > 0.000 > 850.000 0.000 1513.000 0.000 6.000 > 0.000 5.129 129.000 2.392 129.000 70.000 > 0.000 > 835.000 0.000 1659.000 0.000 5.800 > 0.000 5.440 129.000 1.560 129.000 61.000 > 0.000 > 726.000 0.000 2785.000 0.000 -3.300 > 0.000 3.535 129.000 0.687 129.000 76.000 > 0.000 > 700.000 0.000 3074.000 0.000 -3.100 > 0.000 2.417 129.000 0.880 129.000 54.000 > 0.000 > 693.000 0.000 3153.000 0.000 -3.700 > 0.000 2.862 129.000 1.156 129.000 46.000 > 0.000 > 681.000 0.000 3291.000 0.000 -3.500 > 0.000 3.264 129.000 1.522 129.000 16.000 > 0.000 > 669.000 0.000 3431.000 0.000 -3.700 > 0.000 3.600 129.000 1.995 129.000 25.000 > 0.000 > 660.000 0.000 3538.000 0.000 -3.300 > 0.000 3.926 129.000 2.454 129.000 10.000 > 0.000 > 641.000 0.000 3768.000 0.000 -4.300 > 0.000 4.109 129.000 3.096 129.000 36.000 > 0.000 > 629.000 0.000 3917.000 0.000 -4.700 > 0.000 3.598 129.000 2.914 129.000 23.000 > 0.000 > 603.000 0.000 4246.000 0.000 -7.700 > 0.000 2.960 129.000 2.859 129.000 41.000 > 0.000 > 566.000 0.000 4735.000 0.000 -10.900 > 0.000 5.144 129.000 0.000 129.000 31.000 > 0.000 > 562.000 0.000 4790.000 0.000 -10.500 > 0.000 5.125 129.000 -0.448 129.000 10.000 > 0.000 > 546.000 0.000 5011.000 0.000 -11.100 > 0.000 5.763 129.000 -2.212 129.000 14.000 > 0.000 > 500.000 0.000 5680.000 0.000 -15.900 > 0.000 7.951 129.000 -2.130 129.000 9.000 > 0.000 > 400.000 0.000 7320.000 0.000 -28.100 > 0.000 10.535 129.000 -7.377 129.000 13.000 > 0.000 > 390.000 0.000 7502.000 0.000 -29.100 > 0.000 10.535 129.000 -7.377 129.000 6.000 > 0.000 > 331.000 0.000 8652.000 0.000 -39.500 > 0.000 15.852 129.000 -7.392 129.000 27.000 > 0.000 > 300.000 0.000 9320.000 0.000 -44.300 > 0.000 18.853 129.000 -6.862 129.000 20.000 > 0.000 > 250.000 0.000 10520.000 0.000 -53.300 > 0.000 20.870 129.000 -5.592 129.000 22.000 > 0.000 > 229.000 0.000 11078.000 0.000 -56.300 > 0.000 21.012 129.000 -1.838 129.000 24.000 > 0.000 > 209.000 0.000 11659.000 0.000 -53.300 > 0.000 22.292 129.000 3.931 129.000 13.000 > 0.000 > 200.000 0.000 11940.000 0.000 -55.300 > 0.000 21.785 129.000 3.841 129.000 10.000 > 0.000 > 188.000 0.000 12335.000 0.000 -56.700 > 0.000 25.457 129.000 6.347 129.000 9.000 > 0.000 > 167.000 0.000 13091.000 0.000 -55.500 > 0.000 23.094 129.000 -1.615 129.000 6.000 > 0.000 > 150.000 0.000 13770.000 0.000 -60.300 > 0.000 17.937 129.000 1.569 129.000 6.000 > 0.000 > 127.000 0.000 14802.000 0.000 -61.100 > 0.000 16.460 129.000 0.287 129.000 5.000 > 0.000 > 107.000 0.000 15850.000 0.000 -66.100 > 0.000 12.920 129.000 3.462 129.000 5.000 > 0.000 > 100.000 0.000 16260.000 0.000 -65.300 > 0.000 14.411 129.000 3.861 129.000 6.000 > 0.000 > > > Andre > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users From ebeigi3 at tigers.lsu.edu Mon Jul 25 16:19:12 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Mon, 25 Jul 2011 17:19:12 -0500 Subject: [Wrf-users] Metgird.exe error Message-ID: Dear Sir/Madam, I am using ifort and icc version 11 on linux redhat 6, i suuccesfully installed wrf and wps on parallel processing (dm) option , but when i run metgrid.exe it takes long time to run that can not do it, Processing domain 1 of 1: any solution will be appreciated in advanced. -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110725/1e729489/attachment.html From johnsonp at ucar.edu Thu Jul 28 13:27:01 2011 From: johnsonp at ucar.edu (Pam Johnson) Date: Thu, 28 Jul 2011 13:27:01 -0600 Subject: [Wrf-users] CIRES job posting Message-ID: <4E31B805.3010807@ucar.edu> Hi, The University of Colorado is recruiting a CIRES Research Associate with expertise in statistical post-processing and background in ensemble forecasting and/or data assimilation. The job is in the Global Systems Division of NOAA?s Earth System Research Lab in Boulder. If you are interested, the job announcement is posted on the CIRES Jobs site, http://cires.colorado.edu/jobs/ , under "Statistical Post-Processing" (GSD-1, position # 813447). Please pass this message to anyone who may be interested. Thank you, and sorry for multiple emails from different lists. Jon Rush Associate Director for Administration Cooperative Institute for Research in Environmental Sciences University of Colorado at Boulder From jiwen.fan at pnnl.gov Wed Aug 3 00:14:00 2011 From: jiwen.fan at pnnl.gov (Fan, Jiwen) Date: Tue, 2 Aug 2011 23:14:00 -0700 Subject: [Wrf-users] Last reminder- AMS meeting special session: Aerosol-Cloud Interactions from Cloud-Scale to Large-Scale Circulation Message-ID: Dear Colleagues, The AMS special session: Aerosol-Cloud Interactions from Cloud-Scale to Large-Scale Circulation at Fourth Symposium on Aerosol-Cloud-Climate Interactions of the 2012 American Meteorological Society Meeting (22-26 January 2012, New Orleans, Louisiana) is now available for you to choose while submitting your abstract. For those who have submitted abstracts, you may want to contact us if your title and abstract are not obviously related to this topic but want to have your presentation assigned to this session. We invite papers on observational, modeling, instrumentation and/or laboratory studies. The deadline to submit an abstract has been extended to 8 August 2011. Please consider joining us, and also please pass this email on to others who may be interested! Our confirmed invited speakers are V. Ramanathan, Bill Cotton, Joyce Penner, and Chien Wang. Look forward to seeing you at New Orleans. Jiwen Fan & Leo Donner -- Jiwen Fan Atmospheric Science & Global Change Division Pacific Northwest National Laboratory PO Box 999, MSIN K9-24 Richland, WA 99352 509/375-2116 (o) Jiwen.fan at pnl.gov -- Leo Donner Geophysical Fluid Dynamics Laboratory/NOAA Princeton University Forrestal Campus 201 Forrestal Rd. Princeton, NJ 08540 (609) 452-6562 Leo.J.Donner at noaa.gov ......................... From chenming at ucar.edu Wed Aug 3 11:53:13 2011 From: chenming at ucar.edu (Ming Chen) Date: Wed, 03 Aug 2011 11:53:13 -0600 Subject: [Wrf-users] fwd: Announcement for upcoming conference Message-ID: <4E398B09.7040105@ucar.edu> Dear Colleague, EPRI and Air & Waste Management Associati on are o uture Air Quality Model Development Needs to be held on September 12 & 13, 2011 in Washington, DC. The agenda is as follows: Keynote Speaker: John Seinfeld, California Institute of Technology 1. Homogeneous-Phase Chemistry Ron Cohen, University of California Berkeley Dick Derwe nt, Rdsc 2. Heterogeneous-Phase Chemistry (including Inorganic and Organic Aqueous-Phase Chemistry) Hartmut Herrmann, Leibniz Institute for Tropospheric Research Barbara Turpin, Rutgers University 3. Organic Particulate Matter: Formation and Aging of Secondary Organic Aerosol Prakash Bhave, US Environmental Protection Agency Allen Robinson, Carnegie Mellon University 4. Meteorological Processes Affecting Air Quality Jerome Fast, Pacific Northwest National Laboratory Leiming Zhang, Environment CanadaThis workshop is designed to bring together leading researchers from academia, government, and private institutions, industry, and other stakeholders to brainstorm on various model development needs and develop a comprehensive research agenda that can be used by the community to help guide research plans and promote collaboration among researchers. The focus of this two-day workshop is limited to the use of atmospheric models to support regional-scale air quality applications, such as forecasting or control strategy development. On behalf of the steering committee, we would like to invite you to participate and contribute to the success of the workshop. LEARN MORE OR REGISTER ONLINE HERE or follow this link: http://cts.vresp.com/c/?AirWasteManagementAs/b42c14fa4e/906c2f3e85/1fc55d7f0d/iKey=S136310 Stuart McKeen ESRL/NOAA, Boulder, CO -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110803/96885d96/attachment.html From antonio.parodi at cimafoundation.org Wed Aug 3 15:31:29 2011 From: antonio.parodi at cimafoundation.org (antonio.parodi at cimafoundation.org) Date: Wed, 3 Aug 2011 23:31:29 +0200 Subject: [Wrf-users] AGU 2011 (deadline august 4th): last minute reminder NG12: Scaling and Prediction of Climate Extremes and Regime Transitions In-Reply-To: <4E3924C2.6090206@cimafoundation.org> References: <4E3924C2.6090206@cimafoundation.org> Message-ID: <4bb7d134b5b925f8c05f903c002f2b85.squirrel@mail.cimafoundation.org> Dear Colleague, We would like to invite you to submit your contribution to the session (deadline august 4th, tomorrow): NG12: Scaling and Prediction of Climate Extremes and Regime Transitions Sponsor: Nonlinear Geophysics (NG) Co-Sponsor(s): Atmospheric Sciences (A), Global Environmental Change (GC), Hydrology (H), Natural Hazards (NH) The invited speakers are: Efi Foufoula-Georgiou (St. Anthony Falls Laboratory, University of Minnesota) Auroop R. Ganguly (GIST Group, CSED, Oak Ridge National Laboratory) Dimitris Giannakis (Center for Atmosphere Ocean Science (CAOS), Courant Institute of Mathematical Sciences) William K. Lau (NASA) You can find more session information at the web site: http://sites.agu.org/fallmeeting/scientific-program/session-search/573 Abstracts should be submitted through the AGU website at the following web address: http://sites.agu.org/fallmeeting/announcements/abstract-submission-open/ Sorry for cross-posting. Best regards Ana Barros Alin-Andrei Carsteanu Joshua Hacker Antonio Parodi NG12: Scaling and Prediction of Climate Extremes and Regime Transitions Description: This session seeks contributions concerning modeling and observational studies of multiscale Climate processes including Hydrometeorology and Hydrology defined broadly (e.g. from global to regional scale down to cloud formation and precipitation, runoff, groundwater and streamflow) which have a direct impact on predictability of extreme events (heat waves, floods, droughts) and regime transitions. Studies over a wide range of temporal and spatial scales including nonlinear models, scaling analysis, multifractals and cascades are encouraged. Papers focusing on the detection and fingerprinting of scaling behavior associated with model boundary conditions, threshold physics, and model or observing system structure are welcome. From jennie.thomas at latmos.ipsl.fr Fri Aug 5 08:05:39 2011 From: jennie.thomas at latmos.ipsl.fr (Jennie Thomas) Date: Fri, 5 Aug 2011 16:05:39 +0200 Subject: [Wrf-users] =?windows-1252?q?Post-doctoral_research_position_on_t?= =?windows-1252?q?he_=91Regional_modelling_of_short-lived_climate_forcers?= =?windows-1252?q?=92_located_in_Paris=2C_France?= Message-ID: Attached is the announcement one of several postdoc positions available in my lab. I'm happy to answer informal questions about the position. Please send CVs to Kathy Law and Jean-Christophe Raut. ************************************************************************ Post-doctoral research position on the ?Regional modelling of short-lived climate forcers? at Laboratoire Atmosph?res, Milieux, Observations Spatiales (LATMOS), Universit? Pierre et Marie Curie/CNRS, Paris, France ************************************************************************ A research position is available for 2-3 years to work on the ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants) project recently been accepted for funding by the European Commission. The main goal of ECLIPSE, which involves several European groups, is to develop and assess effective emission abatement strategies for short-lived climate forcers, notably tropospheric ozone and black carbon, in order to provide sound scientific advice on measures that mitigate climate change and improve air quality at the same time. A key part of the work will be to evaluate the strengths and weaknesses of global chemistry-climate models used to assess the climate and air quality impacts of short-lived climate forcers. In particular, the work at LATMOS will be to carry out high-resolution case studies using the WRF-Chem model over major emission regions. Long-range transport between pollution source and receptor regions (such as the Arctic) will also be investigated. Process based analysis of ozone and aerosols through comparison with observations will be used to evaluate model performance. Applications are invited from post-doctoral researchers with a PhD in environmental or atmospheric sciences and ideally experience in running chemical-aerosol models and data analysis. Candidates should have a good working knowledge of computer programming (Unix, Fortran, IDL/Matlab/NCL), be proficient in English and willing to work in an international framework attending project meetings and conferences. The position is available from late-autumn 2011. To apply send a detailed CV (including publication list), a letter of motivation (in English) and email addresses of 3 referees to Kathy.Law at latmos.ipsl.fr and Jean-Christophe.Raut at latmos.ipsl.fr. Review of the applications will start in mid-September 2011 and the position is open until filled. LATMOS ((http://www.latmos.ipsl.fr/ ) is located in central Paris at the Jussieu campus of Universite Pierre et Marie Curie (http://www.upmc.fr/en/index.html ). It is a joint CNRS/university laboratory which is also part of the Institut Pierre Simon Laplace (http://www.ipsl.fr/). It carries out research into a wide range of topics related to atmospheric science. Cheers, Jennie Thomas _________________ Laboratoire Atmosph?res, Milieux, Observations Spatiales LATMOS Tour 45, Couloir 45-46, 3e et 4e ?tages (bo?te 102) Universit? Pierre et Marie Curie 4 place Jussieu 75252 Paris Cedex 05 France -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110805/757d9fc9/attachment-0001.html From jennie.thomas at latmos.ipsl.fr Fri Aug 5 08:07:29 2011 From: jennie.thomas at latmos.ipsl.fr (Jennie Thomas) Date: Fri, 5 Aug 2011 16:07:29 +0200 Subject: [Wrf-users] Post-Doctoral Positions in the field of atmospheric composition and its evolution located in Paris, France Message-ID: Attached is the announcement for two postdoc positions available in my lab. I'm happy to answer informal questions about the positions. Please send CVs to Kathy Law and Claire Granier. Post-Doctoral Positions in the field of atmospheric composition and its evolution The LATMOS laboratory, located at the University Pierre and Marie Curie in the center of Paris, invites applications for 2 postdoctoral researcher positions funded by the European Commission. The contracts will start between autumn 2011 and early 2012. The proposed work will contribute to the following European projects: - ACCESS (Arctic Climate Change, Economy and Society) running from 2011 to 2015 (see http://www.upmc.fr/en/press_office/press_releases/arctic_ocean_under_close_surveillance_at_upmc.html). The goals of the ACCESS project are to evaluate climatic impacts in the Arctic on marine transportation, fisheries, marine mammals and the extraction of hydrocarbons for the next 20 years, with particular attention to environmental sensitivities and sustainability. The Arctic is a region undergoing unprecedented changes. The work at LATMOS will focus on quantification of the impact of local emissions (shipping, resource extraction) and remote mid-latitude pollution on present-day and future air composition in the Arctic. The LATMOS group will also participate in an aircraft campaign which will provide data on emissions and plume dispersion from shipping and oil/gas extraction. The post-doctoral work will focus on global and north-hemispheric scales, and links with regional modeling work. - MACC-II (Monitoring Atmospheric Composition and Climate): http://www.gmes-atmosphere.eu/gmes-atmosphere.eu ? planned to start early 2012 for 3 years. The goal of MACC-II is to provide state of the art simulations of atmospheric composition for recent years using models run on the operational system at ECMWF, as a well as results for monitoring present conditions and provision of chemical forecasts, for example, for field campaigns. The available position will focus on the quantification, harmonization and evaluation of surface emissions at the global and regional scales, the participation in a pilot study of the use of inverse modeling techniques to improve emissions in the context of the MACC-II modeling work. The LATMOS group will also participate in the development of a new emissions database for atmospheric constituents, and will work with other European groups on interconnections between different databases for emissions, observations and modeling results. Qualifications and requirements for the positions: - PhD in atmospheric sciences or in a similar field - Experience in atmospheric chemistry modeling / analysis of atmospheric observations / surface emissions - Programming skills in Unix/Linux, Fortran, and scientific visualization programs (Matlab, IDL or NCL) - Good English language skills - Willingness to work in an international framework, and to attend the different meetings organized within each project. Review of the applications will start mid-September 2011, and will continue until the positions are filled. The positions are available for 2-3 years. To apply, send a detailed CV (including a list of publications), a letter of motivation in English, and the email addresses of two academic referees to: Claire Granier: claire.granier at latmos.ipsl.fr and Kathy Law: kathy.law at latmos.ipsl.fr Cheers, Jennie Thomas _________________ Laboratoire Atmosph?res, Milieux, Observations Spatiales LATMOS Tour 45, Couloir 45-46, 3e et 4e ?tages (bo?te 102) Universit? Pierre et Marie Curie 4 place Jussieu 75252 Paris Cedex 05 France -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110805/b943e388/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: clip_image002.png Type: image/png Size: 6601 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110805/b943e388/attachment-0001.png From mashfaq at ornl.gov Thu Aug 4 21:30:52 2011 From: mashfaq at ornl.gov (Ashfaq, Moetasim) Date: Thu, 04 Aug 2011 23:30:52 -0400 Subject: [Wrf-users] Postdoctoral Researcher in Regional Climate Modeling at ORNL Message-ID: <19BF3DE9435EFC4F8AD75ADB7F030A4E0AA45874FF@EXCHMB.ornl.gov> Postdoctoral Researcher in Regional Climate Modeling The Computational Earth Sciences group of the Computer Science and Mathematics Division at Oak Ridge National Laboratory seeks to hire a Post Doctoral Researcher to participate in research involving the development of approaches to investigate natural and anthropogenic hydroclimate variability at regional and local scales. Using a suite of global-to-regional scale climate models and observational datasets, this research will improve the quantitative understanding of nature of interactions between fine-scale hydroclimate processes and large-scale climate forcing, and role of such interactions in the occurrence of high-intensity low-frequency hydro-climate extremes at multi-decadal time scales. The successful candidate will be expected to: 1) Play a key role in the planning and execution of multi-model-driven ultra-high resolution global and regional climate modeling experiments 2) Analyze, present and publish research results in scientific conferences and peer-reviewed journals 3) Coordinate and collaborate with researchers at ORNL and other DOE National Labs. Minimum Qualification: Candidates must have received a PhD in Atmospheric and Oceanic Sciences or a related field within the past five years from an accredited college or university. Required Skills: 1) Strong understanding of current climate modeling techniques with demonstrated expertise in the use and application of a Regional Climate Model. 2) Demonstrated ability to perform comprehensive analysis of large climate datasets through advanced data analysis techniques and evaluation matrices. 3) Excellent oral and written communication skills. 4) Good publications record in peer-reviewed journals. 5) Expertise in one of the programming languages such as Fortran, C, and analysis packages such as MATLAB, IDL, NCL. Desired Skills: 1) Knowledge of North American climate and global monsoon systems 2) Experience in the use of Global Climate Models data 3) Experience in the use of climate models output in process-based hydrological applications and statistical hydrology We anticipate it to be a two years position, dependent on continuing funding. Applications will be accepted until the position is filled. Technical Questions: For more information about this position please contact Dr. Moetasim Ashfaq (mashfaq at ornl.gov). Please reference this position title in your correspondence. Interested candidates should apply online: https://www3.orau.gov/ORNL_TOppS/Posting/Details/174 Please refer to the following link for the application requirements: http://www.orau.org/ornl/postdocs/ornl-pd-pm/application.htm This appointment is offered through the ORNL Postgraduate Research Participation Program and is administered by the Oak Ridge Institute for Science and Education (ORISE). The program is open to all qualified U.S. and non-U.S. citizens without regard to race, color, age, religion, sex, national origin, physical or mental disability, or status as a Vietnam-era veteran or disabled veteran. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110804/c8f0d13c/attachment.html From mmkamal at uwaterloo.ca Fri Aug 5 11:03:16 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Fri, 05 Aug 2011 13:03:16 -0400 Subject: [Wrf-users] What are the CESM variable needed to run WRF Message-ID: <20110805130316.99091rq9dnvun800@www.nexusmail.uwaterloo.ca> Hi All, I would like to use CESM output as ILBC for WRF. Could anybody please tell me what are the CESM (Both CAM & CLM) variables needed to run WRF. Thanks in advance Kamal From ebeigi3 at tigers.lsu.edu Sun Aug 7 12:55:55 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Sun, 7 Aug 2011 14:55:55 -0400 Subject: [Wrf-users] ingesting CCSM3 to WRF? Message-ID: Dear Sir/Madam, i am using CAM2WRF , to changing output of CAM (CCSM3) into WRF. the CAM2WRF utility is written for changing both cam and clm data, but i have only cam data. does anyone has modified CAM2WRF code which only considers cam data. any help will be appreciated in advance -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110807/ffc71a22/attachment.html From sstolaki at geo.auth.gr Mon Aug 8 04:36:43 2011 From: sstolaki at geo.auth.gr (Stavroula Stolaki) Date: Mon, 08 Aug 2011 13:36:43 +0300 Subject: [Wrf-users] a question on RIP Message-ID: <20110808133643.24136yn06jcefj23@webmail.auth.gr> Hi all, I am trying to use RIP in order to plot values of precipitation and surface pressure for the domain of Europe. I would like to set two different colors for land and water in my maps and at the same time overlay on this map the values (in colored contours) of total precipitation for the past 3 hours. I find difficulty in doing this though, since the land-water map covers the precipitation values in the output maps. I attach part of my .in script: feld=rtot3h; ptyp=hc; wdbr=0.05;smth=303;nsmm;cbeg=0.1;cmth=fill;> cosq=0.1,white,0.25,light.blue,0.5,light.purple,1,> light.cerulean,2,cyan,4,avocado,8,green,16,forest.green,> 32,dark.green,64,med.gray,128,dark.gray;> hvbr=0; nmin;nmsg=true feld=slp; ptyp=hc; cint=4;linw=1;pwlb=0;colr=red;pwlb=0;smth=303;nsmm feld=map; ptyp=hb; mfco=cyan,lemon;outy=Earth..3L5; Any help would be appreciated. Thank you in advance. Stavroula Stolaki, PhD Candidate. From havadurumu at gmail.com Mon Aug 8 12:24:41 2011 From: havadurumu at gmail.com (Abdullah Kahraman) Date: Mon, 8 Aug 2011 21:24:41 +0300 Subject: [Wrf-users] a question on RIP In-Reply-To: <20110808133643.24136yn06jcefj23@webmail.auth.gr> References: <20110808133643.24136yn06jcefj23@webmail.auth.gr> Message-ID: Hi Ms Stolaki, You should just change the order the fields to be plotted, i.e. shift the last line to the beginning to avoid the land/sea colors cover all other fields, like this: feld=map; ptyp=hb; mfco=cyan,lemon;outy=Earth..3L5; feld=rtot3h; ptyp=hc; wdbr=0.05;smth=303;nsmm;cbeg=0.1;cmth=fill;> cosq=0.1,white,0.25,light.blue,0.5,light.purple,1,> light.cerulean,2,cyan,4,avocado,8,green,16,forest.green,> 32,dark.green,64,med.gray,128,dark.gray;> hvbr=0; nmin;nmsg=true feld=slp; ptyp=hc; cint=4;linw=1;pwlb=0;colr=red;pwlb=0;smth=303;nsmm I hope this helps. Best, Abdullah. *Abdullah KAHRAMAN* M.Sc. Meteorological Engineer Turkish State Meteorological Service Ph.D. Candidate Istanbul Technical University, Atmospheric Sciences Program On Mon, Aug 8, 2011 at 13:36, Stavroula Stolaki wrote: > Hi all, > > I am trying to use RIP in order to plot values of precipitation and > surface pressure for the domain of Europe. I would like to set two > different colors for land and water in my maps and at the same time > overlay on this map the values (in colored contours) of total > precipitation for the past 3 hours. I find difficulty in doing this > though, since the land-water map covers the precipitation values in > the output maps. > > I attach part of my .in script: > > feld=rtot3h; ptyp=hc; wdbr=0.05;smth=303;nsmm;cbeg=0.1;cmth=fill;> > cosq=0.1,white,0.25,light.blue,0.5,light.purple,1,> > light.cerulean,2,cyan,4,avocado,8,green,16,forest.green,> > 32,dark.green,64,med.gray,128,dark.gray;> > hvbr=0; nmin;nmsg=true > feld=slp; ptyp=hc; cint=4;linw=1;pwlb=0;colr=red;pwlb=0;smth=303;nsmm > feld=map; ptyp=hb; mfco=cyan,lemon;outy=Earth..3L5; > > > > Any help would be appreciated. > > Thank you in advance. > > Stavroula Stolaki, PhD Candidate. > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110808/6736f1cb/attachment.html From dbh409 at ku.edu Mon Aug 8 19:55:42 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 9 Aug 2011 01:55:42 +0000 Subject: [Wrf-users] WRF Compilation Error Message-ID: Hello, I'm having an issue compiling WRF V3.1.1 with the Intel compilers using a multi-processor (dmpar) build. NCAR Graphics (6.0.0) and NetCDF (4.1.3) are also built with the Intel compilers. The configure.wrf file contents (minus comments) and a list of the environmental variables are shown at the end of this message. The following is the first of several warning/undefined reference messages: ifort -w -ftz -align all -fno-alias -fp-model precise -FR -convert big_endian -c -I/usr/local/include -I../ioapi_share diffwrf.f diffwrf.f(1629): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf.f(1630): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf.f(1709): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf.f(1710): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf.f(1711): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf.f(1712): (col. 7) remark: LOOP WAS VECTORIZED. diffwrf io_netcdf is being built now. /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail wrf_io.o: In function `ext_ncd_get_var_info_': wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. I have successfully built WRF and WPS on this system before, but this is the first time I have tried it with everything compiled with icc and ifort. Any suggestions would be much appreciated. Thanks, Dave configure.wrf: SHELL = /bin/sh DEVTOP = `pwd` LIBINCLUDE = . .SUFFIXES: .F .i .o .f90 .c COREDEFS = -DEM_CORE=$(WRF_EM_CORE) \ -DNMM_CORE=$(WRF_NMM_CORE) -DNMM_MAX_DIM=2600 \ -DCOAMPS_CORE=$(WRF_COAMPS_CORE) \ -DDA_CORE=$(WRF_DA_CORE) \ -DEXP_CORE=$(WRF_EXP_CORE) MAX_DOMAINS = 21 CONFIG_BUF_LEN = 32768 NATIVE_RWORDSIZE = 4 SED_FTN = $(WRF_SRC_ROOT_DIR)/tools/standard.exe IO_GRIB_SHARE_DIR = ESMF_COUPLING = 0 ESMF_MOD_DEPENDENCE = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/module_utility.o ESMF_IO_INC = -I$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 ESMF_MOD_INC = $(ESMF_IO_INC) ESMF_IO_DEFS = ESMF_TARGET = esmf_time LIBWRFLIB = libwrflib.a DMPARALLEL = 1 OMPCPP = # -D_OPENMP OMP = # -openmp -fpp -auto SFC = ifort SCC = icc DM_FC = mpif90 -f90=$(SFC) DM_CC = mpicc -cc=$(SCC) -DMPI2_SUPPORT FC = $(DM_FC) CC = $(DM_CC) -DFSEEKO64_OK LD = $(FC) RWORDSIZE = $(NATIVE_RWORDSIZE) PROMOTION = -i4 ARCH_LOCAL = -DNONSTANDARD_SYSTEM_FUNC CFLAGS_LOCAL = -w -O3 -ip LDFLAGS_LOCAL = -ip CPLUSPLUSLIB = ESMF_LDFLAG = $(CPLUSPLUSLIB) FCOPTIM = -O3 FCREDUCEDOPT = $(FCOPTIM) FCNOOPT = -O0 -fno-inline -fno-ip FCDEBUG = # -g $(FCNOOPT) -traceback FORMAT_FIXED = -FI FORMAT_FREE = -FR FCSUFFIX = BYTESWAPIO = -convert big_endian FCBASEOPTS = -w -ftz -align all -fno-alias -fp-model precise $(FCDEBUG) $(FORMAT_FREE) $(BYTESWAPIO) MODULE_SRCH_FLAG = TRADFLAG = -traditional CPP = /lib/cpp -C -P AR = ar ARFLAGS = ru M4 = m4 RANLIB = ranlib CC_TOOLS = $(SCC) FGREP = fgrep -iq ARCHFLAGS = $(COREDEFS) -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=$(RWORDSIZE) -DLWORDSIZE=4 \ $(ARCH_LOCAL) \ $(DA_ARCHFLAGS) \ -DDM_PARALLEL \ \ -DNETCDF \ \ \ \ \ \ \ -DGRIB1 \ -DINTIO \ -DLIMIT_ARGS \ -DCONFIG_BUF_LEN=$(CONFIG_BUF_LEN) \ -DMAX_DOMAINS_F=$(MAX_DOMAINS) \ -DNMM_NEST=$(WRF_NMM_NEST) CFLAGS = $(CFLAGS_LOCAL) -DDM_PARALLEL FCFLAGS = $(FCOPTIM) $(FCBASEOPTS) ESMF_LIB_FLAGS = ESMF_IO_LIB = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a ESMF_IO_LIB_EXT = -L$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a INCLUDE_MODULES = $(MODULE_SRCH_FLAG) \ $(ESMF_MOD_INC) $(ESMF_LIB_FLAGS) \ -I$(WRF_SRC_ROOT_DIR)/main \ -I$(WRF_SRC_ROOT_DIR)/external/io_netcdf \ -I$(WRF_SRC_ROOT_DIR)/external/io_int \ -I$(WRF_SRC_ROOT_DIR)/frame \ -I$(WRF_SRC_ROOT_DIR)/share \ -I$(WRF_SRC_ROOT_DIR)/phys \ -I$(WRF_SRC_ROOT_DIR)/chem -I$(WRF_SRC_ROOT_DIR)/inc \ \ REGISTRY = Registry LIB_BUNDLED = \ -L$(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 -lfftpack \ -L$(WRF_SRC_ROOT_DIR)/external/io_grib1 -lio_grib1 \ -L$(WRF_SRC_ROOT_DIR)/external/io_grib_share -lio_grib_share \ -L$(WRF_SRC_ROOT_DIR)/external/io_int -lwrfio_int \ $(ESMF_IO_LIB) \ $(ESMF_IO_LIB) \ $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a \ $(WRF_SRC_ROOT_DIR)/frame/module_internal_header_util.o \ $(WRF_SRC_ROOT_DIR)/frame/pack_utils.o LIB_EXTERNAL = \ $(WRF_SRC_ROOT_DIR)/external/io_netcdf/libwrfio_nf.a -L/usr/local/lib -lnetcdff -lnetcdf LIB = $(LIB_BUNDLED) $(LIB_EXTERNAL) $(LIB_LOCAL) LDFLAGS = $(OMP) $(FCFLAGS) $(LDFLAGS_LOCAL) ENVCOMPDEFS = WRF_CHEM = 0 CPPFLAGS = $(ARCHFLAGS) $(ENVCOMPDEFS) -I$(LIBINCLUDE) $(TRADFLAG) NETCDFPATH = /usr/local PNETCDFPATH = bundled: wrf_ioapi_includes wrfio_grib_share wrfio_grib1 wrfio_int esmf_time fftpack external: wrfio_nf $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a gen_comms_rsllite module_dm_rsllite $(ESMF_TARGET) ###################### externals: bundled external gen_comms_serial : ( /bin/rm -f $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ) module_dm_serial : ( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; cat module_dm_stubs.F >> module_dm.F ; fi ) gen_comms_rsllite : ( if [ ! -e $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ] ; then \ /bin/cp $(WRF_SRC_ROOT_DIR)/tools/gen_comms_warning $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; \ cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/gen_comms.c >> $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; fi ) module_dm_rsllite : ( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; \ cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/module_dm.F >> module_dm.F ; fi ) wrfio_nf : ( cd $(WRF_SRC_ROOT_DIR)/external/io_netcdf ; \ make NETCDFPATH="$(NETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ CC="$(SCC)" CFLAGS="$(CFLAGS)" \ FC="$(SFC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) wrfio_pnf : ( cd $(WRF_SRC_ROOT_DIR)/external/io_pnetcdf ; \ make NETCDFPATH="$(PNETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ FC="$(FC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) wrfio_grib_share : ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib_share ; \ make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) wrfio_grib1 : ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib1 ; \ make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) wrfio_grib2 : ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib2 ; \ make CC="$(SCC)" CFLAGS="$(CFLAGS) " RM="$(RM)" RANLIB="$(RANLIB)" \ CPP="$(CPP)" \ FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="-traditional" AR="$(AR)" ARFLAGS="$(ARFLAGS)" \ FIXED="$(FORMAT_FIXED)" archive) wrfio_int : ( cd $(WRF_SRC_ROOT_DIR)/external/io_int ; \ make CC="$(CC)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" \ TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" all ) esmf_time : ( cd $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 ; \ make FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" \ CPP="$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) fftpack : ( cd $(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 ; \ make FC="$(SFC)" FFLAGS="$(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a : ( cd $(WRF_SRC_ROOT_DIR)/external/RSL_LITE ; make CC="$(CC) $(CFLAGS)" \ FC="$(FC) $(FCFLAGS) $(PROMOTION) $(BYTESWAPIO)" \ CPP="$(CPP) -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ;\ $(RANLIB) $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a ) LN = ln -sf MAKE = make -i -r RM = rm -f # These sub-directory builds are identical across all architectures wrf_ioapi_includes : ( cd $(WRF_SRC_ROOT_DIR)/external/ioapi_share ; \ $(MAKE) NATIVE_RWORDSIZE="$(NATIVE_RWORDSIZE)" RWORDSIZE="$(RWORDSIZE)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) wrfio_esmf : ( cd $(WRF_SRC_ROOT_DIR)/external/io_esmf ; \ make FC="$(FC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS) $(ESMF_MOD_INC)" \ RANLIB="$(RANLIB)" CPP="$(CPP) $(POUND_DEF) " AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) # There is probably no reason to modify these rules .F.i: $(RM) $@ $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.F > $@ mv $*.i $(DEVTOP)/pick/$*.f90 cp $*.F $(DEVTOP)/pick .F.o: $(RM) $@ $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F > $*.bb $(SED_FTN) $*.bb | $(CPP) > $*.f90 $(RM) $*.b $*.bb if $(FGREP) '!$$OMP' $*.f90 ; then \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ $(FC) -o $@ -c $(FCFLAGS) $(OMP) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ else \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ $(FC) -o $@ -c $(FCFLAGS) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ fi .F.f90: $(RM) $@ $(SED_FTN) $*.F > $*.b $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.b > $@ $(RM) $*.b .f90.o: $(RM) $@ $(FC) -o $@ -c $(FCFLAGS) $(PROMOTION) $(FCSUFFIX) $*.f90 .c.o: $(RM) $@ $(CC) -o $@ -c $(CFLAGS) $*.c # A little more adventurous. Allow full opt on # mediation_integrate.o \ # shift_domain_em.o \ # solve_em.o <-- gets a little kick from SOLVE_EM_SPECIAL too, if defined # mediation_feedback_domain.o : mediation_feedback_domain.F # mediation_force_domain.o : mediation_force_domain.F # mediation_interp_domain.o : mediation_interp_domain.F # compile these without high optimization to speed compile convert_nmm.o : convert_nmm.F init_modules_em.o : init_modules_em.F input_wrf.o : input_wrf.F module_io.o : module_io.F module_comm_dm.o : module_comm_dm.F module_configure.o : module_configure.F module_dm.o : module_dm.F module_domain.o : module_domain.F module_domain_type.o : module_domain_type.F module_alloc_space.o : module_alloc_space.F module_tiles.o : module_tiles.F module_fddaobs_rtfdda.o : module_fddaobs_rtfdda.F module_initialize.o : module_initialize.F module_physics_init.o : module_physics_init.F module_initialize_b_wave.o : module_initialize_b_wave.F module_initialize_hill2d_x.o : module_initialize_hill2d_x.F module_initialize_quarter_ss.o : module_initialize_quarter_ss.F module_initialize_real.o : module_initialize_real.F module_initialize_real.o: module_initialize_real.F module_initialize_squall2d_x.o : module_initialize_squall2d_x.F module_initialize_squall2d_y.o : module_initialize_squall2d_y.F module_integrate.o : module_integrate.F module_io_mm5.o : module_io_mm5.F module_io_wrf.o : module_io_wrf.F module_si_io.o : module_si_io.F module_state_description.o : module_state_description.F output_wrf.o : output_wrf.F NMM_NEST_UTILS1.o : NMM_NEST_UTILS1.F solve_interface.o : solve_interface.F start_domain.o : start_domain.F start_domain_nmm.o : start_domain_nmm.F start_em.o : start_em.F wrf_auxhist10in.o : wrf_auxhist10in.F wrf_auxhist10out.o : wrf_auxhist10out.F wrf_auxhist11in.o : wrf_auxhist11in.F wrf_auxhist11out.o : wrf_auxhist11out.F wrf_auxhist1in.o : wrf_auxhist1in.F wrf_auxhist1out.o : wrf_auxhist1out.F wrf_auxhist2in.o : wrf_auxhist2in.F wrf_auxhist2out.o : wrf_auxhist2out.F wrf_auxhist3in.o : wrf_auxhist3in.F wrf_auxhist3out.o : wrf_auxhist3out.F wrf_auxhist4in.o : wrf_auxhist4in.F wrf_auxhist4out.o : wrf_auxhist4out.F wrf_auxhist5in.o : wrf_auxhist5in.F wrf_auxhist5out.o : wrf_auxhist5out.F wrf_auxhist6in.o : wrf_auxhist6in.F wrf_auxhist6out.o : wrf_auxhist6out.F wrf_auxhist7in.o : wrf_auxhist7in.F wrf_auxhist7out.o : wrf_auxhist7out.F wrf_auxhist8in.o : wrf_auxhist8in.F wrf_auxhist8out.o : wrf_auxhist8out.F wrf_auxhist9in.o : wrf_auxhist9in.F wrf_auxhist9out.o : wrf_auxhist9out.F wrf_auxinput10in.o : wrf_auxinput10in.F wrf_auxinput10out.o : wrf_auxinput10out.F wrf_auxinput11in.o : wrf_auxinput11in.F wrf_auxinput11out.o : wrf_auxinput11out.F wrf_auxinput1in.o : wrf_auxinput1in.F wrf_auxinput1out.o : wrf_auxinput1out.F wrf_auxinput2in.o : wrf_auxinput2in.F wrf_auxinput2out.o : wrf_auxinput2out.F wrf_auxinput3in.o : wrf_auxinput3in.F wrf_auxinput3out.o : wrf_auxinput3out.F wrf_auxinput4in.o : wrf_auxinput4in.F wrf_auxinput4out.o : wrf_auxinput4out.F wrf_auxinput5in.o : wrf_auxinput5in.F wrf_auxinput5out.o : wrf_auxinput5out.F wrf_auxinput6in.o : wrf_auxinput6in.F wrf_auxinput6out.o : wrf_auxinput6out.F wrf_auxinput7in.o : wrf_auxinput7in.F wrf_auxinput7out.o : wrf_auxinput7out.F wrf_auxinput8in.o : wrf_auxinput8in.F wrf_auxinput8out.o : wrf_auxinput8out.F wrf_auxinput9in.o : wrf_auxinput9in.F wrf_auxinput9out.o : wrf_auxinput9out.F wrf_bdyin.o : wrf_bdyin.F wrf_bdyout.o : wrf_bdyout.F wrf_ext_read_field.o : wrf_ext_read_field.F wrf_ext_write_field.o : wrf_ext_write_field.F wrf_fddaobs_in.o : wrf_fddaobs_in.F wrf_histin.o : wrf_histin.F wrf_histout.o : wrf_histout.F wrf_inputin.o : wrf_inputin.F wrf_inputout.o : wrf_inputout.F wrf_restartin.o : wrf_restartin.F wrf_restartout.o : wrf_restartout.F wrf_tsin.o : wrf_tsin.F nl_get_0_routines.o : nl_get_0_routines.F nl_get_1_routines.o : nl_get_1_routines.F nl_set_0_routines.o : nl_set_0_routines.F nl_set_1_routines.o : nl_set_1_routines.F convert_nmm.o \ init_modules_em.o \ module_dm.o \ module_fddaobs_rtfdda.o \ module_initialize.o \ module_initialize_b_wave.o \ module_initialize_hill2d_x.o \ module_initialize_quarter_ss.o \ module_initialize_real.o \ module_initialize_squall2d_x.o \ module_initialize_squall2d_y.o \ module_integrate.o \ module_io_mm5.o \ module_io_wrf.o \ module_si_io.o \ module_tiles.o \ output_wrf.o \ NMM_NEST_UTILS1.o \ solve_interface.o \ start_domain.o \ start_domain_nmm.o \ shift_domain_nmm.o \ start_em.o \ wrf_fddaobs_in.o \ wrf_tsin.o : $(RM) $@ $(SED_FTN) $*.F > $*.b $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b > $*.f90 $(RM) $*.b if $(FGREP) '!$$OMP' $*.f90 ; then \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ $(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ else \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ $(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ fi module_sf_ruclsm.o : module_sf_ruclsm.F module_sf_ruclsm.o : $(RM) $@ $(SED_FTN) $*.F > $*.b $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b > $*.f90 $(RM) $*.b if $(FGREP) '!$$OMP' $*.f90 ; then \ echo COMPILING $*.F WITH OMP ; \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ $(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ else \ if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ $(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ fi input_wrf.o \ module_domain.o \ module_domain_type.o \ module_physics_init.o \ module_io.o \ wrf_auxhist10in.o \ wrf_auxhist10out.o \ wrf_auxhist11in.o \ wrf_auxhist11out.o \ wrf_auxhist1in.o \ wrf_auxhist1out.o \ wrf_auxhist2in.o \ wrf_auxhist2out.o \ wrf_auxhist3in.o \ wrf_auxhist3out.o \ wrf_auxhist4in.o \ wrf_auxhist4out.o \ wrf_auxhist5in.o \ wrf_auxhist5out.o \ wrf_auxhist6in.o \ wrf_auxhist6out.o \ wrf_auxhist7in.o \ wrf_auxhist7out.o \ wrf_auxhist8in.o \ wrf_auxhist8out.o \ wrf_auxhist9in.o \ wrf_auxhist9out.o \ wrf_auxinput10in.o \ wrf_auxinput10out.o \ wrf_auxinput11in.o \ wrf_auxinput11out.o \ wrf_auxinput1in.o \ wrf_auxinput1out.o \ wrf_auxinput2in.o \ wrf_auxinput2out.o \ wrf_auxinput3in.o \ wrf_auxinput3out.o \ wrf_auxinput4in.o \ wrf_auxinput4out.o \ wrf_auxinput5in.o \ wrf_auxinput5out.o \ wrf_auxinput6in.o \ wrf_auxinput6out.o \ wrf_auxinput7in.o \ wrf_auxinput7out.o \ wrf_auxinput8in.o \ wrf_auxinput8out.o \ wrf_auxinput9in.o \ wrf_auxinput9out.o \ wrf_bdyin.o \ wrf_bdyout.o \ wrf_ext_read_field.o \ wrf_ext_write_field.o \ wrf_histin.o \ wrf_histout.o \ wrf_inputin.o \ wrf_inputout.o \ wrf_restartin.o \ wrf_restartout.o \ module_state_description.o \ nl_set_0_routines.o \ nl_set_1_routines.o \ nl_get_0_routines.o \ nl_get_1_routines.o \ module_alloc_space.o \ module_comm_dm.o \ module_configure.o : $(RM) $@ $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F > $*.bb $(SED_FTN) $*.bb | $(CPP) > $*.f90 $(RM) $*.b $*.bb $(FC) -c $(PROMOTION) $(FCSUFFIX) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $*.f90 Environmental Variables: USER=dbh409 LOGNAME=dbh409 HOME=/home/dbh409 PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/bin:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/bin:/usr/local/intel/11.0/084/bin/intel64:/usr/local/intel/11.0/084/bin/intel64:/usr/local/maui/bin:/usr/local/maui/sbin:/usr/local/torque/bin:/usr/local/torque/sbin:/usr/local/Modules/3.2.6/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/software/netCDF/3.6.3/intel/mvapich/bin:/usr/local/ncarg_ifort/bin:. MAIL=/var/spool/mail/dbh409 SHELL=/bin/tcsh SSH_CLIENT=10.237.0.175 49359 22 SSH_CONNECTION=10.237.0.175 49359 129.237.228.235 22 SSH_TTY=/dev/pts/3 TERM=xterm HOSTTYPE=x86_64-linux VENDOR=unknown OSTYPE=linux MACHTYPE=x86_64 SHLVL=1 PWD=/home/dbh409/WRFV3 GROUP=dbh409 HOST=pequod REMOTEHOST=10.237.0.175 HOSTNAME=pequod INPUTRC=/etc/inputrc LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.mpg=01;35:*.mpeg=01;35:*.avi=01;35:*.fli=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.ogg=01;35:*.mp3=01;35:*.wav=01;35: G_BROKEN_FILENAMES=1 SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass KDE_IS_PRELINKED=1 KDEDIR=/usr LANG=en_US.UTF-8 LESSOPEN=|/usr/bin/lesspipe.sh %s MANPATH=/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/share/man:/usr/local/intel/11.0/084/man:/usr/local/intel/11.0/084/man:/usr/local/maui/man:/usr/local/torque/man:/usr/local/Modules/3.2.6/man:/usr/man:/usr/share/man:/usr/local/man:/usr/local/share/man:/usr/X11R6/man QTDIR=/usr/lib64/qt-3.3 QTINC=/usr/lib64/qt-3.3/include QTLIB=/usr/lib64/qt-3.3/lib MODULE_VERSION=3.2.6 MODULE_VERSION_STACK=3.2.6 MODULESHOME=/usr/local/Modules/3.2.6 MODULEPATH=/usr/local/Modules/versions:/usr/local/Modules/$MODULE_VERSION/modulefiles:/usr/local/Modules/modulefiles: LOADEDMODULES=null:modules:tools/torque-maui:compilers/64/intel-x86_64:openmpi/openmpi-1.3.2-ethernet-intel.64:tools/netcdf-4.1.3-intel:mpich2/mpich2-1.0.7-ethernet-intel.64 CC=/usr/local/intel/11.0/084/bin/intel64/icc CCHOME=/usr/local/intel/11.0/084 CXX=/usr/local/intel/11.0/084/bin/intel64/icpc F77=/usr/local/intel/11.0/084/bin/intel64/ifort F90=/usr/local/intel/11.0/084/bin/intel64/ifort FC=/usr/local/intel/11.0/084/bin/intel64/ifort FCHOME=/usr/local/intel/11.0/084 INTEL_LICENSE_FILE=/opt/intel/licenses/l_RH4JJF9H.lic LD_LIBRARY_PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/lib:/usr/local/lib:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/lib:/usr/local/intel/11.0/084/lib/intel64:/usr/local/maui/lib:/usr/local/torque/lib MPIHOME=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0 NETCDF=/usr/local _LMFILES_=/usr/local/Modules/modulefiles/null:/usr/local/Modules/modulefiles/modules:/usr/local/Modules/modulefiles/tools/torque-maui:/usr/local/Modules/modulefiles/compilers/64/intel-x86_64:/usr/local/Modules/modulefiles/openmpi/openmpi-1.3.2-ethernet-intel.64:/usr/local/Modules/modulefiles/tools/netcdf-4.1.3-intel:/usr/local/Modules/modulefiles/mpich2/mpich2-1.0.7-ethernet-intel.64 NCARG_ROOT=/usr/local/ncarg_ifort RIP_ROOT=/home/dbh409/RIP4 ITT_DIR=/usr/local/itt IDL_DIR=/usr/local/itt/idl71 The NetCDF build location is /usr/local/ a slight accident, but I don't think it makes a difference that it's not /usr/local/netcdf-4.1.3. From raju_attadamsc at rediffmail.com Tue Aug 9 03:55:53 2011 From: raju_attadamsc at rediffmail.com (rajuattada) Date: 9 Aug 2011 09:55:53 -0000 Subject: [Wrf-users] =?utf-8?q?Help_Needed?= Message-ID: <1312567363.S.71264.32393.f4mail-235-193.rediffmail.com.old.1312883753.20938@webmail.rediffmail.com> Dear All, When i am plotting NCEP data for vorticity, it is not displaying near the my boundaries(boarders). I enclosed plotted one. Just find the attachment. Please give me suggestions....... -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110809/8d30a739/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: AS_vorticity_850.png Type: image/png Size: 23571 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110809/8d30a739/attachment-0001.png From agnes.mika at bmtargoss.com Tue Aug 9 09:29:57 2011 From: agnes.mika at bmtargoss.com (Agnes Mika) Date: Tue, 9 Aug 2011 17:29:57 +0200 Subject: [Wrf-users] Help Needed In-Reply-To: <1312567363.S.71264.32393.f4mail-235-193.rediffmail.com.old.1312883753.20938@webmail.rediffmail.com> References: <1312567363.S.71264.32393.f4mail-235-193.rediffmail.com.old.1312883753.20938@webmail.rediffmail.com> Message-ID: <20110809152956.GF20086@aggedor.argoss.nl> Hallo, It does make sense to me that you don't see anything near the boundaries. Vorticity is defined as: dv/dx - du/dy (but then partial derivatives), thus you need data from surrounding grid points to be able to compute it for a given grid point. On/close to the boundary you don't have enough data, so your program probably does not compute it. Greetings, Agnes rajuattada wrote: > Dear All, > > When i am plotting NCEP data for vorticity, it is not displaying near the my boundaries(boarders). > > I enclosed plotted one. Just find the attachment. > > Please give me suggestions....... > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users -- Dr. ?gnes Mika Advisor, Meteorology and Air Quality Tel: +31 (0)527-242299 Fax: +31 (0)527-242016 Web: www.bmtargoss.com BMT ARGOSS P.O. Box 61, 8325 ZH Vollenhove Voorsterweg 28, 8316 PT Marknesse The Netherlands Confidentiality Notice & Disclaimer The contents of this e-mail and any attachments are intended for the use of the mail addressee(s) shown. If you are not that person, you are not allowed to take any action based upon it or to copy it, forward, distribute or disclose its contents and you should delete it from your system. BMT ARGOSS does not accept liability for any errors or omissions in the context of this e-mail or its attachments which arise as a result of internet transmission, nor accept liability for statements which are those of the author and clearly not made on behalf of BMT ARGOSS. Please consider the environmental impacts of printing this e-mail, and only do so if really necessary. From andrew.porter at stfc.ac.uk Tue Aug 9 09:42:09 2011 From: andrew.porter at stfc.ac.uk (Andy Porter) Date: Tue, 09 Aug 2011 16:42:09 +0100 Subject: [Wrf-users] WRF Compilation Error In-Reply-To: References: Message-ID: <4E415551.1000304@stfc.ac.uk> Hello, > > I'm having an issue compiling WRF V3.1.1 with the Intel compilers using a multi-processor (dmpar) build. NCAR Graphics (6.0.0) and NetCDF (4.1.3) are also built with the Intel compilers. The configure.wrf file contents (minus comments) and a list of the environmental variables are shown at the end of this message. The following is the first of several warning/undefined reference messages: > > > ifort -w -ftz -align all -fno-alias -fp-model precise -FR -convert big_endian -c -I/usr/local/include -I../ioapi_share diffwrf.f > diffwrf.f(1629): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1630): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1709): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1710): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1711): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1712): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf io_netcdf is being built now. > /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail > wrf_io.o: In function `ext_ncd_get_var_info_': > wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' > wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' > wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' > wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' > wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' > wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' > > > I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. I have successfully built WRF and WPS on this system before, but this is the first time I have tried it with everything compiled with icc and ifort. Any suggestions would be much appreciated. > That looks like either netcdf has not been built with Fortran support or there's an issue with how the compiler has named its objects. I've not had trouble with the Intel compiler and the latter issue so I suspect the former - I think the configure flag you want for netcdf is --enable-f90. HTH, Andy. -- Dr. Andrew Porter Computational Scientist Advanced Research Computing Group Computational Science and Engineering Dept. STFC Daresbury Laboratory Keckwick Lane Daresbury WA4 4AD Tel. : +44 (0)1925 603607 From dbh409 at ku.edu Tue Aug 9 11:07:30 2011 From: dbh409 at ku.edu (Huber, David) Date: Tue, 9 Aug 2011 17:07:30 +0000 Subject: [Wrf-users] WRF Compilation Error In-Reply-To: References: , Message-ID: Hi Dmitry, Thanks for the reply. It took a little digging, but I found the makefile with this command in WRFV3/external/ionetcdf/makefile. Here's the block for diffwrf.F90: diffwrf: diffwrf.F90 x=`echo "$(FC)" | awk '{print $$1}'` ; export x ; \ if [ $$x = "gfortran" ] ; then \ echo removing external declaration of iargc for gfortran ; \ $(CPP1) -I$(NETCDFPATH)/include -I../ioapi_share diffwrf.F90 | sed '/integer *, *external.*iargc/d' > diffwrf.f ;\ else \ $(CPP1) -I$(NETCDFPATH)/include -I../ioapi_share diffwrf.F90 > diffwrf.f ; \ fi $(FC) -c $(FFLAGS) diffwrf.f if [ \( -f ../../frame/wrf_debug.o \) -a \( -f ../../frame/module_wrf_error.o \) -a \( -f $(ESMF_MOD_DEPENDENCE) \) ] ; then \ echo "diffwrf io_netcdf is being built now. " ; \ if [ -f $(NETCDFPATH)/lib/libnetcdff.a ] ; then \ $(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) -lnetcdff ;\ else \ $(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) ;\ fi ; \ else \ echo "***************************************************************************** " ; \ echo "*** Rerun compile to make diffwrf in external/io_netcdf directory *** " ; \ echo "***************************************************************************** " ; \ fi There was an @ before the line if[\( -f ../../frame/wrf_debug.o \) ... which I deleted, but the output did not change. I see the link for netcdff and the $LIBS, which is comprised of LIBS = -L$(NETCDFPATH)/lib -lnetcdf -Dave ________________________________________ From: Dmitry N. Mikushin [maemarcus at gmail.com] Sent: Tuesday, August 09, 2011 10:44 AM To: Huber, David Cc: wrf-users at ucar.edu Subject: Re: [Wrf-users] WRF Compilation Error Hi, > I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. - This is usually harmless and is always around with Intel compiler. > /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail > wrf_io.o: In function `ext_ncd_get_var_info_': > wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' > wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' > wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' > wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' > wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' > wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' These are unresolved externals, which should normally occur, if proper library is not added to the link command. Could you print out the exact linking command it is using? > diffwrf io_netcdf is being built now. - I suppose, linking command should go in the Makefile right after echoing "diffwrf io_netcdf is being built now". If it starts with "@", then it's hidden. Remove "@", and you will see the command. Once, exact command is known, it should be much easier to understand what's wrong. - D. 2011/8/9 Huber, David : > Hello, > > I'm having an issue compiling WRF V3.1.1 with the Intel compilers using a multi-processor (dmpar) build. NCAR Graphics (6.0.0) and NetCDF (4.1.3) are also built with the Intel compilers. The configure.wrf file contents (minus comments) and a list of the environmental variables are shown at the end of this message. The following is the first of several warning/undefined reference messages: > > > ifort -w -ftz -align all -fno-alias -fp-model precise -FR -convert big_endian -c -I/usr/local/include -I../ioapi_share diffwrf.f > diffwrf.f(1629): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1630): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1709): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1710): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1711): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1712): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf io_netcdf is being built now. > /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail > wrf_io.o: In function `ext_ncd_get_var_info_': > wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' > wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' > wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' > wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' > wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' > wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' > > > I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. I have successfully built WRF and WPS on this system before, but this is the first time I have tried it with everything compiled with icc and ifort. Any suggestions would be much appreciated. > > Thanks, > > Dave > > > configure.wrf: > > > SHELL = /bin/sh > DEVTOP = `pwd` > LIBINCLUDE = . > .SUFFIXES: .F .i .o .f90 .c > > COREDEFS = -DEM_CORE=$(WRF_EM_CORE) \ > -DNMM_CORE=$(WRF_NMM_CORE) -DNMM_MAX_DIM=2600 \ > -DCOAMPS_CORE=$(WRF_COAMPS_CORE) \ > -DDA_CORE=$(WRF_DA_CORE) \ > > -DEXP_CORE=$(WRF_EXP_CORE) > > > MAX_DOMAINS = 21 > > CONFIG_BUF_LEN = 32768 > > > NATIVE_RWORDSIZE = 4 > > SED_FTN = $(WRF_SRC_ROOT_DIR)/tools/standard.exe > > IO_GRIB_SHARE_DIR = > > ESMF_COUPLING = 0 > ESMF_MOD_DEPENDENCE = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/module_utility.o > > ESMF_IO_INC = -I$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 > ESMF_MOD_INC = $(ESMF_IO_INC) > ESMF_IO_DEFS = > ESMF_TARGET = esmf_time > > > LIBWRFLIB = libwrflib.a > > > DMPARALLEL = 1 > OMPCPP = # -D_OPENMP > OMP = # -openmp -fpp -auto > SFC = ifort > SCC = icc > DM_FC = mpif90 -f90=$(SFC) > DM_CC = mpicc -cc=$(SCC) -DMPI2_SUPPORT > FC = $(DM_FC) > CC = $(DM_CC) -DFSEEKO64_OK > LD = $(FC) > RWORDSIZE = $(NATIVE_RWORDSIZE) > PROMOTION = -i4 > ARCH_LOCAL = -DNONSTANDARD_SYSTEM_FUNC > CFLAGS_LOCAL = -w -O3 -ip > LDFLAGS_LOCAL = -ip > CPLUSPLUSLIB = > ESMF_LDFLAG = $(CPLUSPLUSLIB) > FCOPTIM = -O3 > FCREDUCEDOPT = $(FCOPTIM) > FCNOOPT = -O0 -fno-inline -fno-ip > FCDEBUG = # -g $(FCNOOPT) -traceback > FORMAT_FIXED = -FI > FORMAT_FREE = -FR > FCSUFFIX = > BYTESWAPIO = -convert big_endian > FCBASEOPTS = -w -ftz -align all -fno-alias -fp-model precise $(FCDEBUG) $(FORMAT_FREE) $(BYTESWAPIO) > MODULE_SRCH_FLAG = > TRADFLAG = -traditional > CPP = /lib/cpp -C -P > AR = ar > ARFLAGS = ru > M4 = m4 > RANLIB = ranlib > CC_TOOLS = $(SCC) > > FGREP = fgrep -iq > > ARCHFLAGS = $(COREDEFS) -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=$(RWORDSIZE) -DLWORDSIZE=4 \ > $(ARCH_LOCAL) \ > $(DA_ARCHFLAGS) \ > -DDM_PARALLEL \ > > \ > -DNETCDF \ > \ > \ > \ > \ > \ > \ > -DGRIB1 \ > -DINTIO \ > -DLIMIT_ARGS \ > -DCONFIG_BUF_LEN=$(CONFIG_BUF_LEN) \ > -DMAX_DOMAINS_F=$(MAX_DOMAINS) \ > -DNMM_NEST=$(WRF_NMM_NEST) > CFLAGS = $(CFLAGS_LOCAL) -DDM_PARALLEL > FCFLAGS = $(FCOPTIM) $(FCBASEOPTS) > ESMF_LIB_FLAGS = > ESMF_IO_LIB = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a > ESMF_IO_LIB_EXT = -L$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a > INCLUDE_MODULES = $(MODULE_SRCH_FLAG) \ > $(ESMF_MOD_INC) $(ESMF_LIB_FLAGS) \ > -I$(WRF_SRC_ROOT_DIR)/main \ > -I$(WRF_SRC_ROOT_DIR)/external/io_netcdf \ > -I$(WRF_SRC_ROOT_DIR)/external/io_int \ > -I$(WRF_SRC_ROOT_DIR)/frame \ > -I$(WRF_SRC_ROOT_DIR)/share \ > -I$(WRF_SRC_ROOT_DIR)/phys \ > -I$(WRF_SRC_ROOT_DIR)/chem -I$(WRF_SRC_ROOT_DIR)/inc \ > \ > > REGISTRY = Registry > > LIB_BUNDLED = \ > -L$(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 -lfftpack \ > -L$(WRF_SRC_ROOT_DIR)/external/io_grib1 -lio_grib1 \ > -L$(WRF_SRC_ROOT_DIR)/external/io_grib_share -lio_grib_share \ > -L$(WRF_SRC_ROOT_DIR)/external/io_int -lwrfio_int \ > $(ESMF_IO_LIB) \ > $(ESMF_IO_LIB) \ > $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a \ > $(WRF_SRC_ROOT_DIR)/frame/module_internal_header_util.o \ > $(WRF_SRC_ROOT_DIR)/frame/pack_utils.o > > LIB_EXTERNAL = \ > > $(WRF_SRC_ROOT_DIR)/external/io_netcdf/libwrfio_nf.a -L/usr/local/lib -lnetcdff -lnetcdf > > LIB = $(LIB_BUNDLED) $(LIB_EXTERNAL) $(LIB_LOCAL) > LDFLAGS = $(OMP) $(FCFLAGS) $(LDFLAGS_LOCAL) > ENVCOMPDEFS = > WRF_CHEM = 0 > CPPFLAGS = $(ARCHFLAGS) $(ENVCOMPDEFS) -I$(LIBINCLUDE) $(TRADFLAG) > NETCDFPATH = /usr/local > PNETCDFPATH = > > bundled: wrf_ioapi_includes wrfio_grib_share wrfio_grib1 wrfio_int esmf_time fftpack > external: wrfio_nf $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a gen_comms_rsllite module_dm_rsllite $(ESMF_TARGET) > > ###################### > externals: bundled external > > gen_comms_serial : > ( /bin/rm -f $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ) > > module_dm_serial : > ( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; cat module_dm_stubs.F >> module_dm.F ; fi ) > > gen_comms_rsllite : > ( if [ ! -e $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ] ; then \ > /bin/cp $(WRF_SRC_ROOT_DIR)/tools/gen_comms_warning $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; \ > cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/gen_comms.c >> $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; fi ) > > module_dm_rsllite : > ( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; \ > cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/module_dm.F >> module_dm.F ; fi ) > > wrfio_nf : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_netcdf ; \ > make NETCDFPATH="$(NETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > CC="$(SCC)" CFLAGS="$(CFLAGS)" \ > FC="$(SFC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > wrfio_pnf : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_pnetcdf ; \ > make NETCDFPATH="$(PNETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > FC="$(FC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > > > wrfio_grib_share : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib_share ; \ > make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) > > wrfio_grib1 : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib1 ; \ > make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) > > wrfio_grib2 : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_grib2 ; \ > make CC="$(SCC)" CFLAGS="$(CFLAGS) " RM="$(RM)" RANLIB="$(RANLIB)" \ > CPP="$(CPP)" \ > FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="-traditional" AR="$(AR)" ARFLAGS="$(ARFLAGS)" \ > FIXED="$(FORMAT_FIXED)" archive) > > wrfio_int : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_int ; \ > make CC="$(CC)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" \ > TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" all ) > > esmf_time : > ( cd $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 ; \ > make FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" \ > CPP="$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > fftpack : > ( cd $(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 ; \ > make FC="$(SFC)" FFLAGS="$(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a : > ( cd $(WRF_SRC_ROOT_DIR)/external/RSL_LITE ; make CC="$(CC) $(CFLAGS)" \ > FC="$(FC) $(FCFLAGS) $(PROMOTION) $(BYTESWAPIO)" \ > CPP="$(CPP) -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ;\ > $(RANLIB) $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a ) > > > LN = ln -sf > MAKE = make -i -r > RM = rm -f > > > # These sub-directory builds are identical across all architectures > > wrf_ioapi_includes : > ( cd $(WRF_SRC_ROOT_DIR)/external/ioapi_share ; \ > $(MAKE) NATIVE_RWORDSIZE="$(NATIVE_RWORDSIZE)" RWORDSIZE="$(RWORDSIZE)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > wrfio_esmf : > ( cd $(WRF_SRC_ROOT_DIR)/external/io_esmf ; \ > make FC="$(FC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS) $(ESMF_MOD_INC)" \ > RANLIB="$(RANLIB)" CPP="$(CPP) $(POUND_DEF) " AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > # There is probably no reason to modify these rules > > .F.i: > $(RM) $@ > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.F > $@ > mv $*.i $(DEVTOP)/pick/$*.f90 > cp $*.F $(DEVTOP)/pick > > .F.o: > $(RM) $@ > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F > $*.bb > $(SED_FTN) $*.bb | $(CPP) > $*.f90 > $(RM) $*.b $*.bb > if $(FGREP) '!$$OMP' $*.f90 ; then \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > $(FC) -o $@ -c $(FCFLAGS) $(OMP) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ > else \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > $(FC) -o $@ -c $(FCFLAGS) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ > fi > > > .F.f90: > $(RM) $@ > $(SED_FTN) $*.F > $*.b > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.b > $@ > $(RM) $*.b > > > .f90.o: > $(RM) $@ > $(FC) -o $@ -c $(FCFLAGS) $(PROMOTION) $(FCSUFFIX) $*.f90 > > .c.o: > $(RM) $@ > $(CC) -o $@ -c $(CFLAGS) $*.c > > # A little more adventurous. Allow full opt on > # mediation_integrate.o \ > # shift_domain_em.o \ > # solve_em.o <-- gets a little kick from SOLVE_EM_SPECIAL too, if defined > # mediation_feedback_domain.o : mediation_feedback_domain.F > # mediation_force_domain.o : mediation_force_domain.F > # mediation_interp_domain.o : mediation_interp_domain.F > > # compile these without high optimization to speed compile > convert_nmm.o : convert_nmm.F > init_modules_em.o : init_modules_em.F > input_wrf.o : input_wrf.F > module_io.o : module_io.F > module_comm_dm.o : module_comm_dm.F > module_configure.o : module_configure.F > module_dm.o : module_dm.F > module_domain.o : module_domain.F > module_domain_type.o : module_domain_type.F > module_alloc_space.o : module_alloc_space.F > module_tiles.o : module_tiles.F > module_fddaobs_rtfdda.o : module_fddaobs_rtfdda.F > module_initialize.o : module_initialize.F > module_physics_init.o : module_physics_init.F > module_initialize_b_wave.o : module_initialize_b_wave.F > module_initialize_hill2d_x.o : module_initialize_hill2d_x.F > module_initialize_quarter_ss.o : module_initialize_quarter_ss.F > module_initialize_real.o : module_initialize_real.F > module_initialize_real.o: module_initialize_real.F > module_initialize_squall2d_x.o : module_initialize_squall2d_x.F > module_initialize_squall2d_y.o : module_initialize_squall2d_y.F > module_integrate.o : module_integrate.F > module_io_mm5.o : module_io_mm5.F > module_io_wrf.o : module_io_wrf.F > module_si_io.o : module_si_io.F > module_state_description.o : module_state_description.F > output_wrf.o : output_wrf.F > > > NMM_NEST_UTILS1.o : NMM_NEST_UTILS1.F > solve_interface.o : solve_interface.F > start_domain.o : start_domain.F > start_domain_nmm.o : start_domain_nmm.F > start_em.o : start_em.F > wrf_auxhist10in.o : wrf_auxhist10in.F > wrf_auxhist10out.o : wrf_auxhist10out.F > wrf_auxhist11in.o : wrf_auxhist11in.F > wrf_auxhist11out.o : wrf_auxhist11out.F > wrf_auxhist1in.o : wrf_auxhist1in.F > wrf_auxhist1out.o : wrf_auxhist1out.F > wrf_auxhist2in.o : wrf_auxhist2in.F > wrf_auxhist2out.o : wrf_auxhist2out.F > wrf_auxhist3in.o : wrf_auxhist3in.F > wrf_auxhist3out.o : wrf_auxhist3out.F > wrf_auxhist4in.o : wrf_auxhist4in.F > wrf_auxhist4out.o : wrf_auxhist4out.F > wrf_auxhist5in.o : wrf_auxhist5in.F > wrf_auxhist5out.o : wrf_auxhist5out.F > wrf_auxhist6in.o : wrf_auxhist6in.F > wrf_auxhist6out.o : wrf_auxhist6out.F > wrf_auxhist7in.o : wrf_auxhist7in.F > wrf_auxhist7out.o : wrf_auxhist7out.F > wrf_auxhist8in.o : wrf_auxhist8in.F > wrf_auxhist8out.o : wrf_auxhist8out.F > wrf_auxhist9in.o : wrf_auxhist9in.F > wrf_auxhist9out.o : wrf_auxhist9out.F > wrf_auxinput10in.o : wrf_auxinput10in.F > wrf_auxinput10out.o : wrf_auxinput10out.F > wrf_auxinput11in.o : wrf_auxinput11in.F > wrf_auxinput11out.o : wrf_auxinput11out.F > wrf_auxinput1in.o : wrf_auxinput1in.F > wrf_auxinput1out.o : wrf_auxinput1out.F > wrf_auxinput2in.o : wrf_auxinput2in.F > wrf_auxinput2out.o : wrf_auxinput2out.F > wrf_auxinput3in.o : wrf_auxinput3in.F > wrf_auxinput3out.o : wrf_auxinput3out.F > wrf_auxinput4in.o : wrf_auxinput4in.F > wrf_auxinput4out.o : wrf_auxinput4out.F > wrf_auxinput5in.o : wrf_auxinput5in.F > wrf_auxinput5out.o : wrf_auxinput5out.F > wrf_auxinput6in.o : wrf_auxinput6in.F > wrf_auxinput6out.o : wrf_auxinput6out.F > wrf_auxinput7in.o : wrf_auxinput7in.F > > wrf_auxinput7out.o : wrf_auxinput7out.F > wrf_auxinput8in.o : wrf_auxinput8in.F > wrf_auxinput8out.o : wrf_auxinput8out.F > wrf_auxinput9in.o : wrf_auxinput9in.F > wrf_auxinput9out.o : wrf_auxinput9out.F > wrf_bdyin.o : wrf_bdyin.F > wrf_bdyout.o : wrf_bdyout.F > wrf_ext_read_field.o : wrf_ext_read_field.F > wrf_ext_write_field.o : wrf_ext_write_field.F > wrf_fddaobs_in.o : wrf_fddaobs_in.F > wrf_histin.o : wrf_histin.F > wrf_histout.o : wrf_histout.F > wrf_inputin.o : wrf_inputin.F > wrf_inputout.o : wrf_inputout.F > wrf_restartin.o : wrf_restartin.F > wrf_restartout.o : wrf_restartout.F > wrf_tsin.o : wrf_tsin.F > nl_get_0_routines.o : nl_get_0_routines.F > nl_get_1_routines.o : nl_get_1_routines.F > nl_set_0_routines.o : nl_set_0_routines.F > nl_set_1_routines.o : nl_set_1_routines.F > > convert_nmm.o \ > init_modules_em.o \ > module_dm.o \ > module_fddaobs_rtfdda.o \ > module_initialize.o \ > module_initialize_b_wave.o \ > module_initialize_hill2d_x.o \ > module_initialize_quarter_ss.o \ > module_initialize_real.o \ > module_initialize_squall2d_x.o \ > module_initialize_squall2d_y.o \ > module_integrate.o \ > module_io_mm5.o \ > module_io_wrf.o \ > module_si_io.o \ > module_tiles.o \ > output_wrf.o \ > NMM_NEST_UTILS1.o \ > solve_interface.o \ > start_domain.o \ > start_domain_nmm.o \ > shift_domain_nmm.o \ > > start_em.o \ > wrf_fddaobs_in.o \ > wrf_tsin.o : > $(RM) $@ > $(SED_FTN) $*.F > $*.b > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b > $*.f90 > $(RM) $*.b > if $(FGREP) '!$$OMP' $*.f90 ; then \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > $(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ > else \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > $(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ > fi > > module_sf_ruclsm.o : module_sf_ruclsm.F > > module_sf_ruclsm.o : > $(RM) $@ > $(SED_FTN) $*.F > $*.b > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b > $*.f90 > $(RM) $*.b > if $(FGREP) '!$$OMP' $*.f90 ; then \ > echo COMPILING $*.F WITH OMP ; \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > $(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ > else \ > if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > $(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ > fi > > input_wrf.o \ > module_domain.o \ > module_domain_type.o \ > module_physics_init.o \ > module_io.o \ > > wrf_auxhist10in.o \ > wrf_auxhist10out.o \ > wrf_auxhist11in.o \ > wrf_auxhist11out.o \ > wrf_auxhist1in.o \ > wrf_auxhist1out.o \ > wrf_auxhist2in.o \ > wrf_auxhist2out.o \ > wrf_auxhist3in.o \ > wrf_auxhist3out.o \ > wrf_auxhist4in.o \ > wrf_auxhist4out.o \ > wrf_auxhist5in.o \ > wrf_auxhist5out.o \ > wrf_auxhist6in.o \ > wrf_auxhist6out.o \ > wrf_auxhist7in.o \ > wrf_auxhist7out.o \ > wrf_auxhist8in.o \ > wrf_auxhist8out.o \ > wrf_auxhist9in.o \ > wrf_auxhist9out.o \ > wrf_auxinput10in.o \ > wrf_auxinput10out.o \ > wrf_auxinput11in.o \ > wrf_auxinput11out.o \ > wrf_auxinput1in.o \ > wrf_auxinput1out.o \ > wrf_auxinput2in.o \ > wrf_auxinput2out.o \ > wrf_auxinput3in.o \ > wrf_auxinput3out.o \ > wrf_auxinput4in.o \ > wrf_auxinput4out.o \ > wrf_auxinput5in.o \ > wrf_auxinput5out.o \ > wrf_auxinput6in.o \ > wrf_auxinput6out.o \ > > wrf_auxinput7in.o \ > wrf_auxinput7out.o \ > wrf_auxinput8in.o \ > wrf_auxinput8out.o \ > wrf_auxinput9in.o \ > wrf_auxinput9out.o \ > wrf_bdyin.o \ > wrf_bdyout.o \ > wrf_ext_read_field.o \ > wrf_ext_write_field.o \ > wrf_histin.o \ > wrf_histout.o \ > wrf_inputin.o \ > wrf_inputout.o \ > wrf_restartin.o \ > wrf_restartout.o \ > module_state_description.o \ > nl_set_0_routines.o \ > nl_set_1_routines.o \ > nl_get_0_routines.o \ > nl_get_1_routines.o \ > module_alloc_space.o \ > module_comm_dm.o \ > module_configure.o : > $(RM) $@ > $(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F > $*.bb > $(SED_FTN) $*.bb | $(CPP) > $*.f90 > $(RM) $*.b $*.bb > $(FC) -c $(PROMOTION) $(FCSUFFIX) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $*.f90 > > > Environmental Variables: > > > USER=dbh409 > LOGNAME=dbh409 > HOME=/home/dbh409 > PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/bin:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/bin:/usr/local/intel/11.0/084/bin/intel64:/usr/local/intel/11.0/084/bin/intel64:/usr/local/maui/bin:/usr/local/maui/sbin:/usr/local/torque/bin:/usr/local/torque/sbin:/usr/local/Modules/3.2.6/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/software/netCDF/3.6.3/intel/mvapich/bin:/usr/local/ncarg_ifort/bin:. > MAIL=/var/spool/mail/dbh409 > SHELL=/bin/tcsh > SSH_CLIENT=10.237.0.175 49359 22 > SSH_CONNECTION=10.237.0.175 49359 129.237.228.235 22 > SSH_TTY=/dev/pts/3 > TERM=xterm > HOSTTYPE=x86_64-linux > VENDOR=unknown > OSTYPE=linux > MACHTYPE=x86_64 > SHLVL=1 > PWD=/home/dbh409/WRFV3 > GROUP=dbh409 > HOST=pequod > REMOTEHOST=10.237.0.175 > HOSTNAME=pequod > INPUTRC=/etc/inputrc > LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.mpg=01;35:*.mpeg=01;35:*.avi=01;35:*.fli=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.ogg=01;35:*.mp3=01;35:*.wav=01;35: > G_BROKEN_FILENAMES=1 > SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass > KDE_IS_PRELINKED=1 > KDEDIR=/usr > LANG=en_US.UTF-8 > LESSOPEN=|/usr/bin/lesspipe.sh %s > MANPATH=/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/share/man:/usr/local/intel/11.0/084/man:/usr/local/intel/11.0/084/man:/usr/local/maui/man:/usr/local/torque/man:/usr/local/Modules/3.2.6/man:/usr/man:/usr/share/man:/usr/local/man:/usr/local/share/man:/usr/X11R6/man > QTDIR=/usr/lib64/qt-3.3 > QTINC=/usr/lib64/qt-3.3/include > QTLIB=/usr/lib64/qt-3.3/lib > MODULE_VERSION=3.2.6 > MODULE_VERSION_STACK=3.2.6 > MODULESHOME=/usr/local/Modules/3.2.6 > MODULEPATH=/usr/local/Modules/versions:/usr/local/Modules/$MODULE_VERSION/modulefiles:/usr/local/Modules/modulefiles: > LOADEDMODULES=null:modules:tools/torque-maui:compilers/64/intel-x86_64:openmpi/openmpi-1.3.2-ethernet-intel.64:tools/netcdf-4.1.3-intel:mpich2/mpich2-1.0.7-ethernet-intel.64 > CC=/usr/local/intel/11.0/084/bin/intel64/icc > CCHOME=/usr/local/intel/11.0/084 > CXX=/usr/local/intel/11.0/084/bin/intel64/icpc > F77=/usr/local/intel/11.0/084/bin/intel64/ifort > F90=/usr/local/intel/11.0/084/bin/intel64/ifort > FC=/usr/local/intel/11.0/084/bin/intel64/ifort > FCHOME=/usr/local/intel/11.0/084 > INTEL_LICENSE_FILE=/opt/intel/licenses/l_RH4JJF9H.lic > LD_LIBRARY_PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/lib:/usr/local/lib:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/lib:/usr/local/intel/11.0/084/lib/intel64:/usr/local/maui/lib:/usr/local/torque/lib > MPIHOME=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0 > NETCDF=/usr/local > _LMFILES_=/usr/local/Modules/modulefiles/null:/usr/local/Modules/modulefiles/modules:/usr/local/Modules/modulefiles/tools/torque-maui:/usr/local/Modules/modulefiles/compilers/64/intel-x86_64:/usr/local/Modules/modulefiles/openmpi/openmpi-1.3.2-ethernet-intel.64:/usr/local/Modules/modulefiles/tools/netcdf-4.1.3-intel:/usr/local/Modules/modulefiles/mpich2/mpich2-1.0.7-ethernet-intel.64 > NCARG_ROOT=/usr/local/ncarg_ifort > RIP_ROOT=/home/dbh409/RIP4 > ITT_DIR=/usr/local/itt > IDL_DIR=/usr/local/itt/idl71 > > > The NetCDF build location is /usr/local/ a slight accident, but I don't think it makes a difference that it's not /usr/local/netcdf-4.1.3. > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From maemarcus at gmail.com Tue Aug 9 09:44:48 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 9 Aug 2011 19:44:48 +0400 Subject: [Wrf-users] WRF Compilation Error In-Reply-To: References: Message-ID: Hi, > I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. - This is usually harmless and is always around with Intel compiler. > /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail > wrf_io.o: In function `ext_ncd_get_var_info_': > wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' > wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' > wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' > wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' > wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' > wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' These are unresolved externals, which should normally occur, if proper library is not added to the link command. Could you print out the exact linking command it is using? > diffwrf io_netcdf is being built now. - I suppose, linking command should go in the Makefile right after echoing "diffwrf io_netcdf is being built now". If it starts with "@", then it's hidden. Remove "@", and you will see the command. Once, exact command is known, it should be much easier to understand what's wrong. - D. 2011/8/9 Huber, David : > Hello, > > I'm having an issue compiling WRF V3.1.1 with the Intel compilers using a multi-processor (dmpar) build. ?NCAR Graphics (6.0.0) and NetCDF (4.1.3) are also built with the Intel compilers. The configure.wrf file contents (minus comments) and a list of the environmental variables are shown at the end of this message. ?The following is the first of several warning/undefined reference messages: > > > ifort -w -ftz -align all -fno-alias -fp-model precise ?-FR -convert big_endian -c ?-I/usr/local/include -I../ioapi_share diffwrf.f > diffwrf.f(1629): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1630): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1709): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1710): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1711): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf.f(1712): (col. 7) remark: LOOP WAS VECTORIZED. > diffwrf io_netcdf is being built now. > /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail > wrf_io.o: In function `ext_ncd_get_var_info_': > wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' > wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' > wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' > wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' > wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' > wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' > > > I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. ?I have successfully built WRF and WPS on this system before, but this is the first time I have tried it with everything compiled with icc and ifort. ?Any suggestions would be much appreciated. > > Thanks, > > Dave > > > configure.wrf: > > > SHELL ? ? ? ? ? = ? ? ? /bin/sh > DEVTOP ? ? ? ? ?= ? ? ? `pwd` > LIBINCLUDE ? ? ?= ? ? ? . > .SUFFIXES: .F .i .o .f90 .c > > COREDEFS = -DEM_CORE=$(WRF_EM_CORE) \ > ? ? ? ? ? -DNMM_CORE=$(WRF_NMM_CORE) -DNMM_MAX_DIM=2600 \ > ? ? ? ? ? -DCOAMPS_CORE=$(WRF_COAMPS_CORE) \ > ? ? ? ? ? -DDA_CORE=$(WRF_DA_CORE) \ > > ? ? ? ? ? -DEXP_CORE=$(WRF_EXP_CORE) > > > MAX_DOMAINS ? ? = ? ? ? 21 > > CONFIG_BUF_LEN ?= ? ? ? 32768 > > > NATIVE_RWORDSIZE = 4 > > SED_FTN = $(WRF_SRC_ROOT_DIR)/tools/standard.exe > > IO_GRIB_SHARE_DIR = > > ESMF_COUPLING ? ? ? = 0 > ESMF_MOD_DEPENDENCE = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/module_utility.o > > ESMF_IO_INC ? ? ? ? = -I$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 > ESMF_MOD_INC ? ? ? ?= ?$(ESMF_IO_INC) > ESMF_IO_DEFS ? ? ? ?= > ESMF_TARGET ? ? ? ? = esmf_time > > > LIBWRFLIB = libwrflib.a > > > DMPARALLEL ? ? ?= ? ? ? ?1 > OMPCPP ? ? ? ? ?= ? ? ? # -D_OPENMP > OMP ? ? ? ? ? ? = ? ? ? # -openmp -fpp -auto > SFC ? ? ? ? ? ? = ? ? ? ifort > SCC ? ? ? ? ? ? = ? ? ? icc > DM_FC ? ? ? ? ? = ? ? ? mpif90 -f90=$(SFC) > DM_CC ? ? ? ? ? = ? ? ? mpicc -cc=$(SCC) -DMPI2_SUPPORT > FC ? ? ? ? ? ? ?= ? ? ? ?$(DM_FC) > CC ? ? ? ? ? ? ?= ? ? ? $(DM_CC) -DFSEEKO64_OK > LD ? ? ? ? ? ? ?= ? ? ? $(FC) > RWORDSIZE ? ? ? = ? ? ? $(NATIVE_RWORDSIZE) > PROMOTION ? ? ? = ? ? ? -i4 > ARCH_LOCAL ? ? ?= ? ? ? -DNONSTANDARD_SYSTEM_FUNC > CFLAGS_LOCAL ? ?= ? ? ? -w -O3 -ip > LDFLAGS_LOCAL ? = ? ? ? -ip > CPLUSPLUSLIB ? ?= > ESMF_LDFLAG ? ? = ? ? ? $(CPLUSPLUSLIB) > FCOPTIM ? ? ? ? = ? ? ? -O3 > FCREDUCEDOPT ? ?= ? ? ? $(FCOPTIM) > FCNOOPT ? ? ? ? = ? ? ? -O0 -fno-inline -fno-ip > FCDEBUG ? ? ? ? = ? ? ? # -g $(FCNOOPT) -traceback > FORMAT_FIXED ? ?= ? ? ? -FI > FORMAT_FREE ? ? = ? ? ? -FR > FCSUFFIX ? ? ? ?= > BYTESWAPIO ? ? ?= ? ? ? -convert big_endian > FCBASEOPTS ? ? ?= ? ? ? -w -ftz -align all -fno-alias -fp-model precise $(FCDEBUG) $(FORMAT_FREE) $(BYTESWAPIO) > MODULE_SRCH_FLAG = > TRADFLAG ? ? ? ?= ? ? ?-traditional > CPP ? ? ? ? ? ? = ? ? ?/lib/cpp -C -P > AR ? ? ? ? ? ? ?= ? ? ?ar > ARFLAGS ? ? ? ? = ? ? ?ru > M4 ? ? ? ? ? ? ?= ? ? ?m4 > RANLIB ? ? ? ? ?= ? ? ?ranlib > CC_TOOLS ? ? ? ?= ? ? ?$(SCC) > > FGREP = fgrep -iq > > ARCHFLAGS ? ? ? = ? ?$(COREDEFS) -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=$(RWORDSIZE) -DLWORDSIZE=4 \ > ? ? ? ? ? ? ? ? ? ? $(ARCH_LOCAL) \ > ? ? ? ? ? ? ? ? ? ? $(DA_ARCHFLAGS) \ > ? ? ? ? ? ? ? ? ? ? ?-DDM_PARALLEL \ > > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ?-DNETCDF \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ? \ > ? ? ? ? ? ? ? ? ? ? ?-DGRIB1 \ > ? ? ? ? ? ? ? ? ? ? ?-DINTIO \ > ? ? ? ? ? ? ? ? ? ? ?-DLIMIT_ARGS \ > ? ? ? ? ? ? ? ? ? ? ?-DCONFIG_BUF_LEN=$(CONFIG_BUF_LEN) \ > ? ? ? ? ? ? ? ? ? ? ?-DMAX_DOMAINS_F=$(MAX_DOMAINS) \ > ? ? ? ? ? ? ? ? ? ? ?-DNMM_NEST=$(WRF_NMM_NEST) > CFLAGS ? ? ? ? ?= ? ?$(CFLAGS_LOCAL) -DDM_PARALLEL > FCFLAGS ? ? ? ? = ? ?$(FCOPTIM) $(FCBASEOPTS) > ESMF_LIB_FLAGS ?= > ESMF_IO_LIB ? ? = ? ?$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a > ESMF_IO_LIB_EXT = ? ?-L$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a > INCLUDE_MODULES = ? ?$(MODULE_SRCH_FLAG) \ > ? ? ? ? ? ? ? ? ? ? $(ESMF_MOD_INC) $(ESMF_LIB_FLAGS) \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/main \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/external/io_netcdf \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/external/io_int \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/frame \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/share \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/phys \ > ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/chem -I$(WRF_SRC_ROOT_DIR)/inc \ > ? ? ? ? ? ? ? ? ? ? ? \ > > REGISTRY ? ? ? ?= ? ?Registry > > LIB_BUNDLED ? ? = \ > ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 -lfftpack \ > ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_grib1 -lio_grib1 \ > ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_grib_share -lio_grib_share \ > ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_int -lwrfio_int \ > ? ? ? ? ? ? ? ? ? ? $(ESMF_IO_LIB) \ > ? ? ? ? ? ? ? ? ? ? $(ESMF_IO_LIB) \ > ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a \ > ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/frame/module_internal_header_util.o \ > ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/frame/pack_utils.o > > LIB_EXTERNAL ? ?= \ > > ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/external/io_netcdf/libwrfio_nf.a -L/usr/local/lib -lnetcdff -lnetcdf > > LIB ? ? ? ? ? ? = ? ?$(LIB_BUNDLED) $(LIB_EXTERNAL) $(LIB_LOCAL) > LDFLAGS ? ? ? ? = ? ?$(OMP) $(FCFLAGS) $(LDFLAGS_LOCAL) > ENVCOMPDEFS ? ? = > WRF_CHEM ? ? ? ?= ? ? ? 0 > CPPFLAGS ? ? ? ?= ? ?$(ARCHFLAGS) $(ENVCOMPDEFS) -I$(LIBINCLUDE) $(TRADFLAG) > NETCDFPATH ? ? ?= ? ?/usr/local > PNETCDFPATH ? ? = > > bundled: ?wrf_ioapi_includes wrfio_grib_share wrfio_grib1 wrfio_int esmf_time fftpack > external: ?wrfio_nf ? $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a gen_comms_rsllite module_dm_rsllite $(ESMF_TARGET) > > ###################### > externals: bundled external > > gen_comms_serial : > ? ? ? ?( /bin/rm -f $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ) > > module_dm_serial : > ? ? ? ?( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; cat module_dm_stubs.F >> module_dm.F ; fi ) > > gen_comms_rsllite : > ? ? ? ?( if [ ! -e $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ] ; then \ > ? ? ? ? ?/bin/cp $(WRF_SRC_ROOT_DIR)/tools/gen_comms_warning $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; \ > ? ? ? ? ?cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/gen_comms.c >> $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; fi ) > > module_dm_rsllite : > ? ? ? ?( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; \ > ? ? ? ? ?cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/module_dm.F >> module_dm.F ; fi ) > > wrfio_nf : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_netcdf ; \ > ? ? ? ? ?make NETCDFPATH="$(NETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > ? ? ? ? ?CC="$(SCC)" CFLAGS="$(CFLAGS)" \ > ? ? ? ? ?FC="$(SFC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > wrfio_pnf : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_pnetcdf ; \ > ? ? ? ? ?make NETCDFPATH="$(PNETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > ? ? ? ? ?FC="$(FC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > > > wrfio_grib_share : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib_share ; \ > ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) > > wrfio_grib1 : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib1 ; \ > ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) > > wrfio_grib2 : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib2 ; \ > ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS) " RM="$(RM)" RANLIB="$(RANLIB)" \ > ? ? ? ? ?CPP="$(CPP)" \ > ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="-traditional" AR="$(AR)" ARFLAGS="$(ARFLAGS)" \ > ? ? ? ? ?FIXED="$(FORMAT_FIXED)" archive) > > wrfio_int : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_int ; \ > ? ? ? ? ?make CC="$(CC)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ > ? ? ? ? ?FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" \ > ? ? ? ? ?TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" all ) > > esmf_time : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 ; \ > ? ? ? ? ?make FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" \ > ? ? ? ? ?CPP="$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > fftpack : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 ; \ > ? ? ? ? ?make FC="$(SFC)" FFLAGS="$(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/RSL_LITE ; make CC="$(CC) $(CFLAGS)" \ > ? ? ? ? ?FC="$(FC) $(FCFLAGS) $(PROMOTION) $(BYTESWAPIO)" \ > ? ? ? ? ?CPP="$(CPP) -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ;\ > ? ? ? ? ?$(RANLIB) $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a ) > > > LN ? ? ?= ? ? ? ln -sf > MAKE ? ?= ? ? ? make -i -r > RM ? ? ?= ? ? ? rm -f > > > # These sub-directory builds are identical across all architectures > > wrf_ioapi_includes : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/ioapi_share ; \ > ? ? ? ? ?$(MAKE) NATIVE_RWORDSIZE="$(NATIVE_RWORDSIZE)" RWORDSIZE="$(RWORDSIZE)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > wrfio_esmf : > ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_esmf ; \ > ? ? ? ? ?make FC="$(FC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS) $(ESMF_MOD_INC)" \ > ? ? ? ? ?RANLIB="$(RANLIB)" CPP="$(CPP) $(POUND_DEF) " AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) > > # ? ? ? There is probably no reason to modify these rules > > .F.i: > ? ? ? ?$(RM) $@ > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.F > $@ > ? ? ? ?mv $*.i $(DEVTOP)/pick/$*.f90 > ? ? ? ?cp $*.F $(DEVTOP)/pick > > .F.o: > ? ? ? ?$(RM) $@ > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F ?> $*.bb > ? ? ? ?$(SED_FTN) $*.bb | $(CPP) > $*.f90 > ? ? ? ?$(RM) $*.b $*.bb > ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > ? ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(OMP) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ > ? ? ? ?else \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > ? ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ > ? ? ? ?fi > > > .F.f90: > ? ? ? ?$(RM) $@ > ? ? ? ?$(SED_FTN) $*.F > $*.b > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.b ?> $@ > ? ? ? ?$(RM) $*.b > > > .f90.o: > ? ? ? ?$(RM) $@ > ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(PROMOTION) $(FCSUFFIX) $*.f90 > > .c.o: > ? ? ? ?$(RM) $@ > ? ? ? ?$(CC) -o $@ -c $(CFLAGS) $*.c > > # A little more adventurous. ?Allow full opt on > # mediation_integrate.o \ > # shift_domain_em.o \ > # solve_em.o ?<-- gets a little kick from SOLVE_EM_SPECIAL too, if defined > # mediation_feedback_domain.o : mediation_feedback_domain.F > # mediation_force_domain.o : mediation_force_domain.F > # mediation_interp_domain.o : mediation_interp_domain.F > > # compile these without high optimization to speed compile > convert_nmm.o : convert_nmm.F > init_modules_em.o : init_modules_em.F > input_wrf.o : input_wrf.F > module_io.o : module_io.F > module_comm_dm.o : module_comm_dm.F > module_configure.o : module_configure.F > module_dm.o : module_dm.F > module_domain.o : module_domain.F > module_domain_type.o : module_domain_type.F > module_alloc_space.o : module_alloc_space.F > module_tiles.o : module_tiles.F > module_fddaobs_rtfdda.o : module_fddaobs_rtfdda.F > module_initialize.o : module_initialize.F > module_physics_init.o : module_physics_init.F > module_initialize_b_wave.o : module_initialize_b_wave.F > module_initialize_hill2d_x.o : module_initialize_hill2d_x.F > module_initialize_quarter_ss.o : module_initialize_quarter_ss.F > module_initialize_real.o : module_initialize_real.F > module_initialize_real.o: module_initialize_real.F > module_initialize_squall2d_x.o : module_initialize_squall2d_x.F > module_initialize_squall2d_y.o : module_initialize_squall2d_y.F > module_integrate.o : module_integrate.F > module_io_mm5.o : module_io_mm5.F > module_io_wrf.o : module_io_wrf.F > module_si_io.o : module_si_io.F > module_state_description.o : module_state_description.F > output_wrf.o : output_wrf.F > > > NMM_NEST_UTILS1.o : NMM_NEST_UTILS1.F > solve_interface.o : solve_interface.F > start_domain.o : start_domain.F > start_domain_nmm.o : start_domain_nmm.F > start_em.o : start_em.F > wrf_auxhist10in.o : wrf_auxhist10in.F > wrf_auxhist10out.o : wrf_auxhist10out.F > wrf_auxhist11in.o : wrf_auxhist11in.F > wrf_auxhist11out.o : wrf_auxhist11out.F > wrf_auxhist1in.o : wrf_auxhist1in.F > wrf_auxhist1out.o : wrf_auxhist1out.F > wrf_auxhist2in.o : wrf_auxhist2in.F > wrf_auxhist2out.o : wrf_auxhist2out.F > wrf_auxhist3in.o : wrf_auxhist3in.F > wrf_auxhist3out.o : wrf_auxhist3out.F > wrf_auxhist4in.o : wrf_auxhist4in.F > wrf_auxhist4out.o : wrf_auxhist4out.F > wrf_auxhist5in.o : wrf_auxhist5in.F > wrf_auxhist5out.o : wrf_auxhist5out.F > wrf_auxhist6in.o : wrf_auxhist6in.F > wrf_auxhist6out.o : wrf_auxhist6out.F > wrf_auxhist7in.o : wrf_auxhist7in.F > wrf_auxhist7out.o : wrf_auxhist7out.F > wrf_auxhist8in.o : wrf_auxhist8in.F > wrf_auxhist8out.o : wrf_auxhist8out.F > wrf_auxhist9in.o : wrf_auxhist9in.F > wrf_auxhist9out.o : wrf_auxhist9out.F > wrf_auxinput10in.o : wrf_auxinput10in.F > wrf_auxinput10out.o : wrf_auxinput10out.F > wrf_auxinput11in.o : wrf_auxinput11in.F > wrf_auxinput11out.o : wrf_auxinput11out.F > wrf_auxinput1in.o : wrf_auxinput1in.F > wrf_auxinput1out.o : wrf_auxinput1out.F > wrf_auxinput2in.o : wrf_auxinput2in.F > wrf_auxinput2out.o : wrf_auxinput2out.F > wrf_auxinput3in.o : wrf_auxinput3in.F > wrf_auxinput3out.o : wrf_auxinput3out.F > wrf_auxinput4in.o : wrf_auxinput4in.F > wrf_auxinput4out.o : wrf_auxinput4out.F > wrf_auxinput5in.o : wrf_auxinput5in.F > wrf_auxinput5out.o : wrf_auxinput5out.F > wrf_auxinput6in.o : wrf_auxinput6in.F > wrf_auxinput6out.o : wrf_auxinput6out.F > wrf_auxinput7in.o : wrf_auxinput7in.F > > wrf_auxinput7out.o : wrf_auxinput7out.F > wrf_auxinput8in.o : wrf_auxinput8in.F > wrf_auxinput8out.o : wrf_auxinput8out.F > wrf_auxinput9in.o : wrf_auxinput9in.F > wrf_auxinput9out.o : wrf_auxinput9out.F > wrf_bdyin.o : wrf_bdyin.F > wrf_bdyout.o : wrf_bdyout.F > wrf_ext_read_field.o : wrf_ext_read_field.F > wrf_ext_write_field.o : wrf_ext_write_field.F > wrf_fddaobs_in.o : wrf_fddaobs_in.F > wrf_histin.o : wrf_histin.F > wrf_histout.o : wrf_histout.F > wrf_inputin.o : wrf_inputin.F > wrf_inputout.o : wrf_inputout.F > wrf_restartin.o : wrf_restartin.F > wrf_restartout.o : wrf_restartout.F > wrf_tsin.o : wrf_tsin.F > nl_get_0_routines.o : nl_get_0_routines.F > nl_get_1_routines.o : nl_get_1_routines.F > nl_set_0_routines.o : nl_set_0_routines.F > nl_set_1_routines.o : nl_set_1_routines.F > > convert_nmm.o \ > init_modules_em.o \ > module_dm.o \ > module_fddaobs_rtfdda.o \ > module_initialize.o \ > module_initialize_b_wave.o \ > module_initialize_hill2d_x.o \ > module_initialize_quarter_ss.o \ > module_initialize_real.o \ > module_initialize_squall2d_x.o \ > module_initialize_squall2d_y.o \ > module_integrate.o \ > module_io_mm5.o \ > module_io_wrf.o \ > module_si_io.o \ > module_tiles.o \ > output_wrf.o \ > NMM_NEST_UTILS1.o \ > solve_interface.o \ > start_domain.o \ > start_domain_nmm.o \ > shift_domain_nmm.o \ > > start_em.o \ > wrf_fddaobs_in.o \ > wrf_tsin.o : > ? ? ? ?$(RM) $@ > ? ? ? ?$(SED_FTN) $*.F > $*.b > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b ?> $*.f90 > ? ? ? ?$(RM) $*.b > ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ > ? ? ? ?else \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ > ? ? ? ?fi > > module_sf_ruclsm.o : module_sf_ruclsm.F > > module_sf_ruclsm.o : > ? ? ? ?$(RM) $@ > ? ? ? ?$(SED_FTN) $*.F > $*.b > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b ?> $*.f90 > ? ? ? ?$(RM) $*.b > ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ > ? ? ? ? ?echo COMPILING $*.F WITH OMP ; \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ > ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ > ? ? ? ?else \ > ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ > ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ > ? ? ? ?fi > > input_wrf.o \ > module_domain.o \ > module_domain_type.o \ > module_physics_init.o \ > module_io.o \ > > wrf_auxhist10in.o \ > wrf_auxhist10out.o \ > wrf_auxhist11in.o \ > wrf_auxhist11out.o \ > wrf_auxhist1in.o \ > wrf_auxhist1out.o \ > wrf_auxhist2in.o \ > wrf_auxhist2out.o \ > wrf_auxhist3in.o \ > wrf_auxhist3out.o \ > wrf_auxhist4in.o \ > wrf_auxhist4out.o \ > wrf_auxhist5in.o \ > wrf_auxhist5out.o \ > wrf_auxhist6in.o \ > wrf_auxhist6out.o \ > wrf_auxhist7in.o \ > wrf_auxhist7out.o \ > wrf_auxhist8in.o \ > wrf_auxhist8out.o \ > wrf_auxhist9in.o \ > wrf_auxhist9out.o \ > wrf_auxinput10in.o \ > wrf_auxinput10out.o \ > wrf_auxinput11in.o \ > wrf_auxinput11out.o \ > wrf_auxinput1in.o \ > wrf_auxinput1out.o \ > wrf_auxinput2in.o \ > wrf_auxinput2out.o \ > wrf_auxinput3in.o \ > wrf_auxinput3out.o \ > wrf_auxinput4in.o \ > wrf_auxinput4out.o \ > wrf_auxinput5in.o \ > wrf_auxinput5out.o \ > wrf_auxinput6in.o \ > wrf_auxinput6out.o \ > > wrf_auxinput7in.o \ > wrf_auxinput7out.o \ > wrf_auxinput8in.o \ > wrf_auxinput8out.o \ > wrf_auxinput9in.o \ > wrf_auxinput9out.o \ > wrf_bdyin.o \ > wrf_bdyout.o \ > wrf_ext_read_field.o \ > wrf_ext_write_field.o \ > wrf_histin.o \ > wrf_histout.o \ > wrf_inputin.o \ > wrf_inputout.o \ > wrf_restartin.o \ > wrf_restartout.o \ > module_state_description.o \ > nl_set_0_routines.o \ > nl_set_1_routines.o \ > nl_get_0_routines.o \ > nl_get_1_routines.o \ > module_alloc_space.o \ > module_comm_dm.o \ > module_configure.o : > ? ? ? ?$(RM) $@ > ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F ?> $*.bb > ? ? ? ?$(SED_FTN) $*.bb | $(CPP) > $*.f90 > ? ? ? ?$(RM) $*.b $*.bb > ? ? ? ?$(FC) -c $(PROMOTION) $(FCSUFFIX) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $*.f90 > > > Environmental Variables: > > > USER=dbh409 > LOGNAME=dbh409 > HOME=/home/dbh409 > PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/bin:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/bin:/usr/local/intel/11.0/084/bin/intel64:/usr/local/intel/11.0/084/bin/intel64:/usr/local/maui/bin:/usr/local/maui/sbin:/usr/local/torque/bin:/usr/local/torque/sbin:/usr/local/Modules/3.2.6/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/software/netCDF/3.6.3/intel/mvapich/bin:/usr/local/ncarg_ifort/bin:. > MAIL=/var/spool/mail/dbh409 > SHELL=/bin/tcsh > SSH_CLIENT=10.237.0.175 49359 22 > SSH_CONNECTION=10.237.0.175 49359 129.237.228.235 22 > SSH_TTY=/dev/pts/3 > TERM=xterm > HOSTTYPE=x86_64-linux > VENDOR=unknown > OSTYPE=linux > MACHTYPE=x86_64 > SHLVL=1 > PWD=/home/dbh409/WRFV3 > GROUP=dbh409 > HOST=pequod > REMOTEHOST=10.237.0.175 > HOSTNAME=pequod > INPUTRC=/etc/inputrc > LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.mpg=01;35:*.mpeg=01;35:*.avi=01;35:*.fli=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.ogg=01;35:*.mp3=01;35:*.wav=01;35: > G_BROKEN_FILENAMES=1 > SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass > KDE_IS_PRELINKED=1 > KDEDIR=/usr > LANG=en_US.UTF-8 > LESSOPEN=|/usr/bin/lesspipe.sh %s > MANPATH=/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/share/man:/usr/local/intel/11.0/084/man:/usr/local/intel/11.0/084/man:/usr/local/maui/man:/usr/local/torque/man:/usr/local/Modules/3.2.6/man:/usr/man:/usr/share/man:/usr/local/man:/usr/local/share/man:/usr/X11R6/man > QTDIR=/usr/lib64/qt-3.3 > QTINC=/usr/lib64/qt-3.3/include > QTLIB=/usr/lib64/qt-3.3/lib > MODULE_VERSION=3.2.6 > MODULE_VERSION_STACK=3.2.6 > MODULESHOME=/usr/local/Modules/3.2.6 > MODULEPATH=/usr/local/Modules/versions:/usr/local/Modules/$MODULE_VERSION/modulefiles:/usr/local/Modules/modulefiles: > LOADEDMODULES=null:modules:tools/torque-maui:compilers/64/intel-x86_64:openmpi/openmpi-1.3.2-ethernet-intel.64:tools/netcdf-4.1.3-intel:mpich2/mpich2-1.0.7-ethernet-intel.64 > CC=/usr/local/intel/11.0/084/bin/intel64/icc > CCHOME=/usr/local/intel/11.0/084 > CXX=/usr/local/intel/11.0/084/bin/intel64/icpc > F77=/usr/local/intel/11.0/084/bin/intel64/ifort > F90=/usr/local/intel/11.0/084/bin/intel64/ifort > FC=/usr/local/intel/11.0/084/bin/intel64/ifort > FCHOME=/usr/local/intel/11.0/084 > INTEL_LICENSE_FILE=/opt/intel/licenses/l_RH4JJF9H.lic > LD_LIBRARY_PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/lib:/usr/local/lib:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/lib:/usr/local/intel/11.0/084/lib/intel64:/usr/local/maui/lib:/usr/local/torque/lib > MPIHOME=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0 > NETCDF=/usr/local > _LMFILES_=/usr/local/Modules/modulefiles/null:/usr/local/Modules/modulefiles/modules:/usr/local/Modules/modulefiles/tools/torque-maui:/usr/local/Modules/modulefiles/compilers/64/intel-x86_64:/usr/local/Modules/modulefiles/openmpi/openmpi-1.3.2-ethernet-intel.64:/usr/local/Modules/modulefiles/tools/netcdf-4.1.3-intel:/usr/local/Modules/modulefiles/mpich2/mpich2-1.0.7-ethernet-intel.64 > NCARG_ROOT=/usr/local/ncarg_ifort > RIP_ROOT=/home/dbh409/RIP4 > ITT_DIR=/usr/local/itt > IDL_DIR=/usr/local/itt/idl71 > > > The NetCDF build location is /usr/local/ ?a slight accident, but I don't think it makes a difference that it's not /usr/local/netcdf-4.1.3. > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > From maemarcus at gmail.com Tue Aug 9 11:21:24 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 9 Aug 2011 21:21:24 +0400 Subject: [Wrf-users] WRF Compilation Error In-Reply-To: References: Message-ID: Good, let's look here: if [ -f $(NETCDFPATH)/lib/libnetcdff.a ] ; then \ $(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) -lnetcdff ;\ else \ $(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) ;\ fi ; \ <-- it handles two cases: either file $(NETCDFPATH)/lib/libnetcdff.a exists and then it is added to the linker arguments, or else it is not. I don't know what it means, maybe diffwrf could be built without netcdf support or can provide its own implementations? Anyway, if it falls into "else" brach and those unresolved symbols are from libnetcdff, then that's the problem. Could you check $(NETCDFPATH)/lib/libnetcdff.a exists and/or add debugging echo "Check"; line to the beginning of the "else" branch and see if it is reached? - D. 2011/8/9 Huber, David : > Hi Dmitry, > > Thanks for the reply. ?It took a little digging, but I found the makefile with this command in WRFV3/external/ionetcdf/makefile. ?Here's the block for diffwrf.F90: > > diffwrf: ? ? ? ? ? ? ? ?diffwrf.F90 > ? ? ? ?x=`echo "$(FC)" | awk '{print $$1}'` ; export x ; \ > ? ? ? ?if [ $$x = "gfortran" ] ; then \ > ? ? ? ? ? echo removing external declaration of iargc for gfortran ; \ > ? ? ? ? ? $(CPP1) -I$(NETCDFPATH)/include -I../ioapi_share diffwrf.F90 | sed '/integer *, *external.*iargc/d' > diffwrf.f ;\ > ? ? ? ?else \ > ? ? ? ? ? $(CPP1) -I$(NETCDFPATH)/include -I../ioapi_share diffwrf.F90 > diffwrf.f ; \ > ? ? ? ?fi > ? ? ? ?$(FC) -c $(FFLAGS) diffwrf.f > ? ? ? ?if [ \( -f ../../frame/wrf_debug.o \) -a \( -f ../../frame/module_wrf_error.o \) -a \( -f $(ESMF_MOD_DEPENDENCE) \) ] ; then \ > ? ? ? ? ?echo "diffwrf io_netcdf is being built now. " ; \ > ? ? ? ? ?if [ -f $(NETCDFPATH)/lib/libnetcdff.a ] ; then \ > ? ? ? ? ? ?$(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) -lnetcdff ;\ > ? ? ? ? ?else \ > ? ? ? ? ? ?$(FC) $(FFLAGS) $(LDFLAGS) -o diffwrf diffwrf.o $(OBJSL) bitwise_operators.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o $(ESMF_IO_LIB_EXT) $(LIBS) ;\ > ? ? ? ? ?fi ; \ > ? ? ? ?else \ > ? ? ? ? ? echo "***************************************************************************** " ; \ > ? ? ? ? ? echo "*** Rerun compile to make diffwrf in external/io_netcdf directory ? ? ? ? *** " ; \ > ? ? ? ? ? echo "***************************************************************************** " ; \ > ? ? ? ?fi > > > There was an @ before the line > if[\( -f ../../frame/wrf_debug.o \) ... > which I deleted, but the output did not change. > I see the link for netcdff and the $LIBS, which is comprised of > LIBS ? ?= -L$(NETCDFPATH)/lib -lnetcdf > > -Dave > > > ________________________________________ > From: Dmitry N. Mikushin [maemarcus at gmail.com] > Sent: Tuesday, August 09, 2011 10:44 AM > To: Huber, David > Cc: wrf-users at ucar.edu > Subject: Re: [Wrf-users] WRF Compilation Error > > Hi, > >> I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. > > - This is usually harmless and is always around with Intel compiler. > >> /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail >> wrf_io.o: In function `ext_ncd_get_var_info_': >> wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' >> wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' >> wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' >> wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' >> wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' >> wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' > > These are unresolved externals, which should normally occur, if proper > library is not added to the link command. > > Could you print out the exact linking command it is using? > >> diffwrf io_netcdf is being built now. > > - I suppose, linking command should go in the Makefile right after > echoing "diffwrf io_netcdf is being built now". If it starts with "@", > then it's hidden. Remove "@", and you will see the command. Once, > exact command is known, it should be much easier to understand what's > wrong. > > - D. > > > 2011/8/9 Huber, David : >> Hello, >> >> I'm having an issue compiling WRF V3.1.1 with the Intel compilers using a multi-processor (dmpar) build. ?NCAR Graphics (6.0.0) and NetCDF (4.1.3) are also built with the Intel compilers. The configure.wrf file contents (minus comments) and a list of the environmental variables are shown at the end of this message. ?The following is the first of several warning/undefined reference messages: >> >> >> ifort -w -ftz -align all -fno-alias -fp-model precise ?-FR -convert big_endian -c ?-I/usr/local/include -I../ioapi_share diffwrf.f >> diffwrf.f(1629): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf.f(1630): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf.f(1709): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf.f(1710): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf.f(1711): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf.f(1712): (col. 7) remark: LOOP WAS VECTORIZED. >> diffwrf io_netcdf is being built now. >> /usr/local/intel/11.0/084/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail >> wrf_io.o: In function `ext_ncd_get_var_info_': >> wrf_io.f:(.text+0x1ef): undefined reference to `nf_inq_varid_' >> wrf_io.f:(.text+0x740): undefined reference to `nf_inq_vartype_' >> wrf_io.f:(.text+0x883): undefined reference to `nf_get_att_int_' >> wrf_io.f:(.text+0x1158): undefined reference to `nf_get_att_text_' >> wrf_io.f:(.text+0x142e): undefined reference to `nf_inq_vardimid_' >> wrf_io.f:(.text+0x1595): undefined reference to `nf_inq_dimlen_' >> >> >> I'm not certain if this is an issue with NetCDF, the shared object, or what the deal with feupdateenv is about. ?I have successfully built WRF and WPS on this system before, but this is the first time I have tried it with everything compiled with icc and ifort. ?Any suggestions would be much appreciated. >> >> Thanks, >> >> Dave >> >> >> configure.wrf: >> >> >> SHELL ? ? ? ? ? = ? ? ? /bin/sh >> DEVTOP ? ? ? ? ?= ? ? ? `pwd` >> LIBINCLUDE ? ? ?= ? ? ? . >> .SUFFIXES: .F .i .o .f90 .c >> >> COREDEFS = -DEM_CORE=$(WRF_EM_CORE) \ >> ? ? ? ? ? -DNMM_CORE=$(WRF_NMM_CORE) -DNMM_MAX_DIM=2600 \ >> ? ? ? ? ? -DCOAMPS_CORE=$(WRF_COAMPS_CORE) \ >> ? ? ? ? ? -DDA_CORE=$(WRF_DA_CORE) \ >> >> ? ? ? ? ? -DEXP_CORE=$(WRF_EXP_CORE) >> >> >> MAX_DOMAINS ? ? = ? ? ? 21 >> >> CONFIG_BUF_LEN ?= ? ? ? 32768 >> >> >> NATIVE_RWORDSIZE = 4 >> >> SED_FTN = $(WRF_SRC_ROOT_DIR)/tools/standard.exe >> >> IO_GRIB_SHARE_DIR = >> >> ESMF_COUPLING ? ? ? = 0 >> ESMF_MOD_DEPENDENCE = $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/module_utility.o >> >> ESMF_IO_INC ? ? ? ? = -I$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 >> ESMF_MOD_INC ? ? ? ?= ?$(ESMF_IO_INC) >> ESMF_IO_DEFS ? ? ? ?= >> ESMF_TARGET ? ? ? ? = esmf_time >> >> >> LIBWRFLIB = libwrflib.a >> >> >> DMPARALLEL ? ? ?= ? ? ? ?1 >> OMPCPP ? ? ? ? ?= ? ? ? # -D_OPENMP >> OMP ? ? ? ? ? ? = ? ? ? # -openmp -fpp -auto >> SFC ? ? ? ? ? ? = ? ? ? ifort >> SCC ? ? ? ? ? ? = ? ? ? icc >> DM_FC ? ? ? ? ? = ? ? ? mpif90 -f90=$(SFC) >> DM_CC ? ? ? ? ? = ? ? ? mpicc -cc=$(SCC) -DMPI2_SUPPORT >> FC ? ? ? ? ? ? ?= ? ? ? ?$(DM_FC) >> CC ? ? ? ? ? ? ?= ? ? ? $(DM_CC) -DFSEEKO64_OK >> LD ? ? ? ? ? ? ?= ? ? ? $(FC) >> RWORDSIZE ? ? ? = ? ? ? $(NATIVE_RWORDSIZE) >> PROMOTION ? ? ? = ? ? ? -i4 >> ARCH_LOCAL ? ? ?= ? ? ? -DNONSTANDARD_SYSTEM_FUNC >> CFLAGS_LOCAL ? ?= ? ? ? -w -O3 -ip >> LDFLAGS_LOCAL ? = ? ? ? -ip >> CPLUSPLUSLIB ? ?= >> ESMF_LDFLAG ? ? = ? ? ? $(CPLUSPLUSLIB) >> FCOPTIM ? ? ? ? = ? ? ? -O3 >> FCREDUCEDOPT ? ?= ? ? ? $(FCOPTIM) >> FCNOOPT ? ? ? ? = ? ? ? -O0 -fno-inline -fno-ip >> FCDEBUG ? ? ? ? = ? ? ? # -g $(FCNOOPT) -traceback >> FORMAT_FIXED ? ?= ? ? ? -FI >> FORMAT_FREE ? ? = ? ? ? -FR >> FCSUFFIX ? ? ? ?= >> BYTESWAPIO ? ? ?= ? ? ? -convert big_endian >> FCBASEOPTS ? ? ?= ? ? ? -w -ftz -align all -fno-alias -fp-model precise $(FCDEBUG) $(FORMAT_FREE) $(BYTESWAPIO) >> MODULE_SRCH_FLAG = >> TRADFLAG ? ? ? ?= ? ? ?-traditional >> CPP ? ? ? ? ? ? = ? ? ?/lib/cpp -C -P >> AR ? ? ? ? ? ? ?= ? ? ?ar >> ARFLAGS ? ? ? ? = ? ? ?ru >> M4 ? ? ? ? ? ? ?= ? ? ?m4 >> RANLIB ? ? ? ? ?= ? ? ?ranlib >> CC_TOOLS ? ? ? ?= ? ? ?$(SCC) >> >> FGREP = fgrep -iq >> >> ARCHFLAGS ? ? ? = ? ?$(COREDEFS) -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=$(RWORDSIZE) -DLWORDSIZE=4 \ >> ? ? ? ? ? ? ? ? ? ? $(ARCH_LOCAL) \ >> ? ? ? ? ? ? ? ? ? ? $(DA_ARCHFLAGS) \ >> ? ? ? ? ? ? ? ? ? ? ?-DDM_PARALLEL \ >> >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ?-DNETCDF \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> ? ? ? ? ? ? ? ? ? ? ?-DGRIB1 \ >> ? ? ? ? ? ? ? ? ? ? ?-DINTIO \ >> ? ? ? ? ? ? ? ? ? ? ?-DLIMIT_ARGS \ >> ? ? ? ? ? ? ? ? ? ? ?-DCONFIG_BUF_LEN=$(CONFIG_BUF_LEN) \ >> ? ? ? ? ? ? ? ? ? ? ?-DMAX_DOMAINS_F=$(MAX_DOMAINS) \ >> ? ? ? ? ? ? ? ? ? ? ?-DNMM_NEST=$(WRF_NMM_NEST) >> CFLAGS ? ? ? ? ?= ? ?$(CFLAGS_LOCAL) -DDM_PARALLEL >> FCFLAGS ? ? ? ? = ? ?$(FCOPTIM) $(FCBASEOPTS) >> ESMF_LIB_FLAGS ?= >> ESMF_IO_LIB ? ? = ? ?$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a >> ESMF_IO_LIB_EXT = ? ?-L$(WRF_SRC_ROOT_DIR)/external/esmf_time_f90/libesmf_time.a >> INCLUDE_MODULES = ? ?$(MODULE_SRCH_FLAG) \ >> ? ? ? ? ? ? ? ? ? ? $(ESMF_MOD_INC) $(ESMF_LIB_FLAGS) \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/main \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/external/io_netcdf \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/external/io_int \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/frame \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/share \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/phys \ >> ? ? ? ? ? ? ? ? ? ? ?-I$(WRF_SRC_ROOT_DIR)/chem -I$(WRF_SRC_ROOT_DIR)/inc \ >> ? ? ? ? ? ? ? ? ? ? ? \ >> >> REGISTRY ? ? ? ?= ? ?Registry >> >> LIB_BUNDLED ? ? = \ >> ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 -lfftpack \ >> ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_grib1 -lio_grib1 \ >> ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_grib_share -lio_grib_share \ >> ? ? ? ? ? ? ? ? ? ? -L$(WRF_SRC_ROOT_DIR)/external/io_int -lwrfio_int \ >> ? ? ? ? ? ? ? ? ? ? $(ESMF_IO_LIB) \ >> ? ? ? ? ? ? ? ? ? ? $(ESMF_IO_LIB) \ >> ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a \ >> ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/frame/module_internal_header_util.o \ >> ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/frame/pack_utils.o >> >> LIB_EXTERNAL ? ?= \ >> >> ? ? ? ? ? ? ? ? ? ? $(WRF_SRC_ROOT_DIR)/external/io_netcdf/libwrfio_nf.a -L/usr/local/lib -lnetcdff -lnetcdf >> >> LIB ? ? ? ? ? ? = ? ?$(LIB_BUNDLED) $(LIB_EXTERNAL) $(LIB_LOCAL) >> LDFLAGS ? ? ? ? = ? ?$(OMP) $(FCFLAGS) $(LDFLAGS_LOCAL) >> ENVCOMPDEFS ? ? = >> WRF_CHEM ? ? ? ?= ? ? ? 0 >> CPPFLAGS ? ? ? ?= ? ?$(ARCHFLAGS) $(ENVCOMPDEFS) -I$(LIBINCLUDE) $(TRADFLAG) >> NETCDFPATH ? ? ?= ? ?/usr/local >> PNETCDFPATH ? ? = >> >> bundled: ?wrf_ioapi_includes wrfio_grib_share wrfio_grib1 wrfio_int esmf_time fftpack >> external: ?wrfio_nf ? $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a gen_comms_rsllite module_dm_rsllite $(ESMF_TARGET) >> >> ###################### >> externals: bundled external >> >> gen_comms_serial : >> ? ? ? ?( /bin/rm -f $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ) >> >> module_dm_serial : >> ? ? ? ?( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; cat module_dm_stubs.F >> module_dm.F ; fi ) >> >> gen_comms_rsllite : >> ? ? ? ?( if [ ! -e $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ] ; then \ >> ? ? ? ? ?/bin/cp $(WRF_SRC_ROOT_DIR)/tools/gen_comms_warning $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; \ >> ? ? ? ? ?cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/gen_comms.c >> $(WRF_SRC_ROOT_DIR)/tools/gen_comms.c ; fi ) >> >> module_dm_rsllite : >> ? ? ? ?( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; \ >> ? ? ? ? ?cat $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/module_dm.F >> module_dm.F ; fi ) >> >> wrfio_nf : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_netcdf ; \ >> ? ? ? ? ?make NETCDFPATH="$(NETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ >> ? ? ? ? ?CC="$(SCC)" CFLAGS="$(CFLAGS)" \ >> ? ? ? ? ?FC="$(SFC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> wrfio_pnf : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_pnetcdf ; \ >> ? ? ? ? ?make NETCDFPATH="$(PNETCDFPATH)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ >> ? ? ? ? ?FC="$(FC) $(PROMOTION) $(FCFLAGS)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> >> >> wrfio_grib_share : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib_share ; \ >> ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ >> ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) >> >> wrfio_grib1 : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib1 ; \ >> ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ >> ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" archive) >> >> wrfio_grib2 : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_grib2 ; \ >> ? ? ? ? ?make CC="$(SCC)" CFLAGS="$(CFLAGS) " RM="$(RM)" RANLIB="$(RANLIB)" \ >> ? ? ? ? ?CPP="$(CPP)" \ >> ? ? ? ? ?FC="$(SFC) $(PROMOTION) -I. $(FCDEBUG) $(FCBASEOPTS) $(FCSUFFIX)" TRADFLAG="-traditional" AR="$(AR)" ARFLAGS="$(ARFLAGS)" \ >> ? ? ? ? ?FIXED="$(FORMAT_FIXED)" archive) >> >> wrfio_int : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_int ; \ >> ? ? ? ? ?make CC="$(CC)" RM="$(RM)" RANLIB="$(RANLIB)" CPP="$(CPP)" \ >> ? ? ? ? ?FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" \ >> ? ? ? ? ?TRADFLAG="$(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" all ) >> >> esmf_time : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/esmf_time_f90 ; \ >> ? ? ? ? ?make FC="$(SFC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" \ >> ? ? ? ? ?CPP="$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> fftpack : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/fftpack/fftpack5 ; \ >> ? ? ? ? ?make FC="$(SFC)" FFLAGS="$(PROMOTION) $(FCDEBUG) $(FCBASEOPTS)" RANLIB="$(RANLIB)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/RSL_LITE ; make CC="$(CC) $(CFLAGS)" \ >> ? ? ? ? ?FC="$(FC) $(FCFLAGS) $(PROMOTION) $(BYTESWAPIO)" \ >> ? ? ? ? ?CPP="$(CPP) -I. $(ARCHFLAGS) $(TRADFLAG)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ;\ >> ? ? ? ? ?$(RANLIB) $(WRF_SRC_ROOT_DIR)/external/RSL_LITE/librsl_lite.a ) >> >> >> LN ? ? ?= ? ? ? ln -sf >> MAKE ? ?= ? ? ? make -i -r >> RM ? ? ?= ? ? ? rm -f >> >> >> # These sub-directory builds are identical across all architectures >> >> wrf_ioapi_includes : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/ioapi_share ; \ >> ? ? ? ? ?$(MAKE) NATIVE_RWORDSIZE="$(NATIVE_RWORDSIZE)" RWORDSIZE="$(RWORDSIZE)" AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> wrfio_esmf : >> ? ? ? ?( cd $(WRF_SRC_ROOT_DIR)/external/io_esmf ; \ >> ? ? ? ? ?make FC="$(FC) $(PROMOTION) $(FCDEBUG) $(FCBASEOPTS) $(ESMF_MOD_INC)" \ >> ? ? ? ? ?RANLIB="$(RANLIB)" CPP="$(CPP) $(POUND_DEF) " AR="$(AR)" ARFLAGS="$(ARFLAGS)" ) >> >> # ? ? ? There is probably no reason to modify these rules >> >> .F.i: >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.F > $@ >> ? ? ? ?mv $*.i $(DEVTOP)/pick/$*.f90 >> ? ? ? ?cp $*.F $(DEVTOP)/pick >> >> .F.o: >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F ?> $*.bb >> ? ? ? ?$(SED_FTN) $*.bb | $(CPP) > $*.f90 >> ? ? ? ?$(RM) $*.b $*.bb >> ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ >> ? ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(OMP) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ >> ? ? ? ?else \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ >> ? ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(MODULE_DIRS) $(PROMOTION) $(FCSUFFIX) $*.f90 ; \ >> ? ? ? ?fi >> >> >> .F.f90: >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(SED_FTN) $*.F > $*.b >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $*.b ?> $@ >> ? ? ? ?$(RM) $*.b >> >> >> .f90.o: >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(FC) -o $@ -c $(FCFLAGS) $(PROMOTION) $(FCSUFFIX) $*.f90 >> >> .c.o: >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(CC) -o $@ -c $(CFLAGS) $*.c >> >> # A little more adventurous. ?Allow full opt on >> # mediation_integrate.o \ >> # shift_domain_em.o \ >> # solve_em.o ?<-- gets a little kick from SOLVE_EM_SPECIAL too, if defined >> # mediation_feedback_domain.o : mediation_feedback_domain.F >> # mediation_force_domain.o : mediation_force_domain.F >> # mediation_interp_domain.o : mediation_interp_domain.F >> >> # compile these without high optimization to speed compile >> convert_nmm.o : convert_nmm.F >> init_modules_em.o : init_modules_em.F >> input_wrf.o : input_wrf.F >> module_io.o : module_io.F >> module_comm_dm.o : module_comm_dm.F >> module_configure.o : module_configure.F >> module_dm.o : module_dm.F >> module_domain.o : module_domain.F >> module_domain_type.o : module_domain_type.F >> module_alloc_space.o : module_alloc_space.F >> module_tiles.o : module_tiles.F >> module_fddaobs_rtfdda.o : module_fddaobs_rtfdda.F >> module_initialize.o : module_initialize.F >> module_physics_init.o : module_physics_init.F >> module_initialize_b_wave.o : module_initialize_b_wave.F >> module_initialize_hill2d_x.o : module_initialize_hill2d_x.F >> module_initialize_quarter_ss.o : module_initialize_quarter_ss.F >> module_initialize_real.o : module_initialize_real.F >> module_initialize_real.o: module_initialize_real.F >> module_initialize_squall2d_x.o : module_initialize_squall2d_x.F >> module_initialize_squall2d_y.o : module_initialize_squall2d_y.F >> module_integrate.o : module_integrate.F >> module_io_mm5.o : module_io_mm5.F >> module_io_wrf.o : module_io_wrf.F >> module_si_io.o : module_si_io.F >> module_state_description.o : module_state_description.F >> output_wrf.o : output_wrf.F >> >> >> NMM_NEST_UTILS1.o : NMM_NEST_UTILS1.F >> solve_interface.o : solve_interface.F >> start_domain.o : start_domain.F >> start_domain_nmm.o : start_domain_nmm.F >> start_em.o : start_em.F >> wrf_auxhist10in.o : wrf_auxhist10in.F >> wrf_auxhist10out.o : wrf_auxhist10out.F >> wrf_auxhist11in.o : wrf_auxhist11in.F >> wrf_auxhist11out.o : wrf_auxhist11out.F >> wrf_auxhist1in.o : wrf_auxhist1in.F >> wrf_auxhist1out.o : wrf_auxhist1out.F >> wrf_auxhist2in.o : wrf_auxhist2in.F >> wrf_auxhist2out.o : wrf_auxhist2out.F >> wrf_auxhist3in.o : wrf_auxhist3in.F >> wrf_auxhist3out.o : wrf_auxhist3out.F >> wrf_auxhist4in.o : wrf_auxhist4in.F >> wrf_auxhist4out.o : wrf_auxhist4out.F >> wrf_auxhist5in.o : wrf_auxhist5in.F >> wrf_auxhist5out.o : wrf_auxhist5out.F >> wrf_auxhist6in.o : wrf_auxhist6in.F >> wrf_auxhist6out.o : wrf_auxhist6out.F >> wrf_auxhist7in.o : wrf_auxhist7in.F >> wrf_auxhist7out.o : wrf_auxhist7out.F >> wrf_auxhist8in.o : wrf_auxhist8in.F >> wrf_auxhist8out.o : wrf_auxhist8out.F >> wrf_auxhist9in.o : wrf_auxhist9in.F >> wrf_auxhist9out.o : wrf_auxhist9out.F >> wrf_auxinput10in.o : wrf_auxinput10in.F >> wrf_auxinput10out.o : wrf_auxinput10out.F >> wrf_auxinput11in.o : wrf_auxinput11in.F >> wrf_auxinput11out.o : wrf_auxinput11out.F >> wrf_auxinput1in.o : wrf_auxinput1in.F >> wrf_auxinput1out.o : wrf_auxinput1out.F >> wrf_auxinput2in.o : wrf_auxinput2in.F >> wrf_auxinput2out.o : wrf_auxinput2out.F >> wrf_auxinput3in.o : wrf_auxinput3in.F >> wrf_auxinput3out.o : wrf_auxinput3out.F >> wrf_auxinput4in.o : wrf_auxinput4in.F >> wrf_auxinput4out.o : wrf_auxinput4out.F >> wrf_auxinput5in.o : wrf_auxinput5in.F >> wrf_auxinput5out.o : wrf_auxinput5out.F >> wrf_auxinput6in.o : wrf_auxinput6in.F >> wrf_auxinput6out.o : wrf_auxinput6out.F >> wrf_auxinput7in.o : wrf_auxinput7in.F >> >> wrf_auxinput7out.o : wrf_auxinput7out.F >> wrf_auxinput8in.o : wrf_auxinput8in.F >> wrf_auxinput8out.o : wrf_auxinput8out.F >> wrf_auxinput9in.o : wrf_auxinput9in.F >> wrf_auxinput9out.o : wrf_auxinput9out.F >> wrf_bdyin.o : wrf_bdyin.F >> wrf_bdyout.o : wrf_bdyout.F >> wrf_ext_read_field.o : wrf_ext_read_field.F >> wrf_ext_write_field.o : wrf_ext_write_field.F >> wrf_fddaobs_in.o : wrf_fddaobs_in.F >> wrf_histin.o : wrf_histin.F >> wrf_histout.o : wrf_histout.F >> wrf_inputin.o : wrf_inputin.F >> wrf_inputout.o : wrf_inputout.F >> wrf_restartin.o : wrf_restartin.F >> wrf_restartout.o : wrf_restartout.F >> wrf_tsin.o : wrf_tsin.F >> nl_get_0_routines.o : nl_get_0_routines.F >> nl_get_1_routines.o : nl_get_1_routines.F >> nl_set_0_routines.o : nl_set_0_routines.F >> nl_set_1_routines.o : nl_set_1_routines.F >> >> convert_nmm.o \ >> init_modules_em.o \ >> module_dm.o \ >> module_fddaobs_rtfdda.o \ >> module_initialize.o \ >> module_initialize_b_wave.o \ >> module_initialize_hill2d_x.o \ >> module_initialize_quarter_ss.o \ >> module_initialize_real.o \ >> module_initialize_squall2d_x.o \ >> module_initialize_squall2d_y.o \ >> module_integrate.o \ >> module_io_mm5.o \ >> module_io_wrf.o \ >> module_si_io.o \ >> module_tiles.o \ >> output_wrf.o \ >> NMM_NEST_UTILS1.o \ >> solve_interface.o \ >> start_domain.o \ >> start_domain_nmm.o \ >> shift_domain_nmm.o \ >> >> start_em.o \ >> wrf_fddaobs_in.o \ >> wrf_tsin.o : >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(SED_FTN) $*.F > $*.b >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b ?> $*.f90 >> ? ? ? ?$(RM) $*.b >> ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ >> ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ >> ? ? ? ?else \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ >> ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ >> ? ? ? ?fi >> >> module_sf_ruclsm.o : module_sf_ruclsm.F >> >> module_sf_ruclsm.o : >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(SED_FTN) $*.F > $*.b >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.b ?> $*.f90 >> ? ? ? ?$(RM) $*.b >> ? ? ? ?if $(FGREP) '!$$OMP' $*.f90 ; then \ >> ? ? ? ? ?echo COMPILING $*.F WITH OMP ; \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITH OMP ; fi ; \ >> ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $(OMP) $*.f90 ; \ >> ? ? ? ?else \ >> ? ? ? ? ?if [ -n "$(OMP)" ] ; then echo COMPILING $*.F WITHOUT OMP ; fi ; \ >> ? ? ? ? ?$(FC) -c $(PROMOTION) $(FCREDUCEDOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(FCSUFFIX) $*.f90 ; \ >> ? ? ? ?fi >> >> input_wrf.o \ >> module_domain.o \ >> module_domain_type.o \ >> module_physics_init.o \ >> module_io.o \ >> >> wrf_auxhist10in.o \ >> wrf_auxhist10out.o \ >> wrf_auxhist11in.o \ >> wrf_auxhist11out.o \ >> wrf_auxhist1in.o \ >> wrf_auxhist1out.o \ >> wrf_auxhist2in.o \ >> wrf_auxhist2out.o \ >> wrf_auxhist3in.o \ >> wrf_auxhist3out.o \ >> wrf_auxhist4in.o \ >> wrf_auxhist4out.o \ >> wrf_auxhist5in.o \ >> wrf_auxhist5out.o \ >> wrf_auxhist6in.o \ >> wrf_auxhist6out.o \ >> wrf_auxhist7in.o \ >> wrf_auxhist7out.o \ >> wrf_auxhist8in.o \ >> wrf_auxhist8out.o \ >> wrf_auxhist9in.o \ >> wrf_auxhist9out.o \ >> wrf_auxinput10in.o \ >> wrf_auxinput10out.o \ >> wrf_auxinput11in.o \ >> wrf_auxinput11out.o \ >> wrf_auxinput1in.o \ >> wrf_auxinput1out.o \ >> wrf_auxinput2in.o \ >> wrf_auxinput2out.o \ >> wrf_auxinput3in.o \ >> wrf_auxinput3out.o \ >> wrf_auxinput4in.o \ >> wrf_auxinput4out.o \ >> wrf_auxinput5in.o \ >> wrf_auxinput5out.o \ >> wrf_auxinput6in.o \ >> wrf_auxinput6out.o \ >> >> wrf_auxinput7in.o \ >> wrf_auxinput7out.o \ >> wrf_auxinput8in.o \ >> wrf_auxinput8out.o \ >> wrf_auxinput9in.o \ >> wrf_auxinput9out.o \ >> wrf_bdyin.o \ >> wrf_bdyout.o \ >> wrf_ext_read_field.o \ >> wrf_ext_write_field.o \ >> wrf_histin.o \ >> wrf_histout.o \ >> wrf_inputin.o \ >> wrf_inputout.o \ >> wrf_restartin.o \ >> wrf_restartout.o \ >> module_state_description.o \ >> nl_set_0_routines.o \ >> nl_set_1_routines.o \ >> nl_get_0_routines.o \ >> nl_get_1_routines.o \ >> module_alloc_space.o \ >> module_comm_dm.o \ >> module_configure.o : >> ? ? ? ?$(RM) $@ >> ? ? ? ?$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F ?> $*.bb >> ? ? ? ?$(SED_FTN) $*.bb | $(CPP) > $*.f90 >> ? ? ? ?$(RM) $*.b $*.bb >> ? ? ? ?$(FC) -c $(PROMOTION) $(FCSUFFIX) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $*.f90 >> >> >> Environmental Variables: >> >> >> USER=dbh409 >> LOGNAME=dbh409 >> HOME=/home/dbh409 >> PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/bin:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/bin:/usr/local/intel/11.0/084/bin/intel64:/usr/local/intel/11.0/084/bin/intel64:/usr/local/maui/bin:/usr/local/maui/sbin:/usr/local/torque/bin:/usr/local/torque/sbin:/usr/local/Modules/3.2.6/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/software/netCDF/3.6.3/intel/mvapich/bin:/usr/local/ncarg_ifort/bin:. >> MAIL=/var/spool/mail/dbh409 >> SHELL=/bin/tcsh >> SSH_CLIENT=10.237.0.175 49359 22 >> SSH_CONNECTION=10.237.0.175 49359 129.237.228.235 22 >> SSH_TTY=/dev/pts/3 >> TERM=xterm >> HOSTTYPE=x86_64-linux >> VENDOR=unknown >> OSTYPE=linux >> MACHTYPE=x86_64 >> SHLVL=1 >> PWD=/home/dbh409/WRFV3 >> GROUP=dbh409 >> HOST=pequod >> REMOTEHOST=10.237.0.175 >> HOSTNAME=pequod >> INPUTRC=/etc/inputrc >> LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.mpg=01;35:*.mpeg=01;35:*.avi=01;35:*.fli=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.ogg=01;35:*.mp3=01;35:*.wav=01;35: >> G_BROKEN_FILENAMES=1 >> SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass >> KDE_IS_PRELINKED=1 >> KDEDIR=/usr >> LANG=en_US.UTF-8 >> LESSOPEN=|/usr/bin/lesspipe.sh %s >> MANPATH=/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/share/man:/usr/local/intel/11.0/084/man:/usr/local/intel/11.0/084/man:/usr/local/maui/man:/usr/local/torque/man:/usr/local/Modules/3.2.6/man:/usr/man:/usr/share/man:/usr/local/man:/usr/local/share/man:/usr/X11R6/man >> QTDIR=/usr/lib64/qt-3.3 >> QTINC=/usr/lib64/qt-3.3/include >> QTLIB=/usr/lib64/qt-3.3/lib >> MODULE_VERSION=3.2.6 >> MODULE_VERSION_STACK=3.2.6 >> MODULESHOME=/usr/local/Modules/3.2.6 >> MODULEPATH=/usr/local/Modules/versions:/usr/local/Modules/$MODULE_VERSION/modulefiles:/usr/local/Modules/modulefiles: >> LOADEDMODULES=null:modules:tools/torque-maui:compilers/64/intel-x86_64:openmpi/openmpi-1.3.2-ethernet-intel.64:tools/netcdf-4.1.3-intel:mpich2/mpich2-1.0.7-ethernet-intel.64 >> CC=/usr/local/intel/11.0/084/bin/intel64/icc >> CCHOME=/usr/local/intel/11.0/084 >> CXX=/usr/local/intel/11.0/084/bin/intel64/icpc >> F77=/usr/local/intel/11.0/084/bin/intel64/ifort >> F90=/usr/local/intel/11.0/084/bin/intel64/ifort >> FC=/usr/local/intel/11.0/084/bin/intel64/ifort >> FCHOME=/usr/local/intel/11.0/084 >> INTEL_LICENSE_FILE=/opt/intel/licenses/l_RH4JJF9H.lic >> LD_LIBRARY_PATH=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0/lib:/usr/local/lib:/usr/local/openmpi/openmpi-1.3.2/x86_64/intel11.0/lib:/usr/local/intel/11.0/084/lib/intel64:/usr/local/maui/lib:/usr/local/torque/lib >> MPIHOME=/usr/local/mpich2/mpich2-1.0.7/x86_64/intel11.0 >> NETCDF=/usr/local >> _LMFILES_=/usr/local/Modules/modulefiles/null:/usr/local/Modules/modulefiles/modules:/usr/local/Modules/modulefiles/tools/torque-maui:/usr/local/Modules/modulefiles/compilers/64/intel-x86_64:/usr/local/Modules/modulefiles/openmpi/openmpi-1.3.2-ethernet-intel.64:/usr/local/Modules/modulefiles/tools/netcdf-4.1.3-intel:/usr/local/Modules/modulefiles/mpich2/mpich2-1.0.7-ethernet-intel.64 >> NCARG_ROOT=/usr/local/ncarg_ifort >> RIP_ROOT=/home/dbh409/RIP4 >> ITT_DIR=/usr/local/itt >> IDL_DIR=/usr/local/itt/idl71 >> >> >> The NetCDF build location is /usr/local/ ?a slight accident, but I don't think it makes a difference that it's not /usr/local/netcdf-4.1.3. >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> > > From J.Kala at murdoch.edu.au Wed Aug 10 00:08:32 2011 From: J.Kala at murdoch.edu.au (Jatin Kala) Date: Wed, 10 Aug 2011 14:08:32 +0800 Subject: [Wrf-users] troubles, could not find trapping x locations Message-ID: Hi, I am getting a ", could not find trapping x locations" when running real.exe. This is using data I have written myself in binary format, as the data was in netcdf, and not GRIB. I looked at the written pressure fields, and they look OK. Any suggestions at what else I should be looking at? Has anybody else had this "trapping X locations " issue? Here is the error msg: target pressure and value = NaN 0.0000000E+00 column of pressure and value = 11.53211 0.0000000E+00 column of pressure and value = 9.182385 15130.88 -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 4142 troubles, could not find trapping x locations ------------------------------------------- Cheers, Jatin -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110810/d75b6b2c/attachment.html From littaaj at gmail.com Thu Aug 11 23:14:08 2011 From: littaaj at gmail.com (Litta A J) Date: Fri, 12 Aug 2011 10:44:08 +0530 Subject: [Wrf-users] issue in 15 minute WRF-NMM output creation Message-ID: Dear sir, I am trying to generate 15 minute WRF_NMM model output. WPPV3.2 module has properly created the output (ie 15 minute interval). I have concatenated the files and named as *all*. Then i have used grib2ctl.pl command to generate ctl file. I have edited the tdef line of all.ctl as follows *tdef 97 linear 00Z11jun2011 15mn * I have also tried to edit tdef as follows * * *tdef 97 linear 00:00Z**11jun2011 15mn * Then i have used gribmap -i all.ctl. Its properly plot the hourly data. But not 15 minute interval.. Can you please tell me, how to create 15 minute interval data from WRF model output. regards, Litta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110812/be8c778c/attachment.html From dbh409 at ku.edu Sun Aug 14 11:09:03 2011 From: dbh409 at ku.edu (Huber, David) Date: Sun, 14 Aug 2011 17:09:03 +0000 Subject: [Wrf-users] Writing Binary Files for WPS Message-ID: Hello, I am trying to ingest CCSM output data into WPS 3.3 for initial and boundary conditions of WRF. I have attached the NCL code, the stub file, and the fortran code that read the CCSM output and generates the unformatted binary files that metgrid.exe needs in order to generate the NetCDF files necessary for WRF. The code runs properly, and the binary files are generated with the proper fields. After providing soft links to the binary files and running geogrid.exe, I ran metgrid.exe and received the following error message: pequod:dbh409> ./metgrid.exe Processing domain 1 of 1 Processing 1990-01-01_00 SF 3D 2D ERROR: The mandatory field TT was not found in any input data. I have double checked the binary files and I've been able to find the TT field in the "3D" files, but for some reason metgrid.exe is unable to recognize it. My best guess is that metgrid.exe is expecting a different type of binary file than is being provided, which likely means there is an issue in the way NCL and WPS were compiled such that they do not generate the same types of binary files. NCL (6.0.0), WRF, WPS, and NCAR Graphics are all built using gfortran and gcc. NCL and NCAR Graphics used the precompiled binaries while WRF and WPS are both built using the serial option. NetCDF was built using the Intel compilers, which I believe was responsible for some errors in the WPS compilations, but ungrib.exe, metgrid.exe, and geogrid.exe were all created without issue. Note that this build of WPS/WRF will not be used to actually run the simulations. Instead, it is only used to run metgrid.exe. Do you know if I should change any of the flags in the building of WPS and/or WRF so that metgrid.exe expects the correct type of binary file? Does NCL need to be custom built to accomplish what I need to do? Thanks, Dave From jiwen.fan at pnnl.gov Sat Aug 13 23:46:48 2011 From: jiwen.fan at pnnl.gov (Fan, Jiwen) Date: Sat, 13 Aug 2011 22:46:48 -0700 Subject: [Wrf-users] Reminder: travel grant application deadline Sep 1 for Young Scientist Forum in 2011 IYC O3 and Climate Change Symposium Message-ID: Dear young hip scientists, The deadline for applying for a travel fellowship to attend the 2011 IYC O3 and Climate Change Symposium (http://www.2011-iyc-o3.org/) is approaching (Sep 1, 2011). The benefits of the travel fellowship on the basis of the actual need include: (1) paid registration fee, (2) hotel accommodation (shared double occupancy for 5 nights), and (3) airfare reimbursement up to $500. Each applicant is required to complete * an online application form, containing the applicant's contact information, * a description of the applicant's current research (less than 300 words), * a statement (a few sentences) on the potential benefits of this travel grant to the applicant's future professional goals, * an abstract for presentation at the young scientist forum (less than 300 words), * and a two-page cv in PDF format. Here is the website to submit your application: http://www.2011-iyc-o3.org/studentsyoung-scientists/travel-grant-for-young-scientists Please consider joining us in this unique symposium and also please pass this email on to others who may be interested (especially postdocs and graduate students you know)! Please feel free to contact us for any questions. Apologies for cross-posting! Jiwen Fan (PNNL), Trude Storelvmo (Yale Univ.), and Annmarie G. Carlton (Rutgers Univ.) The organizers of Young Scientist Forum in 2011 IYC O3 and Climate Change Symposium ---- Jiwen Fan, Ph.D. Scientist Atmospheric Science & Global Change Division Pacific Northwest National Laboratory PO Box 999, MSIN K9-24 Richland, WA 99352 509/375-2116 (o) Jiwen.fan at pnl.gov From M.Yap at massey.ac.nz Thu Aug 18 19:45:34 2011 From: M.Yap at massey.ac.nz (Yap, Mike) Date: Fri, 19 Aug 2011 13:45:34 +1200 Subject: [Wrf-users] WRF3.3 - syntax error in module_fr_sfire_core? Message-ID: <92FDFD8B26EB6542B1E1BF017BB998D16C923549AF@TUR-EXCHMBX.massey.ac.nz> Hi, I got the same message when building WRF 3.3 PGF90-S-0034-Syntax error at or near end of line (module_fr_sfire_core.f90: 182) 0 inform, 0 warnings, 1 severes, 0 fatal for nearest make[3]: [module_fr_sfire_core.o] Error 2 (ignored) Anyone have a solution for this ? Thanks Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110819/00371dde/attachment.html From mkudsy at gmail.com Thu Aug 18 22:31:26 2011 From: mkudsy at gmail.com (M Kudsy) Date: Fri, 19 Aug 2011 11:31:26 +0700 Subject: [Wrf-users] WRFVAR with radiance takes forever Message-ID: Dear all, I am now attempting radiance assimilation using WRFvar 3.3, in which I used the testing data obtained from the internet. The assimilated data was obtained from WRF gdas1.t12z.1bamua.tm00.bufr_d.linux ---> amsua.bufr gdas1.t12z.1bamub.tm00.bufr_d.linux ---> amsub.bufr gdas1.t12z.prepbufr.nr.linux ---> ob.bufr for the date 2008020512 I obtained from Summer Tutorial in Korea My friend said that the files on the internet are in big_endian linux format, so I use ssrc.exe to convert them to little_endia, use cwordsh to unblock and reblock but the run did not proceed as expected. After I left the computer for one hour, even one night it stuck as the following [kudsy at cumulus 2008020512]$ ./da_wrfvar.exe *** VARIATIONAL ANALYSIS *** Namelist perturbation not found in namelist.input. Using registry defaults for variables in perturbation DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 625600848 bytes allocated WRF TILE 1 IS 1 IE 89 JS 1 JE 59 WRF NUMBER OF TILES = 1 Set up observations (ob) Using BUFR format observation input PREPBUFR ob with quality marker <= 3 will be retained. I attached the namelist. I use gfortran and gcc compilers for building the wrfvar and the libraries. Is there any clue please? -- Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110819/80d40eba/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 2578 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110819/80d40eba/attachment.obj From moudipascal at yahoo.fr Fri Aug 19 06:52:08 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Fri, 19 Aug 2011 13:52:08 +0100 (BST) Subject: [Wrf-users] Problem panneling WRF outputs and trmm data Message-ID: <1313758328.78204.YahooMailNeo@web29014.mail.ird.yahoo.com> Dear all, I am having troubles using trmm data. In fact i want to pannel wrf outpouts and trmm data in the same figure. I am having segmentation fault. Would someone provides a script which deals with some king of job. Attached is the script i used. What are the steps to be followed when one need to pannel data with different resolution and subscripts? Please help. ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110819/ec37b9a8/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: panel_1.ncl Type: application/octet-stream Size: 11959 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110819/ec37b9a8/attachment-0001.obj From bpolla02 at harris.com Mon Aug 22 07:48:35 2011 From: bpolla02 at harris.com (Pollack, Bryan) Date: Mon, 22 Aug 2011 13:48:35 +0000 Subject: [Wrf-users] WRF 3.0 NMM Segmentation fault in wrf.exe Message-ID: Hi, Does anybody know how to solve this issue: http://mailman.ucar.edu/pipermail/wrf-users/2009/001096.html. I'm getting the same issue as he was, but nobody had responded. Thanks, Bryan Pollack Software Engineer, Harris Corporation 321-309-7646 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110822/7a97ad10/attachment.html From carlosromancascon at fis.ucm.es Sat Aug 20 10:52:07 2011 From: carlosromancascon at fis.ucm.es (CARLOS ROMAN CASCON) Date: Sat, 20 Aug 2011 18:52:07 +0200 Subject: [Wrf-users] choosing initial time conditions Message-ID: Hi, I am simulating fog events with ARW-WRF V3.3 and comparing with observations. The problem is that the model seems to be too sensitive to the choice of the initial day to start the simulation. For example, if I am interested on day 4, and I run the simulation beginning on 3, the results are very diferent if I choose the initial day on 2... The differences can be of the order of 3 or 4?C, and It?s important for me because I would like to do a statistic comparing with observations, and this choice of the initial day seems to be very important. Is this normal?? Thank you in advance Carlos Rom?n Casc?n -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110820/25380711/attachment.html From mmkamal at uwaterloo.ca Mon Aug 22 11:08:28 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Mon, 22 Aug 2011 13:08:28 -0400 Subject: [Wrf-users] Discrepancies between WPS 3.3 & WPS 3.2.1 Message-ID: <20110822130828.14855d568vpk5zwg@www.nexusmail.uwaterloo.ca> Hi, I was wondering whether any change took place in Vtable.NARR and Vtable.GFS when upgrading from WPS 3.2.1 to WPS 3.3. The difference is in the last two line of the above two Vtables. I look forward to hearing from you. Thanks Kamal From moudipascal at yahoo.fr Mon Aug 22 07:16:36 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Mon, 22 Aug 2011 14:16:36 +0100 (BST) Subject: [Wrf-users] Tr : Problem panneling WRF outputs and trmm data In-Reply-To: <1313758328.78204.YahooMailNeo@web29014.mail.ird.yahoo.com> References: <1313758328.78204.YahooMailNeo@web29014.mail.ird.yahoo.com> Message-ID: <1314018996.65223.YahooMailNeo@web29003.mail.ird.yahoo.com> ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 ----- Mail transf?r? ----- De?: moudi pascal ??: WRF DA ; WRF User's ; "wrf_users at ucar.edu" Envoy? le : Vendredi 19 Ao?t 2011 13h52 Objet?: Problem panneling WRF outputs and trmm data Dear all, I am having troubles using trmm data. In fact i want to pannel wrf outpouts and trmm data in the same figure. I am having segmentation fault. Would someone provides a script which deals with some king of job. Attached is the script i used. What are the steps to be followed when one need to pannel data with different resolution and subscripts? Please help. ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110822/6ecdb0c5/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: panel_1.ncl Type: application/octet-stream Size: 11959 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110822/6ecdb0c5/attachment-0001.obj From mashfaq at ornl.gov Wed Aug 24 08:28:17 2011 From: mashfaq at ornl.gov (Ashfaq, Moetasim) Date: Wed, 24 Aug 2011 10:28:17 -0400 Subject: [Wrf-users] Postdoctoral Researcher in hydro-climate predictions and impact assessments Message-ID: <695D6190-40FD-4D98-96D3-026C73FD83E1@ornl.gov> Postdoctoral Researcher in hydro-climate predictions and impact assessments at Oak Ridge National Lab The Computational Earth Sciences group of the Computer Science and Mathematics Division at Oak Ridge National Laboratory seeks to hire a Post Doctoral Researcher to participate in research on understanding the roles of natural and anthropogenic forcing in near-term decadal-scale regional hydro-climatic variability over continental United States and South Asia. In addition, the research will also focus on the projection of potential impacts of decadal-scale regional hydro-climatic variability on energy, water resources and associated critical infrastructures. This research will use a suite of Earth system models and statistical techniques to downscale predictions from a multi-model ensemble of IPCC-AR5 GCMs to an ultra-high horizontal resolution of 4 km over the United States and the South Asia. The successful candidate will be expected to 1) develop and perform experiments with regional and hydrological models on Oak Ridge Leadership Computing Facility (OLCF) 2) present the research at national and international conferences, and 3) report results in peer reviewed journals, technical manuals, and conference proceedings. This position requires a PhD in Atmospheric and Hydrological Sciences or a related field within the past five years from an accredited college or university. Candidate is expected to have a strong understanding of North American climate and/or South Asian monsoon system. Experience in the use and application of a regional climate model and/or a hydrological model, and ability to perform advanced data analysis on large datasets is required. Excellent interpersonal skills, oral and written communications skills, organizational skills, and strong personal motivation are necessary. Ability to work effectively and contribute to a dynamic, team environment is required. Ability to assimilate new concepts and adapt to a rapidly evolving scientific and computational environment is necessary. Experience with numerical methods, parallel algorithms, MPI, FORTRAN, C, C++, and parallel software development on large scale computational resources will be an advantage. We anticipate it to be a two years position, dependent on continuing funding. Applications will be accepted until the position is filled. Technical Questions: For more information about this position please contact Dr. Moetasim Ashfaq (mashfaq at ornl.gov). Please reference this position title in your correspondence. Interested candidates should apply online: https://www3.orau.gov/ORNL_TOppS/Posting/Details/185 Please refer to the following link for the application requirements: http://www.orau.org/ornl/postdocs/ornl-pd-pm/application.htm This appointment is offered through the ORNL Postgraduate Research Participation Program and is administered by the Oak Ridge Institute for Science and Education (ORISE). The program is open to all qualified U.S. and non-U.S. citizens without regard to race, color, age, religion, sex, national origin, physical or mental disability, or status as a Vietnam-era veteran or disabled veteran. From scott.rowe at globocean.fr Wed Aug 24 09:22:55 2011 From: scott.rowe at globocean.fr (Scott) Date: Wed, 24 Aug 2011 17:22:55 +0200 Subject: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest related? Message-ID: <4E55174F.9030306@globocean.fr> Hello all, I would like to know if others have come across this problem. The best I can do is give a general description because it is quite unpredictable. In point form: - General Details - o I am performing a simulation with one parent domain (@25km) and three child domains (@12.5km) o I am able to run just the parent domain without problem on 2 CPUs with 4 cores each, ie 8 threads using MPI for communications, in a single computer. o I can run the parent domain on at least 30 odd cores without problem, using MPI over a network. --> no nests, no worries o When I increase maxdom to include from one to three child domains, the simulations will work fine when run on a single core. --> no MPI, no worries o As soon as I increase the number of cores, simulation success becomes less likely. --> nests + MPI = worries o The strange thing is, when it performs correctly with say, two cores, I will increase this to three cores, WRF will crash. Upon returning to two cores, this simulation will no longer function, and this without touching any other configuration aspect! Success is highly unpredictable. o When WRF crashes, it is most often in radiation routines, but sometimes in cumulus, this is also highly unpredictable. o Successive runs always crash at the same timestep and in the same routine. o Timestep values for the parent domain and child domains are very conservative, and are also shown to function well when run without MPI I will add o Many combinations of physics and dynamics options have been trialled to no avail. I note again that the options chosen, when run without MPI, run fine. o I have tried several configurations for the widths of relaxation zones for boundary conditions, a wider relaxation does seem to increase the chance of success, but this is hard to verify. o No CFL warnings appear in the rsl log files, the crashes are brusque and take the form of a segmentation fault whilst treating a child domain, never in the parent domain. o The only hint I have seen in output files is the TSK field becoming NaN over land inside the child domain. This does not occur 100% of the time however. It would thus appear to be a MPI or compiler issue rather than WRF. This said, it is only the combination of nests AND MPI that causes problems, not one or the other alone. Could it be RSL? Does anyone have any debugging ideas, even just general approaches to try and find the culprit? Any MPI parameters that could be ajusted? - Technical Details - o Using OpenMPI 1.4.3 o Aiming for WRFV3.3 use but have tried v3.2.1 also o EM/ARW core o Compiler is ifort and icc v10.1 o Have tried compiling with -O0, -O2 and -O3 with thourough cleaning each time o GFS boundary conditions, WPSV3.3. No obvious problems to report here. geo*.nc and met_em* appear fine. Thank you for any help you may be able to give. From ulasim at chemistry.uoc.gr Wed Aug 24 04:56:57 2011 From: ulasim at chemistry.uoc.gr (Ulas IM) Date: Wed, 24 Aug 2011 13:56:57 +0300 Subject: [Wrf-users] plotting sigma lalyers with rip Message-ID: Dear users I am trying to plot the terrain following sigma layers on a cross section from a wrf output. Is there a default way to accomplish this? thank you Ulas IM, PhD University of Crete Department of Chemistry Environmental Chemical Processes Laboratory (ECPL) Voutes, Heraklion Crete, Greece E - mail: ulasim at chemistry.uoc.gr Web: http://ulas-im.tr.gg Phone: (+30) 2810 545162 Fax: (+30) 2810 545001 From bbrashers at Environcorp.com Wed Aug 24 18:09:27 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Wed, 24 Aug 2011 17:09:27 -0700 Subject: [Wrf-users] OBS nudging Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520752C51E@irvine01.irvine.environ.local> I have a few questions about using OBS nudging in WRF v3.3. I'm using the 12km NAM data, which comes in 6 hour intervals. WPS ran fine, and I have met_em files every 6 hours. I have processed MADIS data to little_r format, and named the files OBS:YYYY-MM-DD_HH. Each file contains 6 hours of data, which I've confirmed via grep (e.g. "grep 200801 OBS:2008-01-29_18 | cut -c 327-336 | sort | uniq" shows the right time-stamps). 1. OBSGRID.EXE creates OBS_DOMAIN101, OBSDOMAIN102, etc. But each file contains only ONE hour's data (the analysis hour +/- 30 minutes). The OBS data for the next 5 hours is not output. How can I get OBSGRID.EXE to create OBS_DOMAIN1* files that contain ALL the available data? I want to nudge every hour, not every 6 hours. 2. The most recent notes in the WRF User's Guide make no mention having to concatenate the OBS_DOMAIN1* output from OBSGRID.EXE to OBS_DOMAIN101, but the tutorial notes do say so. Which is correct? Will WRF not read OBS_DOMAIN102, OBS_DOMAIN103, etc.? 3. If I want to nudge every hour, do I set auxinput11_interval_m = 60 in WRF's namelist? Or is that supposed to match OBSGRID's record1 interval ? 4. If I want to keep nudging till the very end of my simulation, can I set auxinput11_end_h = 99999? The meaning of these two values is pretty unclear in the README.namelist, README.obs_fdda, as well as the WRF User's Guide and tutorial notes. If anyone can shed some light, I'd be most appreciative. Thanks, Bart This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110824/32877a44/attachment.html From Glenn.Creighton at offutt.af.mil Thu Aug 25 13:57:28 2011 From: Glenn.Creighton at offutt.af.mil (Creighton, Glenn A Civ USAF AFWA 16 WS/WXN) Date: Thu, 25 Aug 2011 14:57:28 -0500 Subject: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest related? (Scott) In-Reply-To: References: Message-ID: <201108251937.p7PJaob3037049@sgbp-fwl-001.offutt.af.mil> Scott, I have a similar problem with version 3.3, but not version 3.2. It may be a related issue to that which you are experiencing. WRF will either seg fault somewhere in a call to alloc_space_field, or collect_on_comm, debugging shows me that in these cases its dying in the MPI code ( calling libmpi.so.0 -> libopen-pal.so.0 -> mca_btl_openib.so ). It seems to die in a different place every time. Sometimes it will just hang while creating the first wrfout file for d02. It dies more frequently with nested runs. Running openmpi 1.4.2. I can run it 5 times and it will die 4 different ways. 1. module_comm_dm.f90:812 -> c_code.c:627 -> libmpi.so.0:?? -> libopen-pal.so.0:?? -> mca_btl_openib.so:?? libmlx4-rdav2.so:?? 2. module_comm_nesting_dm:11793 -> c_code.c:627 -> libmpi.so.0:?? -> libopen-pal.so.0:?? -> mca_btl_openib.so:?? libmlx4-rdav2.so:?? -> libpthread.so.0:?? 3. Hung writing wrfout_d02 4. mediation_integrate.f90:234 -> wrf_ext_read_field.f90:130 -> module_io.f90:14873 -> module_io.f90:15043 -> module_io.f90:16177 ... -> ... -> libpthread.so.0:?? I'm trying to work with the folks at ncar on this right now. It's a weird bug that seems very machine/compiler dependent (I'm running this on a Linux with the ifort/icc also. Same code works just fine on our AIX and another Linux box we have here. Very strange bug. Glenn -----Original Message----- From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of wrf-users-request at ucar.edu Sent: Thursday, August 25, 2011 1:00 PM To: wrf-users at ucar.edu Subject: Wrf-users Digest, Vol 84, Issue 14 Send Wrf-users mailing list submissions to wrf-users at ucar.edu To subscribe or unsubscribe via the World Wide Web, visit http://mailman.ucar.edu/mailman/listinfo/wrf-users or, via email, send a message with subject or body 'help' to wrf-users-request at ucar.edu You can reach the person managing the list at wrf-users-owner at ucar.edu When replying, please edit your Subject line so it is more specific than "Re: Contents of Wrf-users digest..." Today's Topics: 1. Postdoctoral Researcher in hydro-climate predictions and impact assessments (Ashfaq, Moetasim) 2. Unpredictable crashes - MPI/RSL/Nest related? (Scott) 3. plotting sigma lalyers with rip (Ulas IM) ---------------------------------------------------------------------- Message: 1 Date: Wed, 24 Aug 2011 10:28:17 -0400 From: "Ashfaq, Moetasim" Subject: [Wrf-users] Postdoctoral Researcher in hydro-climate predictions and impact assessments To: "wrf-users at ucar.edu" Message-ID: <695D6190-40FD-4D98-96D3-026C73FD83E1 at ornl.gov> Content-Type: text/plain; charset=us-ascii Postdoctoral Researcher in hydro-climate predictions and impact assessments at Oak Ridge National Lab The Computational Earth Sciences group of the Computer Science and Mathematics Division at Oak Ridge National Laboratory seeks to hire a Post Doctoral Researcher to participate in research on understanding the roles of natural and anthropogenic forcing in near-term decadal-scale regional hydro-climatic variability over continental United States and South Asia. In addition, the research will also focus on the projection of potential impacts of decadal-scale regional hydro-climatic variability on energy, water resources and associated critical infrastructures. This research will use a suite of Earth system models and statistical techniques to downscale predictions from a multi-model ensemble of IPCC-AR5 GCMs to an ultra-high horizontal resolution of 4 km over the United States and the South Asia. The successful candidate will be expected to 1) develop and perform experiments with regional and hydrological models on Oak Ridge Leadership Computing Facility (OLCF) 2) present the research at national and international conferences, and 3) report results in peer reviewed journals, technical manuals, and conference proceedings. This position requires a PhD in Atmospheric and Hydrological Sciences or a related field within the past five years from an accredited college or university. Candidate is expected to have a strong understanding of North American climate and/or South Asian monsoon system. Experience in the use and application of a regional climate model and/or a hydrological model, and ability to perform advanced data analysis on large datasets is required. Excellent interpersonal skills, oral and written communications skills, organizational skills, and strong personal motivation are necessary. Ability to work effectively and contribute to a dynamic, team environment is required. Ability to assimilate new concepts and adapt to a rapidly evolving scientific and computational environment is necessary. Experience with numerical methods, parallel algorithms, MPI, FORTRAN, C, C++, and parallel software development on large scale computational resources will be an advantage. We anticipate it to be a two years position, dependent on continuing funding. Applications will be accepted until the position is filled. Technical Questions: For more information about this position please contact Dr. Moetasim Ashfaq (mashfaq at ornl.gov). Please reference this position title in your correspondence. Interested candidates should apply online: https://www3.orau.gov/ORNL_TOppS/Posting/Details/185 Please refer to the following link for the application requirements: http://www.orau.org/ornl/postdocs/ornl-pd-pm/application.htm This appointment is offered through the ORNL Postgraduate Research Participation Program and is administered by the Oak Ridge Institute for Science and Education (ORISE). The program is open to all qualified U.S. and non-U.S. citizens without regard to race, color, age, religion, sex, national origin, physical or mental disability, or status as a Vietnam-era veteran or disabled veteran. ------------------------------ Message: 2 Date: Wed, 24 Aug 2011 17:22:55 +0200 From: Scott Subject: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest related? To: wrf-users at ucar.edu Message-ID: <4E55174F.9030306 at globocean.fr> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Hello all, I would like to know if others have come across this problem. The best I can do is give a general description because it is quite unpredictable. In point form: - General Details - o I am performing a simulation with one parent domain (@25km) and three child domains (@12.5km) o I am able to run just the parent domain without problem on 2 CPUs with 4 cores each, ie 8 threads using MPI for communications, in a single computer. o I can run the parent domain on at least 30 odd cores without problem, using MPI over a network. --> no nests, no worries o When I increase maxdom to include from one to three child domains, the simulations will work fine when run on a single core. --> no MPI, no worries o As soon as I increase the number of cores, simulation success becomes less likely. --> nests + MPI = worries o The strange thing is, when it performs correctly with say, two cores, I will increase this to three cores, WRF will crash. Upon returning to two cores, this simulation will no longer function, and this without touching any other configuration aspect! Success is highly unpredictable. o When WRF crashes, it is most often in radiation routines, but sometimes in cumulus, this is also highly unpredictable. o Successive runs always crash at the same timestep and in the same routine. o Timestep values for the parent domain and child domains are very conservative, and are also shown to function well when run without MPI I will add o Many combinations of physics and dynamics options have been trialled to no avail. I note again that the options chosen, when run without MPI, run fine. o I have tried several configurations for the widths of relaxation zones for boundary conditions, a wider relaxation does seem to increase the chance of success, but this is hard to verify. o No CFL warnings appear in the rsl log files, the crashes are brusque and take the form of a segmentation fault whilst treating a child domain, never in the parent domain. o The only hint I have seen in output files is the TSK field becoming NaN over land inside the child domain. This does not occur 100% of the time however. It would thus appear to be a MPI or compiler issue rather than WRF. This said, it is only the combination of nests AND MPI that causes problems, not one or the other alone. Could it be RSL? Does anyone have any debugging ideas, even just general approaches to try and find the culprit? Any MPI parameters that could be ajusted? - Technical Details - o Using OpenMPI 1.4.3 o Aiming for WRFV3.3 use but have tried v3.2.1 also o EM/ARW core o Compiler is ifort and icc v10.1 o Have tried compiling with -O0, -O2 and -O3 with thourough cleaning each time o GFS boundary conditions, WPSV3.3. No obvious problems to report here. geo*.nc and met_em* appear fine. Thank you for any help you may be able to give. ------------------------------ Message: 3 Date: Wed, 24 Aug 2011 13:56:57 +0300 From: Ulas IM Subject: [Wrf-users] plotting sigma lalyers with rip To: wrf-users at ucar.edu Message-ID: Content-Type: text/plain; charset=ISO-8859-1 Dear users I am trying to plot the terrain following sigma layers on a cross section from a wrf output. Is there a default way to accomplish this? thank you Ulas IM, PhD University of Crete Department of Chemistry Environmental Chemical Processes Laboratory (ECPL) Voutes, Heraklion Crete, Greece E - mail: ulasim at chemistry.uoc.gr Web: http://ulas-im.tr.gg Phone: (+30) 2810 545162 Fax: (+30) 2810 545001 ------------------------------ _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users End of Wrf-users Digest, Vol 84, Issue 14 ***************************************** From bbrashers at Environcorp.com Fri Aug 26 12:18:24 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Fri, 26 Aug 2011 11:18:24 -0700 Subject: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest related?(Scott) Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520752C7B9@irvine01.irvine.environ.local> I have been running _almost_ the same setup as you, just using - GFS (and NAM 12km) GRIB files for inits - WRFv3.3 - Openmpi-1.4.3 - PGI 10.6-0 - CentOS 5.x (2.6.18-53.1.14 kernel) on a Rocks 5.0 system - gE interconnect. I've not seen any similar problems. FWIW, here's how I compiled stuff: # grep -A11 DMPARALLEL /usr/local/src/wrf/WRFV3.3-openmpi/configure.wrf DMPARALLEL = 1 OMPCPP = # -D_OPENMP OMP = # -mp -Minfo=mp -Mrecursive OMPCC = # -mp SFC = pgf90 SCC = gcc CCOMP = pgcc DM_FC = /usr/local/src/openmpi-1.4.3/bin/mpif90 DM_CC = /usr/local/src/openmpi-1.4.3/bin/mpicc -DMPI2_SUPPORT FC = $(DM_FC) CC = $(DM_CC) -DFSEEKO64_OK LD = $(FC) # cat /usr/local/src/openmpi-1.4.3/my.configure #!/bin/tcsh -f setenv CC pgcc setenv CFLAGS '' setenv CXX pgCC setenv CXXFLAGS '' setenv FC pgf90 setenv FCFLAGS '-fast' setenv FFLAGS '-O2' setenv F90 pgf90 ./configure --prefix=/usr/local/src/openmpi-1.4.3 --with-tm=/opt/torque --disable-ipv6 >&! my.configure.out make all >&! make.out make install >&! make.install.out Maybe there's something there that is different, that will help. Bart > -----Original Message----- > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On > Behalf Of Creighton, Glenn A Civ USAF AFWA 16 WS/WXN > Sent: Thursday, August 25, 2011 12:57 PM > To: wrf-users at ucar.edu > Subject: Re: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest > related?(Scott) > > Scott, > > I have a similar problem with version 3.3, but not version 3.2. It may be a > related issue to that which you are experiencing. WRF will either seg fault > somewhere in a call to alloc_space_field, or collect_on_comm, debugging > shows me that in these cases its dying in the MPI code ( calling libmpi.so.0 > -> libopen-pal.so.0 -> mca_btl_openib.so ). It seems to die in a different > place every time. Sometimes it will just hang while creating the first > wrfout file for d02. It dies more frequently with nested runs. Running > openmpi 1.4.2. > > I can run it 5 times and it will die 4 different ways. > 1. module_comm_dm.f90:812 -> c_code.c:627 -> libmpi.so.0:?? -> libopen- > pal.so.0:?? -> mca_btl_openib.so:?? libmlx4-rdav2.so:?? > > 2. module_comm_nesting_dm:11793 -> c_code.c:627 -> libmpi.so.0:?? -> > libopen-pal.so.0:?? -> mca_btl_openib.so:?? libmlx4-rdav2.so:?? -> > libpthread.so.0:?? > > 3. Hung writing wrfout_d02 > > 4. mediation_integrate.f90:234 -> wrf_ext_read_field.f90:130 -> > module_io.f90:14873 -> module_io.f90:15043 -> module_io.f90:16177 ... -> ... > -> libpthread.so.0:?? > > I'm trying to work with the folks at ncar on this right now. It's a weird > bug that seems very machine/compiler dependent (I'm running this on a Linux > with the ifort/icc also. Same code works just fine on our AIX and another > Linux box we have here. Very strange bug. > Glenn > > > -----Original Message----- > Date: Wed, 24 Aug 2011 17:22:55 +0200 > From: Scott > Subject: [Wrf-users] Unpredictable crashes - MPI/RSL/Nest related? > To: wrf-users at ucar.edu > Message-ID: <4E55174F.9030306 at globocean.fr> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > Hello all, > > I would like to know if others have come across this problem. The best I > can do is give a general description because it is quite unpredictable. > In point form: > > - General Details - > > o I am performing a simulation with one parent domain (@25km) and three > child domains (@12.5km) > o I am able to run just the parent domain without problem on 2 CPUs with > 4 cores each, ie 8 threads using MPI for communications, in a single > computer. > o I can run the parent domain on at least 30 odd cores without problem, > using MPI over a network. --> no nests, no worries > o When I increase maxdom to include from one to three child domains, the > simulations will work fine when run on a single core. --> no MPI, no worries > o As soon as I increase the number of cores, simulation success becomes > less likely. --> nests + MPI = worries > o The strange thing is, when it performs correctly with say, two cores, > I will increase this to three cores, WRF will crash. Upon returning to > two cores, this simulation will no longer function, and this without > touching any other configuration aspect! Success is highly unpredictable. > o When WRF crashes, it is most often in radiation routines, but > sometimes in cumulus, this is also highly unpredictable. > o Successive runs always crash at the same timestep and in the same routine. > o Timestep values for the parent domain and child domains are very > conservative, and are also shown to function well when run without MPI I > will add > o Many combinations of physics and dynamics options have been trialled > to no avail. I note again that the options chosen, when run without MPI, > run fine. > o I have tried several configurations for the widths of relaxation zones > for boundary conditions, a wider relaxation does seem to increase the > chance of success, but this is hard to verify. > o No CFL warnings appear in the rsl log files, the crashes are brusque > and take the form of a segmentation fault whilst treating a child > domain, never in the parent domain. > o The only hint I have seen in output files is the TSK field becoming > NaN over land inside the child domain. This does not occur 100% of the > time however. > > It would thus appear to be a MPI or compiler issue rather than WRF. This > said, it is only the combination of nests AND MPI that causes problems, > not one or the other alone. Could it be RSL? > > Does anyone have any debugging ideas, even just general approaches to > try and find the culprit? > Any MPI parameters that could be ajusted? > > > - Technical Details - > > o Using OpenMPI 1.4.3 > o Aiming for WRFV3.3 use but have tried v3.2.1 also > o EM/ARW core > o Compiler is ifort and icc v10.1 > o Have tried compiling with -O0, -O2 and -O3 with thourough cleaning > each time > o GFS boundary conditions, WPSV3.3. No obvious problems to report here. > geo*.nc and met_em* appear fine. > > Thank you for any help you may be able to give. This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. From bbrashers at Environcorp.com Fri Aug 26 12:38:07 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Fri, 26 Aug 2011 11:38:07 -0700 Subject: [Wrf-users] OBS nudging Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520752C7C9@irvine01.irvine.environ.local> I'm replying to my own question here, so others can find the solution. Solved. Tags for searching: hourly OBS nudging, OBSGRID, WRF 3.3. Solution first, statement of the problem below. SOLUTION: In namelist.wps, set (among other settings): &share interval_seconds = 21600 to match your input GRIB data (every 6 hours for my data). Make your obs_filename (OBS:*, little_r format) files each contain only 1 hour of data. This conflicts somewhat with the statement in the WRF v3.3 Users Guide , Chapter 7, under OBSGRID Namelist: "Ideally there should be an obs_filename for each time period for which an objective analysis is desired." I suppose I should interpret "objective analysis" as either the 3D objective analysis or the 2D surface analysis. I believe it should say that there should be an obs_filename for each intf4d (surface analysis) period. This is the key. In namelist.obsgrid, set (among other settings): &record1 interval = 21600 &record2 obs_filename = path/to/OBS &record7 f4d = .TRUE. intf4d = 3600 This will produce OBS_DOMAIN101 with 1 hour of data (+/- 30 minutes from the start time); OBS_DOMAIN102 with 5 hours of data; OBS_DOMAIN103 with 1 hour of data (+/- 30 minutes from start + interval [6 hours]); etc. You must concatenate all the OBS_DOMAIN1?? files into a single OBS_DOMAIN101 file in the WRF run directory. The Users Guide does not explicitly state this, though the latest tutorial does. auxinput11_interval_m is the minimum time interval you would like to check for new observations. If you want to nudge every hour, set this to 3600. Note that this is listed in test/em_real/README.obs_fdda as auxinput11_interval_s = 180, (180 seconds) which I believe is a typo. They problem meant 180 minutes (3 hours). auxinput11_end_h is the time at which you want to WRF stop reading OBS_DOMAIN101. PROBLEM: Following the WRF v3.3 Users Guide, Chapter 7, my obs_filename (OBS:*, little_r format) files each contained 6 hours of data (to match interval = 21600). At each intf4d time interval, OBSGRID opens an obs_filename file with the current time-stamp. It DOES NOT check for the existence of the file before it opens the file. So if your obs_filename (little_r) files each contain 6 hours of data, then at hour +1 it opens (and thus creates) a file with that time-stamp. Of course it's empty (newly created) so no data are read, and no data are output to the OBS_DOMAIN* file for that hour. If you started with obs_filename files like this: -rw-rw-rw- 1 username 650M Aug 25 10:02 OBS:2008-01-29_12 -rw-rw-rw- 1 username 630M Aug 25 10:10 OBS:2008-01-29_18 -rw-rw-rw- 1 username 650M Aug 25 10:16 OBS:2008-01-30_00 Then after OBSGRID.EXE had run, you'll have files like this: -rw-rw-rw- 1 username 650M Aug 25 10:02 OBS:2008-01-29_12 -rw-rw-rw- 1 username 0 Aug 25 10:03 OBS:2008-01-29_13 -rw-rw-rw- 1 username 0 Aug 25 10:05 OBS:2008-01-29_14 -rw-rw-rw- 1 username 0 Aug 25 10:06 OBS:2008-01-29_15 -rw-rw-rw- 1 username 0 Aug 25 10:07 OBS:2008-01-29_16 -rw-rw-rw- 1 username 0 Aug 25 10:08 OBS:2008-01-29_17 -rw-rw-rw- 1 username 630M Aug 25 10:10 OBS:2008-01-29_18 -rw-rw-rw- 1 username 0 Aug 25 10:10 OBS:2008-01-29_19 -rw-rw-rw- 1 username 0 Aug 25 10:12 OBS:2008-01-29_20 -rw-rw-rw- 1 username 0 Aug 25 10:13 OBS:2008-01-29_21 -rw-rw-rw- 1 username 0 Aug 25 10:14 OBS:2008-01-29_22 -rw-rw-rw- 1 username 0 Aug 25 10:15 OBS:2008-01-29_23 -rw-rw-rw- 1 username 650M Aug 25 10:16 OBS:2008-01-30_00 and your OBS_DOMAIN1?? files will contain only data +/- 30 minutes from each interval (each 6 hours). Bart From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Bart Brashers Sent: Wednesday, August 24, 2011 5:09 PM To: wrf-users at ucar.edu Cc: wrfhelp at ucar.edu Subject: [Wrf-users] OBS nudging I have a few questions about using OBS nudging in WRF v3.3. I'm using the 12km NAM data, which comes in 6 hour intervals. WPS ran fine, and I have met_em files every 6 hours. I have processed MADIS data to little_r format, and named the files OBS:YYYY-MM-DD_HH. Each file contains 6 hours of data, which I've confirmed via grep (e.g. "grep 200801 OBS:2008-01-29_18 | cut -c 327-336 | sort | uniq" shows the right time-stamps). 1. OBSGRID.EXE creates OBS_DOMAIN101, OBSDOMAIN102, etc. But each file contains only ONE hour's data (the analysis hour +/- 30 minutes). The OBS data for the next 5 hours is not output. How can I get OBSGRID.EXE to create OBS_DOMAIN1* files that contain ALL the available data? I want to nudge every hour, not every 6 hours. 2. The most recent notes in the WRF User's Guide make no mention having to concatenate the OBS_DOMAIN1* output from OBSGRID.EXE to OBS_DOMAIN101, but the tutorial notes do say so. Which is correct? Will WRF not read OBS_DOMAIN102, OBS_DOMAIN103, etc.? 3. If I want to nudge every hour, do I set auxinput11_interval_m = 60 in WRF's namelist? Or is that supposed to match OBSGRID's record1 interval ? 4. If I want to keep nudging till the very end of my simulation, can I set auxinput11_end_h = 99999? The meaning of these two values is pretty unclear in the README.namelist, README.obs_fdda, as well as the WRF User's Guide and tutorial notes. If anyone can shed some light, I'd be most appreciative. Thanks, Bart ________________________________ This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110826/c028b06f/attachment-0001.html From jagan at tnau.ac.in Thu Sep 1 23:47:58 2011 From: jagan at tnau.ac.in (jagan TNAU) Date: Fri, 2 Sep 2011 11:17:58 +0530 Subject: [Wrf-users] climate run output Message-ID: Dear Users, I am trying to use CAM3 data for climate runs starting from 2000 to 2099. I need only the following daily output for running the crop simulation model. Maximum Temperature Minimum Temperature Solar Radiation (Total) Rainfall Relative Humidity (Maximum & Minimum) Wind speed (km/day) As there is no output like this on daily basis what is the possibility for getting this output. -- With regards Dr.R.Jagannathan Professor of Agronomy, Department of Agronomy Tamil Nadu Agricultural University, Coimbatore - 641 003 India PHONE: Mob: +91 94438 89891 DO NOT PRINT THIS E-MAIL UNLESS NECESSARY. THE ENVIRONMENT CONCERNS US ALL. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/c7580876/attachment.html From anottrot at ucsd.edu Tue Aug 30 11:59:41 2011 From: anottrot at ucsd.edu (Anders A Nottrott) Date: Tue, 30 Aug 2011 10:59:41 -0700 Subject: [Wrf-users] More information about WRF Ideal test cases Message-ID: <000201cc673e$97d73b00$c785b100$@ucsd.edu> Hello All, I wondered if anyone knows where I can obtain detailed information regarding the WRF ideal test simulations? I have read the information on pg. 4-3 of the WRF V3 users guide, but there are only a few basic points relating to each simulation. A more detailed description would be helpful/useful. Kind Regards, Anders Nottrott -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110830/9e406e22/attachment.html From anottrot at ucsd.edu Thu Sep 1 14:13:05 2011 From: anottrot at ucsd.edu (Anders A Nottrott) Date: Thu, 1 Sep 2011 13:13:05 -0700 Subject: [Wrf-users] TKE and velocity perturbation output? Message-ID: <00a901cc68e3$8f404300$adc0c900$@ucsd.edu> Hello All, I am running some idealized LES cases for a convective ABL over a flat surface. I was wondering if it is possible to get the resolved scale TKE as an output variable. It seems to me that using km_opt=2 in the 'namelist' should give the resolved TKE as an output, but I do not find it in my netcdf output file. I found the following in the ARW description document: "Optionally, turbulent kinetic energy and any number of scalars such as water vapor mixing ratio, rain/snow mixing ratio, cloud water/ice mixing ratio, and chemical species and tracers." Maybe this is only referring to the output obtained when using a PBL scheme? Also is it possible to get information about the velocity perturbation as an output (e.g. u', v', w' or maybe the second moment)? I believe that the U, V, W output variables are the full velocities. If I have to compute TKE and velocity fluctuation in post processing then I will have a lot of output data from the WRF run and post processing will be memory intensive. Regards, Anders Nottrott -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110901/f76c5b3b/attachment.html From dbh409 at ku.edu Thu Sep 1 16:22:47 2011 From: dbh409 at ku.edu (Huber, David) Date: Thu, 1 Sep 2011 22:22:47 +0000 Subject: [Wrf-users] WRF Seasonal Variations Message-ID: Hello, If I were to perform a 1-year long simulation in WRF, would how often would it change the land cover parameters? I noticed in VEGPARM.TBL that there are only winter and summer values, so does this mean that there is a sudden change in vegetation parameters once or possibly twice in a year-long simulation, that the vegetation parameters change regularly throughout the simulation, or that they stay the same as the initial vegetation parameters? Thanks, David From agmunoz at cmc.org.ve Fri Sep 2 12:07:48 2011 From: agmunoz at cmc.org.ve (=?iso-8859-1?Q?=22=C1ngel_G=2E_Mu=F1oz=22?=) Date: Fri, 2 Sep 2011 14:07:48 -0400 Subject: [Wrf-users] Wrf-users Digest, Vol 85, Issue 1 In-Reply-To: References: Message-ID: Dear Dr Jagannathan The Andean Observatory (http://journals.ametsoc.org/doi/abs/10.1175/2010BAMS2958.1) has been running for a several years now the WRF using CAM3 output as input. If you are interested we can share the CAM3 output in the "intermediate format" needed by WPS so you can run some test. If this is successful and you are still interested, we can give you access to all our data. Best ?ngel El 02/09/2011, a las 14:00, wrf-users-request at ucar.edu escribi?: > > > Message: 1 > Date: Fri, 2 Sep 2011 11:17:58 +0530 > From: jagan TNAU > Subject: [Wrf-users] climate run output > To: wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Dear Users, > > I am trying to use CAM3 data for climate runs starting from 2000 to 2099. I > need only the following daily output for running the crop simulation model. > > Maximum Temperature > Minimum Temperature > Solar Radiation (Total) > Rainfall > Relative Humidity (Maximum & Minimum) > Wind speed (km/day) > > As there is no output like this on daily basis what is the possibility for > getting this output. > > -- > With regards > > Dr.R.Jagannathan > Professor of Agronomy, > Department of Agronomy > Tamil Nadu Agricultural University, > Coimbatore - 641 003 India > > PHONE: Mob: +91 94438 89891 > > DO NOT PRINT THIS E-MAIL UNLESS NECESSARY. THE ENVIRONMENT CONCERNS US ALL. > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/c7580876/attachment-0001.html Prof. ?ngel G. Mu?oz S. Coordinador Eje Geociencias Centro de Modelado Cient?fico (CMC) La Universidad del Zulia VENEZUELA http://cmc.org.ve Telf +58-261-412-6008 From ecarlste at gmail.com Fri Sep 2 14:02:26 2011 From: ecarlste at gmail.com (Erik Carlsten) Date: Fri, 2 Sep 2011 14:02:26 -0600 Subject: [Wrf-users] Which QVAPOR values correspond to which Total Geopotential Height? Message-ID: Hi All, This is my first time using this mailing list so hopefully I give you the information you need to answer the question I have, providing you have the time to help. Currently I am running WRF and producing output just fine. The variables I am primarily interested in looking at right now are PH, PHB, and QVAPOR. I might also need Q2, but I am unsure yet due to my lack of overall understanding of the correlation between the staggered grid for elevation ( ( PH + PHB ) / 9.81 ) and the non-staggered grid of QVAPOR. What I am trying to figure out is what the water vapor mixing ratio is at a given elevation. I am pulling the data straight out of the NetCDF files using C and Matlab, so I won't be using NCL at all for this. Can anyone explain with some detail on how I would determine which vertical layer of QVAPOR correlates to which vertical layer of "total geopotential height in meters"? Thanks very much for your time, Erik -- Erik S. Carlsten Cell: (406) 570-1547 EPS 116 Lab: (406) 994-6145 Montana State University Bozeman, MT 59717 --------------------------------------------------------------- There is a single light of science, and to brighten it anywhere is to brighten it everywhere. - Isaac Asimov -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/a3949380/attachment.html From eric.kemp at nasa.gov Fri Sep 2 11:17:50 2011 From: eric.kemp at nasa.gov (Kemp, Eric M. (GSFC-610.3)[NORTHROP GRUMMAN]) Date: Fri, 2 Sep 2011 12:17:50 -0500 Subject: [Wrf-users] Minor bug found in Grell-Devenyi cumulus scheme (WRF 3.2.1 and WRF 3.3) Message-ID: All: I?ve found a minor bug in the Grell-Devenyi cumulus scheme while running a test domain including the Himalayas and with run time checking turned on. The relevant code is in lines 261-276 of module_cu_gd.F: do k= kts+1,ktf-1 DO I = its,itf if((p2d(i,1)-p2d(i,k)).gt.150.and.p2d(i,k).gt.300)then dp=-.5*(p2d(i,k+1)-p2d(i,k-1)) umean(i)=umean(i)+us(i,k)*dp vmean(i)=vmean(i)+vs(i,k)*dp pmean(i)=pmean(i)+dp endif enddo enddo DO I = its,itf umean(i)=umean(i)/pmean(i) vmean(i)=vmean(i)/pmean(i) direction(i)=(atan2(umean(i),vmean(i))+3.1415926)*57.29578 if(direction(i).gt.360.)direction(i)=direction(i)-360. ENDDO p2d is the 2D slab of pressure in mb. Pmean is an average pressure thickness of model levels below the 300 mb level and 150 mb above the model level closest to the ground. Pmean is initialized to zero earlier in the routine. In my test run, I encountered a situation in the Himalayas where the pmean value is not updated, triggering a division-by-zero error in the umean, vmean, and direction calculations. My suggested bug fix is: DO I = its,itf !EMK Bug fix if (pmean(i) > 0) then umean(i)=umean(i)/pmean(i) vmean(i)=vmean(i)/pmean(i) direction(i)=(atan2(umean(i),vmean(i))+3.1415926)*57.29578 if(direction(i).gt.360.)direction(i)=direction(i)-360. end if !EMK END Bug fix ENDDO Note that umean and vmean are only used to calculate direction, and direction is initialized as zero. It does not appear that this direction variable is actually used anywhere else in the code (code fragments that did use it are all commented out), so this is a minor bug and bug fix that shouldn?t change simulation results. Nonetheless I suggest fixing it, since it can interfere with run-time checks for other errors (as it did in my test case!) and it could slow down the simulation by producing NaNs. -Eric -------------------------------------------------------------------- Eric M. Kemp Northrop Grumman Corporation Meteorologist Information Systems Civil Enterprise Solutions Civil Systems Division Goddard Space Flight Center Mailstop 610.3 Greenbelt, MD 20771 Telephone 301-286-9768 Fax 301-286-1775 E-mail: eric.kemp at nasa.gov E-mail: eric.kemp at ngc.com -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/5784d085/attachment.html From FLiu at azmag.gov Fri Sep 2 12:31:59 2011 From: FLiu at azmag.gov (Feng Liu) Date: Fri, 2 Sep 2011 18:31:59 +0000 Subject: [Wrf-users] TKE and velocity perturbation output? In-Reply-To: <00a901cc68e3$8f404300$adc0c900$@ucsd.edu> References: <00a901cc68e3$8f404300$adc0c900$@ucsd.edu> Message-ID: <9BDE2A7F9712AF45A0C08451B3CD8E5C2630E2E3@mag9006> Anders, The option of km_opt is about calculation of turbulent diffusivity. If you want to have TKE in your output, you need to (1) use PBL schemes of TKE-related closure such as Mellor-Yamada-Janjic TKE scheme, MYNN 2.5/3rd level schemes; (2) to modify entries in IO column for TKE output in Registry file. I hope that is helpful. Feng From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Anders A Nottrott Sent: Thursday, September 01, 2011 1:13 PM To: wrf-users at ucar.edu Subject: [Wrf-users] TKE and velocity perturbation output? Hello All, I am running some idealized LES cases for a convective ABL over a flat surface. I was wondering if it is possible to get the resolved scale TKE as an output variable. It seems to me that using km_opt=2 in the 'namelist' should give the resolved TKE as an output, but I do not find it in my netcdf output file. I found the following in the ARW description document: "Optionally, turbulent kinetic energy and any number of scalars such as water vapor mixing ratio, rain/snow mixing ratio, cloud water/ice mixing ratio, and chemical species and tracers." Maybe this is only referring to the output obtained when using a PBL scheme? Also is it possible to get information about the velocity perturbation as an output (e.g. u', v', w' or maybe the second moment)? I believe that the U, V, W output variables are the full velocities. If I have to compute TKE and velocity fluctuation in post processing then I will have a lot of output data from the WRF run and post processing will be memory intensive. Regards, Anders Nottrott -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/4a578754/attachment-0001.html From wrf at nusculus.com Fri Sep 2 12:21:40 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Fri, 2 Sep 2011 12:21:40 -0600 Subject: [Wrf-users] WRF Seasonal Variations In-Reply-To: References: Message-ID: Hi David, No, the VEGPARM.TBL parameters get changed frequently (daily I believe or perhaps with every timestep). The VEGPARM.TBL values represent a range and get adjust within that range corresponding to the range of values the grid cell gets from the greenfrac database within WPS, which populates the GREENFRAC variable with average monthly values. During a run, WRF interpolates between the 12 monthly values for the current day. But to cause the updates to happen, you need to use the "sst_update" option in namelist.input. I believe most of the parameters go up with the exception of albedo, which goes from max to min as less soil is exposed when greenfrac goes from min to max. Also, there is a "usemonalb" variable in the namelist.input file which you might consider using (I don't). If you turn on "sst_update" you have to re-run real.exe to produce another input file for wrf.exe. Read the WRF Users Guide about configuring for the new input files when "sst_update" is used. Hmm. That was not explained well, but I hope you get the drift of what I was trying to say. And there is a little more in the User Guide. Kevin On Thu, Sep 1, 2011 at 4:22 PM, Huber, David wrote: > Hello, > > If I were to perform a 1-year long simulation in WRF, would how often would > it change the land cover parameters? I noticed in VEGPARM.TBL that there > are only winter and summer values, so does this mean that there is a sudden > change in vegetation parameters once or possibly twice in a year-long > simulation, that the vegetation parameters change regularly throughout the > simulation, or that they stay the same as the initial vegetation parameters? > > Thanks, > > David > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/79f3103c/attachment.html From drostkier at yahoo.com Mon Sep 5 06:43:09 2011 From: drostkier at yahoo.com (Dorita Rostkier-Edelstein) Date: Mon, 5 Sep 2011 05:43:09 -0700 (PDT) Subject: [Wrf-users] correcting landmask Message-ID: <1315226589.30015.YahooMailNeo@web113107.mail.gq1.yahoo.com> Hi folks, Does anybody have an user friendly method to correct landmask in geo_em files? I have been using read_wrf_nc.f to change values. To identify the cells, or its i,j indexes in fact, I have plotted landmask using the high-resolution coast line of ncl and by trial and error found the i,j. Has anybody a better/systematic way to locate i,j's with wrong landmask? Thanks. Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110905/a9c0a0e0/attachment.html From ahah at risoe.dtu.dk Sat Sep 3 02:42:35 2011 From: ahah at risoe.dtu.dk (Hahmann, Andrea N.) Date: Sat, 3 Sep 2011 10:42:35 +0200 Subject: [Wrf-users] WRF Seasonal Variations In-Reply-To: Message-ID: This is only true if using WRF V3.3. And always check your output to make sure that the important variables (I.e. Albedo, surface roughness, etc) are being changed and used in the simulations. In previous versions, the output field Z0 (background surface roughness length) changes with time, but this field was not actually used in the model calculations, ZNT was actually the field being used and it had only two values, one for summer and one for winter. Andrea ---- Andrea N. Hahmann Senior Scientist Wind Energy Division Ris? DTU Technical University of Denmark Ris? National Laboratory for Sustainable Energy Frederikborgvej 399, P.O. Box 49 4000 Roskilde, Denmark Direct +45 4677 5471 Mobil: +45 2133 0550 ahah at risoe.dtu.dk http://www.risoe.dtu.dk From: Kevin Matthew Nuss > Date: Fri, 2 Sep 2011 20:21:40 +0200 To: "Huber, David" > Cc: "wrf-users at ucar.edu" > Subject: Re: [Wrf-users] WRF Seasonal Variations Hi David, No, the VEGPARM.TBL parameters get changed frequently (daily I believe or perhaps with every timestep). The VEGPARM.TBL values represent a range and get adjust within that range corresponding to the range of values the grid cell gets from the greenfrac database within WPS, which populates the GREENFRAC variable with average monthly values. During a run, WRF interpolates between the 12 monthly values for the current day. But to cause the updates to happen, you need to use the "sst_update" option in namelist.input. I believe most of the parameters go up with the exception of albedo, which goes from max to min as less soil is exposed when greenfrac goes from min to max. Also, there is a "usemonalb" variable in the namelist.input file which you might consider using (I don't). If you turn on "sst_update" you have to re-run real.exe to produce another input file for wrf.exe. Read the WRF Users Guide about configuring for the new input files when "sst_update" is used. Hmm. That was not explained well, but I hope you get the drift of what I was trying to say. And there is a little more in the User Guide. Kevin On Thu, Sep 1, 2011 at 4:22 PM, Huber, David > wrote: Hello, If I were to perform a 1-year long simulation in WRF, would how often would it change the land cover parameters? I noticed in VEGPARM.TBL that there are only winter and summer values, so does this mean that there is a sudden change in vegetation parameters once or possibly twice in a year-long simulation, that the vegetation parameters change regularly throughout the simulation, or that they stay the same as the initial vegetation parameters? Thanks, David _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users From jrausch at ubimet.com Tue Sep 6 04:48:38 2011 From: jrausch at ubimet.com (Johannes Rausch) Date: Tue, 06 Sep 2011 12:48:38 +0200 Subject: [Wrf-users] WRFVAR-Background Error Problem Message-ID: <1315306118.16838.28.camel@pc43.it5.ubimet.at> Hi, I'm working on a working setup for a 2 way nested WRFVAR with Radar-Refl. Data Assimilation for my university diploma. I'm using cv-option=5 and generated BE's for 2 domains(12/4km) over a month of WRF-Runs(GFS-driven). I managed too get nice analysis for coldstart and warmstarts, but have also experienced through Verification that the forecast-quality is not constant good. At the moment I run it with var_scaling1-5=0.1 and len_scaling1-5=0.1 due to a ncar paper I read which deals with Doppler Radar DA. Perhaps you could explain me, how I can find out in a "not empirical" way(which means trying all possibilities of var and len scaling and take the avaraged best one:) ), what the best configuration for the 3dVAR is. I really would appreciate a answer of you, because i cannot work on right now in a "proper" way. Thanks a lot. Best Regards Johannes From wrf at nusculus.com Fri Sep 2 19:08:13 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Fri, 2 Sep 2011 19:08:13 -0600 Subject: [Wrf-users] Which QVAPOR values correspond to which Total Geopotential Height? In-Reply-To: References: Message-ID: Hi Erik, The basic concept is that the "staggered" vertical grid is at the top and bottom of each grid cell. Except for vertical wind, W, most variables are in the vertical middle of the grid cell. So to get the height of the QVAPOR variable, take the average of the top and bottom heights: height of QVAPOR in vertical cell n = ((PH(n)+PHB(n)) + (PH(n+1) + PHB(n+1))) / 9.81 Of course, there are also the X and Y coordinates for the cell, but they are the same for QVAPOR as for PH and PHB. If you then want QVAPOR "at a given elevation" you will have to interpolate again from the two QVAPOR values above and below the given elevation. So you interpolate PH and PHB to get the vertical center of the cells and then interpolate to the "given elevation." There are bound to be some algebraic shortcuts. And maybe someone else will come up with a better approach. Q2 has already been interpolated (more likely extrapolated) to the 2 meter level for easy comparison to met station observations. You didn't ask, but for the horizontally staggered grids, the U and V winds are on the horizontal edges of the grid cell rather than in the middle, but they are centered up and down, so they are at the same heights as QVAPOR. Hope that helps, Kevin On Fri, Sep 2, 2011 at 2:02 PM, Erik Carlsten wrote: > Hi All, > > This is my first time using this mailing list so hopefully I give you the > information you need to answer the question I have, providing you have the > time to help. > > Currently I am running WRF and producing output just fine. The variables I > am primarily interested in looking at right now are PH, PHB, and QVAPOR. I > might also need Q2, but I am unsure yet due to my lack of overall > understanding of the correlation between the staggered grid for elevation ( > ( PH + PHB ) / 9.81 ) and the non-staggered grid of QVAPOR. > > What I am trying to figure out is what the water vapor mixing ratio is at a > given elevation. I am pulling the data straight out of the NetCDF files > using C and Matlab, so I won't be using NCL at all for this. > > Can anyone explain with some detail on how I would determine which vertical > layer of QVAPOR correlates to which vertical layer of "total geopotential > height in meters"? > > Thanks very much for your time, > Erik > > -- > Erik S. Carlsten Cell: (406) 570-1547 > EPS 116 Lab: (406) 994-6145 > Montana State University > Bozeman, MT 59717 > > --------------------------------------------------------------- > There is a single light of science, and to > brighten it anywhere is to brighten it > everywhere. > - Isaac Asimov > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110902/9f63a851/attachment.html From james.himer at oracle.com Wed Sep 7 12:00:23 2011 From: james.himer at oracle.com (james.himer at oracle.com) Date: Wed, 7 Sep 2011 11:00:23 -0700 (PDT) Subject: [Wrf-users] Auto Reply: Wrf-users Digest, Vol 85, Issue 5 Message-ID: <0e4ebb54-6a9e-4d27-8739-4f062ce5590a@default> Sorry I missed your email but I am currently on vacation, returning Sep 19 (talk-like-a-pirate day). Should you need assistance with SAE activities, please contact my SAE Oil/gas colleague Melinda McDade or manager Daryl Madura. From reenb at meteo.psu.edu Wed Sep 7 12:19:20 2011 From: reenb at meteo.psu.edu (Brian Reen) Date: Wed, 07 Sep 2011 14:19:20 -0400 Subject: [Wrf-users] OBS nudging In-Reply-To: <1B8D1B9BF4DCDC4A90A42E312FF308520752C7C9@irvine01.irvine.environ.local> References: <1B8D1B9BF4DCDC4A90A42E312FF308520752C7C9@irvine01.irvine.environ.local> Message-ID: <4E67B5A8.3080802@meteo.psu.edu> Bart, Regarding the auxinput11_interval_* settings for obs nudging, I think the following email written by Al Bourgeois at NCAR will clear things up: ********************************************************************* There is some redundancy here with the auxiput11_interval and the obs-nudging obs_ionf switch. All the auxinput11_interval variable really does is give a frequency at which "wrf alarms" calls subroutine wrf_fddaobs_in, which in turn calls subroutine in4dob to do the reading of obs data based on the obs_ionf flag. So the safest thing to do is simply to force the wrf alarms to call wrf_fddaobs_in every model timestep. You can do this with the following namelist entry ... this is what I always use: auxinput11_interval = 1, 1, 1, 1, 1, auxinput11_end_h = 99999,99999,99999,99999,99999, (Notice I set "auxinput11_interval" here, not "auxinput11_interval_s". Setting auxinput11_interval =1 means wrf alarms will call wrf_fddaobs_in on every model iteration.) You can accomplish the same thing with auxinput11_interval_s, as long as you set it equal to or smaller than the model timestep in seconds. This will cause wrf alarms to check for input (that is, to call wrf_fddaobs_in) at a frequency of every model timestep. For example, with the above setting, if you used obs_ionf=2, the result would be that wrf_fddaobs_in is called on every model iteration, but input would only be read (i.e., sub in4dob would only be called) on every other model iteration. Having wrf alarms check for input on every iteration is an insignificant amount of overhead, and it is the safest thing to do so that, regardless of the obs_ionf setting, a read step is never skipped. You can tell from the output if you are getting the fddaobs input at the correct frequency by searching for lines that contain the text "CALL IN4DOB AT". For example: % grep "CALL IN4" rsl.out.0000 ****** CALL IN4DOB AT KTAU = 0 AND XTIME = 0.00: NSTA = 10 ****** ****** CALL IN4DOB AT KTAU = 2 AND XTIME = 5.00: NSTA = 10 ****** ****** CALL IN4DOB AT KTAU = 4 AND XTIME = 10.00: NSTA = 10 ****** shows that in4dob is reading obs input every other model iteration (ktau), at a 5 minute interval. (In this example, obs_ionf = 2, time_step = 150 (seconds), and auxinput11_interval = 1). ****************************************************** On 8/26/2011 2:38 PM, Bart Brashers wrote: > I'm replying to my own question here, so others can find the solution. > Solved. Tags for searching: hourly OBS nudging, OBSGRID, WRF 3.3. > Solution first, statement of the problem below. > > > > SOLUTION: > > > > In namelist.wps, set (among other settings): > > > > &share > > interval_seconds = 21600 > > > > to match your input GRIB data (every 6 hours for my data). > > > > Make your obs_filename (OBS:*, little_r format) files each contain only > 1 hour of data. This conflicts somewhat with the statement in the WRF > v3.3 Users Guide > htm#namelist> , Chapter 7, under OBSGRID Namelist: "Ideally there should > be an obs_filename for each time period for which an objective analysis > is desired." I suppose I should interpret "objective analysis" as > either the 3D objective analysis or the 2D surface analysis. > > > > I believe it should say that there should be an obs_filename for each > intf4d (surface analysis) period. This is the key. > > > > In namelist.obsgrid, set (among other settings): > > > > &record1 > > interval = 21600 > > &record2 > > obs_filename = path/to/OBS > > &record7 > > f4d = .TRUE. > > intf4d = 3600 > > > > This will produce OBS_DOMAIN101 with 1 hour of data (+/- 30 minutes from > the start time); OBS_DOMAIN102 with 5 hours of data; OBS_DOMAIN103 with > 1 hour of data (+/- 30 minutes from start + interval [6 hours]); etc. > > > > You must concatenate all the OBS_DOMAIN1?? files into a single > OBS_DOMAIN101 file in the WRF run directory. The Users Guide does not > explicitly state this, though the latest tutorial does. > > > > auxinput11_interval_m is the minimum time interval you would like to > check for new observations. If you want to nudge every hour, set this > to 3600. Note that this is listed in test/em_real/README.obs_fdda as > auxinput11_interval_s = 180, (180 seconds) which I believe is a typo. > They problem meant 180 minutes (3 hours). > > > > auxinput11_end_h is the time at which you want to WRF stop reading > OBS_DOMAIN101. > > > > PROBLEM: > > > > Following the WRF v3.3 Users Guide, Chapter 7, my obs_filename (OBS:*, > little_r format) files each contained 6 hours of data (to match interval > = 21600). > > > > At each intf4d time interval, OBSGRID opens an obs_filename file with > the current time-stamp. It DOES NOT check for the existence of the file > before it opens the file. > > > > So if your obs_filename (little_r) files each contain 6 hours of data, > then at hour +1 it opens (and thus creates) a file with that time-stamp. > Of course it's empty (newly created) so no data are read, and no data > are output to the OBS_DOMAIN* file for that hour. If you started with > obs_filename files like this: > > > > -rw-rw-rw- 1 username 650M Aug 25 10:02 OBS:2008-01-29_12 > > -rw-rw-rw- 1 username 630M Aug 25 10:10 OBS:2008-01-29_18 > > -rw-rw-rw- 1 username 650M Aug 25 10:16 OBS:2008-01-30_00 > > > > Then after OBSGRID.EXE had run, you'll have files like this: > > > > -rw-rw-rw- 1 username 650M Aug 25 10:02 OBS:2008-01-29_12 > > -rw-rw-rw- 1 username 0 Aug 25 10:03 OBS:2008-01-29_13 > > -rw-rw-rw- 1 username 0 Aug 25 10:05 OBS:2008-01-29_14 > > -rw-rw-rw- 1 username 0 Aug 25 10:06 OBS:2008-01-29_15 > > -rw-rw-rw- 1 username 0 Aug 25 10:07 OBS:2008-01-29_16 > > -rw-rw-rw- 1 username 0 Aug 25 10:08 OBS:2008-01-29_17 > > -rw-rw-rw- 1 username 630M Aug 25 10:10 OBS:2008-01-29_18 > > -rw-rw-rw- 1 username 0 Aug 25 10:10 OBS:2008-01-29_19 > > -rw-rw-rw- 1 username 0 Aug 25 10:12 OBS:2008-01-29_20 > > -rw-rw-rw- 1 username 0 Aug 25 10:13 OBS:2008-01-29_21 > > -rw-rw-rw- 1 username 0 Aug 25 10:14 OBS:2008-01-29_22 > > -rw-rw-rw- 1 username 0 Aug 25 10:15 OBS:2008-01-29_23 > > -rw-rw-rw- 1 username 650M Aug 25 10:16 OBS:2008-01-30_00 > > > > and your OBS_DOMAIN1?? files will contain only data +/- 30 minutes from > each interval (each 6 hours). > > > > Bart > > > > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On > Behalf Of Bart Brashers > Sent: Wednesday, August 24, 2011 5:09 PM > To: wrf-users at ucar.edu > Cc: wrfhelp at ucar.edu > Subject: [Wrf-users] OBS nudging > > > > I have a few questions about using OBS nudging in WRF v3.3. I'm using > the 12km NAM data, which comes in 6 hour intervals. WPS ran fine, and I > have met_em files every 6 hours. > > > > I have processed MADIS data to little_r format, and named the files > OBS:YYYY-MM-DD_HH. Each file contains 6 hours of data, which I've > confirmed via grep (e.g. "grep 200801 OBS:2008-01-29_18 | cut -c 327-336 > | sort | uniq" shows the right time-stamps). > > > > 1. OBSGRID.EXE creates OBS_DOMAIN101, OBSDOMAIN102, etc. But each file > contains only ONE hour's data (the analysis hour +/- 30 minutes). The > OBS data for the next 5 hours is not output. How can I get OBSGRID.EXE > to create OBS_DOMAIN1* files that contain ALL the available data? I > want to nudge every hour, not every 6 hours. > > > > 2. The most recent notes in the WRF User's Guide > > make no mention having to concatenate the OBS_DOMAIN1* output from > OBSGRID.EXE to OBS_DOMAIN101, but the tutorial notes > > do say so. Which is correct? Will WRF not read OBS_DOMAIN102, > OBS_DOMAIN103, etc.? > > > > 3. If I want to nudge every hour, do I set auxinput11_interval_m = 60 in > WRF's namelist? Or is that supposed to match OBSGRID's record1 interval > ? > > > > 4. If I want to keep nudging till the very end of my simulation, can I > set auxinput11_end_h = 99999? > > > > The meaning of these two values is pretty unclear in the > README.namelist, README.obs_fdda, as well as the WRF User's Guide and > tutorial notes. If anyone can shed some light, I'd be most > appreciative. > > > > Thanks, > > > > Bart > > > > ________________________________ > > This message contains information that may be confidential, privileged > or otherwise protected by law from disclosure. It is intended for the > exclusive use of the Addressee(s). Unless you are the addressee or > authorized agent of the addressee, you may not review, copy, distribute > or disclose to anyone the message or any information contained within. > If you have received this message in error, please contact the sender by > electronic reply to email at environcorp.com and immediately delete all > copies of the message. > > > > This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. > > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users From andrew.robbie at gmail.com Wed Sep 7 18:54:42 2011 From: andrew.robbie at gmail.com (Andrew Robbie (GMail)) Date: Thu, 8 Sep 2011 10:54:42 +1000 Subject: [Wrf-users] correcting landmask In-Reply-To: <1315226589.30015.YahooMailNeo@web113107.mail.gq1.yahoo.com> References: <1315226589.30015.YahooMailNeo@web113107.mail.gq1.yahoo.com> Message-ID: Check out tools: http://www.nusculus.com/wtools. Andrew On 05/09/2011 10:43 PM, "Dorita Rostkier-Edelstein" wrote: > Hi folks, > > Does anybody have an user friendly method to correct landmask in geo_em files? I have been using read_wrf_nc.f to change values. To identify the cells, or its i,j indexes in fact, I have plotted landmask using the high-resolution coast line of ncl and by trial and error found the i,j. Has anybody a better/systematic way to locate i,j's with wrong landmask? > > Thanks. > > Dorita -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110908/18e4ef64/attachment.html From anottrot at ucsd.edu Wed Sep 7 15:44:54 2011 From: anottrot at ucsd.edu (Anders A Nottrott) Date: Wed, 7 Sep 2011 14:44:54 -0700 Subject: [Wrf-users] How to limit number of threads in 'smpar' OpenMP runs? Message-ID: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> Hi All, I am trying to run an ideal LES simulation as a parallel process (i.e. using option 'smpar'). I have an 8 core machine but I would like to limit the number of threads to 4. I tried setting the environment variable 'OMP_NUM_THREADS=4' prior to configuration and compilation. However, when I run wrf.exe I get the following just before the simulation begins: WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 8 WRF TILE 1 IS 1 IE 100 JS 1 JE 13 WRF TILE 2 IS 1 IE 100 JS 14 JE 26 WRF TILE 3 IS 1 IE 100 JS 27 JE 38 WRF TILE 4 IS 1 IE 100 JS 39 JE 50 WRF TILE 5 IS 1 IE 100 JS 51 JE 62 WRF TILE 6 IS 1 IE 100 JS 63 JE 74 WRF TILE 7 IS 1 IE 100 JS 75 JE 87 WRF TILE 8 IS 1 IE 100 JS 88 JE 100 WRF NUMBER OF TILES = 8 I looked in 'module_tiles.F' and it appears that the variable 'num_tiles' is assigned by the OMP_GET_MAX_THREADS function. Perhaps this is overriding the environment setting??? Note that when I check my system performance the code is running on all 8 cores, but this is not efficient since I want to run other process on my machine. I also messed with some of the parameters in the namelist, e.g. 'tile_sz_x', 'tile_sz_y', 'numtiles', 'nproc_x', 'nproc_y', but to no avail. Any suggestions will be most appreciated. Regards, Anders Anders Nottrott PhD Student, Mechanical and Environmental Engineering University of California, San Diego Department of Mechanical and Aerospace Engineering Website: -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110907/d75a588f/attachment.html From jaareval at gmail.com Fri Sep 9 09:30:09 2011 From: jaareval at gmail.com (Jorge Alejandro Arevalo Borquez) Date: Fri, 9 Sep 2011 11:30:09 -0400 Subject: [Wrf-users] Old GFS data for initialize WRF Message-ID: Dears, I need to perform some simulations with WRF for past times, but it must be under the same conditions of forecast. I mean in need to run WRF with GFS data, FNL does not work for my study. My problem is on get GFS 1 degree forecast data for WRF on these times. Does anyone of you know where can i get it? or does someone have it stored and can share it with me? I need at least daily forecast data up to 72 hours for the next dates: from Novembre 12 to November 22 of 2004 from November 11 to November 21 of 2005 from November 01 to November 13 of 2006 Any help will be very apreciated Regards Jorge Ar?valo B?rquez Coordinador Lab. Modelaci?n Atmosf?rica Departamento de Meteorolog?a Universidad de Valpara?so, Chile -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110909/e43c99aa/attachment.html From moudipascal at yahoo.fr Thu Sep 8 07:55:18 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Thu, 8 Sep 2011 14:55:18 +0100 (BST) Subject: [Wrf-users] (no subject) Message-ID: <1315490118.2316.YahooMailNeo@web29017.mail.ird.yahoo.com> Dear All, I would like to know if i can build WRF3.2.1 with gfortran compiler on fedora 15? What should be the netcdf version which will work Best regards ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110908/c8df1196/attachment.html From siemprelgas at gmail.com Fri Sep 9 00:39:53 2011 From: siemprelgas at gmail.com (Guillermo Perez) Date: Fri, 9 Sep 2011 08:39:53 +0200 Subject: [Wrf-users] OBSGRID: Surface observations for Surface FDDA Message-ID: Greetings, I'm a Spanish Chemical Engineering student. I'm writting you because I'm trying to run OBSGRID but I have a doubt with the ingestion of observational data. As I understand in the manual, I have concatenated upper air observations and surface observations in the same file (concretely NCEP ADP data for both). I want to run it with FD4=.TRUE., and I set intf4d=3600. Then when I run OBSGRID, it looks like the program doesn't process periods 00 06 12 18 for surface fdda, but does it only for 03 09 15 21. My question is, Am I doing it right? do I have to copy the 00 file and rename it with 03 so that the program can process it? In MM5 it was possible to set surface observations apart from upper air ones and also the program didn't need a file for each period of time, that's why I'm a bit confused. Awainting your soonest reply Thank You very much -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110909/63a6e0f5/attachment.html From william.gustafson at pnnl.gov Fri Sep 9 13:17:48 2011 From: william.gustafson at pnnl.gov (Gustafson, William I) Date: Fri, 9 Sep 2011 12:17:48 -0700 Subject: [Wrf-users] Old GFS data for initialize WRF In-Reply-To: Message-ID: Jorge, I typically get GFS data from NCDC at http://nomads.ncdc.noaa.gov/data.php#hires_weather_datasets. I believe NCAR also has GFS, but I have not used it from that location. -Bill On 9/9/11 8:30 AM, "Jorge Alejandro Arevalo Borquez" wrote: Dears, I need to perform some simulations with WRF for past times, but it must be under the same conditions of forecast. I mean in need to run WRF with GFS data, FNL does not work for my study. My problem is on get GFS 1 degree forecast data for WRF on these times. Does anyone of you know where can i get it? or does someone have it stored and can share it with me? I need at least daily forecast data up to 72 hours for the next dates: from Novembre 12 to November 22 of 2004 from November 11 to November 21 of 2005 from November 01 to November 13 of 2006 Any help will be very apreciated Regards Jorge Ar?valo B?rquez Coordinador Lab. Modelaci?n Atmosf?rica Departamento de Meteorolog?a Universidad de Valpara?so, Chile _______________________________________________ William I. Gustafson Jr., Ph.D. Scientist ATMOSPHERIC SCIENCES AND GLOBAL CHANGE DIVISION Pacific Northwest National Laboratory P.O. 999, MSIN K9-30 Richland, WA 99352 Tel: 509-372-6110 William.Gustafson at pnnl.gov http://www.pnnl.gov/atmospheric/staff/staff_info.asp?staff_num=5716 http://www.researcherid.com/rid/A-7732-2008 From maemarcus at gmail.com Sat Sep 10 02:38:33 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Sat, 10 Sep 2011 12:38:33 +0400 Subject: [Wrf-users] (no subject) In-Reply-To: <1315490118.2316.YahooMailNeo@web29017.mail.ird.yahoo.com> References: <1315490118.2316.YahooMailNeo@web29017.mail.ird.yahoo.com> Message-ID: Hi Pascal, WRF 3.2 should use netcdf 4.1.1. On Fedora 15 I compiled 3.3 with 4.2.1, and it works fine. - D. 2011/9/8 moudi pascal : > Dear All, > I would like to know if i can build WRF3.2.1 with gfortran compiler on > fedora 15? > What should be the netcdf version which will work > Best regards > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From maemarcus at gmail.com Sat Sep 10 02:54:27 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Sat, 10 Sep 2011 12:54:27 +0400 Subject: [Wrf-users] How to limit number of threads in 'smpar' OpenMP runs? In-Reply-To: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> References: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> Message-ID: Hi Anders, > I tried setting the environment variable > 'OMP_NUM_THREADS=4' prior to configuration and compilation. I think OMP_NUM_THREADS is a runtime setting, i.e. it affects only the resulting OpenMP application. So you should specify OMP_NUM_THREADS=4 with wrf.exe invocation. Then OMP_GET_MAX_THREADS should be adjusted to this number. My guess is based on the following test case: [marcusmae at loveland omptest]$ cat omptest.c #include #include int main() { printf("omp_get_max_threads = %d\n", omp_get_max_threads()); return 0; } [marcusmae at loveland omptest]$ make gcc -fopenmp omptest.c -o omptest [marcusmae at loveland omptest]$ ./omptest omp_get_max_threads = 2 [marcusmae at loveland omptest]$ OMP_NUM_THREADS=4 ./omptest omp_get_max_threads = 4 [marcusmae at loveland omptest]$ - D. 2011/9/8 Anders A Nottrott : > Hi All, > > > > I am trying to run an ideal LES simulation as a parallel process (i.e. using > option 'smpar'). I have an 8 core machine but I would like to limit the > number of threads to 4. I tried setting the environment variable > 'OMP_NUM_THREADS=4' prior to configuration and compilation. However, when I > run wrf.exe I get the following just before the simulation begins: > > WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 8 > WRF TILE 1 IS 1 IE 100 JS 1 JE 13 > WRF TILE 2 IS 1 IE 100 JS 14 JE 26 > WRF TILE 3 IS 1 IE 100 JS 27 JE 38 > WRF TILE 4 IS 1 IE 100 JS 39 JE 50 > WRF TILE 5 IS 1 IE 100 JS 51 JE 62 > WRF TILE 6 IS 1 IE 100 JS 63 JE 74 > WRF TILE 7 IS 1 IE 100 JS 75 JE 87 > WRF TILE 8 IS 1 IE 100 JS 88 JE 100 > WRF NUMBER OF TILES = 8 > > I looked in 'module_tiles.F' and it appears that the variable 'num_tiles' is > assigned by the OMP_GET_MAX_THREADS function. Perhaps this is overriding the > environment setting??? Note that when I check my system performance the code > is running on all 8 cores, but this is not efficient since I want to run > other process on my machine. > > > > I also messed with some of the parameters in the namelist, e.g. ?tile_sz_x?, > ?tile_sz_y?, ?numtiles?, ?nproc_x?, ?nproc_y?, but to no avail. > > > > Any suggestions will be most appreciated. > > > > Regards, > > > > Anders > > > > > > Anders Nottrott > > PhD Student, Mechanical and Environmental Engineering > > University of California, San Diego > > Department of Mechanical and Aerospace Engineering > > Website: > > > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From mkudsy at gmail.com Fri Sep 9 15:02:44 2011 From: mkudsy at gmail.com (M Kudsy) Date: Sat, 10 Sep 2011 04:02:44 +0700 Subject: [Wrf-users] (no subject) In-Reply-To: <1315490118.2316.YahooMailNeo@web29017.mail.ird.yahoo.com> References: <1315490118.2316.YahooMailNeo@web29017.mail.ird.yahoo.com> Message-ID: I built it on 14 neatly. There should not be a problem I think, because my office friend has build it on 15. Gfortran is better than Intel ifort. On Thu, Sep 8, 2011 at 8:55 PM, moudi pascal wrote: > Dear All, > I would like to know if i can build WRF3.2.1 with gfortran compiler on > fedora 15? > What should be the netcdf version which will work > > Best regards > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Dr.Mahally Kudsy Weather Modification Technology Center Agency for the Assessment and Application of Technology Jln MH Thamrin 8, Jakarta, Indonesia Telp:62-21-3168830 Fax:62-21-3906225 mkudsy at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110910/f43efad4/attachment.html From mmkamal at uwaterloo.ca Mon Sep 12 10:17:35 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Mon, 12 Sep 2011 12:17:35 -0400 Subject: [Wrf-users] "NetCDF error: NetCDF: Attribute not found" error during real.exe Message-ID: <20110912121735.2769255i2tu06t1c@www.nexusmail.uwaterloo.ca> Hi! I have been trying to run WRF 3.3 in my machine (configuration is given below) using IBM compiler but confused in some point. Although I have successfully finish running WRF there but the following message confused me. I am getting an error message called "NetCDF error: NetCDF: Attribute not found" in the "rsl.error.0000" and "rsl.out.0000" log file but not in the rest of the log files. But fortunately I am getting the message "real_em: SUCCESS COMPLETE REAL_EM INIT" at the end of each log files and running wrf.exe generate history files. I have checked the output as well and looks fine. Could anyone please tell me whether it is a problem or not? I look forward to hearing from you. Thanks Kamal PhD student University of Waterloo, Canada =================================================== Machine configuration =================================================== 3,328 cores of IBM Power 6 (4.7 GHz) Operating System AIX 5.3 Interconnect Infiniband Ram/Node 128 GB (256 on 2 nodes) Cores/Node 32 Vendor Compilers xlc (C) xlf (fortran) xlC (C++) ---------------------------------------------------------- Libraries used 1) netcdf/4.0.1_nc3 2) parallel-netcdf/1.1.1 =================================================== I get the following message from LoadLeveler ================================================== From: LoadLeveler LoadLeveler Job Step: tcs-f11n06.33773.0 Executable: /scratch/mkamal/WRF_tcs/WRFV3/test/em_real/real.exe Executable arguments: State for machine: tcs-f08n10 LoadL_starter: The program, real.exe, exited normally and returned an exit code of 0. State for machine: tcs-f08n12 This job step was dispatched to run 1 time(s). This job step was rejected by Starter 0 time(s). Submitted at: Thu Sep 8 17:58:14 2011 Started at: Thu Sep 8 17:58:20 2011 Exited at: Thu Sep 8 18:00:56 2011 Real Time: 0 00:02:42 Job Step User Time: 0 02:24:16 Job Step System Time: 0 00:11:32 Total Job Step Time: 0 02:35:48 Starter User Time: 0 00:00:00 Starter System Time: 0 00:00:13 Total Starter Time: 0 00:00:13 --------------------------------------------------------------------- ############################################## gpc-f101n084-$ vi rsl.error.0000 ############################################## taskid: 0 hostname: tcs-f08n10 Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 8 , ntasks in Y 16 --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 3, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 3, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 3, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.3 PREPROCESSOR real_em: calling alloc_and_configure_domain ************************************* Parent domain ids,ide,jds,jde 1 225 1 175 ims,ime,jms,jme -4 35 -4 18 ips,ipe,jps,jpe 1 28 1 11 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 20923412 bytes allocated setup_timekeeping: set xtime to 0.0000000000E+00 setup_timekeeping: set julian to 182.0000000 setup_timekeeping: returning... d01 2002-07-02_00:00:00 real_em: calling set_scalar_indices_from_config d01 2002-07-02_00:00:00 real_em: calling model_to_grid_config_rec d01 2002-07-02_00:00:00 real_em: calling init_wrfio d01 2002-07-02_00:00:00 Entering ext_gr1_ioinit d01 2002-07-02_00:00:00 real_em: re-broadcast the configuration records d01 2002-07-02_00:00:00 calling med_sidata_input d01 2002-07-02_00:00:00 med_sidata_input: calling open_r_dataset for met_em.d. d01 2002-07-02_00:00:00 med_sidata_input: calling input_auxinput1 metgrid input_wrf.F first_date_input = 2002-07-02_00:00:00 metgrid input_wrf.F first_date_nml = 2002-07-02_00:00:00 d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element P_TOP d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element GMT d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code INTEGER, line 83 Element JULYR d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code INTEGER, line 83 Element JULDAY d01 2002-07-02_00:00:00 mminlu = 'USGS' ###################################### gpc-f101n084-$ vi rsl.out.0000 ###################################### taskid: 0 hostname: tcs-f08n10 Namelist dfi_control not found in namelist.input. Using registry defaults for variables in dfi_control Namelist tc not found in namelist.input. Using registry defaults for variables in tc Namelist scm not found in namelist.input. Using registry defaults for variables in scm Namelist fire not found in namelist.input. Using registry defaults for variables in fire Ntasks in X 8 , ntasks in Y 16 --- NOTE: grid_fdda is 0 for domain 1, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 1, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 1, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 2, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 2, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 2, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: grid_fdda is 0 for domain 3, setting gfdda interval and ending time to 0 for that domain. --- NOTE: both grid_sfdda and pxlsm_soil_nudge are 0 for domain 3, setting sgfdda interval and ending time to 0 for that domain. --- NOTE: obs_nudge_opt is 0 for domain 3, setting obs nudging interval and ending time to 0 for that domain. --- NOTE: num_soil_layers has been set to 4 REAL_EM V3.3 PREPROCESSOR real_em: calling alloc_and_configure_domain ************************************* Parent domain ids,ide,jds,jde 1 225 1 175 ims,ime,jms,jme -4 35 -4 18 ips,ipe,jps,jpe 1 28 1 11 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1 , 20923412 bytes allocated setup_timekeeping: set xtime to 0.0000000000E+00 setup_timekeeping: set julian to 182.0000000 setup_timekeeping: returning... d01 2002-07-02_00:00:00 real_em: calling set_scalar_indices_from_config d01 2002-07-02_00:00:00 real_em: calling model_to_grid_config_rec d01 2002-07-02_00:00:00 real_em: calling init_wrfio d01 2002-07-02_00:00:00 Entering ext_gr1_ioinit d01 2002-07-02_00:00:00 real_em: re-broadcast the configuration records d01 2002-07-02_00:00:00 calling med_sidata_input Time period # 1 to process = 2002-07-02_00:00:00. Time period # 2 to process = 2002-07-02_03:00:00. Time period # 3 to process = 2002-07-02_06:00:00. Time period # 4 to process = 2002-07-02_09:00:00. Time period # 5 to process = 2002-07-02_12:00:00. Time period # 6 to process = 2002-07-02_15:00:00. Time period # 7 to process = 2002-07-02_18:00:00. Time period # 8 to process = 2002-07-02_21:00:00. Time period # 9 to process = 2002-07-03_00:00:00. Total analysis times to input = 9. ----------------------------------------------------------------------------- Domain 1: Current date being processed: 2002-07-02_00:00:00.0000, which is loop # 1 out of 9 configflags%julyr, %julday, %gmt: 2002 183 0.0000000000E+00 d01 2002-07-02_00:00:00 med_sidata_input: calling open_r_dataset for met_em.d. d01 2002-07-02_00:00:00 med_sidata_input: calling input_auxinput1 metgrid input_wrf.F first_date_input = 2002-07-02_00:00:00 metgrid input_wrf.F first_date_nml = 2002-07-02_00:00:00 d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element P_TOP d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code REAL, line 83 Element GMT d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code INTEGER, line 83 Element JULYR d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Attribute not found d01 2002-07-02_00:00:00 NetCDF error in ext_ncd_get_dom_ti.code INTEGER, line 83 Element JULDAY d01 2002-07-02_00:00:00 mminlu = 'USGS' d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Variable not found d01 2002-07-02_00:00:00 NetCDF error in wrf_io.F90, line 2712 Varname PTHETA d01 2002-07-02_00:00:00 NetCDF error: NetCDF: Variable not found d01 2002-07-02_00:00:00 NetCDF error in wrf_io.F90, line 2712 Varname QV ========================================================================== Configure.wrf ========================================================================== #### Architecture specific settings #### # Settings for AIX xlf compiler with xlc (dmpar) # DMPARALLEL = 1 OMPCPP = # -D_OPENMP OMP = # -qsmp=noauto OMPCC = # -qsmp=noauto SFC = xlf90_r SCC = cc_r SC99 = c99_r CCOMP = cc_r DM_FC = mpxlf90_r DM_CC = mpcc_r -DMPI2_SUPPORT FC = timex $(DM_FC) CC = $(DM_CC) -DFSEEKO64_OK LD = $(FC) RWORDSIZE = $(NATIVE_RWORDSIZE) PROMOTION = -qrealsize=$(RWORDSIZE) -qintsize=4 ARCH_LOCAL = -DNONSTANDARD_SYSTEM_SUBR -DNATIVE_MASSV CFLAGS_LOCAL = -DNOUNDERSCORE LDFLAGS_LOCAL = -lmass -lmassv CPLUSPLUSLIB = -lC ESMF_LDFLAG = $(CPLUSPLUSLIB) # -qhot commented out in 3.0.1.1 release because of reported problems with # model results under certain configurations. Use at your own risk. FCOPTIM = -O3 # -qhot FCREDUCEDOPT = -O2 FCNOOPT = -qnoopt FCDEBUG = # -g $(FCNOOPT) -qfullpath FORMAT_FIXED = -qfixed FORMAT_FREE = -qfree=f90 FCSUFFIX = -qsuffix=f=f90 BYTESWAPIO = FCBASEOPTS_NO_G = -w -qspill=20000 -qmaxmem=32767 $(FORMAT_FREE) $(BYTESWAPIO) #-qflttrap=zerodivide:invalid:enable -qsigtrap -C # -qinitauto=7FF7FFFF FCBASEOPTS = $(FCBASEOPTS_NO_G) $(FCDEBUG) MODULE_SRCH_FLAG = TRADFLAG = CPP = /lib/cpp -C -P AR = ar ARFLAGS = ru M4 = m4 -B 14000 RANLIB = ranlib CC_TOOLS = cc ========================================================================== LIB_EXTERNAL = \ -L$(WRF_SRC_ROOT_DIR)/external/io_netcdf -lwrfio_nf -L/scinet/tcs/Libraries/netcdf-4.0.1_nc3/lib -lnetcdf -L$(WRF_SRC_ROOT_DIR)/external/io_pnetcdf -lwrfio_pnf -L/scinet/tcs/Libraries/parallel-netcdf-1.1.1/lib -lpnetcdf ########################################################### =========================================================== Configure.wps =========================================================== #### Architecture specific settings #### # Settings for AIX DM parallel, NO GRIB2 # COMPRESSION_LIBS = COMPRESSION_INC = FDEFS = NCARG_LIBS = NCARG_LIBS2 = -L/usr/local/lib64/r4i4 -lncarg -lncarg_gks -lncarg_c \ -L/usr/X11R6/lib -lX11 -lpng_ncl -lz_ncl FC = mpxlf90_r SFC = xlf90_r FFLAGS = -qfree=f90 F77FLAGS = -qfixed FCSUFFIX = -qsuffix=f=f90 FNGFLAGS = $(FFLAGS) LDFLAGS = CC = mpcc_r SCC = cc CFLAGS = CPP = /usr/lib/cpp -C -P CPPFLAGS = -DAIX -DIBM4 -DIO_NETCDF -DIO_BINARY -DIO_GRIB1 -D_MPI -DBIT32 ARFLAGS = ============================================================================ From saji at u-aizu.ac.jp Sun Sep 11 22:14:42 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Mon, 12 Sep 2011 13:14:42 +0900 Subject: [Wrf-users] How to limit number of threads in 'smpar' OpenMP runs? In-Reply-To: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> References: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> Message-ID: Hi Anders, Setting OMP_NUM_THREADS should do fine. Exactly how do you set this? For example for a bash shell, one should 'export' the environment variable export OMP_NUM_THREADS=4 saji On Thu, Sep 8, 2011 at 6:44 AM, Anders A Nottrott wrote: > Hi All,**** > > ** ** > > I am trying to run an ideal LES simulation as a parallel process (i.e. > using option 'smpar'). I have an 8 core machine but I would like to limit > the number of threads to 4. I tried setting the environment variable > 'OMP_NUM_THREADS=4' prior to configuration and compilation. However, when I > run wrf.exe I get the following just before the simulation begins: > > WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 8 > WRF TILE 1 IS 1 IE 100 JS 1 JE 13 > WRF TILE 2 IS 1 IE 100 JS 14 JE 26 > WRF TILE 3 IS 1 IE 100 JS 27 JE 38 > WRF TILE 4 IS 1 IE 100 JS 39 JE 50 > WRF TILE 5 IS 1 IE 100 JS 51 JE 62 > WRF TILE 6 IS 1 IE 100 JS 63 JE 74 > WRF TILE 7 IS 1 IE 100 JS 75 JE 87 > WRF TILE 8 IS 1 IE 100 JS 88 JE 100 > WRF NUMBER OF TILES = 8 > > I looked in 'module_tiles.F' and it appears that the variable 'num_tiles' > is assigned by the OMP_GET_MAX_THREADS function. Perhaps this is overriding > the environment setting??? Note that when I check my system performance the > code is running on all 8 cores, but this is not efficient since I want to > run other process on my machine. **** > > ** ** > > I also messed with some of the parameters in the namelist, e.g. > ?tile_sz_x?, ?tile_sz_y?, ?numtiles?, ?nproc_x?, ?nproc_y?, but to no avail. > **** > > ** ** > > Any suggestions will be most appreciated.**** > > ** ** > > Regards,**** > > ** ** > > Anders**** > > ** ** > > ** ** > > Anders Nottrott**** > > *PhD Student, Mechanical and Environmental Engineering* > > University of California, San Diego**** > > Department of Mechanical and Aerospace Engineering**** > > Website: **** > > ** ** > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 Fax:+81242 37-2760 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110912/d91d3b64/attachment-0001.html From wrf at nusculus.com Sat Sep 10 12:41:21 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Sat, 10 Sep 2011 12:41:21 -0600 Subject: [Wrf-users] How to limit number of threads in 'smpar' OpenMP runs? In-Reply-To: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> References: <008301cc6da7$616cfb80$2446f280$@ucsd.edu> Message-ID: Hi Anders, The "OMP_NUM_THREADS=4" needs to be set pryor to running the program rather than before compiling. That environmental variable from the shell is checked implicitly at the beginning of each run. I have used it on my Linux workstation in the past. I imagine the newer releases will work just fine too, although I have not tried it with any "ideal" simulations. Kevin On Wed, Sep 7, 2011 at 3:44 PM, Anders A Nottrott wrote: > Hi All,**** > > ** ** > > I am trying to run an ideal LES simulation as a parallel process (i.e. > using option 'smpar'). I have an 8 core machine but I would like to limit > the number of threads to 4. I tried setting the environment variable > 'OMP_NUM_THREADS=4' prior to configuration and compilation. However, when I > run wrf.exe I get the following just before the simulation begins: > > WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 8 > WRF TILE 1 IS 1 IE 100 JS 1 JE 13 > WRF TILE 2 IS 1 IE 100 JS 14 JE 26 > WRF TILE 3 IS 1 IE 100 JS 27 JE 38 > WRF TILE 4 IS 1 IE 100 JS 39 JE 50 > WRF TILE 5 IS 1 IE 100 JS 51 JE 62 > WRF TILE 6 IS 1 IE 100 JS 63 JE 74 > WRF TILE 7 IS 1 IE 100 JS 75 JE 87 > WRF TILE 8 IS 1 IE 100 JS 88 JE 100 > WRF NUMBER OF TILES = 8 > > I looked in 'module_tiles.F' and it appears that the variable 'num_tiles' > is assigned by the OMP_GET_MAX_THREADS function. Perhaps this is overriding > the environment setting??? Note that when I check my system performance the > code is running on all 8 cores, but this is not efficient since I want to > run other process on my machine. **** > > ** ** > > I also messed with some of the parameters in the namelist, e.g. > ?tile_sz_x?, ?tile_sz_y?, ?numtiles?, ?nproc_x?, ?nproc_y?, but to no avail. > **** > > ** ** > > Any suggestions will be most appreciated.**** > > ** ** > > Regards,**** > > ** ** > > Anders**** > > ** ** > > ** ** > > Anders Nottrott**** > > *PhD Student, Mechanical and Environmental Engineering* > > University of California, San Diego**** > > Department of Mechanical and Aerospace Engineering**** > > Website: **** > > ** ** > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110910/83a8cb00/attachment.html From moudipascal at yahoo.fr Tue Sep 13 03:50:43 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Tue, 13 Sep 2011 10:50:43 +0100 (BST) Subject: [Wrf-users] WRF compilation with gfortran Message-ID: <1315907443.47695.YahooMailNeo@web29015.mail.ird.yahoo.com> I am using ?Linux hpz800 2.6.40.3-0.fc15.x86_64 #1 SMP Tue Aug 16 04:10:59 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux. For configure, i used? option 17 (gfortran? with gcc dmpar) The completion is unsuccessfull Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110913/be078a69/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile Type: application/octet-stream Size: 61302 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110913/be078a69/attachment-0001.obj From maemarcus at gmail.com Tue Sep 13 09:57:52 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 13 Sep 2011 19:57:52 +0400 Subject: [Wrf-users] WRF compilation with gfortran In-Reply-To: <1315907443.47695.YahooMailNeo@web29015.mail.ird.yahoo.com> References: <1315907443.47695.YahooMailNeo@web29015.mail.ird.yahoo.com> Message-ID: Hi Pascal, call random_seed (PUT=seed) Error: Size of 'put' argument of 'random_seed' intrinsic at (1) too small (8/12) - I met this too. Workaround is to enlarge seed variable as requested. 2011/9/13 moudi pascal : > I am using > ?Linux hpz800 2.6.40.3-0.fc15.x86_64 #1 SMP Tue Aug 16 04:10:59 UTC 2011 > x86_64 x86_64 x86_64 GNU/Linux. > > For configure, i used? option 17 (gfortran? with gcc dmpar) > The completion is unsuccessfull > > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > From maemarcus at gmail.com Tue Sep 13 10:54:56 2011 From: maemarcus at gmail.com (Dmitry N. Mikushin) Date: Tue, 13 Sep 2011 20:54:56 +0400 Subject: [Wrf-users] Re : WRF compilation with gfortran In-Reply-To: <1315931605.90614.YahooMailNeo@web29006.mail.ird.yahoo.com> References: <1315907443.47695.YahooMailNeo@web29015.mail.ird.yahoo.com> <1315931605.90614.YahooMailNeo@web29006.mail.ird.yahoo.com> Message-ID: In file phys/module_cu_g3.F Replace line integer, dimension (8) :: seed with line integer, dimension (12) :: seed - D. 2011/9/13 moudi pascal : > What i have to do please > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > ________________________________ > De?: Dmitry N. Mikushin > ??: moudi pascal > Cc?: WRF DA ; WRF User's ; > "wrf_users at ucar.edu" > Envoy? le : Mardi 13 Septembre 2011 16h57 > Objet?: Re: [Wrf-users] WRF compilation with gfortran > > Hi Pascal, > > call random_seed (PUT=seed) > > Error: Size of 'put' argument of 'random_seed' intrinsic at (1) too small > (8/12) > > - I met this too. Workaround is to enlarge seed variable as requested. > > 2011/9/13 moudi pascal : >> I am using >> ?Linux hpz800 2.6.40.3-0.fc15.x86_64 #1 SMP Tue Aug 16 04:10:59 UTC 2011 >> x86_64 x86_64 x86_64 GNU/Linux. >> >> For configure, i used? option 17 (gfortran? with gcc dmpar) >> The completion is unsuccessfull >> >> >> Pascal MOUDI IGRI >> >> Ph-D Student >> Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) >> Department of Physics >> Faculty of Science >> University of Yaounde I, Cameroon >> National Advanced Training School for Technical Education, >> Electricity Engineering, Douala >> >> Tel:+237 75 32 58 52 >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> >> > > > From anottrot at ucsd.edu Fri Sep 16 13:18:32 2011 From: anottrot at ucsd.edu (Anders A Nottrott) Date: Fri, 16 Sep 2011 12:18:32 -0700 Subject: [Wrf-users] Explanation of array dimension in module_sf_sfclay? Message-ID: <002b01cc74a5$6f5f6390$4e1e2ab0$@ucsd.edu> Hi All, I am looking into the surface layer module for Monin-Obukhov similarity in LES (module_sf_sfclay.F). I wondered if someone could confirm the actual spatial variables associated with 3D domain variables. For example, there is a 3 dimensional array 'U3D' containing u-velocity interpolated to theta points, so U3D has indices (i, j, k). For now I am working on the assumption that these indices correspond to: i = x-dimension j = z-dimension k = y-dimension Can anyone confirm whether this is correct? Regards, Anders Anders Nottrott PhD Student, Mechanical and Environmental Engineering University of California, San Diego Department of Mechanical and Aerospace Engineering Website: -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110916/8b4324bd/attachment.html From ecarlste at gmail.com Wed Sep 14 16:29:10 2011 From: ecarlste at gmail.com (Erik Carlsten) Date: Wed, 14 Sep 2011 16:29:10 -0600 Subject: [Wrf-users] Which WRF input data set to use for Water Vapor Mixing Ratio @ 15 minute intervals? Message-ID: Hi everyone, Up until now I have been using AWIP input data as input for running WRF to determine Water Vapor Mixing Ratio at a given altitude above Bozeman, MT. We are trying to compare these numbers with LIDAR data we are collecting. The problems I am having currently is that the data set I am using is not available until 2-4 weeks after it has been collected and also that the lowest interval time I can specify when running WPS on my current data set is 3 hours. If at all possible I would like to find a data set that will allow me to determine the Water Vapor Mixing Ratio that meets the following criteria (or at least as close as possible): 1. Data which is sampled in 15-30 minute intervals 2. The data is available within 1 week of being sampled If anyone has any ideas on data sets which will allow us to do this which meet these requirements or are at least closer than my current solution or could point me in the right direction about who to talk to about this, the help would be greatly appreciated. Thanks all, Erik -- Erik S. Carlsten Cell: (406) 570-1547 EPS 116 Lab: (406) 994-6145 Montana State University Bozeman, MT 59717 --------------------------------------------------------------- There is a single light of science, and to brighten it anywhere is to brighten it everywhere. - Isaac Asimov -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110914/342c3f0f/attachment.html From moudipascal at yahoo.fr Thu Sep 22 04:22:06 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Thu, 22 Sep 2011 11:22:06 +0100 (BST) Subject: [Wrf-users] Error compiling WPP Message-ID: <1316686926.90737.YahooMailNeo@web29010.mail.ird.yahoo.com> Dear all, I have compiled WRFV3.3 using intel 11.1 on fedora15. Now in want to compile WPP3.2.1 using the same compiler, but i am having this: module_io_quilt.f90:(.text+0x105): undefined reference to `ext_gr2_ioinit_' module_io_quilt.f90:(.text+0x1378): undefined reference to `ext_gr2_inquire_filename_' module_io_quilt.f90:(.text+0x145b): undefined reference to `ext_gr2_open_for_write_commit_' module_io_quilt.f90:(.text+0x1691): undefined reference to `ext_gr2_open_for_write_begin_' module_io_quilt.f90:(.text+0x18a5): undefined reference to `ext_gr2_inquire_filename_' module_io_quilt.f90:(.text+0x18dc): undefined reference to `ext_gr2_ioclose_' module_io_quilt.f90:(.text+0x1c05): undefined reference to `ext_gr2_put_var_ti_char_' module_io_quilt.f90:(.text+0x1f19): undefined reference to `ext_gr2_put_dom_ti_char_' module_io_quilt.f90:(.text+0x2343): undefined reference to `ext_gr2_put_dom_ti_integer_' module_io_quilt.f90:(.text+0x26d8): undefined reference to `ext_gr2_put_dom_td_integer_' module_io_quilt.f90:(.text+0x2a61): undefined reference to `ext_gr2_put_dom_ti_real_' module_io_quilt.f90:(.text+0x2df6): undefined reference to `ext_gr2_put_dom_td_real_' module_io_quilt.f90:(.text+0x30fe): undefined reference to `ext_gr2_ioexit_' /home/lemap/WRFV_3_3_INTEL/WRFV3/main/libwrflib.a(module_io_quilt.o): In function `module_wrf_quilt_mp_quilt_': module_io_quilt.f90:(.text+0x6f39): undefined reference to `ext_gr2_ioinit_' module_io_quilt.f90:(.text+0x85d9): undefined reference to `ext_gr2_iosync_' module_io_quilt.f90:(.text+0x8802): undefined reference to `ext_gr2_inquire_filename_' module_io_quilt.f90:(.text+0x88e6): undefined reference to `ext_gr2_open_for_write_commit_' module_io_quilt.f90:(.text+0x8b69): undefined reference to `ext_gr2_open_for_write_begin_' module_io_quilt.f90:(.text+0x8d7b): undefined reference to `ext_gr2_inquire_filename_' module_io_quilt.f90:(.text+0x8db2): undefined reference to `ext_gr2_ioclose_' module_io_quilt.f90:(.text+0x90ee): undefined reference to `ext_gr2_put_var_ti_char_' module_io_quilt.f90:(.text+0x9416): undefined reference to `ext_gr2_put_dom_ti_char_' module_io_quilt.f90:(.text+0x983d): undefined reference to `ext_gr2_put_dom_ti_integer_' module_io_quilt.f90:(.text+0x9bc5): undefined reference to `ext_gr2_put_dom_td_integer_' module_io_quilt.f90:(.text+0x9f41): undefined reference to `ext_gr2_put_dom_ti_real_' module_io_quilt.f90:(.text+0xa2d3): undefined reference to `ext_gr2_put_dom_td_real_' module_io_quilt.f90:(.text+0xa6ea): undefined reference to `ext_gr2_ioexit_' /home/lemap/WRFV_3_3_INTEL/WRFV3/main/libwrflib.a(module_quilt_outbuf_ops.o): In function `module_quilt_outbuf_ops_mp_write_outbuf_': module_quilt_outbuf_ops.f90:(.text+0xbc4): undefined reference to `ext_gr2_write_field_' module_quilt_outbuf_ops.f90:(.text+0x1485): undefined reference to `ext_gr2_write_field_' /home/lemap/WRFV_3_3_INTEL/WRFV3/external/RSL_LITE/librsl_lite.a(f_xpose.o): In function `trans_z2x_': f_xpose.f:(.text+0x127b): undefined reference to `mpi_alltoallv_' f_xpose.f:(.text+0x178d): undefined reference to `mpi_alltoallv_' /home/lemap/WRFV_3_3_INTEL/WRFV3/external/RSL_LITE/librsl_lite.a(f_xpose.o): In function `trans_x2y_': f_xpose.f:(.text+0x2bf4): undefined reference to `mpi_alltoallv_' f_xpose.f:(.text+0x3103): undefined reference to `mpi_alltoallv_' make[2]: Leaving directory `/home/lemap/WRFV_3_3_INTEL/WPPV3/sorc/wrfpost' make[1]: Leaving directory `/home/lemap/WRFV_3_3_INTEL/WPPV3/sorc' [lemap at hpz800 WPPV3]# pwd /home/lemap/WRFV_3_3_INTEL/WPPV3 [lemap at hpz800 WPPV3]# someone can help to fix the problem? Regards ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110922/d61ba7e0/attachment.html From sam.hawkins at vattenfall.com Thu Sep 22 05:02:28 2011 From: sam.hawkins at vattenfall.com (sam.hawkins at vattenfall.com) Date: Thu, 22 Sep 2011 13:02:28 +0200 Subject: [Wrf-users] sources of SST data Message-ID: <25EBE9C717C6244A97CC1E31BF688EFB0317CC3766@SMMABOX0744.eur.corp.vattenfall.com> Dear WRF Users, Can anyone give me some advice on the best source of high-resolution SST data for use in WRF in Northern Europe? Does anyone have any experience using high different SST data in WRF? Thanks, Sam Hawkins From ahah at risoe.dtu.dk Fri Sep 23 02:53:05 2011 From: ahah at risoe.dtu.dk (Hahmann, Andrea N.) Date: Fri, 23 Sep 2011 10:53:05 +0200 Subject: [Wrf-users] sources of SST data In-Reply-To: <25EBE9C717C6244A97CC1E31BF688EFB0317CC3766@SMMABOX0744.eur.corp.vattenfall.com> Message-ID: Dear Sam Do you mean for real-time runs or retrospective simulations? For real-time runs, we have recently started using the 1/12 degree SSTs (called rtgssthr_grb_0.083.grib2). I was having strange issues in the northern Baltic when using the 0.5 degree (sst2dvar_grb_0.5.grib2) data. The data is one day behind, good enough. The data is available from their ftp site: ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/sst. Hope this helps, Andrea ---- Andrea N. Hahmann Senior Scientist Wind Energy Division Ris? DTU Technical University of Denmark Ris? National Laboratory for Sustainable Energy Frederikborgvej 399, P.O. Box 49 4000 Roskilde, Denmark Direct +45 4677 5471 Mobil: +45 2133 0550 ahah at risoe.dtu.dk http://www.risoe.dtu.dk On 9/22/11 1:02 PM, "sam.hawkins at vattenfall.com" wrote: >Dear WRF Users, > >Can anyone give me some advice on the best source of high-resolution SST >data for use in WRF in Northern Europe? Does anyone have any experience >using high different SST data in WRF? > >Thanks, > >Sam Hawkins > > > > >_______________________________________________ >Wrf-users mailing list >Wrf-users at ucar.edu >http://mailman.ucar.edu/mailman/listinfo/wrf-users From albin.ullmann at u-bourgogne.fr Fri Sep 23 02:42:14 2011 From: albin.ullmann at u-bourgogne.fr (Albin Ullmann) Date: Fri, 23 Sep 2011 10:42:14 +0200 Subject: [Wrf-users] post-doc position Message-ID: <20110923104214.93143dpzqp6wrzwg@webmail.u-bourgogne.fr> Dear members, Please find a post-doctoral position (using WRF). Have a nice day, Regards -- Albin Ullmann Ma?tre de conf?rence UMR 5210 CNRS-Centre de Recherches de Climatologie Universit? de Bourgogne, 6 Bd Gabriel, B?t. Science Gabriel Tel: (+33) 3.80.39.38.22 Fax: (+33) 3.80.39.57.41 -------------- next part -------------- A non-text attachment was scrubbed... Name: postdoc_proposal.pdf Type: video/x-flv Size: 105005 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110923/f05d3fdc/attachment-0001.flv From mm5user at rediffmail.com Wed Sep 28 00:14:31 2011 From: mm5user at rediffmail.com (mm5user) Date: 28 Sep 2011 06:14:31 -0000 Subject: [Wrf-users] =?utf-8?q?WRF-chem_compilation_Error?= Message-ID: <20110928061431.31328.qmail@f6mail-145-156.rediffmail.com> Dear All I am new to WRF chem. I could configure and compile WRFV3, WPS on IBM AIX machin. (all WRF version 3.2) After downloading and unpacking WRF Chem (WRFV3-Chem-3.0.1.TAR) in WRFV3, and setting the environment variables (WRF EM CORE = 1, WRF CHEM = 1) I recompiled WRFV3 em_real and obtained the following error:--------------------------------------------------------------------------------------------------------------------------------  ranlib libwrflib.a         timex mpxlf90_r -o wrf.exe  -O3  -w -qspill=20000 -qmaxmem=32767 -qfree=f90     -lmass -lmassv  wrf.o ../main/module_wrf_top.o libwrflib.a /--WRFdir--/WRFV3/external/fftpack/fftpack5/libfftpack.a  /--WRFdir--/WRFV3/external/io_grib1/libio_grib1.a  /--WRFdir--/WRFV3/external/io_grib_share/libio_grib_shar e.a  /--WRFdir--/WRFV3/external/io_int/libwrfio_int.a  /--WRFdir--/WRFV3/external/esmf_time_f90/libesmf_time.a  /--WRFdir--/WRFV3/external/RSL_LITE/librsl_lite.a  /--WRFdir--/WRFV3/frame/module_internal_header_util.o  /--WRFdir--/WRFV3/frame/pack_utils.o  /--WRFdir--/WRFV3/external/io_netcdf/libwrfio_nf.a -L/--NETCDFdir--/netcdf/lib  -lnetcdf ld: 0711-224 WARNING: Duplicate symbol: .logf ld: 0711-224 WARNING: Duplicate symbol: .log1pf ld: 0711-224 WARNING: Duplicate symbol: .log10f ld: 0711-224 WARNING: Duplicate symbol: .lgammaf ld: 0711-224 WARNING: Duplicate symbol: .hypotf ld: 0711-224 WARNING: Duplicate symbol: .expm1f ld: 0711-224 WARNING: Duplicate symbol: .expf ld: 0711-224 WARNING: Duplicate symbol: .erff ld: 0711-224 WARNING: Duplicate symbol: .erfcf ld: 0711-224 WARNING: Duplicate symbol: .coshf ld: 0711-224 WARNING: Duplicate symbol: .cosf ld: 0711-224 WARNING: Duplicate symbol: .copysignf ld: 0711-224 WARNING: Duplicate symbol: .cbrtf ld: 0711-224 WARNING: Duplicate symbol: .atanhf ld: 0711-224 WARNING: Duplicate symbol: .atanf ld: 0711-224 WARNING: Duplicate symbol: .atan2f ld: 0711-224 WARNING: Duplicate symbol: .asinhf ld: 0711-224 WARNING: Duplicate symbol: .asinf ld: 0711-224 WARNING: Duplicate symbol: .acoshf ld: 0711-224 WARNING: Duplicate symbol: .acosf ld: 0711-345 Use the -bloadmap or -bnoquiet option to obtain more information. ld: 0711-317 ERROR: Undefined symbol: .start_domain_em ld: 0711-317 ERROR: Undefined symbol: .solve_em ld: 0711-317 ERROR: Undefined symbol: .chem_driver make: 1254-004 The error code from the last command is 8. make: 1254-005 Ignored error code 8 from last command.         ( cd run ; /bin/rm -f wrf.exe ; ln -s ../main/wrf.exe . )------------------------------------------------------------------------------------------------------------------------------only  tc.exe is created in /main/There is no any problem when i compile WRFV3 without chem. All *.exe are created in main. But the problem comes when i compile WRFV3 with chem. How to solve this issuePlease Help me out in this regardThanksmm5user -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/3f214519/attachment.html From mmkamal at uwaterloo.ca Sun Sep 25 17:16:57 2011 From: mmkamal at uwaterloo.ca (mmkamal at uwaterloo.ca) Date: Sun, 25 Sep 2011 19:16:57 -0400 Subject: [Wrf-users] run_wrfpost is not processing more than one day Message-ID: <20110925191657.53455e7uyle65ksg@www.nexusmail.uwaterloo.ca> Hi, I have been trying to convert my WRF history file into GRIB format using WPP but "run_wrfpost" stop after converting one days although my history file contain one year output. I have modified my "run_wrfpost" based on the "run_wrfpost_frames". Could any one please help me to overcome this problem. Thanks in advance Kamal ====================================================== run_wrfpost ====================================================== #!/bin/ksh # set -x # August 2005: Hui-Ya Chuang, NCEP: This script uses # NCEP's WRF-POSTPROC to post processes WRF native model # output, and uses copygb to horizontally interpolate posted # output from native A-E to a regular projection grid. # # July 2006: Meral Demirtas, NCAR/DTC: Added new "copygb" # options and revised some parts for clarity. # #-------------------------------------------------------- # This script performs 2 jobs: # # 1. Run WRF-POSTPROC # 2. Run copygb to horizontally interpolate output from # native A-E to a regular projection grid #-------------------------------------------------------- # Set path to your top directory and your run dorectory # export TOP_DIR=/scratch/mkamal/WRF_new export DOMAINPATH=${TOP_DIR}/DOMAINS #Specify Dyn Core (ARW or NMM in upper case) dyncore="ARW" if [ $dyncore = "NMM" ]; then export tag=NMM elif [ $dyncore = "ARW" ]; then export tag=NCAR else echo "${dyncore} is not supported. Edit script to choose ARW or NMM dyncore." exit fi # Specify forecast start date # fhr is the first forecast hour to be post-processed # lastfhr is the last forecast hour to be post-processed # incrementhr is the incement (in hours) between forecast files export startdate=2004010200 export fhr=00 export lastfhr=21 export incrementhr=03 # Path names for WRF_POSTPROC and WRFV3 export WRF_POSTPROC_HOME=${TOP_DIR}/WPPV3 export POSTEXEC=${WRF_POSTPROC_HOME}/exec export SCRIPTS=${WRF_POSTPROC_HOME}/scripts export WRFPATH=${TOP_DIR}/WRFV3 # cd to working directory cd ${DOMAINPATH}/postprd # Link Ferrier's microphysic's table and WRF-POSTPROC control file, ln -fs ${WRFPATH}/run/ETAMPNEW_DATA eta_micro_lookup.dat ln -fs ${DOMAINPATH}/parm/wrf_cntrl.parm . export tmmark=tm00 export MP_SHARED_MEMORY=yes export MP_LABELIO=yes ####################################################### # 1. Run WRF-POSTPROC # # The WRF-POSTPROC is used to read native WRF model # output and put out isobaric state fields and derived fields. # ####################################################### pwd ls -x export NEWDATE=$startdate YYi=`echo $NEWDATE | cut -c1-4` MMi=`echo $NEWDATE | cut -c5-6` DDi=`echo $NEWDATE | cut -c7-8` HHi=`echo $NEWDATE | cut -c9-10` while [ $fhr -le $lastfhr ] ; do typeset -Z3 fhr NEWDATE=`${POSTEXEC}/ndate.exe +${fhr} $startdate` YY=`echo $NEWDATE | cut -c1-4` MM=`echo $NEWDATE | cut -c5-6` DD=`echo $NEWDATE | cut -c7-8` HH=`echo $NEWDATE | cut -c9-10` echo 'NEWDATE' $NEWDATE echo 'YY' $YY#for domain in d01 d02 d03 for domain in d01 do cat > itag < wrfpost_${domain}.$fhr.out 2>&1 mv WRFPRS$fhr.tm00 WRFPRS_${domain}.${fhr} # #---------------------------------------------------------------------- # End of wrf post job #---------------------------------------------------------------------- ls -l WRFPRS_${domain}.${fhr} err1=$? if test "$err1" -ne 0 then echo 'WRF POST FAILED, EXITTING' exit fi if [ $dyncore = "NMM" ]; then ####################################################################### # 2. Run copygb # # Copygb interpolates WRF-POSTPROC output from its native # grid to a regular projection grid. The package copygb # is used to horizontally interpolate from one domain # to another, it is necessary to run this step for wrf-nmm # (but not for wrf-arw) because wrf-nmm's computational # domain is on rotated Arakawa-E grid # # Copygb can be run in 3 ways as explained below. # Uncomment the preferable one. # #---------------------------------------------------------------------- # # Option 1: # Copygb is run with a pre-defined AWIPS grid # (variable $gridno, see below) Specify the grid to # interpolate the forecast onto. To use standard AWIPS grids # (list in http://wwwt.emc.ncep.noaa.gov/mmb/namgrids/ or # or http://www.nco.ncep.noaa.gov/pmb/docs/on388/tableb.html), # set the number of the grid in variable gridno below. # To use a user defined grid, see explanation above copygb.exe command. # # export gridno=212 # #${POSTEXEC}/copygb.exe -xg${gridno} WRFPRS_${domain}.${fhr} wrfprs_${domain}.${fhr} # #---------------------------------------------------------------------- # # Option 2: # Copygb ingests a kgds definition on the command line. #${POSTEXEC}/copygb.exe -xg"255 3 109 91 37748 -77613 8 -71000 10379 9900 0 64 42000 42000" WRFPRS_${domain}.${fhr} wrfprs_${domain}.${fhr} # #---------------------------------------------------------------------- # # Option 3: # Copygb can ingests contents of files too. For example: # copygb_gridnav.txt or copygb_hwrf.txt through variable $nav. # # Option -3.1: # To run for "Lambert Comformal map projection" uncomment the following line # read nav < 'copygb_gridnav.txt' # # Option -3.2: # To run for "lat-lon" uncomment the following line # #read nav < 'copygb_hwrf.txt' # export nav # ${POSTEXEC}/copygb.exe -xg"${nav}" WRFPRS_${domain}.${fhr} wrfprs_${domain}.${fhr} # # (For more info on "copygb" see WRF-NMM User's Guide, Chapter-7.) #---------------------------------------------------------------------- # Check to see whether "copygb" created the requested file. ls -l wrfprs_${domain}.${fhr} err1=$? if test "$err1" -ne 0 then echo 'copygb FAILED, EXITTING' exit fi #---------------------------------------------------------------------- # End of copygb job #---------------------------------------------------------------------- elif [ $dyncore = "ARW" ]; then ln -s WRFPRS_${domain}.${fhr} wrfprs_${domain}.${fhr} fi done let "fhr=fhr+$incrementhr" NEWDATE=`${POSTEXEC}/ndate.exe +${fhr} $startdate` done date echo "End of Output Job" exit From oholo at iibr.gov.il Tue Sep 27 02:19:50 2011 From: oholo at iibr.gov.il (Oholo) Date: Tue, 27 Sep 2011 11:19:50 +0300 Subject: [Wrf-users] =?windows-1255?q?=FE=FEFW=3A_48th_Oholo_Conference_?= =?windows-1255?q?=96_=22Emerging_Remote_Sensing_Techniques_and_Ass?= =?windows-1255?q?ociated_Modeling_for_Air_Pollution_Applications?= =?windows-1255?q?=22_=96_Eilat=2C_Israel_=96_November_6-10=2C_2011?= In-Reply-To: References: , , , , , , , , , , , , Message-ID: Dear Colleague, Subject: 48th Oholo Conference ? "Emerging Remote Sensing Techniques and Associated Modeling for Air Pollution Applications" ? Eilat, Israel ? November 6-10, 2011 We would like to take this opportunity to remind you that the deadline for early registration rate is nearing ? September 30, 2011. In order to register please enter our website ? www.oholoconference.com In the event of any difficulties please do not hesitate to contact me. Thank you for your kind attention. Sincerely yours, Ariella Raz Ariella Raz Secretariat, 48th Oholo Conference Tel: 972-8-9381656 Fax: 972-8-9401404 Email: oholo at iibr.gov.il ********************************************************************************************** IMPORTANT: The contents of this email and any attachments are confidential. They are intended for the named recipient(s) only. If you have received this email in error, please notify the system manager or the sender immediately and do not disclose the contents to anyone or make copies thereof. *** eSafe scanned this email for viruses, vandals, and malicious content. *** ********************************************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110927/79923742/attachment-0001.html From preeti at csa.iisc.ernet.in Wed Sep 28 02:45:27 2011 From: preeti at csa.iisc.ernet.in (Preeti) Date: Wed, 28 Sep 2011 14:15:27 +0530 Subject: [Wrf-users] Absolute vorticity Message-ID: Hello I want to get "Absolute vorticity" as one of the variables in the WRF output files. Is this possible? If yes, can someone please tell me what are the steps I need to follow? Thanks in advance Preeti -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/3035ee0f/attachment.html From anottrot at ucsd.edu Wed Sep 28 11:18:24 2011 From: anottrot at ucsd.edu (Anders A Nottrott) Date: Wed, 28 Sep 2011 10:18:24 -0700 Subject: [Wrf-users] Unusual Velocity Spectra in Ideal LES w/ Wind Message-ID: <003101cc7e02$a1a87d30$e4f97790$@ucsd.edu> Hello All, I'm running some simple ideal LES simulations of the neutral ABL forced with 10 m/s geostrophic wind. In validating vertical profiles the mean wind, velocity variance and momentum flux profiles agree well with other LES and experimental data. However, the velocity spectra begin to "roll-off" at unusually low wavenumbers, i.e. there is too much energy content at the highest wavenumbers (see the plots in the attached PDF). A comparison with Mirocha et al. (2010) indicates that this is not typical of WRF-LES. In an attempt to diagnose the problem I have tried increasing vertical resolution, 3 different SGS models (3D Smagorinsky, NBA-Stress, NBA-TKE) and simulations with and without a capping inversion, but the results are similar each time. This leads me to believe that the problem is related to one or more poorly selected options for numerics and as a result unphysical high frequency waves are generated in the domain. Note that I am using the 'em_les' test case as the basis for my runs so I have 5th order RK3 in the horizontal and 3rd order RK3 in the vertical. I have set the acoustic timestep to be calculated automatically (i.e. namelist variable 'time_step_sound=0'). It is worth noting that in purely convective simulations (no background wind) the velocity and temperature spectra look fine, so this is more evidence pointing to a problem with the advection scheme/numerics. Has anyone else experienced this problem? Any comments or suggestions will be very helpful and most appreciated. Regards, Anders Anders Nottrott PhD Student, Mechanical and Environmental Engineering University of California, San Diego Department of Mechanical and Aerospace Engineering -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/4fb41de8/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: WRFLESspec.pdf Type: application/pdf Size: 31909 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/4fb41de8/attachment-0001.pdf From jaareval at gmail.com Wed Sep 28 12:38:53 2011 From: jaareval at gmail.com (Jorge Alejandro Arevalo Borquez) Date: Wed, 28 Sep 2011 14:38:53 -0400 Subject: [Wrf-users] WRF Tutorial in Chile - January 2012 Message-ID: Dear WRF Users, The Laboratory for Atmospherical Modelling of Meteorology Department of Universidad de Valpara?so (Chile) will be really greatful if you could distribute the announcement of our third summer course "Introducci?n a la Modelaci?n Atmosf?rica - Uso de WRF" will focuses on basic use and description of WRF-ARW for real cases. This course will be offered in spanish language at dependences of Universidad de Valpara?so, Valpara?so, Chile. More information is available in attached document and at course website: http://www.meteo.uv.cl/ima Atentamente > Jorge Ar?valo B?rquez > Coordinador LMA-UV > Departamento de Meteorolog?a > Universidad de Valpara?so 56-32-2508700 > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/f06deaa2/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Curso_WRF_LMA-UV_2012.pdf Type: application/pdf Size: 135586 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20110928/f06deaa2/attachment-0001.pdf From dennisl at u.washington.edu Thu Sep 29 13:55:55 2011 From: dennisl at u.washington.edu (Dennis Lettenmaier) Date: Thu, 29 Sep 2011 12:55:55 -0700 Subject: [Wrf-users] =?windows-1252?q?Postdoc_=96_Regional_climate_and_lan?= =?windows-1252?q?d_surface_modeling?= Message-ID: <4E84CD4B.8060705@u.washington.edu> The University of Washington Department of Civil and Environmental Engineering seeks a postdoctoral researcher to participate in development of the land portion of the DOE-funded Regional Arctic System Model. The project involves coupling the Variable Infiltration Capacity (VIC) model with the WRF regional climate model (and ocean and sea ice models) through the CCSM/CESM flux coupler. Experience in land surface model development and regional climate modeling, including diagnosis of coupled model simulations, is desirable, as is experience in handling large data sets ? e.g., remote sensing, gridded in situ observations, and multivariate spatial model output. Proficiency in the C programming language and Unix experience is essential, as are strong written and oral communications skills in English. A Ph.D. in hydrology or a closely related field is required. Salary DOE. Email vitae and a short statement of relevant experience to Professor Dennis P. Lettenmaier, dennisl at u.washington.edu . AA/EOE. From moudipascal at yahoo.fr Sun Oct 9 02:00:23 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Sun, 9 Oct 2011 09:00:23 +0100 (BST) Subject: [Wrf-users] Explanation of Backgroung Errors Statistics plots Message-ID: <1318147223.45954.YahooMailNeo@web29002.mail.ird.yahoo.com> Dear All, I have generated BES for my own domain using NMC method. When i use gen_be_wrapper_plot.ksh to generate plot, i am unable to explain and? understand what these plots mean. I would like to know: what physically mean variable control, eigenvector and lengthscale? Please finf attached the figures i produced and please provide me some explanations ? Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111009/74da5421/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: gen_be_plots.tar.gz Type: application/x-gzip Size: 322378 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111009/74da5421/attachment-0001.gz From ebeigi3 at tigers.lsu.edu Sun Oct 9 16:06:25 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Sun, 9 Oct 2011 18:06:25 -0400 Subject: [Wrf-users] cam2wrf Message-ID: Dear Friends, Does anyone have a code that converts CCSM output to intermediate format of WPS (except the cam2wrf provided by NCAR )? Thanks in advance. Best Regards, -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111009/a5971366/attachment.html From Don.Morton at alaska.edu Mon Oct 10 18:13:05 2011 From: Don.Morton at alaska.edu (Don Morton) Date: Mon, 10 Oct 2011 16:13:05 -0800 Subject: [Wrf-users] 2012 Alaska Weather Symposium Message-ID: The 2012 Alaska Weather Symposium (AWS '12) 13-14 March 2012 University of Alaska Fairbanks Preliminary Announcement and Call for Abstracts Symposium Web Page: http://weather.arsc.edu/Events/AWS12/ The following sponsors (listed alphabetically) - Alaska Region, NOAA National Weather Service - Arctic Region Supercomputing Center (UAF) - Department of Atmospheric Sciences (College of Natural Science and Mathematics, UAF) - Geophysical Institute (UAF) - International Arctic Research Center (UAF) invite you to attend the 2012 Alaska Weather Symposium. The symposiumprovides a forum for the exchange of operational and research information related to weather in the Alaska environment. Participation from academic, research, government, military, and private sectors is encouraged. Anticipated areas of focus are - Air quality - Data assimilation - High resolution modeling in complex terrain - Observations / monitoring challenges However, as usual, all abstracts relating to weather in Alaska are welcome --Schedule/Venue-- This will be a two-day symposium held at the University of Alaska Fairbanks campus on Tuesday and Wednesday, 13-14 March 2012. Snacks will be provided and evening meals will be on-your-own, with an organized evening out (pay on your own) at a local establishment. --Abstract Submission-- The deadline for one-paragraph abstracts for a 15-20 minute presentation is Tuesday, 24 January 2012. See the symposium web page for abstract submission procedures. --Registration-- Registration is required by Tuesday, 06 March 2012. See thes ymposium web page for registration instructions. There is no registration fee. Symposium Web Page: http://weather.arsc.edu/Events/AWS12/ -- Voice: 907 450 8679 Arctic Region Supercomputing Center http://weather.arsc.edu/ http://people.arsc.edu/~morton/ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111010/d97b4e2b/attachment.html From aijun at email.unc.edu Sat Oct 15 22:24:13 2011 From: aijun at email.unc.edu (Xiu, Aijun) Date: Sun, 16 Oct 2011 04:24:13 +0000 Subject: [Wrf-users] cam2wrf code Message-ID: Hi, Does anyone have a updated CAM2WRF code that can deal with CCSM4 output files? If so, can you share the code with me? Thanks! Aijun -- Thanks! Aijun --------------------------------------------------------------------- Aijun Xiu, PhD, Research Associate Professor Phone: 919-966-2064 Institute for the Environment Fax: 919-843-3113 University of North Carolina at Chapel Hill Email: aijun at email.unc.edu Bank of America Plaza, CB# 6116 137 E. Franklin St., Room 657 Chapel Hill, NC 27599-6116 --------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111016/c94700c4/attachment.html From aruny at iitk.ac.in Tue Oct 11 04:18:14 2011 From: aruny at iitk.ac.in (Arun Yadav) Date: Tue, 11 Oct 2011 15:48:14 +0530 Subject: [Wrf-users] Edit WRF-Chem input file Message-ID: <4E9417E6.5030401@iitk.ac.in> Hi, I want to edit the input files prepared by prep_chem_sources for WRF-Chem by either changing the emissions file itself or changing the code of Prep_chem_sources to do this for me. I need it to make some hypothetical scenarios run on WRF-Chem. Is it possible to do so? Regards, Arun From shankhabanerjee at gmail.com Sun Oct 16 08:41:13 2011 From: shankhabanerjee at gmail.com (shankha) Date: Sun, 16 Oct 2011 10:41:13 -0400 Subject: [Wrf-users] wrf fails to run : build with OpenMPI Message-ID: Hi, Machine Information : Linux bertram 2.6.32-33-generic #72-Ubuntu SMP Fri Jul 29 21:07:13 UTC 2011 x86_64 GNU/Linux Compiler : 4.4.3. MPI : OpenMPI 1.4.3 I am unable to run wrf.exe with mpirun. I am able to run wrf.exe stand alone bertram : ~ ] mpirun -np 2 ./wrf.exe -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 7061 on node bertram exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- I have checked my MPI installation and it is good. ldd didn't report any missing libraries or errors. Thanks for your help. -- Thanks Shankha -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111016/37a8191e/attachment-0001.html -------------- next part -------------- None of WRF_EM_CORE, WRF_NMM_CORE, specified in shell environment.... copying Registry/Registry.EM to Registry/Registry Compiling: WRF_EM_CORE . setting parallel make -j 2 make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" ext make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- ( cd frame ; make -i -r externals ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/ioapi_share ; \ make -i -r NATIVE_RWORDSIZE="4" RWORDSIZE="4" AR="ar" ARFLAGS="ru" ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/ioapi_share' ( /bin/rm -f ../../inc/wrf_io_flags.h foo_io_flags.h; \ /bin/cp wrf_io_flags.h foo_io_flags.h; \ if [ 4 -ne 4 ] ; then \ /bin/rm -f foo_io_flags.h; \ sed -e 's/104/105/' wrf_io_flags.h > foo_io_flags.h ;\ fi ; \ /bin/mv foo_io_flags.h ../../inc/wrf_io_flags.h ) /bin/rm -f ../../inc/wrf_status_codes.h /bin/cp wrf_status_codes.h ../../inc make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/ioapi_share' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share ; \ make CC="gcc" CFLAGS="-w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25" RM="rm -f" RANLIB="ranlib" CPP="/lib/cpp -C -P" \ FC="gfortran -I. -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " TRADFLAG="-traditional" AR="ar" ARFLAGS="ru" archive) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share' make[4]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share' rm -f io_grib_share.o /lib/cpp -C -P -traditional -I. io_grib_share.F > io_grib_share.f90 gfortran -I. -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I. -c io_grib_share.f90 gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c get_region_center.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gridnav.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c open_file.c ar ru ./libio_grib_share.a io_grib_share.o get_region_center.o gridnav.o open_file.o ar: creating ./libio_grib_share.a ranlib ./libio_grib_share.a make[4]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share' make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1 ; \ make CC="gcc" CFLAGS="-w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25" RM="rm -f" RANLIB="ranlib" CPP="/lib/cpp -C -P" \ FC="gfortran -I. -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " TRADFLAG="-traditional" AR="ar" ARFLAGS="ru" archive) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1' Doing make archive on library subdirectory MEL_grib1 make[4]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/MEL_grib1' make[5]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/MEL_grib1' gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c FTP_getfile.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c apply_bitmap.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c display_gribhdr.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gbyte.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c grib_dec.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c grib_enc.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c grib_seek.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribgetbds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribgetbms.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribgetgds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribgetpds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribhdr2file.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribputbds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribputgds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribputpds.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c hdr_print.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c init_dec_struct.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c init_enc_struct.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c init_gribhdr.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c init_struct.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c ld_dec_lookup.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c ld_enc_input.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c ld_enc_lookup.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c ld_grib_origctrs.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c make_default_grbfn.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c make_grib_log.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c map_lvl.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c map_parm.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c pack_spatial.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c prt_inp_struct.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c upd_child_errmsg.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c prt_badmsg.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c swap.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c grib_uthin.c gcc -I. -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c set_bytes.c ar ru ../libio_grib1.a FTP_getfile.o apply_bitmap.o display_gribhdr.o gbyte.o grib_dec.o grib_enc.o grib_seek.o gribgetbds.o gribgetbms.o gribgetgds.o gribgetpds.o gribhdr2file.o gribputbds.o gribputgds.o gribputpds.o hdr_print.o init_dec_struct.o init_enc_struct.o init_gribhdr.o init_struct.o ld_dec_lookup.o ld_enc_input.o ld_enc_lookup.o ld_grib_origctrs.o make_default_grbfn.o make_grib_log.o map_lvl.o map_parm.o pack_spatial.o prt_inp_struct.o upd_child_errmsg.o prt_badmsg.o swap.o grib_uthin.o set_bytes.o ar: creating ../libio_grib1.a ranlib ../libio_grib1.a make[5]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/MEL_grib1' make[4]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/MEL_grib1' Doing make archive on library subdirectory grib1_util make[4]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/grib1_util' make[5]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/grib1_util' gcc -I. -I../MEL_grib1 -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c alloc_2d.c gcc -I. -I../MEL_grib1 -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c read_grib.c gcc -I. -I../MEL_grib1 -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c write_grib.c ar ru ../libio_grib1.a alloc_2d.o read_grib.o write_grib.o ranlib ../libio_grib1.a make[5]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/grib1_util' make[4]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/grib1_util' Doing make archive on library subdirectory WGRIB make[4]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/WGRIB' gcc -c -O wgrib_main.c gcc -c -O seekgrib.c gcc -c -O ibm2flt.c gcc -c -O readgrib.c gcc -c -O intpower.c gcc -c -O cnames.c gcc -c -O BDSunpk.c gcc -c -O flt2ieee.c gcc -c -O wrtieee.c gcc -c -O levels.c gcc -c -O PDStimes.c gcc -c -O missing.c gcc -c -O nceptable_reanal.c gcc -c -O nceptable_opn.c gcc -c -O ensemble.c gcc -c -O ombtable.c gcc -c -O ec_ext.c gcc -c -O gribtable.c gcc -c -O gds_grid.c gcc -c -O PDS_date.c gcc -c -O ectable_128.c gcc -c -O ectable_129.c gcc -c -O ectable_130.c gcc -c -O ectable_131.c gcc -c -O ectable_140.c gcc -c -O ectable_150.c gcc -c -O ectable_151.c gcc -c -O ectable_160.c gcc -c -O ectable_170.c gcc -c -O ectable_180.c gcc -c -O nceptab_129.c gcc -c -O dwdtable_002.c gcc -c -O dwdtable_201.c gcc -c -O dwdtable_202.c gcc -c -O dwdtable_203.c gcc -c -O cptectable_254.c gcc -c -O nceptab_130.c gcc -c -O nceptab_131.c gcc -o wgrib.exe wgrib_main.o seekgrib.o ibm2flt.o readgrib.o intpower.o cnames.o BDSunpk.o flt2ieee.o wrtieee.o levels.o PDStimes.o missing.o nceptable_reanal.o nceptable_opn.o ensemble.o ombtable.o ec_ext.o gribtable.o gds_grid.o PDS_date.o ectable_128.o ectable_129.o ectable_130.o ectable_131.o ectable_140.o ectable_150.o ectable_151.o ectable_160.o ectable_170.o ectable_180.o nceptab_129.o dwdtable_002.o dwdtable_201.o dwdtable_202.o dwdtable_203.o cptectable_254.o nceptab_130.o nceptab_131.o #-lm ( cd .. ; \rm -f wgrib.exe ; \ln -sf WGRIB/wgrib.exe wgrib.exe ; cd WGRIB ) make[4]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/WGRIB' make[4]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1' gcc -I. -I./MEL_grib1 -Igrib1_util -I../io_grib_share -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c grib1_routines.c gcc -I. -I./MEL_grib1 -Igrib1_util -I../io_grib_share -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c gribmap.c rm -f io_grib1.o /lib/cpp -C -P -traditional -I. -I./MEL_grib1 -Igrib1_util -I../io_grib_share io_grib1.F > io_grib1.f90 gfortran -I. -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I. -I./MEL_grib1 -Igrib1_util -I../io_grib_share -c io_grib1.f90 gcc -I. -I./MEL_grib1 -Igrib1_util -I../io_grib_share -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c trim.c ar ru ./libio_grib1.a grib1_routines.o gribmap.o io_grib1.o trim.o ranlib ./libio_grib1.a make[4]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1' make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int ; \ make CC="mpicc -DMPI2_SUPPORT -DFSEEKO64_OK " RM="rm -f" RANLIB="ranlib" CPP="/lib/cpp -C -P" \ FC="gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " FGREP="fgrep -iq" \ TRADFLAG="-traditional" AR="ar" ARFLAGS="ru" ARCHFLAGS="-DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0" all ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int' cp ../../frame/module_internal_header_util.F module_internal_header_util.b cp ../../inc/intio_tags.h intio_tags.h /bin/rm -f module_internal_header_util.f /lib/cpp -C -P -traditional -I../ioapi_share module_internal_header_util.b > module_internal_header_util.f gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I. -o module_internal_header_util.o -c module_internal_header_util.f /bin/rm -f module_internal_header_util.b /lib/cpp -C -P -traditional -I../ioapi_share io_int.F90 | m4 -Uinclude -Uindex -Ulen - > io_int.f gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I. -I../ioapi_share -o io_int.o -c io_int.f /bin/rm -f intio_tags.h /bin/rm -f libwrfio_int.a ar cr libwrfio_int.a io_int.o ranlib libwrfio_int.a Diffwrf io_int will be built later on in this compile. No need to rerun compile. Diffwrf io_int will be built later on in this compile. No need to rerun compile. Diffwrf io_int will be built later on in this compile. No need to rerun compile. Diffwrf io_int will be built later on in this compile. No need to rerun compile. if [ -f ../../frame/pack_utils.o ] ; then \ mv diffwrf.F diffwrf.F90 ; \ x=`echo "gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " | awk '{print $1}'` ; export x ; \ if [ $x = "gfortran" ] ; then \ echo removing external declaration of iargc for gfortran ; \ /lib/cpp -C -P -traditional -I../ioapi_share diffwrf.F90 | sed '/integer *, *external.*iargc/d' > diffwrf.f ; \ else \ /lib/cpp -C -P -traditional -I../ioapi_share diffwrf.F90 > diffwrf.f ; \ fi ; \ gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -c -I../ioapi_share diffwrf.f ; \ mv diffwrf.F90 diffwrf.F ; \ gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o diffwrf diffwrf.o io_int.o \ ../../frame/pack_utils.o ../../frame/module_internal_header_util.o \ ../../frame/module_driver_constants.o \ ../../frame/module_machine.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o \ ; fi make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 ; \ make FC="gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " RANLIB="ranlib" \ CPP="/lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional" AR="ar" ARFLAGS="ru" ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90' /bin/rm -f ESMF_Base.o sed -e "/\!.*'/s/'//g" ESMF_Base.F90 > ESMF_Base.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Base.b > ESMF_Base.f /bin/rm -f ESMF_Base.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Base.o -c ESMF_Base.f /bin/rm -f ESMF_BaseTime.o sed -e "/\!.*'/s/'//g" ESMF_BaseTime.F90 > ESMF_BaseTime.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_BaseTime.b > ESMF_BaseTime.f /bin/rm -f ESMF_BaseTime.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_BaseTime.o -c ESMF_BaseTime.f /bin/rm -f ESMF_Calendar.o sed -e "/\!.*'/s/'//g" ESMF_Calendar.F90 > ESMF_Calendar.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Calendar.b > ESMF_Calendar.f /bin/rm -f ESMF_Calendar.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Calendar.o -c ESMF_Calendar.f /bin/rm -f ESMF_Fraction.o sed -e "/\!.*'/s/'//g" ESMF_Fraction.F90 > ESMF_Fraction.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Fraction.b > ESMF_Fraction.f /bin/rm -f ESMF_Fraction.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Fraction.o -c ESMF_Fraction.f /bin/rm -f ESMF_TimeInterval.o sed -e "/\!.*'/s/'//g" ESMF_TimeInterval.F90 > ESMF_TimeInterval.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_TimeInterval.b > ESMF_TimeInterval.f /bin/rm -f ESMF_TimeInterval.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_TimeInterval.o -c ESMF_TimeInterval.f /bin/rm -f ESMF_Stubs.o sed -e "/\!.*'/s/'//g" ESMF_Stubs.F90 > ESMF_Stubs.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Stubs.b > ESMF_Stubs.f /bin/rm -f ESMF_Stubs.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Stubs.o -c ESMF_Stubs.f /bin/rm -f ESMF_Time.o sed -e "/\!.*'/s/'//g" ESMF_Time.F90 > ESMF_Time.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Time.b > ESMF_Time.f /bin/rm -f ESMF_Time.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Time.o -c ESMF_Time.f /bin/rm -f ESMF_Alarm.o sed -e "/\!.*'/s/'//g" ESMF_Alarm.F90 > ESMF_Alarm.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Alarm.b > ESMF_Alarm.f /bin/rm -f ESMF_Alarm.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Alarm.o -c ESMF_Alarm.f /bin/rm -f ESMF_Clock.o sed -e "/\!.*'/s/'//g" ESMF_Clock.F90 > ESMF_Clock.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Clock.b > ESMF_Clock.f /bin/rm -f ESMF_Clock.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Clock.o -c ESMF_Clock.f /bin/rm -f Meat.o sed -e "/\!.*'/s/'//g" Meat.F90 > Meat.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. Meat.b > Meat.f /bin/rm -f Meat.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o Meat.o -c Meat.f /bin/rm -f ESMF_AlarmClock.o sed -e "/\!.*'/s/'//g" ESMF_AlarmClock.F90 > ESMF_AlarmClock.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_AlarmClock.b > ESMF_AlarmClock.f /bin/rm -f ESMF_AlarmClock.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_AlarmClock.o -c ESMF_AlarmClock.f /bin/rm -f ESMF_Mod.o sed -e "/\!.*'/s/'//g" ESMF_Mod.F90 > ESMF_Mod.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. ESMF_Mod.b > ESMF_Mod.f /bin/rm -f ESMF_Mod.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o ESMF_Mod.o -c ESMF_Mod.f /bin/rm -f module_symbols_util.o sed -e "/\!.*'/s/'//g" module_symbols_util.F90 > module_symbols_util.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. module_symbols_util.b > module_symbols_util.f /bin/rm -f module_symbols_util.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o module_symbols_util.o -c module_symbols_util.f /bin/rm -f module_utility.o sed -e "/\!.*'/s/'//g" module_utility.F90 > module_utility.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional -C -P -I. module_utility.b > module_utility.f /bin/rm -f module_utility.b gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o module_utility.o -c module_utility.f /bin/rm -f libesmf_time.a if [ "ar" != "lib.exe" ] ; then \ ar ru libesmf_time.a ESMF_Alarm.o ESMF_BaseTime.o ESMF_Clock.o ESMF_Time.o Meat.o ESMF_Base.o ESMF_Calendar.o ESMF_Fraction.o ESMF_TimeInterval.o ESMF_Stubs.o ESMF_Mod.o module_symbols_util.o module_utility.o ESMF_AlarmClock.o ; \ else \ ar /out:libesmf_time.a ESMF_Alarm.o ESMF_BaseTime.o ESMF_Clock.o ESMF_Time.o Meat.o ESMF_Base.o ESMF_Calendar.o ESMF_Fraction.o ESMF_TimeInterval.o ESMF_Stubs.o ESMF_Mod.o module_symbols_util.o module_utility.o ESMF_AlarmClock.o ; \ fi ar: creating libesmf_time.a ranlib libesmf_time.a make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/fftpack/fftpack5 ; \ make FC="gfortran" FFLAGS=" -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " RANLIB="ranlib" AR="ar" ARFLAGS="ru" ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/fftpack/fftpack5' gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f2kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f2kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f3kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f3kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f4kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f4kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f5kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1f5kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1fgkb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1fgkf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1fm1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 c1fm1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft2b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft2f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfft2i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfftmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfftmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cfftmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf2kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf2kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf3kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf3kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf4kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf4kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf5kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmf5kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmfgkb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmfgkf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmfm1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cmfm1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosq1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosq1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosq1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosqb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosqf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosqmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosqmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cosqmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cost1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cost1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 cost1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 costb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 costf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 costmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 costmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 costmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 factor.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mcfti1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mcsqb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mcsqf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mcstb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mcstf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradb2.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradb3.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradb4.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradb5.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradbg.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradf2.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradf3.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradf4.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradf5.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mradfg.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mrftb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mrftf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 mrfti1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 msntb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 msntf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f2kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f2kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f3kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f3kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f4kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f4kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f5kb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1f5kf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1fgkb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 r1fgkf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft2b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft2f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfft2i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfftb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfftf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rffti1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfftmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfftmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 rfftmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinq1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinq1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinq1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinqmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinqmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sinqmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sint1b.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sint1f.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sint1i.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sintb1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sintf1.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sintmb.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sintmf.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 sintmi.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 tables.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 xercon.F gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 xerfft.F ar ru libfftpack.a c1f2kb.o c1f2kf.o c1f3kb.o c1f3kf.o c1f4kb.o c1f4kf.o c1f5kb.o c1f5kf.o c1fgkb.o c1fgkf.o c1fm1b.o c1fm1f.o cfft1b.o cfft1f.o cfft1i.o cfft2b.o cfft2f.o cfft2i.o cfftmb.o cfftmf.o cfftmi.o cmf2kb.o cmf2kf.o cmf3kb.o cmf3kf.o cmf4kb.o cmf4kf.o cmf5kb.o cmf5kf.o cmfgkb.o cmfgkf.o cmfm1b.o cmfm1f.o cosq1b.o cosq1f.o cosq1i.o cosqb1.o cosqf1.o cosqmb.o cosqmf.o cosqmi.o cost1b.o cost1f.o cost1i.o costb1.o costf1.o costmb.o costmf.o costmi.o factor.o mcfti1.o mcsqb1.o mcsqf1.o mcstb1.o mcstf1.o mradb2.o mradb3.o mradb4.o mradb5.o mradbg.o mradf2.o mradf3.o mradf4.o mradf5.o mradfg.o mrftb1.o mrftf1.o mrfti1.o msntb1.o msntf1.o r1f2kb.o r1f2kf.o r1f3kb.o r1f3kf.o r1f4kb.o r1f4kf.o r1f5kb.o r1f5kf.o r1fgkb.o r1fgkf.o rfft1b.o rfft1f.o rfft1i.o rfft2b.o rfft2f.o rfft2i.o rfftb1.o rfftf1.o rffti1.o rfftmb.o rfftmf.o rfftmi.o sinq1b.o sinq1f.o sinq1i.o sinqmb.o sinqmf.o sinqmi.o sint1b.o sint1f.o sint1i.o sintb1.o sintf1.o sintmb.o sintmf.o sintmi.o tables.o xercon.o xerfft.o ar: creating libfftpack.a ranlib libfftpack.a make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/fftpack/fftpack5' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf ; \ make NETCDFPATH="/home/shankha/work/packages/netcdf/install" RANLIB="ranlib" CPP="/lib/cpp -C -P" \ CC="gcc" CFLAGS="-w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25" \ FC="gfortran -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " TRADFLAG="-traditional" AR="ar" ARFLAGS="ru" ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf' grep nf_format_64bit /home/shankha/work/packages/netcdf/install/include/netcdf.inc ;\ a=$? ; export a ; \ if [ $a -a "$WRFIO_NCD_LARGE_FILE_SUPPORT" = "1" ] ; then \ /lib/cpp -C -P -C -P -traditional -DWRFIO_NCD_LARGE_FILE_SUPPORT -I../ioapi_share wrf_io.F90 | m4 -Uinclude -Uindex -Ulen - > wrf_io.f ; \ else \ /lib/cpp -C -P -C -P -traditional -I../ioapi_share wrf_io.F90 | m4 -Uinclude -Uindex -Ulen - > wrf_io.f ; \ fi integer nf_format_64bit parameter (nf_format_64bit = 2) gfortran -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o wrf_io.o -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share -c wrf_io.f /lib/cpp -C -P -C -P -traditional -I../ioapi_share field_routines.F90 > field_routines.f gfortran -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o field_routines.o -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share -c field_routines.f /lib/cpp -C -P -C -P -traditional -I../ioapi_share module_wrfsi_static.F90 > module_wrfsi_static.f gfortran -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o module_wrfsi_static.o -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share -c module_wrfsi_static.f /bin/rm -f libwrfio_nf.a if [ "ar" != "lib.exe" ] ; then \ ar cr libwrfio_nf.a wrf_io.o field_routines.o module_wrfsi_static.o ; \ else \ ar /out:libwrfio_nf.a wrf_io.o field_routines.o module_wrfsi_static.o ; \ fi ranlib libwrfio_nf.a make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf' ( cd /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE ; make CC="mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25" \ FC="mpif77 -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -fconvert=big-endian -frecord-marker=4" \ CPP="/lib/cpp -C -P -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional" AR="ar" ARFLAGS="ru" ;\ ranlib /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE/librsl_lite.a ) make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE' mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c c_code.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c buf_for_proc.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c rsl_malloc.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c rsl_bcast.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c task_for_point.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c period.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c swap.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -c cycle.c mpif77 -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -fconvert=big-endian -frecord-marker=4 -o f_pack.o -c f_pack.F90 /lib/cpp -C -P -I. -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -traditional f_xpose.F90 > f_xpose.f mpif77 -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -fconvert=big-endian -frecord-marker=4 -o f_xpose.o -c f_xpose.f /bin/rm -f librsl_lite.a ar cr librsl_lite.a c_code.o buf_for_proc.o rsl_malloc.o rsl_bcast.o task_for_point.o period.o swap.o cycle.o f_pack.o f_xpose.o make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE' ( if [ ! -e /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/gen_comms.c ] ; then \ /bin/cp /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/gen_comms_warning /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/gen_comms.c ; \ cat /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE/gen_comms.c >> /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/gen_comms.c ; fi ) ( if [ ! -e module_dm.F ] ; then /bin/cp module_dm_warning module_dm.F ; \ cat /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE/module_dm.F >> module_dm.F ; fi ) make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" toolsdir make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- ( cd tools ; make -i -r CC_TOOLS="gcc -DIWORDSIZE=4 -DMAX_HISTORY=25" ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools' gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g registry.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g my_strtok.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g reg_parse.c reg_parse.c: In function ?pre_parse?: reg_parse.c:258: warning: format not a string literal and no format arguments gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g data.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g type.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g misc.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_defs.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_allocs.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_mod_state_descr.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_scalar_indices.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_args.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_config.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g sym.c sym.c: In function ?sym_init?: sym.c:77: warning: incompatible implicit declaration of built-in function ?exit? sym.c: In function ?sym_forget?: sym.c:157: warning: incompatible implicit declaration of built-in function ?exit? gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g symtab_gen.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_model_data_ord.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_interp.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_comms.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_scalar_derefs.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g set_dim_strs.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_wrf_io.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g gen_streams.c gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -c -g standard.c standard.c: In function ?main?: standard.c:43: warning: incompatible implicit declaration of built-in function ?strncpy? standard.c:78: warning: incompatible implicit declaration of built-in function ?strcpy? standard.c:88: warning: incompatible implicit declaration of built-in function ?strcat? gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -o standard.exe -g standard.o gcc -DIWORDSIZE=4 -DMAX_HISTORY=25 -o registry -g registry.o my_strtok.o reg_parse.o data.o type.o misc.o gen_defs.o gen_allocs.o gen_mod_state_descr.o gen_scalar_indices.o gen_args.o gen_config.o sym.o symtab_gen.o gen_model_data_ord.o gen_interp.o gen_comms.o gen_scalar_derefs.o set_dim_strs.o gen_wrf_io.o gen_streams.o make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' /bin/rm -f main/libwrflib.a main/libwrflib.lib make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" framework make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- ( cd frame ; make -i -r -j 2 framework; \ cd ../external/io_netcdf ; \ make -i -r NETCDFPATH="/home/shankha/work/packages/netcdf/install" FC="gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " RANLIB="ranlib" \ CPP="/lib/cpp -C -P" LDFLAGS=" -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " TRADFLAG="-traditional" ESMF_IO_LIB_EXT="-L/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -lesmf_time" \ ESMF_MOD_DEPENDENCE="/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90/module_utility.o" AR="INTERNAL_BUILD_ERROR_SHOULD_NOT_NEED_AR" diffwrf; \ cd ../io_int ; \ make -i -r SFC="gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " FC="gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " RANLIB="ranlib" CPP="/lib/cpp -C -P" \ TRADFLAG="-traditional" ESMF_IO_LIB_EXT="-L/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -lesmf_time" \ ESMF_MOD_DEPENDENCE="/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90/module_utility.o" AR="INTERNAL_BUILD_ERROR_SHOULD_NOT_NEED_AR" diffwrf ; \ cd ../../frame ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame' ( cd .. ; tools/registry -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -DNEW_BDYS Registry/Registry ) ; rm -f wrf_shutdown.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_shutdown.F > wrf_shutdown.bb opening Registry/registry.dimspec including Registry/registry.dimspec /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_shutdown.bb | /lib/cpp -C -P > wrf_shutdown.f90 rm -f wrf_shutdown.b wrf_shutdown.bb if fgrep -iq '!$OMP' wrf_shutdown.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf_shutdown.F WITH OMP ; fi ; \ mpif77 -o wrf_shutdown.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_shutdown.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf_shutdown.F WITHOUT OMP ; fi ; \ mpif77 -o wrf_shutdown.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_shutdown.f90 ; \ fi opening Registry/registry.les including Registry/registry.les opening Registry/registry.cam including Registry/registry.cam rm -f module_sm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sm.F > module_sm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sm.bb | /lib/cpp -C -P > module_sm.f90 rm -f module_sm.b module_sm.bb if fgrep -iq '!$OMP' module_sm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sm.F WITH OMP ; fi ; \ mpif77 -o module_sm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sm.F WITHOUT OMP ; fi ; \ mpif77 -o module_sm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sm.f90 ; \ fi if [ "m4 -G" = NA ] ; then \ /bin/cp ../arch/md_calls.inc . ; \ else \ m4 -G md_calls.m4 > md_calls.inc ; \ fi /lib/cpp -C -P -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional -I../inc module_internal_header_util.F > module_internal_header_util.f90 gfortran -c -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 module_internal_header_util.f90 opening Registry/registry.io_boilerplate including Registry/registry.io_boilerplate opening Registry/io_boilerplate_temporary.inc including Registry/io_boilerplate_temporary.inc mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -c -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 wrf_num_bytes_between.c rm -f libmassv.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional libmassv.F > libmassv.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe libmassv.bb | /lib/cpp -C -P > libmassv.f90 rm -f libmassv.b libmassv.bb if fgrep -iq '!$OMP' libmassv.f90 ; then \ if [ -n "" ] ; then echo COMPILING libmassv.F WITH OMP ; fi ; \ mpif77 -o libmassv.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include libmassv.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING libmassv.F WITHOUT OMP ; fi ; \ mpif77 -o libmassv.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include libmassv.f90 ; \ fi opening Registry/registry.fire including Registry/registry.fire opening Registry/registry.avgflx including Registry/registry.avgflx opening Registry/registry.stoch including Registry/registry.stoch rm -f collect_on_comm.o mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -o collect_on_comm.o -c -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 collect_on_comm.c mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -c -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 -DIWORDSIZE=4 pack_utils.c Registry INFO variable counts: 0d 1920 1d 92 2d 595 3d 434 rm -f module_wrf_error.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_wrf_error.F > module_wrf_error.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_wrf_error.bb | /lib/cpp -C -P > module_wrf_error.f90 rm -f module_wrf_error.b module_wrf_error.bb if fgrep -iq '!$OMP' module_wrf_error.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_wrf_error.F WITH OMP ; fi ; \ mpif77 -o module_wrf_error.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wrf_error.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_wrf_error.F WITHOUT OMP ; fi ; \ mpif77 -o module_wrf_error.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wrf_error.f90 ; \ fi rm -f wrf_debug.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_debug.F > wrf_debug.bb ADVISORY: RSL_LITE version of gen_comms is linked in with registry program. /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_debug.bb | /lib/cpp -C -P > wrf_debug.f90 rm -f wrf_debug.b wrf_debug.bb if fgrep -iq '!$OMP' wrf_debug.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf_debug.F WITH OMP ; fi ; \ mpif77 -o wrf_debug.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_debug.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf_debug.F WITHOUT OMP ; fi ; \ mpif77 -o wrf_debug.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_debug.f90 ; \ fi rm -f module_state_description.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_state_description.F > module_state_description.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_state_description.bb | /lib/cpp -C -P > module_state_description.f90 rm -f module_state_description.b module_state_description.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_state_description.f90 rm -f module_driver_constants.o rm -f module_streams.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_driver_constants.F > module_driver_constants.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_streams.F > module_streams.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_driver_constants.bb | /lib/cpp -C -P > module_driver_constants.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_streams.bb | /lib/cpp -C -P > module_streams.f90 rm -f module_driver_constants.b module_driver_constants.bb rm -f module_streams.b module_streams.bb if fgrep -iq '!$OMP' module_driver_constants.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_driver_constants.F WITH OMP ; fi ; \ mpif77 -o module_driver_constants.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_driver_constants.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_driver_constants.F WITHOUT OMP ; fi ; \ mpif77 -o module_driver_constants.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_driver_constants.f90 ; \ fi if fgrep -iq '!$OMP' module_streams.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_streams.F WITH OMP ; fi ; \ mpif77 -o module_streams.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_streams.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_streams.F WITHOUT OMP ; fi ; \ mpif77 -o module_streams.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_streams.f90 ; \ fi rm -f module_timing.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_timing.F > module_timing.bb rm -f module_domain_type.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_domain_type.F > module_domain_type.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_timing.bb | /lib/cpp -C -P > module_timing.f90 rm -f module_timing.b module_timing.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_domain_type.bb | /lib/cpp -C -P > module_domain_type.f90 if fgrep -iq '!$OMP' module_timing.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_timing.F WITH OMP ; fi ; \ mpif77 -o module_timing.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_timing.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_timing.F WITHOUT OMP ; fi ; \ mpif77 -o module_timing.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_timing.f90 ; \ fi rm -f module_domain_type.b module_domain_type.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_domain_type.f90 rm -f module_machine.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_machine.F > module_machine.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_machine.bb | /lib/cpp -C -P > module_machine.f90 rm -f module_machine.b module_machine.bb if fgrep -iq '!$OMP' module_machine.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_machine.F WITH OMP ; fi ; \ mpif77 -o module_machine.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_machine.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_machine.F WITHOUT OMP ; fi ; \ mpif77 -o module_machine.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_machine.f90 ; \ fi rm -f module_quilt_outbuf_ops.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_quilt_outbuf_ops.F > module_quilt_outbuf_ops.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_quilt_outbuf_ops.bb | /lib/cpp -C -P > module_quilt_outbuf_ops.f90 rm -f module_quilt_outbuf_ops.b module_quilt_outbuf_ops.bb if fgrep -iq '!$OMP' module_quilt_outbuf_ops.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_quilt_outbuf_ops.F WITH OMP ; fi ; \ mpif77 -o module_quilt_outbuf_ops.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_quilt_outbuf_ops.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_quilt_outbuf_ops.F WITHOUT OMP ; fi ; \ mpif77 -o module_quilt_outbuf_ops.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_quilt_outbuf_ops.f90 ; \ fi rm -f module_configure.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_configure.F > module_configure.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_configure.bb | /lib/cpp -C -P > module_configure.f90 rm -f module_configure.b module_configure.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_configure.f90 rm -f module_alloc_space_0.o rm -f module_alloc_space_1.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_0.F > module_alloc_space_0.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_1.F > module_alloc_space_1.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_0.bb | /lib/cpp -C -P > module_alloc_space_0.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_1.bb | /lib/cpp -C -P > module_alloc_space_1.f90 rm -f module_alloc_space_0.b module_alloc_space_0.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_0.f90 rm -f module_alloc_space_1.b module_alloc_space_1.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_1.f90 rm -f module_alloc_space_2.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_2.F > module_alloc_space_2.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_2.bb | /lib/cpp -C -P > module_alloc_space_2.f90 rm -f module_alloc_space_2.b module_alloc_space_2.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_2.f90 rm -f module_alloc_space_3.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_3.F > module_alloc_space_3.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_3.bb | /lib/cpp -C -P > module_alloc_space_3.f90 rm -f module_alloc_space_3.b module_alloc_space_3.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_3.f90 rm -f module_alloc_space_4.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_4.F > module_alloc_space_4.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_4.bb | /lib/cpp -C -P > module_alloc_space_4.f90 rm -f module_alloc_space_4.b module_alloc_space_4.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_4.f90 rm -f module_alloc_space_5.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_5.F > module_alloc_space_5.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_5.bb | /lib/cpp -C -P > module_alloc_space_5.f90 rm -f module_alloc_space_5.b module_alloc_space_5.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_5.f90 rm -f module_alloc_space_6.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_6.F > module_alloc_space_6.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_6.bb | /lib/cpp -C -P > module_alloc_space_6.f90 rm -f module_alloc_space_6.b module_alloc_space_6.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_6.f90 rm -f module_alloc_space_7.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_7.F > module_alloc_space_7.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_7.bb | /lib/cpp -C -P > module_alloc_space_7.f90 rm -f module_alloc_space_7.b module_alloc_space_7.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_7.f90 rm -f module_alloc_space_8.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_8.F > module_alloc_space_8.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_8.bb | /lib/cpp -C -P > module_alloc_space_8.f90 rm -f module_alloc_space_8.b module_alloc_space_8.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_8.f90 rm -f module_alloc_space_9.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_alloc_space_9.F > module_alloc_space_9.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_alloc_space_9.bb | /lib/cpp -C -P > module_alloc_space_9.f90 rm -f module_alloc_space_9.b module_alloc_space_9.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_alloc_space_9.f90 /lib/cpp -C -P -DNNN=0 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy0.f90 mpif77 -o nl_get_0_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy0.f90 /lib/cpp -C -P -DNNN=1 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy1.f90 mpif77 -o nl_get_1_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy1.f90 rm -f yy0.f90 /lib/cpp -C -P -DNNN=2 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy2.f90 mpif77 -o nl_get_2_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy2.f90 rm -f yy1.f90 /lib/cpp -C -P -DNNN=3 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy3.f90 mpif77 -o nl_get_3_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy3.f90 rm -f yy2.f90 /lib/cpp -C -P -DNNN=4 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy4.f90 mpif77 -o nl_get_4_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy4.f90 rm -f yy3.f90 /lib/cpp -C -P -DNNN=5 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy5.f90 mpif77 -o nl_get_5_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy5.f90 rm -f yy4.f90 /lib/cpp -C -P -DNNN=6 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy6.f90 mpif77 -o nl_get_6_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy6.f90 rm -f yy5.f90 /lib/cpp -C -P -DNNN=7 -I../inc -DNL_get_ROUTINES nl_access_routines.F > yy7.f90 mpif77 -o nl_get_7_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include yy7.f90 rm -f yy6.f90 /lib/cpp -C -P -DNNN=0 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx0.f90 mpif77 -o nl_set_0_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx0.f90 rm -f yy7.f90 /lib/cpp -C -P -DNNN=1 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx1.f90 mpif77 -o nl_set_1_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx1.f90 rm -f xx0.f90 /lib/cpp -C -P -DNNN=2 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx2.f90 mpif77 -o nl_set_2_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx2.f90 rm -f xx1.f90 /lib/cpp -C -P -DNNN=3 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx3.f90 mpif77 -o nl_set_3_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx3.f90 rm -f xx2.f90 /lib/cpp -C -P -DNNN=4 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx4.f90 mpif77 -o nl_set_4_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx4.f90 rm -f xx3.f90 /lib/cpp -C -P -DNNN=5 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx5.f90 mpif77 -o nl_set_5_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx5.f90 rm -f xx4.f90 /lib/cpp -C -P -DNNN=6 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx6.f90 mpif77 -o nl_set_6_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx6.f90 rm -f xx5.f90 /lib/cpp -C -P -DNNN=7 -I../inc -DNL_set_ROUTINES nl_access_routines.F > xx7.f90 mpif77 -o nl_set_7_routines.o -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include xx7.f90 rm -f xx6.f90 rm -f module_domain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_domain.F > module_domain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_domain.bb | /lib/cpp -C -P > module_domain.f90 rm -f xx7.f90 rm -f module_domain.b module_domain.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_domain.f90 rm -f module_nesting.o rm -f module_tiles.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_nesting.F > module_nesting.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_tiles.F > module_tiles.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_tiles.b > module_tiles.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_nesting.bb | /lib/cpp -C -P > module_nesting.f90 rm -f module_tiles.b rm -f module_nesting.b module_nesting.bb if fgrep -iq '!$OMP' module_tiles.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_tiles.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_tiles.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_tiles.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_tiles.f90 ; \ fi if fgrep -iq '!$OMP' module_nesting.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_nesting.F WITH OMP ; fi ; \ mpif77 -o module_nesting.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_nesting.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_nesting.F WITHOUT OMP ; fi ; \ mpif77 -o module_nesting.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_nesting.f90 ; \ fi rm -f module_comm_nesting_dm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_nesting_dm.F > module_comm_nesting_dm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_nesting_dm.bb | /lib/cpp -C -P > module_comm_nesting_dm.f90 rm -f module_comm_nesting_dm.b module_comm_nesting_dm.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_nesting_dm.f90 rm -f module_comm_dm_0.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm_0.F > module_comm_dm_0.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm_0.bb | /lib/cpp -C -P > module_comm_dm_0.f90 rm -f module_comm_dm_0.b module_comm_dm_0.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_0.f90 rm -f module_comm_dm_1.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm_1.F > module_comm_dm_1.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm_1.bb | /lib/cpp -C -P > module_comm_dm_1.f90 rm -f module_comm_dm_1.b module_comm_dm_1.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_1.f90 rm -f module_comm_dm_2.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm_2.F > module_comm_dm_2.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm_2.bb | /lib/cpp -C -P > module_comm_dm_2.f90 rm -f module_comm_dm_2.b module_comm_dm_2.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_2.f90 rm -f module_comm_dm_3.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm_3.F > module_comm_dm_3.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm_3.bb | /lib/cpp -C -P > module_comm_dm_3.f90 rm -f module_comm_dm_3.b module_comm_dm_3.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_3.f90 rm -f module_comm_dm_4.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm_4.F > module_comm_dm_4.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm_4.bb | /lib/cpp -C -P > module_comm_dm_4.f90 rm -f module_comm_dm_4.b module_comm_dm_4.bb if fgrep -iq '!$OMP' module_comm_dm_4.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_comm_dm_4.F WITH OMP ; fi ; \ mpif77 -o module_comm_dm_4.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_4.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_comm_dm_4.F WITHOUT OMP ; fi ; \ mpif77 -o module_comm_dm_4.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm_4.f90 ; \ fi rm -f module_integrate.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_integrate.F > module_integrate.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_integrate.b > module_integrate.f90 rm -f module_integrate.b if fgrep -iq '!$OMP' module_integrate.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_integrate.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_integrate.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_integrate.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_integrate.f90 ; \ fi rm -f module_comm_dm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_comm_dm.F > module_comm_dm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_comm_dm.bb | /lib/cpp -C -P > module_comm_dm.f90 rm -f module_comm_dm.b module_comm_dm.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_comm_dm.f90 rm -f module_dm.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_dm.F > module_dm.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_dm.b > module_dm.f90 rm -f module_dm.b if fgrep -iq '!$OMP' module_dm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_dm.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_dm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_dm.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_dm.f90 ; \ fi rm -f module_io.o rm -f module_io_quilt.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_io.F > module_io.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_io_quilt.F > module_io_quilt.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_io_quilt.bb | /lib/cpp -C -P > module_io_quilt.f90 rm -f module_io_quilt.b module_io_quilt.bb if fgrep -iq '!$OMP' module_io_quilt.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_io_quilt.F WITH OMP ; fi ; \ mpif77 -o module_io_quilt.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_quilt.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_io_quilt.F WITHOUT OMP ; fi ; \ mpif77 -o module_io_quilt.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_quilt.f90 ; \ fi /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_io.bb | /lib/cpp -C -P > module_io.f90 rm -f module_io.b module_io.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io.f90 ar ru ../main/libwrflib.a module_driver_constants.o module_domain_type.o module_streams.o module_domain.o module_integrate.o module_timing.o module_configure.o module_tiles.o module_machine.o module_nesting.o module_wrf_error.o module_state_description.o module_sm.o module_io.o module_comm_dm.o module_comm_dm_0.o module_comm_dm_1.o module_comm_dm_2.o module_comm_dm_3.o module_comm_dm_4.o module_comm_nesting_dm.o module_dm.o module_quilt_outbuf_ops.o module_io_quilt.o wrf_num_bytes_between.o wrf_shutdown.o wrf_debug.o libmassv.o collect_on_comm.o nl_get_0_routines.o nl_get_1_routines.o nl_get_2_routines.o nl_get_3_routines.o nl_get_4_routines.o nl_get_5_routines.o nl_get_6_routines.o nl_get_7_routines.o nl_set_0_routines.o nl_set_1_routines.o nl_set_2_routines.o nl_set_3_routines.o nl_set_4_routines.o nl_set_5_routines.o nl_set_6_routines.o nl_set_7_routines.o module_alloc_space_0.o module_alloc_space_1.o module_alloc_space_2.o module_alloc_space_3.o module_alloc_space_4.o module_alloc_space_5.o module_alloc_space_6.o module_alloc_space_7.o module_alloc_space_8.o module_alloc_space_9.o ar: creating ../main/libwrflib.a ranlib ../main/libwrflib.a make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame' make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf' x=`echo "gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " | awk '{print $1}'` ; export x ; \ if [ $x = "gfortran" ] ; then \ echo removing external declaration of iargc for gfortran ; \ /lib/cpp -C -P -C -P -traditional -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share diffwrf.F90 | sed '/integer *, *external.*iargc/d' > diffwrf.f ;\ else \ /lib/cpp -C -P -C -P -traditional -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share diffwrf.F90 > diffwrf.f ; \ fi removing external declaration of iargc for gfortran gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -c -I/home/shankha/work/packages/netcdf/install/include -I../ioapi_share diffwrf.f diffwrf io_netcdf is being built now. make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf' make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int' if [ -f ../../frame/pack_utils.o ] ; then \ mv diffwrf.F diffwrf.F90 ; \ x=`echo "gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 " | awk '{print $1}'` ; export x ; \ if [ $x = "gfortran" ] ; then \ echo removing external declaration of iargc for gfortran ; \ /lib/cpp -C -P -traditional -I../ioapi_share diffwrf.F90 | sed '/integer *, *external.*iargc/d' > diffwrf.f ; \ else \ /lib/cpp -C -P -traditional -I../ioapi_share diffwrf.F90 > diffwrf.f ; \ fi ; \ gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -c -I../ioapi_share diffwrf.f ; \ mv diffwrf.F90 diffwrf.F ; \ gfortran -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -o diffwrf diffwrf.o io_int.o \ ../../frame/pack_utils.o ../../frame/module_internal_header_util.o \ ../../frame/module_driver_constants.o \ ../../frame/module_machine.o ../../frame/wrf_debug.o ../../frame/module_wrf_error.o \ -L/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -lesmf_time ; fi removing external declaration of iargc for gfortran make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" shared make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- ( cd share ; make -i -r -j 2 ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share' rm -f module_bc.o rm -f module_bc_time_utilities.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bc.F > module_bc.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bc_time_utilities.F > module_bc_time_utilities.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bc_time_utilities.bb | /lib/cpp -C -P > module_bc_time_utilities.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bc.bb | /lib/cpp -C -P > module_bc.f90 rm -f module_bc_time_utilities.b module_bc_time_utilities.bb if fgrep -iq '!$OMP' module_bc_time_utilities.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bc_time_utilities.F WITH OMP ; fi ; \ mpif77 -o module_bc_time_utilities.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc_time_utilities.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bc_time_utilities.F WITHOUT OMP ; fi ; \ mpif77 -o module_bc_time_utilities.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc_time_utilities.f90 ; \ fi rm -f module_bc.b module_bc.bb if fgrep -iq '!$OMP' module_bc.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bc.F WITH OMP ; fi ; \ mpif77 -o module_bc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bc.F WITHOUT OMP ; fi ; \ mpif77 -o module_bc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc.f90 ; \ fi rm -f module_model_constants.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_model_constants.F > module_model_constants.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_model_constants.bb | /lib/cpp -C -P > module_model_constants.f90 rm -f module_model_constants.b module_model_constants.bb if fgrep -iq '!$OMP' module_model_constants.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_model_constants.F WITH OMP ; fi ; \ mpif77 -o module_model_constants.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_model_constants.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_model_constants.F WITHOUT OMP ; fi ; \ mpif77 -o module_model_constants.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_model_constants.f90 ; \ fi rm -f module_get_file_names.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_get_file_names.F > module_get_file_names.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_get_file_names.bb | /lib/cpp -C -P > module_get_file_names.f90 rm -f module_get_file_names.b module_get_file_names.bb if fgrep -iq '!$OMP' module_get_file_names.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_get_file_names.F WITH OMP ; fi ; \ mpif77 -o module_get_file_names.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_get_file_names.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_get_file_names.F WITHOUT OMP ; fi ; \ mpif77 -o module_get_file_names.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_get_file_names.f90 ; \ fi rm -f module_MPP.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_MPP.F > module_MPP.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_MPP.bb | /lib/cpp -C -P > module_MPP.f90 rm -f module_MPP.b module_MPP.bb if fgrep -iq '!$OMP' module_MPP.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_MPP.F WITH OMP ; fi ; \ mpif77 -o module_MPP.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_MPP.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_MPP.F WITHOUT OMP ; fi ; \ mpif77 -o module_MPP.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_MPP.f90 ; \ fi rm -f module_compute_geop.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_compute_geop.F > module_compute_geop.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_compute_geop.bb | /lib/cpp -C -P > module_compute_geop.f90 rm -f module_compute_geop.b module_compute_geop.bb if fgrep -iq '!$OMP' module_compute_geop.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_compute_geop.F WITH OMP ; fi ; \ mpif77 -o module_compute_geop.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_compute_geop.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_compute_geop.F WITHOUT OMP ; fi ; \ mpif77 -o module_compute_geop.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_compute_geop.f90 ; \ fi rm -f module_check_a_mundo.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_check_a_mundo.F > module_check_a_mundo.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_check_a_mundo.bb | /lib/cpp -C -P > module_check_a_mundo.f90 rm -f module_check_a_mundo.b module_check_a_mundo.bb if fgrep -iq '!$OMP' module_check_a_mundo.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_check_a_mundo.F WITH OMP ; fi ; \ mpif77 -o module_check_a_mundo.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_check_a_mundo.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_check_a_mundo.F WITHOUT OMP ; fi ; \ mpif77 -o module_check_a_mundo.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_check_a_mundo.f90 ; \ fi rm -f module_llxy.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_llxy.F > module_llxy.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_llxy.bb | /lib/cpp -C -P > module_llxy.f90 rm -f module_llxy.b module_llxy.bb if fgrep -iq '!$OMP' module_llxy.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_llxy.F WITH OMP ; fi ; \ mpif77 -o module_llxy.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_llxy.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_llxy.F WITHOUT OMP ; fi ; \ mpif77 -o module_llxy.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_llxy.f90 ; \ fi rm -f mediation_interp_domain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_interp_domain.F > mediation_interp_domain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_interp_domain.bb | /lib/cpp -C -P > mediation_interp_domain.f90 rm -f mediation_interp_domain.b mediation_interp_domain.bb if fgrep -iq '!$OMP' mediation_interp_domain.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_interp_domain.F WITH OMP ; fi ; \ mpif77 -o mediation_interp_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_interp_domain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_interp_domain.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_interp_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_interp_domain.f90 ; \ fi rm -f mediation_force_domain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_force_domain.F > mediation_force_domain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_force_domain.bb | /lib/cpp -C -P > mediation_force_domain.f90 rm -f mediation_force_domain.b mediation_force_domain.bb if fgrep -iq '!$OMP' mediation_force_domain.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_force_domain.F WITH OMP ; fi ; \ mpif77 -o mediation_force_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_force_domain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_force_domain.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_force_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_force_domain.f90 ; \ fi rm -f mediation_feedback_domain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_feedback_domain.F > mediation_feedback_domain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_feedback_domain.bb | /lib/cpp -C -P > mediation_feedback_domain.f90 rm -f mediation_feedback_domain.b mediation_feedback_domain.bb if fgrep -iq '!$OMP' mediation_feedback_domain.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_feedback_domain.F WITH OMP ; fi ; \ mpif77 -o mediation_feedback_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_feedback_domain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_feedback_domain.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_feedback_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_feedback_domain.f90 ; \ fi rm -f solve_interface.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe solve_interface.F > solve_interface.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional solve_interface.b > solve_interface.f90 rm -f solve_interface.b if fgrep -iq '!$OMP' solve_interface.f90 ; then \ if [ -n "" ] ; then echo COMPILING solve_interface.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include solve_interface.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING solve_interface.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include solve_interface.f90 ; \ fi rm -f wrf_tsin.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_tsin.F > wrf_tsin.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_tsin.b > wrf_tsin.f90 rm -f wrf_tsin.b if fgrep -iq '!$OMP' wrf_tsin.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf_tsin.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_tsin.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf_tsin.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_tsin.f90 ; \ fi rm -f set_timekeeping.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional set_timekeeping.F > set_timekeeping.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe set_timekeeping.bb | /lib/cpp -C -P > set_timekeeping.f90 rm -f set_timekeeping.b set_timekeeping.bb if fgrep -iq '!$OMP' set_timekeeping.f90 ; then \ if [ -n "" ] ; then echo COMPILING set_timekeeping.F WITH OMP ; fi ; \ mpif77 -o set_timekeeping.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include set_timekeeping.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING set_timekeeping.F WITHOUT OMP ; fi ; \ mpif77 -o set_timekeeping.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include set_timekeeping.f90 ; \ fi rm -f interp_fcn.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional interp_fcn.F > interp_fcn.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe interp_fcn.bb | /lib/cpp -C -P > interp_fcn.f90 rm -f interp_fcn.b interp_fcn.bb if fgrep -iq '!$OMP' interp_fcn.f90 ; then \ if [ -n "" ] ; then echo COMPILING interp_fcn.F WITH OMP ; fi ; \ mpif77 -o interp_fcn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include interp_fcn.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING interp_fcn.F WITHOUT OMP ; fi ; \ mpif77 -o interp_fcn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include interp_fcn.f90 ; \ fi rm -f sint.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional sint.F > sint.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe sint.bb | /lib/cpp -C -P > sint.f90 rm -f sint.b sint.bb if fgrep -iq '!$OMP' sint.f90 ; then \ if [ -n "" ] ; then echo COMPILING sint.F WITH OMP ; fi ; \ mpif77 -o sint.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include sint.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING sint.F WITHOUT OMP ; fi ; \ mpif77 -o sint.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include sint.f90 ; \ fi rm -f wrf_ext_write_field.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_ext_write_field.F > wrf_ext_write_field.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_ext_write_field.bb | /lib/cpp -C -P > wrf_ext_write_field.f90 rm -f wrf_ext_write_field.b wrf_ext_write_field.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_ext_write_field.f90 rm -f wrf_ext_read_field.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_ext_read_field.F > wrf_ext_read_field.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_ext_read_field.bb | /lib/cpp -C -P > wrf_ext_read_field.f90 rm -f wrf_ext_read_field.b wrf_ext_read_field.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_ext_read_field.f90 rm -f landread.o mpicc -DMPI2_SUPPORT -DFSEEKO64_OK -o landread.o -c -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 landread.c rm -f setfeenv.o gcc -o setfeenv.o -c -w -O3 -c -DLANDREAD_STUB -DDM_PARALLEL -DMAX_HISTORY=25 setfeenv.c rm -f module_date_time.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_date_time.F > module_date_time.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_date_time.bb | /lib/cpp -C -P > module_date_time.f90 rm -f module_date_time.b module_date_time.bb if fgrep -iq '!$OMP' module_date_time.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_date_time.F WITH OMP ; fi ; \ mpif77 -o module_date_time.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_date_time.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_date_time.F WITHOUT OMP ; fi ; \ mpif77 -o module_date_time.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_date_time.f90 ; \ fi rm -f wrf_timeseries.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_timeseries.F > wrf_timeseries.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_timeseries.bb | /lib/cpp -C -P > wrf_timeseries.f90 rm -f wrf_timeseries.b wrf_timeseries.bb if fgrep -iq '!$OMP' wrf_timeseries.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf_timeseries.F WITH OMP ; fi ; \ mpif77 -o wrf_timeseries.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_timeseries.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf_timeseries.F WITHOUT OMP ; fi ; \ mpif77 -o wrf_timeseries.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_timeseries.f90 ; \ fi rm -f module_io_wrf.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_io_wrf.F > module_io_wrf.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_io_wrf.b > module_io_wrf.f90 rm -f module_io_wrf.b if fgrep -iq '!$OMP' module_io_wrf.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_io_wrf.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_wrf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_io_wrf.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_wrf.f90 ; \ fi rm -f module_soil_pre.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_soil_pre.F > module_soil_pre.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_soil_pre.bb | /lib/cpp -C -P > module_soil_pre.f90 rm -f module_soil_pre.b module_soil_pre.bb if fgrep -iq '!$OMP' module_soil_pre.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_soil_pre.F WITH OMP ; fi ; \ mpif77 -o module_soil_pre.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_soil_pre.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_soil_pre.F WITHOUT OMP ; fi ; \ mpif77 -o module_soil_pre.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_soil_pre.f90 ; \ fi rm -f init_modules.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional init_modules.F > init_modules.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe init_modules.bb | /lib/cpp -C -P > init_modules.f90 rm -f init_modules.b init_modules.bb if fgrep -iq '!$OMP' init_modules.f90 ; then \ if [ -n "" ] ; then echo COMPILING init_modules.F WITH OMP ; fi ; \ mpif77 -o init_modules.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include init_modules.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING init_modules.F WITHOUT OMP ; fi ; \ mpif77 -o init_modules.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include init_modules.f90 ; \ fi rm -f input_wrf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional input_wrf.F > input_wrf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe input_wrf.bb | /lib/cpp -C -P > input_wrf.f90 rm -f input_wrf.b input_wrf.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include input_wrf.f90 rm -f output_wrf.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe output_wrf.F > output_wrf.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional output_wrf.b > output_wrf.f90 rm -f output_wrf.b if fgrep -iq '!$OMP' output_wrf.f90 ; then \ if [ -n "" ] ; then echo COMPILING output_wrf.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include output_wrf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING output_wrf.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include output_wrf.f90 ; \ fi rm -f wrf_bdyout.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_bdyout.F > wrf_bdyout.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_bdyout.bb | /lib/cpp -C -P > wrf_bdyout.f90 rm -f wrf_bdyout.b wrf_bdyout.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_bdyout.f90 rm -f wrf_fddaobs_in.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_fddaobs_in.F > wrf_fddaobs_in.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_fddaobs_in.b > wrf_fddaobs_in.f90 rm -f wrf_fddaobs_in.b if fgrep -iq '!$OMP' wrf_fddaobs_in.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf_fddaobs_in.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_fddaobs_in.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf_fddaobs_in.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_fddaobs_in.f90 ; \ fi rm -f wrf_bdyin.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf_bdyin.F > wrf_bdyin.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf_bdyin.bb | /lib/cpp -C -P > wrf_bdyin.f90 rm -f wrf_bdyin.b wrf_bdyin.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf_bdyin.f90 rm -f module_io_domain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_io_domain.F > module_io_domain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_io_domain.bb | /lib/cpp -C -P > module_io_domain.f90 rm -f module_io_domain.b module_io_domain.bb if fgrep -iq '!$OMP' module_io_domain.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_io_domain.F WITH OMP ; fi ; \ mpif77 -o module_io_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_domain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_io_domain.F WITHOUT OMP ; fi ; \ mpif77 -o module_io_domain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_io_domain.f90 ; \ fi rm -f start_domain.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe start_domain.F > start_domain.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional start_domain.b > start_domain.f90 rm -f start_domain.b if fgrep -iq '!$OMP' start_domain.f90 ; then \ if [ -n "" ] ; then echo COMPILING start_domain.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include start_domain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING start_domain.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include start_domain.f90 ; \ fi rm -f module_optional_input.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_optional_input.F > module_optional_input.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_optional_input.bb | /lib/cpp -C -P > module_optional_input.f90 rm -f module_optional_input.b module_optional_input.bb if fgrep -iq '!$OMP' module_optional_input.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_optional_input.F WITH OMP ; fi ; \ mpif77 -o module_optional_input.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_optional_input.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_optional_input.F WITHOUT OMP ; fi ; \ mpif77 -o module_optional_input.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_optional_input.f90 ; \ fi rm -f dfi.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional dfi.F > dfi.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe dfi.bb | /lib/cpp -C -P > dfi.f90 rm -f dfi.b dfi.bb if fgrep -iq '!$OMP' dfi.f90 ; then \ if [ -n "" ] ; then echo COMPILING dfi.F WITH OMP ; fi ; \ mpif77 -o dfi.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include dfi.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING dfi.F WITHOUT OMP ; fi ; \ mpif77 -o dfi.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include dfi.f90 ; \ fi rm -f mediation_integrate.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_integrate.F > mediation_integrate.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_integrate.bb | /lib/cpp -C -P > mediation_integrate.f90 rm -f mediation_integrate.b mediation_integrate.bb if fgrep -iq '!$OMP' mediation_integrate.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_integrate.F WITH OMP ; fi ; \ mpif77 -o mediation_integrate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_integrate.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_integrate.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_integrate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_integrate.f90 ; \ fi rm -f mediation_nest_move.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_nest_move.F > mediation_nest_move.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_nest_move.bb | /lib/cpp -C -P > mediation_nest_move.f90 rm -f mediation_nest_move.b mediation_nest_move.bb if fgrep -iq '!$OMP' mediation_nest_move.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_nest_move.F WITH OMP ; fi ; \ mpif77 -o mediation_nest_move.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_nest_move.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_nest_move.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_nest_move.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_nest_move.f90 ; \ fi rm -f mediation_wrfmain.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional mediation_wrfmain.F > mediation_wrfmain.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe mediation_wrfmain.bb | /lib/cpp -C -P > mediation_wrfmain.f90 rm -f mediation_wrfmain.b mediation_wrfmain.bb if fgrep -iq '!$OMP' mediation_wrfmain.f90 ; then \ if [ -n "" ] ; then echo COMPILING mediation_wrfmain.F WITH OMP ; fi ; \ mpif77 -o mediation_wrfmain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_wrfmain.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING mediation_wrfmain.F WITHOUT OMP ; fi ; \ mpif77 -o mediation_wrfmain.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include mediation_wrfmain.f90 ; \ fi if [ 0 -eq 1 ] ; then \ make -i -r nmm_contrib ; \ ar ru ../main/libwrflib.a module_bc.o module_bc_time_utilities.o module_io_wrf.o module_date_time.o module_get_file_names.o module_io_domain.o module_model_constants.o module_MPP.o module_optional_input.o module_compute_geop.o module_soil_pre.o module_check_a_mundo.o module_llxy.o dfi.o mediation_integrate.o mediation_interp_domain.o mediation_force_domain.o mediation_feedback_domain.o mediation_nest_move.o mediation_wrfmain.o solve_interface.o start_domain.o init_modules.o set_timekeeping.o interp_fcn.o sint.o input_wrf.o output_wrf.o wrf_timeseries.o wrf_ext_write_field.o wrf_ext_read_field.o wrf_bdyout.o wrf_fddaobs_in.o wrf_bdyin.o wrf_tsin.o landread.o setfeenv.o ; \ else \ ar ru ../main/libwrflib.a module_bc.o module_bc_time_utilities.o module_io_wrf.o module_date_time.o module_get_file_names.o module_io_domain.o module_model_constants.o module_MPP.o module_optional_input.o module_compute_geop.o module_soil_pre.o module_check_a_mundo.o module_llxy.o dfi.o mediation_integrate.o mediation_interp_domain.o mediation_force_domain.o mediation_feedback_domain.o mediation_nest_move.o mediation_wrfmain.o solve_interface.o start_domain.o init_modules.o set_timekeeping.o interp_fcn.o sint.o input_wrf.o output_wrf.o wrf_timeseries.o wrf_ext_write_field.o wrf_ext_read_field.o wrf_bdyout.o wrf_fddaobs_in.o wrf_bdyin.o wrf_tsin.o landread.o setfeenv.o ; \ fi make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" physics make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- ( cd phys ; make -i -r -j 2 ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys' if [ 0 -eq 1 ] ; then \ make -i -r nmm_contrib ; \ ar ru ../main/libwrflib.a module_cam_shr_kind_mod.o module_cam_support.o module_cam_shr_const_mod.o module_cam_physconst.o module_cam_cldwat.o module_cam_wv_saturation.o module_cam_esinti.o module_cam_gffgch.o module_cam_error_function.o module_cam_constituents.o module_cam_trb_mtn_stress.o module_cam_molec_diff.o module_cam_upper_bc.o module_cam_bl_diffusion_solver.o module_cam_bl_eddy_diff.o module_bl_ysu.o module_bl_mrf.o module_bl_gfs.o module_bl_myjpbl.o module_bl_qnsepbl.o module_bl_acm.o module_bl_mynn.o module_bl_gwdo.o module_bl_myjurb.o module_bl_boulac.o module_bl_camuwpbl_driver.o module_cu_camuwshcu_driver.o module_cu_camuwshcu.o module_cu_camzm_driver.o module_cu_camzm.o module_bl_temf.o module_cu_g3.o module_cu_kf.o module_cu_bmj.o module_cu_kfeta.o module_cu_tiedtke.o module_cu_gd.o module_cu_nsas.o module_cu_sas.o module_cu_osas.o module_mp_kessler.o module_mp_lin.o module_mp_sbu_ylin.o module_mp_wsm3.o module_mp_wsm5.o module_mp_wsm6.o module_mp_etanew.o module_mp_HWRF.o module_mp_thompson.o module_mp_gsfcgce.o module_mp_morr_two_moment.o module_mp_milbrandt2mom.o module_mp_wdm5.o module_mp_wdm6.o module_ra_sw.o module_ra_gsfcsw.o module_ra_goddard.o module_ra_rrtm.o module_ra_rrtmg_lw.o module_ra_rrtmg_sw.o module_ra_cam_support.o module_ra_cam.o module_ra_gfdleta.o module_ra_HWRF.o module_ra_hs.o module_sf_sfclay.o module_sf_gfs.o module_sf_gfdl.o module_sf_slab.o module_sf_noahdrv.o module_sf_noahlsm.o module_sf_urban.o module_sf_bep.o module_sf_bep_bem.o module_sf_bem.o module_sf_pxlsm.o module_sf_ruclsm.o module_sf_sfcdiags.o module_sf_sfcdiags_ruclsm.o module_sf_sstskin.o module_sf_tmnupdate.o module_sf_oml.o module_sf_myjsfc.o module_sf_qnsesfc.o module_sf_mynn.o module_sf_pxsfclay.o module_sf_temfsfclay.o module_sf_idealscmsfclay.o module_physics_addtendc.o module_physics_init.o module_gfs_machine.o module_gfs_funcphys.o module_gfs_physcons.o module_progtm.o module_pbl_driver.o module_data_gocart_dust.o module_cumulus_driver.o module_shallowcu_driver.o module_microphysics_driver.o module_microphysics_zero_out.o module_mixactivate.o module_radiation_driver.o module_surface_driver.o module_diagnostics.o module_fdda_psufddagd.o module_fdda_spnudging.o module_fddagd_driver.o module_fddaobs_rtfdda.o module_fddaobs_driver.o module_wind_generic.o module_wind_fitch.o ; \ else \ make -i -r non_nmm ; \ ar ru ../main/libwrflib.a module_cam_shr_kind_mod.o module_cam_support.o module_cam_shr_const_mod.o module_cam_physconst.o module_cam_cldwat.o module_cam_wv_saturation.o module_cam_esinti.o module_cam_gffgch.o module_cam_error_function.o module_cam_constituents.o module_cam_trb_mtn_stress.o module_cam_molec_diff.o module_cam_upper_bc.o module_cam_bl_diffusion_solver.o module_cam_bl_eddy_diff.o module_bl_ysu.o module_bl_mrf.o module_bl_gfs.o module_bl_myjpbl.o module_bl_qnsepbl.o module_bl_acm.o module_bl_mynn.o module_bl_gwdo.o module_bl_myjurb.o module_bl_boulac.o module_bl_camuwpbl_driver.o module_cu_camuwshcu_driver.o module_cu_camuwshcu.o module_cu_camzm_driver.o module_cu_camzm.o module_bl_temf.o module_cu_g3.o module_cu_kf.o module_cu_bmj.o module_cu_kfeta.o module_cu_tiedtke.o module_cu_gd.o module_cu_nsas.o module_cu_sas.o module_cu_osas.o module_mp_kessler.o module_mp_lin.o module_mp_sbu_ylin.o module_mp_wsm3.o module_mp_wsm5.o module_mp_wsm6.o module_mp_etanew.o module_mp_HWRF.o module_mp_thompson.o module_mp_gsfcgce.o module_mp_morr_two_moment.o module_mp_milbrandt2mom.o module_mp_wdm5.o module_mp_wdm6.o module_ra_sw.o module_ra_gsfcsw.o module_ra_goddard.o module_ra_rrtm.o module_ra_rrtmg_lw.o module_ra_rrtmg_sw.o module_ra_cam_support.o module_ra_cam.o module_ra_gfdleta.o module_ra_HWRF.o module_ra_hs.o module_sf_sfclay.o module_sf_gfs.o module_sf_gfdl.o module_sf_slab.o module_sf_noahdrv.o module_sf_noahlsm.o module_sf_urban.o module_sf_bep.o module_sf_bep_bem.o module_sf_bem.o module_sf_pxlsm.o module_sf_ruclsm.o module_sf_sfcdiags.o module_sf_sfcdiags_ruclsm.o module_sf_sstskin.o module_sf_tmnupdate.o module_sf_oml.o module_sf_myjsfc.o module_sf_qnsesfc.o module_sf_mynn.o module_sf_pxsfclay.o module_sf_temfsfclay.o module_sf_idealscmsfclay.o module_physics_addtendc.o module_physics_init.o module_gfs_machine.o module_gfs_funcphys.o module_gfs_physcons.o module_progtm.o module_pbl_driver.o module_data_gocart_dust.o module_cumulus_driver.o module_shallowcu_driver.o module_microphysics_driver.o module_microphysics_zero_out.o module_mixactivate.o module_radiation_driver.o module_surface_driver.o module_diagnostics.o module_fdda_psufddagd.o module_fdda_spnudging.o module_fddagd_driver.o module_fddaobs_rtfdda.o module_fddaobs_driver.o module_wind_generic.o module_wind_fitch.o module_fr_sfire_driver.o module_fr_sfire_driver_wrf.o module_fr_sfire_atm.o module_fr_sfire_model.o module_fr_sfire_core.o module_fr_sfire_phys.o module_fr_sfire_util.o ; \ fi make[3]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys' rm -f module_cam_shr_kind_mod.o rm -f module_cam_error_function.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_shr_kind_mod.F > module_cam_shr_kind_mod.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_error_function.F > module_cam_error_function.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_shr_kind_mod.bb | /lib/cpp -C -P > module_cam_shr_kind_mod.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_error_function.bb | /lib/cpp -C -P > module_cam_error_function.f90 rm -f module_cam_shr_kind_mod.b module_cam_shr_kind_mod.bb rm -f module_cam_error_function.b module_cam_error_function.bb if fgrep -iq '!$OMP' module_cam_shr_kind_mod.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_shr_kind_mod.F WITH OMP ; fi ; \ mpif77 -o module_cam_shr_kind_mod.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_shr_kind_mod.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_shr_kind_mod.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_shr_kind_mod.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_shr_kind_mod.f90 ; \ fi if fgrep -iq '!$OMP' module_cam_error_function.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_error_function.F WITH OMP ; fi ; \ mpif77 -o module_cam_error_function.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_error_function.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_error_function.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_error_function.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_error_function.f90 ; \ fi rm -f module_bl_ysu.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_ysu.F > module_bl_ysu.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_ysu.bb | /lib/cpp -C -P > module_bl_ysu.f90 rm -f module_bl_ysu.b module_bl_ysu.bb if fgrep -iq '!$OMP' module_bl_ysu.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_ysu.F WITH OMP ; fi ; \ mpif77 -o module_bl_ysu.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_ysu.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_ysu.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_ysu.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_ysu.f90 ; \ fi rm -f module_bl_mrf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_mrf.F > module_bl_mrf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_mrf.bb | /lib/cpp -C -P > module_bl_mrf.f90 rm -f module_bl_mrf.b module_bl_mrf.bb if fgrep -iq '!$OMP' module_bl_mrf.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_mrf.F WITH OMP ; fi ; \ mpif77 -o module_bl_mrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_mrf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_mrf.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_mrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_mrf.f90 ; \ fi rm -f module_gfs_machine.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_gfs_machine.F > module_gfs_machine.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_gfs_machine.bb | /lib/cpp -C -P > module_gfs_machine.f90 rm -f module_gfs_machine.b module_gfs_machine.bb if fgrep -iq '!$OMP' module_gfs_machine.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_gfs_machine.F WITH OMP ; fi ; \ mpif77 -o module_gfs_machine.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_machine.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_gfs_machine.F WITHOUT OMP ; fi ; \ mpif77 -o module_gfs_machine.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_machine.f90 ; \ fi rm -f module_bl_myjpbl.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_myjpbl.F > module_bl_myjpbl.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_myjpbl.bb | /lib/cpp -C -P > module_bl_myjpbl.f90 rm -f module_bl_myjpbl.b module_bl_myjpbl.bb if fgrep -iq '!$OMP' module_bl_myjpbl.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_myjpbl.F WITH OMP ; fi ; \ mpif77 -o module_bl_myjpbl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_myjpbl.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_myjpbl.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_myjpbl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_myjpbl.f90 ; \ fi rm -f module_bl_qnsepbl.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_qnsepbl.F > module_bl_qnsepbl.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_qnsepbl.bb | /lib/cpp -C -P > module_bl_qnsepbl.f90 rm -f module_bl_qnsepbl.b module_bl_qnsepbl.bb if fgrep -iq '!$OMP' module_bl_qnsepbl.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_qnsepbl.F WITH OMP ; fi ; \ mpif77 -o module_bl_qnsepbl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_qnsepbl.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_qnsepbl.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_qnsepbl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_qnsepbl.f90 ; \ fi rm -f module_bl_acm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_acm.F > module_bl_acm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_acm.bb | /lib/cpp -C -P > module_bl_acm.f90 rm -f module_bl_acm.b module_bl_acm.bb if fgrep -iq '!$OMP' module_bl_acm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_acm.F WITH OMP ; fi ; \ mpif77 -o module_bl_acm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_acm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_acm.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_acm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_acm.f90 ; \ fi rm -f module_bl_mynn.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_mynn.F > module_bl_mynn.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_mynn.bb | /lib/cpp -C -P > module_bl_mynn.f90 rm -f module_bl_mynn.b module_bl_mynn.bb if fgrep -iq '!$OMP' module_bl_mynn.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_mynn.F WITH OMP ; fi ; \ mpif77 -o module_bl_mynn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_mynn.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_mynn.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_mynn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_mynn.f90 ; \ fi rm -f module_bl_gwdo.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_gwdo.F > module_bl_gwdo.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_gwdo.bb | /lib/cpp -C -P > module_bl_gwdo.f90 rm -f module_bl_gwdo.b module_bl_gwdo.bb if fgrep -iq '!$OMP' module_bl_gwdo.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_gwdo.F WITH OMP ; fi ; \ mpif77 -o module_bl_gwdo.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_gwdo.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_gwdo.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_gwdo.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_gwdo.f90 ; \ fi rm -f module_bl_myjurb.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_myjurb.F > module_bl_myjurb.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_myjurb.bb | /lib/cpp -C -P > module_bl_myjurb.f90 rm -f module_bl_myjurb.b module_bl_myjurb.bb if fgrep -iq '!$OMP' module_bl_myjurb.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_myjurb.F WITH OMP ; fi ; \ mpif77 -o module_bl_myjurb.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_myjurb.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_myjurb.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_myjurb.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_myjurb.f90 ; \ fi rm -f module_bl_boulac.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_boulac.F > module_bl_boulac.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_boulac.bb | /lib/cpp -C -P > module_bl_boulac.f90 rm -f module_bl_boulac.b module_bl_boulac.bb if fgrep -iq '!$OMP' module_bl_boulac.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_boulac.F WITH OMP ; fi ; \ mpif77 -o module_bl_boulac.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_boulac.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_boulac.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_boulac.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_boulac.f90 ; \ fi rm -f module_bl_temf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_temf.F > module_bl_temf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_temf.bb | /lib/cpp -C -P > module_bl_temf.f90 rm -f module_bl_temf.b module_bl_temf.bb if fgrep -iq '!$OMP' module_bl_temf.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_temf.F WITH OMP ; fi ; \ mpif77 -o module_bl_temf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_temf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_temf.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_temf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_temf.f90 ; \ fi rm -f module_cu_g3.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_g3.F > module_cu_g3.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_g3.bb | /lib/cpp -C -P > module_cu_g3.f90 rm -f module_cu_g3.b module_cu_g3.bb if fgrep -iq '!$OMP' module_cu_g3.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_g3.F WITH OMP ; fi ; \ mpif77 -o module_cu_g3.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_g3.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_g3.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_g3.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_g3.f90 ; \ fi rm -f module_cu_kf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_kf.F > module_cu_kf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_kf.bb | /lib/cpp -C -P > module_cu_kf.f90 rm -f module_cu_kf.b module_cu_kf.bb if fgrep -iq '!$OMP' module_cu_kf.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_kf.F WITH OMP ; fi ; \ mpif77 -o module_cu_kf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_kf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_kf.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_kf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_kf.f90 ; \ fi rm -f module_cu_bmj.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_bmj.F > module_cu_bmj.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_bmj.bb | /lib/cpp -C -P > module_cu_bmj.f90 rm -f module_cu_bmj.b module_cu_bmj.bb if fgrep -iq '!$OMP' module_cu_bmj.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_bmj.F WITH OMP ; fi ; \ mpif77 -o module_cu_bmj.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_bmj.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_bmj.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_bmj.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_bmj.f90 ; \ fi rm -f module_cu_kfeta.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_kfeta.F > module_cu_kfeta.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_kfeta.bb | /lib/cpp -C -P > module_cu_kfeta.f90 rm -f module_cu_kfeta.b module_cu_kfeta.bb if fgrep -iq '!$OMP' module_cu_kfeta.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_kfeta.F WITH OMP ; fi ; \ mpif77 -o module_cu_kfeta.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_kfeta.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_kfeta.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_kfeta.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_kfeta.f90 ; \ fi rm -f module_cu_gd.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_gd.F > module_cu_gd.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_gd.bb | /lib/cpp -C -P > module_cu_gd.f90 rm -f module_cu_gd.b module_cu_gd.bb if fgrep -iq '!$OMP' module_cu_gd.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_gd.F WITH OMP ; fi ; \ mpif77 -o module_cu_gd.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_gd.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_gd.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_gd.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_gd.f90 ; \ fi rm -f module_cu_nsas.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_nsas.F > module_cu_nsas.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_nsas.bb | /lib/cpp -C -P > module_cu_nsas.f90 rm -f module_cu_nsas.b module_cu_nsas.bb if fgrep -iq '!$OMP' module_cu_nsas.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_nsas.F WITH OMP ; fi ; \ mpif77 -o module_cu_nsas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_nsas.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_nsas.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_nsas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_nsas.f90 ; \ fi rm -f module_mp_kessler.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_kessler.F > module_mp_kessler.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_kessler.bb | /lib/cpp -C -P > module_mp_kessler.f90 rm -f module_mp_kessler.b module_mp_kessler.bb if fgrep -iq '!$OMP' module_mp_kessler.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_kessler.F WITH OMP ; fi ; \ mpif77 -o module_mp_kessler.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_kessler.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_kessler.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_kessler.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_kessler.f90 ; \ fi rm -f module_mp_lin.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_lin.F > module_mp_lin.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_lin.bb | /lib/cpp -C -P > module_mp_lin.f90 rm -f module_mp_lin.b module_mp_lin.bb if fgrep -iq '!$OMP' module_mp_lin.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_lin.F WITH OMP ; fi ; \ mpif77 -o module_mp_lin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_lin.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_lin.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_lin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_lin.f90 ; \ fi rm -f module_mp_sbu_ylin.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_sbu_ylin.F > module_mp_sbu_ylin.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_sbu_ylin.bb | /lib/cpp -C -P > module_mp_sbu_ylin.f90 rm -f module_mp_sbu_ylin.b module_mp_sbu_ylin.bb if fgrep -iq '!$OMP' module_mp_sbu_ylin.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_sbu_ylin.F WITH OMP ; fi ; \ mpif77 -o module_mp_sbu_ylin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_sbu_ylin.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_sbu_ylin.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_sbu_ylin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_sbu_ylin.f90 ; \ fi rm -f module_mp_wsm3.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_wsm3.F > module_mp_wsm3.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_wsm3.bb | /lib/cpp -C -P > module_mp_wsm3.f90 rm -f module_mp_wsm3.b module_mp_wsm3.bb if fgrep -iq '!$OMP' module_mp_wsm3.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_wsm3.F WITH OMP ; fi ; \ mpif77 -o module_mp_wsm3.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm3.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_wsm3.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_wsm3.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm3.f90 ; \ fi rm -f module_mp_wsm5.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_wsm5.F > module_mp_wsm5.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_wsm5.bb | /lib/cpp -C -P > module_mp_wsm5.f90 rm -f module_mp_wsm5.b module_mp_wsm5.bb if fgrep -iq '!$OMP' module_mp_wsm5.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_wsm5.F WITH OMP ; fi ; \ mpif77 -o module_mp_wsm5.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm5.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_wsm5.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_wsm5.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm5.f90 ; \ fi rm -f module_mp_wsm6.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_wsm6.F > module_mp_wsm6.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_wsm6.bb | /lib/cpp -C -P > module_mp_wsm6.f90 rm -f module_mp_wsm6.b module_mp_wsm6.bb if fgrep -iq '!$OMP' module_mp_wsm6.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_wsm6.F WITH OMP ; fi ; \ mpif77 -o module_mp_wsm6.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm6.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_wsm6.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_wsm6.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wsm6.f90 ; \ fi rm -f module_mp_etanew.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_etanew.F > module_mp_etanew.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_etanew.bb | /lib/cpp -C -P > module_mp_etanew.f90 rm -f module_mp_etanew.b module_mp_etanew.bb if fgrep -iq '!$OMP' module_mp_etanew.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_etanew.F WITH OMP ; fi ; \ mpif77 -o module_mp_etanew.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_etanew.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_etanew.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_etanew.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_etanew.f90 ; \ fi rm -f module_mp_HWRF.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_HWRF.F > module_mp_HWRF.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_HWRF.bb | /lib/cpp -C -P > module_mp_HWRF.f90 rm -f module_mp_HWRF.b module_mp_HWRF.bb if fgrep -iq '!$OMP' module_mp_HWRF.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_HWRF.F WITH OMP ; fi ; \ mpif77 -o module_mp_HWRF.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_HWRF.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_HWRF.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_HWRF.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_HWRF.f90 ; \ fi rm -f module_mp_thompson.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_thompson.F > module_mp_thompson.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_thompson.bb | /lib/cpp -C -P > module_mp_thompson.f90 rm -f module_mp_thompson.b module_mp_thompson.bb if fgrep -iq '!$OMP' module_mp_thompson.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_thompson.F WITH OMP ; fi ; \ mpif77 -o module_mp_thompson.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_thompson.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_thompson.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_thompson.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_thompson.f90 ; \ fi rm -f module_mp_gsfcgce.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_gsfcgce.F > module_mp_gsfcgce.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_gsfcgce.bb | /lib/cpp -C -P > module_mp_gsfcgce.f90 rm -f module_mp_gsfcgce.b module_mp_gsfcgce.bb if fgrep -iq '!$OMP' module_mp_gsfcgce.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_gsfcgce.F WITH OMP ; fi ; \ mpif77 -o module_mp_gsfcgce.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_gsfcgce.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_gsfcgce.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_gsfcgce.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_gsfcgce.f90 ; \ fi rm -f module_mp_morr_two_moment.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_morr_two_moment.F > module_mp_morr_two_moment.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_morr_two_moment.bb | /lib/cpp -C -P > module_mp_morr_two_moment.f90 rm -f module_mp_morr_two_moment.b module_mp_morr_two_moment.bb if fgrep -iq '!$OMP' module_mp_morr_two_moment.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_morr_two_moment.F WITH OMP ; fi ; \ mpif77 -o module_mp_morr_two_moment.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_morr_two_moment.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_morr_two_moment.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_morr_two_moment.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_morr_two_moment.f90 ; \ fi rm -f module_mp_milbrandt2mom.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_milbrandt2mom.F > module_mp_milbrandt2mom.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_milbrandt2mom.bb | /lib/cpp -C -P > module_mp_milbrandt2mom.f90 rm -f module_mp_milbrandt2mom.b module_mp_milbrandt2mom.bb if fgrep -iq '!$OMP' module_mp_milbrandt2mom.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_milbrandt2mom.F WITH OMP ; fi ; \ mpif77 -o module_mp_milbrandt2mom.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_milbrandt2mom.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_milbrandt2mom.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_milbrandt2mom.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_milbrandt2mom.f90 ; \ fi rm -f module_mp_wdm5.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_wdm5.F > module_mp_wdm5.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_wdm5.bb | /lib/cpp -C -P > module_mp_wdm5.f90 rm -f module_mp_wdm5.b module_mp_wdm5.bb if fgrep -iq '!$OMP' module_mp_wdm5.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_wdm5.F WITH OMP ; fi ; \ mpif77 -o module_mp_wdm5.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wdm5.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_wdm5.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_wdm5.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wdm5.f90 ; \ fi rm -f module_mp_wdm6.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mp_wdm6.F > module_mp_wdm6.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mp_wdm6.bb | /lib/cpp -C -P > module_mp_wdm6.f90 rm -f module_mp_wdm6.b module_mp_wdm6.bb if fgrep -iq '!$OMP' module_mp_wdm6.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mp_wdm6.F WITH OMP ; fi ; \ mpif77 -o module_mp_wdm6.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wdm6.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mp_wdm6.F WITHOUT OMP ; fi ; \ mpif77 -o module_mp_wdm6.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mp_wdm6.f90 ; \ fi rm -f module_ra_sw.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_sw.F > module_ra_sw.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_sw.bb | /lib/cpp -C -P > module_ra_sw.f90 rm -f module_ra_sw.b module_ra_sw.bb if fgrep -iq '!$OMP' module_ra_sw.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_sw.F WITH OMP ; fi ; \ mpif77 -o module_ra_sw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_sw.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_sw.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_sw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_sw.f90 ; \ fi rm -f module_ra_gsfcsw.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_gsfcsw.F > module_ra_gsfcsw.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_gsfcsw.bb | /lib/cpp -C -P > module_ra_gsfcsw.f90 rm -f module_ra_gsfcsw.b module_ra_gsfcsw.bb if fgrep -iq '!$OMP' module_ra_gsfcsw.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_gsfcsw.F WITH OMP ; fi ; \ mpif77 -o module_ra_gsfcsw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_gsfcsw.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_gsfcsw.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_gsfcsw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_gsfcsw.f90 ; \ fi rm -f module_ra_goddard.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_goddard.F > module_ra_goddard.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_goddard.bb | /lib/cpp -C -P > module_ra_goddard.f90 rm -f module_ra_goddard.b module_ra_goddard.bb if fgrep -iq '!$OMP' module_ra_goddard.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_goddard.F WITH OMP ; fi ; \ mpif77 -o module_ra_goddard.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_goddard.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_goddard.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_goddard.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_goddard.f90 ; \ fi rm -f module_ra_rrtm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_rrtm.F > module_ra_rrtm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_rrtm.bb | /lib/cpp -C -P > module_ra_rrtm.f90 rm -f module_ra_rrtm.b module_ra_rrtm.bb if fgrep -iq '!$OMP' module_ra_rrtm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_rrtm.F WITH OMP ; fi ; \ mpif77 -o module_ra_rrtm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_rrtm.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_rrtm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtm.f90 ; \ fi rm -f module_ra_rrtmg_lw.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_rrtmg_lw.F > module_ra_rrtmg_lw.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_rrtmg_lw.bb | /lib/cpp -C -P > module_ra_rrtmg_lw.f90 rm -f module_ra_rrtmg_lw.b module_ra_rrtmg_lw.bb if fgrep -iq '!$OMP' module_ra_rrtmg_lw.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_rrtmg_lw.F WITH OMP ; fi ; \ mpif77 -o module_ra_rrtmg_lw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtmg_lw.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_rrtmg_lw.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_rrtmg_lw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtmg_lw.f90 ; \ fi rm -f module_ra_gfdleta.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_gfdleta.F > module_ra_gfdleta.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_gfdleta.bb | /lib/cpp -C -P > module_ra_gfdleta.f90 rm -f module_ra_gfdleta.b module_ra_gfdleta.bb if fgrep -iq '!$OMP' module_ra_gfdleta.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_gfdleta.F WITH OMP ; fi ; \ mpif77 -o module_ra_gfdleta.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_gfdleta.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_gfdleta.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_gfdleta.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_gfdleta.f90 ; \ fi rm -f module_ra_HWRF.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_HWRF.F > module_ra_HWRF.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_HWRF.bb | /lib/cpp -C -P > module_ra_HWRF.f90 rm -f module_ra_HWRF.b module_ra_HWRF.bb if fgrep -iq '!$OMP' module_ra_HWRF.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_HWRF.F WITH OMP ; fi ; \ mpif77 -o module_ra_HWRF.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_HWRF.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_HWRF.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_HWRF.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_HWRF.f90 ; \ fi rm -f module_ra_hs.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_hs.F > module_ra_hs.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_hs.bb | /lib/cpp -C -P > module_ra_hs.f90 rm -f module_ra_hs.b module_ra_hs.bb if fgrep -iq '!$OMP' module_ra_hs.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_hs.F WITH OMP ; fi ; \ mpif77 -o module_ra_hs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_hs.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_hs.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_hs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_hs.f90 ; \ fi rm -f module_sf_sfclay.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_sfclay.F > module_sf_sfclay.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_sfclay.bb | /lib/cpp -C -P > module_sf_sfclay.f90 rm -f module_sf_sfclay.b module_sf_sfclay.bb if fgrep -iq '!$OMP' module_sf_sfclay.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_sfclay.F WITH OMP ; fi ; \ mpif77 -o module_sf_sfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfclay.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_sfclay.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_sfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfclay.f90 ; \ fi rm -f module_progtm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_progtm.F > module_progtm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_progtm.bb | /lib/cpp -C -P > module_progtm.f90 rm -f module_progtm.b module_progtm.bb if fgrep -iq '!$OMP' module_progtm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_progtm.F WITH OMP ; fi ; \ mpif77 -o module_progtm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_progtm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_progtm.F WITHOUT OMP ; fi ; \ mpif77 -o module_progtm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_progtm.f90 ; \ fi rm -f module_sf_slab.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_slab.F > module_sf_slab.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_slab.bb | /lib/cpp -C -P > module_sf_slab.f90 rm -f module_sf_slab.b module_sf_slab.bb if fgrep -iq '!$OMP' module_sf_slab.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_slab.F WITH OMP ; fi ; \ mpif77 -o module_sf_slab.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_slab.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_slab.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_slab.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_slab.f90 ; \ fi rm -f module_sf_noahlsm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_noahlsm.F > module_sf_noahlsm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_noahlsm.bb | /lib/cpp -C -P > module_sf_noahlsm.f90 rm -f module_sf_noahlsm.b module_sf_noahlsm.bb if fgrep -iq '!$OMP' module_sf_noahlsm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_noahlsm.F WITH OMP ; fi ; \ mpif77 -o module_sf_noahlsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_noahlsm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_noahlsm.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_noahlsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_noahlsm.f90 ; \ fi rm -f module_data_gocart_dust.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_data_gocart_dust.F > module_data_gocart_dust.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_data_gocart_dust.bb | /lib/cpp -C -P > module_data_gocart_dust.f90 rm -f module_data_gocart_dust.b module_data_gocart_dust.bb if fgrep -iq '!$OMP' module_data_gocart_dust.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_data_gocart_dust.F WITH OMP ; fi ; \ mpif77 -o module_data_gocart_dust.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_data_gocart_dust.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_data_gocart_dust.F WITHOUT OMP ; fi ; \ mpif77 -o module_data_gocart_dust.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_data_gocart_dust.f90 ; \ fi rm -f module_sf_urban.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_urban.F > module_sf_urban.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_urban.bb | /lib/cpp -C -P > module_sf_urban.f90 rm -f module_sf_urban.b module_sf_urban.bb if fgrep -iq '!$OMP' module_sf_urban.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_urban.F WITH OMP ; fi ; \ mpif77 -o module_sf_urban.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_urban.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_urban.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_urban.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_urban.f90 ; \ fi rm -f module_sf_bem.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_bem.F > module_sf_bem.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_bem.bb | /lib/cpp -C -P > module_sf_bem.f90 rm -f module_sf_bem.b module_sf_bem.bb if fgrep -iq '!$OMP' module_sf_bem.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_bem.F WITH OMP ; fi ; \ mpif77 -o module_sf_bem.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bem.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_bem.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_bem.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bem.f90 ; \ fi rm -f module_sf_pxlsm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_pxlsm.F > module_sf_pxlsm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_pxlsm.bb | /lib/cpp -C -P > module_sf_pxlsm.f90 rm -f module_sf_pxlsm.b module_sf_pxlsm.bb if fgrep -iq '!$OMP' module_sf_pxlsm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_pxlsm.F WITH OMP ; fi ; \ mpif77 -o module_sf_pxlsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_pxlsm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_pxlsm.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_pxlsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_pxlsm.f90 ; \ fi rm -f module_sf_ruclsm.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_ruclsm.F > module_sf_ruclsm.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_ruclsm.b > module_sf_ruclsm.f90 rm -f module_sf_ruclsm.b if fgrep -iq '!$OMP' module_sf_ruclsm.f90 ; then \ echo COMPILING module_sf_ruclsm.F WITH OMP ; \ if [ -n "" ] ; then echo COMPILING module_sf_ruclsm.F WITH OMP ; fi ; \ mpif77 -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_ruclsm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_ruclsm.F WITHOUT OMP ; fi ; \ mpif77 -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_ruclsm.f90 ; \ fi rm -f module_sf_sfcdiags.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_sfcdiags.F > module_sf_sfcdiags.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_sfcdiags.bb | /lib/cpp -C -P > module_sf_sfcdiags.f90 rm -f module_sf_sfcdiags.b module_sf_sfcdiags.bb if fgrep -iq '!$OMP' module_sf_sfcdiags.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_sfcdiags.F WITH OMP ; fi ; \ mpif77 -o module_sf_sfcdiags.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfcdiags.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_sfcdiags.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_sfcdiags.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfcdiags.f90 ; \ fi rm -f module_sf_sfcdiags_ruclsm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_sfcdiags_ruclsm.F > module_sf_sfcdiags_ruclsm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_sfcdiags_ruclsm.bb | /lib/cpp -C -P > module_sf_sfcdiags_ruclsm.f90 rm -f module_sf_sfcdiags_ruclsm.b module_sf_sfcdiags_ruclsm.bb if fgrep -iq '!$OMP' module_sf_sfcdiags_ruclsm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_sfcdiags_ruclsm.F WITH OMP ; fi ; \ mpif77 -o module_sf_sfcdiags_ruclsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfcdiags_ruclsm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_sfcdiags_ruclsm.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_sfcdiags_ruclsm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sfcdiags_ruclsm.f90 ; \ fi rm -f module_sf_sstskin.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_sstskin.F > module_sf_sstskin.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_sstskin.bb | /lib/cpp -C -P > module_sf_sstskin.f90 rm -f module_sf_sstskin.b module_sf_sstskin.bb if fgrep -iq '!$OMP' module_sf_sstskin.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_sstskin.F WITH OMP ; fi ; \ mpif77 -o module_sf_sstskin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sstskin.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_sstskin.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_sstskin.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_sstskin.f90 ; \ fi rm -f module_sf_tmnupdate.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_tmnupdate.F > module_sf_tmnupdate.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_tmnupdate.bb | /lib/cpp -C -P > module_sf_tmnupdate.f90 rm -f module_sf_tmnupdate.b module_sf_tmnupdate.bb if fgrep -iq '!$OMP' module_sf_tmnupdate.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_tmnupdate.F WITH OMP ; fi ; \ mpif77 -o module_sf_tmnupdate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_tmnupdate.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_tmnupdate.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_tmnupdate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_tmnupdate.f90 ; \ fi rm -f module_sf_oml.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_oml.F > module_sf_oml.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_oml.bb | /lib/cpp -C -P > module_sf_oml.f90 rm -f module_sf_oml.b module_sf_oml.bb if fgrep -iq '!$OMP' module_sf_oml.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_oml.F WITH OMP ; fi ; \ mpif77 -o module_sf_oml.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_oml.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_oml.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_oml.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_oml.f90 ; \ fi rm -f module_sf_myjsfc.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_myjsfc.F > module_sf_myjsfc.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_myjsfc.bb | /lib/cpp -C -P > module_sf_myjsfc.f90 rm -f module_sf_myjsfc.b module_sf_myjsfc.bb if fgrep -iq '!$OMP' module_sf_myjsfc.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_myjsfc.F WITH OMP ; fi ; \ mpif77 -o module_sf_myjsfc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_myjsfc.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_myjsfc.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_myjsfc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_myjsfc.f90 ; \ fi rm -f module_sf_qnsesfc.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_qnsesfc.F > module_sf_qnsesfc.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_qnsesfc.bb | /lib/cpp -C -P > module_sf_qnsesfc.f90 rm -f module_sf_qnsesfc.b module_sf_qnsesfc.bb if fgrep -iq '!$OMP' module_sf_qnsesfc.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_qnsesfc.F WITH OMP ; fi ; \ mpif77 -o module_sf_qnsesfc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_qnsesfc.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_qnsesfc.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_qnsesfc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_qnsesfc.f90 ; \ fi rm -f module_sf_mynn.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_mynn.F > module_sf_mynn.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_mynn.bb | /lib/cpp -C -P > module_sf_mynn.f90 rm -f module_sf_mynn.b module_sf_mynn.bb if fgrep -iq '!$OMP' module_sf_mynn.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_mynn.F WITH OMP ; fi ; \ mpif77 -o module_sf_mynn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_mynn.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_mynn.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_mynn.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_mynn.f90 ; \ fi rm -f module_sf_pxsfclay.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_pxsfclay.F > module_sf_pxsfclay.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_pxsfclay.bb | /lib/cpp -C -P > module_sf_pxsfclay.f90 rm -f module_sf_pxsfclay.b module_sf_pxsfclay.bb if fgrep -iq '!$OMP' module_sf_pxsfclay.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_pxsfclay.F WITH OMP ; fi ; \ mpif77 -o module_sf_pxsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_pxsfclay.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_pxsfclay.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_pxsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_pxsfclay.f90 ; \ fi rm -f module_sf_temfsfclay.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_temfsfclay.F > module_sf_temfsfclay.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_temfsfclay.bb | /lib/cpp -C -P > module_sf_temfsfclay.f90 rm -f module_sf_temfsfclay.b module_sf_temfsfclay.bb if fgrep -iq '!$OMP' module_sf_temfsfclay.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_temfsfclay.F WITH OMP ; fi ; \ mpif77 -o module_sf_temfsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_temfsfclay.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_temfsfclay.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_temfsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_temfsfclay.f90 ; \ fi rm -f module_sf_idealscmsfclay.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_idealscmsfclay.F > module_sf_idealscmsfclay.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_idealscmsfclay.bb | /lib/cpp -C -P > module_sf_idealscmsfclay.f90 rm -f module_sf_idealscmsfclay.b module_sf_idealscmsfclay.bb if fgrep -iq '!$OMP' module_sf_idealscmsfclay.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_idealscmsfclay.F WITH OMP ; fi ; \ mpif77 -o module_sf_idealscmsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_idealscmsfclay.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_idealscmsfclay.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_idealscmsfclay.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_idealscmsfclay.f90 ; \ fi rm -f module_physics_addtendc.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_physics_addtendc.F > module_physics_addtendc.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_physics_addtendc.bb | /lib/cpp -C -P > module_physics_addtendc.f90 rm -f module_physics_addtendc.b module_physics_addtendc.bb if fgrep -iq '!$OMP' module_physics_addtendc.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_physics_addtendc.F WITH OMP ; fi ; \ mpif77 -o module_physics_addtendc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_physics_addtendc.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_physics_addtendc.F WITHOUT OMP ; fi ; \ mpif77 -o module_physics_addtendc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_physics_addtendc.f90 ; \ fi rm -f module_fdda_psufddagd.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fdda_psufddagd.F > module_fdda_psufddagd.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fdda_psufddagd.bb | /lib/cpp -C -P > module_fdda_psufddagd.f90 rm -f module_fdda_psufddagd.b module_fdda_psufddagd.bb if fgrep -iq '!$OMP' module_fdda_psufddagd.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fdda_psufddagd.F WITH OMP ; fi ; \ mpif77 -o module_fdda_psufddagd.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fdda_psufddagd.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fdda_psufddagd.F WITHOUT OMP ; fi ; \ mpif77 -o module_fdda_psufddagd.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fdda_psufddagd.f90 ; \ fi rm -f module_fdda_spnudging.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fdda_spnudging.F > module_fdda_spnudging.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fdda_spnudging.bb | /lib/cpp -C -P > module_fdda_spnudging.f90 rm -f module_fdda_spnudging.b module_fdda_spnudging.bb if fgrep -iq '!$OMP' module_fdda_spnudging.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fdda_spnudging.F WITH OMP ; fi ; \ mpif77 -o module_fdda_spnudging.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fdda_spnudging.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fdda_spnudging.F WITHOUT OMP ; fi ; \ mpif77 -o module_fdda_spnudging.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fdda_spnudging.f90 ; \ fi rm -f module_fddaobs_rtfdda.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fddaobs_rtfdda.F > module_fddaobs_rtfdda.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fddaobs_rtfdda.b > module_fddaobs_rtfdda.f90 rm -f module_fddaobs_rtfdda.b if fgrep -iq '!$OMP' module_fddaobs_rtfdda.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fddaobs_rtfdda.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddaobs_rtfdda.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fddaobs_rtfdda.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddaobs_rtfdda.f90 ; \ fi rm -f module_wind_generic.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_wind_generic.F > module_wind_generic.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_wind_generic.bb | /lib/cpp -C -P > module_wind_generic.f90 rm -f module_wind_generic.b module_wind_generic.bb if fgrep -iq '!$OMP' module_wind_generic.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_wind_generic.F WITH OMP ; fi ; \ mpif77 -o module_wind_generic.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wind_generic.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_wind_generic.F WITHOUT OMP ; fi ; \ mpif77 -o module_wind_generic.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wind_generic.f90 ; \ fi rm -f module_microphysics_zero_out.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_microphysics_zero_out.F > module_microphysics_zero_out.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_microphysics_zero_out.bb | /lib/cpp -C -P > module_microphysics_zero_out.f90 rm -f module_microphysics_zero_out.b module_microphysics_zero_out.bb if fgrep -iq '!$OMP' module_microphysics_zero_out.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_microphysics_zero_out.F WITH OMP ; fi ; \ mpif77 -o module_microphysics_zero_out.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_microphysics_zero_out.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_microphysics_zero_out.F WITHOUT OMP ; fi ; \ mpif77 -o module_microphysics_zero_out.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_microphysics_zero_out.f90 ; \ fi rm -f module_diagnostics.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_diagnostics.F > module_diagnostics.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_diagnostics.bb | /lib/cpp -C -P > module_diagnostics.f90 rm -f module_diagnostics.b module_diagnostics.bb if fgrep -iq '!$OMP' module_diagnostics.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_diagnostics.F WITH OMP ; fi ; \ mpif77 -o module_diagnostics.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_diagnostics.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_diagnostics.F WITHOUT OMP ; fi ; \ mpif77 -o module_diagnostics.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_diagnostics.f90 ; \ fi rm -f module_fddagd_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fddagd_driver.F > module_fddagd_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fddagd_driver.bb | /lib/cpp -C -P > module_fddagd_driver.f90 rm -f module_fddagd_driver.b module_fddagd_driver.bb if fgrep -iq '!$OMP' module_fddagd_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fddagd_driver.F WITH OMP ; fi ; \ mpif77 -o module_fddagd_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddagd_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fddagd_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_fddagd_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddagd_driver.f90 ; \ fi rm -f module_fddaobs_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fddaobs_driver.F > module_fddaobs_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fddaobs_driver.bb | /lib/cpp -C -P > module_fddaobs_driver.f90 rm -f module_fddaobs_driver.b module_fddaobs_driver.bb if fgrep -iq '!$OMP' module_fddaobs_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fddaobs_driver.F WITH OMP ; fi ; \ mpif77 -o module_fddaobs_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddaobs_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fddaobs_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_fddaobs_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fddaobs_driver.f90 ; \ fi rm -f module_fr_sfire_util.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_util.F > module_fr_sfire_util.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_util.bb | /lib/cpp -C -P > module_fr_sfire_util.f90 rm -f module_fr_sfire_util.b module_fr_sfire_util.bb if fgrep -iq '!$OMP' module_fr_sfire_util.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_util.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_util.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_util.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_util.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_util.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_util.f90 ; \ fi rm -f module_cam_support.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_support.F > module_cam_support.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_support.bb | /lib/cpp -C -P > module_cam_support.f90 rm -f module_cam_support.b module_cam_support.bb if fgrep -iq '!$OMP' module_cam_support.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_support.F WITH OMP ; fi ; \ mpif77 -o module_cam_support.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_support.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_support.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_support.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_support.f90 ; \ fi rm -f module_cam_shr_const_mod.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_shr_const_mod.F > module_cam_shr_const_mod.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_shr_const_mod.bb | /lib/cpp -C -P > module_cam_shr_const_mod.f90 rm -f module_cam_shr_const_mod.b module_cam_shr_const_mod.bb if fgrep -iq '!$OMP' module_cam_shr_const_mod.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_shr_const_mod.F WITH OMP ; fi ; \ mpif77 -o module_cam_shr_const_mod.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_shr_const_mod.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_shr_const_mod.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_shr_const_mod.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_shr_const_mod.f90 ; \ fi rm -f module_cam_trb_mtn_stress.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_trb_mtn_stress.F > module_cam_trb_mtn_stress.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_trb_mtn_stress.bb | /lib/cpp -C -P > module_cam_trb_mtn_stress.f90 rm -f module_cam_trb_mtn_stress.b module_cam_trb_mtn_stress.bb if fgrep -iq '!$OMP' module_cam_trb_mtn_stress.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_trb_mtn_stress.F WITH OMP ; fi ; \ mpif77 -o module_cam_trb_mtn_stress.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_trb_mtn_stress.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_trb_mtn_stress.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_trb_mtn_stress.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_trb_mtn_stress.f90 ; \ fi rm -f module_cam_upper_bc.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_upper_bc.F > module_cam_upper_bc.bb rm -f module_cam_bl_diffusion_solver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_bl_diffusion_solver.F > module_cam_bl_diffusion_solver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_upper_bc.bb | /lib/cpp -C -P > module_cam_upper_bc.f90 rm -f module_cam_upper_bc.b module_cam_upper_bc.bb if fgrep -iq '!$OMP' module_cam_upper_bc.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_upper_bc.F WITH OMP ; fi ; \ mpif77 -o module_cam_upper_bc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_upper_bc.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_upper_bc.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_upper_bc.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_upper_bc.f90 ; \ fi /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_bl_diffusion_solver.bb | /lib/cpp -C -P > module_cam_bl_diffusion_solver.f90 rm -f module_gfs_physcons.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_gfs_physcons.F > module_gfs_physcons.bb rm -f module_cam_bl_diffusion_solver.b module_cam_bl_diffusion_solver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_gfs_physcons.bb | /lib/cpp -C -P > module_gfs_physcons.f90 if fgrep -iq '!$OMP' module_cam_bl_diffusion_solver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_bl_diffusion_solver.F WITH OMP ; fi ; \ mpif77 -o module_cam_bl_diffusion_solver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_bl_diffusion_solver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_bl_diffusion_solver.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_bl_diffusion_solver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_bl_diffusion_solver.f90 ; \ fi rm -f module_gfs_physcons.b module_gfs_physcons.bb if fgrep -iq '!$OMP' module_gfs_physcons.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_gfs_physcons.F WITH OMP ; fi ; \ mpif77 -o module_gfs_physcons.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_physcons.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_gfs_physcons.F WITHOUT OMP ; fi ; \ mpif77 -o module_gfs_physcons.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_physcons.f90 ; \ fi rm -f module_ra_rrtmg_sw.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_rrtmg_sw.F > module_ra_rrtmg_sw.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_rrtmg_sw.bb | /lib/cpp -C -P > module_ra_rrtmg_sw.f90 rm -f module_ra_rrtmg_sw.b module_ra_rrtmg_sw.bb if fgrep -iq '!$OMP' module_ra_rrtmg_sw.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_rrtmg_sw.F WITH OMP ; fi ; \ mpif77 -o module_ra_rrtmg_sw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtmg_sw.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_rrtmg_sw.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_rrtmg_sw.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_rrtmg_sw.f90 ; \ fi rm -f module_ra_cam_support.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_cam_support.F > module_ra_cam_support.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_cam_support.bb | /lib/cpp -C -P > module_ra_cam_support.f90 rm -f module_ra_cam_support.b module_ra_cam_support.bb if fgrep -iq '!$OMP' module_ra_cam_support.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_cam_support.F WITH OMP ; fi ; \ mpif77 -o module_ra_cam_support.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_cam_support.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_cam_support.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_cam_support.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_cam_support.f90 ; \ fi rm -f module_sf_bep.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_bep.F > module_sf_bep.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_bep.bb | /lib/cpp -C -P > module_sf_bep.f90 rm -f module_sf_bep.b module_sf_bep.bb if fgrep -iq '!$OMP' module_sf_bep.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_bep.F WITH OMP ; fi ; \ mpif77 -o module_sf_bep.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bep.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_bep.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_bep.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bep.f90 ; \ fi rm -f module_sf_bep_bem.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_bep_bem.F > module_sf_bep_bem.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_bep_bem.bb | /lib/cpp -C -P > module_sf_bep_bem.f90 rm -f module_sf_bep_bem.b module_sf_bep_bem.bb if fgrep -iq '!$OMP' module_sf_bep_bem.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_bep_bem.F WITH OMP ; fi ; \ mpif77 -o module_sf_bep_bem.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bep_bem.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_bep_bem.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_bep_bem.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_bep_bem.f90 ; \ fi rm -f module_wind_fitch.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_wind_fitch.F > module_wind_fitch.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_wind_fitch.bb | /lib/cpp -C -P > module_wind_fitch.f90 rm -f module_wind_fitch.b module_wind_fitch.bb if fgrep -iq '!$OMP' module_wind_fitch.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_wind_fitch.F WITH OMP ; fi ; \ mpif77 -o module_wind_fitch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wind_fitch.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_wind_fitch.F WITHOUT OMP ; fi ; \ mpif77 -o module_wind_fitch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_wind_fitch.f90 ; \ fi rm -f module_fr_sfire_phys.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_phys.F > module_fr_sfire_phys.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_phys.bb | /lib/cpp -C -P > module_fr_sfire_phys.f90 rm -f module_fr_sfire_phys.b module_fr_sfire_phys.bb if fgrep -iq '!$OMP' module_fr_sfire_phys.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_phys.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_phys.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_phys.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_phys.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_phys.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_phys.f90 ; \ fi rm -f module_fr_sfire_atm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_atm.F > module_fr_sfire_atm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_atm.bb | /lib/cpp -C -P > module_fr_sfire_atm.f90 rm -f module_fr_sfire_atm.b module_fr_sfire_atm.bb if fgrep -iq '!$OMP' module_fr_sfire_atm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_atm.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_atm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_atm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_atm.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_atm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_atm.f90 ; \ fi rm -f module_cam_physconst.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_physconst.F > module_cam_physconst.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_physconst.bb | /lib/cpp -C -P > module_cam_physconst.f90 rm -f module_cam_physconst.b module_cam_physconst.bb if fgrep -iq '!$OMP' module_cam_physconst.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_physconst.F WITH OMP ; fi ; \ mpif77 -o module_cam_physconst.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_physconst.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_physconst.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_physconst.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_physconst.f90 ; \ fi rm -f module_cam_bl_eddy_diff.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_bl_eddy_diff.F > module_cam_bl_eddy_diff.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_bl_eddy_diff.bb | /lib/cpp -C -P > module_cam_bl_eddy_diff.f90 rm -f module_cam_bl_eddy_diff.b module_cam_bl_eddy_diff.bb if fgrep -iq '!$OMP' module_cam_bl_eddy_diff.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_bl_eddy_diff.F WITH OMP ; fi ; \ mpif77 -o module_cam_bl_eddy_diff.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_bl_eddy_diff.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_bl_eddy_diff.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_bl_eddy_diff.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_bl_eddy_diff.f90 ; \ fi rm -f module_bl_gfs.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_gfs.F > module_bl_gfs.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_gfs.bb | /lib/cpp -C -P > module_bl_gfs.f90 rm -f module_bl_gfs.b module_bl_gfs.bb if fgrep -iq '!$OMP' module_bl_gfs.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_gfs.F WITH OMP ; fi ; \ mpif77 -o module_bl_gfs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_gfs.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_gfs.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_gfs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_gfs.f90 ; \ fi rm -f module_gfs_funcphys.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_gfs_funcphys.F > module_gfs_funcphys.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_gfs_funcphys.bb | /lib/cpp -C -P > module_gfs_funcphys.f90 rm -f module_gfs_funcphys.b module_gfs_funcphys.bb if fgrep -iq '!$OMP' module_gfs_funcphys.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_gfs_funcphys.F WITH OMP ; fi ; \ mpif77 -o module_gfs_funcphys.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_funcphys.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_gfs_funcphys.F WITHOUT OMP ; fi ; \ mpif77 -o module_gfs_funcphys.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_gfs_funcphys.f90 ; \ fi rm -f module_ra_cam.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_ra_cam.F > module_ra_cam.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_ra_cam.bb | /lib/cpp -C -P > module_ra_cam.f90 rm -f module_ra_cam.b module_ra_cam.bb if fgrep -iq '!$OMP' module_ra_cam.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_ra_cam.F WITH OMP ; fi ; \ mpif77 -o module_ra_cam.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_cam.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_ra_cam.F WITHOUT OMP ; fi ; \ mpif77 -o module_ra_cam.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_ra_cam.f90 ; \ fi rm -f module_sf_gfs.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_gfs.F > module_sf_gfs.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_gfs.bb | /lib/cpp -C -P > module_sf_gfs.f90 rm -f module_sf_gfs.b module_sf_gfs.bb if fgrep -iq '!$OMP' module_sf_gfs.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_gfs.F WITH OMP ; fi ; \ mpif77 -o module_sf_gfs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_gfs.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_gfs.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_gfs.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_gfs.f90 ; \ fi rm -f module_sf_gfdl.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_gfdl.F > module_sf_gfdl.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_gfdl.bb | /lib/cpp -C -P > module_sf_gfdl.f90 rm -f module_sf_gfdl.b module_sf_gfdl.bb if fgrep -iq '!$OMP' module_sf_gfdl.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_gfdl.F WITH OMP ; fi ; \ mpif77 -o module_sf_gfdl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_gfdl.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_gfdl.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_gfdl.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_gfdl.f90 ; \ fi rm -f module_sf_noahdrv.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sf_noahdrv.F > module_sf_noahdrv.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sf_noahdrv.bb | /lib/cpp -C -P > module_sf_noahdrv.f90 rm -f module_sf_noahdrv.b module_sf_noahdrv.bb if fgrep -iq '!$OMP' module_sf_noahdrv.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sf_noahdrv.F WITH OMP ; fi ; \ mpif77 -o module_sf_noahdrv.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_noahdrv.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sf_noahdrv.F WITHOUT OMP ; fi ; \ mpif77 -o module_sf_noahdrv.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sf_noahdrv.f90 ; \ fi rm -f module_fr_sfire_core.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_core.F > module_fr_sfire_core.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_core.bb | /lib/cpp -C -P > module_fr_sfire_core.f90 rm -f module_fr_sfire_core.b module_fr_sfire_core.bb if fgrep -iq '!$OMP' module_fr_sfire_core.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_core.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_core.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_core.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_core.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_core.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_core.f90 ; \ fi rm -f module_cam_gffgch.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_gffgch.F > module_cam_gffgch.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_gffgch.bb | /lib/cpp -C -P > module_cam_gffgch.f90 rm -f module_cam_gffgch.b module_cam_gffgch.bb if fgrep -iq '!$OMP' module_cam_gffgch.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_gffgch.F WITH OMP ; fi ; \ mpif77 -o module_cam_gffgch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_gffgch.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_gffgch.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_gffgch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_gffgch.f90 ; \ fi rm -f module_cam_constituents.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_constituents.F > module_cam_constituents.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_constituents.bb | /lib/cpp -C -P > module_cam_constituents.f90 rm -f module_cam_constituents.b module_cam_constituents.bb if fgrep -iq '!$OMP' module_cam_constituents.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_constituents.F WITH OMP ; fi ; \ mpif77 -o module_cam_constituents.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_constituents.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_constituents.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_constituents.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_constituents.f90 ; \ fi rm -f module_cu_tiedtke.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_tiedtke.F > module_cu_tiedtke.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_tiedtke.bb | /lib/cpp -C -P > module_cu_tiedtke.f90 rm -f module_cu_tiedtke.b module_cu_tiedtke.bb if fgrep -iq '!$OMP' module_cu_tiedtke.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_tiedtke.F WITH OMP ; fi ; \ mpif77 -o module_cu_tiedtke.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_tiedtke.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_tiedtke.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_tiedtke.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_tiedtke.f90 ; \ fi rm -f module_cu_sas.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_sas.F > module_cu_sas.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_sas.bb | /lib/cpp -C -P > module_cu_sas.f90 rm -f module_cu_sas.b module_cu_sas.bb if fgrep -iq '!$OMP' module_cu_sas.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_sas.F WITH OMP ; fi ; \ mpif77 -o module_cu_sas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_sas.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_sas.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_sas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_sas.f90 ; \ fi rm -f module_cu_osas.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_osas.F > module_cu_osas.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_osas.bb | /lib/cpp -C -P > module_cu_osas.f90 rm -f module_cu_osas.b module_cu_osas.bb if fgrep -iq '!$OMP' module_cu_osas.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_osas.F WITH OMP ; fi ; \ mpif77 -o module_cu_osas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_osas.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_osas.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_osas.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_osas.f90 ; \ fi rm -f module_radiation_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_radiation_driver.F > module_radiation_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_radiation_driver.bb | /lib/cpp -C -P > module_radiation_driver.f90 rm -f module_radiation_driver.b module_radiation_driver.bb if fgrep -iq '!$OMP' module_radiation_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_radiation_driver.F WITH OMP ; fi ; \ mpif77 -o module_radiation_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_radiation_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_radiation_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_radiation_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_radiation_driver.f90 ; \ fi rm -f module_surface_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_surface_driver.F > module_surface_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_surface_driver.bb | /lib/cpp -C -P > module_surface_driver.f90 rm -f module_surface_driver.b module_surface_driver.bb if fgrep -iq '!$OMP' module_surface_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_surface_driver.F WITH OMP ; fi ; \ mpif77 -o module_surface_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_surface_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_surface_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_surface_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_surface_driver.f90 ; \ fi rm -f module_fr_sfire_model.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_model.F > module_fr_sfire_model.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_model.bb | /lib/cpp -C -P > module_fr_sfire_model.f90 rm -f module_fr_sfire_model.b module_fr_sfire_model.bb if fgrep -iq '!$OMP' module_fr_sfire_model.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_model.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_model.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_model.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_model.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_model.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_model.f90 ; \ fi rm -f module_cam_wv_saturation.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_wv_saturation.F > module_cam_wv_saturation.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_wv_saturation.bb | /lib/cpp -C -P > module_cam_wv_saturation.f90 rm -f module_cam_wv_saturation.b module_cam_wv_saturation.bb if fgrep -iq '!$OMP' module_cam_wv_saturation.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_wv_saturation.F WITH OMP ; fi ; \ mpif77 -o module_cam_wv_saturation.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_wv_saturation.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_wv_saturation.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_wv_saturation.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_wv_saturation.f90 ; \ fi rm -f module_cam_molec_diff.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_molec_diff.F > module_cam_molec_diff.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_molec_diff.bb | /lib/cpp -C -P > module_cam_molec_diff.f90 rm -f module_cam_molec_diff.b module_cam_molec_diff.bb if fgrep -iq '!$OMP' module_cam_molec_diff.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_molec_diff.F WITH OMP ; fi ; \ mpif77 -o module_cam_molec_diff.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_molec_diff.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_molec_diff.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_molec_diff.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_molec_diff.f90 ; \ fi rm -f module_mixactivate.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_mixactivate.F > module_mixactivate.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_mixactivate.bb | /lib/cpp -C -P > module_mixactivate.f90 rm -f module_mixactivate.b module_mixactivate.bb if fgrep -iq '!$OMP' module_mixactivate.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_mixactivate.F WITH OMP ; fi ; \ mpif77 -o module_mixactivate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mixactivate.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_mixactivate.F WITHOUT OMP ; fi ; \ mpif77 -o module_mixactivate.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_mixactivate.f90 ; \ fi rm -f module_fr_sfire_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_driver.F > module_fr_sfire_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_driver.bb | /lib/cpp -C -P > module_fr_sfire_driver.f90 rm -f module_fr_sfire_driver.b module_fr_sfire_driver.bb if fgrep -iq '!$OMP' module_fr_sfire_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_driver.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_driver.f90 ; \ fi rm -f module_cam_cldwat.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_cldwat.F > module_cam_cldwat.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_cldwat.bb | /lib/cpp -C -P > module_cam_cldwat.f90 rm -f module_cam_cldwat.b module_cam_cldwat.bb if fgrep -iq '!$OMP' module_cam_cldwat.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_cldwat.F WITH OMP ; fi ; \ mpif77 -o module_cam_cldwat.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_cldwat.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_cldwat.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_cldwat.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_cldwat.f90 ; \ fi rm -f module_cam_esinti.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cam_esinti.F > module_cam_esinti.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cam_esinti.bb | /lib/cpp -C -P > module_cam_esinti.f90 rm -f module_cam_esinti.b module_cam_esinti.bb if fgrep -iq '!$OMP' module_cam_esinti.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cam_esinti.F WITH OMP ; fi ; \ mpif77 -o module_cam_esinti.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_esinti.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cam_esinti.F WITHOUT OMP ; fi ; \ mpif77 -o module_cam_esinti.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cam_esinti.f90 ; \ fi rm -f module_cu_camzm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_camzm.F > module_cu_camzm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_camzm.bb | /lib/cpp -C -P > module_cu_camzm.f90 rm -f module_cu_camzm.b module_cu_camzm.bb if fgrep -iq '!$OMP' module_cu_camzm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_camzm.F WITH OMP ; fi ; \ mpif77 -o module_cu_camzm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camzm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_camzm.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_camzm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camzm.f90 ; \ fi rm -f module_microphysics_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_microphysics_driver.F > module_microphysics_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_microphysics_driver.bb | /lib/cpp -C -P > module_microphysics_driver.f90 rm -f module_microphysics_driver.b module_microphysics_driver.bb if fgrep -iq '!$OMP' module_microphysics_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_microphysics_driver.F WITH OMP ; fi ; \ mpif77 -o module_microphysics_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_microphysics_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_microphysics_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_microphysics_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_microphysics_driver.f90 ; \ fi rm -f module_fr_sfire_driver_wrf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_fr_sfire_driver_wrf.F > module_fr_sfire_driver_wrf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_fr_sfire_driver_wrf.bb | /lib/cpp -C -P > module_fr_sfire_driver_wrf.f90 rm -f module_fr_sfire_driver_wrf.b module_fr_sfire_driver_wrf.bb if fgrep -iq '!$OMP' module_fr_sfire_driver_wrf.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_driver_wrf.F WITH OMP ; fi ; \ mpif77 -o module_fr_sfire_driver_wrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_driver_wrf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_fr_sfire_driver_wrf.F WITHOUT OMP ; fi ; \ mpif77 -o module_fr_sfire_driver_wrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_fr_sfire_driver_wrf.f90 ; \ fi rm -f module_bl_camuwpbl_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bl_camuwpbl_driver.F > module_bl_camuwpbl_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bl_camuwpbl_driver.bb | /lib/cpp -C -P > module_bl_camuwpbl_driver.f90 rm -f module_bl_camuwpbl_driver.b module_bl_camuwpbl_driver.bb if fgrep -iq '!$OMP' module_bl_camuwpbl_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bl_camuwpbl_driver.F WITH OMP ; fi ; \ mpif77 -o module_bl_camuwpbl_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_camuwpbl_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bl_camuwpbl_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_bl_camuwpbl_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bl_camuwpbl_driver.f90 ; \ fi rm -f module_cu_camuwshcu.o rm -f module_cu_camzm_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_camuwshcu.F > module_cu_camuwshcu.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_camzm_driver.F > module_cu_camzm_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_camzm_driver.bb | /lib/cpp -C -P > module_cu_camzm_driver.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_camuwshcu.bb | /lib/cpp -C -P > module_cu_camuwshcu.f90 rm -f module_cu_camzm_driver.b module_cu_camzm_driver.bb if fgrep -iq '!$OMP' module_cu_camzm_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_camzm_driver.F WITH OMP ; fi ; \ mpif77 -o module_cu_camzm_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camzm_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_camzm_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_camzm_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camzm_driver.f90 ; \ fi rm -f module_cu_camuwshcu.b module_cu_camuwshcu.bb if fgrep -iq '!$OMP' module_cu_camuwshcu.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_camuwshcu.F WITH OMP ; fi ; \ mpif77 -o module_cu_camuwshcu.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camuwshcu.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_camuwshcu.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_camuwshcu.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camuwshcu.f90 ; \ fi rm -f module_pbl_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_pbl_driver.F > module_pbl_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_pbl_driver.bb | /lib/cpp -C -P > module_pbl_driver.f90 rm -f module_pbl_driver.b module_pbl_driver.bb if fgrep -iq '!$OMP' module_pbl_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_pbl_driver.F WITH OMP ; fi ; \ mpif77 -o module_pbl_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_pbl_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_pbl_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_pbl_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_pbl_driver.f90 ; \ fi rm -f module_cumulus_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cumulus_driver.F > module_cumulus_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cumulus_driver.bb | /lib/cpp -C -P > module_cumulus_driver.f90 rm -f module_cumulus_driver.b module_cumulus_driver.bb if fgrep -iq '!$OMP' module_cumulus_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cumulus_driver.F WITH OMP ; fi ; \ mpif77 -o module_cumulus_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cumulus_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cumulus_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_cumulus_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cumulus_driver.f90 ; \ fi rm -f module_cu_camuwshcu_driver.o rm -f module_physics_init.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_cu_camuwshcu_driver.F > module_cu_camuwshcu_driver.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_physics_init.F > module_physics_init.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_cu_camuwshcu_driver.bb | /lib/cpp -C -P > module_cu_camuwshcu_driver.f90 rm -f module_cu_camuwshcu_driver.b module_cu_camuwshcu_driver.bb if fgrep -iq '!$OMP' module_cu_camuwshcu_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_cu_camuwshcu_driver.F WITH OMP ; fi ; \ mpif77 -o module_cu_camuwshcu_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camuwshcu_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_cu_camuwshcu_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_cu_camuwshcu_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_cu_camuwshcu_driver.f90 ; \ fi /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_physics_init.bb | /lib/cpp -C -P > module_physics_init.f90 rm -f module_physics_init.b module_physics_init.bb mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_physics_init.f90 rm -f module_shallowcu_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_shallowcu_driver.F > module_shallowcu_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_shallowcu_driver.bb | /lib/cpp -C -P > module_shallowcu_driver.f90 rm -f module_shallowcu_driver.b module_shallowcu_driver.bb if fgrep -iq '!$OMP' module_shallowcu_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_shallowcu_driver.F WITH OMP ; fi ; \ mpif77 -o module_shallowcu_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_shallowcu_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_shallowcu_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_shallowcu_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_shallowcu_driver.f90 ; \ fi make[3]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys' make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' if [ 0 -eq 1 ] ; then make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" chemics ; fi if [ 1 -eq 1 ] ; then make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" em_core ; fi make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' -------------------------------------- if [ 0 -eq 0 ] ; then \ CF= ; \ else \ CF=../chem/module_aerosols_sorgam.o ../chem/module_gocart_aerosols.o ../chem/module_mosaic_driver.o ../chem/module_input_tracer.o ; \ fi ( cd dyn_em ; make -i -r -j 2 CF="" ) make[2]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/dyn_em' rm -f module_advect_em.o rm -f module_big_step_utilities_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_advect_em.F > module_advect_em.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_big_step_utilities_em.F > module_big_step_utilities_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_big_step_utilities_em.bb | /lib/cpp -C -P > module_big_step_utilities_em.f90 /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_advect_em.bb | /lib/cpp -C -P > module_advect_em.f90 rm -f module_big_step_utilities_em.b module_big_step_utilities_em.bb if fgrep -iq '!$OMP' module_big_step_utilities_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_big_step_utilities_em.F WITH OMP ; fi ; \ mpif77 -o module_big_step_utilities_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_big_step_utilities_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_big_step_utilities_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_big_step_utilities_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_big_step_utilities_em.f90 ; \ fi rm -f module_advect_em.b module_advect_em.bb if fgrep -iq '!$OMP' module_advect_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_advect_em.F WITH OMP ; fi ; \ mpif77 -o module_advect_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_advect_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_advect_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_advect_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_advect_em.f90 ; \ fi rm -f module_small_step_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_small_step_em.F > module_small_step_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_small_step_em.bb | /lib/cpp -C -P > module_small_step_em.f90 rm -f module_small_step_em.b module_small_step_em.bb if fgrep -iq '!$OMP' module_small_step_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_small_step_em.F WITH OMP ; fi ; \ mpif77 -o module_small_step_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_small_step_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_small_step_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_small_step_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_small_step_em.f90 ; \ fi rm -f module_damping_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_damping_em.F > module_damping_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_damping_em.bb | /lib/cpp -C -P > module_damping_em.f90 rm -f module_damping_em.b module_damping_em.bb if fgrep -iq '!$OMP' module_damping_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_damping_em.F WITH OMP ; fi ; \ mpif77 -o module_damping_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_damping_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_damping_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_damping_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_damping_em.f90 ; \ fi rm -f module_solvedebug_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_solvedebug_em.F > module_solvedebug_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_solvedebug_em.bb | /lib/cpp -C -P > module_solvedebug_em.f90 rm -f module_solvedebug_em.b module_solvedebug_em.bb if fgrep -iq '!$OMP' module_solvedebug_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_solvedebug_em.F WITH OMP ; fi ; \ mpif77 -o module_solvedebug_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_solvedebug_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_solvedebug_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_solvedebug_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_solvedebug_em.f90 ; \ fi rm -f module_bc_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_bc_em.F > module_bc_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_bc_em.bb | /lib/cpp -C -P > module_bc_em.f90 rm -f module_bc_em.b module_bc_em.bb if fgrep -iq '!$OMP' module_bc_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_bc_em.F WITH OMP ; fi ; \ mpif77 -o module_bc_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_bc_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_bc_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_bc_em.f90 ; \ fi rm -f module_init_utilities.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_init_utilities.F > module_init_utilities.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_init_utilities.bb | /lib/cpp -C -P > module_init_utilities.f90 rm -f module_init_utilities.b module_init_utilities.bb if fgrep -iq '!$OMP' module_init_utilities.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_init_utilities.F WITH OMP ; fi ; \ mpif77 -o module_init_utilities.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_init_utilities.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_init_utilities.F WITHOUT OMP ; fi ; \ mpif77 -o module_init_utilities.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_init_utilities.f90 ; \ fi rm -f module_polarfft.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_polarfft.F > module_polarfft.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_polarfft.bb | /lib/cpp -C -P > module_polarfft.f90 rm -f module_polarfft.b module_polarfft.bb if fgrep -iq '!$OMP' module_polarfft.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_polarfft.F WITH OMP ; fi ; \ mpif77 -o module_polarfft.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_polarfft.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_polarfft.F WITHOUT OMP ; fi ; \ mpif77 -o module_polarfft.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_polarfft.f90 ; \ fi rm -f module_force_scm.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_force_scm.F > module_force_scm.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_force_scm.bb | /lib/cpp -C -P > module_force_scm.f90 rm -f module_force_scm.b module_force_scm.bb if fgrep -iq '!$OMP' module_force_scm.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_force_scm.F WITH OMP ; fi ; \ mpif77 -o module_force_scm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_force_scm.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_force_scm.F WITHOUT OMP ; fi ; \ mpif77 -o module_force_scm.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_force_scm.f90 ; \ fi rm -f module_convtrans_prep.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_convtrans_prep.F > module_convtrans_prep.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_convtrans_prep.bb | /lib/cpp -C -P > module_convtrans_prep.f90 rm -f module_convtrans_prep.b module_convtrans_prep.bb if fgrep -iq '!$OMP' module_convtrans_prep.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_convtrans_prep.F WITH OMP ; fi ; \ mpif77 -o module_convtrans_prep.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_convtrans_prep.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_convtrans_prep.F WITHOUT OMP ; fi ; \ mpif77 -o module_convtrans_prep.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_convtrans_prep.f90 ; \ fi rm -f module_stoch.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_stoch.F > module_stoch.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_stoch.bb | /lib/cpp -C -P > module_stoch.f90 rm -f module_stoch.b module_stoch.bb if fgrep -iq '!$OMP' module_stoch.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_stoch.F WITH OMP ; fi ; \ mpif77 -o module_stoch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_stoch.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_stoch.F WITHOUT OMP ; fi ; \ mpif77 -o module_stoch.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_stoch.f90 ; \ fi rm -f module_sfs_nba.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sfs_nba.F > module_sfs_nba.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sfs_nba.bb | /lib/cpp -C -P > module_sfs_nba.f90 rm -f module_sfs_nba.b module_sfs_nba.bb if fgrep -iq '!$OMP' module_sfs_nba.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sfs_nba.F WITH OMP ; fi ; \ mpif77 -o module_sfs_nba.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sfs_nba.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sfs_nba.F WITHOUT OMP ; fi ; \ mpif77 -o module_sfs_nba.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sfs_nba.f90 ; \ fi rm -f module_avgflx_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_avgflx_em.F > module_avgflx_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_avgflx_em.bb | /lib/cpp -C -P > module_avgflx_em.f90 rm -f module_avgflx_em.b module_avgflx_em.bb if fgrep -iq '!$OMP' module_avgflx_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_avgflx_em.F WITH OMP ; fi ; \ mpif77 -o module_avgflx_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_avgflx_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_avgflx_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_avgflx_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_avgflx_em.f90 ; \ fi rm -f init_modules_em.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe init_modules_em.F > init_modules_em.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional init_modules_em.b > init_modules_em.f90 rm -f init_modules_em.b if fgrep -iq '!$OMP' init_modules_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING init_modules_em.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include init_modules_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING init_modules_em.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include init_modules_em.f90 ; \ fi rm -f start_em.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe start_em.F > start_em.b /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional start_em.b > start_em.f90 rm -f start_em.b if fgrep -iq '!$OMP' start_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING start_em.F WITH OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include start_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING start_em.F WITHOUT OMP ; fi ; \ mpif77 -c -O0 -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include start_em.f90 ; \ fi rm -f shift_domain_em.o rm -f couple_or_uncouple_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional shift_domain_em.F > shift_domain_em.bb /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional couple_or_uncouple_em.F > couple_or_uncouple_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe shift_domain_em.bb | /lib/cpp -C -P > shift_domain_em.f90 rm -f shift_domain_em.b shift_domain_em.bb if fgrep -iq '!$OMP' shift_domain_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING shift_domain_em.F WITH OMP ; fi ; \ mpif77 -o shift_domain_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include shift_domain_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING shift_domain_em.F WITHOUT OMP ; fi ; \ mpif77 -o shift_domain_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include shift_domain_em.f90 ; \ fi /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe couple_or_uncouple_em.bb | /lib/cpp -C -P > couple_or_uncouple_em.f90 rm -f couple_or_uncouple_em.b couple_or_uncouple_em.bb if fgrep -iq '!$OMP' couple_or_uncouple_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING couple_or_uncouple_em.F WITH OMP ; fi ; \ mpif77 -o couple_or_uncouple_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include couple_or_uncouple_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING couple_or_uncouple_em.F WITHOUT OMP ; fi ; \ mpif77 -o couple_or_uncouple_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include couple_or_uncouple_em.f90 ; \ fi rm -f nest_init_utils.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional nest_init_utils.F > nest_init_utils.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe nest_init_utils.bb | /lib/cpp -C -P > nest_init_utils.f90 rm -f nest_init_utils.b nest_init_utils.bb if fgrep -iq '!$OMP' nest_init_utils.f90 ; then \ if [ -n "" ] ; then echo COMPILING nest_init_utils.F WITH OMP ; fi ; \ mpif77 -o nest_init_utils.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include nest_init_utils.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING nest_init_utils.F WITHOUT OMP ; fi ; \ mpif77 -o nest_init_utils.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include nest_init_utils.f90 ; \ fi rm -f adapt_timestep_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional adapt_timestep_em.F > adapt_timestep_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe adapt_timestep_em.bb | /lib/cpp -C -P > adapt_timestep_em.f90 rm -f adapt_timestep_em.b adapt_timestep_em.bb if fgrep -iq '!$OMP' adapt_timestep_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING adapt_timestep_em.F WITH OMP ; fi ; \ mpif77 -o adapt_timestep_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include adapt_timestep_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING adapt_timestep_em.F WITHOUT OMP ; fi ; \ mpif77 -o adapt_timestep_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include adapt_timestep_em.f90 ; \ fi rm -f interp_domain_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional interp_domain_em.F > interp_domain_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe interp_domain_em.bb | /lib/cpp -C -P > interp_domain_em.f90 rm -f interp_domain_em.b interp_domain_em.bb if fgrep -iq '!$OMP' interp_domain_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING interp_domain_em.F WITH OMP ; fi ; \ mpif77 -o interp_domain_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include interp_domain_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING interp_domain_em.F WITHOUT OMP ; fi ; \ mpif77 -o interp_domain_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include interp_domain_em.f90 ; \ fi rm -f module_diffusion_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_diffusion_em.F > module_diffusion_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_diffusion_em.bb | /lib/cpp -C -P > module_diffusion_em.f90 rm -f module_diffusion_em.b module_diffusion_em.bb if fgrep -iq '!$OMP' module_diffusion_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_diffusion_em.F WITH OMP ; fi ; \ mpif77 -o module_diffusion_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_diffusion_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_diffusion_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_diffusion_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_diffusion_em.f90 ; \ fi rm -f module_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_em.F > module_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_em.bb | /lib/cpp -C -P > module_em.f90 rm -f module_em.b module_em.bb if fgrep -iq '!$OMP' module_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_em.F WITH OMP ; fi ; \ mpif77 -o module_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_em.F WITHOUT OMP ; fi ; \ mpif77 -o module_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_em.f90 ; \ fi rm -f module_sfs_driver.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_sfs_driver.F > module_sfs_driver.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_sfs_driver.bb | /lib/cpp -C -P > module_sfs_driver.f90 rm -f module_sfs_driver.b module_sfs_driver.bb if fgrep -iq '!$OMP' module_sfs_driver.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_sfs_driver.F WITH OMP ; fi ; \ mpif77 -o module_sfs_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sfs_driver.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_sfs_driver.F WITHOUT OMP ; fi ; \ mpif77 -o module_sfs_driver.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_sfs_driver.f90 ; \ fi rm -f module_first_rk_step_part1.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_first_rk_step_part1.F > module_first_rk_step_part1.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_first_rk_step_part1.bb | /lib/cpp -C -P > module_first_rk_step_part1.f90 rm -f module_first_rk_step_part1.b module_first_rk_step_part1.bb if fgrep -iq '!$OMP' module_first_rk_step_part1.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_first_rk_step_part1.F WITH OMP ; fi ; \ mpif77 -o module_first_rk_step_part1.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_first_rk_step_part1.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_first_rk_step_part1.F WITHOUT OMP ; fi ; \ mpif77 -o module_first_rk_step_part1.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_first_rk_step_part1.f90 ; \ fi rm -f module_first_rk_step_part2.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional module_first_rk_step_part2.F > module_first_rk_step_part2.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe module_first_rk_step_part2.bb | /lib/cpp -C -P > module_first_rk_step_part2.f90 rm -f module_first_rk_step_part2.b module_first_rk_step_part2.bb if fgrep -iq '!$OMP' module_first_rk_step_part2.f90 ; then \ if [ -n "" ] ; then echo COMPILING module_first_rk_step_part2.F WITH OMP ; fi ; \ mpif77 -o module_first_rk_step_part2.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_first_rk_step_part2.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING module_first_rk_step_part2.F WITHOUT OMP ; fi ; \ mpif77 -o module_first_rk_step_part2.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include module_first_rk_step_part2.f90 ; \ fi rm -f solve_em.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional solve_em.F > solve_em.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe solve_em.bb | /lib/cpp -C -P > solve_em.f90 rm -f solve_em.b solve_em.bb if fgrep -iq '!$OMP' solve_em.f90 ; then \ if [ -n "" ] ; then echo COMPILING solve_em.F WITH OMP ; fi ; \ mpif77 -o solve_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include solve_em.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING solve_em.F WITHOUT OMP ; fi ; \ mpif77 -o solve_em.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include solve_em.f90 ; \ fi ar ru ../main/libwrflib.a module_advect_em.o module_diffusion_em.o module_small_step_em.o module_big_step_utilities_em.o module_em.o module_solvedebug_em.o module_bc_em.o module_init_utilities.o module_damping_em.o module_polarfft.o module_force_scm.o module_first_rk_step_part1.o module_first_rk_step_part2.o module_avgflx_em.o module_sfs_nba.o module_convtrans_prep.o module_sfs_driver.o module_stoch.o init_modules_em.o solve_em.o start_em.o shift_domain_em.o couple_or_uncouple_em.o nest_init_utils.o adapt_timestep_em.o interp_domain_em.o make[2]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/dyn_em' make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3' if [ 0 -eq 1 ] ; then make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" nmm_core ; fi if [ 0 -eq 1 ] ; then make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" exp_core ; fi ( cd main ; make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" SOLVER=em em_wrf ) make[1]: Entering directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main' rm -f ../main/module_wrf_top.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional ../main/module_wrf_top.F > ../main/module_wrf_top.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe ../main/module_wrf_top.bb | /lib/cpp -C -P > ../main/module_wrf_top.f90 rm -f ../main/module_wrf_top.b ../main/module_wrf_top.bb if fgrep -iq '!$OMP' ../main/module_wrf_top.f90 ; then \ if [ -n "" ] ; then echo COMPILING ../main/module_wrf_top.F WITH OMP ; fi ; \ mpif77 -o ../main/module_wrf_top.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include ../main/module_wrf_top.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING ../main/module_wrf_top.F WITHOUT OMP ; fi ; \ mpif77 -o ../main/module_wrf_top.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include ../main/module_wrf_top.f90 ; \ fi rm -f wrf.o /lib/cpp -C -P -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -DEM_CORE=1 -DNMM_CORE=0 -DNMM_MAX_DIM=2600 -DCOAMPS_CORE=0 -DDA_CORE=0 -DEXP_CORE=0 -DIWORDSIZE=4 -DDWORDSIZE=8 -DRWORDSIZE=4 -DLWORDSIZE=4 -DNONSTANDARD_SYSTEM_SUBR -DUSE_MPI_IN_PLACE -DDM_PARALLEL -DNETCDF -DUSE_ALLOCATABLES -DGRIB1 -DINTIO -DLIMIT_ARGS -DCONFIG_BUF_LEN=32768 -DMAX_DOMAINS_F=21 -DMAX_HISTORY=25 -DNMM_NEST=0 -I. -traditional wrf.F > wrf.bb /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/tools/standard.exe wrf.bb | /lib/cpp -C -P > wrf.f90 rm -f wrf.b wrf.bb if fgrep -iq '!$OMP' wrf.f90 ; then \ if [ -n "" ] ; then echo COMPILING wrf.F WITH OMP ; fi ; \ mpif77 -o wrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf.f90 ; \ else \ if [ -n "" ] ; then echo COMPILING wrf.F WITHOUT OMP ; fi ; \ mpif77 -o wrf.o -c -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 -I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include wrf.f90 ; \ fi ranlib libwrflib.a mpif77 -o wrf.exe -O3 -ftree-vectorize -ftree-loop-linear -funroll-loops -w -ffree-form -ffree-line-length-none -fconvert=big-endian -frecord-marker=4 wrf.o ../main/module_wrf_top.o libwrflib.a /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/fftpack/fftpack5/libfftpack.a /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib1/libio_grib1.a /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_grib_share/libio_grib_share.a /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int/libwrfio_int.a -L/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -lesmf_time /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/RSL_LITE/librsl_lite.a /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame/module_internal_header_util.o /home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame/pack_utils.o -L/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -lwrfio_nf -L/home/shankha/work/packages/netcdf/install/lib -lnetcdff -lnetcdf -Wl,-R/home/shankha/work/packages/netcdf/install/lib -lmpi_f77 -L/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/lib -Wl,-R/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/lib make[1]: Leaving directory `/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main' ( cd run ; /bin/rm -f wrf.exe ; ln -s ../main/wrf.exe . ) if [ 0 -eq 1 ] ; then \ ( cd main ; make -i -r MODULE_DIRS="-I../dyn_em -I../dyn_nmm -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/esmf_time_f90 -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/main -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_netcdf -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/external/io_int -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/frame -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/share -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/phys -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/chem -I/home/shankha/work/I_O_Traces/trace_apps/wrf/model/WRFV3/inc -I/home/shankha/work/packages/netcdf/install/include -I/home/shankha/work/OpenMPI/gcc-4.4/gfortran/install/include" SOLVER=em em_wrf_SST_ESMF ) ; \ fi build started: Sun Oct 16 09:59:00 EDT 2011 build completed: Sun Oct 16 10:12:36 EDT 2011 -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.wrf Type: application/octet-stream Size: 20419 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111016/37a8191e/attachment-0001.obj From shankhabanerjee at gmail.com Tue Oct 18 03:43:22 2011 From: shankhabanerjee at gmail.com (shankha) Date: Tue, 18 Oct 2011 05:43:22 -0400 Subject: [Wrf-users] wrf fails to run : build with OpenMPI In-Reply-To: References: Message-ID: Hi, Please ignore my below mentioned message. After reading the on-line documentation I figured out a way to run wrf. I am sorry for not going through the manual carefully. On Sun, Oct 16, 2011 at 10:41 AM, shankha wrote: > Hi, > Machine Information : Linux bertram 2.6.32-33-generic #72-Ubuntu SMP Fri > Jul 29 21:07:13 UTC 2011 x86_64 GNU/Linux > Compiler : 4.4.3. > MPI : OpenMPI 1.4.3 > > I am unable to run wrf.exe with mpirun. I am able to run wrf.exe stand > alone > > bertram : ~ ] mpirun -np 2 ./wrf.exe > -------------------------------------------------------------------------- > mpirun has exited due to process rank 0 with PID 7061 on > node bertram exiting without calling "finalize". This may > have caused other processes in the application to be > terminated by signals sent by mpirun (as reported here). > -------------------------------------------------------------------------- > > I have checked my MPI installation and it is good. ldd didn't report any > missing libraries or errors. > > Thanks for your help. > -- > Thanks > Shankha > > -- Thanks Shankha Banerjee -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111018/abd3ab5c/attachment.html From claudiomet at gmail.com Fri Oct 21 09:38:27 2011 From: claudiomet at gmail.com (claudiomet) Date: Fri, 21 Oct 2011 12:38:27 -0300 Subject: [Wrf-users] wrf 3.3.1 and pgi 11.7 Message-ID: Greetings users ! I'm trying to compile (smpar) WRFV3.3.1 with PGI 11.7 on a Intel x64 machine NetCDF and WRF compilation OK, no problems, but WPS fails. The compilation log says in the first lines: **** Compiling WPS and all utilities **** make[1]: se ingresa al directorio ?/home/user1/WPS/geogrid/src? /bin/rm -f cio.o mpicc -cc=pgcc -D_UNDERSCORE -DBYTESWAP -DLINUX -DIO_NETCDF -DIO_BINARY -DIO_GRIB1 -D_MPI -DBIT32 -D_GEOGRID -O -c cio.c pgcc-Error-Unknown switch: -Wall make[1]: [cio.o] Error 1 (no tiene efecto) What's the problem ? Thanks !! -- Claudio Cortes (Skype: claudiomet78) claudiomet.blogspot.com twitter.com/claudiomet cl.linkedin.com/in/claudiomet Meteorologo Jefe del Laboratorio de Innovaci?n e Inform?tica Ambiental (LIIA) Unidad de Modelacion y Gestion de la Calidad del Aire (UMGCA) Centro Nacional del Medio Ambiente (CENMA) www.cenma.cl -- Claudio Cortes (Skype: claudiomet78) claudiomet.blogspot.com twitter.com/claudiomet cl.linkedin.com/in/claudiomet Meteorologist Laboratory of Innovation and Environmental Informatics Chief Modeling and Air Quality Management Unit National Enviroment Center, Chile (CENMA) www.cenma.cl From ebeigi3 at tigers.lsu.edu Tue Oct 18 19:31:47 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Tue, 18 Oct 2011 21:31:47 -0400 Subject: [Wrf-users] Error Running metgrid.exe Message-ID: Dear Sir/Madam, Thanks for your previous help. I ran metgrid using ccsm converted format to intermediate format, and i got this error message. could you please help me with that? Oops, something is not right with the Gaussian latitude computation. The input data gave the starting latitude as 23.113. This routine computed the starting latitude as +- 84.376. The difference is larger than 0.01 degrees, which is not expected. ERROR: Gaussian_latitude_computation application called MPI_Abort(MPI_COMM_WORLD, 137254296) - process 0 thanks in advance Best Regards, -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111018/848d4229/attachment.html From michael.bane at manchester.ac.uk Fri Oct 21 06:25:42 2011 From: michael.bane at manchester.ac.uk (michael bane) Date: Fri, 21 Oct 2011 13:25:42 +0100 Subject: [Wrf-users] TAU profiler for WRF and WRF-Chem Message-ID: Hello everybody! I've just tried to compile up WRF-Chem with the TAU profiler. Whilst many routines compile okay, many others fail, seemingly since PDT (a parser used internally by TAU) cannot parse fully the source code. Before I delve deeply in to the errant routines I was wondering if anybody else had already solved this issue? I'm building using Intel compilers (v11.0) for OpenMPI (1.4.1), under Scientific Linux 5.2. Many thanks, M -- Dr. Michael Bane Snr Research Apps & Collab Consultant IT Services for Research The University of Manchester UK http://www.manchester.ac.uk/rac @mkbane_mcr From moudipascal at yahoo.fr Fri Oct 21 03:46:29 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Fri, 21 Oct 2011 10:46:29 +0100 (BST) Subject: [Wrf-users] Segmentation fault Message-ID: <1319190389.22851.YahooMailClassic@web29005.mail.ird.yahoo.com> Dear all, I tried to compile the attached scripts, but i got segmentation fault, I don't understand why, would someone help me to fix the problem Regards Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/d07ed39d/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: pascal1.ncl Type: application/octet-stream Size: 7991 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/d07ed39d/attachment.obj From remi.montroty at mfi.fr Fri Oct 21 10:58:59 2011 From: remi.montroty at mfi.fr (Remi Montroty) Date: Fri, 21 Oct 2011 18:58:59 +0200 Subject: [Wrf-users] Nesting, Intermediate domain & JMAX value In-Reply-To: References: <1315907443.47695.YahooMailNeo@web29015.mail.ird.yahoo.com> Message-ID: <4EA1A4D3.2030908@mfi.fr> Dear all, I have add quite a few surprises trying to nest in version 3.3.0, core nmm. I am using a main grid of nx=238, nj=302, and a internal grid of nx=355 & ny=697. Those last two values seemed to match with the rule of each value needing to be equal to 1 + n* 3 (the fixed parent_ratio in nmm core). Now when I get to the model run itself (wrf, after real_nmm.exe) I get the following error (cf First Log). *"NESTED DOMAIN: JMAX IS EVEN, INCREASE e_sn IN THE namelist.input BY 1" which seems to match with * dyn_nmm/NMM_NEST_UTILS1.F 46 If I do increase JMAX by 1 (ie 698), it crashes wrf.exe when trying to run find_ijstart_level ... (cf Second Log) Any clue? And what is this intermediate domain in First Log? Does anyone have an example of a set of functioning namelists for a nested run in WRF_v3.3.0? Thanks ! Remi *First Log :* WRF V3.3 MODEL ************************************* Parent domain ids,ide,jds,jde 1 238 1 302 ims,ime,jms,jme -4 66 234 307 ips,ipe,jps,jpe 1 59 241 302 ************************************* DYNAMICS OPTION: nmm dyncore alloc_space_field: domain 1 , 121094988 bytes allocated med_initialdata_input: calling input_input FORECAST BEGINS 0 GMT 0/ 0/ 0 zeroing grid%cwm appear to have grid%q2 values...do not zero INIT: INITIALIZED ARRAYS FOR CLEAN START restrt= F nest= F grid%pdtop= 35592.090 grid%pt= 5000.0000 INPUT LandUse = "USGS" Climatological albedo is used instead of table values SUN-EARTH DISTANCE CALCULATION FINISHED IN SOLARD YEAR= 2011 MONTH= 10 DAY= 21 HOUR= 0 R1= 0.9957 INITIALIZE THREE Noah LSM RELATED TABLES Initializng moist(:,:,:, Qv) from q summing moist(:,:,:,i_m) into cwm array summing moist(:,:,:,i_m) into cwm array summing moist(:,:,:,i_m) into cwm array computing grid%f_ice computing f_rain ************************************* Nesting domain ids,ide,jds,jde 1 355 1 697 ims,ime,jms,jme -4 99 547 702 ips,ipe,jps,jpe 1 89 557 697 INTERMEDIATE domain ids,ide,jds,jde 68 191 38 275 ims,ime,jms,jme 63 109 216 280 ips,ipe,jps,jpe 66 99 226 277 ************************************* d01 2011-10-21_00:00:00 alloc_space_field: domain 2 , 372043584 bytes allocated -------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: LINE: 54 NESTED DOMAIN: JMAX IS EVEN, INCREASE e_sn IN THE namelist.input BY 1 ------------------------------------------- *Second Log :* wrf.exe:28373 terminated with signal 11 at PC=8a8fc9 SP=7fffe78b7300. Backtrace: wrf.exe(find_ijstart_level_+0x109)[0x8a8fc9] wrf.exe(nest_terrain_+0x7fb)[0x890fbb] wrf.exe(med_nest_initial_+0x772)[0x787c72] wrf.exe(__module_integrate_MOD_integrate+0x196)[0x4abf1e] wrf.exe(__module_wrf_top_MOD_wrf_run+0x24)[0x480a84] wrf.exe(MAIN__+0x3c)[0x4802dc] wrf.exe(main+0x2a)[0x16d2bba] /lib64/libc.so.6(__libc_start_main+0xf4)[0x3f9201d994] wrf.exe[0x4801d9] -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/2138e937/attachment-0001.html From William.Neff at noaa.gov Fri Oct 21 13:01:38 2011 From: William.Neff at noaa.gov (William Neff) Date: Fri, 21 Oct 2011 13:01:38 -0600 Subject: [Wrf-users] First Notice ISARS2012, 16th International Symposium for the Advancement of Boundary Layer Remote Sensing, Boulder Colorado, June 5-8, 2012 Message-ID: <4EA1C192.8060001@noaa.gov> Please see attached flyer for the 16th International Symposium for the Advancement of Boundary Layer Remote Sensing -- Dr. William Neff Director, Physical Sciences Division NOAA/OAR,Earth System Research Laboratory ATTN: R/PSD 325 Broadway Boulder Colorado 80305 Alternate email:william.neff.325 at gmail.com website: http://www.esrl.noaa.gov/psd/ Ph: 303-497-6265 Fx: 303-497-6020 -------------- next part -------------- A non-text attachment was scrubbed... Name: ISARS2012_1st Circular.pdf Type: video/x-flv Size: 338191 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/f24a986f/attachment-0001.flv From haley at ucar.edu Fri Oct 21 15:19:32 2011 From: haley at ucar.edu (Mary Haley) Date: Fri, 21 Oct 2011 15:19:32 -0600 Subject: [Wrf-users] [ncl-install] Segmentation fault In-Reply-To: <1319190389.22851.YahooMailClassic@web29005.mail.ird.yahoo.com> References: <1319190389.22851.YahooMailClassic@web29005.mail.ird.yahoo.com> Message-ID: <39287344-64E1-44D3-A3FB-974422D6D751@ucar.edu> Pascal, As Dennis stated, this script is too complicated. We can't run it ourselves, so we have no way to tell where it is getting the segmentation faul. One thing you can to do see if you can figure out where the code is failing is to run ncl with the "-x" option. This will echo every line as it is executed. For this to work properly, though, you'll need to comment out the "begin" and "end" statements. Otherwise, NCL will echo all the lines in your script, and then execute the program. --Mary On Oct 21, 2011, at 3:46 AM, moudi pascal wrote: > Dear all, > I tried to compile the attached scripts, but i got segmentation fault, > I don't understand why, would someone help me to fix the problem > Regards > > Pascal MOUDI IGRI > > Ph-D Student > Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) > Department of Physics > Faculty of Science > University of Yaounde I, Cameroon > National Advanced Training School for Technical Education, > Electricity Engineering, Douala > > Tel:+237 75 32 58 52 > _______________________________________________ > ncl-install mailing list > List instructions, subscriber options, unsubscribe: > http://mailman.ucar.edu/mailman/listinfo/ncl-install -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/76c63ee7/attachment.html From weilla at latmos.ipsl.fr Fri Oct 21 14:13:16 2011 From: weilla at latmos.ipsl.fr (weilla at latmos.ipsl.fr) Date: Fri, 21 Oct 2011 22:13:16 +0200 Subject: [Wrf-users] First Notice ISARS2012, 16th International Symposium for the Advancement of Boundary Layer Remote Sensing, Boulder Colorado, June 5-8, 2012 In-Reply-To: <4EA1C192.8060001@noaa.gov> References: <4EA1C192.8060001@noaa.gov> Message-ID: <4944013cb30d27339f52b42022dd1c2d.squirrel@webmail.latmos.ipsl.fr> >Wonderful Bill Have a nice day or evening from the Spitsbergen city of Ny Alesund Alain Please see attached flyer for the 16th International Symposium for the > Advancement of Boundary Layer Remote Sensing > > -- > Dr. William Neff > Director, Physical Sciences Division > NOAA/OAR,Earth System Research Laboratory > ATTN: R/PSD > 325 Broadway > Boulder Colorado 80305 > Alternate email:william.neff.325 at gmail.com > website: http://www.esrl.noaa.gov/psd/ > Ph: 303-497-6265 > Fx: 303-497-6020 > > From terrib at ucar.edu Tue Oct 25 17:09:04 2011 From: terrib at ucar.edu (Terri Betancourt) Date: Tue, 25 Oct 2011 17:09:04 -0600 Subject: [Wrf-users] Thomas T. Warner Memorial Symposium Message-ID: <8E2777F4-0BCC-4650-91D6-2AF9A71D80D5@ucar.edu> Family, friends and colleagues are invited to the National Center for Atmospheric Research on December 2, 2012 for the Thomas T. Warner Memorial Symposium. The purpose of the Symposium is to pay tribute to the scientific career and to celebrate the rich life of Dr. Warner. The symposium will consist of invited oral presentations focused on topics related to the areas of atmospheric modeling that represent Dr. Warner's career, including desert meteorology, numerical weather and climate prediction, quality assurance in the atmospheric modeling process, and Tom?s extraordinary role as a mentor in the lives of his colleagues. Details on the Symposium are available at the following website: http://ral.ucar.edu/general/events/warner_symposium/ Contributions may be made to the Warner Visitor Scholarship to fund student visitor and outreach collaborations between PSU, CU, and NCAR. Details on the scholarship fund will be posted at the above web site and available at the Symposium. Hosted by NCAR?s Research Applications Laboratory, Penn State Department of Meteorology and University of Colorado Atmospheric and Oceanic Sciences. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111025/4af7705c/attachment.html From caklich at gmail.com Sun Oct 30 13:30:56 2011 From: caklich at gmail.com (Chris Klich) Date: Sun, 30 Oct 2011 15:30:56 -0400 Subject: [Wrf-users] Help with Obs Nudging Crash Message-ID: Hi all, I have recently been trying to use obs-nudging with my WRF run. I am currently running WRF-Chem version 3.2.1 due to a long-term project and unable to upgrade to 3.3. I currently have both upper air and surface data in little-r format (separate however). I convert these files into the proper naming format for OBSGRID, run obsgrid and move the files to the /run directory, concatenate all the OBS_DOMAIN files into OBS_DOMAIN101, and then run real.exe and wrf.exe. However, when OBSGRID runs, the output files are every 3 hours, while my met_em files and observation files are every 6. These 3 hour intermediate times (3, 9, 15, 21) are empty when the OBS_DOMAIN and plotobs and qc files are outputted, while the normal intervals of 0, 6, 12, 18 are several megabytes. Even in this case, I had tried concatenating all files to OBS_DOMAIN101 and running WRF. However, it seems the entire run finishes only when I use just upper air data specifically. When I run using just surface obs or concatenating surface to upper air, the run crashes, and it seems at a very random time. When using just surface it crashed after about 12:49 into the run, while running with the concatenated upper air/surface, it crashed after about 3 days, 12:49. I can't seem to figure out what is causing this. The end of my rsl.error.0000 file looks like: Timing for main (dt=135.00): time 2008-05-20_12:45:13 on domain 1: 1.08750 elapsed seconds. Timing for main (dt=135.00): time 2008-05-20_12:47:28 on domain 1: 1.05240 elapsed seconds. OBS NUDGING: Reading new obs for time window TBACK = 12.125 TFORWD = 13.458 for grid = 1 OBS NUDGING: 1 previously read obs are now too old for the current window a nd have been removed. ****** CALL IN4DOB AT KTAU = 326 AND XTIME = 767.48: NSTA = 43 **** ** ++++++CALL ERROB AT KTAU = 326 AND INEST = 1: NSTA = 43 ++++++ OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 3 3 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 4 4 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 1 1 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 2 2 rindx= 5. 3 Timing for main (dt=135.00): time 2008-05-20_12:49:43 on domain 1: 1.07960 elapsed seconds. forrtl: error (78): process killed (SIGTERM) Image PC Routine Line Source libc.so.6 00000033C34DD1E3 Unknown Unknown Unknown libopen-pal.so.0 00002B47160653DD Unknown Unknown Unknown libopen-pal.so.0 00002B471606213D Unknown Unknown Unknown libopen-pal.so.0 00002B471605555C Unknown Unknown Unknown libmpi.so.0 00002B4715B43AA8 Unknown Unknown Unknown libmpi.so.0 00002B4715B721AC Unknown Unknown Unknown wrf.exe 0000000001C64282 Unknown Unknown Unknown wrf.exe 0000000000F373F4 Unknown Unknown Unknown wrf.exe 00000000014F73B5 Unknown Unknown Unknown wrf.exe 0000000000E8EC4F Unknown Unknown Unknown wrf.exe 0000000000D8DB6D Unknown Unknown Unknown wrf.exe 000000000052ECDF Unknown Unknown Unknown wrf.exe 00000000004BC733 Unknown Unknown Unknown wrf.exe 00000000004BC6E7 Unknown Unknown Unknown wrf.exe 00000000004BC67C Unknown Unknown Unknown libc.so.6 00000033C341EC9D Unknown Unknown Unknown wrf.exe 00000000004BC579 Unknown Unknown Unknown In addition, when I use upper air data, I get thousands and thousands of lines in my rsl.error.0000 file that read something like: n= 9061 unknown ob of type FM-88 SATOB Is this normal or is there an option I am missing where it will read these correctly? Any help for either of these problems would be greatly appreciated as I need to figure this out ASAP. I've also attached both namelist.input and namelist.oa for any help that may provide. Thank you. -- Christopher Klich Graduate Student Florida State University (908) 208-9743 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111030/d787b431/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.oa Type: application/octet-stream Size: 2707 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111030/d787b431/attachment.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.input Type: application/octet-stream Size: 9258 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111030/d787b431/attachment-0001.obj From terrib at ucar.edu Tue Oct 25 20:21:17 2011 From: terrib at ucar.edu (Terri Betancourt) Date: Tue, 25 Oct 2011 20:21:17 -0600 Subject: [Wrf-users] Fwd: Thomas T. Warner Memorial Symposium References: <8E2777F4-0BCC-4650-91D6-2AF9A71D80D5@ucar.edu> Message-ID: <6F5E6E5E-EF2A-4166-B043-BF8735D7811C@ucar.edu> Dear Moderator, If possible I'd like to make a correction to the announcement below before it goes out. The year of the symposium should be 2011, not 2012. What would be the best method to make this correction? Submit a new, corrected announcement, or ask you, as the moderator, to make the modification? Many thanks in advance for your assistance! - Terri b. Begin forwarded message: > From: Terri Betancourt > Date: October 25, 2011 5:09:04 PM MDT > To: wrf-users at ucar.edu > Subject: Thomas T. Warner Memorial Symposium > > Family, friends and colleagues are invited to the National Center for Atmospheric Research on December 2, 2012 for the Thomas T. Warner Memorial Symposium. The purpose of the Symposium is to pay tribute to the scientific career and to celebrate the rich life of Dr. Warner. The symposium will consist of invited oral presentations focused on topics related to the areas of atmospheric modeling that represent Dr. Warner's career, including desert meteorology, numerical weather and climate prediction, quality assurance in the atmospheric modeling process, and Tom?s extraordinary role as a mentor in the lives of his colleagues. > Details on the Symposium are available at the following website: > > http://ral.ucar.edu/general/events/warner_symposium/ > > Contributions may be made to the Warner Visitor Scholarship to fund student visitor and outreach collaborations between PSU, CU, and NCAR. Details on the scholarship fund will be posted at the above web site and available at the Symposium. > > Hosted by NCAR?s Research Applications Laboratory, Penn State Department of Meteorology and University of Colorado Atmospheric and Oceanic Sciences. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111025/41fed5ea/attachment-0001.html From bbrashers at Environcorp.com Mon Oct 31 11:59:11 2011 From: bbrashers at Environcorp.com (Bart Brashers) Date: Mon, 31 Oct 2011 10:59:11 -0700 Subject: [Wrf-users] Help with Obs Nudging Crash In-Reply-To: References: Message-ID: <1B8D1B9BF4DCDC4A90A42E312FF308520776C23E@irvine01.irvine.environ.local> Sounds like you have 2 issues: 1. OBSGRID.EXE is not writing output at the times you want 2. WRF-Chem is bombing when nudging I posted a bit on problem 1 - I was misinterpreting the DOCs (or the DOCs are misleading, depending on your viewpoint). See http://mailman.ucar.edu/pipermail/wrf-users/2011/002424.html. Basically, set &share::interval_seconds to match your input GRIB data interval (must match the interval you ran METGRID.EXE with). This is 21600 if your GRIB files come every 6 hours. Then make sure your little_r format files contain data every &record7::intf4d (from your email, it looks sorta like you want to nudge every 3 hours). The main point is that OBSGRID.EXE will look for a new file every intf4d seconds, and will not look in a file that's already been opened (from a previous interval). I'm nearly certain that you must combine both your surface and upper-air data into the same file, but that's pretty easy - you can just "cat" them together. I suggest you fix problem 1 first, so you have non-empty OBS_DOMAIN1?? files for the in-between hours, before addressing the bombing of WRF. Bart Brashers From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Chris Klich Sent: Sunday, October 30, 2011 12:31 PM To: wrf-users at ucar.edu Subject: [Wrf-users] Help with Obs Nudging Crash Hi all, I have recently been trying to use obs-nudging with my WRF run. I am currently running WRF-Chem version 3.2.1 due to a long-term project and unable to upgrade to 3.3. I currently have both upper air and surface data in little-r format (separate however). I convert these files into the proper naming format for OBSGRID, run obsgrid and move the files to the /run directory, concatenate all the OBS_DOMAIN files into OBS_DOMAIN101, and then run real.exe and wrf.exe. However, when OBSGRID runs, the output files are every 3 hours, while my met_em files and observation files are every 6. These 3 hour intermediate times (3, 9, 15, 21) are empty when the OBS_DOMAIN and plotobs and qc files are outputted, while the normal intervals of 0, 6, 12, 18 are several megabytes. Even in this case, I had tried concatenating all files to OBS_DOMAIN101 and running WRF. However, it seems the entire run finishes only when I use just upper air data specifically. When I run using just surface obs or concatenating surface to upper air, the run crashes, and it seems at a very random time. When using just surface it crashed after about 12:49 into the run, while running with the concatenated upper air/surface, it crashed after about 3 days, 12:49. I can't seem to figure out what is causing this. The end of my rsl.error.0000 file looks like: Timing for main (dt=135.00): time 2008-05-20_12:45:13 on domain 1: 1.08750 elapsed seconds. Timing for main (dt=135.00): time 2008-05-20_12:47:28 on domain 1: 1.05240 elapsed seconds. OBS NUDGING: Reading new obs for time window TBACK = 12.125 TFORWD = 13.458 for grid = 1 OBS NUDGING: 1 previously read obs are now too old for the current window a nd have been removed. ****** CALL IN4DOB AT KTAU = 326 AND XTIME = 767.48: NSTA = 43 **** ** ++++++CALL ERROB AT KTAU = 326 AND INEST = 1: NSTA = 43 ++++++ OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 3 3 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 4 4 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 1 1 rindx= 5. 3 OBS NUDGING FOR IN,J,KTAU,XTIME,IVAR,IPL: 1 10 326 767.48 2 2 rindx= 5. 3 Timing for main (dt=135.00): time 2008-05-20_12:49:43 on domain 1: 1.07960 elapsed seconds. forrtl: error (78): process killed (SIGTERM) Image PC Routine Line Source libc.so.6 00000033C34DD1E3 Unknown Unknown Unknown libopen-pal.so.0 00002B47160653DD Unknown Unknown Unknown libopen-pal.so.0 00002B471606213D Unknown Unknown Unknown libopen-pal.so.0 00002B471605555C Unknown Unknown Unknown libmpi.so.0 00002B4715B43AA8 Unknown Unknown Unknown libmpi.so.0 00002B4715B721AC Unknown Unknown Unknown wrf.exe 0000000001C64282 Unknown Unknown Unknown wrf.exe 0000000000F373F4 Unknown Unknown Unknown wrf.exe 00000000014F73B5 Unknown Unknown Unknown wrf.exe 0000000000E8EC4F Unknown Unknown Unknown wrf.exe 0000000000D8DB6D Unknown Unknown Unknown wrf.exe 000000000052ECDF Unknown Unknown Unknown wrf.exe 00000000004BC733 Unknown Unknown Unknown wrf.exe 00000000004BC6E7 Unknown Unknown Unknown wrf.exe 00000000004BC67C Unknown Unknown Unknown libc.so.6 00000033C341EC9D Unknown Unknown Unknown wrf.exe 00000000004BC579 Unknown Unknown Unknown In addition, when I use upper air data, I get thousands and thousands of lines in my rsl.error.0000 file that read something like: n= 9061 unknown ob of type FM-88 SATOB Is this normal or is there an option I am missing where it will read these correctly? Any help for either of these problems would be greatly appreciated as I need to figure this out ASAP. I've also attached both namelist.input and namelist.oa for any help that may provide. Thank you. -- Christopher Klich Graduate Student Florida State University (908) 208-9743 This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111031/0dd9c343/attachment.html From chenming at ucar.edu Thu Nov 3 16:01:51 2011 From: chenming at ucar.edu (Ming Chen) Date: Thu, 03 Nov 2011 16:01:51 -0600 Subject: [Wrf-users] Fwd: job posting for project scientist I position Message-ID: <4EB30F4F.9050206@ucar.edu> -------- Original Message -------- Subject: job posting for project scientist I position Date: Thu, 03 Nov 2011 16:00:06 -0600 From: Ming Chen To: wrf-news-request at ucar.edu Dear WRF Users, We have activated the requisition to hire a Project Scientist I and have posted this position on the UCAR Career Opportunities page http://tinyurl.com/Proj-Sci-I-12025 Application deadline is Wednesday, November 16, 2011. Hans Xiang-Yu (Hans) Huang Phone: 303-497-8975 Email: huangx at ucar.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111103/d07f93d2/attachment.html From stuefer at gi.alaska.edu Thu Nov 3 17:54:47 2011 From: stuefer at gi.alaska.edu (Martin) Date: Thu, 03 Nov 2011 15:54:47 -0800 Subject: [Wrf-users] Vacancy: Postdoc Position at the University of Alaska Fairbanks Message-ID: <4EB329C7.7030607@gi.alaska.edu> Postdoctoral Position at the Geophysical Institute - University of Alaska Fairbanks A postdoctoral position is available to join a growing research group working in the area of Arctic aerosol assessment using a variety of modern remote sensing and insitu measurement techniques, and numerical modeling methods. The primary responsibility associated with this full time funded position is to conduct research to assess temporal and spatial variations of anthropogenic water vapor emissions in the Arctic. Research tasks will include modeling of the respective microphysics and related visibility constraints due to ice fog. Candidates need to have a strong background in physics, and extensive IT and programming skills. The position is aimed to evaluate and improve the capabilities of the Weather Research and Forecast (WRF) model to simulate the microphysics of point sources of water vapor in an Arctic environment. Further details and instructions for applicants can be found at If you have questions regarding the position, please contact Dr. Martin Stuefer at the Geophysical Institute at University of Alaska Fairbanks, (907) 474-6477 or email: stuefer at gi.alaska.edu. If you have questions about applying online, please contact Farra Smith, HR Consultant, Geophysical Institute, University of Alaska Fairbanks, (907) 474-5511 or e mail: fsmith32 at alaska.edu. The University of Alaska Fairbanks is an Equal Opportunity/Affirmative Action Employer. From ebeigi3 at tigers.lsu.edu Mon Nov 7 21:11:09 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Mon, 7 Nov 2011 23:11:09 -0500 Subject: [Wrf-users] average of wrfout variables Message-ID: Dear Sir/Madam, I need to get daily and monthly averages of 6 hourly wrf output files , which method or which software is better to deal with, thanks in advance Bests, -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111107/ea7dfc0c/attachment.html From ebeigi3 at tigers.lsu.edu Wed Nov 9 21:33:55 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Wed, 9 Nov 2011 23:33:55 -0500 Subject: [Wrf-users] STOP: LON outside domain Message-ID: Dear Sir/Madam, i am using read_wrf_nc to extract time series of variables from wrfout file, but when i use this command : ./read_wrf_nc wrfout_d01_2000-01-25_12\:00\:00 -ts ll 30 -90 T2 RAINC RAINNC -lev 1 i get this error "STOP: LON outside domain" , i tested that with a range of lat and lon , but every time i get the same error.could anyone help me with that please? Best Regards, Ehsan Beigi -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111109/6dda5c74/attachment.html From jpad0001 at um.edu.mt Wed Nov 9 05:24:53 2011 From: jpad0001 at um.edu.mt (Jason Padovani Ginies) Date: Wed, 9 Nov 2011 13:24:53 +0100 Subject: [Wrf-users] Optimal schemes for LES Message-ID: Good afternoon, I am interested in running LES runs on a small domain in the Mediterranean and have come across some problems when deciding which physics schemes would be optimal to use. I am to understand that for the finer domain, diff_opt should be set to 2, with an accompanying setting for km_opt of 2 or 3 and as a consequence cumulus and pbl should be set to zero. When it comes to the other parameters such as Microphysics, LSM, surface and radiation, I am unsure which would be best for such an application as the difference amongst each seems sometimes minimal. Is there any setting which is crucial for such an application? Kind regards, Jason -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111109/6b0251d4/attachment.html From jdemko.ctr at cnttr.dtra.mil Thu Nov 10 11:03:07 2011 From: jdemko.ctr at cnttr.dtra.mil (Cory Demko) Date: Thu, 10 Nov 2011 13:03:07 -0500 Subject: [Wrf-users] average of wrfout variables In-Reply-To: References: Message-ID: <2D71B19457E745CB9462F917ABFCC9F4@cdemko> Ehsan: If the data set is recent (2006 I believe), then you can download via FTP or HTTP from NCDC. Look under NAM (WRF-NMM). http://nomads.ncdc.noaa.gov/data.php?name=access#hires_weather_datasets Cory Demko _____ From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Ehsan Beigi Sent: Monday, November 07, 2011 11:11 PM To: wrfhelp; wrf-users at ucar.edu Subject: [Wrf-users] average of wrfout variables Dear Sir/Madam, I need to get daily and monthly averages of 6 hourly wrf output files , which method or which software is better to deal with, thanks in advance Bests, -- Ehsan Beigi PhD Student Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111110/12286021/attachment.html From jpad0001 at um.edu.mt Thu Nov 10 12:08:19 2011 From: jpad0001 at um.edu.mt (Jason Padovani Ginies) Date: Thu, 10 Nov 2011 20:08:19 +0100 Subject: [Wrf-users] ARWpost for GrADS application Message-ID: Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: * input_module.f:(.text+0x40f): undefined reference to `ncvgt_'* and eventually are confronted with a final error message saying: *make: [ARWpost.exe] Error 1 (ignored) *Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111110/a4dcdce3/attachment.html From giselezepka at bol.com.br Fri Nov 11 07:18:41 2011 From: giselezepka at bol.com.br (Gisele dos Santos Zepka) Date: Fri, 11 Nov 2011 12:18:41 -0200 Subject: [Wrf-users] ungrib.exe error Message-ID: <4EBD2EC1.5060109@bol.com.br> Hello all, I want to run a 25 hours simulation in wrf 3.3.1. I am getting an error in ungrib.exe. I set in namelist.input the following: ... start_hour = 0 start_year = 2009 start_month = 12 start_day = 3 ... end_hour = 0 end_year = 2009 end_month = 12 end_day = 4 I add as input data: gfs_4_2009-12-03_0000_000.grb2 gfs_4_2009-12-03_0000_003.grb2 ... gfs_4_2009-12-03_0000_024.grb2 The error is: 2011-11-11 11:29:58.568 --- *** StarError_ndate GETH_IDTS: Hour of NDATE = 24 Screwy NDATE: 2009-12-03_24:00:00 What am I doing wrong? Thanks very much for any help Gisele -- ____________________________________________ Dra. Gisele dos Santos Zepka Saraiva Meteorologist - Atmopheric Electricity National Institute for Space Research - INPE Av. dos Astronautas, 1758 S?o Jos? dos Campos, SP Brazil, 12227-010 Phone:++55(12) 3208-6841 From jeremy.young868 at topper.wku.edu Thu Nov 10 16:42:01 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Thu, 10 Nov 2011 23:42:01 +0000 Subject: [Wrf-users] ARWpost for GrADS application In-Reply-To: References: Message-ID: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> Hi Jason, I've seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have -L/path/to/netcdf/lib -lnetcdf -lnetcdff --CPPFLAGS should now have -I/path/to/netcdf/include and -R/path/to/netcdf/lib --Change the "-O" flags to match that of netcdf during compile, probably "-O2". Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an "error opening shared libraries ...". If you see this message, you can use a command like " ln -s /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111110/a5a74aac/attachment.html From bernardo at fisica.edu.uy Sat Nov 12 16:31:31 2011 From: bernardo at fisica.edu.uy (Bernardo de los Santos) Date: Sat, 12 Nov 2011 21:31:31 -0200 Subject: [Wrf-users] Wrf-users Digest, Vol 87, Issue 5 In-Reply-To: References: Message-ID: Hola Gisele, para obtener 25 horas (sino entend? mal), end_hour deber?a ser al menos 03... del d?a siguiente (namelist) en lugar de 0 Saludos. 2011/11/12 : > Send Wrf-users mailing list submissions to > ? ? ? ?wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > ? ? ? ?http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > ? ? ? ?wrf-users-request at ucar.edu > > You can reach the person managing the list at > ? ? ? ?wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > ? 1. ungrib.exe error (Gisele dos Santos Zepka) > ? 2. Re: ARWpost for GrADS application (Young, Jeremy, K) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Fri, 11 Nov 2011 12:18:41 -0200 > From: Gisele dos Santos Zepka > Subject: [Wrf-users] ungrib.exe error > To: wrf-users at ucar.edu > Message-ID: <4EBD2EC1.5060109 at bol.com.br> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > Hello all, > > I want to run a 25 hours simulation in wrf 3.3.1. I am getting an error > in ungrib.exe. > > I set in namelist.input the following: > > ... > start_hour = 0 > start_year = 2009 > start_month = 12 > start_day = 3 > ... > end_hour = 0 > end_year = 2009 > end_month = 12 > end_day = 4 > > > I add as input data: > > gfs_4_2009-12-03_0000_000.grb2 > gfs_4_2009-12-03_0000_003.grb2 > ... > gfs_4_2009-12-03_0000_024.grb2 > > > The error is: > > 2011-11-11?11:29:58.568 --- ?*** StarError_ndate > ?GETH_IDTS: ?Hour of NDATE = ? ? ? ? ? ?24 > ?Screwy NDATE: 2009-12-03_24:00:00 > > What am I doing wrong? > > Thanks very much for any help > > Gisele > > -- > ____________________________________________ > > Dra. Gisele dos Santos Zepka Saraiva > Meteorologist - Atmopheric Electricity > National Institute for Space Research - INPE > Av. dos Astronautas, 1758 > S?o Jos? dos Campos, SP > Brazil, 12227-010 > > Phone:++55(12) 3208-6841 > > > > ------------------------------ > > Message: 2 > Date: Thu, 10 Nov 2011 23:42:01 +0000 > From: "Young, Jeremy, K" > Subject: Re: [Wrf-users] ARWpost for GrADS application > To: "wrf-users at ucar.edu" > Message-ID: > ? ? ? ?<50C0D80B700AFB4597C07F946592F09727C906A0 at SN2PRD0302MB123.namprd03.prod.outlook.com> > > Content-Type: text/plain; charset="us-ascii" > > Hi Jason, > > I've seen a similar error message before when attempting to compile ARWpost. ?I fixed this problem by adding the following to the configure.arwp file: > > --LDFLAGS should now have -L/path/to/netcdf/lib ? -lnetcdf -lnetcdff > --CPPFLAGS should now have -I/path/to/netcdf/include ?and -R/path/to/netcdf/lib > --Change the "-O" flags to match that of netcdf during compile, probably "-O2". > > Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. ?The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an "error opening shared libraries ...". ?If you see this message, you can use a command like " ln -s /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. > > I hope this helps. > > Jeremy Young > System Administrator > Climate Research Lab > Western Kentucky University > > From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies > Sent: Thursday, November 10, 2011 1:08 PM > To: wrf-users at ucar.edu > Subject: [Wrf-users] ARWpost for GrADS application > > Dear wrf-users, > > I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: > > input_module.f:(.text+0x40f): undefined reference to `ncvgt_' > > and eventually are confronted with a final error message saying: > > make: [ARWpost.exe] Error 1 (ignored) > > Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that > Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? > > Kind regards, > > Jason Padovani Ginies > Final Year B.Sc. student > Department of Physics > University of Malta > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111110/a5a74aac/attachment-0001.html > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 87, Issue 5 > **************************************** > > !DSPAM:4ebec25135681056916174! > -- -- Bernardo de los Santos Simonelli +598 2 3135353 - 098927775 Montevideo, Uruguay From rcalmeida at terra.com.br Sun Nov 13 12:30:10 2011 From: rcalmeida at terra.com.br (Ricardo Almeida) Date: Sun, 13 Nov 2011 17:30:10 -0200 Subject: [Wrf-users] Wrf-users Digest, Vol 87, Issue 6 In-Reply-To: References: Message-ID: <3CC383C5B8284B74960BCC3A80315432@ricardoPC> Dear Dr. Gisele, Try to change your simulation period to 24 hours (instead of 25 hours) , since your input data spans only that period of time. That should work. Good luck. Ricardo C. de Almeida, D.Sc. Federal Univesity in Paran? - Brazil rcalmeida at ufpr.br ----- Original Message ----- From: To: Sent: Sunday, November 13, 2011 5:00 PM Subject: Wrf-users Digest, Vol 87, Issue 6 > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. Re: Wrf-users Digest, Vol 87, Issue 5 (Bernardo de los Santos) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sat, 12 Nov 2011 21:31:31 -0200 > From: Bernardo de los Santos > Subject: Re: [Wrf-users] Wrf-users Digest, Vol 87, Issue 5 > To: wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 > > Hola Gisele, > para obtener 25 horas (sino entend? mal), end_hour deber?a ser al > menos 03... del d?a siguiente (namelist) en lugar de 0 > > Saludos. > > 2011/11/12 : >> Send Wrf-users mailing list submissions to >> ? ? ? ?wrf-users at ucar.edu >> >> To subscribe or unsubscribe via the World Wide Web, visit >> ? ? ? ?http://mailman.ucar.edu/mailman/listinfo/wrf-users >> or, via email, send a message with subject or body 'help' to >> ? ? ? ?wrf-users-request at ucar.edu >> >> You can reach the person managing the list at >> ? ? ? ?wrf-users-owner at ucar.edu >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of Wrf-users digest..." >> >> >> Today's Topics: >> >> ? 1. ungrib.exe error (Gisele dos Santos Zepka) >> ? 2. Re: ARWpost for GrADS application (Young, Jeremy, K) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Fri, 11 Nov 2011 12:18:41 -0200 >> From: Gisele dos Santos Zepka >> Subject: [Wrf-users] ungrib.exe error >> To: wrf-users at ucar.edu >> Message-ID: <4EBD2EC1.5060109 at bol.com.br> >> Content-Type: text/plain; charset=ISO-8859-1; format=flowed >> >> Hello all, >> >> I want to run a 25 hours simulation in wrf 3.3.1. I am getting an error >> in ungrib.exe. >> >> I set in namelist.input the following: >> >> ... >> start_hour = 0 >> start_year = 2009 >> start_month = 12 >> start_day = 3 >> ... >> end_hour = 0 >> end_year = 2009 >> end_month = 12 >> end_day = 4 >> >> >> I add as input data: >> >> gfs_4_2009-12-03_0000_000.grb2 >> gfs_4_2009-12-03_0000_003.grb2 >> ... >> gfs_4_2009-12-03_0000_024.grb2 >> >> >> The error is: >> >> 2011-11-11?11:29:58.568 --- ?*** StarError_ndate >> ?GETH_IDTS: ?Hour of NDATE = ? ? ? ? ? ?24 >> ?Screwy NDATE: 2009-12-03_24:00:00 >> >> What am I doing wrong? >> >> Thanks very much for any help >> >> Gisele >> >> -- >> ____________________________________________ >> >> Dra. Gisele dos Santos Zepka Saraiva >> Meteorologist - Atmopheric Electricity >> National Institute for Space Research - INPE >> Av. dos Astronautas, 1758 >> S?o Jos? dos Campos, SP >> Brazil, 12227-010 >> >> Phone:++55(12) 3208-6841 >> >> >> >> ------------------------------ >> >> Message: 2 >> Date: Thu, 10 Nov 2011 23:42:01 +0000 >> From: "Young, Jeremy, K" >> Subject: Re: [Wrf-users] ARWpost for GrADS application >> To: "wrf-users at ucar.edu" >> Message-ID: >> ? ? ? >> ?<50C0D80B700AFB4597C07F946592F09727C906A0 at SN2PRD0302MB123.namprd03.prod.outlook.com> >> >> Content-Type: text/plain; charset="us-ascii" >> >> Hi Jason, >> >> I've seen a similar error message before when attempting to compile >> ARWpost. ?I fixed this problem by adding the following to the >> configure.arwp file: >> >> --LDFLAGS should now have -L/path/to/netcdf/lib ? -lnetcdf -lnetcdff >> --CPPFLAGS should now have -I/path/to/netcdf/include >> ?and -R/path/to/netcdf/lib >> --Change the "-O" flags to match that of netcdf during compile, probably >> "-O2". >> >> Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix >> your problem. ?The-R/path/to/netcdf/lib will eliminate another problem >> that I ran into when trying to run ARWpost.exe in which an error message >> saying that there was an "error opening shared libraries ...". ?If you >> see this message, you can use a command like " ln -s >> /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to >> link the necessary libraries and allow them to be referenced. >> >> I hope this helps. >> >> Jeremy Young >> System Administrator >> Climate Research Lab >> Western Kentucky University >> >> From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On >> Behalf Of Jason Padovani Ginies >> Sent: Thursday, November 10, 2011 1:08 PM >> To: wrf-users at ucar.edu >> Subject: [Wrf-users] ARWpost for GrADS application >> >> Dear wrf-users, >> >> I am having trouble compiling ARWposts. When I try to compile I get lots >> of error messages looking like this: >> >> input_module.f:(.text+0x40f): undefined reference to `ncvgt_' >> >> and eventually are confronted with a final error message saying: >> >> make: [ARWpost.exe] Error 1 (ignored) >> >> Some research has led me to believe that the netCDF on the system might >> causing this problem and that I might have to re-compile that >> Is there any way to go around this perhaps another method of converting >> wrf output in a format readable by GrADS? >> >> Kind regards, >> >> Jason Padovani Ginies >> Final Year B.Sc. student >> Department of Physics >> University of Malta >> >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> URL: >> http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111110/a5a74aac/attachment-0001.html >> >> ------------------------------ >> >> _______________________________________________ >> Wrf-users mailing list >> Wrf-users at ucar.edu >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> >> >> End of Wrf-users Digest, Vol 87, Issue 5 >> **************************************** >> >> !DSPAM:4ebec25135681056916174! >> > > > > -- > -- > Bernardo de los Santos Simonelli > +598 2 3135353 - 098927775 > Montevideo, Uruguay > > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 87, Issue 6 > **************************************** > > E-mail verificado pelo Terra Anti-Spam. > Para classificar esta mensagem como spam ou n??o spam, visite > http://ecp.terra.com.br/cgi-bin/reportspam.cgi?+_d=SCYzODA1Mzc1I3Blcm0hdGVycmEmMSwxMzIxMjEwODE5LjI0NzE1Ny4yNTM1My4xZjIudHBuLnRlcnJhLmNvbSw4Njc4TerraMail > Verifique periodicamente a pasta Spam para garantir que apenas mensagens > indesejadas sejam classificadas como Spam. > > Esta mensagem foi verificada pelo E-mail Protegido Terra. > Atualizado em 05/09/2011 > > From apattantyus2008 at my.fit.edu Sun Nov 13 17:30:30 2011 From: apattantyus2008 at my.fit.edu (Andre Pattantyus) Date: Sun, 13 Nov 2011 14:30:30 -1000 Subject: [Wrf-users] netcdf to littler Message-ID: Hi, I was wondering if anyone had written a code to convert data from netcdf to littler format to use in 3d-var? Thanks Andre -- Andre Pattantyus, Graduate Student Research Assistant Marine and Environmental Systems, Florida Institute of Technology 150 W. University Blvd, Melbourne, FL 32901 Phone: (321) 674-8330 | Fax: (321) 674-7212 | Email: apattantyus2008 at fit.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111113/e60901c0/attachment.html From giselezepka at gmail.com Fri Nov 11 17:23:37 2011 From: giselezepka at gmail.com (Gisele dos Santos Zepka) Date: Fri, 11 Nov 2011 22:23:37 -0200 Subject: [Wrf-users] ungrib.exe error In-Reply-To: References: <4EBD2EC1.5060109@bol.com.br> Message-ID: <256ADEBD-ED6D-414D-8AD6-0D418BEDD1F2@gmail.com> Hi Kevin, In DomainWizard, after I configure the domains, I complete the time specifications in namelist.input and then add the input data to run the geogrid, ungrib and metgrid. The time specifications shown in the previous email are really from the namelist.input. So far so good. I believe the problem is the date and time of the last GFS file, because it is gfs_4_20091203_0000_024.grb2 instead of gfs_4_20091204_0000_000.grb2. There is a disagreement between the simulation end date that I have set in namelist.input and the date of the last GFS file. The ungrib.exe calls for exactly that and not run claiming error. I have thought just to rename the last GFS file to gfs_4_20091204_0000_000.grb2. I will appreciate any suggestion. Gisele Em 11/11/2011, ?s 20:59, Kevin Matthew Nuss escreveu: > Hi Gisele, > > You mention that the error comes from ungrib.exe but you give information from namelist.input rather than namelist.wps. Perhaps you changed the wrong namelist file. If not, then you did not give enough information to provide you with help. > > Even if you meant " I set in namelist.wps the following:" the format of the time specifications shown is the format used for namelist.input, not namelist.wps > > Trying to help, > Kevin Matthew Nuss > > On Fri, Nov 11, 2011 at 7:18 AM, Gisele dos Santos Zepka wrote: > Hello all, > > I want to run a 25 hours simulation in wrf 3.3.1. I am getting an error > in ungrib.exe. > > I set in namelist.input the following: > > ... > start_hour = 0 > start_year = 2009 > start_month = 12 > start_day = 3 > ... > end_hour = 0 > end_year = 2009 > end_month = 12 > end_day = 4 > > > I add as input data: > > gfs_4_2009-12-03_0000_000.grb2 > gfs_4_2009-12-03_0000_003.grb2 > ... > gfs_4_2009-12-03_0000_024.grb2 > > > The error is: > > 2011-11-11 11:29:58.568 --- *** StarError_ndate > GETH_IDTS: Hour of NDATE = 24 > Screwy NDATE: 2009-12-03_24:00:00 > > What am I doing wrong? > > Thanks very much for any help > > Gisele > > -- > ____________________________________________ > > Dra. Gisele dos Santos Zepka Saraiva > Meteorologist - Atmopheric Electricity > National Institute for Space Research - INPE > Av. dos Astronautas, 1758 > S?o Jos? dos Campos, SP > Brazil, 12227-010 > > Phone:++55(12) 3208-6841 > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > ____________________________________________ Dra. Gisele dos Santos Zepka Saraiva Meteorologist - Atmopheric Electricity National Institute for Space Research - INPE Av. dos Astronautas, 1758 S?o Jos? dos Campos, SP Brazil, 12227-010 Phone:++55(12) 3208-6841 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111111/656f1ffe/attachment.html From jeremy.young868 at topper.wku.edu Sat Nov 12 05:31:47 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Sat, 12 Nov 2011 12:31:47 +0000 Subject: [Wrf-users] ARWpost for GrADS application In-Reply-To: References: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> Message-ID: <50C0D80B700AFB4597C07F946592F09727C98C14@CH1PRD0302MB132.namprd03.prod.outlook.com> Hi Jason, What exactly is your problem? Maybe the paths to your wrfout files are incorrect? I've attached the namelist file. The only things I had to change in the namelist when running ARWpost.exe were the start date, the end date and the name of my wrfout file. I hope the namelist helps. If you run into any additional problems I can try and help. My ARWpost directory is stored at the same level as my WRFV3 folder. That is, the path to my wrfout files in the namelist looks like "../WRFV3/run/wrfout....". Jeremy Young From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Friday, November 11, 2011 8:22 PM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, Thank you for your email. In fact the first modification seems like to have solved the problem already and I now have ARWpost.exe. I still cannot make it work however =( I cannot understand what I am doing wrong to get different error messages. Could I kindly ask you to explain the procedure if you have expertise in the matter? I've tried renaming/relocating and changing the input file and adjusting the namelist accordingly but still I can't seem to get it right. Could you kindly send me a copy of your namelist.ARWpost and specify the following: * location and name of input(WRF output) files relative to ARWpost.exe Many thanks, Jason On Fri, Nov 11, 2011 at 12:42 AM, Young, Jeremy, K > wrote: Hi Jason, I've seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have -L/path/to/netcdf/lib -lnetcdf -lnetcdff --CPPFLAGS should now have -I/path/to/netcdf/include and -R/path/to/netcdf/lib --Change the "-O" flags to match that of netcdf during compile, probably "-O2". Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an "error opening shared libraries ...". If you see this message, you can use a command like " ln -s /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111112/79098bd4/attachment-0001.html From jeremy.young868 at topper.wku.edu Sat Nov 12 05:34:46 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Sat, 12 Nov 2011 12:34:46 +0000 Subject: [Wrf-users] ARWpost for GrADS application References: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> Message-ID: <50C0D80B700AFB4597C07F946592F09727C98C37@CH1PRD0302MB132.namprd03.prod.outlook.com> I apologize for not attaching the namelist.ARWpost file. Here it is. From: Young, Jeremy, K Sent: Saturday, November 12, 2011 6:32 AM To: wrf-users at ucar.edu Subject: RE: [Wrf-users] ARWpost for GrADS application Hi Jason, What exactly is your problem? Maybe the paths to your wrfout files are incorrect? I've attached the namelist file. The only things I had to change in the namelist when running ARWpost.exe were the start date, the end date and the name of my wrfout file. I hope the namelist helps. If you run into any additional problems I can try and help. My ARWpost directory is stored at the same level as my WRFV3 folder. That is, the path to my wrfout files in the namelist looks like "../WRFV3/run/wrfout....". Jeremy Young From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Friday, November 11, 2011 8:22 PM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, Thank you for your email. In fact the first modification seems like to have solved the problem already and I now have ARWpost.exe. I still cannot make it work however =( I cannot understand what I am doing wrong to get different error messages. Could I kindly ask you to explain the procedure if you have expertise in the matter? I've tried renaming/relocating and changing the input file and adjusting the namelist accordingly but still I can't seem to get it right. Could you kindly send me a copy of your namelist.ARWpost and specify the following: * location and name of input(WRF output) files relative to ARWpost.exe Many thanks, Jason On Fri, Nov 11, 2011 at 12:42 AM, Young, Jeremy, K > wrote: Hi Jason, I've seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have -L/path/to/netcdf/lib -lnetcdf -lnetcdff --CPPFLAGS should now have -I/path/to/netcdf/include and -R/path/to/netcdf/lib --Change the "-O" flags to match that of netcdf during compile, probably "-O2". Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an "error opening shared libraries ...". If you see this message, you can use a command like " ln -s /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111112/03f17da4/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.ARWpost Type: application/octet-stream Size: 1200 bytes Desc: namelist.ARWpost Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111112/03f17da4/attachment.obj From jeremy.young868 at topper.wku.edu Sun Nov 13 10:33:41 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Sun, 13 Nov 2011 17:33:41 +0000 Subject: [Wrf-users] ARWpost for GrADS application In-Reply-To: References: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> <50C0D80B700AFB4597C07F946592F09727C98C06@CH1PRD0302MB132.namprd03.prod.outlook.com> Message-ID: <50C0D80B700AFB4597C07F946592F09727C98D4F@CH1PRD0302MB132.namprd03.prod.outlook.com> Hi Jason, If I'm understanding your procedure, you have renamed the wrfout files produced by wrf.exe to a different name (your example is t10). I have not used ARWpost very often, but I'm not sure if ARWpost.exe can understand what it's reading if the wrfout files are renamed. Looking at the errors you are seeing, I think that this may be your problem. It's either that the renamed files are not being understood by ARWpost.exe or that there was an error in the compilation of ARWpost. &datetime start_date = '1998-09-01_00:00:00', end_date = '1998-09-05_00:00:00', interval_seconds = 10800, tacc = 0, debug_level = 0, / &io input_root_name = '../WRFV3/run/wrfout_d01_1998-09-01_00:00:00' output_root_name = './test' plot = 'all_list' fields = 'height,pressure,tk,tc' mercator_defs = .true. / split_output = .true. frames_per_outfile = 2 plot = 'all' plot = 'list' plot = 'all_list' ! Below is a list of all available diagnostics fields = 'height,geopt,theta,tc,tk,td,td2,rh,rh2,umet,vmet,pressure,u10m,v10m,wdir,wspd,wd10,ws10,slp,mcape,mcin,lcl,lfc,cape,cin,dbz,max_dbz,clfr' &interp interp_method = 0, interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., / extrapolate = .true. interp_method = 0, ! 0 is model levels, -1 is nice height levels, 1 is user specified pressure/height interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., interp_levels = 0.25, 0.50, 0.75, 1.00, 2.00, 3.00, 4.00, 5.00, 6.00, 7.00, 8.00, 9.00, 10.0, 11.0, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 19.0, 20.0, This is my namelist.ARWpost. The only lines I changed in this file before running the program were the start date, the end date and the input file name. I don't see why you couldn't name the output files (the .ctl and the .dat) whatever you wish, but it's possible that renaming the file has corrupted it somehow or that ARWpost cannot accept files with names like this. My suggestion would be to try and rename the t10 file (and others like it) back to their original names and try again. To keep the files organized, maybe you could create several directories that would contain each model run and its related wrfout files. If possible, complete another model run and leave the wrfout files in the ../WRFV3/run directory. Try and run ARWpost.exe while the wrfout files are still in this location and still have the wrfout_d01____ name. In doing so, your process would completely match mine and we might be able to see if the problem is related to your file names or if it is related to ARWpost not compiling correctly. Please respond with any additional questions or problems that you run into. The attached screenshots show my completion of ARWpost using the namelist above. Jeremy Young. From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Sunday, November 13, 2011 2:17 AM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Hi Jeremy, Thank you for your time. I did not get the attachment in the email, could you please send it again? What I''ve tried so far and the errors I get: * I've made two directories WRFinput [in which I have a wrf output file named t10] and OUTPUT in the ARWpost directory. So in the input and output root name of the namelist i've put (respectively) './WRFinput/t10' and './OUTPUT/test'. I adjusted the dates and the error I get is: * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! ls: ./WRFinput/t10: Value too large for defined data type ls: ./WRFinput/t10: Value too large for defined data type Oops, we need at least ONE input file for the program to read. * I've tried to put 'WRFinput/' as the input root name so that ARWpost looks for the files automatically in the directory and the error I get is * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! FOUND the following input files: t10 START PROCESSING DATA WARNING --- I do not recognize this data. Will make an attempt to read it. Processing time --- 2011-01-24_12:00:00 forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source ARWpost.exe 000000000040CC91 Unknown Unknown Unknown ARWpost.exe 0000000000419E58 Unknown Unknown Unknown ARWpost.exe 0000000000419395 Unknown Unknown Unknown ARWpost.exe 0000000000409E78 Unknown Unknown Unknown ARWpost.exe 0000000000408BEC Unknown Unknown Unknown libc.so.6 00000030D141D994 Unknown Unknown Unknown ARWpost.exe 0000000000408AE9 Unknown Unknown Unknown I can't see where I am going wrong?! perhaps you have come across such errors before? Thanks again Jason On Sat, Nov 12, 2011 at 1:28 PM, Young, Jeremy, K > wrote: Hi Jason, What exactly is your problem? Maybe the paths to your wrfout files are incorrect? I've attached the namelist file. The only things I had to change in the namelist when running ARWpost.exe were the start date, the end date and the name of my wrfout file. I hope the namelist helps. If you run into any additional problems I can try and help. Jeremy Young From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Friday, November 11, 2011 8:22 PM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, Thank you for your email. In fact the first modification seems like to have solved the problem already and I now have ARWpost.exe. I still cannot make it work however =( I cannot understand what I am doing wrong to get different error messages. Could I kindly ask you to explain the procedure if you have expertise in the matter? I've tried renaming/relocating and changing the input file and adjusting the namelist accordingly but still I can't seem to get it right. Could you kindly send me a copy of your namelist.ARWpost and specify the following: * location and name of input(WRF output) files relative to ARWpost.exe Many thanks, Jason On Fri, Nov 11, 2011 at 12:42 AM, Young, Jeremy, K > wrote: Hi Jason, I've seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have -L/path/to/netcdf/lib -lnetcdf -lnetcdff --CPPFLAGS should now have -I/path/to/netcdf/include and -R/path/to/netcdf/lib --Change the "-O" flags to match that of netcdf during compile, probably "-O2". Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an "error opening shared libraries ...". If you see this message, you can use a command like " ln -s /path/to/netcdf/lib/libnetcdf* ." issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111113/a7fcbd21/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: namelist.ARWpost Type: application/octet-stream Size: 1200 bytes Desc: namelist.ARWpost Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111113/a7fcbd21/attachment-0001.obj -------------- next part -------------- A non-text attachment was scrubbed... Name: screen1.jpg Type: image/jpeg Size: 211946 bytes Desc: screen1.jpg Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111113/a7fcbd21/attachment-0002.jpg -------------- next part -------------- A non-text attachment was scrubbed... Name: screen2.jpg Type: image/jpeg Size: 202910 bytes Desc: screen2.jpg Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111113/a7fcbd21/attachment-0003.jpg From wrf at nusculus.com Fri Nov 11 15:59:17 2011 From: wrf at nusculus.com (Kevin Matthew Nuss) Date: Fri, 11 Nov 2011 15:59:17 -0700 Subject: [Wrf-users] ungrib.exe error In-Reply-To: <4EBD2EC1.5060109@bol.com.br> References: <4EBD2EC1.5060109@bol.com.br> Message-ID: Hi Gisele, You mention that the error comes from ungrib.exe but you give information from namelist.input rather than namelist.wps. Perhaps you changed the wrong namelist file. If not, then you did not give enough information to provide you with help. Even if you meant " I set in namelist.wps the following:" the format of the time specifications shown is the format used for namelist.input, not namelist.wps Trying to help, Kevin Matthew Nuss On Fri, Nov 11, 2011 at 7:18 AM, Gisele dos Santos Zepka < giselezepka at bol.com.br> wrote: > Hello all, > > I want to run a 25 hours simulation in wrf 3.3.1. I am getting an error > in ungrib.exe. > > I set in namelist.input the following: > > ... > start_hour = 0 > start_year = 2009 > start_month = 12 > start_day = 3 > ... > end_hour = 0 > end_year = 2009 > end_month = 12 > end_day = 4 > > > I add as input data: > > gfs_4_2009-12-03_0000_000.grb2 > gfs_4_2009-12-03_0000_003.grb2 > ... > gfs_4_2009-12-03_0000_024.grb2 > > > The error is: > > 2011-11-11 11:29:58.568 --- *** StarError_ndate > GETH_IDTS: Hour of NDATE = 24 > Screwy NDATE: 2009-12-03_24:00:00 > > What am I doing wrong? > > Thanks very much for any help > > Gisele > > -- > ____________________________________________ > > Dra. Gisele dos Santos Zepka Saraiva > Meteorologist - Atmopheric Electricity > National Institute for Space Research - INPE > Av. dos Astronautas, 1758 > S?o Jos? dos Campos, SP > Brazil, 12227-010 > > Phone:++55(12) 3208-6841 > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111111/9124ff94/attachment.html From jeremy.young868 at topper.wku.edu Mon Nov 14 09:53:39 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Mon, 14 Nov 2011 16:53:39 +0000 Subject: [Wrf-users] ARWpost for GrADS application In-Reply-To: References: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> <50C0D80B700AFB4597C07F946592F09727C98C06@CH1PRD0302MB132.namprd03.prod.outlook.com> <50C0D80B700AFB4597C07F946592F09727C98D4F@CH1PRD0302MB132.namprd03.prod.outlook.com>, Message-ID: <50C0D80B700AFB4597C07F946592F09727C98EBB@CH1PRD0302MB132.namprd03.prod.outlook.com> Hi Jason, Here is my configure.arwp. I did indeed have to make the previously mentioned modifications in order to compile ARWpost. The only thing different about this file compared to some of my suggested changes to the configure.arwp file is that I did not have to change the "-O" flag to "-O2". The first time I successfully got ARWpost to compile, I did make this change. After trying the "-R/.../netcdf/lib" flag, I did not need to make this change any longer. I've also pasted my .bashrc in case it may be of use. I hope these files help. # configure.arwp # # This file was automatically generated by the configure script in the # top level directory. You may make changes to the settings in this # file but be aware they will be overwritten each time you run configure. # Ordinarily, it is necessary to run configure once, when the code is # first installed. # # To permanently change options, change the settings for your platform # in the file arch/configure.defaults, the preamble, and the postamble - # then rerun configure. # .SUFFIXES: .F90 .f90 .F .f .c .o SHELL = /bin/sh # Listing of options that are usually independent of machine type. # When necessary, these are over-ridden by each architecture. ARFLAGS = PERL = perl RANLIB = echo #### Architecture specific settings #### # Settings for PC Linux i486 i586 i686 x86_64, PGI compiler # FC = pgf90 FFLAGS = -Mfree -byteswapio -O F77FLAGS = -byteswapio -O FNGFLAGS = $(FFLAGS) LDFLAGS = -L/home/jyoung/WRFmodel/netcdf/lib -lnetcdf -lnetcdff CC = gcc CFLAGS = -O CPP = /lib/cpp -C -P -traditional CPPFLAGS = -DIO_NETCDF -DIO_GRIB1 -DIO_BINARY -DRECL4 -Dbytesw -R/home/jyoung/WRFmodel/netcdf/lib ########################################################### # # Macros, these should be generic for all machines LN = ln -sf MAKE = make -i -r RM = /bin/rm -f CP = /bin/cp AR = ar ru .IGNORE: .SUFFIXES: .c .f90 .F90 .f .F .o # There is probably no reason to modify these rules .c.o: $(RM) $@ $(CC) $(CPPFLAGS) $(CFLAGS) -c $< .f90.o: $(RM) $@ $*.mod $(CP) $< $*.f $(FC) $(FFLAGS) -I${NETCDF}/include -c $*.f $(RM) $*.f .F90.o: $(RM) $@ $*.mod $(CPP) $(CPPFLAGS) $(FDEFS) $< > $*.f $(FC) $(FFLAGS) -I${NETCDF}/include -c $*.f $(RM) $*.f # .bashrc # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi PATH=$PATH:$HOME/bin export PATH # PGI SETTINGS PGI=/opt/pgi; export PGI PATH=/opt/pgi/linux86-64/10.8/bin:$PATH; export PATH MANPATH=$MANPATH:/opt/pgi/linux86-64/10.8/man; export MANPATH LM_LICENSE_FILE=$LM_LICENSE_FILE:/opt/pgi/license.dat; export LM_LICENSE_FILE # WRF SETTINGS NETCDF=/home/jyoung/WRFmodel/netcdf; export NETCDF PATH=/home/jyoung/WRFmodel/netcdf/bin:$PATH; export PATH JASPERLIB=/usr/lib64; export JASPERLIB JASPERINC=/usr/local/include/jasper; export JASPERINC WRF_EM_CORE=1; export WRF_EM_CORE WRF_NMM_CORE=0; export WRF_NMM_CORE WRFIO_NCD_LARGE_FILE_SUPPORT=1; export WRFIO_NCD_LARGE_FILE_SUPPORT LD_LIBRARY_PATH=/home/jyoung/WRFmodel/netcdf/lib:$LD_LIBRARY_PATH; export LD_LIBRARY_PATH LD_LIBRARY_PATH=/home/jyoung/WRFmodel/hdf5/lib:$LD_LIBRARY_PATH; export LD_LIBRARY_PATH #MPICH SETTINGS PATH=/home/jyoung/WRFmodel/mpich/bin:$PATH; export PATH #GRADS SETTINGS PATH=/home/jyoung/WRFmodel/grads/bin:$PATH; export PATH Jeremy Young ________________________________________ From: jasonpadovani at gmail.com [jasonpadovani at gmail.com] on behalf of Jason Padovani Ginies [jpad0001 at um.edu.mt] Sent: Monday, November 14, 2011 10:00 AM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Hi there, Thanks for your last detailed email. I have a script which copies the run directory to an output location so not to mess up run/ itself if anything goes wrong. What's different is only the paths so things should work fine. My namelist now has start_date = '2011-01-24_12:00:00', end_date = '2011-01-27_12:00:00', and &io input_root_name = '/home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00' output_root_name = './OUTPUT/test' and is otherwise the same. Again the output I get is !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! ls: /home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00: Value too large for defined data type ls: /home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00: Value too large for defined data type Oops, we need at least ONE input file for the program to read. I am now thinking it might be a problem in the compilation or it is the fact that since ls is not producing an actual output for ARWpost, then the process is coming to a halt. The last few lines of the compilation in fact read -L/home/wrf/ARWpost/netcdf_links/lib -I/home/wrf/ARWpost/netcdf_links/include -lnetcdf /cm/shared/apps/intel/Compiler/11.1/046/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail Did you have to go through the modifications you have listed in the first email in order to compile ARWpost ? And if yes, could I kindly ask you to send me the configure.arwp file please? Once again thank you for your time, I know this must be getting frustrating. Jason On Sun, Nov 13, 2011 at 6:33 PM, Young, Jeremy, K > wrote: Hi Jason, If I?m understanding your procedure, you have renamed the wrfout files produced by wrf.exe to a different name (your example is t10). I have not used ARWpost very often, but I?m not sure if ARWpost.exe can understand what it?s reading if the wrfout files are renamed. Looking at the errors you are seeing, I think that this may be your problem. It?s either that the renamed files are not being understood by ARWpost.exe or that there was an error in the compilation of ARWpost. &datetime start_date = '1998-09-01_00:00:00', end_date = '1998-09-05_00:00:00', interval_seconds = 10800, tacc = 0, debug_level = 0, / &io input_root_name = '../WRFV3/run/wrfout_d01_1998-09-01_00:00:00' output_root_name = './test' plot = 'all_list' fields = 'height,pressure,tk,tc' mercator_defs = .true. / split_output = .true. frames_per_outfile = 2 plot = 'all' plot = 'list' plot = 'all_list' ! Below is a list of all available diagnostics fields = 'height,geopt,theta,tc,tk,td,td2,rh,rh2,umet,vmet,pressure,u10m,v10m,wdir,wspd,wd10,ws10,slp,mcape,mcin,lcl,lfc,cape,cin,dbz,max_dbz,clfr' &interp interp_method = 0, interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., / extrapolate = .true. interp_method = 0, ! 0 is model levels, -1 is nice height levels, 1 is user specified pressure/height interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., interp_levels = 0.25, 0.50, 0.75, 1.00, 2.00, 3.00, 4.00, 5.00, 6.00, 7.00, 8.00, 9.00, 10.0, 11.0, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 19.0, 20.0, This is my namelist.ARWpost. The only lines I changed in this file before running the program were the start date, the end date and the input file name. I don?t see why you couldn?t name the output files (the .ctl and the .dat) whatever you wish, but it?s possible that renaming the file has corrupted it somehow or that ARWpost cannot accept files with names like this. My suggestion would be to try and rename the t10 file (and others like it) back to their original names and try again. To keep the files organized, maybe you could create several directories that would contain each model run and its related wrfout files. If possible, complete another model run and leave the wrfout files in the ../WRFV3/run directory. Try and run ARWpost.exe while the wrfout files are still in this location and still have the wrfout_d01____ name. In doing so, your process would completely match mine and we might be able to see if the problem is related to your file names or if it is related to ARWpost not compiling correctly. Please respond with any additional questions or problems that you run into. The attached screenshots show my completion of ARWpost using the namelist above. Jeremy Young. From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Sunday, November 13, 2011 2:17 AM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Hi Jeremy, Thank you for your time. I did not get the attachment in the email, could you please send it again? What I''ve tried so far and the errors I get: * I've made two directories WRFinput [in which I have a wrf output file named t10] and OUTPUT in the ARWpost directory. So in the input and output root name of the namelist i've put (respectively) './WRFinput/t10' and './OUTPUT/test'. I adjusted the dates and the error I get is: * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! ls: ./WRFinput/t10: Value too large for defined data type ls: ./WRFinput/t10: Value too large for defined data type Oops, we need at least ONE input file for the program to read. * I've tried to put 'WRFinput/' as the input root name so that ARWpost looks for the files automatically in the directory and the error I get is * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! FOUND the following input files: t10 START PROCESSING DATA WARNING --- I do not recognize this data. Will make an attempt to read it. Processing time --- 2011-01-24_12:00:00 forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source ARWpost.exe 000000000040CC91 Unknown Unknown Unknown ARWpost.exe 0000000000419E58 Unknown Unknown Unknown ARWpost.exe 0000000000419395 Unknown Unknown Unknown ARWpost.exe 0000000000409E78 Unknown Unknown Unknown ARWpost.exe 0000000000408BEC Unknown Unknown Unknown libc.so.6 00000030D141D994 Unknown Unknown Unknown ARWpost.exe 0000000000408AE9 Unknown Unknown Unknown I can't see where I am going wrong?! perhaps you have come across such errors before? Thanks again Jason On Sat, Nov 12, 2011 at 1:28 PM, Young, Jeremy, K > wrote: Hi Jason, What exactly is your problem? Maybe the paths to your wrfout files are incorrect? I?ve attached the namelist file. The only things I had to change in the namelist when running ARWpost.exe were the start date, the end date and the name of my wrfout file. I hope the namelist helps. If you run into any additional problems I can try and help. Jeremy Young From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Friday, November 11, 2011 8:22 PM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, Thank you for your email. In fact the first modification seems like to have solved the problem already and I now have ARWpost.exe. I still cannot make it work however =( I cannot understand what I am doing wrong to get different error messages. Could I kindly ask you to explain the procedure if you have expertise in the matter? I've tried renaming/relocating and changing the input file and adjusting the namelist accordingly but still I can't seem to get it right. Could you kindly send me a copy of your namelist.ARWpost and specify the following: * location and name of input(WRF output) files relative to ARWpost.exe Many thanks, Jason On Fri, Nov 11, 2011 at 12:42 AM, Young, Jeremy, K > wrote: Hi Jason, I?ve seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have ?L/path/to/netcdf/lib ?lnetcdf ?lnetcdff --CPPFLAGS should now have ?I/path/to/netcdf/include and ?R/path/to/netcdf/lib --Change the ?-O? flags to match that of netcdf during compile, probably ?-O2?. Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an ?error opening shared libraries ??. If you see this message, you can use a command like ? ln ?s /path/to/netcdf/lib/libnetcdf* .? issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users From jeremy.young868 at topper.wku.edu Mon Nov 14 14:14:24 2011 From: jeremy.young868 at topper.wku.edu (Young, Jeremy, K) Date: Mon, 14 Nov 2011 21:14:24 +0000 Subject: [Wrf-users] ARWpost for GrADS application In-Reply-To: References: <50C0D80B700AFB4597C07F946592F09727C906A0@SN2PRD0302MB123.namprd03.prod.outlook.com> <50C0D80B700AFB4597C07F946592F09727C98C06@CH1PRD0302MB132.namprd03.prod.outlook.com> <50C0D80B700AFB4597C07F946592F09727C98D4F@CH1PRD0302MB132.namprd03.prod.outlook.com> , Message-ID: <50C0D80B700AFB4597C07F946592F09727C98EF6@CH1PRD0302MB132.namprd03.prod.outlook.com> Hi Jason, I'm glad that you figured out what your problem was. For my knowledge in helping other students that may run into this problem, what exactly did you do to get ARWpost to work correctly? Was it a compilation issue? Thanks for the information! Jeremy Young ________________________________________ From: jasonpadovani at gmail.com [jasonpadovani at gmail.com] on behalf of Jason Padovani Ginies [jpad0001 at um.edu.mt] Sent: Monday, November 14, 2011 11:26 AM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, I think I found what the problem is. I tried running a quick simulation producing a file which is less than 2Gb in size. Guess what? !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! FOUND the following input files: /home/wrf/OUTPUT/run_kernel/wrfout_d01_2011-01-24_12:00:00 START PROCESSING DATA Processing time --- 2011-01-24_12:00:00 Found the right date - continue Processing time --- 2011-01-24_15:00:00 Found the right date - continue Processing time --- 2011-01-24_18:00:00 Found the right date - continue Processing time --- 2011-01-24_21:00:00 Found the right date - continue Processing time --- 2011-01-25_00:00:00 Found the right date - continue Processing time --- 2011-01-25_03:00:00 Found the right date - continue Processing time --- 2011-01-25_06:00:00 Found the right date - continue Processing time --- 2011-01-25_09:00:00 Found the right date - continue Processing time --- 2011-01-25_12:00:00 Found the right date - continue DONE Processing Data CREATING .ctl file !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ! Successful completion of ARWpost ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! On Mon, Nov 14, 2011 at 5:00 PM, Jason Padovani Ginies > wrote: Hi there, Thanks for your last detailed email. I have a script which copies the run directory to an output location so not to mess up run/ itself if anything goes wrong. What's different is only the paths so things should work fine. My namelist now has start_date = '2011-01-24_12:00:00', end_date = '2011-01-27_12:00:00', and &io input_root_name = '/home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00' output_root_name = './OUTPUT/test' and is otherwise the same. Again the output I get is !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! ls: /home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00: Value too large for defined data type ls: /home/wrf/OUTPUT/LES-Tests/test11/wrfout_d03_2011-01-24_12:00:00: Value too large for defined data type Oops, we need at least ONE input file for the program to read. I am now thinking it might be a problem in the compilation or it is the fact that since ls is not producing an actual output for ARWpost, then the process is coming to a halt. The last few lines of the compilation in fact read -L/home/wrf/ARWpost/netcdf_links/lib -I/home/wrf/ARWpost/netcdf_links/include -lnetcdf /cm/shared/apps/intel/Compiler/11.1/046/lib/intel64/libimf.so: warning: warning: feupdateenv is not implemented and will always fail Did you have to go through the modifications you have listed in the first email in order to compile ARWpost ? And if yes, could I kindly ask you to send me the configure.arwp file please? Once again thank you for your time, I know this must be getting frustrating. Jason On Sun, Nov 13, 2011 at 6:33 PM, Young, Jeremy, K > wrote: Hi Jason, If I?m understanding your procedure, you have renamed the wrfout files produced by wrf.exe to a different name (your example is t10). I have not used ARWpost very often, but I?m not sure if ARWpost.exe can understand what it?s reading if the wrfout files are renamed. Looking at the errors you are seeing, I think that this may be your problem. It?s either that the renamed files are not being understood by ARWpost.exe or that there was an error in the compilation of ARWpost. &datetime start_date = '1998-09-01_00:00:00', end_date = '1998-09-05_00:00:00', interval_seconds = 10800, tacc = 0, debug_level = 0, / &io input_root_name = '../WRFV3/run/wrfout_d01_1998-09-01_00:00:00' output_root_name = './test' plot = 'all_list' fields = 'height,pressure,tk,tc' mercator_defs = .true. / split_output = .true. frames_per_outfile = 2 plot = 'all' plot = 'list' plot = 'all_list' ! Below is a list of all available diagnostics fields = 'height,geopt,theta,tc,tk,td,td2,rh,rh2,umet,vmet,pressure,u10m,v10m,wdir,wspd,wd10,ws10,slp,mcape,mcin,lcl,lfc,cape,cin,dbz,max_dbz,clfr' &interp interp_method = 0, interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., / extrapolate = .true. interp_method = 0, ! 0 is model levels, -1 is nice height levels, 1 is user specified pressure/height interp_levels = 1000.,950.,900.,850.,800.,750.,700.,650.,600.,550.,500.,450.,400.,350.,300.,250.,200.,150.,100., interp_levels = 0.25, 0.50, 0.75, 1.00, 2.00, 3.00, 4.00, 5.00, 6.00, 7.00, 8.00, 9.00, 10.0, 11.0, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 19.0, 20.0, This is my namelist.ARWpost. The only lines I changed in this file before running the program were the start date, the end date and the input file name. I don?t see why you couldn?t name the output files (the .ctl and the .dat) whatever you wish, but it?s possible that renaming the file has corrupted it somehow or that ARWpost cannot accept files with names like this. My suggestion would be to try and rename the t10 file (and others like it) back to their original names and try again. To keep the files organized, maybe you could create several directories that would contain each model run and its related wrfout files. If possible, complete another model run and leave the wrfout files in the ../WRFV3/run directory. Try and run ARWpost.exe while the wrfout files are still in this location and still have the wrfout_d01____ name. In doing so, your process would completely match mine and we might be able to see if the problem is related to your file names or if it is related to ARWpost not compiling correctly. Please respond with any additional questions or problems that you run into. The attached screenshots show my completion of ARWpost using the namelist above. Jeremy Young. From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Sunday, November 13, 2011 2:17 AM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Hi Jeremy, Thank you for your time. I did not get the attachment in the email, could you please send it again? What I''ve tried so far and the errors I get: * I've made two directories WRFinput [in which I have a wrf output file named t10] and OUTPUT in the ARWpost directory. So in the input and output root name of the namelist i've put (respectively) './WRFinput/t10' and './OUTPUT/test'. I adjusted the dates and the error I get is: * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! ls: ./WRFinput/t10: Value too large for defined data type ls: ./WRFinput/t10: Value too large for defined data type Oops, we need at least ONE input file for the program to read. * I've tried to put 'WRFinput/' as the input root name so that ARWpost looks for the files automatically in the directory and the error I get is * !!!!!!!!!!!!!!!! ARWpost v3.1 !!!!!!!!!!!!!!!! FOUND the following input files: t10 START PROCESSING DATA WARNING --- I do not recognize this data. Will make an attempt to read it. Processing time --- 2011-01-24_12:00:00 forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source ARWpost.exe 000000000040CC91 Unknown Unknown Unknown ARWpost.exe 0000000000419E58 Unknown Unknown Unknown ARWpost.exe 0000000000419395 Unknown Unknown Unknown ARWpost.exe 0000000000409E78 Unknown Unknown Unknown ARWpost.exe 0000000000408BEC Unknown Unknown Unknown libc.so.6 00000030D141D994 Unknown Unknown Unknown ARWpost.exe 0000000000408AE9 Unknown Unknown Unknown I can't see where I am going wrong?! perhaps you have come across such errors before? Thanks again Jason On Sat, Nov 12, 2011 at 1:28 PM, Young, Jeremy, K > wrote: Hi Jason, What exactly is your problem? Maybe the paths to your wrfout files are incorrect? I?ve attached the namelist file. The only things I had to change in the namelist when running ARWpost.exe were the start date, the end date and the name of my wrfout file. I hope the namelist helps. If you run into any additional problems I can try and help. Jeremy Young From: jasonpadovani at gmail.com [mailto:jasonpadovani at gmail.com] On Behalf Of Jason Padovani Ginies Sent: Friday, November 11, 2011 8:22 PM To: Young, Jeremy, K Subject: Re: [Wrf-users] ARWpost for GrADS application Dear Jeremy, Thank you for your email. In fact the first modification seems like to have solved the problem already and I now have ARWpost.exe. I still cannot make it work however =( I cannot understand what I am doing wrong to get different error messages. Could I kindly ask you to explain the procedure if you have expertise in the matter? I've tried renaming/relocating and changing the input file and adjusting the namelist accordingly but still I can't seem to get it right. Could you kindly send me a copy of your namelist.ARWpost and specify the following: * location and name of input(WRF output) files relative to ARWpost.exe Many thanks, Jason On Fri, Nov 11, 2011 at 12:42 AM, Young, Jeremy, K > wrote: Hi Jason, I?ve seen a similar error message before when attempting to compile ARWpost. I fixed this problem by adding the following to the configure.arwp file: --LDFLAGS should now have ?L/path/to/netcdf/lib ?lnetcdf ?lnetcdff --CPPFLAGS should now have ?I/path/to/netcdf/include and ?R/path/to/netcdf/lib --Change the ?-O? flags to match that of netcdf during compile, probably ?-O2?. Changing the optimization flags, LDFLAGS and CPPFLAGS will likely fix your problem. The-R/path/to/netcdf/lib will eliminate another problem that I ran into when trying to run ARWpost.exe in which an error message saying that there was an ?error opening shared libraries ??. If you see this message, you can use a command like ? ln ?s /path/to/netcdf/lib/libnetcdf* .? issued from the ARWpost directory to link the necessary libraries and allow them to be referenced. I hope this helps. Jeremy Young System Administrator Climate Research Lab Western Kentucky University From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of Jason Padovani Ginies Sent: Thursday, November 10, 2011 1:08 PM To: wrf-users at ucar.edu Subject: [Wrf-users] ARWpost for GrADS application Dear wrf-users, I am having trouble compiling ARWposts. When I try to compile I get lots of error messages looking like this: input_module.f:(.text+0x40f): undefined reference to `ncvgt_' and eventually are confronted with a final error message saying: make: [ARWpost.exe] Error 1 (ignored) Some research has led me to believe that the netCDF on the system might causing this problem and that I might have to re-compile that Is there any way to go around this perhaps another method of converting wrf output in a format readable by GrADS? Kind regards, Jason Padovani Ginies Final Year B.Sc. student Department of Physics University of Malta _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users _______________________________________________ Wrf-users mailing list Wrf-users at ucar.edu http://mailman.ucar.edu/mailman/listinfo/wrf-users From tanxiangshan2005 at 126.com Wed Nov 16 01:37:18 2011 From: tanxiangshan2005 at 126.com (tanxiangshan2005) Date: Wed, 16 Nov 2011 16:37:18 +0800 Subject: [Wrf-users] about droplet effective radius Message-ID: <201111161637180539261@126.com> Dear Pro: Recently , I want to output droplet effective radius by using WRFchem model,Please tell me how to do ? or generally , how to calculate droplet effective radius from the output varies. Thank you very much Jingjing Shang 2011-11-16 tanxiangshan2005 2011-11-16 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111116/4e11e4a4/attachment.html From Mouhamad.Al-Sayed-Ali at u-bourgogne.fr Thu Nov 17 08:56:32 2011 From: Mouhamad.Al-Sayed-Ali at u-bourgogne.fr (Mouhamad Al-Sayed-Ali) Date: Thu, 17 Nov 2011 16:56:32 +0100 Subject: [Wrf-users] Global WRF Message-ID: <20111117165632.151874hsbvdv9zgo@webmail.u-bourgogne.fr> Hello everyone, I have been trying to use WRF as a global. I have compiled it, but when I run the "wrf.Exe" I get some strange results(I only get zeros numbers). If someone has already used global WRF, could you please send me the "namelist.input" and "namelist.wps" you have used. Many thanks for your help Mouhamad From jians at umn.edu Fri Nov 18 12:51:11 2011 From: jians at umn.edu (Jian Sun) Date: Fri, 18 Nov 2011 13:51:11 -0600 Subject: [Wrf-users] problem of running WRF with 2+ threads on Mac Message-ID: Hi everyone, I have a problem of running WRF on my Mac pro with Lion (Xcode 4.2). I am using intel fortran composer XE and GCC (4.2). I am able to run WRF in serial mode or in smpar mode with single thread. But segmentation fault occurs (forrtl: severe (174): SIGSEGV, segmentation fault occurred) when i try to run it with 2 or more threads. After a lot searching, the most possible reason seems to be about the stack size. I did "unlimit" in the command line, which is supposed to free the stack size, but there is hard upper limit in mac (64MB). Then I adjusted LDFLAGS_LOCAL option in configure.wrf, increasing -stack_size to 0x1F0000000 (about 8GB). I still cannot make it run with 2+ threads and get the same error. Any help or advice is truly appreciated Jian Sun -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111118/774415af/attachment.html From mattocks at mac.com Tue Nov 22 19:05:37 2011 From: mattocks at mac.com (Craig Mattocks) Date: Tue, 22 Nov 2011 21:05:37 -0500 Subject: [Wrf-users] problem of running WRF with 2+ threads on Mac Message-ID: <0DB45B6B-A529-43B8-AABF-7FC8D463FC07@mac.com> Hi Jian, WRF runs fine in hybrid shared + distributed memory mode on my MacBook Pro laptop (Intel Core i7, Sandy Bridge, SSD) under Mac OS X Lion. I am getting about a 10% speed boost over pure MPI. Make sure you follow the instructions in the 'configure.wrf' file: # increase stack size; also note that for OpenMP, set environment OMP_STACKSIZE 4G or greater LDFLAGS_LOCAL = -ip -Wl,-stack_addr,0xF10000000 -Wl,-stack_size,0x64000000 by setting: setenv OMP_STACKSIZE 4G before you run your simulation. I am also using these settings in my .cshrc file: ############################################### # Allow unlimited memory for big executables: # ############################################### limit datasize unlimited limit descriptors 8912 limit memorylocked unlimited limit memoryuse unlimited limit stacksize unlimited mpich2 (http://www.mcs.anl.gov/research/projects/mpich2/) runs much faster than OpenMPI for me, but beware that you have to roll back to version 12.04 of the Intel Fortran compiler (instead of 12.1.x) to build the shared library version (I have reported this bug, but Intel warned me that it will not be fixed in the immediate future). Also, remove/rename the default Apple-installed OpenMPI compilers in your /usr/bin directory: sudo \rm /usr/bin/opal_wrapper sudo \rm /usr/bin/orte-ps sudo \rm /usr/bin/orte-iof sudo \rm /usr/bin/orte-clean sudo \rm /usr/bin/orterun sudo \rm /usr/bin/orted sudo \rm /usr/bin/otfmerge sudo \rm /usr/bin/otfinfo sudo \rm /usr/bin/otfdump sudo \rm /usr/bin/otfdecompress -> otfcompress sudo \rm /usr/bin/otfconfig sudo \rm /usr/bin/otfcompress sudo \rm /usr/bin/otfaux sudo \rm /usr/bin/vtunify sudo \rm /usr/bin/vtfilter sudo \rm /usr/bin/vtf90 sudo \rm /usr/bin/vtf77 sudo \rm /usr/bin/vtcxx sudo \rm /usr/bin/vtcc sudo \rm /usr/bin/opari sudo \rm /usr/bin/ompi_info sudo \rm /usr/bin/mpif90-vt sudo \rm /usr/bin/mpif90 sudo \rm /usr/bin/mpif77-vt sudo \rm /usr/bin/mpif77 sudo \rm /usr/bin/mpicxx-vt sudo \rm /usr/bin/mpicxx sudo \rm /usr/bin/mpicc-vt sudo \rm /usr/bin/mpicc sudo \rm /usr/bin/mpic++-vt sudo \rm /usr/bin/mpic++ sudo \rm /usr/bin/mpiCC-vt sudo \rm /usr/bin/mpiCC sudo \rm /usr/bin/ompi-server sudo \rm /usr/bin/ompi-ps sudo \rm /usr/bin/ompi-iof sudo \rm /usr/bin/ompi-clean sudo \rm /usr/bin/mpirun sudo \rm /usr/bin/mpiexec Some good input data for testing: wget http://www.mmm.ucar.edu/wrf_tmp/friendly/VENDOR_small_arw_ic.tar Hope this helps, Craig On Friday, 18 November 2011 at 13:51:11 -0600 Jian Sun wrote: > Hi everyone, > > I have a problem of running WRF on my Mac pro with Lion (Xcode 4.2). I am > using intel fortran composer XE and GCC (4.2). > > I am able to run WRF in serial mode or in smpar mode with single thread. > But segmentation fault occurs (forrtl: severe (174): SIGSEGV, segmentation > fault occurred) when i try to run it with 2 or more threads. After a lot > searching, the most possible reason seems to be about the stack size. I did > "unlimit" in the command line, which is supposed to free the stack size, > but there is hard upper limit in mac (64MB). Then I adjusted LDFLAGS_LOCAL > option in configure.wrf, increasing -stack_size to 0x1F0000000 (about 8GB). I > still cannot make it run with 2+ threads and get the same error. > > Any help or advice is truly appreciated > > Jian Sun From ebeigi3 at tigers.lsu.edu Mon Nov 21 16:46:17 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Mon, 21 Nov 2011 17:46:17 -0600 Subject: [Wrf-users] Calculating precipitation? Message-ID: Dear Sir/Madam, I added RAINC to RAINNC to calculate precipitation for my domain , and then for each day , i took summation of every 6hour (00 06 12 18 )to estimate the daily precipitation. do you think i am doing the correct way to calculate precipitation? i think it is wrong because the simulated precipitation has increasing trend while it should be stochastic. Bests, Ehsan Beigi -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111121/71a7e582/attachment.html From kedx1kii at nottingham.edu.my Tue Nov 22 00:03:57 2011 From: kedx1kii at nottingham.edu.my (KENOBI ISIMA MORRIS) Date: Tue, 22 Nov 2011 07:03:57 +0000 Subject: [Wrf-users] Problem compiling WRF: Message-ID: <9DFC984430744E4E89E3FF2B58929FF801B2A3@MAILBOX2.nottingham.edu.my> Dear Sir/Madam, I am new to using WRF and would need help in solving this problem; during the compilation of the WRFV3 version 3.3.1, I encountered these problems at the respective lines; 2487: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2520: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2670: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2687: Fatal Error: Can't open module file 'module_cumulus_driver.mod' for reading at (1): No such file or directory 2759: Fatal Error: Can't open module file 'module_physics_init.mod' for reading at (1): No such file or directory 2787: Fatal Error: Can't open module file 'module_physics_calls.mod' for reading at (1): No such file or directory. The attached is the compile log file. I would really appreciate if anyone could help me. Thanks, Kenobi Isima Morris PhD Research Student University of Nottingham Malaysia Campus<< This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham. This message has been checked for viruses but the contents of an attachment may still contain software viruses which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK & Malaysia legislation. >> -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_nmm.log Type: text/x-log Size: 516672 bytes Desc: compile_nmm.log Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111122/0e852551/attachment-0001.bin From kenimor at yahoo.com Tue Nov 22 00:09:31 2011 From: kenimor at yahoo.com (MORRIS KENOBI ISIMA) Date: Mon, 21 Nov 2011 23:09:31 -0800 (PST) Subject: [Wrf-users] Problem compiling WRF: In-Reply-To: <9DFC984430744E4E89E3FF2B58929FF801D2B8@MAILBOX2.nottingham.edu.my> References: <9DFC984430744E4E89E3FF2B58929FF801B2A3@MAILBOX2.nottingham.edu.my> <9DFC984430744E4E89E3FF2B58929FF801D2B8@MAILBOX2.nottingham.edu.my> Message-ID: <1321945771.88876.YahooMailNeo@web126006.mail.ne1.yahoo.com> Dear Sir/Madam, I am new to using WRF and would need help in solving this problem; during the compilation of the WRFV3 version 3.3.1, I encountered these problems at the respective lines; 2487: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2520: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2670: Fatal Error: Can't open module file 'module_cu_g3.mod' for reading at (1): No such file or directory 2687: Fatal Error: Can't open module file 'module_cumulus_driver.mod' for reading at (1): No such file or directory 2759: Fatal Error: Can't open module file 'module_physics_init.mod' for reading at (1): No such file or directory 2787: Fatal Error: Can't open module file 'module_physics_calls.mod' for reading at (1): No such file or directory. The attached is the compile log file. I would really appreciate if anyone could help me. Thanks, Kenobi Isima Morris PhD Research Student University of Nottingham Malaysia Campus -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111121/a5772554/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: compile_nmm.log Type: text/x-log Size: 516672 bytes Desc: compile_nmm.log Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111121/a5772554/attachment-0001.bin From Sean.Crowell at noaa.gov Wed Nov 23 10:51:42 2011 From: Sean.Crowell at noaa.gov (Sean Crowell) Date: Wed, 23 Nov 2011 11:51:42 -0600 Subject: [Wrf-users] WRFDA: Negative Heights Error Message-ID: I've run WRF and generated a set of wrfout files, to use as a first guess for WRFDA in a 3D-Var setting. I've linked all the files as it says to do in the tutorial, but when I run da_wrfvar.exe, I get the message "Negative height found" in my rsl.error files. Indeed, when I look at the wrfout files, there are negative heights. I know that this is not unusual, but I'm not sure how to correct the problem to make wrfvar work correctly. Any ideas? From tripathi at atmo.arizona.edu Wed Nov 23 13:33:09 2011 From: tripathi at atmo.arizona.edu (Om tripathi) Date: Wed, 23 Nov 2011 13:33:09 -0700 Subject: [Wrf-users] Re-Calculating precipitation? (Ehsan Beigi) In-Reply-To: References: Message-ID: <4ECD5885.7070404@atmo.arizona.edu> Ehsan, WRF accumulates precipitation from the day of the start of run. So to calculate daily precipitation you need to first add RAINC and RAINNC for 00 hours of day1 and 00 hours of day2 and subtract the value for day1 from the value for day2 or vice versa. Remember: WRF keeps accumulating rains as the run prograsses. Om Message: 1 Date: Mon, 21 Nov 2011 17:46:17 -0600 From: Ehsan Beigi Subject: [Wrf-users] Calculating precipitation? To: wrfhelp , wrf-users at ucar.edu Message-ID: Content-Type: text/plain; charset="iso-8859-1" Dear Sir/Madam, I added RAINC to RAINNC to calculate precipitation for my domain , and then for each day , i took summation of every 6hour (00 06 12 18 )to estimate the daily precipitation. do you think i am doing the correct way to calculate precipitation? i think it is wrong because the simulated precipitation has increasing trend while it should be stochastic. Bests, Ehsan Beigi wrf-users-request at ucar.edu wrote: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body 'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. Calculating precipitation? (Ehsan Beigi) > 2. Problem compiling WRF: (KENOBI ISIMA MORRIS) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Mon, 21 Nov 2011 17:46:17 -0600 > From: Ehsan Beigi > Subject: [Wrf-users] Calculating precipitation? > To: wrfhelp , wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Dear Sir/Madam, > > I added RAINC to RAINNC to calculate precipitation for my domain , and then > for each day , i took summation of every 6hour (00 06 12 18 )to estimate > the daily precipitation. do you think i am doing the correct way to > calculate precipitation? i think it is wrong because the simulated > precipitation has increasing trend while it should be stochastic. > > > Bests, > > > Ehsan Beigi > > > > -------------- next part -------------- A non-text attachment was scrubbed... Name: tripathi.vcf Type: text/x-vcard Size: 242 bytes Desc: not available Url : http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111123/6185a02c/attachment.vcf From yamasaki at fis.ua.pt Thu Nov 24 13:54:08 2011 From: yamasaki at fis.ua.pt (Y. Yamasaki) Date: Thu, 24 Nov 2011 20:54:08 +0000 Subject: [Wrf-users] Wrf-users Digest, Vol 87, Issue 17 In-Reply-To: References: Message-ID: Ref: Vol87, Issue 17. The negative height is probably related to model top configuration if you are preparing your own fc files/etc.. ref to - WRFVAR look at log files and you see that tutorial example set the model top to 10 hPa. So that, if you are preparing your own fc files you should set the top to 10 hPa OR change all setting by including : p_top_requested = 5000 (if top is 50 hPa) best regards, yyamazaki Em Thu, 24 Nov 2011 12:00:02 -0700 wrf-users-request at ucar.edu escreveu: > Send Wrf-users mailing list submissions to > wrf-users at ucar.edu > > To subscribe or unsubscribe via the World Wide Web, >visit > http://mailman.ucar.edu/mailman/listinfo/wrf-users > or, via email, send a message with subject or body >'help' to > wrf-users-request at ucar.edu > > You can reach the person managing the list at > wrf-users-owner at ucar.edu > > When replying, please edit your Subject line so it is >more specific > than "Re: Contents of Wrf-users digest..." > > > Today's Topics: > > 1. WRFDA: Negative Heights Error (Sean Crowell) > 2. Re-Calculating precipitation? (Ehsan Beigi) (Om >tripathi) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 23 Nov 2011 11:51:42 -0600 >From: Sean Crowell > Subject: [Wrf-users] WRFDA: Negative Heights Error > To: wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset=US-ASCII; >format=flowed; delsp=yes > > I've run WRF and generated a set of wrfout files, to use >as a first > guess for WRFDA in a 3D-Var setting. I've linked all >the files as it > says to do in the tutorial, but when I run >da_wrfvar.exe, I get the > message "Negative height found" in my rsl.error files. > Indeed, when I > look at the wrfout files, there are negative heights. I >know that > this is not unusual, but I'm not sure how to correct the >problem to > make wrfvar work correctly. Any ideas? > > > ------------------------------ > > Message: 2 > Date: Wed, 23 Nov 2011 13:33:09 -0700 >From: Om tripathi > Subject: [Wrf-users] Re-Calculating precipitation? >(Ehsan Beigi) > To: wrf-users at ucar.edu > Message-ID: <4ECD5885.7070404 at atmo.arizona.edu> > Content-Type: text/plain; charset="iso-8859-1" > > Ehsan, > > WRF accumulates precipitation from the day of the start >of run. So to calculate daily precipitation you need to >first add RAINC and RAINNC for 00 hours of day1 and 00 >hours of day2 and subtract the value for day1 from the >value for day2 or vice versa. Remember: WRF keeps >accumulating rains as the run prograsses. > > Om > > Message: 1 > Date: Mon, 21 Nov 2011 17:46:17 -0600 >From: Ehsan Beigi > Subject: [Wrf-users] Calculating precipitation? > To: wrfhelp , wrf-users at ucar.edu > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > Dear Sir/Madam, > > I added RAINC to RAINNC to calculate precipitation for >my domain , and then > for each day , i took summation of every 6hour (00 06 12 >18 )to estimate > the daily precipitation. do you think i am doing the >correct way to > calculate precipitation? i think it is wrong because the >simulated > precipitation has increasing trend while it should be >stochastic. > > > Bests, > > > Ehsan Beigi > > > > wrf-users-request at ucar.edu wrote: >> Send Wrf-users mailing list submissions to >> wrf-users at ucar.edu >> >> To subscribe or unsubscribe via the World Wide Web, >>visit >> http://mailman.ucar.edu/mailman/listinfo/wrf-users >> or, via email, send a message with subject or body >>'help' to >> wrf-users-request at ucar.edu >> >> You can reach the person managing the list at >> wrf-users-owner at ucar.edu >> >> When replying, please edit your Subject line so it is >>more specific >> than "Re: Contents of Wrf-users digest..." >> >> >> Today's Topics: >> >> 1. Calculating precipitation? (Ehsan Beigi) >> 2. Problem compiling WRF: (KENOBI ISIMA MORRIS) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Mon, 21 Nov 2011 17:46:17 -0600 >> From: Ehsan Beigi >> Subject: [Wrf-users] Calculating precipitation? >> To: wrfhelp , wrf-users at ucar.edu >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> Dear Sir/Madam, >> >> I added RAINC to RAINNC to calculate precipitation for >>my domain , and then >> for each day , i took summation of every 6hour (00 06 12 >>18 )to estimate >> the daily precipitation. do you think i am doing the >>correct way to >> calculate precipitation? i think it is wrong because the >>simulated >> precipitation has increasing trend while it should be >>stochastic. >> >> >> Bests, >> >> >> Ehsan Beigi >> >> >> >> > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: tripathi.vcf > Type: text/x-vcard > Size: 242 bytes > Desc: not available > Url : >http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111123/6185a02c/attachment-0001.vcf > > ------------------------------ > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > > End of Wrf-users Digest, Vol 87, Issue 17 > ***************************************** From ebeigi3 at tigers.lsu.edu Fri Nov 25 15:41:19 2011 From: ebeigi3 at tigers.lsu.edu (Ehsan Beigi) Date: Fri, 25 Nov 2011 16:41:19 -0600 Subject: [Wrf-users] Precipitation Zero? Message-ID: Dear Friends, Thanks for your previous help. according to http://www.mmm.ucar.edu/wrf/OnLineTutorial/Basics/UNGRIB/ungrib_req_fields.htm, i didn't cosider the precipitaiton as an input to WPS, now after runnig WRF, i got value of zero for RAINNC and RIANC, should i consider Large-scale (stable) precipitation rate (PRECL) and Convective precipitation rate (PRECC) ? I appreciate your help in advance? or what may be cause of this error? Best Regards, Ehsan Beigi -- *Ehsan Beigi* *PhD Student* *Department of Civil and and Environmental Engineering 2408 Patrick F. Taylor Hall Louisiana State University Baton Rouge, LA, 70803* -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111125/627a914d/attachment.html From Sean.Crowell at noaa.gov Tue Nov 29 12:38:28 2011 From: Sean.Crowell at noaa.gov (Sean Crowell) Date: Tue, 29 Nov 2011 13:38:28 -0600 Subject: [Wrf-users] WRFDA negative height problem Message-ID: <2A271497-94B7-458F-9660-48479A2C70B1@noaa.gov> I am attempting to run WRFDA, and I have generated a background error file, a set of observations, and a first guess from WRF using NAM data as initial and boundary conditions. When I run da_wrfvar.exe, I get the following error message in my rsl.error.*: Ntasks in X 2, ntasks in Y 3 ************************************* Parent domain ids,ide,jds,jde 1 266 1 172 ims,ime,jms,jme 127 271 -4 64 ips,ipe,jps,jpe 134 266 1 57 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 852723024 bytes allocat ed WRF TILE 1 IS 134 IE 265 JS 1 JE 57 WRF NUMBER OF TILES = 1 ---------------------------- FATAL ERROR ----------------------- Fatal error in file: da_transfer_xatowrf.inc LINE: 442 Negative height found 222 22 ht = -1.19 terr = 7.11 ---------------------------------------------------------------- Can you help me to understand what might be going wrong here? I'm certain that I've set something wrong, but I don't know what. I copied my WRF namelist over to my WRFDA namelist, so I know the settings should match. For completeness, here's the rsl.out.* file: *** VARIATIONAL ANALYSIS *** Ntasks in X 2, ntasks in Y 3 ************************************* Parent domain ids,ide,jds,jde 1 266 1 172 ims,ime,jms,jme 127 271 -4 64 ips,ipe,jps,jpe 134 266 1 57 ************************************* DYNAMICS OPTION: Eulerian Mass Coordinate alloc_space_field: domain 1, 852723024 bytes allocat ed WRF TILE 1 IS 134 IE 265 JS 1 JE 57 WRF NUMBER OF TILES = 1 Set up observations (ob) Using ASCII format observation input scan obs ascii end scan obs ascii Observation summary ob time 1 sound 79 global, 13 local synop 961 global, 114 local pilot 0 global, 0 local satem 0 global, 0 local geoamv 0 global, 0 local polaramv 0 global, 0 local airep 1603 global, 369 local gpspw 0 global, 0 local gpsrf 0 global, 0 local metar 0 global, 0 local ships 0 global, 0 local ssmi_rv 0 global, 0 local ssmi_tb 0 global, 0 local ssmt1 0 global, 0 local ssmt2 0 global, 0 local qscat 0 global, 0 local profiler 0 global, 0 local buoy 0 global, 0 local bogus 0 global, 0 local pseudo 0 global, 0 local radar 0 global, 0 local radiance 0 global, 0 local airs retrieval 0 global, 0 local sonde_sfc 79 global, 13 local mtgirs 0 global, 0 local tamdar 0 global, 0 local Set up background errors for regional application for cv_options = 5 Using the averaged regression coefficients for unbalanced part WRF-Var dry control variables are:psi, chi_u, t_u and ps_u Humidity control variable is rh Vertical truncation for psi = 13( 99.00%) Vertical truncation for chi_u = 16( 99.00%) Vertical truncation for t_u = 25( 99.00%) Vertical truncation for rh = 23( 99.00%) >>> Save the variances and scale-lengths in outer-loop 1 Scaling: var, len, ds: 0.100000E+01 0.100000E+01 0.180000E+05 Scaling: var, len, ds: 0.100000E+01 0.100000E+01 0.180000E+05 Scaling: var, len, ds: 0.100000E+01 0.100000E+01 0.180000E+05 Scaling: var, len, ds: 0.100000E+01 0.100000E+01 0.180000E+05 Scaling: var, len, ds: 0.100000E+01 0.100000E+01 0.180000E+05 Calculate innovation vector(iv) Minimize cost function using CG method jo_sound 0.262194959846E+04 jo%sound_u 0.149525238198E+04 jo%sound_v 0.756634251656E+03 jo%sound_t 0.253029644007E+03 jo%sound_q 0.117033320818E+03 jo_sonde_sfc 0.000000000000E+00 jo%sonde_sfc_u 0.000000000000E+00 jo%sonde_sfc_v 0.000000000000E+00 jo%sonde_sfc_t 0.000000000000E+00 jo%sonde_sfc_p 0.000000000000E+00 jo%sonde_sfc_q 0.000000000000E+00 jo_synop 0.636955314734E+08 jo%synop_u 0.491152971520E+03 jo%synop_v 0.304559122743E+04 jo%synop_t 0.829944173431E+07 jo%synop_p 0.553925320036E+08 jo%synop_q 0.209912537612E+02 jo%total 0.637064792883E+08 jo_sound 0.262194959846E+04 jo_sonde_sfc 0.000000000000E+00 jo_geoamv 0.000000000000E+00 jo_polaramv 0.000000000000E+00 jo_synop 0.636955314734E+08 jo_satem 0.000000000000E+00 jo_pilot 0.000000000000E+00 jo_airep 0.832586532811E+04 jo_metar 0.000000000000E+00 jo_ships 0.000000000000E+00 jo_gpspw 0.000000000000E+00 jo_ssmi_tb 0.000000000000E+00 jo_ssmi_rv 0.000000000000E+00 jo_ssmt1 0.000000000000E+00 jo_ssmt2 0.000000000000E+00 jo_pseudo 0.000000000000E+00 jo_qscat 0.000000000000E+00 jo_profiler 0.000000000000E+00 jo_buoy 0.000000000000E+00 jo_radar 0.000000000000E+00 jo_gpsref 0.000000000000E+00 jo_bogus 0.000000000000E+00 jo_radiance 0.000000000000E+00 jo_airsr 0.000000000000E+00 jo_mtgirs 0.000000000000E+00 jo_tamdar 0.000000000000E+00 jo_tamdar_sfc 0.000000000000E+00 Starting outer iteration : 1 Starting cost function: 4.25254199D+08, Gradient= 1.98261367D+05 For this outer iteration gradient target is: 1.98261367D+03 ---------------------------------------------------------- Iter Cost Function Gradient Step 1 1.71179200D+08 8.41493011D+04 1.29275354D-02 2 9.54624377D+07 4.47172535D+04 2.13855787D-02 3 7.16577641D+07 3.08483006D+04 2.38090454D-02 4 5.95942218D+07 1.72014545D+04 2.53537591D-02 5 5.47628019D+07 1.56966639D+04 3.26568609D-02 6 5.16598630D+07 9.96024256D+03 2.51876992D-02 7 4.95550876D+07 8.22891171D+03 4.24322367D-02 8 4.82312735D+07 6.35948495D+03 3.90995895D-02 9 4.73919322D+07 5.09046293D+03 4.15073184D-02 10 4.69090645D+07 4.20757477D+03 3.72686462D-02 11 4.66196330D+07 3.79322360D+03 3.26973125D-02 12 4.64207492D+07 2.83758913D+03 2.76447402D-02 13 4.62677592D+07 2.03484654D+03 3.80009081D-02 14 4.61558173D+07 1.48366366D+03 5.40703801D-02 ---------------------------------------------------------- Inner iteration stopped after 14 iterations jo_sound 0.686012931808E+06 jo%sound_u 0.122287370327E+06 jo%sound_v 0.149678539580E+05 jo%sound_t 0.548725663831E+06 jo%sound_q 0.320436913736E+02 jo_sonde_sfc 0.000000000000E+00 jo%sonde_sfc_u 0.000000000000E+00 jo%sonde_sfc_v 0.000000000000E+00 jo%sonde_sfc_t 0.000000000000E+00 jo%sonde_sfc_p 0.000000000000E+00 jo%sonde_sfc_q 0.000000000000E+00 jo_synop 0.209394523833E+07 jo%synop_u 0.272392793587E+05 jo%synop_v 0.115704332054E+05 jo%synop_t 0.180881916304E+07 jo%synop_p 0.246310098594E+06 jo%synop_q 0.626412325805E+01 jo%total 0.329586223340E+07 jo_sound 0.686012931808E+06 jo_sonde_sfc 0.000000000000E+00 jo_geoamv 0.000000000000E+00 jo_polaramv 0.000000000000E+00 jo_synop 0.209394523833E+07 jo_satem 0.000000000000E+00 jo_pilot 0.000000000000E+00 jo_airep 0.515904063267E+06 jo_metar 0.000000000000E+00 jo_ships 0.000000000000E+00 jo_gpspw 0.000000000000E+00 jo_ssmi_tb 0.000000000000E+00 jo_ssmi_rv 0.000000000000E+00 jo_ssmt1 0.000000000000E+00 jo_ssmt2 0.000000000000E+00 jo_pseudo 0.000000000000E+00 jo_qscat 0.000000000000E+00 jo_profiler 0.000000000000E+00 jo_buoy 0.000000000000E+00 jo_radar 0.000000000000E+00 jo_gpsref 0.000000000000E+00 jo_bogus 0.000000000000E+00 jo_radiance 0.000000000000E+00 jo_airsr 0.000000000000E+00 jo_mtgirs 0.000000000000E+00 jo_tamdar 0.000000000000E+00 jo_tamdar_sfc 0.000000000000E+00 Final: 14 iter, J= 4.61558173D+07, g= 1.48366366D+03 ---------------------------------------------------------- ---------------------------- FATAL ERROR ----------------------- Fatal error in file: da_transfer_xatowrf.inc LINE: 442 Negative height found 222 22 ht = -1.19 terr = 7.11 ---------------------------------------------------------------- taskid: 1 hostname: 4351-crowell.winstorm.nssl taskid: 1 hostname: 4351-crowell.winstorm.nssl -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111129/00f48b27/attachment-0001.html From ahmed4kernel at gmail.com Fri Dec 2 06:45:52 2011 From: ahmed4kernel at gmail.com (ahmed lasheen) Date: Fri, 2 Dec 2011 15:45:52 +0200 Subject: [Wrf-users] WRF-NMM compiled without nesting In-Reply-To: References: Message-ID: Helloi found a solution to my problem it was just the WRF_EM_NEST settting in the .cshrc waswrong . as i am working with NMM, it is now working well.thanks On Thu, Nov 24, 2011 at 1:33 PM, ahmed lasheen wrote: > Hello > i have configured WRF-NMM with basic option, and i make nest and when > i run the real_nmm.exe , then it works well with no problem but when i > run wrf.exe i got the following error. > -------------- FATAL CALLED --------------- > FATAL CALLED FROM FILE: LINE: 38 > WRF-NMM compiled without nesting; set max_dom to 1 in namelist.input > ------------------------------------------- > any suggestions. > > -- > =============== > Ahmed Lasheen > Egyptian?Meteorological Authority(EMA) > Cairo,Egypt > =============== -- =============== Ahmed Lasheen Junior?researcher at?Cairo Numerical Weather?Prediction?Center (CNWPC) Egyptian?Meteorological Authority(EMA) Cairo,Egypt =============== From wxtofly at comcast.net Mon Dec 5 11:04:04 2011 From: wxtofly at comcast.net (wxtofly) Date: Mon, 05 Dec 2011 10:04:04 -0800 Subject: [Wrf-users] WPS 3.3.1 compilation error In-Reply-To: References: Message-ID: <4EDD0794.10209@comcast.net> An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111205/00c517e2/attachment.html -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: configure.wrf.workeddec42011smp Url: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111205/00c517e2/attachment.pl From moudipascal at yahoo.fr Tue Dec 6 03:03:32 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Tue, 6 Dec 2011 10:03:32 +0000 (GMT) Subject: [Wrf-users] Array Division Message-ID: <1323165812.31317.YahooMailClassic@web29006.mail.ird.yahoo.com> Dear all, I recently wanted to divide two WRF variables grid by grid, but i failed doing so. In fact i want to compute the skill for forecast verification, but i am not sure how to? divide two variables with the same sizes and dimension in an ncl script. Would someone help to fix this, please Regards Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111206/616d7234/attachment.html From saji at u-aizu.ac.jp Mon Dec 5 18:56:30 2011 From: saji at u-aizu.ac.jp (Saji Hameed) Date: Tue, 6 Dec 2011 10:56:30 +0900 Subject: [Wrf-users] WPS 3.3.1 compilation error In-Reply-To: <4EDD0794.10209@comcast.net> References: <4EDD0794.10209@comcast.net> Message-ID: Maybe you have not used the configure script for WPS. You have to run ./configure first and chose compilers etc. It is better if the same compilers are used! saji On Tue, Dec 6, 2011 at 3:04 AM, wxtofly wrote: > ** > Having finally managed to get WRFV3.3.1 to compile, I then expected an > easy time with WPS. > I used ifort and icc 12.0.3 for the NetCDF and WRF compiles. No success > with WPS. > > 1) Do I have to use the same compilers for WPS as well? > Configure defaults to gcc without icc, even when icc is defined in the > environment. > > 2) Do I have to use the same compiler options for WPS that I used for > netCDF and WRF? eg -static > > 3) Better yet, does someone have a working recipe for getting these three > compiled using the intel 12 compiler suite? > (preferably under a bourne/bash shell) > > > Thanks, > TJ Olney > attached is the configure.wrf that compiled successfully. > _____________________________________________ > Environment that worked for WRF and netCDF > source /opt/intel/composerxe-2011.3.174/bin/compilervars.sh ia32 > source /opt/intel/composerxe-2011.3.174/bin/compilervars_global.sh ia32 > export CC=icc > export CXX=icpc > export F77=ifort > export FC=ifort > export F90=ifort > export CPP='icc -E' > export CXXCPP='icpc -E' > export JASPERLIB=/usr/lib > export JASPERINC=/usr/include/jasper > export CFLAGS='-O3 -xssse3 -ip -no-prec-div -shared' > export CXXFLAGS='-O3 -xssse3 -ip -no-prec-div -shared' > export FFLAGS='-O3 -xssse3 -ip -no-prec-div -shared' > export CPPFLAGS='-O3 -xssse3 -ip -no-prec-div -shared' > export FLIBS='-L/usr/local/lib -lnetcdff -lnetcdf -L/usr/lib/mpich2/lib > -llibfmpich -llibmpichf90 -llibmpichcxx' > export LIBS='-L/usr/local/lib -lnetcdff -lnetcdf -L/usr/lib/mpich2/lib > -llibfmpich -llibmpichf90 -llibmpichcxx' > export CLIBS='-L/usr/local/lib -lnetcdff -lnetcdf -L/usr/lib/mpich2/lib > -llibfmpich -llibmpichf90 -llibmpichcxx' > export NETCDF=/usr/local > > _______________________________________________ > Wrf-users mailing list > Wrf-users at ucar.edu > http://mailman.ucar.edu/mailman/listinfo/wrf-users > > -- Saji N Hameed, ARC-ENV, Center for Advanced Information Science and Technology, University of Aizu, Tsuruga, Ikki-machi, Aizuwakamatsu-shi, Fukushima 965-8580, Japan Tel: +81242 37-2736 Fax:+81242 37-2760 email: saji at u-aizu.ac.jp url: http://www.u-aizu.ac.jp bib: http://www.researcherid.com/rid/B-9188-2009 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111206/10bce341/attachment.html From wxtofly at comcast.net Mon Dec 5 21:54:28 2011 From: wxtofly at comcast.net (wxtofly) Date: Mon, 05 Dec 2011 20:54:28 -0800 Subject: [Wrf-users] WPS 3.3.1 compilation error In-Reply-To: References: <4EDD0794.10209@comcast.net> Message-ID: <4EDDA004.3090205@comcast.net> An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111205/6701f568/attachment.html From wxtofly at comcast.net Wed Dec 7 17:16:38 2011 From: wxtofly at comcast.net (wxtofly) Date: Wed, 07 Dec 2011 16:16:38 -0800 Subject: [Wrf-users] WPS 3.3.1 OK, but no jasper no png In-Reply-To: <1323165812.31317.YahooMailClassic@web29006.mail.ird.yahoo.com> References: <1323165812.31317.YahooMailClassic@web29006.mail.ird.yahoo.com> Message-ID: <4EE001E6.9020300@comcast.net> An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111207/6ae68d05/attachment.html From wxtofly at comcast.net Wed Dec 14 00:32:50 2011 From: wxtofly at comcast.net (wxtofly) Date: Tue, 13 Dec 2011 23:32:50 -0800 Subject: [Wrf-users] Resolved: Re: WPS 3.3.1 OK, but no jasper no png In-Reply-To: <20111212074214.GB22522@giotto.bmtargoss.org> References: <1323165812.31317.YahooMailClassic@web29006.mail.ird.yahoo.com> <4EE001E6.9020300@comcast.net> <20111212074214.GB22522@giotto.bmtargoss.org> Message-ID: <4EE85122.70302@comcast.net> An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111213/4dd3b541/attachment.html From bbrashers at environcorp.com Wed Dec 14 09:59:27 2011 From: bbrashers at environcorp.com (Bart Brashers) Date: Wed, 14 Dec 2011 16:59:27 +0000 Subject: [Wrf-users] Resolved: Re: WPS 3.3.1 OK, but no jasper no png In-Reply-To: <4EE85122.70302@comcast.net> References: <1323165812.31317.YahooMailClassic@web29006.mail.ird.yahoo.com> <4EE001E6.9020300@comcast.net> <20111212074214.GB22522@giotto.bmtargoss.org> <4EE85122.70302@comcast.net> Message-ID: <020B89C32F08C64FAACB3715A54952153ECFB594@wcexs1> Have you tried the versions of jasper and libpng that are in the EPEL repository? It should be as simple as enabling EPEL, and doing a # yum install jasper-libs libpng libpng-devel I believe I compiled WPS/WRF using these versions, on CentOS x86_64. Bart From: wrf-users-bounces at ucar.edu [mailto:wrf-users-bounces at ucar.edu] On Behalf Of wxtofly Sent: Tuesday, December 13, 2011 11:33 PM To: Hein Zelle; WRF Users; WRF DA; WRF Users Subject: [Wrf-users] Resolved: Re: WPS 3.3.1 OK, but no jasper no png I reinstalled all fedora libz libpng and jasper. I got this: ./ngl/libg2_4.a(dec_png.o): In function `dec_png_': dec_png.c:(.text+0x148): undefined reference to `png_set_longjmp_fn' I finally compiled and installed those three libraries (http://www.mmm.ucar.edu/wrf_tmp/WRF_OnLineTutorial/SOURCE_DATA/libs_for_wps.tar) to my WPS working directory and adjusted all the configure paths to them. I then succeeded in compiling ungrib with libpng and jasper. It seems that the new libpng libraries ( >1.4 )are not compatible with WPSV3.3.1 as distributed. A patch by someone who knows what they are doing along the lines of the one described at: http://stackoverflow.com/questions/5190554/unresolved-external-png-set-longjmp-fn-in-libpng would be a welcome thing. TJ Olney ________________________________ This message contains information that may be confidential, privileged or otherwise protected by law from disclosure. It is intended for the exclusive use of the Addressee(s). Unless you are the addressee or authorized agent of the addressee, you may not review, copy, distribute or disclose to anyone the message or any information contained within. If you have received this message in error, please contact the sender by electronic reply to email at environcorp.com and immediately delete all copies of the message. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111214/03fd6934/attachment.html From ardie_coal at yahoo.com Sun Dec 18 03:16:30 2011 From: ardie_coal at yahoo.com (Ahmad Wan) Date: Sun, 18 Dec 2011 02:16:30 -0800 (PST) Subject: [Wrf-users] wrf error could not find level above ground Message-ID: <1324203390.14343.YahooMailNeo@web111208.mail.gq1.yahoo.com> Hi, Thank you for taking the time to read this, but I am stuck with the same error for some time now: .... .... warning: water vapor pressure exceeds total pressure warning: water vapor pressure exceeds total pressure warning: water vapor pressure exceeds total pressure warning: water vapor pressure exceeds total pressure .... ??? ERROR: could not find level above ground which is produced at ./real.exe At first, I thought it was a problem with lack of pressure levels in input data. I have tried increasing it to no success. Does unrealistic values produce this error? For your information, I created my own intermediate data, & successfully ran the geogrid, metgrid programs... Thank you (Wan Ahmad Ardie) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111218/d617ddf2/attachment.html From sean.crowell at noaa.gov Mon Dec 19 09:51:35 2011 From: sean.crowell at noaa.gov (Sean Crowell) Date: Mon, 19 Dec 2011 10:51:35 -0600 Subject: [Wrf-users] Pseudo-Single Obs Test in WRFDA Message-ID: <896EB935-C4B8-4E8A-B3C9-282BCA236B96@noaa.gov> I am using WRFDA and WRF-ARW version 3.3.1, and have generated a background error covariance file be.dat for my specific season and model grid, and am trying to determine if I need to tune, etc, because the answers that I'm getting are quite unreasonable. For this purpose, I'm running a single observation test, and seem to get reasonable values for wind observations. However, when I try to put in a temperature observation, I don't get any increment. The difference between the first guess and the analysis is zero. Any ideas? Sean From scrowell at ou.edu Mon Dec 19 10:18:03 2011 From: scrowell at ou.edu (Sean Crowell) Date: Mon, 19 Dec 2011 11:18:03 -0600 Subject: [Wrf-users] Pseudo-Single Obs Test in WRFDA In-Reply-To: <896EB935-C4B8-4E8A-B3C9-282BCA236B96@noaa.gov> References: <896EB935-C4B8-4E8A-B3C9-282BCA236B96@noaa.gov> Message-ID: I discovered the magnitude was too small, and now I am getting nonzero increments. However, the corresponding increment in U and V for a 4-8 K increment in T is huge (like 100m/s). Not sure how to proceed. On Dec 19, 2011, at 10:51 AM, Sean Crowell wrote: > I am using WRFDA and WRF-ARW version 3.3.1, and have generated a > background error covariance file be.dat for my specific season and > model grid, and am trying to determine if I need to tune, etc, > because the answers that I'm getting are quite unreasonable. For > this purpose, I'm running a single observation test, and seem to get > reasonable values for wind observations. However, when I try to put > in a temperature observation, I don't get any increment. The > difference between the first guess and the analysis is zero. Any > ideas? > > Sean From jpad0001 at um.edu.mt Thu Dec 29 05:02:41 2011 From: jpad0001 at um.edu.mt (Jason Padovani Ginies) Date: Thu, 29 Dec 2011 13:02:41 +0100 Subject: [Wrf-users] Nests in WRF Portal Message-ID: Dear wrf-users, I am having problems running a simulation with 4 nests. When I specify the nests in WRF Portal and move on to the namelist, the final column of values for the last nest are not filled out and if I try filling them out myself, the simulation fails at initialization. Has anyone else encountered this problem before or is it perhaps a limitation of WRF portal? Kind regards, Jason -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111229/b3a8f6ee/attachment.html From moudipascal at yahoo.fr Wed Dec 28 07:44:57 2011 From: moudipascal at yahoo.fr (moudi pascal) Date: Wed, 28 Dec 2011 14:44:57 +0000 (GMT) Subject: [Wrf-users] WRFVAR Verification packages Message-ID: <1325083497.21353.YahooMailClassic@web132205.mail.ird.yahoo.com> Dear sir, I am Cameroonian and i deal with WRFVAR as far as my ongoing Ph.D. is concerned. I installed and run WRF and WRFVAR successfully. I use GFS for initialization. I use Prepbufr (ob.bufr and gpsro.bufr) and radiances data (airs.bufr, hirs.bufr ...) for the WRFVAR scheme (3D-Var). I would like to use the provided verification packages in ../var/scripts (da_run_suite_verif_obs.ksh and da_verif_obs_plot.ksh). I have read the document WRFDA_tools.pdf, but i am unable to run it successfully. since in don't have the ob.ascii file in my directories, I would like? you to help me step by step to fill? files da_run_suite_verif_obs.ksh and da_verif_obs_plot.ksh. I want to run verification against observation. Please, help me. Pascal MOUDI IGRI Ph-D Student Laboratory of Environmental Modeling and Atmospheric Physics (LEMAP) Department of Physics Faculty of Science University of Yaounde I, Cameroon National Advanced Training School for Technical Education, Electricity Engineering, Douala Tel:+237 75 32 58 52 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111228/7950a3d8/attachment.html From sam.hawkins at vattenfall.com Fri Dec 30 05:26:26 2011 From: sam.hawkins at vattenfall.com (sam.hawkins at vattenfall.com) Date: Fri, 30 Dec 2011 13:26:26 +0100 Subject: [Wrf-users] Intermittent MPI problem Message-ID: <25EBE9C717C6244A97CC1E31BF688EFB032FD1AB27@SMMABOX0744.eur.corp.vattenfall.com> Dear WRF users, I'm experiencing an intermittent problem running WRF V3.3.1, compiled with gcc and using mvapich2 V1.2. Sometimes a simulation immediately aborts with an error message: [14] Abort: Control shouldn't reach here in prototype, header 184 at line 276 in file ibv_recv.c This error always appears in an rsl.error file other than rsl.error.0000. Sometime, with the same settings, the simulation runs fine. Any ideas? Sam. -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111230/d4cb881a/attachment.html