[Wrf-users] Nesting, Intermediate domain & JMAX value

Remi Montroty remi.montroty at mfi.fr
Fri Oct 21 10:58:59 MDT 2011


Dear all,

I have add quite a few surprises trying to nest in version 3.3.0, core nmm.

I am using a main grid of nx=238, nj=302, and a internal grid of nx=355 
& ny=697.
Those last two values seemed to match with the rule of each value 
needing to be equal to 1 + n* 3 (the fixed parent_ratio in nmm core).

Now when I get to the model run itself (wrf, after real_nmm.exe) I get 
the following error (cf First Log). *"NESTED DOMAIN: JMAX IS EVEN, 
INCREASE e_sn IN THE namelist.input BY 1" which seems to match with *

dyn_nmm/NMM_NEST_UTILS1.F 46  <http://12characters.net/wrfbrowser/html_code/dyn_nmm/NMM_NEST_UTILS1.F.html#46>

If I do increase JMAX by 1 (ie 698), it crashes wrf.exe when trying to 
run find_ijstart_level ... (cf Second Log)

Any clue? And what is this intermediate domain in First Log?

Does anyone have an example of a set of functioning namelists for a 
nested run in WRF_v3.3.0?

Thanks !

Remi

*First Log :*

   WRF V3.3 MODEL
   *************************************
   Parent domain
   ids,ide,jds,jde            1         238           1         302
   ims,ime,jms,jme           -4          66         234         307
   ips,ipe,jps,jpe            1          59         241         302
   *************************************
  DYNAMICS OPTION: nmm dyncore
     alloc_space_field: domain            1 ,             121094988  
bytes allocated
    med_initialdata_input: calling input_input
   FORECAST BEGINS  0 GMT  0/ 0/   0
  zeroing grid%cwm
  appear to have grid%q2 values...do not zero
  INIT:  INITIALIZED ARRAYS FOR CLEAN START
      restrt= F  nest= F
      grid%pdtop=   35592.090      grid%pt=   5000.0000
  INPUT LandUse = "USGS"
  Climatological albedo is used instead of table values
SUN-EARTH DISTANCE CALCULATION FINISHED IN SOLARD
YEAR= 2011  MONTH= 10  DAY= 21 HOUR=  0 R1=   0.9957
  INITIALIZE THREE Noah LSM RELATED TABLES
  Initializng moist(:,:,:, Qv) from q
   summing moist(:,:,:,i_m) into cwm array
   summing moist(:,:,:,i_m) into cwm array
   summing moist(:,:,:,i_m) into cwm array
   computing grid%f_ice
   computing f_rain
   *************************************
   Nesting domain
   ids,ide,jds,jde            1         355           1         697
   ims,ime,jms,jme           -4          99         547         702
   ips,ipe,jps,jpe            1          89         557         697
   INTERMEDIATE domain
   ids,ide,jds,jde           68         191          38         275
   ims,ime,jms,jme           63         109         216         280
   ips,ipe,jps,jpe           66          99         226         277
   *************************************
  d01 2011-10-21_00:00:00  alloc_space_field: domain            2 
,             372043584  bytes allocated
  -------------- FATAL CALLED ---------------
  FATAL CALLED FROM FILE: <stdin>  LINE:      54
  NESTED DOMAIN: JMAX IS EVEN, INCREASE e_sn IN THE namelist.input BY 1
  -------------------------------------------

*Second Log :*
wrf.exe:28373 terminated with signal 11 at PC=8a8fc9 SP=7fffe78b7300.  
Backtrace:
wrf.exe(find_ijstart_level_+0x109)[0x8a8fc9]
wrf.exe(nest_terrain_+0x7fb)[0x890fbb]
wrf.exe(med_nest_initial_+0x772)[0x787c72]
wrf.exe(__module_integrate_MOD_integrate+0x196)[0x4abf1e]
wrf.exe(__module_wrf_top_MOD_wrf_run+0x24)[0x480a84]
wrf.exe(MAIN__+0x3c)[0x4802dc]
wrf.exe(main+0x2a)[0x16d2bba]
/lib64/libc.so.6(__libc_start_main+0xf4)[0x3f9201d994]
wrf.exe[0x4801d9]
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucar.edu/pipermail/wrf-users/attachments/20111021/2138e937/attachment-0001.html 


More information about the Wrf-users mailing list