[mpas-developers] proposed fix in src/registry/gen_inc.c

Michael Duda duda at ucar.edu
Thu Jul 22 10:46:11 MDT 2010


Hi, Todd.

Thanks for the review. As you've seen, the proposed change only
initializes fields in the grid/mesh type to zero, while leaving
fields in the state type untouched. I had also thought it might be
good practice to initialize all of the fields allocated by the
software infrastructure, but decided that this might be a topic
for further discussion: For example, should we initialize state
arrays to zero or to NaN? Initializing to the latter should make
it obvious when we accidentally use memory locations that had not
been filled in by a ghost-cell update or that had used the garbage
cell (nCells+1) as a dependency in their computation. To give a
supporting example, we learned that we hadn't initalized the omega
field in the hydrostatic core only after we initialized memory to
NaNs and tried to make a run. Perhaps we could have both zero and
NaN as a compile-time option for the infrastructure code, though?

For now, unless anyone else objects or would like to discuss the
initialization of grid type fields further, I'll go ahead and
commit the updated version of gen_inc.c to the repository trunk.

Cheers,
Michael


On Thu, Jul 22, 2010 at 12:19:48AM -0600, Todd D. Ringler wrote:
> 
> Hi Michael,
> 
> I think that the fix is appropriate.
> 
> We might want to consider making an init after every allocation a standard
> practice.
> 
> Cheers,
> Todd
> 
> 
> > Hi, All.
> >
> > Jeff Painter at LLNL has found a problem related to the use of
> > unitialized grid metadata fields in the hydrostatic MPAS core, and
> > his fix involves the shared registry code src/registry/gen_inc.c.
> > Since the bug fix is to common code, and since other cores might
> > be affected by similar bugs, I'd like to propose the changes in
> > the attached gen_inc.c (search for "initialize field") for
> > inclusion in the trunk.
> >
> > An example of where we would be caught by uninitialized fields is
> > given in the loop below, where either cell1 or cell2 may equal
> > nCells+1 when iEdge is at the boundary of a block, and the
> > nCells+1 element of nEdgesOnCell could take on any value:
> >
> >    do iEdge=1,nEdges
> >       cell1 = cellsOnEdge(1,iEdge)
> >       cell2 = cellsOnEdge(2,iEdge)
> >
> >       do k=1,grid % nVertLevels
> >
> >          d2fdx2_cell1 = deriv_two(1,1,iEdge) * theta(k,cell1)
> >          d2fdx2_cell2 = deriv_two(1,2,iEdge) * theta(k,cell2)
> >          do i=1, grid % nEdgesOnCell % array (cell1)
> >             d2fdx2_cell1 = d2fdx2_cell1 + deriv_two(i+1,1,iEdge) *
> > theta(k,grid % CellsOnCell % array (i,cell1))
> >          end do
> >
> >          ...
> >
> >       end do
> >    end do
> >
> > Although there are other ways that we could handle the
> > uninitialized nCells+1/nEdges+1/nVertices elements of grid
> > indexing arrays, Jeff's fix is simple, and adds a only very small
> > overhead during model startup. The change to gen_inc.c adds lines
> > like those marked below in src/inc/grid_meta_allocs.inc:
> >
> >       allocate(g % latCell)
> >       allocate(g % latCell % ioinfo)
> >       allocate(g % latCell % array(nCells + 1))
> >       g % latCell % array = 0                   <- additional code
> >       g % latCell % ioinfo % input = .true.
> >       g % latCell % ioinfo % restart = .true.
> >       g % latCell % ioinfo % output = .true.
> >
> >       allocate(g % lonCell)
> >       allocate(g % lonCell % ioinfo)
> >       allocate(g % lonCell % array(nCells + 1))
> >       g % lonCell % array = 0                   <- additional code
> >       g % lonCell % ioinfo % input = .true.
> >       g % lonCell % ioinfo % restart = .true.
> >       g % lonCell % ioinfo % output = .true.
> >
> >
> > If anyone has any thoughts or concerns, I'd be glad to hear them.
> > Otherwise, unless there are any objections, I'd like to commit the
> > attached version of gen_inc.c to the repository trunk.
> >
> > Cheers,
> > Michael
> > _______________________________________________
> > mpas-developers mailing list
> > mpas-developers at mailman.ucar.edu
> > http://mailman.ucar.edu/mailman/listinfo/mpas-developers
> >
> 


More information about the mpas-developers mailing list