[Go-essp-tech] repeated md5 chksum failures

Mike Berkley mike.berkley at gmail.com
Wed Apr 11 13:54:54 MDT 2012


I've been busily trying to republish the CCCma datasets, to try to get
the checksums to appear. The piControl dataset listed here was finally
republished today.  You may have had trouble with it while
republishing was in progress.


On Wed, Apr 11, 2012 at 11:51, Jennifer Adams <jma at cola.iges.org> wrote:
> Hi, Everyone --
> I'm trying to download some fairly large files (~1Gb) from the piControl run
> (monthly ocean variables) and find that the checksum fails to match several
> times and then will be ok. In some cases, it can take 10 or more re-tries
> before the checksum succeeds.
>
> The problem is not with a specific data node. Here are some of the dataset
> IDs for the troublesome downloads:
> cmip5.output1.CCCma.CanESM2.piControl.mon.ocean.Omon.r1i1p1.v20111028
> cmip5.output1.INM.inmcm4.piControl.mon.ocean.Omon.r1i1p1.v20110323
> cmip5.output1.MIROC.MIROC-ESM.piControl.mon.ocean.Omon.r1i1p1.v20110929
> cmip5.output1.MRI.MRI-CGCM3.piControl.mon.ocean.Omon.r1i1p1.v20110831
> cmip5.output1.NCAR.CCSM4.piControl.mon.ocean.Omon.r1i1p1.v20120220
> cmip5.output1.NCC.NorESM1-M.piControl.mon.ocean.Omon.r1i1p1.v20110901
> cmip5.output2.MRI.MRI-CGCM3.piControl.mon.ocean.Omon.r1i1p1.v20110831
> cmip5.output2.NCC.NorESM1-M.piControl.mon.ocean.Omon.r1i1p1.v20110901
> cmip5.output1.MPI-M.MPI-ESM-LR.piControl.mon.ocean.Omon.r1i1p1.v20120315
> cmip5.output1.MPI-M.MPI-ESM-P.piControl.mon.ocean.Omon.r1i1p1.v20120315
> cmip5.output2.MPI-M.MPI-ESM-P.piControl.mon.ocean.Omon.r1i1p1.v20111028
>
> For example, from the final two datasets in the list, here is an entry from
> the wget script:
> 'rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc'
> 'http://bmbf-ipcc-ar5.dkrz.de/thredds/fileServer/cmip5/output2/MPI-M/MPI-ESM-P/piControl/mon/ocean/Omon/r1i1p1/v20111028/rhopoto/rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc'
> 'MD5' '036aabfc10caa76a8943f967bc10ad4d'
>
> Here are the 21 download tries so far today, taking 5 hours, the "md5
> failed!" message appears in the log file after each one:
> 2012-04-11 09:19:18 (2.19 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 09:35:05 (1.13 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 09:53:26 (1009 KB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 10:05:52 (1.49 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 10:17:03 (1.61 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 10:31:14 (1.30 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 10:48:50 (1.04 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 11:01:09 (1.46 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 11:14:01 (1.40 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 11:29:46 (1.15 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 11:42:39 (1.40 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 12:01:05 (1011 KB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 12:18:25 (1.03 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 12:35:30 (1.04 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 12:49:44 (1.35 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 13:08:38 ( 960 KB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 13:26:11 (1.01 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 13:36:21 (1.78 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 13:50:53 (1.25 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 14:06:26 (1.15 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
> 2012-04-11 14:19:43 (1.39 MB/s) -
> `rhopoto_Omon_MPI-ESM-P_piControl_r1i1p1_185001-185912.nc' saved
> [1083611268/1083611268]
>
> This one failed 14 times before finally getting the "md5 ok" message -- it
> took 3 hrs 45 minutes to get this file:
> 'so_Omon_MPI-ESM-P_piControl_r1i1p1_189001-189912.nc'
> 'http://bmbf-ipcc-ar5.dkrz.de/thredds/fileServer/cmip5/output1/MPI-M/MPI-ESM-P/piControl/mon/ocean/Omon/r1i1p1/v20120315/so/so_Omon_MPI-ESM-P_piControl_r1i1p1_189001-189912.nc'
> 'MD5' '175d6c9dd3ffea30186e6bc9c7e3dee1'
>
> This problem is sucking up my bandwidth and my time, which are not
> unlimited. Is there any remedy?
> --Jennifer
>
>
>
>
> _______________________________________________
> GO-ESSP-TECH mailing list
> GO-ESSP-TECH at ucar.edu
> http://mailman.ucar.edu/mailman/listinfo/go-essp-tech
>


More information about the GO-ESSP-TECH mailing list