TimeSeriesHDF5() fails for more than 8185 steps
When writing out data using TimeSeriesHDF5()
, I inexplicably get error messages from HDF5, corrupting the entire file. On my machine, the code
from dolfin import *
mesh = UnitSquareMesh(10, 10, 'crossed')
V = FunctionSpace(mesh, 'CG', 1)
f = Expression('sin(pi*x[0]) * sin(pi*x[1])')
u = project(f, V)
u_file = TimeSeriesHDF5('/tmp/lossless_u')
t = 0.0
dt = 1.0e-2
u_file.store(mesh, t)
for k in range(8185):
u_file.store(u.vector(), t)
t += dt
gives
HDF5-DIAG: Error detected in HDF5 (1.8.11) MPI-process 0:
#000: ../../../src/H5Dio.c line 234 in H5Dwrite(): can't prepare for writing data
major: Dataset
minor: Write failed
#001: ../../../src/H5Dio.c line 266 in H5D__pre_write(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
[...]
Curiously, I found that magical number 8185 to be the limit on two different machines. Also, it is independent of the mesh size.
Another related issue is that errors in HDF5 merely get printed to the std::err, no actual exception is raised.
Comments (10)
-
-
reporter - edited description
-
reporter So it is just the vector of time steps that creates the problem here? Certainly smells like it. The link you mention contains information of how to surpass this limit by using Dense Attribute Storage. Any chance of plugging this into DOLFIN? After all, 8185 isn't absurdly large.
-
Yes, either that or make the timesteps into a dataset of its own.
-
The latter sounds better to me.
-
It probably requires an extension to the low-level HDF5 interface to allow datasets which can expand ("unlimited dimension"). Another approach would be to tag each individual dataset of the TimeSeries with a time value.
-
https://bitbucket.org/fenics-project/dolfin/branch/chris/fix-issue-181
should fix the problem.
-
- changed milestone to 1.4
-
- changed status to resolved
-
- removed milestone
Removing milestone: 1.4 (automated comment)
- Log in to comment
Apparently there is a 64k limit on attribute size in HDF5 by default. Since the timesteps are stored as a vector<double> attribute, I guess that is the limit you are hitting. It seems there is some way of enabling larger attributes... http://www.hdfgroup.org/HDF5/doc/UG/13_Attributes.html