HDF5 output may fail if more than ~8k processes involved.
Issue #238
resolved
Partitioning information is saved in HDF5 files as a vector attribute (which is normally limited to 64k). Therefore if more than ~8k processes are writing a file, the attribute is likely to overflow (depending on word size).
Should change to using a dataset to store the partition data.
Comments (5)
-
-
reporter @garth-wells It is an issue, because the partitioning data is stored as an "attribute", which is limited in size. It's not caused a problem so far, but it may do.
-
- changed milestone to 1.6
-
reporter - changed status to resolved
Closing this until it becomes a problem
-
- removed milestone
Removing milestone: 1.6 (automated comment)
- Log in to comment
@chris_richardson Is this still an issue?