HDF5 output may fail if more than ~8k processes involved.

Issue #238 resolved
Chris Richardson created an issue

Partitioning information is saved in HDF5 files as a vector attribute (which is normally limited to 64k). Therefore if more than ~8k processes are writing a file, the attribute is likely to overflow (depending on word size).

Should change to using a dataset to store the partition data.

Comments (5)

  1. Chris Richardson reporter

    @garth-wells It is an issue, because the partitioning data is stored as an "attribute", which is limited in size. It's not caused a problem so far, but it may do.

  2. Log in to comment