Loading large h5 file cause to consume all available memory

Issue #10 resolved
Hananel Hazan created an issue

I am trying to load h5 file with size of 800Mb.

Is it possible to implement load function that could load specific chunk from the data? see for example in Matmatica help on Import: Import["ExampleData/rose.gif", {"Data", 100, 100}]

Or simply import from place X until place Y?

It will be much appreciated, because I need to read from huge files (something like 2G)

Thanks!

Comments (4)

  1. Barry Wardell

    I have added support for specifying a specific hyperslab of a dataset to read. The syntax for doing so mimics Mathematica's Part[], for example:

    ImportHDF5["file.h5", {"Datasets", {"dataset", 1;;100;;2, 4;;30;;3}}]
    

    It might be worth thinking a bit more about whether this is the best interface, or if there would be a better one.

  2. Hananel Hazan reporter

    Thanks. That worked. What you mean by whether this is the best interface? is there other interface?

  3. Barry Wardell

    Thanks for confirming it works.

    With regards to the interface, I meant that the syntax was just the simplest thing I could think of. It might be that there is something better. The current interface is quite similar to the example you pointed out from the help on Import, though, so maybe it is the best choice.

  4. Log in to comment