The h5py library is a very nice wrapper round the HDF5 data storage format/library. The authors of h5py have done a super job of aligning HDF5 data types with numpy data types including structured arrays, which means you can store variable lengths strings and jagged arrays. One of the advantages of HDF5 for large datasets is that you can load slices of the data into memory very easily and transparently - h5py and HDF5 take care of everything - compression, chunking, buffering - for you.
As I was playing around with h5py one thing tripped me up. h5py has an "in memory" mode where you can create HDF5 files in memory (driver='core') option which is great when prototyping or writing tests, since you don't have to clean up files after you are done.
In the documentation it says, even if you have an in-memory file, you need to give it a name. I found this requirement funny, because I assumed that the fake file was being created in a throw away memory buffer attached to the HDF5 object and would disappear once the object was closed or went out of scope.
So I did things like this:
Great, things work as expected
Whaaaa?!?!
Closing the file didn't get rid of it! I have the data still!
Whoah! Deleting the parent object doesn't get rid of it either!!!
This surprised me a great deal. I had assumed the name was a dummy item, perhaps in order to keep some of their internal code consistent, but I did not ever expect a persistent memory store.
Welp, it turns out this is a *memory mapped* file and there is an actual file called f1 and f2 on the file system now. In order to make a file truly stored in memory, you have to use an additional option
As I was playing around with h5py one thing tripped me up. h5py has an "in memory" mode where you can create HDF5 files in memory (driver='core') option which is great when prototyping or writing tests, since you don't have to clean up files after you are done.
In the documentation it says, even if you have an in-memory file, you need to give it a name. I found this requirement funny, because I assumed that the fake file was being created in a throw away memory buffer attached to the HDF5 object and would disappear once the object was closed or went out of scope.
So I did things like this:
import h5py fp = h5py.File(name='f1', driver='core') # driver='core' is the incantation for creating an in memory HDF5 file dset = fp.create_group('/grp1') fp.keys() -> [u'grp1']
Great, things work as expected
fp.close() fp1 = h5py.File(name='f1', driver='core') # driver='core' is the incantation for creating an in memory HDF5 file fp1.keys() -> [u'grp1']
Whaaaa?!?!
Closing the file didn't get rid of it! I have the data still!
del fp fp2 = h5py.File(name='f1', driver='core') # driver='core' is the incantation for creating an in memory HDF5 file fp2.keys() -> [u'grp1']
Whoah! Deleting the parent object doesn't get rid of it either!!!
fp3 = h5py.File(name='f2', driver='core') # driver='core' is the incantation for creating an in memory HDF5 file fp3.keys() -> []
This surprised me a great deal. I had assumed the name was a dummy item, perhaps in order to keep some of their internal code consistent, but I did not ever expect a persistent memory store.
Welp, it turns out this is a *memory mapped* file and there is an actual file called f1 and f2 on the file system now. In order to make a file truly stored in memory, you have to use an additional option
backing_store
fp = h5py.File(name='f1', driver='core', backing_store=False) dset = fp.create_group('/grp1') fp.keys() -> [u'grp1'] fp.close() fp = h5py.File(name='f1', driver='core', backing_store=False) fp.keys() -> []
Comments
Post a Comment