I have an numpy array(arr) of size (3997,29). I am using this array to create a dataset. This array has both integer and float variables. So dtype is reference. But when I execute it I get the below error.
"ValueError: Not a location id (Invalid object id)"
with h5py.File("test1.h5", 'w') as f:
grp = f.create_group('Nodes')
with h5py.File("test1.h5", 'r+') as f:
grp = f.require_group('Nodes')
ref_dtype = h5py.special_dtype(ref=h5py.Reference)
arrshape = np.shape(arr)
dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)
The error occurs in the last line. Below are the traceback messages
dset = f.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)
File "C:\Users\rupesh.n\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\_hl\group.py", line 108, in create_dataset
dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)
File "C:\Users\rupesh.n\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\_hl\dataset.py", line 137, in make_new_dset
dset_id = h5d.create(parent.id, None, tid, sid, dcpl=dcpl)
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5d.pyx", line 79, in h5py.h5d.create
ValueError: Not a location id (Invalid object id)
This error frequently occurs when someone tries to create a new dataset using a closed handle. If you are iterating make sure you are not closing the file inside of a loop. I had the same problem as OP.
This question is a bit old, but in case anyone else ends up here with the same question I'll clear up the answer a little bit. @WilderField is correct, but to be a little more clear.
In the last line:
dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)
grp
is pointing to the Closed h5py.Group
that was used in:
with h5py.File("test1.h5", 'r+') as f:
grp = f.require_group('Nodes')
Because grp
was set to point to the group inside of the with...
context manager grp
is only an open group within that context manager. The HDF and all groups/datasets associated with the HDF are closed when the context manager exits. This behaviour is to prevent the HDF from being held open by lost pointers to HDF objects.
The solution is to create the h5py.Dataset
inside of the context manager, i.e.:
with h5py.File("test1.h5", 'r+') as f:
grp = f.require_group('Nodes')
dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)
Again, as soon as the context manager closes, dset
will point to a Closed h5py.Dataset
so unless you actually want to do something more with it, it would be sufficient to call grp.create_dataset(...)
without assigning the return to dset
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With