-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wriring array of tuples fails #1076
Comments
Have you considered JLD or JLD2 which focus on serializing Julia types to HDF5? How should the tuple be represented in the HDF5 file? |
So, HDF5 doesn't support tuples by its type model. But for this particular case (homogenous ntuple) I'd expect it to encode ntuples as a statically-sized arrays. |
However, it throws errors both with StaticArrays and with arrays of arrays: using StaticArrays, HDF5
vec = [5,6]
svec = SVector(5, 6)
h5open("test.h5", "w") do h
write_dataset(h, "ntup", [ntup]) # works fine
write_dataset(h, "vec", [vec]) # errors
write_dataset(h, "svec", [svec]) # errors
end |
The clearest path for me would be the
We might be able to support |
Thanks, now I can write ntuples. Is there similar way to declare target type to read back in from HDF5? using HDF5
import HDF5.hdf5_type_id
hdf5_type_id(::Type{NTuple{N,T}}) where {N,T} = HDF5.API.h5t_array_create(hdf5_type_id(T), 1, [N])
tup = (5, 6)
h5open("test.h5", "w") do h
write_dataset(h, "tup", [tup, tup, tup])
end
# reads back vector of vectors
d = h5open("test.h5", "r") do h
read_dataset(h, "tup")
end |
You can do this: julia> out = Vector{typeof(tup)}(undef, 3)
3-element Vector{Tuple{Int64, Int64}}:
(7277816999743324160, 7205759405420183552)
(7205759405386629120, 8358680910027030528)
(8286623315989102592, 8358680910094139392)
julia> # reads back vector of tuples
d = h5open("test.h5", "r") do h
read_dataset(h["tup"], datatype(tup), out)
end
julia> out
3-element Vector{Tuple{Int64, Int64}}:
(5, 6)
(5, 6)
(5, 6) |
This example works with named tuples but fails with ordinary tuples.
Shows error:
The text was updated successfully, but these errors were encountered: