Is there a way in julia to list fields (including structure, group, dimension) of a hdf5 file without loading the dataset? I did not find something similar to h5ls -r -f in the HDF5 package. Thanks.
Here is very rough gist h5_get_structure.jl for HDF5 file structure mapping that gives following output:
/Users/TM/Temp/test.h5: HDF5.HDF5File (length 10)
 AcquisitionLog: HDF5.HDF5Group (length 0)
 AddTraces: HDF5.HDF5Group (length 5)
    PTR-Instrument: HDF5.HDF5Group (length 0)
    PTR-Misc: HDF5.HDF5Group (length 0)
    PTR-Reaction: HDF5.HDF5Group (length 0)
    TOFSupply: HDF5.HDF5Group (length 0)
    TofSupply: HDF5.HDF5Group (length 2)
       TwData: HDF5Dataset () 
       TwInfo: HDF5Dataset (28, 2) 
 FullSpectra: HDF5.HDF5Group (length 3)
    MassAxis: HDF5Dataset (100239,) 
    SumSpectrum: HDF5Dataset (100239, 1) 
    TofData: HDF5Dataset (100239, 1, 1, 6000) 
 PTR-Concentration: HDF5.HDF5Group (length 0)
 PTR-Peaktable: HDF5.HDF5Group (length 2)
    Data: HDF5Dataset (317, 8) Float64
    Info: HDF5Dataset (317,) 
 PTR-PrimaryIonSettings: HDF5.HDF5Group (length 0)
 PTR-Transmission: HDF5.HDF5Group (length 0)
 PeakData: HDF5.HDF5Group (length 2)
    PeakData: HDF5Dataset (317, 1, 1, 6000) 
    PeakTable: HDF5Dataset (4, 317) Float64
 RawData: HDF5.HDF5Group (length 1)
    HPTDC: HDF5.HDF5Group (length 0)
 TimingData: HDF5.HDF5Group (length 1)
    BufTimes: HDF5Dataset (6000,) 
                        Not 100% what your asking for but what information is missing using print or names?
e.g.
# make hdf5 file & add two datasets
A = Vector{Int}(1:10)
h5write("bar.h5", "foo", A)
h5write("bar.h5", "2foo", 2A)
i = h5open("bar.h5", "r") # Open it
names(i) 
# 2-element Array{String,1}:
#  "2foo"
#  "foo"
[println(a) for a in i]
# HDF5 dataset: /2foo (file: bar.h5)
# HDF5 dataset: /foo (file: bar.h5)
## Run h5ls
;h5ls "bar.h5"
# 2foo                     Dataset {10}
# foo                      Dataset {10}
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With