Manage H5Pset_layout in R
1
1
Entering edit mode
micheledemeo ▴ 10
@micheledemeo-6898
Last seen 9.4 years ago

I'm working with a 4.6 Gb data.frame in R and I would like to save it in hdf5 format.
This is what I'm doing:

> print(object.size(A), units = "Gb")
      4.6 Gb
    
 > library(rhdf5) 
 > hd5_file="/home/usert/sim/sim.h5"
 > h5createFile( hd5_file )
   [1] TRUE 

 > h5write(A, hd5_file, "A")

I receive this error message:

>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5D.c line 170 in H5Dcreate2(): unable to create dataset
>         major: Dataset
>         minor: Unable to initialize object
>       #001: H5Dint.c line 431 in H5D_create_named(): unable to create and link to dataset
>         major: Dataset
>         minor: Unable to initialize object
>       #002: H5L.c line 1640 in H5L_link_object(): unable to create new link to object
>         major: Links
>         minor: Unable to initialize object
>       #003: H5L.c line 1884 in H5L_create_real(): can't insert link
>         major: Symbol table
>         minor: Unable to insert object
>       #004: H5Gtraverse.c line 905 in H5G_traverse(): internal path traversal failed
>         major: Symbol table
>         minor: Object not found
>       #005: H5Gtraverse.c line 688 in H5G_traverse_real(): traversal operator failed
>         major: Symbol table
>         minor: Callback failed
>       #006: H5L.c line 1687 in H5L_link_cb(): unable to create object
>         major: Object header
>         minor: Unable to initialize object
>       #007: H5O.c line 3013 in H5O_obj_create(): unable to open object
>         major: Object header
>         minor: Can't open object
>       #008: H5Doh.c line 295 in H5O_dset_create(): unable to create dataset
>         major: Dataset
>         minor: Unable to initialize object
>       #009: H5Dint.c line 1035 in H5D_create(): unable to construct layout information
>         major: Dataset
>         minor: Unable to initialize object
>       #010: H5Dchunk.c line 443 in H5D_chunk_construct(): chunk size must be < 4GB
>         major: Dataset
>         minor: Unable to initialize object
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine[enter link description here][1]
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5Dio.c line 228 in H5Dwrite(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type
>     HDF5-DIAG: Error detected in HDF5 (1.8.7) thread 0:
>       #000: H5D.c line 391 in H5Dclose(): not a dataset
>         major: Invalid arguments to routine
>         minor: Inappropriate type

and (I suppose) I should set a storage option different from H5D_COMPACT, working with contiguous or chunked file as reported in the official reference manual.

Is there a way to work with H5Pset_layout from R?

 

 

rhdf5 • 1.9k views
ADD COMMENT
4
Entering edit mode
Bernd Fischer ▴ 550
@bernd-fischer-5348
Last seen 7.3 years ago
Germany / Heidelberg / DKFZ

You can create the dataset manually, after creating the file and before writing the data. Thereby, you can set the chunk size of your object, with an appropriate chunk size that fits your needs. 

 

>h5createDataset (hd5_file, "A", dims = dim(A), storage.mode = storage.mode(A), chunk= c(dim(A)[1],dim(A)[2],1,1))

 

ADD COMMENT
0
Entering edit mode

Hi Barnd,

thank you very much! I'm working with a data.frame (not a matrix), with different data type, so 

storage.mode(A)
[1] "list"

and this is not a supported type.

What's your suggestion to manage big data.frames (with numeric and character data type)? 

 

ADD REPLY

Login before adding your answer.

Traffic: 845 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6