Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Concatenate two big pandas.HDFStore HDF5 files

This question is somehow related to "Concatenate a large number of HDF5 files".

I have several huge HDF5 files (~20GB compressed), which could not fit the RAM. Each of them stores several pandas.DataFrames of identical format and with indexes that do not overlap.

I'd like to concatenate them to have a single HDF5 file with all DataFrames properly concatenated. One way to do this is to read each of them chunk-by-chunk and then save to a single file, but indeed it would take quite a lot of time.

Are there any special tools or methods to do this without iterating through files?

like image 420
Vladimir Avatar asked Mar 07 '15 19:03

Vladimir


1 Answers

see docs here for the odo project (formerly into). Note if you use the into library, then the argument order has been switched (that was the motivation for changing the name, to avoid confusion!)

You can basically do:

from odo import odo
odo('hdfstore://path_store_1::table_name',
    'hdfstore://path_store_new_name::table_name')

doing multiple operations like this will append to the rhs store.

This will automatically do the chunk operations for you.

like image 58
Jeff Avatar answered Nov 06 '22 07:11

Jeff