I'm a physicist that normally deals with large amounts of numerical data generated using C programs. Typically, I store everything as columns in ASCII files, but this had led to massively large files. Given that I am limited in space, this is an issue and I'd like to be a little smarter about the whole thing. So ...
Is there a better format than ASCII? Should I be using binary files, or perhaps a custom format some library?
Should I be compressing each file individually, or the entire directory? In either case, what format should I use?
Thanks a lot!
In your shoes, I would consider the standard scientific data formats, which are much less space- and time-consuming than ASCII, but (while maybe not quite as bit-efficient as pure, machine-dependent binary formats) still offer standard documented and portable, fast libraries to ease the reading and writing of the data.
If you store data in pure binary form, the metadata is crucial to make any sense out of the data again (are these numbers single or double precision, or integers and of what length, what are the arrays' dimensions, etc, etc), and issues with archiving and retrieving paired data/metadata pairs can, and in practice do, occasionally make perfectly good datasets unusable -- a real pity and waste.
CDF, in particular, is "a self-describing data format for the storage and manipulation of scalar and multidimensional data in a platform- and discipline-independent fashion" with many libraries and utilities to go with it. As alternatives, you might also consider NetCDF and HDF -- I'm less familiar with those (and such tradeoffs as flexibility vs size vs speed issues) but, seeing how widely they're used by scientists in many fields, I suspect any of the three formats could give you very acceptable results.
If you need the files for a longer time, they are important experimental data that prove somethings for you or so, don't use binary formats. You will not be able to read them when your architecture changes. dangerous. stick to text (yes ascii) files.
Choose a compression format that fits your needs. Is compression time an issue? Usually not, but check that for yourself. Is decompression time an issue? Usually yes, if you want to do data analysis on it. Under these conditions I'd go for bzip2. This is quite common nowadays, well tested, foolproof. I'd do files individually, since the larger your file, the larger the probability of losses. (Bits flip etc).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With