Some of my programs consist of two parts. First, they read large data from files and transform it, producing Array
s, Hash
es, Object
s etc.; second, they analyse this data with (always different) user-defined conditions. The first part remains the same (as long as the source data isn't changed), but sometimes it takes considerable time to work every time I run the program, and I usually have to run it many times with the same source data. It would be much better to have two programs — one of them (once) reads the data and transforms it, while the other analyses it (many times).
My question is: what's the best way to store those Array
s, Hash
es and Object
s, so that the first program writes them to a file and the second one reads them from that file?
You can write the data structure to a Perl 6 source file and "use" it, then it'll be pre-compiled for you, which is potentially the fastest way to get a data structure into a Perl 6 program.
Other than that, JSON has some notable limitations that native Perl 6 data structures don't have, like the difference between integers and floating point numbers, and support for Inf, -Inf and NaN. There's also no support for keys in objects that are "complex", i.e. no arrays as hash keys.
One example serialization format that supports most of what Perl 6 can throw at it is MessagePack. There's modules for it already, I have only used Data::MessagePack so far, but here's the list: https://modules.perl6.org/search/?q=messagepack
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With