I have a large CSV file that contains independent items that take a fair bit of effort to process. I'd like to be able to process each line item in parallel. I found a sample piece of code for processing a CSV file on SO here:
Newbie transforming CSV files in Clojure
The code is:
(use '(clojure.contrib duck-streams str-utils)) ;;'
(with-out-writer "coords.txt"
(doseq [line (read-lines "coords.csv")]
(let [[x y z p] (re-split #"," line)]
(println (str-join \space [p x y z])))))
This was able to print out data from my CSV file which was great - but it only used one CPU. I've tried various different things, ending up with:
(pmap println (read-lines "foo"))
This works okay in interactive mode but does nothing when running from the command line. From a conversation on IRC, this is because stdout isn't available by default to threads.
Really what I'm looking for is a way to idiomatically apply a function to each line of the CSV file and do so in parallel. I'd also like to print some results to stdout during testing if at all possible.
Any ideas?
If you want the results in the output be in the same order as in the input, then printing from pmap might not be a good idea. I would recommend creating a (lazy) sequence of the input lines pmap over that and then print the result of pmap. Something like this should work:
(dorun (map println (pmap expensive-computation (read-lines "coords.csv"))))
If you want to do this at speed you might want to look at this article on how Alex Osborne solved the Widefinder 2 challenge posed by Tim Bray. Alex goes into all aspects of parsing, processing and collecting the results (in the Widefinder 2 case the file is a very large Apache log). The actual code used is here.
I would be extremely surprised if hat code can be sped up by using more cores. I'm 99% certain that the actual speed limit here is the file I/O, which should be a couple orders of magnitude slower than any single core you can throw at the problem.
And that's aside from the overhead you'll introduce when splitting these very minimal tasks over multiple CPUs. pmap isn't exactly free.
If you're sure that disk IO isn't going to be a problem and you've got a lot of CSV parsing to do, simply parsing multiple files in their own threads is going to gain you a lot more for a lot less effort.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With