We're working on a Rails project on Heroku that needs to scrape and process data each night for each user. This requires many Internet accesses per user, and we're hoping to be able to support tens of thousands of users. While there's a fair bit of parsing, calculating, and writing to databases involved, we expect that most of the task's time will be spent waiting on data from the network.
What's the best general approach to doing this task while minimizing both wallclock time and Heroku fees? Obviously either concurrency or async networking will be needed to take advantage of the time spent waiting for the network, but how should we go about it? We're thinking in terms of a database-backed queue with forked worker processes, but that may not be the best approach—or may not even be possible on Heroku.
Heroku supports Delayed Job, I would start there. You can then do the following:
You'll need to play with your workers/jobs ratio to figure out the sweet spot for optimizing across db load, wallclock time and heroku costs.
If you're finding that each job spends too much time sitting around waiting for network, take a look at eventmachine. Jobs are just ruby code, so you can play whatever parallelization tricks you want here, Heroku shouldn't limit you in any way.
This setup would be a pretty good baseline to get to as it shouldn't take very long to spin up and you'll probably learn a bit about your work load from it.
You might find out that 1 job/user doesn't make sense, and that you need n jobs per user (one job per property or something). Without knowing your exact usecase it's hard to say up front, that's why I'm assuming a 1-1 mapping.
I should also point out that the new Heroku stack supports queueing systems other than Delayed Job (scroll to bottom).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With