What tool or set of tools would you use for horizontally scaling scrapyd adding new machines to a scrapyd cluster dynamically and having N instances per machine if required. Is not neccesary for all the instances to share a common job queue, but that would be awesome.
Scrapy-cluster seems promising for the job but I want a Scrapyd based solution so I listen to other alternatives and suggestions.
I scripted my own load balancer for Scrapyd using its API and a wrapper.
from random import shuffle
from scrapyd_api.wrapper import ScrapydAPI
class JobLoadBalancer(object):
@classmethod
def get_less_occupied(
cls,
servers_urls=settings.SERVERS_URLS,
project=settings.DEFAULT_PROJECT,
acceptable=settings.ACCEPTABLE_PENDING):
free_runner = {'num_jobs': 9999, 'client': None}
# shuffle servers optimization
shuffle(servers_urls)
for url in servers_urls:
scrapyd = ScrapydAPI(target=url)
jobs = scrapyd.list_jobs(project)
num_jobs = len(jobs['pending'])
if free_runner['num_jobs'] > num_jobs:
free_runner['num_jobs'] = num_jobs
free_runner['client'] = scrapyd
# Optimization: if found acceptable pending operations in one server stop looking for another one
if free_runner['client'] and free_runner['num_jobs'] <= acceptable:
break
return free_runner['client']
Unit test:
def setUp(self):
super(TestFactory, self).setUp()
# Make sure this servers are running
settings.SERVERS_URLS = [
'http://localhost:6800',
'http://localhost:6900'
]
self.project = 'dummy'
self.spider = 'dummy_spider'
self.acceptable = 0
def test_get_less_occupied(self):
# add new dummy jobs to first server so that choose the second one
scrapyd = ScrapydAPI(target=settings.SERVERS_URLS[0])
scrapyd.schedule(project=self.project, spider=self.spider)
scrapyd.schedule(project=self.project, spider=self.spider)
second_server_url = settings.SERVERS_URLS[1]
scrapyd = JobLoadBalancer.get_less_occupied(
servers_urls=settings.SERVERS_URLS,
project=self.project,
acceptable=self.acceptable)
self.assertEqual(scrapyd.target, second_server_url)
This code targets an older version of scrapyd as it was written more than a year ago.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With