When I call
cmdline.execute("scrapy crawl website".split())
print "Hello World"
it stops the script after cmdline.execute, and doesn't run the rest of the script and print "Hello World". How do I fix this?
By taking a look at the execute function in Scrapy's cmdline.py, you'll see the final line is:
sys.exit(cmd.exitcode)
There really is no way around this sys.exit call if you call the execute function directly, at least not without changing it. Monkey-patching is one option, albeit not a good one! A better option is to avoid calling the execute function entirely, and instead use the custom function below:
from twisted.internet import reactor
from scrapy import log, signals
from scrapy.crawler import Crawler as ScrapyCrawler
from scrapy.settings import Settings
from scrapy.xlib.pydispatch import dispatcher
from scrapy.utils.project import get_project_settings
def scrapy_crawl(name):
def stop_reactor():
reactor.stop()
dispatcher.connect(stop_reactor, signal=signals.spider_closed)
scrapy_settings = get_project_settings()
crawler = ScrapyCrawler(scrapy_settings)
crawler.configure()
spider = crawler.spiders.create(name)
crawler.crawl(spider)
crawler.start()
log.start()
reactor.run()
And you can call it like this:
scrapy_crawl("your_crawler_name")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With