Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scrapyd cant find the project name

Tags:

scrapy

scrapyd

I am getting an error when I try to run an existing scrapy project on scrapyd.

I have a working scrapy project (url_finder) and a working spider in that project used for test purpose (test_ip_spider_1x) that simply downloads whatismyip.com.

I succesffully installed scrapyd (with apt-get) and now I would like to run the spider on scrapyd. So i execute:

curl http://localhost:6800/schedule.json -d project=url_finder -d spider=test_ip_spider_1x

This returns:

{"status": "error", "message": "'url_finder'"}

Which seems to suggest that there is a problem with the project. However when I execute: scrapy crawl test_ip_spider_1x Everything runs fine. When I check the scrapyd log in the web interface, this is what I get:

2014-04-01 11:40:22-0400 [HTTPChannel,0,127.0.0.1] 127.0.0.1 - - [01/Apr/2014:15:40:21 +0000] "POST /schedule.json HTTP/1.1" 200 47 "-" "curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3"
2014-04-01 11:40:58-0400 [HTTPChannel,1,127.0.0.1] 127.0.0.1 - - [01/Apr/2014:15:40:57 +0000] "GET / HTTP/1.1" 200 747 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.152 Safari/537.36"
2014-04-01 11:41:01-0400 [HTTPChannel,1,127.0.0.1] 127.0.0.1 - - [01/Apr/2014:15:41:00 +0000] "GET /logs/ HTTP/1.1" 200 1203 "http://localhost:6800/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.152 Safari/537.36"
2014-04-01 11:41:03-0400 [HTTPChannel,1,127.0.0.1] 127.0.0.1 - - [01/Apr/2014:15:41:02 +0000] "GET /logs/scrapyd.log HTTP/1.1" 200 36938 "http://localhost:6800/logs/" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.152 Safari/537.36"
2014-04-01 11:42:02-0400 [HTTPChannel,2,127.0.0.1] Unhandled Error
    Traceback (most recent call last):
      File "/usr/local/lib/python2.7/dist-packages/twisted/web/http.py", line 1730, in allContentReceived
        req.requestReceived(command, path, version)
      File "/usr/local/lib/python2.7/dist-packages/twisted/web/http.py", line 826, in requestReceived
        self.process()
      File "/usr/local/lib/python2.7/dist-packages/twisted/web/server.py", line 189, in process
        self.render(resrc)
      File "/usr/local/lib/python2.7/dist-packages/twisted/web/server.py", line 238, in render
        body = resrc.render(self)
    --- <exception caught here> ---
      File "/usr/lib/pymodules/python2.7/scrapyd/webservice.py", line 18, in render
        return JsonResource.render(self, txrequest)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/txweb.py", line 10, in render
        r = resource.Resource.render(self, txrequest)
      File "/usr/local/lib/python2.7/dist-packages/twisted/web/resource.py", line 250, in render
        return m(request)
      File "/usr/lib/pymodules/python2.7/scrapyd/webservice.py", line 37, in render_POST
        self.root.scheduler.schedule(project, spider, **args)
      File "/usr/lib/pymodules/python2.7/scrapyd/scheduler.py", line 15, in schedule
        q = self.queues[project]
    exceptions.KeyError: 'url_finder'

2014-04-01 11:42:02-0400 [HTTPChannel,2,127.0.0.1] 127.0.0.1 - - [01/Apr/2014:15:42:01 +0000] "POST /schedule.json HTTP/1.1" 200 47 "-" "curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3"

Any ideas?

like image 843
gpanterov Avatar asked Dec 19 '22 16:12

gpanterov


1 Answers

In order to run a scrapyd project you must first deploy it first. This wasn't well explained in the documentation online (especially for first time users). Here is one solution that worked for me:

Install scrapyd-deploy: if you have Ubuntu or similar you can run:

apt-get install scrapyd-deploy

In your scrapy project folder edit scrapy.cfg and uncomment the line

 url = http://localhost:6800/

This is your deploy target -- scrapy will deploy projects at this location. Next, check to make sure scrapyd can see the deploy target:

scrapyd-deploy -l

This should output something similar to:

default http://localhost:6800/

Next you can deploy the project (url_finder):

scrapyd-deploy default -p url_finder

And finally run the spider:

curl http://localhost:6800/schedule.json -d project=url_finder -d spider=test_ip_spider_1x
like image 124
gpanterov Avatar answered Mar 25 '23 07:03

gpanterov