Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scrapy cannot find spider

Tags:

python

scrapy

I am doing scrapy tutorial in scrapy documentation . This is my current directory looks like:

.
├── scrapy.cfg
└── tutorial
    ├── __init__.py
    ├── __init__.pyc
    ├── items.py
    ├── pipelines.py
    ├── settings.py
    ├── settings.pyc
    └── spiders
        ├── __init__.py
        ├── __init__.pyc
        └── dmoz_spider

The dmoz_spider.py is the same as described in scrapy tutorial page.

import scrapy

class DmozSpider(scrapy.Spider):
    name = "dmoz"
    allowed_domains = ["dmoz.org"]
    start_urls = [
        "http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
        "http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
    ]

    def parse(self, response):
        filename = response.url.split("/")[-2] + '.html'
        with open(filename, 'wb') as f:
            f.write(response.body)

Then I run this command from current directory

scrapy crawl dmoz

But I get the error message:

2015-12-17 12:23:22 [scrapy] INFO: Scrapy 1.0.3 started (bot: tutorial)
2015-12-17 12:23:22 [scrapy] INFO: Optional features available: ssl, http11
2015-12-17 12:23:22 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tutorial'}
    ...
        raise KeyError("Spider not found: {}".format(spider_name))
    KeyError: 'Spider not found: dmoz'

Is there any suggestions which part did I do wrong? I have checked similar question in stack overflow and follow the solution there. But I still get the error.

like image 460
endeavour90 Avatar asked Oct 19 '22 20:10

endeavour90


1 Answers

You have to add a .py extension to your dmoz_spider file. The file name should be dmoz_spider.py.

like image 126
dts Avatar answered Oct 21 '22 11:10

dts