On the command line I'm executing the following simple attempt to invoke scrapy:
scrapy version
I get the following error:
$ scrapy version
Traceback (most recent call last):
File "/Users/nathanielford/virtualenvironments/crawler/bin/scrapy", line 11, in <module>
sys.exit(execute())
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/cmdline.py", line 141, in execute
cmd.crawler_process = CrawlerProcess(settings)
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/crawler.py", line 238, in __init__
super(CrawlerProcess, self).__init__(settings)
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/crawler.py", line 129, in __init__
self.spider_loader = _get_spider_loader(settings)
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/crawler.py", line 325, in _get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/spiderloader.py", line 33, in from_settings
return cls(settings)
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/spiderloader.py", line 20, in __init__
self._load_all_spiders()
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/spiderloader.py", line 28, in _load_all_spiders
for module in walk_modules(name):
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/site-packages/scrapy/utils/misc.py", line 63, in walk_modules
mod = import_module(path)
File "/Users/nathanielford/virtualenvironments/crawler/lib/python3.5/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 986, in _gcd_import
File "<frozen importlib._bootstrap>", line 969, in _find_and_load
File "<frozen importlib._bootstrap>", line 956, in _find_and_load_unlocked
ImportError: No module named 'spiders'
If I leave my project directory, the error no longer occurs:
$ scrapy version
Scrapy 1.2.2
What is causing the ImportError: No module named 'spiders'
?
This error was caused by my settings.py
file, where the I had the following:
SPIDER_MODULES = ['spiders']
While it is in the default project provided by scrapy
, I had moved that module. Values in that list need to be fully qualified module names: even trivial executions of scrapy
will load the settings file, and thus try to import modules from your project. If it can find a settings file, it will need to be correct.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With