I am a newbie to python. I am running python 2.7.3 version 32 bit on 64 bit OS. (I tried 64 bit but it didn't workout).
I followed the tutorial and installed scrapy on my machine. I have created one project, demoz. But when I enter scrapy crawl demoz
it shows an error. I came across this thing when i hit scrapy command under (C:\python27\scripts) it shows:
C:\Python27\Scripts>scrapy
Scrapy 0.14.2 - no active project
Usage:
scrapy <command> [options] [args]
Available commands:
fetch Fetch a URL using the Scrapy downloader
runspider Run a self-contained spider (without creating a project)
settings Get settings values
shell Interactive scraping console
startproject Create new project
version Print Scrapy version
view Open URL in browser, as seen by Scrapy
Use "scrapy <command> -h" to see more info about a command
C:\Python27\Scripts>
I guess their is something missing in installation can anybody help please .. Thanks in advance..
You should run scrapy crawl spider_name
command being in a scrapy project folder, where scrapy.cfg
file resides.
From the docs:
Crawling
To put our spider to work, go to the project’s top level directory and run:
scrapy crawl dmoz
You can run scrapy crawl demoz
code from your scrapy project folder which you have created using following command
scrapy startproject tutorials
For example, if you have started scrapy project of name tutorials
, then go to tutorials folder first and run crawl
command from there
scrapy crawl demoz
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With