How do I pass parameters to a a request on a url like this:
site.com/search/?action=search&description=My Search here&e_author=
How do I put the arguments on the structure of a Spider Request, something like this exemple:
req = Request(url="site.com/",parameters={x=1,y=2,z=3})
Scrapy can crawl websites using the Request and Response objects. The request objects pass over the system, uses the spiders to execute the request and get back to the request when it returns a response object.
Making a request is a straightforward process in Scrapy. To generate a request, you need the URL of the webpage from which you want to extract useful data. You also need a callback function. The callback function is invoked when there is a response to the request.
You need to set the user agent which Scrapy allows you to do directly. import scrapy class QuotesSpider(scrapy. Spider): # ... user_agent = 'Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.
Pass your GET parameters inside the URL itself:
return Request(url="https://yoursite.com/search/?action=search&description=MySearchhere&e_author=")
You should probably define your parameters in a dictionary and then "urlencode" it:
from urllib.parse import urlencode
params = {
"action": "search",
"description": "My search here",
"e_author": ""
}
url = "https://yoursite.com/search/?" + urlencode(params)
return Request(url=url)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With