Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scrapy : How to pass list of arguments through command prompt to spider?

Creating a scraper for fantasy team. Looking for a way to pass a list of the players names as arguments, and then for each player_name in player_list run the parsing code.

I currently have something like this

class statsspider(BaseSpider):
name = 'statsspider'

def __init__ (self, domain=None, player_list=""):
    self.allowed_domains = ['sports.yahoo.com']
    self.start_urls = [
        'http://sports.yahoo.com/nba/players',
    ]
    self.player_list= "%s" % player_list


def parse(self, response):
    example code
    yield request

I'm assuming entering a list of arguments is the same as just one argument through the command line so I enter something like this:

scrapy crawl statsspider -a player_list=['xyz','abc']

Problem 2!

Solved the first issue by inputting a comma delimited list of arguments like so

scrapy crawl statsspider -a player_list="abc def,ghi jkl"

I now want to go through each "name" (i.e. 'abc def') to find the first initial of their last name (in this case 'd').

I use the code

array = []
for player_name in self.player_list:
    array.append(player_name)
print array

And I end up with the result [["'",'a','b','c',... etc]] Why does python not assign player_name to each 'name' (e.g. 'abc def' and 'ghi jkl')? can someone explain this logic to me, and I will probably understand the right way to do it afterwards!

like image 320
Python Learner Avatar asked Dec 09 '13 22:12

Python Learner


1 Answers

Shell arguments are string-based. You need to parse arg in your code.

command line:

scrapy crawl statsspider -a player_list=xyz,abc

python code:

self.player_list = player_list.split(',')
like image 167
kev Avatar answered Nov 07 '22 01:11

kev