Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scrapy Logging Level Change

I'm trying to start scrapy spider from my scripty as shown in here

logging.basicConfig(
    filename='log.txt',
    format='%(levelname)s: %(message)s',
    level=logging.CRITICAL
)
configure_logging(install_root_handler=False)
process = CrawlerProcess(get_project_settings())

process.crawl('1740')
process.start() # the script will block here until the crawling is finished

I want to configure the logging level of my spider but even if i do not install root logger handler and configure my basic config with logging.basicConfig method it does not obey the determinded level.

INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
INFO: Enabled item pipelines:
['collector.pipelines.CollectorPipeline']
INFO: Spider opened
INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)

It is following format and file name determined in basicConfig but it does not use logging level. I do not determine logging level other then this place.

NOTE: There is not any other place which i import logging or change logging level.

like image 281
guemues Avatar asked Jul 12 '16 12:07

guemues


Video Answer


1 Answers

For scrapy itself you should define logging settings in settings.py as described in the docs

so in settings.py you can set:

LOG_LEVEL = 'ERROR'  # to only display errors
LOG_FORMAT = '%(levelname)s: %(message)s'
LOG_FILE = 'log.txt'
like image 168
Granitosaurus Avatar answered Sep 22 '22 08:09

Granitosaurus