I'm trying to start scrapy spider from my scripty as shown in here
logging.basicConfig(
filename='log.txt',
format='%(levelname)s: %(message)s',
level=logging.CRITICAL
)
configure_logging(install_root_handler=False)
process = CrawlerProcess(get_project_settings())
process.crawl('1740')
process.start() # the script will block here until the crawling is finished
I want to configure the logging level of my spider but even if i do not install root logger handler and configure my basic config with logging.basicConfig method it does not obey the determinded level.
INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
INFO: Enabled item pipelines:
['collector.pipelines.CollectorPipeline']
INFO: Spider opened
INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
It is following format and file name determined in basicConfig but it does not use logging level. I do not determine logging level other then this place.
NOTE: There is not any other place which i import logging or change logging level.
For scrapy itself you should define logging settings in settings.py
as described in the docs
so in settings.py
you can set:
LOG_LEVEL = 'ERROR' # to only display errors
LOG_FORMAT = '%(levelname)s: %(message)s'
LOG_FILE = 'log.txt'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With