I'm executing a scrapy spider using:
scrapy runspider my_spider.py -o results.json
How do I execute it silently, i.e., without all the spider print logs?
You can use the command line option --nolog
which sets LOG_ENABLED
to False
Another answer would be to add the log settings into your settings.py. Which is a bit different than the exact original question.
Info here: http://doc.scrapy.org/en/latest/topics/logging.html
My code for logs will make it export to a logfile in the folder. It will run silent in terminal but show anything thats NOT DEBUG in the logfile. So you can track it if you want.
LOG_ENABLED = True
LOG_LEVEL = 'INFO' # Levels: CRITICAL, ERROR, WARNING, INFO, DEBUG
LOG_FILE = 'logfile.log'
This will also make it so that you dont have to attach the --no-log when executing
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With