Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Gunicorn logging from multiple workers

I have a flask app that runs in multiple gunicorn sync processes on a server and uses TimedRotatingFileHandler to log to a file from within the flask application in each worker. In retrospect this seems unsafe. Is there a standard way to accomplish this in python (at high volume) without writing my own socket based logging server or similar? How do other people accomplish this? We do use syslog to aggregate across servers to a logging server already but I'd ideally like to persist the log on the app node first.

Thanks for your insights

like image 716
Colin Kroll Avatar asked Jan 05 '13 14:01

Colin Kroll


People also ask

How many workers can gunicorn handle?

Each of the workers is a UNIX process that loads the Python application. There is no shared memory between the workers. The suggested number of workers is (2*CPU)+1 . For a dual-core (2 CPU) machine, 5 is the suggested workers value.

How many workers should I use in gunicorn?

Gunicorn should only need 4-12 worker processes to handle hundreds or thousands of requests per second. Gunicorn relies on the operating system to provide all of the load balancing when handling requests. Generally we recommend (2 x $num_cores) + 1 as the number of workers to start off with.

How many concurrent requests can gunicorn handle?

Yes, with 5 worker processes, each with 8 threads, 40 concurrent requests can be served.

How do I get gunicorn logs?

In version 19.0, Gunicorn doesn't log by default in the console. To watch the logs in the console you need to use the option --log-file=- . In version 19.2, Gunicorn logs to the console by default again.


1 Answers

i use ConcurrentRotatingFileHandler

like image 165
archer Avatar answered Oct 19 '22 09:10

archer