Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How does the uwsgi spooler work?

I need a task queue so that some of the heavy operations can be moved out of the uwsgi context without affecting the users. Since uwsgi is being used currently, I thought uwsgi spooler could be used. I wanted to know how exactly it works? Are the spooled tasks still executed in some other uwsgi worker? If yes, then the server would still be overloaded since the other workers would be busy serving the spooled task. Are there better alternatives? I'm using python

like image 512
dotgc Avatar asked Mar 08 '14 13:03

dotgc


People also ask

How many processes does uWSGI have?

The threads option is used to tell uWSGI to start our application in prethreaded mode. That essentially means it is launching the application across multiple threads, making our four processes essentially eight processes.

What is uWSGI protocol?

uwsgi (all lowercase) is the native binary protocol that uWSGI uses to communicate with other servers. uWSGI is often used in conjunction with web servers such as Cherokee and Nginx, which offer direct support for uWSGI's native uwsgi protocol, to serve Python web applications such as Django.

Is uWSGI multithreaded?

By default uWSGI does not enable threading support within the Python interpreter core. This means it is not possible to create background threads from Python code.

What is uWSGI queue?

In addition to the caching framework, uWSGI includes a shared queue. At the low level it is a simple block-based shared array, with two optional counters, one for stack-style, LIFO usage, the other one for FIFO.


1 Answers

Reference: http://uwsgi-docs.readthedocs.org/en/latest/Spooler.html

each spooler is a different process aimed at running tasks enqueued in the form of files in a directory (the spool directory). Multiple spooler processes (per uWSGI-instance) can sit on the same spool dir to parallelize task-groups and multiple spooldirs can be configured (to have different task groups)

The spooler approach is very low-level, but requires zero-maintainance (and eventually removing tasks is a matter of rm'ing a file) and it is really solid.

The only alternative (and very probably the most used one) in the python world i am aware of is celery

http://www.celeryproject.org/

otherwise you can rely on the venerable redis + daemon thread approach, where a python thread consumes tasks enqueued in redis. Eventually you can use a uWSGI mule (it is like a worker but without external access) instead of a thread to consume tasks.

like image 76
roberto Avatar answered Sep 19 '22 17:09

roberto