Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Whats the best way to present a flask interface to ongoing backround task?

I have a long running process that continuously reads from a telnet port and may occasionally write to it. Sometimes I want to send an HTTP request to it to fetch the info its read since the last time I asked. Sometimes I may send an HTTP request to write certain data to another telnet port.

Should I do this with 2 threads and if so should I use a mutex or an instruction queue. How do you do threading with flask anyway? Should I use multiprocessing? Something else?

The reason I ask this is I had problem with a similar problem(but serial ports instead of telnet port and directly in the app instead of a local/remote HTTP service) and ended up with the non data reading thread somehow almost never running even when I inserted tons of sleep calls. I ended up re-writing it from mutex to queues and then to using multiprocesing w/ queues.

Edit: The telnet ports are connections to an application which communicates(mainly reads debug data) with hardware(a printer). The flask HTTP service I want to write would be accessed by test running against the printer(either on the same machine or a different machine then the HTTP service), none of this involves a web browser!

like image 399
Roman A. Taycher Avatar asked Apr 09 '15 08:04

Roman A. Taycher


People also ask

How do I run a Flask server in the background?

Make sure, you close the terminal and not press Ctrl + C. This will allow it to run in background even when you log out. To stop it from running , ssh in to the pi again and run ps -ef |grep nohup and kill -9 XXXXX where XXXX is the pid you will get ps command.

How do you handle concurrent request in Flask?

Flask applications are deployed on a web server, either the built-in server for development and testing or a suitable server (gunicorn, nginx etc.) for production. By default, the server can handle multiple client requests without the programmer having to do anything special in the application.


2 Answers

These kind of long polling jobs are best achieved using sockets, they don't really fit the Flask/WSGI model as this is not geared to asynchronous operations. You may want to look at twisted or tornado.

That said your back-end process that reads/writes to telnet could be running in a separate thread that may or may not be initiated from a HTTP request. Once you kick off a thread from the flask app it won't block the response.

You can just read from the data store it writes to by occasionally polling the Flask app for new data. This could be achieved client-side in a browser using javascript and timeouts, but it's a bit hacky.

like image 97
gonkan Avatar answered Sep 24 '22 19:09

gonkan


The ideal solution for this is to have separate threads for your Flask app and the long-running process, and to use a message queue to broker messages between threads. Message queues are a great way to allow inter-thread communication while decoupling the components of your architecture.

You have to take the server you're using to run the Flask app into consideration when you think about threading. The dev server you get when you run app.run() is a single synchronous process that can handle one request at a time. Deploying the Flask app to a multithreaded server like Gunicorn will allow you to basically have a process for each worker. I.e. 4 workers on 4 threads will be able to handle 4 concurrent requests.

Each request to Flask gets it's own thread local object for request and session data, so think about this when you're designing your code. Not only is a message queue good for brokering messages between Flask and your long-running process, but also between your individual Flask threads.

Now that we've covered the threading, let's talk about message queues. There are a few different patterns that you can use here, but I'll focus on pub-sub. The publisher-subscriber model is great for when data needs to flow from one to many places. Let's call the long-running process A, and the thread or multiple threads running the Flask app B (the cool part about pub-sub is that either case will work).

B can "subscribe" to messages coming from A through the queue, which A can send the the queue whenever it needs to. The queue can "publish" the messages to the subscribers either by pushing the messages to them, or waiting for them to pull them from the queue. A good solution with Flask might be to check the queue for new messages at the start of each request. You can also have B publish back to A in a similar way.

All that being said, this is pretty open ended and so there's no one right way to do this. This is what I'd recommend as it follows some best practices that have worked from me. Also, I try to stay technology agnostic with these answers, but there is a nice sample of using Redis for a message queue with Flask you can take a look at for more ideas.

like image 29
hert Avatar answered Sep 23 '22 19:09

hert