Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

unable to utilize tensorflow model under uwsgi + nginx deployment

I'm trying to deploy a face detection service using MTCNN in tensorflow + flask + uWSGI. I based my deployment on this docker and added this custom uwsgi.ini:

[uwsgi]
module = main
callable = app
enable-threads = true
cheaper = 2
processes = 16
threads = 16
http-timeout = 60

but when I try to do face detection using this docker image I just built, I always get 504 Gateway Time-out. Actually when I dug deeper, I noticed that the code runs fine to this session.run line:

    for op_name in data_dict:
        with tf.variable_scope(op_name, reuse=True):
            for param_name, data in iteritems(data_dict[op_name]):
                try:
                    var = tf.get_variable(param_name)
                    session.run(var.assign(data))
                except ValueError:
                    if not ignore_missing:
                        raise

At first, I thought it was a problem related to threading under uwsgi worker, so I added increased number of processes and threads but without any success.

When I run the same code with flask debugger, it runs just fine and processes the image in less than a second. So it is not a problem with code but a problem with config or combination of these tools.

like image 556
Mehraban Avatar asked Oct 23 '25 06:10

Mehraban


2 Answers

You also need to set cheaper = 0. This is my uwsgi and it is working.

[uwsgi]
module = main
callable = app
master = false 
processes = 1
cheaper = 0
like image 105
Johnny Yin Avatar answered Oct 24 '25 21:10

Johnny Yin


Use master = false and processes = 1 for uwsgi config. There is a known issue that tensorflow hangs in a multiprocess setting.

like image 32
Darwin Avatar answered Oct 24 '25 20:10

Darwin