I have a 350 MB Sklearn pickle file that I want to load on Flask app start up.
I use _pickle
because it's mentioned in documentation that it has faster load time:
import _pickle as pickle
start = timeit.default_timer()
with open("pickle/pipeline.pkl", 'rb') as f:
# ~350MB file
pipeline = pickle.load(f)
stop = timeit.default_timer()
print('Time: ', stop - start)
The pickle loads in 5-12 seconds locally but on Google App Engine F4 (1GB RAM) instance, the gunicorn
worker times out.
Google App Engine log:
A 2019-10-20T20:07:55Z [2019-10-20 20:07:55 +0000] [14] [INFO] Booting worker with pid: 14
A 2019-10-20T20:11:02Z [2019-10-20 20:04:14 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:14)
I tried increasing the worker timeout by adding the -t flag in app.yaml
file as follows and still doesn't work:
runtime: python
instance_class: F4
env: flex
entrypoint: gunicorn -t 120 -b :$PORT app:app
liveness_check:
initial_delay_sec: 500
readiness_check:
app_start_timeout_sec: 500
I even tried using eventlet and it didn't work:
gunicorn -t 120 -b :$PORT app:app --worker-class eventlet --workers 3
Worker timeouts By default, Gunicorn gracefully restarts a worker if hasn't completed any work within the last 30 seconds. If you expect your application to respond quickly to constant incoming flow of requests, try experimenting with a lower timeout configuration.
Try with entrypoint: gunicorn -t 0 -b :$PORT app:app
it worked for me.
The process of loading a 350 MB pickle data file is taking too long and App Engine thinks your instance has failed. Your container startup time is over three minutes.
You have a problem that cannot be easily solved. The maximum file size is 32 MB (64 MB for Go applications). You are downloading the file to memory, but your instance is taking too long to download it.
Solution: I would use a different service such as Cloud Run where you can embed your pickle data file in the container image so that downloads are not required.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With