How can I properly utilize the asynchronous functionality in a FastAPI route?
The following code snippet takes 10 seconds to complete a call to my /home
route, while I expect it to only take 5 seconds.
from fastapi import FastAPI
import time
app = FastAPI()
async def my_func_1():
"""
my func 1
"""
print('Func1 started..!!')
time.sleep(5)
print('Func1 ended..!!')
return 'a..!!'
async def my_func_2():
"""
my func 2
"""
print('Func2 started..!!')
time.sleep(5)
print('Func2 ended..!!')
return 'b..!!'
@app.get("/home")
async def root():
"""
my home route
"""
start = time.time()
a = await my_func_1()
b = await my_func_2()
end = time.time()
print('It took {} seconds to finish execution.'.format(round(end-start)))
return {
'a': a,
'b': b
}
I am getting the following result, which looks non asynchronous:
λ uvicorn fapi_test:app --reload
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [5116]
INFO: Started server process [7780]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:51862 - "GET / HTTP/1.1" 404
Func1 started..!!
Func1 ended..!!
Func2 started..!!
Func2 ended..!!
It took 10 seconds to finish execution.
INFO: 127.0.0.1:51868 - "GET /home HTTP/1.1" 200
But, I am expecting FastAPI to print like below:
Func1 started..!!
Func2 started..!!
Func1 ended..!!
Func2 ended..!!
It took 5 seconds to finish execution.
Please correct me if I am doing anything wrong?
Note: you can mix def and async def in your path operation functions as much as you need and define each one using the best option for you. FastAPI will do the right thing with them. Anyway, in any of the cases above, FastAPI will still work asynchronously and be extremely fast.
Unfortunately, Express will not be able to handle this error. You'll receive a log like this: To handle an error in an asynchronous function, you need to catch the error first. You can do this with try/catch .
FastAPI (Async) - Python FastAPI in asynchronous mode clocks in at ~228 requests per second.
Perhaps a bit late and elaborating from Hedde's response above, here is how your code app looks like.
Remember to await
when sleeping, and gathering the awaitables - if you don't do it, no matter whether you use time.sleep()
or asyncio.sleep()
you will not have the two tasks run concurrently.
from fastapi import FastAPI
import time
import asyncio
app = FastAPI()
async def my_func_1():
"""
my func 1
"""
print('Func1 started..!!')
await asyncio.sleep(5)
print('Func1 ended..!!')
return 'a..!!'
async def my_func_2():
"""
my func 2
"""
print('Func2 started..!!')
await asyncio.sleep(5)
print('Func2 ended..!!')
return 'b..!!'
@app.get("/home")
async def root():
"""
my home route
"""
start = time.time()
futures = [my_func_1(), my_func_2()]
a,b = await asyncio.gather(*futures)
end = time.time()
print('It took {} seconds to finish execution.'.format(round(end-start)))
return {
'a': a,
'b': b
}
time.sleep
is blocking, you should use asyncio.sleep
, there's also .gather
and .wait
to aggregate jobs. This is well documented within Python and FastAPI.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With