I implemented a tiny Django app (v4.0.4) containing a REST API — GET method for retrieving some data. Next, I wanted to run the project using gunicorn+uvicorn since I saw a more benchmark performance than a normal deployment in an article. So I decided to get my own benchmark using wrk tool.
Here's what I've got:
| Command | Webserver | Protocol | Result (Req/Sec) |
|---|---|---|---|
python manage.py runserver 0.0.0.0:8000 |
Django Default | wsgi | 13.06 |
gunicorn bitpin.wsgi:application --bind 0.0.0.0:8000 -w 2 |
gunicorn | wsgi | 45.20 |
gunicorn bitpin.asgi:application --bind 0.0.0.0:8000 -w 2 -k uvicorn.workers.UvicornWorker |
uvicorn+gunicorn | asgi | 22.17 |
However, the above result demonstrates something else!
Is the reason that when I want to use asgi I have to use async method instead for my API view? If so how can I change a Django REST API view to an async one?
Or might I've missed some configurations?
[NOTE]:
I ran the benchmark using the following command:
wrk -t4 -c11 -d20s -H "Authorization: Token xxx" http://127.0.0.1:8000/api/v1/content/
It is worth mentioning that for this test I used two workers for gunicorn and it is obvious that the higher workers, the better the performance will be.
İf you want create async rest api use one async main function and you can call another synchronous function from this function.
for example:
#url.py:
path('Create',views.data_export,name='export_database')
#view.py
async def data_export(request):
if(request.method == 'POST'):
if(db_connect):
system_id=request.POST.get('system_id')
export_id =request.POST.get('export_id')
asyncio.create_task(start(system_id,export_id))
return HttpResponse("Starting")
async def start(system_id,export_id):
export_task = sync_to_async(start_export,thread_sensitive=False)
system_id=str(config['COLLECTION_NAME']+'_'+system_id)
await export_task(system_id,export_id)
export_task is a synchronous function.
This is how I made the rest api asynchronous.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With