How to handle time-consuming requests in Django?


Lets say I have a Django app that is processing images. User POSTs an image to a view, the view function is processing this image, and then returns a modified one. Lets say a processing can take ~5sec.

Or lets say I have a view that needs to make a request to another web-server. This connection and data download may take ~5secs.

What will happen if three users try to use my django app at the same time? Will Django handle the requests one-by-one, causing third user to wait ~15secs?

I have found four methods of handling requests simultaneously in Django (please correct me if I am wrong):

  1. One is to use a third-party software called Celery to handle tasks in the background. This method was proposed here: Can Django do multi-thread works? . This method requires:

    • Django project reconfiguration

    • Celery "server/worker" process running

    • async functions/views should be decorated with a @shared_task decorator

  2. Second method is to configure Apache/uWSGI/gunicorn server to use "workers". This method requires only server configuration. This method is partially mentioned here: Django, sleep() pauses all processes, but only if no GET parameter?

  3. Third is to simply use Python threads for views. Like in this example: Multithreading for Python Django

  4. Four is to use a django-background-tasks app (

Can anyone elaborate on this topic please? What are the good and bad sides of each of these solutions?


Since your problem is dealing with multiple users resize image in same time. I recommeded you using uwsgi or gunicorn using multiple worker, if you have multicore server and your image proccesing not using all core.

If not you can use load balancing like nginx and create multiple instance apps for handle simultaneously access.

Try that:

 >>> l = list(set(list1)-set(compSet))
 >>> l
 ['this', 'and']
By : coder

This video can help you solving your question :)
By: admin