📜 ⬆️ ⬇️

Asynchronous tasks in Django using Celery

Greetings
I think most Django developers have heard of Celery , a system for asynchronous task execution, and many even use it actively.

About a year ago on Habré there was a pretty good article telling how to use Celery. However, as mentioned in the conclusion, Celery 2.0 has already been released (currently the stable version is 2.2.7), where the integration with django was included in a separate package, as well as other changes .

This article will be useful primarily for beginners who are starting to work with Django, and they need something that can perform asynchronous and / or periodic tasks in the system (for example, cleaning out of outdated sessions). I will show you how to install and configure Celery to work with Django from start to finish, and also tell you about some other useful settings and pitfalls.

')
First of all, let's check for the presence of the python-setuptools package in the system, and install it if it is missing:

aptitude install python-setuptools 

Celery installation

Celery itself is installed very simply:

 easy_install Celery 

Read more in the original: http://celeryq.org/docs/getting-started/introduction.html#installation

In the article, the link to which was given at the beginning, MongoDB was used as a backend, here I will show how to use the same database as the backend and message broker in which other Django applications store data.

django-celery

Install the django-celery package :
 easy_install django-celery 

As already mentioned, django-celery provides convenient integration of Celery and Django. In particular, it uses the Django ORM as a backend to save the results of Celery tasks, and also automatically finds and registers Celery tasks for Django applications listed in INSTALLED_APPS .

After installing django-celery you need to configure:

When using mod_wsgi, add the following lines to the WSGI configuration file:
 import os os.environ["CELERY_LOADER"] = "django" 


django-kombu

Now we need to find a suitable message broker for Celery, in this article I will use django-kombu - a package that allows you to use the Django database as a message store for Kombu (AMPQ implementation on python).
Install the package:
 easy_install django-kombu 

Customize:

Run

We start the celery and celerybeat processes:
(Without celerybeat, you can run and perform regular (regular) tasks. To perform periodic tasks on a schedule, you need to run celerybeat)

After launch, we can see how periodic tasks look in the django admin panel:

image

If you use anything other than Django ORM (RabbitMQ for example) as a celery backend, you could also view the status of all other tasks in the Django admin area, it looks like this:

image
Learn more: http://stackoverflow.com/questions/5449163/django-celery-admin-interface-showing-zero-tasks-workers

UPDATE: I add a little bit about demonization, as it may not happen the first time.

We start celery as a service

Download the celery launch script from here: https://github.com/ask/celery/tree/master/contrib/generic-init.d/ and place it in the /etc/init.d directory with the appropriate rights.
Create the celeryd file in the / etc / default directory , the script will take the launch settings from it:
 # Where the Django project is. CELERYD_CHDIR="/var/www/myproject" # Path to celeryd CELERYD_MULTI="$CELERYD_CHDIR/manage.py celeryd_multi" CELERYD_OPTS="--time-limit=300 --concurrency=8 -B" CELERYD_LOG_FILE=/var/log/celery/%n.log # Path to celerybeat CELERYBEAT="$CELERYD_CHDIR/manage.py celerybeat" CELERYBEAT_LOG_FILE="/var/log/celery/beat.log" CELERYBEAT_PID_FILE="/var/run/celery/beat.pid" CELERY_CONFIG_MODULE="settings" export DJANGO_SETTINGS_MODULE="settings" 

The --concurrency option sets the number of celery processes (by default, this number is equal to the number of processors).
After that, you can start celery using the service :
 service celeryd start 

Read more: docs.celeryproject.org/en/latest/tutorials/daemonizing.html#daemonizing

Working with celery

After installing django-celery, celery jobs are automatically registered from all tasks.py modules from all applications listed in INSTALLED_APPS . In addition to the tasks modules, you can also specify additional modules using the CELERY_IMPORTS parameter:
 CELERY_IMPORTS=('myapp.my_task_module',) 

It is also useful to activate the CELERY_SEND_TASK_ERROR_EMAILS option, thanks to which Celery will report all errors to the addresses listed in the ADMINS variable.

Writing assignments for celery has not changed much since the previous article:
 from celery.task import periodic_task from celery.schedules import crontab @periodic_task(ignore_result=True, run_every=crontab(hour=0, minute=0)) def clean_sessions(): Session.objects.filter(expire_date__lt=datetime.now()).delete() 

The only difference is that decorators should now be imported from celery.task , the decorators module has become deprecated.

A couple of performance notes:

Learn more about these and other Celery tips: http://celeryproject.org/docs/userguide/tasks.html#tips-and-best-practices

Source: https://habr.com/ru/post/123902/


All Articles