Skip to main content
idego
Python

Grab Your Free Book with Django and Celery

By Idego Group

Grab Your Free Book with Django and Celery

Sometimes developers face challenges running code periodically, such as hourly or daily tasks. Celery offers mechanisms to simplify distributed system creation by distributing work units through message exchanges among networked machines or local workers. A task represents any job requiring distribution, which must be encapsulated beforehand.

Celery supports synchronous, asynchronous, periodic, and scheduled tasks transparently across workers on the Internet. When a task is called, it returns an AsyncResult instance, allowing status checks and result retrieval. The client component creates and dispatches tasks to brokers using methods like apply_async().

A result backend component stores task status and results for client retrieval. Celery supports multiple backends including RabbitMQ, Redis, MongoDB, Memcached, and SQLAlchemy/Django ORM.

Workers execute received tasks with configurable behavior through concurrency modes, remote control, and task revocation mechanisms. Brokers handle message transmission between clients and workers, with RabbitMQ and Redis offering the most complete functionality.

The project structure includes a celery.py file in the project directory and tasks.py in the application directory. The Celery application instance is created in celery.py, configuring Django settings and auto-discovering tasks from installed applications.

Basic tasks can be defined using the @shared_task decorator. Periodic tasks execute at regular intervals using cron-like scheduling. These tasks require the Celery beat scheduler running alongside workers.

For production environments, Celery worker and scheduler should run as background daemons. Supervisor can manage these processes for reliable operation.

Related articles