Django-Cog
A django-celery-beat extension to build pipelines of chronological stages and parallel tasks.
About
Using the Djano admin, this library allows you to create pipelines of multi-staged tasks. Each pipeline is launched at a specific time, utilizing the django-celery-beat
CrontabSchedule
object to define the time of launch. Once launched, the pipeline looks for the first stage(s) of parallel tasks. Each task is submitted to a celery worker for completion. Once all tasks of a stage complete, the stage is considered complete and any proceeding stages will launch (assuming all previous stages required are completed).
Pipelines
A pipeline is a collection of stages. It has an optional launch schedule tied to a CronSchedule
that defines when the pipeline should be launched. By default, a pipeline will not launch if it detects that a previous launch has not yet completed.
You can also leave the launch schedule blank should you not want a particular pipeline to run on its own. Instead, clicking the Launch Now
button on that pipeline's Django admin page will begin its execution.
Note: If a pipeline was previously scheduled using a CronSchedule
object but needs to be taken off the schedule, simply set the Schedule
field of the pipeline to the first option of the drop-down menu (---------
). This will remove the generated PeriodicTask
within the database scheduler, preventing further auto-execution of that pipeline.
Stages
A stage is a collection of tasks that can all be ran independently (and in parallel) of each other. It can be dependent on any number of previous stages to be complete before launching, but will be launched upon the completion of all prerequisite stages.
If a stage has no prerequisite stages, it will be launched at the start of the Pipeline's launch. You can have multiple stages run at the same time.
Upon launching a stage, each task that belongs to it will be sent to the celery
broker for execution.
Tasks and Cogs
Cogs:
A Cog
is a registered python function that can be used in a task. To register a function, use the @cog
function decorator:
from django_cog import cog
@cog
def my_task():
pass
NOTE: These functions must be imported in your application's __init__.py
file. Otherwise the auto-discovery process will not find them.
Once Django starts, an auto-discovery process will find all functions with this decorator and create Cog
records for them in the database. This allows them to be reference in the Django admin.
Tasks:
Once you have cogs registered, you can create a task. Tasks are specific execution definitions for a cog, and are tied to a stage. This allows you to run the same function through multiple stages, if needed.
Parameters
If your function has parameters needed, you can set these in the Task creation. See below for an example:
from django_cog import cog
@cog
def add(a, b):
return a + b
Then in the Task Django admin page, set these variables in the Arguments as JSON:
field:
{
"a": 1,
"b": 2
}
Installation
IMPORTANT: It is required that the library is installed and migrations ran PRIOR to registering functions as cogs
.
First install the library with:
pip install django-cog
Then add it to your Django application's INSTALLED_APPS
inside your settings.py
file:
INSTALLED_APPS = [
...
'django_cog.apps.DjangoCogConfig',
'nested_inline',
'django_celery_beat',
]
Lastly, run migrations:
python manage.py migrate django_cog
Cog Registration
Once migrations complete, it is safe to register your functions using the cog
decorator:
from django_cog import cog
@cog
def my_task():
pass
Manual Registration
If the library doesn't pick up your registered cogs and doesn't create Cog
records in the admin automatically, you can call the Cog
record creation function manually in the Django shell:
from django_cog.apps import create_cog_records
create_cog_records()
Note: This does require the functions to be registered cogs. You can list the functions django_cog
has discovered with:
from django_cog import cog
print(cog.all)
This will output a dictionary where the key is the function name it discovered, and the value is the actual function itself.
If your function is not listed here: this means the registration was not called. This is likely due to the function not being imported in your applications __init__.py
. Double check to make sure you are importing the function somewhere that gets called on Django's startup (__init__.py
is the recommended place for this).
Docker-Compose
Below is a sample docker-compose.yml segment to add the required services for Celery workers, Celery-Beat, and Redis:
version: '3'
volumes:
volume_postgresdata:
volume_django_media:
volume_django_static:
services:
postgresdb:
image: postgres:9.6-alpine
environment:
- POSTGRES_DB
- POSTGRES_USER
- POSTGRES_PASSWORD
restart: unless-stopped
volumes:
- volume_postgresdata:/var/lib/postgresql/data
ports:
- "127.0.0.1:5432:5432"
networks:
- backend
djangoweb:
build:
context: .
args:
- DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
image: djangoapp:latest
networks:
- backend
- celery
- frontend
volumes:
- .:/app/
- volume_django_static:/var/staticfiles
- volume_django_media:/var/mediafiles
ports:
- 127.0.0.1:8000:8000
depends_on:
- postgresdb
- mailhog
environment:
- POSTGRES_DB
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_HOST=postgresdb
- POSTGRES_PORT
- DJANGO_SETTINGS_MODULE
- FORCE_SCRIPT_NAME
- DJANGO_ENVIRONMENT
- SECRET_KEY
- DJANGO_ALLOWED_HOSTS
links:
- "postgresdb"
redis:
image: "redis:alpine"
networks:
- celery
celery:
build:
context: .
dockerfile: Dockerfile.celery
args:
- DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
command: celery -A django_cog worker -l info
image: djangoapp_celery:latest
volumes:
- .:/app/
environment:
- POSTGRES_DB
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_HOST=postgresdb
- POSTGRES_PORT
- FORCE_SCRIPT_NAME
- DJANGO_ENVIRONMENT
- DJANGO_SETTINGS_MODULE
- SECRET_KEY
- DJANGO_ALLOWED_HOSTS
depends_on:
- postgresdb
- redis
networks:
- backend
- celery
links:
- "postgresdb"
celerybeat:
build:
context: .
dockerfile: Dockerfile.celery
args:
- DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
command: celery -A django_cog beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
image: djangoapp_celerybeat:latest
volumes:
- .:/app/
environment:
- POSTGRES_DB
- POSTGRES_USER
- POSTGRES_PASSWORD
- POSTGRES_HOST=postgresdb
- POSTGRES_PORT
- FORCE_SCRIPT_NAME
- DJANGO_ENVIRONMENT
- DJANGO_SETTINGS_MODULE
- SECRET_KEY
- DJANGO_ALLOWED_HOSTS
depends_on:
- celery
networks:
- backend
- celery
networks:
frontend:
name: djangocog_frontend
backend:
name: djangocog_backend
celery:
name: djangocog_celery
And the matching Dockerfile.celery
(of which the celery
and celerybeat
services will build from):
FROM python:3.8
ENV PYTHONUNBUFFERED 1
ARG DJANGO_ENVIRONMENT
# Make the static/media folder.
RUN mkdir /var/staticfiles
RUN mkdir /var/mediafiles
# Make a location for all of our stuff to go into
RUN mkdir /app
# Set the working directory to this new location
WORKDIR /app
# Add our Django code
ADD . /app/
RUN pip install --upgrade pip
# Install requirements for Django
RUN pip install -r requirements/base.txt
RUN pip install -r requirements/${DJANGO_ENVIRONMENT}.txt
RUN pip install -r requirements/custom.txt
# No need for an entry point as they are defined in the docker-compose.yml services
Queues
You can create specific queues within celery, and assign tasks to these queues. Within the Task
Django admin, you'll find a queue
field that defaults to celery
. This is the default queue that celery works on. This queue option is created automatically and is used as the default when you run the django-cog migrations.
Adding Queues
If you want to have a separate collection of workers dedicated for certain tasks, create a new CeleryQueue
record in the Django admin and set your tasks to this queue.
Additionally, you'll need workers to work on this queue. You can do this by adding a -Q queue_name
parameter to the command
call in the docker-compose.yml
file's celery
service:
services:
celery:
command: celery -A django_cog worker -l info -Q queue_name
Task Optimization and Weights
In order to achieve better runtimes, tasks are queued in order of greatest "weight" first. This works best in scenarios where you have some n
number of tasks that have a longer run time, but have extra tasks whose summation runtimes do not exceed that of the longer tasks. The longer tasks will be launched first, allowing the runtime of each stage to be in direct correlation of the longest running individual task, rather than an arbitrary coefficient of shorter tasks that happened to be queued up before it.
The weight of a task is updated at the end of each of its Pipeline run completions. The weight
field of a Task
is literally the average number of total seconds it took to complete. (NOTE: This field is limited to 11 decimal places with a precision of 5 points, which means the maximum runtime supported for a single task is roughly 11.5 days).
The sample size taken defaults to the past 10 runtimes of that task, but this can be adjusted by adding the following to your Django settings.py
:
DJANGO_COG_TASK_WEIGHT_SAMPLE_SIZE = 10
If you do not want to have the weight auto-calculated, you can disable this calculation by adding the following to your Django settings.py
:
DJANGO_COG_AUTO_WEIGHT = False
Note: Tasks will still be ordered by weight (descending) when queued, but they all default to a weight of 1
prior to any runs. You can also manually adjust this weight through the Task
Django admin, which enables you to determine the order of queuing yourself (which of course requires the disabling of auto-weight calculation, as described above).
Error Handling
Each task can be assigned as Critical
(default True
). This will cause the pipeline to halt should any critical task fail to execute.
NOTE: Currently running tasks will be allowed to complete, but no new tasks of the same stage will launch, and no further stages of that pipeline will begin.
This leaves the Pipeline in a Failed
state, so if the pipeline has been marked as "Prevent overlapping runs": the pipeline cannot be launched again until the incomplete pipeline run object has been deleted. (See below about failed pipelines.) Once the last task of a failed pipeline completes, the task will update the Pipeline's Completed On
timestamp to reflect when that last task completed.
Error Handlers
Each task call is wrapped in a generic try/except
clause. Within the except
clause, an error handler function is called. By default, this is the defaultCogErrorHandler
, which records the error in the CogErrors
table in the Django admin. This stores information on the error type, the task run that failed, the date/time of the error, and the traceback of the error.
Custom Error Handlers
Each task can be assigned an error handler (if none are assigned, the defaultCogErrorHandler
is used). You can write your own error handler function and register it like so:
from django_cog import cog_error_handler
@cog_error_handler
def myErrorHandler(error):
"""
Do something with the error (Excemption type)
"""
print(error)
Any custom error handler needs to take in at least one positional parameter, that is: the Excemption error.
OPTIONAL: You can also include a keyword argument task_run
to have the failed task run object passed in:
from django_cog import cog_error_handler
@cog_error_handler
def myErrorHandler(error, task_run):
"""
Do something with the error (Excemption type)
and know where it came from.
"""
print("The following error came from the task:", task_run.task.name)
print(error)
Failed Pipelines
Should a task marked critical
fail, the task, its stage, and pipeline will fall into a Failed
state. By default, new pipeline runs cannot be called if the last run failed. This is done intentionally so the same error does not keep repeating itself.
However, you can override this safety feature by setting DJANGO_COG_OVERLAP_FAILED = True
in your Django settings. Doing this will allow a pipeline to launch even if the last run of that pipeline failed.
Canceling Pipeline Runs
When a pipeline is currently running, you can visit the Pipeline Runs
Django admin page to see which ones are running (displayed at the top of the list). Clicking on the details of a running pipeline will show a red Cancel Run
button at the top. Clicking this will send the pipeline into a Canceled
state. Any currently running tasks will be allowed to complete. As soon as the last one of the currently running tasks completes, it will update the stage's Completed On
field to properly reflect the end execution timestamp of the last task. The stage will then also fall into a Canceled
state, and no further stages or tasks will be launched.