
Research
npm Malware Targets Telegram Bot Developers with Persistent SSH Backdoors
Malicious npm packages posing as Telegram bot libraries install SSH backdoors and exfiltrate data from Linux developer machines.
This packages provides a simplistic task queue and scheduler for Django.
If execution of your tasks is mission critical, do not use this library, and turn to more complex solutions such as Celery, as this package doesn't guarantee task execution nor unique execution.
It is geared towards basic apps, where simplicity primes over reliability. The package offers simple decorator syntax, including cron-like schedules.
Features :
Limitations :
Install the library :
$ pip install django-toosimple-q
Enable the app in settings.py
:
INSTALLED_APPS = [
...
'django_toosimple_q',
...
]
Tasks need to be registered using the @register_task()
decorator. Once registered, they can be added to the queue by calling the .queue()
function.
from django_toosimple_q.decorators import register_task
# Register a task
@register_task()
def my_task(name):
return f"Hello {name} !"
# Enqueue tasks
my_task.queue("John")
my_task.queue("Peter")
Registered tasks can be scheduled from code using this cron-like syntax :
from django_toosimple_q.decorators import register_task, schedule
# Register and schedule tasks
@schedule(cron="30 8 * * *", args=['John'])
@register_task()
def morning_routine(name):
return f"Good morning {name} !"
To consume the tasks, you need to run at least one worker :
$ python manage.py worker
The workers will take care of adding scheduled tasks to the queue when needed, and will execute the tasks.
The package autoloads tasks.py
from all installed apps. While this is the recommended place to define your tasks, you can do so from anywhere in your code.
You can optionnaly give a custom name to your tasks. This is required when your task is defined in a local scope.
@register_task("my_favourite_task")
def my_task(name):
return f"Good morning {name} !"
You can set task priorities.
@register_task(priority=0)
def my_favourite_task(name):
return f"Good bye {name} !"
@register_task(priority=1)
def my_other_task(name):
return f"Hello {name} !"
# Enqueue tasks
my_other_task.queue("John")
my_favourite_task.queue("Peter") # will be executed before the other one
You can define retries=N
and retry_delay=S
to retry the task in case of failure. The delay (in second) will double on each failure.
@register_task(retries=10, retry_delay=60)
def send_email():
...
You can mark a task as unique=True
if the task shouldn't be queued again if already queued with the same arguments. This is usefull for tasks such as cleaning or refreshing.
@register_task(unique=True)
def cleanup():
...
cleanup.queue()
cleanup.queue() # this will be ignored as long as the first one is still queued
You can assign tasks to specific queues, and then have your worker only consume tasks from specific queues using --queue myqueue
or --exclude_queue myqueue
. By default, workers consume all tasks.
@register_task(queue='long_running')
def long_task():
...
@register_task()
def short_task():
...
# Then run those with these workers, so that long
# running tasks don't prevent short running tasks
# from being run :
# manage.py worker --exclude_queue long_running
# manage.py worker
By default, last_check
is set to now()
on schedule creation. This means they will only run on next cron occurence. If you need your schedules to be run as soon as possible after initialisation, you can specify last_check=None
.
@schedule(cron="30 8 * * *", last_check=None)
@register_task()
def my_task(name):
return f"Good morning {name} !"
By default, if some crons where missed (e.g. after a server shutdown or if the workers can't keep up with all tasks), the missed tasks will be lost. If you need the tasks to catch up, set catch_up=True
.
@schedule(cron="30 8 * * *", catch_up=True)
@register_task()
def my_task(name):
...
You may define multiple schedules for the same task. In this case, it is mandatory to specify a unique name :
@schedule(name="morning_routine", cron="30 16 * * *", args=['morning'])
@schedule(name="afternoon_routine", cron="30 8 * * *", args=['afternoon'])
@register_task()
def my_task(time):
return f"Good {time} John !"
You may get the schedule's cron datetime provided as a keyword argument to the task using the datetime_kwarg
argument :
@schedule(cron="30 8 * * *", datetime_kwarg="scheduled_on")
@register_task()
def my_task(scheduled_on):
return f"This was scheduled for {scheduled_on.isoformat()}."
Besides standard django management commands arguments, the management command supports following arguments.
usage: manage.py worker [--queue QUEUE | --exclude_queue EXCLUDE_QUEUE]
[--tick TICK]
[--once | --until_done]
[--no_recreate | --recreate_only]
optional arguments:
--queue QUEUE which queue to run (can be used several times, all
queues are run if not provided)
--exclude_queue EXCLUDE_QUEUE
which queue not to run (can be used several times, all
queues are run if not provided)
--tick TICK frequency in seconds at which the database is checked
for new tasks/schedules
--once run once then exit (useful for debugging)
--until_done run until no tasks are available then exit (useful for
debugging)
--no_recreate do not (re)populate the schedule table (useful for
debugging)
--recreate_only populates the schedule table then exit (useful for
debugging)
A queued email backend to send emails asynchronously, preventing your website from failing completely in case the upstream backend is down.
Enable and configure the app in settings.py
:
INSTALLED_APPS = [
...
'django_toosimple_q.contrib.mail',
...
]
EMAIL_BACKEND = 'django_toosimple_q.contrib.mail.backends.QueueBackend'
# Actual Django email backend used, defaults to django.core.mail.backends.smtp.EmailBackend, see https://docs.djangoproject.com/en/3.2/ref/settings/#email-backend
TOOSIMPLEQ_EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
To run tests locally (by default, tests runs against an in-memory sqlite database):
$ pip install -r requirements-dev.txt
$ python manage.py test
To run tests against postgres, run the following commands before :
# Start a local postgres database
$ docker run -p 5432:5432 -e POSTGRES_PASSWORD=postgres -d postgres
# Set and env var
$ export TOOSIMPLEQ_TEST_DB=postgres # on Windows: `$Env:TOOSIMPLEQ_TEST_DB = "postgres"`
Tests are run automatically on github.
Code style is done with pre-commit :
$ pip install -r requirements-dev.txt
$ pre-commit install
2022-03-24 : v0.4.0
last_check
and last_run
optional in the adminid
fields2021-07-15 : v0.3.0
contrib.mail
datetime_kwarg
argument to schedules2021-06-11 : v0.2.0
retries
, retry_delay
options for tasks2020-11-12 : v0.1.0
FAQs
A simplistic task queue and cron-like scheduler for Django
We found that django-toosimple-q demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Malicious npm packages posing as Telegram bot libraries install SSH backdoors and exfiltrate data from Linux developer machines.
Security News
pip, PDM, pip-audit, and the packaging library are already adding support for Python’s new lock file format.
Product
Socket's Go support is now generally available, bringing automatic scanning and deep code analysis to all users with Go projects.