Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Asynchronous job scheduler. Using asyncio to run jobs in worker threads/processes.
A job scheduler for running asynchronous (and synchronous) jobs with
dependencies using asyncio. Jobs are coroutines (async def
functions) with
a name, and (optionally) a set of dependencies (i.e. names of other jobs
that must complete successfully before this job can start). The job coroutine
may await the results from other jobs, schedule work to be done in a thread or
subprocess, or various other things provided by the particular Context
object
passed to the coroutine. The job coroutines are run by a Scheduler
, which
control the execution of the jobs, as well as the number of concurrent threads
and processes doing work. The Scheduler emits events which allow e.g. progress
and statistics to be easily collected and monitored. A separate module is
provided to turn Scheduler events into an interactive scheduling plot:
A job coroutine completes in one of three ways:
asyncio.CancelledError
inside the coroutine, and having it propagate out
of the coroutine.The Scheduler handles its own cancellation (e.g. Ctrl-C) by cancelling all ongoing and remaining tasks as quickly and cleanly as possible.
import asyncio
from asyncjobs import Scheduler
import time
def sleep(): # Run in a worker thread by job #2 below
print(f'{time.ctime()}: Sleep for a second')
time.sleep(1)
print(f'{time.ctime()}: Finished sleep')
s = Scheduler()
# Job #1 prints uptime
s.add_subprocess_job('#1', ['uptime'])
# Job #2 waits for #1 and then sleeps in a thread
s.add_thread_job('#2', sleep, deps={'#1'})
# Job #3 waits for #2 and then prints uptime (again)
s.add_subprocess_job('#3', ['uptime'], deps={'#2'})
asyncio.run(s.run())
(code also available here) should produce output like this:
16:35:58 up 9 days 3:29, 1 user, load average: 0.62, 0.55, 0.55
Tue Feb 25 16:35:58 2020: Sleep for a second
Tue Feb 25 16:35:59 2020: Finished sleep
16:35:59 up 9 days 3:29, 1 user, load average: 0.62, 0.55, 0.55
This example fetches a random Wikipedia article, and then follows links to other articles until 10 articles have been fetched. Sample output:
fetching https://en.wikipedia.org/wiki/Special:Random...
* [Nauru national netball team] links to 3 articles
fetching https://en.wikipedia.org/wiki/Nauru...
fetching https://en.wikipedia.org/wiki/Netball...
fetching https://en.wikipedia.org/wiki/Netball_at_the_1985_South_Pacific_Mini_Games...
* [Netball at the 1985 South Pacific Mini Games] links to 4 articles
* [Netball] links to 114 articles
fetching https://en.wikipedia.org/wiki/1985_South_Pacific_Mini_Games...
fetching https://en.wikipedia.org/wiki/Rarotonga...
fetching https://en.wikipedia.org/wiki/Cook_Islands...
* [Nauru] links to 257 articles
fetching https://en.wikipedia.org/wiki/Ball_sport...
* [Ball game] links to 8 articles
fetching https://en.wikipedia.org/wiki/Commonwealth_of_Nations...
* [Rarotonga] links to 43 articles
fetching https://en.wikipedia.org/wiki/Netball_Superleague...
* [Cook Islands] links to 124 articles
* [Netball Superleague] links to 25 articles
* [Commonwealth of Nations] links to 434 articles
* [1985 South Pacific Mini Games] links to 5 articles
The final example (which was used to produce the schedule plot above) simulates a simple build system: It creates a number of jobs (default: 10), each job sleeps for some random time (default: <=100ms), and has some probability of depending on each preceding job (default: 0.5). After awaiting its dependencies, each job may also split portions of its work into one or more sub-jobs, and await their completion, before finishing its remaining work. Everything is scheduled across a fixed number of worker threads (default: 4).
Run the following to install:
$ pip install asyncjobs
To work on asyncjobs, clone this repo, and run the following (in a virtualenv) to get everything you need to develop and run tests:
$ pip install -e .[dev]
Additionally, if you want to generate scheduling plots (as seen above), you
need a couple more dependencies (plotly
and
numpy
):
$ pip install -e .[dev,plot]
Alternatively, if you are using Nix, use the included
shell.nix
to get a development environment with everything automatically
installed:
$ nix-shell
Use nox
to run all tests, formatters and linters:
$ nox
This will run the pytest
-based test suite under all
supported Python versions, format the code with
black
and run the
flake8
linter.
Main development happens at https://github.com/jherland/asyncjobs/. Post issues and PRs there.
FAQs
Asynchronous job scheduler
We found that asyncjobs demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.