
Research
PyPI Package Disguised as Instagram Growth Tool Harvests User Credentials
A deceptive PyPI package posing as an Instagram growth tool collects user credentials and sends them to third-party bot services.
Dependency-free script to spool jobs into SLURM scheduler without exceeding queue capacity limits.
You need to submit more slurm scripts than fit on the queue at once.
tree .
.
āāā slurmscript0.slurm.sh
āāā slurmscript1.slurm.sh
āāā slurmscript2.slurm.sh
āāā slurmscript3.slurm.sh
āāā slurmscript4.slurm.sh
āāā slurmscript5.slurm.sh
āāā slurmscript6.slurm.sh
āāā slurmscript7.slurm.sh
āāā slurmscript8.slurm.sh
...
The qspool
script will feed your job scripts onto the queue as space becomes available.
python3 -m qspool *.slurm.sh
You can also provide job names via stdin, which is useful for very large job batches.
find . -maxdepth 1 -name '*.slurm.sh' | python3 -m qspool
The qspool
script creates a slurm job that submits your job scripts.
When queue capacity fills, this qspool
job will schedule a follow-up job to submit any remaining job scripts.
This process continues until all job scripts have been submitted.
usage: qspool.py [-h] [--payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE] [--job-log-path JOB_LOG_PATH] [--job-script-cc-path JOB_SCRIPT_CC_PATH]
[--queue-capacity QUEUE_CAPACITY] [--qspooler-job-title QSPOOLER_JOB_TITLE]
[payload_job_script_paths ...]
positional arguments:
payload_job_script_paths
What scripts to spool onto slurm queue? (default: None)
options:
-h, --help show this help message and exit
--payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE
Where to read script paths to spool onto slurm queue? (default: <_io.TextIOWrapper name='<stdin>' mode='r' encoding='utf-8'>)
--job-log-path JOB_LOG_PATH
Where should logs for qspool jobs be written? (default: ~/joblog/)
--job-script-cc-path JOB_SCRIPT_CC_PATH
Where should copies of submitted job scripts be kept? (default: ~/jobscript/)
--queue-capacity QUEUE_CAPACITY
How many jobs can be running or waiting at once? (default: 1000)
--qspooler-job-title QSPOOLER_JOB_TITLE
What title should be included in qspooler job names? (default: none)
no installation:
python3 "$(tmpfile="$(mktemp)"; curl -s https://raw.githubusercontent.com/mmore500/qspool/v0.5.0/qspool.py > "${tmpfile}"; echo "${tmpfile}")" [ARGS]
pip installation:
python3 -m pip install qspool
python3 -m qspool [ARGS]
qspool
has zero dependencies, so no setup or maintenance is required to use it.
Compatible all the way back to Python 3.6, so it will work on your cluster's ancient Python install.
qspool
* read contents of target slurm scripts
* instantiate qspooler job script w/ target slurm scripts embedded
* submit qspooler job script to slurm queue
ā¬ļø ā¬ļø ā¬ļø
qspooler job 1
* submit embedded target slurm scripts one by one until queue is almost full
* instantiate qspooler job script w/ remaining target slurm scripts embedded
* submit qspooler job script to slurm queue
ā¬ļø ā¬ļø ā¬ļø
qspooler job 2
* submit embedded target slurm scripts one by one until queue is almost full
* instantiate qspooler job script w/ remaining target slurm scripts embedded
* submit qspooler job script to slurm queue
...
qspooler job n
* submit embedded target slurm scripts one by one
* no embedded target slurm scripts remain
* exit
roll_q
uses a similar approach to solve this problem.
roll_q
differs in implementation strategy.
roll_q
tracks submission progress via an index variable in a file associated with a job batch.
qspool
embeds jobs in the submission worker script itself.
FAQs
Dependency-free script to spool jobs into SLURM scheduler without exceeding queue capacity limits.
We found that qspool demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Ā It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
A deceptive PyPI package posing as an Instagram growth tool collects user credentials and sends them to third-party bot services.
Product
Socket now supports pylock.toml, enabling secure, reproducible Python builds with advanced scanning and full alignment with PEP 751's new standard.
Security News
Research
Socket uncovered two npm packages that register hidden HTTP endpoints to delete all files on command.