
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
taskobject
Advanced tools
We have implemented the Task class in order to :
A Task is a class inherited from the Readable Stream class. Thus, it can contain data and do stuff with it. It can also use the method pipe like : mytask.pipe(writableStream) and transfer its data to a writable Stream. The output data is always in JSON format.
A Task contains Slots (mytask.myslot). These are objects inherited from the Writable Stream class. Thus, we can push data on them like : readableStream.pipe(mytask.myslot). Each input needed to run a Task calculation is associated to one unique Slot (so each Slot is created to receive only one unique input). As a consequence, a Task must have at least one Slot. The input data must be pushed in a JSON format.
Regarding the two previous parts, we can easily construct a pipeline :
task_a.pipe(task_c.slot_1) // a -> c.1
task_b.pipe(task_c.slot_2) // b -> c.2
task_c.pipe(task_d.slot_1) // c -> d.1
task_d.pipe(task_e.slot_1) // d -> e.1
Here task_c contains two Slots : slot_1 and slot_2. The slot_1 takes data from task_a, and the slot_2 takes data from task_b. Then, task_c push its results into the slot_1 of task_d. Finally, results of task_d are pushed into the slot_1 of task_e.
Note that even if the
task_dhas only one input, it is pushed on a Slot and not on the Task itself (same fortask_e).
If you want to learn more about pipelines, see our pipelineobject project (GitHub repo, NPM package).
In your project repository :
npm install taskobject
This module can be used only with the test modes. In fact, the taskobject is an abstract class created as a base to implement bioinformatic tasks, using inheritance.
Two test modes are available. Each one is based on a child class of the taskobject :
In your JS script, import the test file :
var tkTest = require('./node_modules/taskobject/test/test');
Then you have to start and set up a JM. We provide a method that takes care of that :
tkTest.JMsetup();
JMsetup returns an object instance of EventEmitter. It emits "ready" when the JM is ready to receive jobs, and provide the JM object.
Then, you can run the simpleTest method (or the dualTest method) :
tkTest.JMsetup().on('ready', function (JMobject) {
tkTest.simpleTest(inputFile, management);
tkTest.dualTest(inputFile1, inputFile2, management);
});
inputFile are absolute path to your input file(s). No specific format needed.management is a literal like :let management = {
'jobManager' : JMobject // provided by the JMsetup method
}
The
simpleTestmethod :
- instantiates a simpletask (more info in the SimpleTask section),
- creates a stream (Readable) with your
inputFilecontent (in a JSON),- pipes the stream on the
simpleTask.inputslot,- pipes the simpletask object on
process.stdout, so you can watch the results in your console.
The
dualTestmethod :
- instantiates a dualtask (more info in the DualTask section),
- creates two streams (Readable) each one containing a JSON with an input content (
inputFile1andinputFile2),- pipes the stream of
inputFile1on thesimpleTask.input1slot,- pipes the stream of
inputFile2on thesimpleTask.input2slot,- pipes the dualtask object on
process.stdout, so you can watch the results in your console.
The previous tests are already implemented in the ./node_modules/taskobject/test/ directory. To use it :
node ./node_modules/taskobject/test.js
This script needs some command line options. You can use option -h to display the help.
To read before beginning :
In your project directory :
tsc --init # initialize a TS project (tsconfig.json)
npm init # say yes to all
npm install --save taskobject # we need the taskobject package
npm install --save-dev @types/node # in TS you need node types
Your directories must be organized like the following directory tree :
.
├── data
│ └── myCoreScript.sh
├── index.js
├── node_modules
│ ├── @types
│ │ └── node
│ └── taskobject
├── package.json
├── test
│ └── test.js
├── ts
│ └── src
│ ├── index.ts
│ ├── test
│ │ └── test.ts
│ └── types
│ └── index.ts
├── tsconfig.json
└── types
└── index.js
{
"compilerOptions": {
"allowJs" : true,
"baseUrl": ".",
"lib": [ "dom", "es7" ],
"listEmittedFiles" : true,
"listFiles" : false,
"maxNodeModuleJsDepth" : 10,
"module": "commonjs",
"moduleResolution" : "node",
"outDir" : "./",
"paths": {
"*": [ "node_modules/" ]
},
"preserveConstEnums" : true,
"removeComments" : false,
"target": "ES6"
},
"files": [ // path to the files to compile
"./ts/src/index.ts",
"./ts/src/test/test.ts"
]
}
Every Task must have a bash script which runs the calculations. We named it the core script.
In your core script, you can access to :
slotSymbols array, see The constructor part),options literal (see Options Literal part),options literal (see Options Literal part).Warning : the standard output of the core script must be only JSON containing the results. Otherwise, your Task will crash.
Example :
# Take the content of myInputA :
contentInputA=`cat $myinputA` # (1)
# Run myModule1 with myInputB as a parameter :
myModule1 $myInputB > /dev/null # (2)
# Run myModule2 with the options : ' -ncpu 16 -file /path/toto.txt ' :
myModule2 $myVar_module2 > /dev/null # (3)
# Create the JSON as output :
echo "{ \"pathOfCurrentDir\" : \""
echo $(pwd) # the path of the current directory
echo "\" }"
Remark : the key used in the stdout JSON is important during the implementation of the method
"prepareResults"(see the The methods to implement section).
In the current directory (see the Directory tree section), yous have to create a JavaScript file named index.js, where you will create your task class.
Remark : do not forget to export your class !
Your class must inherit from the taskobject (in TS you have to declare all the slots before writing the constructor) :
import tk = require('taskobject');
declare var __dirname; // mandatory
class my_custom_task extends tk.Task {
public readonly myInputA;
public readonly myInputB;
}
this.rootdir,slotSymbols array),Example :
constructor(management, options) {
super(management, options); // (1)
this.rootdir = __dirname; // (2)
this.coreScript = this.rootdir + '/data/myCoreScript.sh'; // (3)
this.slotSymbols = ['myInputA', 'myInputB']; // (4)
super.initSlots(); // (5)
}
Note :
management(see the Management Literal part) andoptions(see the Options Literal part) are literals.
You have to override two methods :
prepareJob (inputs) {
return super.configJob(inputs);
}
/* REMARK : 'pathOfCurrentDir' is the key you gave in your core script as JSON output */
prepareResults (chunkJson) {
return {
[this.outKey] : chunkJson.pathOfCurrentDir
}
}
These examples can be simply copied-pasted as it for your usage but you have to change the "pathOfCurrentDir". You must replace it by the key used in the stdout JSON of your core script (see the example in The core script part).
In a directory named ./test/ (see the Directory tree section), you have to create a JavaScript file to test your task :
import customTask = require('../index')
let aTaskInstance = new customTask.my_custom_task(myManagement, myOptions);
The management literal can contain 2 keys :
jobManager (required module = "ms-jobmanager/build/nativeJS/job-manager-client") : an instance of a JM client (see the Job Manager section) [mandatory].jobProfile (string) : the profile to run the job [optional]. This profile will be passed to the JM and will define the running settings for the job (nodes, queues, users, groups, etc.).Example :
let myManagement = {
'jobManager' : JMobject,
'jobProfile' : 'default'
}
The options literal can contain 3 keys :
logLevel (string) : specify a verbose level [optional]. Choose between debug, info, success, warning, error and critical.modules ([string]) : an array of modules to load before the run of the core script [optional].exportVar (literal) : a dictionary of the variable to export before the run of the core script [optional]. Each key is the name of the variable and each value is its content. Example :
let myOptions = {
'logLevel': 'debug',
'modules' : ['myModule1', 'myModule2'],
'exportVar' : { 'myVar1' : '/an/awesome/path/to/a/file.exe',
'myVar_module2' : ' -ncpu 16 -file /path/toto.txt ' }
};
Still in your test file, create a Readable Stream with your input (in JSON format), an pipe it on the task instance :
let aFirstInput = 'hello world';
let rs = new stream.Readable();
rs.push('{ "' + myInputA + '" : "' + aFirstInput + '" }'); // JSON format
rs.push(null);
rs.pipe(aTaskInstance.myInputA);
Warning : the key in the JSON must be the name of the Slot you push your data on.
Your task can emit events since it is a Readable Stream. When you listen these following events, the callback give you some arguments :
"processed" : when the task is successfully finished ; [arguments] : the results in JSON format."err" : when an error occured with the task or the JM ; [arguments] : the error."stderrContent" : when an error occured with the coreScript ; [arguments] : the error."lostJob" : when the JM has lost the job ; [arguments] : the message and the job id.As example :
aTaskInstance.on('processed', res => {
console.log("I have my results :");
console.log(res);
})
A Job Manager (JM) is a MicroService necessary to run a Task. In our case, we use the ms-jobmanager package (GitHub repo), adapted for SLURM, SGE and your proper machine.
The simpletask has been implemented only for the tests. It contains only one slot (input) :
simpletask.input takes a JSON containing an "input" key (via a pipe, like x.pipe(simpleTask)).simpleTask.pipe(y).The dualtask has been implemented only to test the task with two slots (input1 and input2) :
dualtask.input1 takes a JSON containing an "input1" key (via a pipe, like x.pipe(dualtask.input1)). Same for "input2" (y.pipe(dualtask.input2)).dualtask.pipe(z.slot).FAQs
A class to define tasks
The npm package taskobject receives a total of 42 weekly downloads. As such, taskobject popularity was classified as not popular.
We found that taskobject demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.