
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
All the stuff common to our Rethink projects
const googleAuthCallback = require("rxs-lib/googleAuthCallback");
app.get("/auth/google/callback", passport.authenticate('google', { failureRedirect: '/login' }), googleAuthCallback(knex, authRouter.addToken));
This is used to turn JS object arrays to CSV ready arrays.
const tabulize = require("rxs-lib/tabulize");
const input = [
{ name: "John", age: 33 },
{ name: "Peter", age: 50 }
];
const output = tabulize(input);
// [
// ["name", "age"],
// ["John", 33],
// ["Peter", 50]
// ]
The output of tabulize used here as input, to generate a CSV string
const csvify = require("rxs-lib/csvify");
const tabularData = [
["name", "age"],
["John", 33],
["Peter", 50]
];
const csv = csvify(tabularData);
// name,age
// "John",33
// "Peter",50
This function takes an array of numbers and calculates the average considering only the present (not null or undefined) positive (> 0) values
const { positiveAvg } = require("rxs-lib/math");
positiveAvg([2, 4, 6]); // 4
positiveAvg([0, 4, 6]); // 5
positiveAvg([0, 0, 8, 2]); // 5
positiveAvg([0, null, null, 8, 2]); // 5
Show / hide JSX components based on roles
import Auth from "rxs-lib/Auth";
<Auth authorizedRoles="role1,role2" userRoles={["role1"]}>
<p>Private content</p>
</Auth>
** authorizedRoles can also be a string array *** if you don't provide authorizedRoles it will be always shown
const nlp = require("rxs-lib/nlp");
const tokens = nlp.tokenize("¡Y \ntambién acentos, María!");
// tokens = ["y", "tambien", "acentos", "maria"]
The emojis get sttriped away, so cannot be counted for the index location. Also might get consufed by very small words closely before the searched token.
const nlp = require("rxs-lib/nlp");
const indexes1 = nlp.approximateIndexesOf("Hard is hard, but you know that hard is hard, yes?", "hard is hard");
// indexes1 = [0, 32]
const indexes2 = nlp.approximateIndexesOf("You cannot find only a piece of the sentence", "only the piece");
// indexes2 = []
const indexes3 = nlp.approximateIndexesOf("Hey, guys!!! Here we are 😁, enjoying the first words that came out 💤", "first words");
// indexes3 = [42] Should be 43, but cannot count emojis
const indexes4 = nlp.approximateIndexesOf("A more simple example", "SimplE");
// indexes4 = [7]
const indexes5 = nlp.approximateIndexesOf("And, sometimes, an a-word is harder", "a-word");
// indexes5 = [16] Should be 19, but the 'an' token gives a false positive
const buildBigQueryUploader = require("rxs-lib/bqUploader");
const bq = new bigquery.BigQuery({ projectId: "your-project-id" });
const bqUploader = await buildBigQueryUploader({
bqClient: bq,
chunkSize: 5000, // Inner use, you shouldn't need to use it
datasetId: "your-dataset-id",
id: "igv2-scraper", // Use any text that uniquely identifies this kind of task
maxFileSizeInBytes: 1024 * 1024 * 500, // This would be 500 MB, by default it's 2 GB
tableId: "your-table-id",
tempPath: "/home/user/temp", // Default .
uniqueValidationFields: ["platform", "id"] // Duplicates validation **
});
await bqUploader.addItem({ name: "John", age: 33 });
await bqUploader.addItem({ name: "Susan", age: 32 });
// or...
await bqUploader.addItems([
{ name: "John", age: 33 },
{ name: "Susan", age: 32 }
]);
const totalItemsToUpload = await bqUploader.getTotalItemsToUpload();
// >> 2
const onTick = function onTick (totalItemsUploaded) {
console.log(`Uploaded ${totalItemsUploaded} of ${totalItemsToUpload}`);
};
const uploadJobs = await bqUploader.upload(onTick);
Creates a bqUploader-files directory in the temporary path provided (or . by default), and saves the added items to NEWLINE_DELIMITED_JSON text files, with a maximum amount of chunkSize per file.
When calling upload it first concatenates the written files into n files (limited in size by the maxFileSizeInBytes parameter); then it reads such files, one by one, and deletes all files after its successful upload.
IMPORTANT: Always prefer addItems over addItem, as each of these operations write to disk, which is a super costly operation.
If you have only 1 item, and want to add and upload it immediatly, without building a batch, you can use this feature:
// ...build uploader
const uploadJobs = await bqUploader.upload({ name: "John", age: 33 });
If you provide uniqueValidationFields parameter, when calling addItem it won't add any item that would be duplicated based on these fields. No error thrown, just doesn't add it.
If you clear all items wrote to files for the given uploader will be physically deleted.
// ...build uploader
await bqUploader.clear();
IMPORTANT: Use only if you know what you're doing, and with extreme care (you might be deleting items queued to be uploaded later)
FAQs
All the stuff common to our Rethink projects
We found that rxs-lib demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.