Security News
The Risks of Misguided Research in Supply Chain Security
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.
Prep your environment
$ echo "APP_NAME=appName" >> .env
$ echo "DATABASE_URL=postgres://user:password@hostname:port/database" >> .env
$ echo "REDIS_URL=redis://hostname:port/" >> .env
$ echo "ADMIN_USER=foo" >> .env
$ echo "ADMIN_PASS=pa$$word" >> .env
$ echo "NODE_ENV=local" >> .env
$ echo "CALLBACK_URL=http://localhost:3000" >> .env
$ echo "SECRET=secretkey" >> .env
Ensure your database and user is created
createdb dbname
createuser dbuser
Start it up
foreman start
Meanwhile, in your code
const Cher = require('cher');
const cherTwitter = require('cher-twitter');
const cherInstagram = require('cher-instagram');
const tools = require('cher-tools');
// filter dirty words
const dirtyFilter = tools.pipeline.profanityFilter;
// save data to the database
const save = tools.pipeline.save;
// config helper
const config = tools.utils.config;
// Cher streams get configured here.
const sourceStreams = [
{
name: 'twitter',
module: cherTwitter
},
{
name: 'instagram',
module: cherInstagram
}
];
config.fetch({
// When data is received it is immediately sent through this pipeline.
// Place your filters, maps, etc here in the order in which you want them
// executed
pipeline: [ dirtyFilter, save ],
// The sources for our streams
sourceStreams: sourceStreams
})
.then(Cher.construct)
.spread((streams, cher) => {
const twitterSource = cher.getStream('twitter');
twitterSource
.start()
.spread((dataChannel, twitterSettings) => {
dataChannel.on('data', (data) => {
console.log(data) // a tweet!
});
});
// sources are also event emitters
twitterSource.on('start', (dataChannel, twitterSettings) => {
// ...
});
});
Cher is a stream processing platform. The best way to process a stream of data is through a data pipeline. Cher provides several pipeline steps out of the box for you to utilize including a save()
which simply saves data to the database. In order to create a step in the pipeline just create a function which returns a through stream. The function which returns the through stream will receive one parameter deps
which is an object that provides several components that you can utilize in your pipeline step.
{
models,
ctx,
settings
}
An instance of the cher model manager which gives you access to your underlyning data model. You can access it like so
const DataModel = models.get('DataModel');
An instance of a key/value store that cher has access to which you can use to store arbitrary data.
ctx.set('foo', {bar: 'beep'})
.then(ctx.get('foo'))
.then((foo) {
console.log(foo); // {bar: 'beep'}
})
// don't forget to catch errors!
The configured settings for the current sourceStream. If the data item coming through the pipeline is from twitter, this will be the twitter settings. If it is from S3, it will be the S3 settings.
So, what does a custom pipeline step look like? Simple
const es = require('event-stream');
const tools = require('cher-tools');
module.exports = (deps) => {
return es.through((data) {
// parse the raw stream data into a JS object
let item = utils.parse(data);
log.debug(data);
// transform the data somehow
// ...
// reserialize the data for more efficient processing
const serialized = utils.serialize(item);
// when you're done, emit the serialized data so that it can go to the next step
// in the pipeline. You can also NOT emit data based on some condition. This effectively
// turns the stream from a map to a filter
this.emit('data', serialize);
});
};
FAQs
Stream processor/aggregator/manager
The npm package cher receives a total of 7 weekly downloads. As such, cher popularity was classified as not popular.
We found that cher demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.
Research
Security News
Socket researchers found several malicious npm packages typosquatting Chalk and Chokidar, targeting Node.js developers with kill switches and data theft.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.