![require(esm) Backported to Node.js 20, Paving the Way for ESM-Only Packages](https://cdn.sanity.io/images/cgdhsj6q/production/be8ab80c8efa5907bc341c6fefe9aa20d239d890-1600x1097.png?w=400&fit=max&auto=format)
Security News
require(esm) Backported to Node.js 20, Paving the Way for ESM-Only Packages
require(esm) backported to Node.js 20, easing the transition to ESM-only packages and reducing complexity for developers as Node 18 nears end-of-life.
mongo-to-s3
Advanced tools
var AWS = require("aws-sdk");
var MongoToS3 = require("mongo-to-s3");
var through = require("through");
var s3 = new AWS.S3({
accessKeyId: "myAccessKey",
secretAccessKey: "mySecretAccessKey",
region: "us-east-1"
});
mongoToS3 = new MongoToS3(s3);
mongoToS3.createS3Sink({s3: {
Bucket: "myBucket",
Key: "myKey",
ACL: "public-read"
}}, function(err, myS3Sink) {
/*
* myS3Sink is a writable stream that batch uploads
* data into s3 using their multipart upload api
*/
mongoToS3.fromMongo([{
//anything accepted by 'mongoexport'
exportOptions: "-h localhost:27017 -d database -c collection",
workingDirectory: "/tmp" //some writable path on your machine
}])
.pipe(through(function(chunk, enc, cb) {
//some processing step
console.log("Processing:", chunk);
this.push(chunk);
cb();
}))
.pipe(myS3Sink);
});
/*
* If you want to process data from multiple mongoexport commands
* just pass in more configuration objects into `fromMongo`
*/
mongoToS3.fromMongo([
{
exportOptions: "-h localhost:27017 -d database -c collection1",
workingDirectory: "/tmp"
},
{
exportOptions: "-h localhost:27017 -d database -c collection2",
workingDirectory: "/tmp"
}
])
.pipe(through(function(chunk, enc, cb) {
//both collection 1 and collection 2 are joined here.
this.push(chunk);
cb();
}))
.pipe(someWritableStream);
/*
* Sometimes you might want to process mongoexport results in
* separate processes to increase throughput.
*/
mongoToS3.fromMongo([
{
exportOptions: "-h localhost:27017 -d database -c collection1",
workingDirectory: "/tmp"
},
{
exportOptions: "-h localhost:27017 -d database -c collection2",
workingDirectory: "/tmp"
}
])
.thoughPipeline(__dirname + "/somePipeline.js")
.pipe(someWritableStream);
/*
* `throughPipeline` takes the pathname of a file that exports a Duplex (or Transform) stream.
* Each mongoexports stream gets its own processing pipe that runs in an external
* process. The results are then aggregated into the main process. In the above example both
* collection1 and collection2 are uploaded to s3 after being processed by the stream exported
* by /somePipeline.js
*/
FAQs
Transfer mongo collections to s3
The npm package mongo-to-s3 receives a total of 1 weekly downloads. As such, mongo-to-s3 popularity was classified as not popular.
We found that mongo-to-s3 demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
require(esm) backported to Node.js 20, easing the transition to ESM-only packages and reducing complexity for developers as Node 18 nears end-of-life.
Security News
PyPI now supports iOS and Android wheels, making it easier for Python developers to distribute mobile packages.
Security News
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.