Research
Security News
Malicious npm Package Targets Solana Developers and Hijacks Funds
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
node_storage_manager
Advanced tools
Node - Storage Pipe Manager allows world-wide storage and retrieval of any amount of data at any time. You can use Google Cloud Storage, AWS S3 Bucket for a range of scenarios including serving website content, storing data for archival and disaster recov
Node.js idiomatic client for [Cloud Storage]
Node - Storage Pipe Manager allows world-wide
storage and retrieval of any amount of data at any time. You can use Google Cloud Storage
, AWS S3 Bucket
for a range of scenarios including serving website content,
storing data for archival and disaster recovery, or distributing large data
objects to users via direct download. Storage Pipe Manager is a Pipe Factory that allow you easily switch between Google CLoud
AWS S3
, CLOUDINARY
and FS
without breaking anything or any extra configurations
Table of contents:
Make Sure to define your credentials using enviromental vairable for AWS
, Google
and FS
in your .zshrc
or .bashrc
file
Google Bucket Declaration
export GOOGLE_APPLICATION_CREDENTIALS=/Users/nitrocode/comics-eagle-39849343-837483483.json
AWS S3 Declaration
export AWS_ACCESS_KEY_ID=284893748923yuwfhsdkfjshkfjh
export AWS_SECRET_ACCESS_KEY=982u289432u48jsdfkjsr3894
export AWS_SESSION_TOKEN (optional)
Cloudinary Declaration
export CLOUDINARY_URL=cloudinary://4737858435783453:3827489jksdhfjasfhjBB@nitrocode
Digital Ocean Spaces
export DG_ACCESS_KEY=284893748923yuwfhsdkfjshkfjh
export DG_SECRET_KEY=982u289432u48jsdfkjsr3894
Local NFS Declaration
export MOUNT_POINT=/Users/nitrocode/bucket/
Would advice to declare all at once for easy switch between clients
npm i node_storage_manager
GCLOUD
node_storage_manager allows you to switch between clients easily without reconfigurations
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
// Set Storage Instance between AWS,GCLOUD and FS
let StorageInstance = Storage.getInstance('GCLOUD');
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// let bucketName = 'bucket-name';
async function download(bucketName) {
// Creates the new bucket
await StorageInstance.download(bucketName, 'file', 'destination');
console.log(`file downloaded`);
}
AWS
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
// Set Storage Instance between AWS,GCLOUD and FS
let StorageInstance = Storage.getInstance('AWS');
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// let bucketName = 'bucket-name';
async function download(bucketName) {
// Creates the new bucket
await StorageInstance.download(bucketName, 'file', 'destination');
console.log(`file downloaded`);
}
CLOUDINARY
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
// Set Storage Instance between AWS,GCLOUD and FS
let StorageInstance = Storage.getInstance('CLOUDINARY');
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// let bucketName = 'bucket-name';
async function download(bucketName) {
// Creates the new bucket
let result = await StorageInstance.upload(bucketName, 'filepath', 'image or video');
console.log(result);
// This way you can get all data returned from Cloudinary Client e.g result.url e.t.c
}
DigitalOcean Spaces
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
// Set Storage Instance between AWS,GCLOUD and FS
let StorageInstance = Storage.getInstance('DG', "Region e.g Asia");
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// let bucketName = 'bucket-name';
async function download(bucketName) {
// Creates the new bucket
let result = await StorageInstance.upload(bucketName, 'filepath', 'image or video');
console.log(result);
// This way you can get all data returned from Cloudinary Client e.g result.url e.t.c
}
NFS
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
// Set Storage Instance between AWS,GCLOUD and FS
let StorageInstance = Storage.getInstance('NFS');
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// let bucketName = 'bucket-name';
async function download(bucketName) {
// Creates the new bucket
await StorageInstance.download(bucketName, 'file to download', 'destination e.g /Users/nitrocode/tmp/');
console.log(`file downloaded`);
}
## API Documentation
This contains a reference to the storage-pipe module. It is a valid use case to use both this module and all it's functions
Note to specify region on S3 and DigitalOcean Spaces you need to pass it parameter on getInstance
// Imports the node_storage_manager library
const Storage = require('node_storage_manager');
let StorageInstance = Storage.getInstance('AWS'or 'DG', 'Asia');
StorageInstance.upload()
}
Download file from S3, AWS & NFS using storage pipe
parameters
:
bucketName
- required, S3 bucket name to download files from.filename
- required, file to download from bucketdestination
- required, where to put the file when done downloadingUploads file to S3, AWS & NFS using storage pipe
parameters
:
bucketName
- required, S3 bucket name to upload files to.filename
- required, file to up to bucketdestination
- optional, for renaming file during upload i.e if file bob.jpg is beign uploaded setting destination
on upload method will use destination
value to rename the fileparameters required if on CLOUDINARY Instance
:
bucketName
- required, S3 bucket name to upload files to.filename
- required, file to up to bucketfileType
- required, Type of file to upload e.g image, videoCreate's Bucket in S3, AWS & NFS using sotrage pipe
parameters required if on S3 Instance
:
bucketName
- required, Bucketname to Create.ACL
- required, Define which AWS accounts or groups are granted access and the type of access. e.g public-readparameters required if on GCLOUD Instance
:
bucketName
- required, Bucketname to Create.location
- required, Define specific region e.g ASIAstorageClass
- optional, e.g coldline default storage or Leave the second argument blank for default settings.parameters required if on NFS Instance
:
bucketName
- required, Bucketname to Create.Delete Bucket in S3, AWS & NFS using storage pipe
parameters required
:
bucketName
- required, Bucketname to Delete.List Buckets in S3, AWS & NFS using storage pipe
parameters required
:
None
- No parameters RequiredList files in Bucket on S3, AWS & NFS using storage pipe
parameters required
:
bucketName
- required, Bucketname to list files from.Delete file in Bucket on S3, AWS & NFS using storage pipe
parameters required
:
bucketName
- required, Bucketname to delete file from.filename
- required, filename to deleteGet metadata from GCLOUD Storage Note these is only applicable to GCLOUD instance alone
parameters required
:
bucketName
- required, Bucketname to fetch it's metadataThis library follows Semantic Versioning.
This library is considered to be General Availability (GA). This means it is stable; the code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.
Contributions welcome! See the Contributing Guide.
Apache Version 2.0
See LICENSE
FAQs
Node - Storage Pipe Manager allows world-wide storage and retrieval of any amount of data at any time. You can use Google Cloud Storage, AWS S3 Bucket for a range of scenarios including serving website content, storing data for archival and disaster recov
The npm package node_storage_manager receives a total of 3 weekly downloads. As such, node_storage_manager popularity was classified as not popular.
We found that node_storage_manager demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
Security News
Research
Socket researchers have discovered malicious npm packages targeting crypto developers, stealing credentials and wallet data using spyware delivered through typosquats of popular cryptographic libraries.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.