New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

npm-storage-package

Package Overview
Dependencies
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

npm-storage-package

Universal cloud storage package for AWS S3, Google Cloud Storage, and Azure Blob Storage with Express.js integration

latest
Source
npmnpm
Version
1.0.2
Version published
Maintainers
1
Created
Source

🌐 NPM Storage Package

A powerful TypeScript-based NPM package for seamless file uploads to AWS S3, Google Cloud Storage (GCS), and Azure Blob Storage with Express.js/Multer integration. One unified API for all your cloud storage needs!

npm version TypeScript License: MIT

🚀 Features

Unified API – One consistent interface for S3, GCS, and Azure
Express.js + Multer Integration – Drop-in replacement for standard multer
Multiple Upload Patterns – Single file, multiple files, multiple fields
TypeScript Support – Full type safety and IntelliSense
Flexible Configuration – Field-specific settings and validation
Production Ready – Error handling, file filtering, size limits

📦 Installation

npm install npm-storage-package multer express
yarn add npm-storage-package multer express

🔧 Quick Start

1️⃣ Basic Setup

import express from 'express';
import { StorageFactory } from 'npm-storage-package';

const app = express();

// Configure once, use everywhere
const storage = new StorageFactory({
  azure: {
    bucket: 'my-container',
    credentials: { connectionString: process.env.AZURE_CONNECTION_STRING }
  },
  s3: {
    bucket: 'my-bucket',
    region: 'us-east-1',
    credentials: {
      accessKeyId: process.env.AWS_ACCESS_KEY_ID,
      secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
    }
  },
  gcs: {
    bucket: 'my-gcs-bucket',
    projectId: 'my-project-id',
    keyFilename: './serviceaccountkey.json'
  }
});

2️⃣ Single File Upload

app.post('/upload/single', 
  storage.azureMulter({
    destination: 'uploads/',
    generateFileName: (req, file) => `${Date.now()}-${file.originalname}`
  }).single('file'), 
  (req, res) => {
    res.json({
      success: true,
      file: {
        originalName: req.file.originalname,
        cloudUrl: req.file.path,
        cloudKey: req.file.cloudKey,
        size: req.file.size
      }
    });
  }
);

3️⃣ Multiple Files Upload

app.post('/upload/multiple', 
  storage.s3Multer({
    destination: 'gallery/',
    fileFilter: (req, file) => file.mimetype.startsWith('image/'),
    limits: { fileSize: 5 * 1024 * 1024 } // 5MB
  }).array('photos', 10), 
  (req, res) => {
    const uploadedFiles = req.files.map(file => ({
      originalName: file.originalname,
      cloudUrl: file.path,
      size: file.size
    }));
    
    res.json({
      success: true,
      totalFiles: req.files.length,
      files: uploadedFiles
    });
  }
);

4️⃣ Multiple Fields Upload

app.post('/upload/profile', 
  storage.gcsMulter({
    fieldConfigs: {
      avatar: {
        destination: 'avatars/',
        allowedExtensions: ['.jpg', '.jpeg', '.png'],
        fileSizeLimit: 2 * 1024 * 1024, // 2MB
        generateFileName: (req, file) => `avatar-${req.user.id}-${Date.now()}.jpg`
      },
      document: {
        destination: 'documents/',
        allowedExtensions: ['.pdf', '.doc', '.docx'],
        fileSizeLimit: 10 * 1024 * 1024 // 10MB
      }
    }
  }).fields([
    { name: 'avatar', maxCount: 1 },
    { name: 'document', maxCount: 3 }
  ]), 
  (req, res) => {
    res.json({
      success: true,
      files: {
        avatar: req.files.avatar?.[0]?.path,
        documents: req.files.document?.map(f => f.path)
      }
    });
  }
);

📚 API Reference

StorageFactory Class

The main class for creating configured storage instances.

const storage = new StorageFactory({
  azure?: AzureConfig,
  s3?: S3Config,
  gcs?: GCSConfig
});

Provider Methods

Each provider has a unified method that returns a configured multer instance:

storage.azureMulter(options)

storage.s3Multer(options)

storage.gcsMulter(options)

Upload Options

OptionTypeDescription
destinationstringUpload destination path
generateFileNameFunctionCustom filename generator
fileFilterFunctionFile validation function
limitsObjectFile size/count limits
fieldConfigsObjectField-specific configurations

Field Configuration

For multiple fields upload, each field can have:

OptionTypeDescription
destinationstringField-specific upload path
allowedExtensionsstring[]Allowed file extensions
fileSizeLimitnumberMaximum file size in bytes
generateFileNameFunctionCustom filename for this field

🔧 Configuration

Azure Blob Storage

azure: {
  bucket: 'container-name',
  credentials: {
    connectionString: 'DefaultEndpointsProtocol=https;AccountName=...'
  }
}

AWS S3

s3: {
  bucket: 'bucket-name',
  region: 'us-east-1',
  credentials: {
    accessKeyId: 'AKIA...',
    secretAccessKey: 'your-secret-key'
  }
}

Google Cloud Storage

// Option 1: Service Account Key File
gcs: {
  bucket: 'bucket-name',
  projectId: 'your-project-id',  // or auto-extracted from key file
  keyFilename: './path/to/serviceaccountkey.json'
}

// Option 2: JSON Credentials
gcs: {
  bucket: 'bucket-name',
  credentials: {
    // Service account JSON object
    type: 'service_account',
    project_id: 'your-project-id',
    private_key: '-----BEGIN PRIVATE KEY-----\n...',
    client_email: 'service-account@project.iam.gserviceaccount.com'
    // ... other fields
  }
}

💡 Advanced Examples

Custom File Validation

const imageUpload = storage.s3Multer({
  destination: 'images/',
  fileFilter: (req, file) => {
    // Only allow images
    if (!file.mimetype.startsWith('image/')) {
      throw new Error('Only image files are allowed');
    }
    
    // Custom validation logic
    if (file.originalname.length > 100) {
      throw new Error('Filename too long');
    }
    
    return true;
  },
  generateFileName: (req, file) => {
    const userId = req.user?.id || 'anonymous';
    const timestamp = Date.now();
    const extension = file.originalname.split('.').pop();
    return `${userId}-${timestamp}.${extension}`;
  }
});

app.post('/upload/image', imageUpload.single('image'), handler);

Dynamic Configuration

function createDynamicUpload(provider) {
  const configs = {
    azure: storage.azureMulter({ destination: 'azure-uploads/' }),
    s3: storage.s3Multer({ destination: 's3-uploads/' }),
    gcs: storage.gcsMulter({ destination: 'gcs-uploads/' })
  };
  
  return configs[provider];
}

app.post('/upload/:provider', (req, res, next) => {
  const upload = createDynamicUpload(req.params.provider);
  upload.single('file')(req, res, next);
}, handler);

Error Handling

app.use((error, req, res, next) => {
  if (error.code === 'LIMIT_FILE_SIZE') {
    return res.status(400).json({
      success: false,
      error: 'File too large'
    });
  }
  
  if (error.code === 'LIMIT_FILE_COUNT') {
    return res.status(400).json({
      success: false,
      error: 'Too many files'
    });
  }
  
  // Handle cloud storage errors
  if (error.message.includes('bucket')) {
    return res.status(500).json({
      success: false,
      error: 'Storage configuration error'
    });
  }
  
  res.status(500).json({
    success: false,
    error: error.message
  });
});

🛠️ File Object Properties

After successful upload, files are available in req.file (single) or req.files (multiple):

{
  fieldname: 'avatar',
  originalname: 'profile.jpg',
  encoding: '7bit',
  mimetype: 'image/jpeg',
  size: 15234,
  destination: 'avatars/',
  filename: 'avatar-123456789.jpg',
  path: 'https://storage.googleapis.com/bucket/avatars/avatar-123456789.jpg',
  cloudKey: 'avatars/avatar-123456789.jpg'
}

🧪 Testing Your Setup

Check out the complete test project in the /test-project directory:

cd test-project
npm install
cp .env.example .env
# Edit .env with your credentials
npm start
# Visit http://localhost:3000

The test project includes:

  • ✅ Single file upload example
  • ✅ Multiple files upload example
  • ✅ Multiple fields upload example
  • ✅ HTML form for easy testing
  • ✅ Error handling demonstrations

🔒 Security Best Practices

Environment Variables – Never hardcode credentials
File Validation – Always validate file types and sizes
Rate Limiting – Implement upload rate limits
Authentication – Protect upload endpoints
CORS Configuration – Set proper CORS for web uploads
Virus Scanning – Consider adding virus scanning

Example Environment Setup

# .env file
AZURE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
AZURE_CONTAINER_NAME=uploads

AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION=us-east-1
AWS_S3_BUCKET=my-bucket

GCS_PROJECT_ID=my-project
GCS_BUCKET_NAME=my-bucket
GCS_KEY_FILE=./serviceaccountkey.json

📖 More Examples

For detailed examples and advanced usage patterns, see:

🤝 Contributing

We welcome contributions! Please feel free to submit a Pull Request.

  • Fork the repository
  • Create your feature branch (git checkout -b feature/amazing-feature)
  • Commit your changes (git commit -m 'Add amazing feature')
  • Push to the branch (git push origin feature/amazing-feature)
  • Open a Pull Request

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support

⭐ Show Your Support

If this package helped you, please consider:

  • ⭐ Starring the repository
  • 🐛 Reporting issues
  • 💡 Suggesting improvements
  • 🤝 Contributing code

Made with ❤️ for the developer community

Keywords

storage

FAQs

Package last updated on 12 Aug 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts