
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
nestjs-upload
Advanced tools
NestJS library for uploading images to AWS S3 and Google Cloud Storage with advanced validation.
A comprehensive library for uploading images and files to AWS S3 or Google Cloud Storage with advanced validation, image optimization, rate limiting, and caching for NestJS applications.
npm install nestjs-upload
The library requires the following peer dependencies:
npm install @nestjs/common @nestjs/config @nestjs/platform-express
For Swagger documentation (optional):
npm install @nestjs/swagger
For image optimization (optional, but recommended):
npm install sharp
For Redis rate limiting (optional):
npm install ioredis
Using Default Credentials (Recommended):
AWS SDK automatically uses credentials from:
AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)~/.aws/credentials fileimport { Module } from '@nestjs/common';
import { UploadModule, StorageProvider } from 'nestjs-upload';
@Module({
imports: [
UploadModule.forRoot({
provider: StorageProvider.AWS,
aws: {
bucket: 'my-bucket-name',
region: 'us-east-1',
// Credentials will be automatically loaded from ~/.aws/credentials
// or environment variables
},
validation: {
maxFileSize: 5 * 1024 * 1024, // 5MB
allowedMimeTypes: ['image/jpeg', 'image/png', 'image/webp'],
imageValidation: {
minWidth: 100,
maxWidth: 2000,
minHeight: 100,
maxHeight: 2000,
aspectRatio: {
width: 16,
height: 9,
tolerance: 0.1,
},
allowedFormats: ['jpeg', 'png', 'webp'],
},
},
imageOptimization: {
enabled: true,
maxWidth: 1920,
maxHeight: 1080,
quality: 85,
format: 'auto', // or ImageFormat.WEBP, ImageFormat.JPEG, ImageFormat.PNG
convertToWebP: false,
},
defaultPublic: true,
rateLimit: {
enabled: true,
type: 'memory', // or 'redis'
windowMs: 60000, // 1 minute
maxRequests: 10,
},
cache: {
enabled: true,
ttl: 3600, // 1 hour
maxSize: 1000,
},
}),
],
})
export class AppModule {}
Using Application Default Credentials (Recommended):
Google Cloud SDK automatically uses credentials from:
GOOGLE_APPLICATION_CREDENTIALS environment variable~/.config/gcloud/application_default_credentials.json (after gcloud auth application-default login)import { Module } from '@nestjs/common';
import { UploadModule, StorageProvider } from 'nestjs-upload';
@Module({
imports: [
UploadModule.forRoot({
provider: StorageProvider.GCS,
gcs: {
bucket: 'my-bucket-name',
projectId: 'my-project-id',
// Credentials will be automatically loaded from Application Default Credentials
},
validation: {
maxFileSize: 10 * 1024 * 1024, // 10MB
allowedMimeTypes: ['image/jpeg', 'image/png'],
imageValidation: {
minWidth: 200,
maxWidth: 4000,
minHeight: 200,
maxHeight: 4000,
},
},
}),
],
})
export class AppModule {}
Option 1: Using ~/.aws/credentials file (Recommended for local development)
Create ~/.aws/credentials file:
[default]
aws_access_key_id = YOUR_ACCESS_KEY_ID
aws_secret_access_key = YOUR_SECRET_ACCESS_KEY
Option 2: Using environment variables
export AWS_ACCESS_KEY_ID=your_access_key_id
export AWS_SECRET_ACCESS_KEY=your_secret_access_key
export AWS_REGION=us-east-1
Option 3: Using explicit credentials in config (not recommended)
aws: {
bucket: 'my-bucket',
region: 'us-east-1',
accessKey: 'your-access-key',
secretKey: 'your-secret-key',
}
Option 1: Using Application Default Credentials (Recommended)
gcloud auth application-default login
This creates credentials at ~/.config/gcloud/application_default_credentials.json
Option 2: Using GOOGLE_APPLICATION_CREDENTIALS environment variable
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account-key.json
Option 3: Using explicit key file in config
gcs: {
bucket: 'my-bucket',
projectId: 'my-project-id',
keyFilename: './path/to/service-account-key.json',
}
import { UploadModule, StorageProvider, ImageFormat } from 'nestjs-upload';
UploadModule.forRoot({
// Required parameters
provider: StorageProvider.AWS, // or StorageProvider.GCS
// AWS configuration (if provider === StorageProvider.AWS)
aws: {
bucket: 'my-bucket',
region: 'us-east-1',
// Optional: Explicit credentials (if not provided, uses default credential chain)
// accessKey: process.env.AWS_ACCESS_KEY_ID,
// secretKey: process.env.AWS_SECRET_ACCESS_KEY,
},
// GCS configuration (if provider === StorageProvider.GCS)
gcs: {
bucket: 'my-bucket',
projectId: 'my-project-id',
// Optional: Explicit credentials (if not provided, uses Application Default Credentials)
// keyFilename: './path/to/service-account-key.json',
// or
// credentials: { client_email: '...', private_key: '...', project_id: '...' }
},
// File validation
validation: {
maxFileSize: 5 * 1024 * 1024, // Maximum size in bytes
allowedMimeTypes: ['image/jpeg', 'image/png', 'image/webp'],
imageValidation: {
minWidth: 100,
maxWidth: 2000,
minHeight: 100,
maxHeight: 2000,
aspectRatio: {
width: 16,
height: 9,
tolerance: 0.1, // 10% deviation
},
allowedFormats: ['jpeg', 'png', 'webp'],
},
},
// Image optimization
imageOptimization: {
enabled: true,
maxWidth: 1920,
maxHeight: 1080,
quality: 85, // 1-100
format: ImageFormat.AUTO, // 'auto' | ImageFormat.JPEG | ImageFormat.PNG | ImageFormat.WEBP
convertToWebP: false, // If true, all images are converted to WebP
},
// Files are public by default
defaultPublic: true,
// Rate Limiting
rateLimit: {
enabled: true,
type: 'memory', // 'memory' | 'redis'
windowMs: 60000, // Time window in milliseconds
maxRequests: 10, // Maximum number of requests in the window
// Redis configuration (if type === 'redis')
redis: {
host: 'localhost',
port: 6379,
// or
// client: redisClient, // Existing Redis client
},
},
// Presigned URL caching
cache: {
enabled: true,
ttl: 3600, // Time to live in seconds
maxSize: 1000, // Maximum number of entries in cache
},
})
Uploads a file to the selected storage with validation and optional optimization.
Request:
Content-Type: multipart/form-datafile (required): File to uploadfolder (optional): Folder path in bucket (e.g., images/products)isPublic (optional): Whether the file is public (default: true)metadata (optional): Object with metadata:
contentType: File MIME typecacheControl: Cache-Control header valuecontentDisposition: Content-Disposition header valuecontentEncoding: Content-Encoding header valuemetadata: Custom metadata (key-value pairs)Response:
{
"imgUrl": "https://bucket.s3.region.amazonaws.com/uuid.jpg",
"key": "images/products/uuid.jpg",
"bucket": "my-bucket-name",
"isPublic": true
}
Example with curl:
curl -X POST http://localhost:3000/upload \
-F "file=@image.jpg" \
-F "folder=images/products" \
-F "isPublic=true" \
-F "metadata[contentType]=image/jpeg" \
-F "metadata[cacheControl]=public, max-age=3600"
Example with FormData (JavaScript):
const formData = new FormData();
formData.append('file', fileInput.files[0]);
formData.append('folder', 'images/products');
formData.append('isPublic', 'true');
formData.append('metadata[contentType]', 'image/jpeg');
formData.append('metadata[cacheControl]', 'public, max-age=3600');
fetch('http://localhost:3000/upload', {
method: 'POST',
body: formData,
});
Deletes a file from storage by key.
Parameters:
key (path parameter): File key/path in storageResponse:
{
"success": true,
"message": "File deleted successfully"
}
Example:
curl -X DELETE http://localhost:3000/upload/images/products/uuid.jpg
Generates a presigned URL for accessing a private file.
Parameters:
key (path parameter): File key/path in storageQuery Parameters:
expiresIn (optional): URL expiration time in seconds (default: 3600, max: 604800)contentType (optional): Content-Type for the presigned URLResponse:
{
"url": "https://bucket.s3.region.amazonaws.com/file.jpg?X-Amz-Algorithm=...",
"key": "images/products/uuid.jpg",
"expiresIn": 3600
}
Example:
curl "http://localhost:3000/upload/presigned-url/images/products/uuid.jpg?expiresIn=7200&contentType=image/jpeg"
import { Injectable } from '@nestjs/common';
import { UploadService } from 'nestjs-upload';
@Injectable()
export class MyService {
constructor(private readonly uploadService: UploadService) {}
}
uploadFile()Uploads a file from buffer with validation and optional optimization.
async uploadFile(
dataBuffer: Buffer,
filename: string,
folder?: string,
options?: UploadOptions,
): Promise<{ imgUrl: string; key: string; bucket: string; isPublic?: boolean }>
Parameters:
dataBuffer: Buffer with file datafilename: Original filenamefolder: Optional folder in bucketoptions: Upload options:
isPublic: Whether the file is public (default: true)metadata: File metadata:
contentType: MIME typecacheControl: Cache-Control headercontentDisposition: Content-Disposition headercontentEncoding: Content-Encoding headermetadata: Custom metadata (Record<string, string>)Example:
const result = await this.uploadService.uploadFile(
buffer,
'image.jpg',
'images/products',
{
isPublic: true,
metadata: {
contentType: 'image/jpeg',
cacheControl: 'public, max-age=3600',
metadata: {
author: 'John Doe',
category: 'product',
},
},
},
);
console.log(result.imgUrl); // File URL
console.log(result.key); // File key
console.log(result.bucket); // Bucket name
uploadPublicFile()Uploads a public file (backward compatibility method).
async uploadPublicFile(
dataBuffer: Buffer,
filename: string,
folder?: string,
): Promise<{ imgUrl: string; key: string; bucket: string }>
Example:
const result = await this.uploadService.uploadPublicFile(
buffer,
'image.jpg',
'images/products',
);
uploadFileStream()Uploads a file from stream (useful for large files).
async uploadFileStream(
stream: Readable,
filename: string,
folder?: string,
options?: UploadOptions,
): Promise<{ imgUrl: string; key: string; bucket: string; isPublic?: boolean }>
Example:
import { createReadStream } from 'fs';
const stream = createReadStream('./large-file.zip');
const result = await this.uploadService.uploadFileStream(
stream,
'large-file.zip',
'downloads',
{ isPublic: false },
);
uploadFiles()Uploads multiple files simultaneously (batch upload).
async uploadFiles(
files: Array<{ buffer: Buffer; filename: string; folder?: string }>,
options?: UploadOptions,
): Promise<Array<{ imgUrl: string; key: string; bucket: string; isPublic?: boolean }>>
Example:
const files = [
{ buffer: buffer1, filename: 'image1.jpg', folder: 'images' },
{ buffer: buffer2, filename: 'image2.png', folder: 'images' },
{ buffer: buffer3, filename: 'document.pdf', folder: 'documents' },
];
const results = await this.uploadService.uploadFiles(files, {
isPublic: true,
});
results.forEach((result, index) => {
console.log(`File ${index + 1}: ${result.imgUrl}`);
});
deleteFile()Deletes a file from storage.
async deleteFile(
key: string,
ignoreNotFound?: boolean,
): Promise<boolean>
Parameters:
key: File key/path in storageignoreNotFound: If true, doesn't throw error if file not found (default: false)Example:
// With error thrown if file not found
await this.uploadService.deleteFile('images/products/uuid.jpg');
// Without error thrown
const deleted = await this.uploadService.deleteFile(
'images/products/uuid.jpg',
true,
);
if (deleted) {
console.log('File deleted');
} else {
console.log('File not found');
}
deleteFiles()Deletes multiple files simultaneously (batch delete).
async deleteFiles(
keys: string[],
ignoreNotFound?: boolean,
): Promise<boolean[]>
Example:
const keys = [
'images/products/uuid1.jpg',
'images/products/uuid2.png',
'images/products/uuid3.webp',
];
const results = await this.uploadService.deleteFiles(keys, true);
// results = [true, true, false] - last file not found
getPresignedUrl()Generates a presigned URL for accessing a private file.
async getPresignedUrl(
key: string,
options?: PresignedUrlOptions,
): Promise<string>
Parameters:
key: File key/path in storageoptions: Presigned URL options:
expiresIn: URL expiration time in seconds (default: 3600)contentType: Content-Type for URLExample:
// Basic presigned URL (1 hour)
const url = await this.uploadService.getPresignedUrl(
'private/documents/file.pdf',
);
// With custom expiration and Content-Type
const url = await this.uploadService.getPresignedUrl(
'private/images/photo.jpg',
{
expiresIn: 7200, // 2 hours
contentType: 'image/jpeg',
},
);
fileExists()Checks if a file exists in storage.
async fileExists(key: string): Promise<boolean>
Example:
const exists = await this.uploadService.fileExists('images/products/uuid.jpg');
if (exists) {
console.log('File exists');
}
getFileMetadata()Gets file metadata.
async getFileMetadata(key: string): Promise<FileMetadata | null>
Example:
const metadata = await this.uploadService.getFileMetadata(
'images/products/uuid.jpg',
);
if (metadata) {
console.log('Size:', metadata.size);
console.log('Content-Type:', metadata.contentType);
console.log('Last Modified:', metadata.lastModified);
console.log('ETag:', metadata.etag);
console.log('Custom metadata:', metadata.metadata);
}
validation: {
maxFileSize: 5 * 1024 * 1024, // Maximum size in bytes
allowedMimeTypes: ['image/jpeg', 'image/png', 'image/webp'], // Allowed MIME types
}
imageValidation: {
minWidth: 100, // Minimum width in pixels
maxWidth: 2000, // Maximum width in pixels
minHeight: 100, // Minimum height in pixels
maxHeight: 2000, // Maximum height in pixels
aspectRatio: { // Aspect ratio
width: 16,
height: 9,
tolerance: 0.1, // Allowed deviation (10%)
},
allowedFormats: ['jpeg', 'png', 'webp'], // Allowed formats
}
Example validation error:
{
"statusCode": 400,
"message": "Image validation failed: Image width must be at least 100px, Image aspect ratio must be 16:9"
}
The library supports automatic image optimization using Sharp.
imageOptimization: {
enabled: true,
maxWidth: 1920, // Maximum width after optimization
maxHeight: 1080, // Maximum height after optimization
quality: 85, // Image quality (1-100)
format: 'auto', // Output format: 'auto' | ImageFormat.JPEG | ImageFormat.PNG | ImageFormat.WEBP
convertToWebP: false, // If true, all images are converted to WebP
}
import { ImageFormat } from 'nestjs-upload';
// Available formats
ImageFormat.JPEG // 'jpeg'
ImageFormat.JPG // 'jpg' (maps to JPEG)
ImageFormat.PNG // 'png'
ImageFormat.WEBP // 'webp'
Automatic format (preserves original format):
imageOptimization: {
enabled: true,
format: 'auto',
maxWidth: 1920,
quality: 85,
}
Force WebP conversion:
imageOptimization: {
enabled: true,
convertToWebP: true,
quality: 85,
}
Convert to JPEG:
imageOptimization: {
enabled: true,
format: ImageFormat.JPEG,
quality: 90,
}
Without resizing (compression only):
imageOptimization: {
enabled: true,
quality: 85,
// maxWidth and maxHeight not specified
}
import { ImageOptimizer } from 'nestjs-upload';
const optimizer = new ImageOptimizer({
enabled: true,
maxWidth: 1920,
quality: 85,
});
const result = await optimizer.optimize(buffer, 'image/jpeg');
console.log(`Original: ${buffer.length} bytes`);
console.log(`Optimized: ${result.size} bytes`);
console.log(`Reduction: ${((1 - result.size / buffer.length) * 100).toFixed(1)}%`);
The library supports rate limiting to protect against abuse.
rateLimit: {
enabled: true,
type: 'memory',
windowMs: 60000, // 1 minute
maxRequests: 10, // Maximum 10 requests per minute
}
rateLimit: {
enabled: true,
type: 'redis',
windowMs: 60000,
maxRequests: 10,
redis: {
host: 'localhost',
port: 6379,
// or use existing client
// client: redisClient,
},
}
{
"statusCode": 429,
"message": "Too Many Requests",
"retryAfter": 30
}
The library supports caching of presigned URLs to improve performance.
cache: {
enabled: true,
ttl: 3600, // Time to live in seconds (1 hour)
maxSize: 1000, // Maximum number of entries in cache
}
import { PresignedUrlCacheService } from 'nestjs-upload';
const cache = new PresignedUrlCacheService(3600, 1000);
// Store URL
cache.set('key', 'https://...', 3600);
// Get URL
const url = cache.get('key');
// Delete from cache
cache.delete('key');
// Clear entire cache
cache.clear();
import {
// Enums
StorageProvider,
ImageFormat,
ImageMimeType,
// Interfaces
UploadConfig,
AwsConfig,
GcsConfig,
ValidationConfig,
ImageValidationConfig,
ImageOptimizationConfig,
CacheConfig,
RateLimitConfig,
// Services
UploadService,
ImageOptimizer,
PresignedUrlCacheService,
// Validators
FileValidator,
ImageValidator,
// Providers
AwsS3Provider,
GcsProvider,
IStorageProvider,
// Guards
RateLimitGuard,
// Utils
getMimeType,
isImageMimeType,
validateFileKey,
buildSafeKey,
} from 'nestjs-upload';
interface UploadOptions {
isPublic?: boolean;
metadata?: FileMetadataOptions;
}
interface FileMetadataOptions {
contentType?: string;
cacheControl?: string;
contentDisposition?: string;
contentEncoding?: string;
metadata?: Record<string, string>;
}
interface PresignedUrlOptions {
expiresIn?: number; // Expiration time in seconds
contentType?: string; // Content-Type
}
interface FileMetadata {
key: string;
size?: number;
contentType?: string;
lastModified?: Date;
etag?: string;
metadata?: Record<string, string>;
}
The library throws the following error types:
Thrown when:
Example:
try {
await uploadService.uploadFile(buffer, 'file.jpg');
} catch (error) {
if (error instanceof BadRequestException) {
console.error('Validation error:', error.message);
}
}
Thrown when:
Example:
try {
await uploadService.uploadFile(buffer, 'file.jpg');
} catch (error) {
if (error instanceof InternalServerErrorException) {
console.error('Server error:', error.message);
}
}
import { Catch, ExceptionFilter, ArgumentsHost } from '@nestjs/common';
import { BadRequestException, InternalServerErrorException } from '@nestjs/common';
@Catch(BadRequestException, InternalServerErrorException)
export class UploadExceptionFilter implements ExceptionFilter {
catch(exception: BadRequestException | InternalServerErrorException, host: ArgumentsHost) {
const response = host.switchToHttp().getResponse();
const status = exception.getStatus();
response.status(status).json({
statusCode: status,
message: exception.message,
timestamp: new Date().toISOString(),
});
}
}
import { Injectable } from '@nestjs/common';
import { UploadService } from 'nestjs-upload';
@Injectable()
export class ImageService {
constructor(private readonly uploadService: UploadService) {}
async uploadProductImage(buffer: Buffer, filename: string) {
const result = await this.uploadService.uploadFile(
buffer,
filename,
'images/products',
{
isPublic: true,
metadata: {
contentType: 'image/jpeg',
cacheControl: 'public, max-age=31536000',
metadata: {
category: 'product',
},
},
},
);
return result.imgUrl;
}
}
async uploadPrivateDocument(buffer: Buffer, filename: string, userId: string) {
const result = await this.uploadService.uploadFile(
buffer,
filename,
`documents/${userId}`,
{
isPublic: false,
metadata: {
contentType: 'application/pdf',
metadata: {
userId,
type: 'document',
},
},
},
);
// Generate presigned URL for access
const presignedUrl = await this.uploadService.getPresignedUrl(
result.key,
{ expiresIn: 3600 },
);
return {
key: result.key,
url: presignedUrl,
};
}
async uploadMultipleImages(files: Array<{ buffer: Buffer; filename: string }>) {
const uploadPromises = files.map(file =>
this.uploadService.uploadFile(
file.buffer,
file.filename,
'images/gallery',
{ isPublic: true },
).catch(error => {
console.error(`Failed to upload ${file.filename}:`, error.message);
return null;
}),
);
const results = await Promise.all(uploadPromises);
const successful = results.filter(r => r !== null);
return {
total: files.length,
successful: successful.length,
failed: files.length - successful.length,
urls: successful.map(r => r!.imgUrl),
};
}
async deleteUserFiles(userId: string) {
// Get list of user files (requires implementation)
const userFiles = await this.getUserFiles(userId);
// Delete all files
const results = await this.uploadService.deleteFiles(
userFiles.map(f => f.key),
true, // ignoreNotFound
);
const deletedCount = results.filter(r => r === true).length;
return {
total: userFiles.length,
deleted: deletedCount,
};
}
async updateFileIfExists(key: string, newBuffer: Buffer) {
const exists = await this.uploadService.fileExists(key);
if (!exists) {
throw new NotFoundException('File not found');
}
// Delete old file
await this.uploadService.deleteFile(key);
// Upload new file
const result = await this.uploadService.uploadFile(
newBuffer,
key.split('/').pop()!,
key.split('/').slice(0, -1).join('/'),
);
return result;
}
import { Controller, Post, UploadedFile, UseInterceptors } from '@nestjs/common';
import { FileInterceptor } from '@nestjs/platform-express';
import { UploadService } from 'nestjs-upload';
@Controller('api/files')
export class FilesController {
constructor(private readonly uploadService: UploadService) {}
@Post('upload')
@UseInterceptors(FileInterceptor('file'))
async upload(@UploadedFile() file: Express.Multer.File) {
if (!file) {
throw new BadRequestException('File is required');
}
const result = await this.uploadService.uploadFile(
file.buffer,
file.originalname,
'uploads',
);
return {
success: true,
url: result.imgUrl,
key: result.key,
};
}
}
The library uses NestJS built-in logger for logging:
Example logs:
[UploadService] File uploaded successfully: images/products/uuid.jpg (public)
[ImageOptimizer] Image optimized: 2048576 -> 512384 bytes (75.0% reduction)
[UploadService] Presigned URL generated for: private/documents/file.pdf
If you use @nestjs/swagger, all endpoints will be automatically documented in Swagger UI.
Example configuration:
import { DocumentBuilder, SwaggerModule } from '@nestjs/swagger';
const config = new DocumentBuilder()
.setTitle('My API')
.setDescription('API description')
.setVersion('1.0')
.addTag('upload')
.build();
const document = SwaggerModule.createDocument(app, config);
SwaggerModule.setup('api', app, document);
import { IStorageProvider, UploadResult, UploadOptions } from 'nestjs-upload';
import { Readable } from 'stream';
export class CustomStorageProvider implements IStorageProvider {
async upload(
dataBuffer: Buffer,
filename: string,
folder?: string,
options?: UploadOptions,
): Promise<UploadResult> {
// Your implementation
}
// Implement other methods...
}
// In your module
const customProvider = new CustomStorageProvider();
const uploadService = new UploadService(
customProvider,
fileValidator,
imageValidator,
imageOptimizer,
urlCache,
);
MIT
If you have questions or issues, please create an issue in the project repository.
FAQs
NestJS library for uploading images to AWS S3 and Google Cloud Storage with advanced validation.
We found that nestjs-upload demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.