
Security News
The Nightmare Before Deployment
Season’s greetings from Socket, and here’s to a calm end of year: clean dependencies, boring pipelines, no surprises.
@yofix/storage
Advanced tools
Universal storage manager for handling file operations across S3, GCS, R2, DigitalOcean Spaces, Firebase, and GitHub Actions
Universal storage manager for handling file operations across multiple cloud providers. Upload and download files to GitHub Actions Artifacts, Firebase Storage, AWS S3, Google Cloud Storage, Cloudflare R2, DigitalOcean Spaces, or local filesystem with a unified API.
quickUpload() and quickDownload() for simple operationsnpm install @yofix/storage
import { uploadFiles, downloadFiles, quickUpload, quickDownload } from '@yofix/storage'
// Upload files
const result = await uploadFiles({
storage: { provider: 's3', config: { region: 'us-east-1', bucket: 'my-bucket' } },
files: ['dist/**/*', 'package.json']
})
// Download files
const downloaded = await downloadFiles({
storage: { provider: 's3', config: { region: 'us-east-1', bucket: 'my-bucket' } },
files: ['config.json', 'data/users.json']
})
// Quick upload (data directly)
const path = await quickUpload(
{ provider: 'gcs', config: { bucket: 'my-bucket' } },
{ path: 'data.json', data: Buffer.from(JSON.stringify(data)) }
)
// Quick download
const buffer = await quickDownload(
{ provider: 'gcs', config: { bucket: 'my-bucket' } },
{ path: 'data.json' }
)
import { uploadFiles } from '@yofix/storage'
const result = await uploadFiles({
storage: {
provider: 'local',
config: {
directory: './uploads',
createIfNotExists: true,
basePath: 'files'
}
},
files: ['package.json', 'dist/**/*'],
verbose: true
})
const result = await uploadFiles({
storage: {
provider: 's3',
config: {
region: 'us-east-1',
bucket: 'your-bucket',
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
basePath: 'uploads',
acl: 'public-read'
}
},
files: ['dist/**/*'],
onProgress: (progress) => {
console.log(`${progress.filesUploaded}/${progress.totalFiles}`)
}
})
// Using service account
const result = await uploadFiles({
storage: {
provider: 'gcs',
config: {
bucket: 'your-bucket',
projectId: 'your-project',
keyFilename: '/path/to/service-account.json'
}
},
files: ['images/**/*.{jpg,png}']
})
// Using OAuth refresh token (user-based auth)
const result = await uploadFiles({
storage: {
provider: 'gcs',
config: {
bucket: 'your-bucket',
refreshToken: 'user-refresh-token',
clientId: 'oauth-client-id',
clientSecret: 'oauth-client-secret'
}
},
files: ['uploads/**/*']
})
GCS Authentication Priority:
const result = await uploadFiles({
storage: {
provider: 'firebase',
config: {
bucket: 'your-project.appspot.com',
credentials: JSON.parse(process.env.FIREBASE_CREDENTIALS),
basePath: 'uploads'
}
},
files: ['images/**/*.{jpg,png}']
})
const result = await uploadFiles({
storage: {
provider: 'r2',
config: {
accountId: 'your-account-id',
bucket: 'your-bucket',
accessKeyId: process.env.R2_ACCESS_KEY_ID,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
basePath: 'uploads',
publicUrl: 'https://cdn.example.com' // Optional custom domain
}
},
files: ['dist/**/*']
})
const result = await uploadFiles({
storage: {
provider: 'do-spaces',
config: {
region: 'nyc3', // nyc3, sfo3, ams3, sgp1, fra1, syd1
bucket: 'your-space',
accessKeyId: process.env.DO_SPACES_KEY,
secretAccessKey: process.env.DO_SPACES_SECRET,
basePath: 'uploads',
acl: 'public-read'
}
},
files: ['static/**/*']
})
// Only works within GitHub Actions environment
const result = await uploadFiles({
storage: {
provider: 'github',
config: {
artifactName: 'build-artifacts',
retentionDays: 30,
basePath: 'dist'
}
},
files: ['dist/**/*', 'coverage/**/*']
})
import { downloadFiles, quickDownload } from '@yofix/storage'
// Download multiple files
const result = await downloadFiles({
storage: {
provider: 's3',
config: { region: 'us-east-1', bucket: 'my-bucket' }
},
files: ['config.json', 'data/users.json'],
verbose: true
})
// Access downloaded content
result.files.forEach(file => {
console.log(`Downloaded: ${file.remotePath}`)
console.log(`Size: ${file.size} bytes`)
console.log(`Content: ${file.buffer.toString()}`)
})
// Quick download single file
const buffer = await quickDownload(
{ provider: 's3', config: { region: 'us-east-1', bucket: 'my-bucket' } },
{ path: 'config.json' }
)
By default, providers use singleton mode with connection reuse for better performance:
import { StorageManager } from '@yofix/storage'
// Multiple uploads reuse the same S3 client
await uploadFiles({ storage: s3Config, files: batch1 }) // Creates client
await uploadFiles({ storage: s3Config, files: batch2 }) // Reuses client!
// Configure registry behavior
StorageManager.configure({
singleton: true, // Enable/disable singleton mode (default: true)
autoCleanup: true, // Auto-cleanup idle connections
idleTTL: 5 * 60 * 1000, // 5 minutes idle timeout
cleanupInterval: 60 * 1000 // Check every minute
})
// Disable singleton mode (original per-operation behavior)
StorageManager.configure({ singleton: false })
// Manual cleanup
await StorageManager.cleanup() // Cleanup all providers
await StorageManager.cleanupProvider(s3Config) // Cleanup specific provider
await StorageManager.cleanupIdle(60000) // Cleanup idle > 1 minute
// Get connection statistics
const stats = StorageManager.getStats()
// { totalProviders: 2, activeProviders: 1, idleProviders: 1, providers: [...] }
import { StorageManager } from '@yofix/storage'
const manager = new StorageManager({
storage: { provider: 'local', config: { directory: './uploads' } },
files: ['*.json'],
verbose: true
})
// Upload
const uploadResult = await manager.upload()
// Download
const downloadResult = await manager.download(['config.json', 'data.json'])
// File operations
const exists = await manager.fileExists('package.json')
const files = await manager.listFiles('uploads/')
const objects = await manager.listObjects({ prefix: 'images/', maxKeys: 100 })
const size = await manager.getFileSize('large-file.zip')
const metadata = await manager.getFileMetadata('document.pdf')
await manager.deleteFile('old-file.txt')
// Direct data upload
const path = await manager.uploadData({
path: 'data.json',
data: Buffer.from(JSON.stringify({ key: 'value' })),
contentType: 'application/json'
})
// Generate signed URLs (S3, GCS, Firebase, R2, DO Spaces)
const downloadUrl = await manager.getSignedUrl('documents/report.pdf', {
action: 'read',
expires: 3600 // 1 hour in seconds
})
const uploadUrl = await manager.getSignedUrl('uploads/new-file.pdf', {
action: 'write',
expires: new Date(Date.now() + 15 * 60 * 1000), // 15 minutes
contentType: 'application/pdf'
})
interface GitHubConfig {
artifactName?: string // Default: 'storage-artifacts'
retentionDays?: number // 1-90, Default: 90
basePath?: string
}
interface FirebaseConfig {
projectId?: string
credentials?: string | object // Service account JSON (supports base64)
bucket: string // Required
basePath?: string
}
interface S3Config {
region: string // Required
bucket: string // Required
accessKeyId?: string
secretAccessKey?: string
basePath?: string
acl?: string // 'public-read', 'private', etc.
endpoint?: string // For S3-compatible services
}
interface GCSConfig {
bucket: string // Required
projectId?: string
keyFilename?: string // Service account key file path
credentials?: string | object
accessToken?: string // OAuth access token
refreshToken?: string // OAuth refresh token
clientId?: string // Required with refreshToken
clientSecret?: string // Required with refreshToken
basePath?: string
}
interface R2Config {
accountId: string // Required
bucket: string // Required
accessKeyId: string // Required
secretAccessKey: string // Required
basePath?: string
publicUrl?: string // Custom domain for public buckets
}
interface DOSpacesConfig {
region: string // Required: nyc3, sfo3, ams3, sgp1, fra1, syd1
bucket: string // Required
accessKeyId: string // Required
secretAccessKey: string // Required
basePath?: string
acl?: string // 'public-read', 'private', etc.
}
interface LocalConfig {
directory: string // Required
createIfNotExists?: boolean // Default: true
basePath?: string
}
Generate pre-signed URLs for temporary access to files without exposing credentials. Useful for:
import { StorageManager } from '@yofix/storage'
const manager = new StorageManager({
storage: { provider: 's3', config: { region: 'us-east-1', bucket: 'my-bucket' } }
})
// Generate a download URL (expires in 1 hour)
const downloadUrl = await manager.getSignedUrl('reports/annual.pdf', {
action: 'read',
expires: 3600 // seconds
})
// Generate an upload URL (expires in 15 minutes)
const uploadUrl = await manager.getSignedUrl('uploads/user-file.pdf', {
action: 'write',
expires: new Date(Date.now() + 15 * 60 * 1000),
contentType: 'application/pdf' // Required for write on some providers
})
// Use with fetch for direct upload
await fetch(uploadUrl, {
method: 'PUT',
headers: { 'Content-Type': 'application/pdf' },
body: fileBuffer
})
Supported Providers: S3, GCS, Firebase, R2, DO Spaces
Options:
| Option | Type | Description |
|---|---|---|
action | 'read' | 'write' | Download or upload access |
expires | Date | number | Expiration as Date or seconds from now |
contentType | string | MIME type (required for write on some providers) |
| Feature | GitHub | Firebase | S3 | GCS | R2 | DO Spaces | Local |
|---|---|---|---|---|---|---|---|
| Upload File | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
| Upload Data | - | Yes | Yes | Yes | Yes | Yes | Yes |
| Download | - | Yes | Yes | Yes | Yes | Yes | Yes |
| File Exists | - | Yes | Yes | Yes | Yes | Yes | Yes |
| Delete File | - | Yes | Yes | Yes | Yes | Yes | Yes |
| List Files | - | Yes | Yes | Yes | Yes | Yes | Yes |
| Get Metadata | - | Yes | Yes | Yes | Yes | Yes | Yes |
| OAuth Support | - | - | - | Yes | - | - | - |
| Multipart Upload | - | - | Yes | - | Yes | Yes | - |
| Signed URLs | - | Yes | Yes | Yes | Yes | Yes | - |
| CDN Support | - | - | - | - | - | Yes | - |
| Base Path | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
name: Upload Artifacts
on:
push:
branches: [main]
jobs:
upload:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Build
run: npm run build
- name: Upload to GitHub Artifacts
uses: yofix/storage@main
with:
provider: github
files: 'dist/**/*,coverage/**/*'
artifact-name: 'build-artifacts'
retention-days: 30
verbose: true
- name: Upload to S3
uses: yofix/storage@main
with:
provider: s3
files: 'dist/**/*'
s3-region: us-east-1
s3-bucket: my-bucket
s3-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
s3-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
s3-base-path: builds/${{ github.sha }}
s3-acl: public-read
verbose: true
Full TypeScript support with comprehensive type definitions:
import type {
// Core
StorageProvider,
StorageOptions,
StorageResult,
StorageError,
ProviderConfig,
// Files
FileToUpload,
DataToUpload,
UploadedFile,
UploadProgress,
FileToDownload,
DownloadedFile,
DownloadOptions,
DownloadResult,
// Metadata
FileMetadata,
StorageObject,
ListOptions,
SignedUrlOptions,
// Provider configs
GitHubConfig,
FirebaseConfig,
S3Config,
GCSConfig,
R2Config,
DOSpacesConfig,
LocalConfig,
// Provider interface
IStorageProvider,
// Registry
RegistryOptions
} from '@yofix/storage'
const result = await uploadFiles({...})
if (!result.success) {
result.errors?.forEach(error => {
console.error(`[${error.code}] ${error.message}`)
if (error.file) {
console.error(`File: ${error.file}`)
}
})
}
# AWS S3
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
# Google Cloud Storage (OAuth)
export GOOGLE_CLIENT_ID=your-client-id
export GOOGLE_CLIENT_SECRET=your-client-secret
# Firebase
export FIREBASE_CREDENTIALS='{"type":"service_account",...}'
# Cloudflare R2
export R2_ACCESS_KEY_ID=your-r2-key
export R2_SECRET_ACCESS_KEY=your-r2-secret
# DigitalOcean Spaces
export DO_SPACES_KEY=your-spaces-key
export DO_SPACES_SECRET=your-spaces-secret
MIT
Contributions are welcome! Please open an issue or submit a pull request on GitHub.
FAQs
Universal storage manager for handling file operations across S3, GCS, R2, DigitalOcean Spaces, Firebase, and GitHub Actions
We found that @yofix/storage demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Season’s greetings from Socket, and here’s to a calm end of year: clean dependencies, boring pipelines, no surprises.

Research
/Security News
Impostor NuGet package Tracer.Fody.NLog typosquats Tracer.Fody and its author, using homoglyph tricks, and exfiltrates Stratis wallet JSON/passwords to a Russian IP address.

Security News
Deno 2.6 introduces deno audit with a new --socket flag that plugs directly into Socket to bring supply chain security checks into the Deno CLI.