cloudku-uploader

Powerful file uploader client for CloudKu image hosting service with automatic chunked uploads, stream support, and load balancing across multiple endpoints.
✨ Features
- 🚀 Smart Upload Strategy - Automatically switches between single and chunked upload based on file size (100MB threshold)
- 📦 Chunked Upload Support - Handles large files by splitting into 8MB chunks with UUID-based tracking
- 🌊 Stream Upload - Direct stream-to-upload capability for memory-efficient processing
- 🔄 Built-in Load Balancing - Random endpoint selection across
cloudkuimages.guru and cloudkuimages-guru.us.itpanel.app
- 📤 Batch Upload - Upload multiple files concurrently with
Promise.allSettled for resilient batch operations
- 🎯 Simple & Clean API - Minimal surface area with three core methods
- 📦 Dual Module Support - Ships with both ESM and CommonJS builds
- 🔒 Type-safe - Full TypeScript definitions included
- ⚡ Zero Dependencies - Pure JavaScript implementation using native Web APIs
📦 Installation
npm install cloudku-uploader
yarn add cloudku-uploader
pnpm add cloudku-uploader
🚀 Usage
ESM (ES Modules)
import cloudku from 'cloudku-uploader'
const buffer = await fetch('image.jpg').then(r => r.arrayBuffer())
const result = await cloudku.uploadFile(buffer, 'image.jpg')
console.log(result.url)
CommonJS
const cloudku = require('cloudku-uploader')
const fs = require('fs').promises
async function upload() {
const buffer = await fs.readFile('image.jpg')
const result = await cloudku.uploadFile(buffer, 'image.jpg')
console.log(result.url)
}
upload()
Browser - File Input
import cloudku from 'cloudku-uploader'
document.querySelector('#fileInput').addEventListener('change', async (e) => {
const file = e.target.files[0]
const buffer = await file.arrayBuffer()
const result = await cloudku.uploadFile(buffer, file.name)
console.log('Uploaded to:', result.url)
})
Node.js - File System
import cloudku from 'cloudku-uploader'
import { readFile } from 'fs/promises'
const buffer = await readFile('./photo.jpg')
const result = await cloudku.uploadFile(buffer.buffer, 'photo.jpg')
console.log('File URL:', result.url)
Stream Upload (Node.js)
import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { pipeline } from 'stream/promises'
const stream = createReadStream('./large-video.mp4')
const chunks = []
for await (const chunk of stream) {
chunks.push(chunk)
}
const buffer = Buffer.concat(chunks)
const result = await cloudku.uploadFile(buffer.buffer, 'video.mp4')
console.log('Stream uploaded:', result.url)
Stream Upload with Progress Tracking
import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { stat } from 'fs/promises'
const filePath = './movie.mp4'
const fileStats = await stat(filePath)
const totalSize = fileStats.size
const stream = createReadStream(filePath)
let uploadedSize = 0
const chunks = []
for await (const chunk of stream) {
chunks.push(chunk)
uploadedSize += chunk.length
const progress = (uploadedSize / totalSize * 100).toFixed(2)
console.log(`Progress: ${progress}%`)
}
const buffer = Buffer.concat(chunks)
const result = await cloudku.uploadFile(buffer.buffer, 'movie.mp4')
console.log('Complete:', result.url)
Large File Upload with Custom Chunk Size
import cloudku from 'cloudku-uploader'
const buffer = await fetch('4k-video.mp4').then(r => r.arrayBuffer())
const result = await cloudku.uploadLarge(
buffer,
'4k-video.mp4',
16 * 1024 * 1024
)
console.log('Large file URL:', result.url)
Batch Upload with Status Tracking
import cloudku from 'cloudku-uploader'
const files = [
{ buffer: buffer1, name: 'photo1.jpg' },
{ buffer: buffer2, name: 'photo2.png' },
{ buffer: buffer3, name: 'document.pdf' }
]
const results = await cloudku.uploadBatch(files)
const successful = results.filter(r => r.status === 'fulfilled')
const failed = results.filter(r => r.status === 'rejected')
console.log(`✓ ${successful.length} uploaded successfully`)
console.log(`✗ ${failed.length} failed`)
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(`[${index + 1}] ${files[index].name}: ${result.value.url}`)
} else {
console.error(`[${index + 1}] ${files[index].name}: ${result.reason.message}`)
}
})
📚 API Reference
uploadFile(buffer, name?)
Main upload method with automatic strategy selection based on file size.
Parameters:
buffer {ArrayBuffer} - File content as ArrayBuffer (required)
name {string} - Filename with extension (optional, default: 'file.bin')
Returns: Promise<UploadResult>
Behavior:
- Files ≤ 100MB: Uses single POST request
- Files > 100MB: Automatically switches to chunked upload
Example:
const buffer = await file.arrayBuffer()
const result = await cloudku.uploadFile(buffer, 'photo.jpg')
Response Schema:
{
status: 'success',
url: 'https://cloudkuimages.guru/files/abc123.jpg',
filename: 'photo.jpg',
size: 2048576
}
uploadLarge(buffer, name?, chunkSize?)
Explicit chunked upload for large files with progress control.
Parameters:
buffer {ArrayBuffer} - File content as ArrayBuffer (required)
name {string} - Filename with extension (optional, default: 'file.bin')
chunkSize {number} - Chunk size in bytes (optional, default: 8388608 = 8MB)
Returns: Promise<UploadResult>
Implementation Details:
- Generates UUID v4 as
fileId for chunk tracking
- Splits buffer into chunks of specified size
- Uploads each chunk with metadata:
chunk: Current chunk index (0-based)
chunks: Total number of chunks
filename: Original filename
fileId: UUID for tracking
size: Total file size in bytes
- Sends finalization request with
chunked=1&finalize=1 query params
Example:
const result = await cloudku.uploadLarge(
buffer,
'movie.mkv',
10 * 1024 * 1024
)
Chunk Upload Request:
FormData {
file: Blob(chunk),
chunk: 0,
chunks: 12,
filename: 'movie.mkv',
fileId: '550e8400-e29b-41d4-a716-446655440000',
size: 104857600
}
Finalization Request:
POST /upload.php?chunked=1&finalize=1
Content-Type: application/json
{
"fileId": "550e8400-e29b-41d4-a716-446655440000",
"filename": "movie.mkv",
"chunks": 12
}
uploadBatch(files)
Upload multiple files concurrently with individual error handling.
Parameters:
files {Array} - Array of file objects (required)
FileObject Schema:
{
buffer: ArrayBuffer,
name: string
}
Returns: Promise<Array<PromiseSettledResult<UploadResult>>>
Example:
const results = await cloudku.uploadBatch([
{ buffer: buffer1, name: 'image1.jpg' },
{ buffer: buffer2, name: 'image2.png' },
{ buffer: buffer3, name: 'video.mp4' }
])
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(`Success: ${result.value.url}`)
} else {
console.error(`Failed: ${result.reason}`)
}
})
🔧 Advanced Usage
Error Handling
try {
const result = await cloudku.uploadFile(buffer, 'image.jpg')
if (result.status === 'error') {
throw new Error(result.message)
}
console.log('Uploaded:', result.url)
} catch (error) {
console.error('Upload failed:', error.message)
}
Retry Logic
async function uploadWithRetry(buffer, name, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
try {
const result = await cloudku.uploadFile(buffer, name)
return result
} catch (error) {
if (i === maxRetries - 1) throw error
await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)))
}
}
}
Custom Chunk Size Based on Connection
function getOptimalChunkSize(connectionType) {
const sizes = {
'4g': 16 * 1024 * 1024,
'wifi': 10 * 1024 * 1024,
'3g': 4 * 1024 * 1024,
'slow-2g': 1 * 1024 * 1024
}
return sizes[connectionType] || 8 * 1024 * 1024
}
const connection = navigator.connection?.effectiveType || 'wifi'
const chunkSize = getOptimalChunkSize(connection)
const result = await cloudku.uploadLarge(buffer, 'file.zip', chunkSize)
Parallel Batch Upload with Concurrency Limit
async function uploadBatchWithLimit(files, limit = 3) {
const results = []
for (let i = 0; i < files.length; i += limit) {
const batch = files.slice(i, i + limit)
const batchResults = await cloudku.uploadBatch(batch)
results.push(...batchResults)
}
return results
}
⚙️ Technical Details
Endpoint Selection
The uploader uses random selection between two endpoints:
https://cloudkuimages.guru
https://cloudkuimages-guru.us.itpanel.app
This provides basic load balancing and failover capability.
Upload Flow
Small File (<= 100MB):
Client → pickBase() → POST /upload.php → Response
Large File (> 100MB):
Client → Generate UUID
→ Split into chunks
→ For each chunk:
→ POST /upload.php (with metadata)
→ POST /upload.php?chunked=1&finalize=1
→ Response
All requests include:
{
'User-Agent': 'cloudku-uploader/5.0',
'Accept': 'application/json'
}
File Size Limits
- Single upload: Recommended up to 100MB
- Chunked upload: No hard limit (tested up to 5GB)
- Default chunk size: 8MB (8,388,608 bytes)
- Recommended chunk range: 4MB - 16MB
🌐 Environment Support
Browser
- ✅ Chrome 90+
- ✅ Firefox 88+
- ✅ Safari 14+
- ✅ Edge 90+
Node.js
- ✅ Node.js 14.x
- ✅ Node.js 16.x
- ✅ Node.js 18.x
- ✅ Node.js 20.x
Frameworks
- ✅ React / Next.js
- ✅ Vue / Nuxt
- ✅ Angular
- ✅ Svelte / SvelteKit
Module Systems
- ✅ ESM (ES Modules)
- ✅ CommonJS
- ✅ UMD (via bundlers)
🛠️ Module Formats
ESM Import
import cloudku from 'cloudku-uploader'
CommonJS Require
const cloudku = require('cloudku-uploader')
TypeScript
import cloudku from 'cloudku-uploader'
import type { UploadResult, FileObject } from 'cloudku-uploader'
const result: UploadResult = await cloudku.uploadFile(buffer, 'file.jpg')
📝 Type Definitions
interface UploadResult {
status: 'success' | 'error'
url?: string
filename?: string
size?: number
message?: string
}
interface FileObject {
buffer: ArrayBuffer
name: string
}
interface CloudkuUploader {
uploadFile(buffer: ArrayBuffer, name?: string): Promise<UploadResult>
uploadLarge(buffer: ArrayBuffer, name?: string, chunkSize?: number): Promise<UploadResult>
uploadBatch(files: FileObject[]): Promise<PromiseSettledResult<UploadResult>[]>
}
🤝 Contributing
Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Commit your changes:
git commit -m 'Add amazing feature'
- Push to the branch:
git push origin feature/amazing-feature
- Open a Pull Request
Please ensure:
- Code follows existing style
- All tests pass
- Documentation is updated
- Commit messages are clear
📄 License
MIT License - see LICENSE file for details
🔗 Links
💬 Support
- 📫 Open an issue on GitHub
- 💡 Check existing issues for solutions
- 📖 Read the full documentation
⭐ Acknowledgments
Special thanks to the CloudKu team for providing the hosting infrastructure.
Made with ❤️ for the JavaScript community