Security News
Fluent Assertions Faces Backlash After Abandoning Open Source Licensing
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.
@vercel/blob
Advanced tools
The Vercel Blob JavaScript API client.
npm install @vercel/blob
import * as vercelBlob from '@vercel/blob';
// usage
async function someMethod() {
const blob = await vercelBlob.put(
'profilesv1/user-12345.txt', // pathname for the blob
'Hello World!', // body
{ access: 'public' }, // mandatory options
);
console.log(blob.url);
// https://public.blob.vercel-storage.com/n1g9m63etib6gkcjqjpspsiwe7ea/profilesv1/user-12345-NoOVGDVcqSPc7VYCUAGnTzLTG2qEM2.txt
}
put(pathname, body, options)
Upload a blob to the Vercel Blob API, and returns the URL of the blob.
async function put(
pathname: string,
body: ReadableStream | String | ArrayBuffer | Blob | File // All fetch body types are supported: https://developer.mozilla.org/en-US/docs/Web/API/fetch#body
options: {
access: 'public', // mandatory, as we will provide private blobs in the future
contentType?: string, // by default inferred from pathname
// `token` defaults to process.env.BLOB_READ_WRITE_TOKEN on Vercel
// and can be configured when you connect more stores to a project
// or using Vercel Blob outside of Vercel
// on the client `token` is mandatory and must be generated by "generateClientTokenFromReadWriteToken"
token?: string,
}): Promise<{
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
}> {}
Delete one or multiple blobs by their full URL. Returns the deleted blob(s) or null when not found.
async function del(
url: string | string[],
options?: {
token?: string;
},
): Promise<
| {
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
}
| null
| ({
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
} | null)[]
> {}
Get the metadata of a blob by its full URL. Returns null
when the blob does not exist.
async function head(
url: string,
options?: {
token?: string;
},
): Promise<{
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
} | null> {}
List blobs and get their metadata in the store. With an optional prefix and limit. Paginate through them.
async function list(options?: {
token?: string;
limit?: number; // defaults to 1,000
prefix?: string;
cursor?: string;
}): Promise<{
blobs: {
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
}[];
cursor?: string;
hasMore: boolean;
}> {}
Handles the requests to generate a client token and respond to the upload completed event. This is useful when uploading directly from browsers to circumvent the 4MB limitation of going through a Vercel-hosted route.
async function handleBlobUpload(options?: {
token?: string;
request?: IncomingMessage | Request;
onBeforeGenerateToken: (pathname: string) => Promise<{
allowedContentTypes?: string[];
maximumSizeInBytes?: number;
validUntil?: number; // timestamp in ms, by default 30s
metadata?: string;
}>;
onUploadCompleted: (body: {
type: 'blob.upload-completed';
payload: {
blob: BlobResult;
metadata?: string;
};
}) => Promise<void>;
body:
| {
type: 'blob.upload-completed';
payload: {
blob: BlobResult;
metadata?: string;
};
}
| {
type: 'blob.generate-client-token';
payload: { pathname: string; callbackUrl: string };
};
}): string {}
Note: This method should be called server-side, not client-side.
Generates a single-use token that can be used from within the client. This method is called internally by handleBlobUpload
.
Once created, a client token is valid by default for 30 seconds (can be customized by configuring the validUntil
field). This means you have 30 seconds to initiate an upload with this token.
async function generateClientTokenFromReadWriteToken(options?: {
token?: string;
pathname?: string;
onUploadCompleted?: {
callbackUrl: string;
metadata?: string;
};
maximumSizeInBytes?: number;
allowedContentTypes?: string[];
validUntil?: number; // timestamp in ms, by default 30s
}): string {}
Note: This method should be called server-side, not client-side.
This example shows a form uploading a file to the Vercel Blob API.
// /app/UploadForm.tsx
'use client';
import type { BlobResult } from '@vercel/blob';
import { useState } from 'react';
export default function UploadForm() {
const [blob, setBlob] = useState<BlobResult | null>(null);
return (
<>
<form
action="/api/upload"
method="POST"
encType="multipart/form-data"
onSubmit={async (event) => {
event.preventDefault();
const formData = new FormData(event.currentTarget);
const response = await fetch('/api/upload', {
method: 'POST',
body: formData,
});
const blob = (await response.json()) as BlobResult;
setBlob(blob);
}}
>
<input type="file" name="file" />
<button type="submit">Upload</button>
</form>
{blob && (
<div>
Blob url: <a href={blob.url}>{blob.url}</a>
</div>
)}
</>
);
}
// /app/api/upload/route.ts
import * as vercelBlob from '@vercel/blob';
import { NextResponse } from 'next/server';
export async function POST(request: Request) {
const form = await request.formData();
const file = form.get('file') as File;
if (!file) {
return NextResponse.json(
{ message: 'No file to upload.' },
{ status: 400 },
);
}
const blob = await vercelBlob.put(file.name, file, { access: 'public' });
return NextResponse.json(blob);
}
The above example uploads a file through a vercel route. This solution is limited to a 4Mb file size. In order to bypass this limit, it is possible to upload a file directly from within the client, after generating a single-use token.
// /app/UploadForm.tsx
'use client';
import type { BlobResult, put } from '@vercel/blob';
import { useState } from 'react';
export default function UploadForm() {
const inputFileRef = useRef<HTMLInputElement>(null);
const [blob, setBlob] = useState<BlobResult | null>(null);
return (
<>
<h1>App Router Client Upload</h1>
<form
onSubmit={async (event): Promise<void> => {
event.preventDefault();
const file = inputFileRef.current?.files?.[0];
if (!file) {
return;
}
// A secured client token will be obtained automatically by making a call to `/api/upload/avatars`
const blobResult = await put(file.name, file, {
access: 'public',
handleBlobUploadUrl: '/api/upload/avatars',
});
setBlob(blobResult);
}}
>
<input name="file" ref={inputFileRef} type="file" />
<button type="submit">Upload</button>
</form>
{blob && (
<div>
Blob url: <a href={blob.url}>{blob.url}</a>
</div>
)}
</>
);
}
// /app/api/upload/avatars/route.ts
import { handleBlobUpload, type HandleBlobUploadBody } from '@vercel/blob';
import { NextResponse } from 'next/server';
export async function POST(request: Request): Promise<NextResponse> {
const body = (await request.json()) as HandleBlobUploadBody;
// Uploading from browsers is a three-step process:
// - First, a specific client token is generated by this current route and sent to the browser. The request body will contain a `{ type: "blob.generate-client-token", ... }` object.
// - Second, the file is uploaded to Vercel Blob directly from the browser, using the client token. The file content doesn't go through this route.
// - Third, a webhook is called on this route (`onUploadCompleted`). The request body will contain a `{ type: "blob.upload-completed", ... }` object. This webhook will be retried five times in case your route doesn't reply with a 200.
try {
const jsonResponse = await handleBlobUpload({
body,
request,
onBeforeGenerateToken: async (pathname) => {
// In most cases, you should authenticate users before allowing upload tokens to be generated and sent to browsers. Otherwise, you're exposing your Blob store to be an anonymous upload platform.
// See https://nextjs.org/docs/pages/building-your-application/routing/authenticating for more informations
const { user, userCanUpload } = await auth(request, pathname);
if (!userCanUpload) {
throw new Error('not authenticated or bad pathname');
}
return {
maxFileSize: 10_000_000,
allowedContentTypes: ['image/jpeg', 'image/png', 'image/gif'],
metadata: JSON.stringify({
userId: user.id,
}),
};
},
onUploadCompleted: async ({ blob, metadata }) => {
console.log('Upload completed', blob, metadata);
try {
// Run any logic after the file upload completed
const parsedMetadata = JSON.parse(metadata);
await db.update({ avatar: blob.url, userId: parsedMetadata.userId });
} catch (error) {
// In case of error, the "onUploadCompleted" will retry for 5 times
throw new Error('Could not update user');
}
},
});
return NextResponse.json(jsonResponse);
} catch (error) {
return NextResponse.json(
{ error: (error as Error).message },
{ status: 400 },
);
}
}
This will paginate through all your blobs in chunks of 1,000 blobs.
You can control the number of blobs in each call with limit
.
let hasMore = true;
let cursor: string | undefined;
while (hasMore) {
const listResult = await vercelBlob.list({
cursor,
});
console.log(listResult);
hasMore = listResult.hasMore;
cursor = listResult.cursor;
}
All methods of this module will throw if the request fails for either:
You should acknowledge that in your code by wrapping our methods in a try/catch block:
try {
await vercelBlob.put('foo', 'bar');
} catch (error) {
if (error instanceof vercelBlob.BlobAccessError) {
// handle error
} else {
// rethrow
throw error;
}
}
pnpm changeset
git commit -am "New version"
Once such a commit gets merged in main, then GitHub will open a versioning PR you can merge. And the package will be automatically published to npm.
When using Serverless or Edge Functions on Vercel, the request body size is limited to 4MB.
When you want to send files larger than that to Vercel Blob, you can do so by using @vercel/blob
from a regular Node.js script context (like at build time). This way the request body will be sent directly to Vercel Blob and not via an Edge or Serverless Function.
We plan to allow sending larger files to Vercel Blob from browser contexts soon.
@vercel/blob
reads the token from the environment variables on process.env
. In general, process.env
is automatically populated from your .env
file during development, which is created when you run vc env pull
. However, Vite does not expose the .env
variables on process.env.
You can fix this in one of following two ways:
process.env
yourself using something like dotenv-expand
:pnpm install --save-dev dotenv dotenv-expand
// vite.config.js
import dotenvExpand from 'dotenv-expand';
import { loadEnv, defineConfig } from 'vite';
export default defineConfig(({ mode }) => {
// This check is important!
if (mode === 'development') {
const env = loadEnv(mode, process.cwd(), '');
dotenvExpand.expand({ parsed: env });
}
return {
...
};
});
$env/static/private
:import { put } from '@vercel/blob';
+ import { BLOB_TOKEN } from '$env/static/private';
const kv = await head("filepath", {
- token: '<token>',
+ token: BLOB_TOKEN,
});
await kv.set('key', 'value');
FAQs
The Vercel Blob JavaScript API client
The npm package @vercel/blob receives a total of 305,738 weekly downloads. As such, @vercel/blob popularity was classified as popular.
We found that @vercel/blob demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 9 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.
Research
Security News
Socket researchers uncover the risks of a malicious Python package targeting Discord developers.
Security News
The UK is proposing a bold ban on ransomware payments by public entities to disrupt cybercrime, protect critical services, and lead global cybersecurity efforts.