![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
@vercel/blob
Advanced tools
The Vercel Blob JavaScript API client.
npm install @vercel/blob
We have examples on the vercel.com documentation, there are two ways to upload files to Vercel Blob:
put(pathname, body, options)
Upload a blob to the Vercel Blob API, and returns the URL of the blob along with some metadata.
async function put(
pathname: string,
body: ReadableStream | String | ArrayBuffer | Blob | File // All fetch body types are supported: https://developer.mozilla.org/en-US/docs/Web/API/fetch#body
options: {
access: 'public', // mandatory, as we will provide private blobs in the future
contentType?: string, // by default inferred from pathname
// `token` defaults to process.env.BLOB_READ_WRITE_TOKEN on Vercel
// and can be configured when you connect more stores to a project
// or using Vercel Blob outside of Vercel
token?: string,
addRandomSuffix?: boolean; // optional, allows to disable or enable random suffixes (defaults to `true`)
cacheControlMaxAge?: number, // optional, a duration in seconds to configure the edge and browser caches. Defaults to one year for browsers and 5 minutes for edge cache. Can only be configured server side (either on server side put or during client token generation). The Edge cache maximum value is 5 minutes.
}): Promise<{
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
}> {}
Delete one or multiple blobs by their full URL. This method doesn't return any value. If the blob url exists, it's deleted once del() returns.
async function del(
url: string | string[],
options?: {
token?: string;
}
): Promise<void> {}
Get the metadata of a blob by its full URL. Returns null
when the blob does not exist.
async function head(
url: string,
options?: {
token?: string;
}
): Promise<{
size: number;
uploadedAt: Date;
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
} | null> {}
List blobs and get their metadata in the store. With an optional prefix and limit. Paginate through them.
async function list(options?: {
token?: string;
limit?: number; // defaults to 1,000
prefix?: string;
cursor?: string;
}): Promise<{
blobs: {
size: number;
uploadedAt: Date;
pathname: string;
url: string;
}[];
cursor?: string;
hasMore: boolean;
}> {}
upload(pathname, body, options)
The upload
method is dedicated to client uploads. It fetches a client token using the handleUploadUrl
before uploading the blob.
Read the client uploads documentation to know more.
async function upload(
pathname: string,
body: ReadableStream | String | ArrayBuffer | Blob | File // All fetch body types are supported: https://developer.mozilla.org/en-US/docs/Web/API/fetch#body
options: {
access: 'public', // mandatory, as we will provide private blobs in the future
contentType?: string, // by default inferred from pathname
// `token` defaults to process.env.BLOB_READ_WRITE_TOKEN on Vercel
// and can be configured when you connect more stores to a project
// or using Vercel Blob outside of Vercel
handleUploadUrl?: string, // A string specifying the route to call for generating client tokens for client uploads
clientPayload?: string, // A string that will be passed to the `onUploadCompleted` callback as `tokenPayload`. It can be used to attach data to the upload, like `JSON.stringify({ postId: 123 })`.
}): Promise<{
pathname: string;
contentType: string;
contentDisposition: string;
url: string;
}> {}
handleUpload(options)
This is a server-side route helper to manage client uploads, it has two responsibilities:
Read the client uploads documentation to know more.
async function handleUpload(options?: {
token?: string; // default to process.env.BLOB_READ_WRITE_TOKEN
request: IncomingMessage | Request;
onBeforeGenerateToken: (
pathname: string,
clientPayload?: string
) => Promise<{
allowedContentTypes?: string[]; // optional, defaults to no restriction
maximumSizeInBytes?: number; // optional, defaults and maximum is 500MB (524,288,000 bytes)
validUntil?: number; // optional, timestamp in ms, by default now + 30s (30,000)
addRandomSuffix?: boolean; // see `put` options
cacheControlMaxAge?: number; // see `put` options
tokenPayload?: string; // optional, defaults to whatever the client sent as `clientPayload`
}>;
onUploadCompleted: (body: {
type: 'blob.upload-completed';
payload: {
blob: PutBlobResult;
tokenPayload?: string;
};
}) => Promise<void>;
body:
| {
type: 'blob.upload-completed';
payload: {
blob: PutBlobResult;
tokenPayload?: string;
};
}
| {
type: 'blob.generate-client-token';
payload: {
pathname: string;
callbackUrl: string;
clientPayload: string;
};
};
}): Promise<
| { type: 'blob.generate-client-token'; clientToken: string }
| { type: 'blob.upload-completed'; response: 'ok' }
> {}
This will paginate through all your blobs in chunks of 1,000 blobs.
You can control the number of blobs in each call with limit
.
let hasMore = true;
let cursor: string | undefined;
while (hasMore) {
const listResult = await list({
cursor,
});
console.log(listResult);
hasMore = listResult.hasMore;
cursor = listResult.cursor;
}
All methods of this module will throw if the request fails for either:
You should acknowledge that in your code by wrapping our methods in a try/catch block:
try {
await put('foo', 'bar');
} catch (error) {
if (error instanceof BlobAccessError) {
// handle error
} else {
// rethrow
throw error;
}
}
pnpm changeset
git commit -am "New version"
Once such a commit gets merged in main, then GitHub will open a versioning PR you can merge. And the package will be automatically published to npm.
When transferring a file to a Serverless or Edge Functions route on Vercel, then the request body is limited to 4.5 MB. If you need to send larger files then use the client-upload method.
@vercel/blob
reads the token from the environment variables on process.env
. In general, process.env
is automatically populated from your .env
file during development, which is created when you run vc env pull
. However, Vite does not expose the .env
variables on process.env.
You can fix this in one of following two ways:
process.env
yourself using something like dotenv-expand
:pnpm install --save-dev dotenv dotenv-expand
// vite.config.js
import dotenvExpand from 'dotenv-expand';
import { loadEnv, defineConfig } from 'vite';
export default defineConfig(({ mode }) => {
// This check is important!
if (mode === 'development') {
const env = loadEnv(mode, process.cwd(), '');
dotenvExpand.expand({ parsed: env });
}
return {
...
};
});
$env/static/private
:import { put } from '@vercel/blob';
+ import { BLOB_TOKEN } from '$env/static/private';
const kv = await head("filepath", {
- token: '<token>',
+ token: BLOB_TOKEN,
});
await kv.set('key', 'value');
FAQs
The Vercel Blob JavaScript API client
The npm package @vercel/blob receives a total of 66,624 weekly downloads. As such, @vercel/blob popularity was classified as popular.
We found that @vercel/blob demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 9 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.