Security News
PyPI Introduces Digital Attestations to Strengthen Python Package Security
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
@azure/storage-blob
Advanced tools
The @azure/storage-blob npm package is designed for working with Azure Blob Storage. It provides a comprehensive set of features to interact with Azure Blob Storage, including uploading and downloading blobs, managing containers, and handling blob metadata and properties. This package is part of the Azure SDK for JavaScript.
Uploading blobs
This code sample demonstrates how to upload data to a blob in Azure Blob Storage. It involves creating a BlobServiceClient, getting a reference to a container, and then uploading data to a blob within that container.
const { BlobServiceClient } = require('@azure/storage-blob');
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient('my-container');
const blockBlobClient = containerClient.getBlockBlobClient('my-blob');
await blockBlobClient.upload(data, data.length);
Downloading blobs
This code sample shows how to download the content of a blob. It involves creating a BlobServiceClient, getting a container client, and then downloading the blob's content using the block blob client.
const { BlobServiceClient } = require('@azure/storage-blob');
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient('my-container');
const blockBlobClient = containerClient.getBlockBlobClient('my-blob');
const downloadBlockBlobResponse = await blockBlobClient.download(0);
const downloadedContent = await streamToString(downloadBlockBlobResponse.readableStreamBody);
Listing blobs in a container
This example demonstrates how to list all blobs in a specific container. It involves creating a BlobServiceClient, obtaining a container client, and iterating over the blobs in the container, printing out their names.
const { BlobServiceClient } = require('@azure/storage-blob');
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient('my-container');
for await (const blob of containerClient.listBlobsFlat()) {
console.log(`Blob name: ${blob.name}`);
}
The AWS SDK for JavaScript allows developers to interact with AWS services, including Amazon S3 for object storage, which is similar to Azure Blob Storage. Compared to @azure/storage-blob, aws-sdk supports a broader range of AWS services but is specific to AWS infrastructure.
Azure Storage Blob is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that does not adhere to a particular data model or definition, such as text or binary data.
This project provides a client library in JavaScript that makes it easy to consume Microsoft Azure Storage Blob service.
Use the client libararies in this package to:
Source code | Package (npm) | API Reference Documentation | Product documentation | Samples | Azure Storage Blob REST APIs
Blob storage is designed for:
Blob storage offers three types of resources:
BlobServiceClient
ContainerClient
BlobClient
Prerequisites: You must have an Azure subscription and a Storage Account to use this package. If you are using this package in a Node.js application, then Node.js version 8.0.0 or higher is required.
The preferred way to install the Azure Storage Blob client library for JavaScript is to use the npm package manager. Type the following into a terminal window:
npm install @azure/storage-blob
The Azure Blob Storage service supports the use of Azure Active Directory to authenticate requests to its APIs. The @azure/identity
package provides a variety of credential types that your application can use to do this. The README for @azure/identity
provides more details and samples to get you started.
This library is compatible with Node.js and browsers, and validated against LTS Node.js versions (>=8.16.0) and latest versions of Chrome, Firefox and Edge.
You need polyfills to make this library work with IE11. The easiest way is to use @babel/polyfill, or polyfill service.
You can also load separate polyfills for missed ES feature(s). This library depends on following ES features which need external polyfills loaded.
Promise
String.prototype.startsWith
String.prototype.endsWith
String.prototype.repeat
String.prototype.includes
Array.prototype.includes
Object.assign
Object.keys
(Override IE11's Object.keys
with ES6 polyfill forcely to enable ES6 behavior)Symbol
There are differences between Node.js and browsers runtime. When getting started with this library, pay attention to APIs or classes marked with "ONLY AVAILABLE IN NODE.JS RUNTIME" or "ONLY AVAILABLE IN BROWSERS".
StorageSharedKeyCredential
generateAccountSASQueryParameters()
generateBlobSASQueryParameters()
BlockBlobClient.uploadFile()
BlockBlobClient.uploadStream()
BlobClient.downloadToBuffer()
BlobClient.downloadToFile()
BlockBlobClient.uploadBrowserData()
To use this client library in the browser, you need to use a bundler. For details on how to do this, please refer to our bundling documentation.
You need to set up Cross-Origin Resource Sharing (CORS) rules for your storage account if you need to develop for browsers. Go to Azure portal and Azure Storage Explorer, find your storage account, create new CORS rules for blob/queue/file/table service(s).
For example, you can create following CORS settings for debugging. But please customize the settings carefully according to your requirements in production environment.
To use the clients, import the package into your file:
const AzureStorageBlob = require("@azure/storage-blob");
Alternatively, selectively import only the types you need:
const { BlobServiceClient, StorageSharedKeyCredential } = require("@azure/storage-blob");
Use the constructor to create a instance of BlobServiceClient
.
Recommended way to instantiate a BlobServiceClient
- with DefaultAzureCredential
from @azure/identity
package
Setup : Reference - Authorize access to blobs and queues with Azure Active Directory from a client application - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-app
Register a new AAD application and give permissions to access Azure Storage on behalf of the signed-in user
API permissions
section, select Add a permission
and choose Microsoft APIs
.Azure Storage
and select the checkbox next to user_impersonation
and then click Add permissions
. This would allow the application to access Azure Storage on behalf of the signed-in user.Grant access to Azure Blob data with RBAC in the Azure Portal
Access control (IAM)
tab (in the left-side-navbar of your storage account in the azure-portal).Environment setup for the sample
CLIENT ID
and TENANT ID
. In the "Certificates & Secrets" tab, create a secret and note that down.const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
// Enter your storage account name
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
See the Azure AD Auth sample for a complete example using this method.
[Note - Above steps are only for Node.js]
Alternatively, you instantiate a BlobServiceClient
with a StorageSharedKeyCredential
by passing account-name and account-key as arguments. (account-name and account-key can be obtained from the azure portal)
[ONLY AVAILABLE IN NODE.JS RUNTIME]
const { BlobServiceClient, StorageSharedKeyCredential } = require("@azure/storage-blob");
// Enter your storage account name and shared key
const account = "<account>";
const accountKey = "<accountkey>";
// Use StorageSharedKeyCredential with storage account and account key
// StorageSharedKeyCredential is only avaiable in Node.js runtime, not in browsers
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
Use BlobServiceClient.getContainerClient()
to get a container client instance then create a new container resource.
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
async function main() {
// Create a container
const containerName = `newcontainer${new Date().getTime()}`;
const containerClient = blobServiceClient.getContainerClient(containerName);
const createContainerResponse = await containerClient.create();
console.log(`Create container ${containerName} successfully`, createContainerResponse.requestId);
}
main();
Use BlobServiceClient.listContainers()
function to iterate the containers,
with the new for-await-of
syntax:
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
async function main() {
let i = 1;
let iter = await blobServiceClient.listContainers();
for await (const container of iter) {
console.log(`Container ${i++}: ${container.name}`);
}
}
main();
Alternatively without using for-await-of
:
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
async function main() {
let i = 1;
let iter = blobServiceClient.listContainers();
let containerItem = await iter.next();
while (!containerItem.done) {
console.log(`Container ${i++}: ${containerItem.value.name}`);
containerItem = await iter.next();
}
}
main();
In addition, pagination is supported for listing too via byPage()
:
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
async function main() {
let i = 1;
for await (const response of blobServiceClient.listContainers().byPage({ maxPageSize: 20 })) {
if (response.containerItems) {
for (const container of response.containerItems) {
console.log(`Container ${i++}: ${container.name}`);
}
}
}
}
main();
For a complete sample on iterating containers please see samples/iterators-containers.ts.
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
const containerName = "<container name>";
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
const content = "Hello world!";
const blobName = "newblob" + new Date().getTime();
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
console.log(`Upload block blob ${blobName} successfully`, uploadBlobResponse.requestId);
}
main();
Similar to listing containers.
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
const containerName = "<container name>";
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
let i = 1;
let iter = await containerClient.listBlobsFlat();
for await (const blob of iter) {
console.log(`Blob ${i++}: ${blob.name}`);
}
}
main();
For a complete sample on iterating blobs please see samples/iterators-blobs.ts.
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
const containerName = "<container name>";
const blobName = "<blob name>"
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobClient = containerClient.getBlobClient(blobName);
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
const downloadBlockBlobResponse = await blobClient.download();
const downloaded = await streamToString(downloadBlockBlobResponse.readableStreamBody);
console.log("Downloaded blob content:", downloaded);
// [Node.js only] A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
}
main();
const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
defaultAzureCredential
);
const containerName = "<container name>";
const blobName = "<blob name>"
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobClient = containerClient.getBlobClient(blobName);
// Get blob content from position 0 to the end
// In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
const downloadBlockBlobResponse = await blobClient.download();
const downloaded = await blobToString(await downloadBlockBlobResponse.blobBody);
console.log(
"Downloaded blob content",
downloaded
);
// [Browsers only] A helper method used to convert a browser Blob into string.
async function blobToString(blob: Blob): Promise<string> {
const fileReader = new FileReader();
return new Promise<string>((resolve, reject) => {
fileReader.onloadend = (ev: any) => {
resolve(ev.target!.result);
};
fileReader.onerror = reject;
fileReader.readAsText(blob);
});
}
}
main();
A complete example of basic scenarios is at samples/basic.ts.
Enabling logging may help uncover useful information about failures. In order to see a log of HTTP requests and responses, set the AZURE_LOG_LEVEL
environment variable to info
. Alternatively, logging can be enabled at runtime by calling setLogLevel
in the @azure/logger
:
import { setLogLevel } from "@azure/logger";
setLogLevel("info");
More code samples:
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
FAQs
Microsoft Azure Storage SDK for JavaScript - Blob
We found that @azure/storage-blob demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now supports digital attestations, enhancing security and trust by allowing package maintainers to verify the authenticity of Python packages.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.