![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
@azure/storage-file
Advanced tools
Azure Files offers fully managed file shares in the cloud that are accessible via the industry standard Server Message Block (SMB) protocol. Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used.
This project provides a client library in JavaScript that makes it easy to consume Microsoft Azure File Storage service.
Source code | Package (npm) | API Reference Documentation | Product documentation | Samples | Azure Storage File REST APIs
This library is compatible with Node.js and browsers, and validated against LTS Node.js versions (>=8.16.0) and latest versions of Chrome, Firefox and Edge.
You need polyfills to make this library work with IE11. The easiest way is to use @babel/polyfill, or polyfill service.
You can also load separate polyfills for missed ES feature(s). This library depends on following ES features which need external polyfills loaded.
Promise
String.prototype.startsWith
String.prototype.endsWith
String.prototype.repeat
String.prototype.includes
Array.prototype.includes
Object.keys
(Override IE11's Object.keys
with ES6 polyfill forcely to enable ES6 behavior)Symbol
There are differences between Node.js and browsers runtime. When getting started with this library, pay attention to APIs or classes marked with "ONLY AVAILABLE IN NODE.JS RUNTIME" or "ONLY AVAILABLE IN BROWSERS".
SharedKeyCredential
generateAccountSASQueryParameters()
generateFileSASQueryParameters()
FileClient.uploadFile()
FileClient.uploadStream()
FileClient.downloadToBuffer()
FileClient.downloadToFile()
FileClient.uploadBrowserData()
The preferred way to install the Azure File Storage client library for JavaScript is to use the npm package manager. Simply type the following into a terminal window:
npm install @azure/storage-file@12.0.0-preview.3
In your TypeScript or JavaScript file, import via following:
import * as Azure from "@azure/storage-file";
Or
const Azure = require("@azure/storage-file");
To use the library with JS bundle in the browsers, simply add a script tag to your HTML pages pointing to the downloaded JS bundle file(s):
<script src="https://mydomain/azure-storage-file.min.js"></script>
The JS bundled file is compatible with UMD standard, if no module system found, following global variable(s) will be exported:
azfile
Download latest released JS bundles from links in the GitHub release page.
You need to set up Cross-Origin Resource Sharing (CORS) rules for your storage account if you need to develop for browsers. Go to Azure portal and Azure Storage Explorer, find your storage account, create new CORS rules for blob/queue/file/table service(s).
For example, you can create following CORS settings for debugging. But please customize the settings carefully according to your requirements in production environment.
Use the constructor to create a instance of FileServiceClient
, passing in the credential.
// Enter your storage account name and shared key
const account = "";
const accountKey = "";
// Use SharedKeyCredential with storage account and account key
// SharedKeyCredential is only avaiable in Node.js runtime, not in browsers
const sharedKeyCredential = new SharedKeyCredential(account, accountKey);
const serviceClient = new FileServiceClient(
// When using AnonymousCredential, following url should include a valid SAS
`https://${account}.file.core.windows.net`,
sharedKeyCredential
);
Use ShareServiceClient.listShares()
to iterator shares in this account,
with the new for-await-of
syntax:
let shareIter1 = serviceClient.listShares();
let i = 1;
for await (const share of shareIter1) {
console.log(`Share${i}: ${share.name}`);
i++;
}
Alternatively without for-await-of
:
let shareIter2 = await serviceClient.listShares();
i = 1;
let shareItem = await shareIter2.next();
while (!shareItem.done) {
console.log(`Share${i++}: ${shareItem.value.name}`);
shareItem = await shareIter2.next();
}
const shareName = `newshare${new Date().getTime()}`;
const shareClient = serviceClient.getShareClient(shareName);
await shareClient.create();
console.log(`Create share ${shareName} successfully`);
const directoryName = `newdirectory${new Date().getTime()}`;
const directoryClient = shareClient.getDirectoryClient(directoryName);
await directoryClient.create();
console.log(`Create directory ${directoryName} successfully`);
const content = "Hello World!";
const fileName = "newfile" + new Date().getTime();
const fileClient = directoryClient.getFileClient(fileName);
await fileClient.create(content.length);
console.log(`Create file ${fileName} successfully`);
// Upload file range
await fileClient.uploadRange(content, 0, content.length);
console.log(`Upload file range "${content}" to ${fileName} successfully`);
Use DirectoryClient.listFilesAndDirectories()
to iterator over files and directories,
with the new for-await-of
syntax. The kind
property can be used to identify whether
a iterm is a directory or a file.
let dirIter1 = directoryClient.listFilesAndDirectories();
i = 1;
for await (const item of dirIter1) {
if (item.kind === "directory") {
console.log(`${i} - directory\t: ${item.name}`);
} else {
console.log(`${i} - file\t: ${item.name}`);
}
i++;
}
Alternatively without using for-await-of
:
let dirIter2 = await directoryClient.listFilesAndDirectories();
i = 1;
let item = await dirIter2.next();
while (!item.done) {
if (item.value.kind === "directory") {
console.log(`${i} - directory\t: ${item.value.name}`);
} else {
console.log(`${i} - file\t: ${item.value.name}`);
}
item = await dirIter2.next();
}
For a complete sample on iterating blobs please see samples/iterators-files-and-directories.ts.
// Get file content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadFileResponse.readableStreamBody
const downloadFileResponse = await fileClient.download(0);
console.log(
`Downloaded file content${await streamToString(downloadFileResponse.readableStreamBody)}`
);
// [Node.js only] A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
// Get file content from position 0 to the end
// In browsers, get downloaded data by accessing downloadFileResponse.blobBody
const downloadFileResponse = await fileClient.download(0);
console.log(
`Downloaded file content${await streamToString(
downloadFileResponse.blobBody
)}`
);
// [Browser only] A helper method used to convert a browser Blob into string.
export async function blobToString(blob: Blob): Promise<string> {
const fileReader = new FileReader();
return new Promise<string>((resolve, reject) => {
fileReader.onloadend = (ev: any) => {
resolve(ev.target!.result);
};
fileReader.onerror = reject;
fileReader.readAsText(blob);
});
}
A complete example of basic scenarios is at samples/basic.ts.
It could help diagnozing issues by turning on the console logging. Here's an example logger implementation. First, add a custom logger:
class ConsoleHttpPipelineLogger {
constructor(minimumLogLevel) {
this.minimumLogLevel = minimumLogLevel;
}
log(logLevel, message) {
const logMessage = `${new Date().toISOString()} ${HttpPipelineLogLevel[logLevel]}: ${message}`;
switch (logLevel) {
case HttpPipelineLogLevel.ERROR:
console.error(logMessage);
break;
case HttpPipelineLogLevel.WARNING:
console.warn(logMessage);
break;
case HttpPipelineLogLevel.INFO:
console.log(logMessage);
break;
}
}
}
When creating the FileServiceClient
instance, pass the logger in the options
const fileServiceClient = new FileServiceClient(
`https://${account}.file.core.windows.net`,
sharedKeyCredential,
{
logger: new ConsoleHttpPipelineLogger(HttpPipelineLogLevel.INFO)
}
);
More code samples
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
FAQs
Microsoft Azure Storage SDK for JavaScript - File
The npm package @azure/storage-file receives a total of 0 weekly downloads. As such, @azure/storage-file popularity was classified as not popular.
We found that @azure/storage-file demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.