Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@azure/storage-file-datalake

Package Overview
Dependencies
Maintainers
3
Versions
372
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@azure/storage-file-datalake - npm Package Compare versions

Comparing version 12.1.0-dev.20200810.1 to 12.1.0-dev.20200812.1

2

package.json
{
"name": "@azure/storage-file-datalake",
"version": "12.1.0-dev.20200810.1",
"version": "12.1.0-dev.20200812.1",
"description": "Microsoft Azure Storage SDK for JavaScript - DataLake",

@@ -5,0 +5,0 @@ "sdk-type": "client",

@@ -8,5 +8,6 @@ # Azure Storage File Data Lake client library for JavaScript

Use the client libraries in this package to:
- Create/List/Delete File Systems
- Create/Read/List/Update/Delete Paths, Directories and Files
- Create/List/Delete File Systems
- Create/Read/List/Update/Delete Paths, Directories and Files
[Source code](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage/storage-file-datalake) |

@@ -91,3 +92,3 @@ [Package (npm)](https://www.npmjs.com/package/@azure/storage-file-datalake) |

If `Parcel` is used then no further work is needed. If using Rollup, an additional step is needed to transform the bundled output to the format that IE11 supports.
If `Parcel` is used then no further work is needed. If using Rollup, an additional step is needed to transform the bundled output to the format that IE11 supports.

@@ -131,2 +132,3 @@ Assuming `bundled-output.js` is the result from `Rollup`:

In the past, cloud-based analytics had to compromise in areas of performance, management, and security. Data Lake Storage Gen2 addresses each of these aspects in the following ways:
- Performance is optimized because you do not need to copy or transform data as a prerequisite for analysis. The hierarchical namespace greatly improves the performance of directory management operations, which improves overall job performance.

@@ -143,6 +145,6 @@ - Management is easier because you can organize and manipulate files through directories and subdirectories.

|Azure DataLake Gen2 | Blob |
| --------------------------| ---------- |
|Filesystem | Container |
|Path (File or Directory) | Blob |
| Azure DataLake Gen2 | Blob |
| ------------------------ | --------- |
| Filesystem | Container |
| Path (File or Directory) | Blob |

@@ -153,2 +155,12 @@ > Note: This client library only supports storage accounts with hierarchical namespace (HNS) enabled.

- [Import the package](#import-the-package)
- [Create the data lake service client](#create-the-data-lake-service-client)
- [Create a new file system](#create-a-new-file-system)
- [List the file systems](#list-the-file-systems)
- [Create and delete a directory](#create-and-delete-a-directory)
- [Create a file](#create-a-file)
- [List paths inside a file system](#list-paths-inside-a-file-system)
- [Download a file and convert it to a string (Node.js)](#download-a-file-and-convert-it-to-a-string-nodejs)
- [Download a file and convert it to a string (Browsers)](#download-a-file-and-convert-it-to-a-string-browsers)
### Import the package

@@ -165,3 +177,6 @@

```javascript
const { DataLakeServiceClient, StorageSharedKeyCredential } = require("@azure/storage-file-datalake");
const {
DataLakeServiceClient,
StorageSharedKeyCredential
} = require("@azure/storage-file-datalake");
```

@@ -179,36 +194,36 @@

Setup : Reference - Authorize access to blobs (data lake) and queues with Azure Active Directory from a client application - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-app
Setup : Reference - Authorize access to blobs (data lake) and queues with Azure Active Directory from a client application - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-app
- Register a new AAD application and give permissions to access Azure Storage on behalf of the signed-in user.
- Register a new AAD application and give permissions to access Azure Storage on behalf of the signed-in user.
- Register a new application in the Azure Active Directory(in the azure-portal) - https://docs.microsoft.com/azure/active-directory/develop/quickstart-register-app
- In the `API permissions` section, select `Add a permission` and choose `Microsoft APIs`.
- Pick `Azure Storage` and select the checkbox next to `user_impersonation` and then click `Add permissions`. This would allow the application to access Azure Storage on behalf of the signed-in user.
- Register a new application in the Azure Active Directory(in the azure-portal) - https://docs.microsoft.com/azure/active-directory/develop/quickstart-register-app
- In the `API permissions` section, select `Add a permission` and choose `Microsoft APIs`.
- Pick `Azure Storage` and select the checkbox next to `user_impersonation` and then click `Add permissions`. This would allow the application to access Azure Storage on behalf of the signed-in user.
- Grant access to Azure Data Lake data with RBAC in the Azure Portal
- Grant access to Azure Data Lake data with RBAC in the Azure Portal
- RBAC roles for blobs (data lake) and queues - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-portal.
- In the azure portal, go to your storage-account and assign **Storage Blob Data Contributor** role to the registered AAD application from `Access control (IAM)` tab (in the left-side-navbar of your storage account in the azure-portal).
- RBAC roles for blobs (data lake) and queues - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-portal.
- In the azure portal, go to your storage-account and assign **Storage Blob Data Contributor** role to the registered AAD application from `Access control (IAM)` tab (in the left-side-navbar of your storage account in the azure-portal).
- Environment setup for the sample
- From the overview page of your AAD Application, note down the `CLIENT ID` and `TENANT ID`. In the "Certificates & Secrets" tab, create a secret and note that down.
- Make sure you have AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET as environment variables to successfully execute the sample(Can leverage process.env).
- Environment setup for the sample
- From the overview page of your AAD Application, note down the `CLIENT ID` and `TENANT ID`. In the "Certificates & Secrets" tab, create a secret and note that down.
- Make sure you have AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET as environment variables to successfully execute the sample(Can leverage process.env).
```javascript
const { DefaultAzureCredential } = require("@azure/identity");
const { DataLakeServiceClient } = require("@azure/storage-file-datalake");
```javascript
const { DefaultAzureCredential } = require("@azure/identity");
const { DataLakeServiceClient } = require("@azure/storage-file-datalake");
// Enter your storage account name
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
// Enter your storage account name
const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();
const datalakeServiceClient = new DataLakeServiceClient(
`https://${account}.dfs.core.windows.net`,
defaultAzureCredential
);
```
const datalakeServiceClient = new DataLakeServiceClient(
`https://${account}.dfs.core.windows.net`,
defaultAzureCredential
);
```
See the [Azure AD Auth sample](https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob/samples/javascript/azureAdAuth.js) for a complete example using this method.
See the [Azure AD Auth sample](https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob/samples/javascript/azureAdAuth.js) for a complete example using this method.
[Note - Above steps are only for Node.js]
[Note - Above steps are only for Node.js]

@@ -218,19 +233,22 @@ #### with `StorageSharedKeyCredential`

Alternatively, you instantiate a `DataLakeServiceClient` with a `StorageSharedKeyCredential` by passing account-name and account-key as arguments. (The account-name and account-key can be obtained from the azure portal.)
[ONLY AVAILABLE IN NODE.JS RUNTIME]
[ONLY AVAILABLE IN NODE.JS RUNTIME]
```javascript
const { DataLakeServiceClient, StorageSharedKeyCredential } = require("@azure/storage-file-datalake");
```javascript
const {
DataLakeServiceClient,
StorageSharedKeyCredential,
} = require("@azure/storage-file-datalake");
// Enter your storage account name and shared key
const account = "<account>";
const accountKey = "<accountkey>";
// Enter your storage account name and shared key
const account = "<account>";
const accountKey = "<accountkey>";
// Use StorageSharedKeyCredential with storage account and account key
// StorageSharedKeyCredential is only available in Node.js runtime, not in browsers
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const datalakeServiceClient = new DataLakeServiceClient(
`https://${account}.dfs.core.windows.net`,
sharedKeyCredential
);
```
// Use StorageSharedKeyCredential with storage account and account key
// StorageSharedKeyCredential is only available in Node.js runtime, not in browsers
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const datalakeServiceClient = new DataLakeServiceClient(
`https://${account}.dfs.core.windows.net`,
sharedKeyCredential
);
```

@@ -349,3 +367,5 @@ #### with SAS Token

let i = 1;
for await (const response of datalakeServiceClient.listFileSystems().byPage({ maxPageSize: 20 })) {
for await (const response of datalakeServiceClient
.listFileSystems()
.byPage({ maxPageSize: 20 })) {
if (response.fileSystemItems) {

@@ -439,3 +459,3 @@ for (const fileSystem of response.fileSystemItems) {

const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
let i = 1;

@@ -466,3 +486,3 @@ let paths = fileSystemClient.listPaths();

const fileSystemName = "<file system name>";
const fileName = "<file name>"
const fileName = "<file name>";

@@ -469,0 +489,0 @@ async function main() {

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc