New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@databricks/sql

Package Overview
Dependencies
Maintainers
0
Versions
21
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@databricks/sql - npm Package Versions

23

1.10.0

Diff
jackyhu-db
published 1.10.0 •

Changelog

Source

1.10.0

  • Rename clientId parameter to userAgentEntry in connect call to standardize across sql drivers (databricks/databricks-sql-nodejs#281)
jackyhu-db
published 1.9.0 •

Changelog

Source

1.9.0

  • Support iterable interface for IOperation (databricks/databricks-sql-nodejs#252)
  • Allow any number type (number, bigint, Int64) for maxRows and queryTimeout (databricks/databricks-sql-nodejs#255)
  • Support streaming query results via Node.js streams (databricks/databricks-sql-nodejs#262)
  • Add custom auth headers into cloud fetch request (databricks/databricks-sql-nodejs#267)
  • Support OAuth on databricks.azure.cn (databricks/databricks-sql-nodejs#271)
  • Fix: Fix the type check in polyfills.ts (databricks/databricks-sql-nodejs#254)
levko
published 1.8.4 •

Changelog

Source

1.8.4

  • Fix: proxy agent unintentionally overwrites protocol in URL (databricks/databricks-sql-nodejs#241)
  • Improve Array.at/TypedArray.at polyfill (databricks/databricks-sql-nodejs#242 by @barelyhuman)
  • UC Volume ingestion: stream files instead of loading them into memory (databricks/databricks-sql-nodejs#247)
  • UC Volume ingestion: improve behavior on SQL REMOVE (databricks/databricks-sql-nodejs#249)
  • Expose session and query ID (databricks/databricks-sql-nodejs#250)
  • Make lz4 module optional so package manager can skip it when cannot install (databricks/databricks-sql-nodejs#246)
levko
published 1.8.3 •

Changelog

Source

1.8.3

  • Improved retry behavior (databricks/databricks-sql-nodejs#230)
  • Fix: in some cases library returned too many results (databricks/databricks-sql-nodejs#239)
levko
published 1.8.2 •

Changelog

Source

1.8.2

Improved results handling when running queries against older DBR versions (databricks/databricks-sql-nodejs#232)

levko
published 1.8.1 •

Changelog

Source

1.8.1

Security fixes:

An issue in all published versions of the NPM package ip allows an attacker to execute arbitrary code and obtain sensitive information via the isPublic() function. This can lead to potential Server-Side Request Forgery (SSRF) attacks. The core issue is the function's failure to accurately distinguish between public and private IP addresses.

levko
published 1.8.0 •

Changelog

Source

1.8.0

Highlights

  • Retry failed CloudFetch requests (databricks/databricks-sql-nodejs#211)
  • Fixed compatibility issues with Node@14 (databricks/databricks-sql-nodejs#219)
  • Support Databricks OAuth on Azure (databricks/databricks-sql-nodejs#223)
  • Support Databricks OAuth on GCP (databricks/databricks-sql-nodejs#224)
  • Support LZ4 compression for Arrow and CloudFetch results (databricks/databricks-sql-nodejs#216)
  • Fix OAuth M2M flow on Azure (databricks/databricks-sql-nodejs#228)

OAuth on Azure

Some Azure instances now support Databricks native OAuth flow (in addition to AAD OAuth). For a backward compatibility, library will continue using AAD OAuth flow by default. To use Databricks native OAuth, pass useDatabricksOAuthInAzure: true to client.connect():

client.connect({
  // other options - host, port, etc.
  authType: 'databricks-oauth',
  useDatabricksOAuthInAzure: true,
  // other OAuth options if needed
});

Also, we fixed issue with AAD OAuth when wrong scopes were passed for M2M flow.

OAuth on GCP

We enabled OAuth support on GCP instances. Since it uses Databricks native OAuth, all the options are the same as for OAuth on AWS instances.

CloudFetch improvements

Now library will automatically attempt to retry failed CloudFetch requests. Currently, the retry strategy is quite basic, but it is going to be improved in the future.

Also, we implemented a support for LZ4-compressed results (Arrow- and CloudFetch-based). It is enabled by default, and compression will be used if server supports it.

levko
published 1.7.1 •

Changelog

Source

1.7.1

  • Fix "Premature close" error which happened due to socket limit when intensively using library (databricks/databricks-sql-nodejs#217)
levko
published 1.7.0 •

Changelog

Source

1.7.0

  • Fixed behavior of maxRows option of IOperation.fetchChunk(). Now it will return chunks of requested size (databricks/databricks-sql-nodejs#200)
  • Improved CloudFetch memory usage and overall performance (databricks/databricks-sql-nodejs#204, databricks/databricks-sql-nodejs#207, databricks/databricks-sql-nodejs#209)
  • Remove protocol version check when using query parameters (databricks/databricks-sql-nodejs#213)
  • Fix IOperation.hasMoreRows() behavior to avoid fetching data beyond the end of dataset. Also, now it will work properly prior to fetching first chunk (databricks/databricks-sql-nodejs#205)
levko
published 1.6.1 •

Changelog

Source

1.6.1

  • Make default logger singleton (databricks/databricks-sql-nodejs#199)
  • Enable canUseMultipleCatalogs option when creating session (databricks/databricks-sql-nodejs#203)