Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

cassandra-driver

Package Overview
Dependencies
Maintainers
1
Versions
52
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

cassandra-driver

DataStax Node.js Driver for Apache Cassandra

  • 4.7.2
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
69K
decreased by-36.6%
Maintainers
1
Weekly downloads
 
Created
Source

DataStax Node.js Driver for Apache Cassandra®

A modern, feature-rich and highly tunable Node.js client library for Apache Cassandra and DSE using exclusively Cassandra's binary protocol and Cassandra Query Language.

Installation

$ npm install cassandra-driver

Build Status Build status

Features

Documentation

Getting Help

You can use the project mailing list or create a ticket on the Jira issue tracker.

Basic usage

const cassandra = require('cassandra-driver');

const client = new cassandra.Client({
  contactPoints: ['h1', 'h2'],
  localDataCenter: 'datacenter1',
  keyspace: 'ks1'
});

const query = 'SELECT name, email FROM users WHERE key = ?';

client.execute(query, [ 'someone' ])
  .then(result => console.log('User with email %s', result.rows[0].email));

The driver supports both promises and callbacks for the asynchronous methods, you can choose the approach that suits your needs.

Note that in order to have concise code examples in this documentation, we will use the promise-based API of the driver along with the await keyword.

If you are using DataStax Astra you can configure your client by setting the secure bundle and the user credentials:

const client = new cassandra.Client({
  cloud: { secureConnectBundle: 'path/to/secure-connect-DATABASE_NAME.zip' },
  credentials: { username: 'user_name', password: 'p@ssword1' }
});

Prepare your queries

Using prepared statements provides multiple benefits.

Prepared statements are parsed and prepared on the Cassandra nodes and are ready for future execution. Also, when preparing, the driver retrieves information about the parameter types which allows an accurate mapping between a JavaScript type and a Cassandra type.

The driver will prepare the query once on each host and execute the statement with the bound parameters.

// Use query markers (?) and parameters
const query = 'UPDATE users SET birth = ? WHERE key=?'; 
const params = [ new Date(1942, 10, 1), 'jimi-hendrix' ];

// Set the prepare flag in the query options
await client.execute(query, params, { prepare: true });
console.log('Row updated on the cluster');

Row streaming and pipes

When using #eachRow() and #stream() methods, the driver parses each row as soon as it is received, yielding rows without buffering them.

// Reducing a large result
client.eachRow(
  'SELECT time, val FROM temperature WHERE station_id=',
  ['abc'],
  (n, row) => {
    // The callback will be invoked per each row as soon as they are received
    minTemperature = Math.min(row.val, minTemperature); 
  },
  err => { 
    // This function will be invoked when all rows where consumed or an error was encountered  
  }
);

The #stream() method works in the same way but instead of callback it returns a Readable Streams2 object in objectMode that emits instances of Row.

It can be piped downstream and provides automatic pause/resume logic (it buffers when not read).

client.stream('SELECT time, val FROM temperature WHERE station_id=', [ 'abc' ])
  .on('readable', function () {
    // 'readable' is emitted as soon a row is received and parsed
    let row;
    while (row = this.read()) {
      console.log('time %s and value %s', row.time, row.val);
    }
  })
  .on('end', function () {
    // Stream ended, there aren't any more rows
  })
  .on('error', function (err) {
    // Something went wrong: err is a response error from Cassandra
  });

User defined types

User defined types (UDT) are represented as JavaScript objects.

For example: Consider the following UDT and table

CREATE TYPE address (
  street text,
  city text,
  state text,
  zip int,
  phones set<text>
);
CREATE TABLE users (
  name text PRIMARY KEY,
  email text,
  address frozen<address>
);

You can retrieve the user address details as a regular JavaScript object.

const query = 'SELECT name, address FROM users WHERE key = ?';
const result = await client.execute(query, [ key ], { prepare: true });
const row = result.first();
const address = row.address;
console.log('User lives in %s, %s - %s', address.street, address.city, address.state);

Read more information about using UDTs with the Node.js Driver.

Paging

All driver methods use a default fetchSize of 5000 rows, retrieving only first page of results up to a maximum of 5000 rows to shield an application against accidentally retrieving large result sets in a single response.

stream() method automatically fetches the following page once the current one was read. You can also use eachRow() method to retrieve the following pages by using autoPage flag. See [paging documentation for more information][doc-paging].

Batch multiple statements

You can execute multiple statements in a batch to update/insert several rows atomically even in different column families.

const queries = [
  {
    query: 'UPDATE user_profiles SET email=? WHERE key=?',
    params: [ emailAddress, 'hendrix' ]
  }, {
    query: 'INSERT INTO user_track (key, text, date) VALUES (?, ?, ?)',
    params: [ 'hendrix', 'Changed email', new Date() ]
  }
];

await client.batch(queries, { prepare: true });
console.log('Data updated on cluster');

Object Mapper

The driver provides a built-in object mapper that lets you interact with your data like you would interact with a set of documents.

Retrieving objects from the database:

const videos = await videoMapper.find({ userId });
for (let video of videos) {
  console.log(video.name);
}

Updating an object from the database:

await videoMapper.update({ id, userId, name, addedDate, description });

You can read more information about getting started with the Mapper in our documentation.


Data types

There are few data types defined in the ECMAScript specification, this usually represents a problem when you are trying to deal with data types that come from other systems in JavaScript.

The driver supports all the CQL data types in Apache Cassandra (3.0 and below) even for types with no built-in JavaScript representation, like decimal, varint and bigint. Check the documentation on working with numerical values, uuids and collections.

Logging

Instances of Client() are EventEmitter and emit log events:

client.on('log', (level, loggerName, message, furtherInfo) => {
  console.log(`${level} - ${loggerName}:  ${message}`);
});

The level being passed to the listener can be verbose, info, warning or error. Visit the logging documentation for more information.

Compatibility

The driver supports all versions of Node.js, Cassandra, and DSE that are not EOL at the time of release. Only LTS eligible branches (i.e. even numbered releases) are supported for Node.js. See the project documentation for more information about the Node.js release cycle.

The current version of the driver offers support consistent with this policy for the following:

  • Apache Cassandra versions 3.0 and above.
  • DataStax Enterprise versions 5.1 and 6.8.
  • Node.js versions 18.x and 20.x.

Note: DataStax products do not support big-endian systems.

Credits

This driver is based on the original work of Jorge Bay on node-cassandra-cql and adds a series of advanced features that are common across all other DataStax drivers for Apache Cassandra.

The development effort to provide an up to date, high performance, fully featured Node.js Driver for Apache Cassandra will continue on this project, while node-cassandra-cql will be discontinued.

License

© DataStax, Inc.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Keywords

FAQs

Package last updated on 21 Sep 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc