Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

google-cloud

Package Overview
Dependencies
Maintainers
12
Versions
35
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

google-cloud

Cloud APIs Client Library for Node.js

  • 0.56.0
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
59
decreased by-48.25%
Maintainers
12
Weekly downloads
 
Created
Source

Cloud Node.js Client

Node.js idiomatic client for Google Cloud Platform services.

NPM Version CircleCI Appveyor Build Status Dependency Status Coverage Status

This client supports the following Google Cloud Platform services at a General Availability (GA) quality level:

This client supports the following Google Cloud Platform services at a Beta quality level:

This client supports the following Google Cloud Platform services at an Alpha quality level:

If you need support for other Google APIs, check out the Google Node.js API Client library.

Quick Start

We recommend installing the individual packages that you need, which are provided under the @google-cloud namespace. For example:

$ npm install --save @google-cloud/datastore
$ npm install --save @google-cloud/storage
var config = {
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
};

var datastore = require('@google-cloud/datastore')(config);
var storage = require('@google-cloud/storage')(config);
The google-cloud meta-package

We also provide a meta-package, google-cloud, which provides all of the individual APIs. However, in order to keep file size and memory use low, the use of this package is not recommended.

If you want the kitchen sink, however, get it with:

$ npm install --save google-cloud
var gcloud = require('google-cloud')({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

var datastore = gcloud.datastore();
var storage = gcloud.storage();

Example Applications

  • nodejs-getting-started - A sample and tutorial that demonstrates how to build a complete web application using Cloud Datastore, Cloud Storage, and Cloud Pub/Sub and deploy it to Google App Engine or Google Compute Engine.
  • gcloud-node-todos - A TodoMVC backend using google-cloud-node and Datastore.
  • gitnpm - Easily lookup an npm package's GitHub repo using google-cloud-node and Google App Engine.
  • gcloud-kvstore - Use Datastore as a simple key-value store.
  • hya-wave - Cloud-based web sample editor. Part of the hya-io family of products.
  • gstore-node - Google Datastore Entities Modeling library.
  • gstore-api - REST API builder for Google Datastore Entities.

Authentication

With google-cloud it's incredibly easy to get authenticated and start using Google's APIs. You can set your credentials on a global basis as well as on a per-API basis. See each individual API section below to see how you can auth on a per-API-basis. This is useful if you want to use different accounts for different Cloud services.

On Google Cloud Platform

If you are running this client on Google Cloud Platform, we handle authentication for you with no configuration. You just need to make sure that when you set up the GCE instance, you add the correct scopes for the APIs you want to access.

var gcloud = require('google-cloud');
// ...you're good to go! See the next section to get started using the APIs.

Elsewhere

If you are not running this client on Google Cloud Platform, you need a Google Developers service account. To create a service account:

  1. Visit the Google Developers Console.
  2. Create a new project or click on an existing project.
  3. Navigate to APIs & auth > APIs section and turn on the following APIs (you may need to enable billing in order to use these services):
  • BigQuery API
  • Cloud Bigtable API
  • Cloud Bigtable Admin API
  • Cloud Bigtable Table Admin API
  • Cloud Spanner API
  • Google Cloud Datastore API
  • Google Cloud DNS API
  • Google Cloud Natural Language API
  • Google Cloud Pub/Sub API
  • Google Cloud Resource Manager API
  • Google Cloud Speech API
  • Google Cloud Storage
  • Google Cloud Storage JSON API
  • Google Cloud Translation API
  • Google Cloud Vision API
  • Google Compute Engine API
  • Prediction API
  • Stackdriver Logging API
  1. Navigate to APIs & auth > Credentials and then:
  • If you want to use a new service account key, click on Create credentials and select Service account key. After the account key is created, you will be prompted to download the JSON key file that the library uses to authenticate your requests.
  • If you want to generate a new service account key for an existing service account, click on Generate new JSON key and download the JSON key file.
// Authenticating on a global basis.
var projectId = process.env.GCLOUD_PROJECT; // E.g. 'grape-spaceship-123'

var gcloud = require('google-cloud')({
  projectId: projectId,

  // The path to your key file:
  keyFilename: '/path/to/keyfile.json'

  // Or the contents of the key file:
  credentials: require('./path/to/keyfile.json')

  // For any APIs that accept an API key:
  key: '...'
});

// ...you're good to go! See the next section to get started using the APIs.

You can also set auth on a per-API-instance basis. The examples below show you how.

Cloud Datastore (GA)

Follow the activation instructions to use the Cloud Datastore API with your project.

Using the Cloud Datastore API module
$ npm install --save @google-cloud/datastore
var datastore = require('@google-cloud/datastore');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var datastoreClient = datastore({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

var key = datastoreClient.key(['Product', 'Computer']);

datastoreClient.get(key, function(err, entity) {
  console.log(err || entity);
});

// Save data to Datastore.
var blogPostData = {
  title: 'How to make the perfect homemade pasta',
  author: 'Andrew Chilton',
  isDraft: true
};

var blogPostKey = datastoreClient.key('BlogPost');

datastoreClient.save({
  key: blogPostKey,
  data: blogPostData
}, function(err) {
  // `blogPostKey` has been updated with an ID so you can do more operations
  // with it, such as an update.
  blogPostData.isDraft = false;

  datastoreClient.save({
    key: blogPostKey,
    data: blogPostData
  }, function(err) {
    if (!err) {
      // The blog post is now published!
    }
  });
});

Cloud Storage (GA)

Using the Cloud Storage API module
$ npm install --save @google-cloud/storage
var storage = require('@google-cloud/storage');
Preview
var fs = require('fs');

// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var gcs = storage({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Create a new bucket.
gcs.createBucket('my-new-bucket', function(err, bucket) {
  if (!err) {
    // "my-new-bucket" was successfully created.
  }
});

// Reference an existing bucket.
var bucket = gcs.bucket('my-existing-bucket');

// Upload a local file to a new file to be created in your bucket.
bucket.upload('/photos/zoo/zebra.jpg', function(err, file) {
  if (!err) {
    // "zebra.jpg" is now in your bucket.
  }
});

// Download a file from your bucket.
bucket.file('giraffe.jpg').download({
  destination: '/photos/zoo/giraffe.jpg'
}, function(err) {});

// Streams are also supported for reading and writing files.
var remoteReadStream = bucket.file('giraffe.jpg').createReadStream();
var localWriteStream = fs.createWriteStream('/photos/zoo/giraffe.jpg');
remoteReadStream.pipe(localWriteStream);

var localReadStream = fs.createReadStream('/photos/zoo/zebra.jpg');
var remoteWriteStream = bucket.file('zebra.jpg').createWriteStream();
localReadStream.pipe(remoteWriteStream);

Cloud Translation API (GA)

Using the Google Translate API module
$ npm install --save @google-cloud/translate
var translate = require('@google-cloud/translate');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var translateClient = translate({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Translate a string of text.
translateClient.translate('Hello', 'es', function(err, translation) {
  if (!err) {
    // translation = 'Hola'
  }
});

// Detect a language from a string of text.
translateClient.detect('Hello', function(err, results) {
  if (!err) {
    // results = {
    //   language: 'en',
    //   confidence: 1,
    //   input: 'Hello'
    // }
  }
});

// Get a list of supported languages.
translateClient.getLanguages(function(err, languages) {
  if (!err) {
    // languages = [
    //   'af',
    //   'ar',
    //   'az',
    //   ...
    // ]
  }
});

Google Stackdriver Logging (GA)

Using the Google Stackdriver Logging API module
$ npm install --save @google-cloud/logging
var logging = require('@google-cloud/logging');
Preview
// Authenticating on a global-basis. You can also authenticate on a per-API-
// basis (see Authentication section above).

var loggingClient = logging({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Create a sink using a Bucket as a destination.
var gcs = storage();

loggingClient.createSink('my-new-sink', {
  destination: gcs.bucket('my-sink')
}, function(err, sink) {});

// Write a critical entry to a log.
var syslog = loggingClient.log('syslog');

var metadata = {
  resource: {
    type: 'gce_instance',
    labels: {
      zone: 'global',
      instance_id: '3'
    }
  }
};

var entry = syslog.entry(metadata, {
  delegate: process.env.user
});

syslog.critical(entry, function(err) {});

// Get all entries in your project.
loggingClient.getEntries(function(err, entries) {
  if (!err) {
    // `entries` contains all of the entries from the logs in your project.
  }
});

Cloud Natural Language (Beta)

Using the Natural Language API module
$ npm install --save @google-cloud/language
var language = require('@google-cloud/language');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authorization section above).

var languageClient = language({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Get the entities from a sentence.
languageClient.detectEntities('Stephen of Michigan!', function(err, entities) {
  // entities = [
  //   {
  //     name: 'Stephen',
  //     type: 'PERSON',
  //     metadata: {
  //       mid: '/m/05d8y4q'
  //     },
  //     salience: 0.7309288382530212,
  //     mentions: [
  //       {
  //         text: {
  //           content: 'Stephen',
  //           beginOffset: -1
  //         },
  //         type: 'PROPER'
  //       }
  //     ]
  //   },
  //   // ...
  // ]
});

// Create a document if you plan to run multiple detections.
var document = languageClient.document('Contributions welcome!');

// Analyze the sentiment of the document.
document.detectSentiment(function(err, sentiment) {
  // sentiment = {
  //   magnitude: 0.5,
  //   score: 0.5
  // }
});

// Parse the syntax of the document.
document.annotate(function(err, annotation) {
  // annotation = {
  //   language: 'en',
  //   sentiment: {
  //     magnitude: 0.30000001192092896,
  //     score: 0.30000001192092896
  //   },
  //   entities: [
  //     {
  //       name: 'Contributions',
  //       type: 'OTHER',
  //       metadata: {},
  //       salience: 1,
  //       mentions: [
  //         {
  //           text: {
  //             content: 'Contributions',
  //             beginOffset: -1
  //           },
  //           type: 'COMMON'
  //         }
  //       ]
  //     }
  //   ],
  //   sentences: [
  //     {
  //       text: {
  //         content: 'Contributions welcome!',
  //         beginOffset: -1
  //       },
  //       sentiment: {
  //         magnitude: 0.30000001192092896,
  //         score: 0.30000001192092896
  //       }
  //     }
  //   ],
  //   tokens: [
  //     {
  //       text: {
  //         content: 'Contributions',
  //         beginOffset: -1
  //       },
  //       partOfSpeech: {
  //         tag: 'NOUN',
  //         aspect: 'ASPECT_UNKNOWN',
  //         case: 'CASE_UNKNOWN',
  //         form: 'FORM_UNKNOWN',
  //         gender: 'GENDER_UNKNOWN',
  //         mood: 'MOOD_UNKNOWN',
  //         number: 'PLURAL',
  //         person: 'PERSON_UNKNOWN',
  //         proper: 'PROPER_UNKNOWN',
  //         reciprocity: 'RECIPROCITY_UNKNOWN',
  //         tense: 'TENSE_UNKNOWN',
  //         voice: 'VOICE_UNKNOWN'
  //       },
  //       dependencyEdge: {
  //         headTokenIndex: 1,
  //         label: 'NSUBJ'
  //       },
  //       lemma: 'contribution'
  //     },
  //     // ...
  //   ]
  // }
});

Cloud Vision (Beta)

Using the Cloud Vision API module
$ npm install --save @google-cloud/vision
var vision = require('@google-cloud/vision');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var visionClient = vision({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Read the text from an image.
visionClient.detectText('./image.jpg', function(err, text) {
  // text = [
  //   'This was text found in the image',
  //   'This was more text found in the image'
  // ]
});

// Detect faces and the locations of their features in an image.
visionClient.detectFaces('./image.jpg', function(err, faces) {
  // faces = [
  //   {
  //     angles: {pan,tilt,roll},
  //     bounds: {
  //       head: [{x,y},{x,y},{x,y},{x,y}],
  //       face: [{x,y},{x,y},{x,y},{x,y}]
  //     },
  //     features: {
  //       confidence: 34.489909,
  //       chin: {
  //         center: {x,y,z},
  //         left: {x,y,z},
  //         right: {x,y,z}
  //       },
  //       ears: {
  //         left: {x,y,z},
  //         right: {x,y,z}
  //       },
  //       eyebrows: {
  //         left: {
  //           left: {x,y,z},
  //           right: {x,y,z},
  //           top: {x,y,z}
  //         },
  //         right: {
  //           left: {x,y,z},
  //           right: {x,y,z},
  //           top: {x,y,z}
  //         }
  //       },
  //       eyes: {
  //         left: {
  //           bottom: {x,y,z},
  //           center: {x,y,z},
  //           left: {x,y,z},
  //           pupil: {x,y,z},
  //           right: {x,y,z},
  //           top: {x,y,z}
  //         },
  //         right: {
  //           bottom: {x,y,z},
  //           center: {x,y,z},
  //           left: {x,y,z},
  //           pupil: {x,y,z},
  //           right: {x,y,z},
  //           top: {x,y,z}
  //         }
  //       },
  //       forehead: {x,y,z},
  //       lips: {
  //         bottom: {x,y,z},
  //         top: {x,y,z}
  //       },
  //       mouth: {
  //         center: {x,y,z},
  //         left: {x,y,z},
  //         right: {x,y,z}
  //       },
  //       nose: {
  //         bottom: {
  //           center: {x,y,z},
  //           left: {x,y,z},
  //           right: {x,y,z}
  //         },
  //         tip: {x,y,z},
  //         top: {x,y,z}
  //       }
  //     },
  //     confidence: 56.748849,
  //     blurry: false,
  //     dark: false,
  //     happy: false,
  //     hat: false,
  //     mad: false,
  //     sad: false,
  //     surprised: false
  //   }
  // ]
});

Google BigQuery (Beta)

Using the BigQuery API module
$ npm install --save @google-cloud/bigquery
var bigquery = require('@google-cloud/bigquery');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var bigqueryClient = bigquery({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Access an existing dataset and table.
var schoolsDataset = bigqueryClient.dataset('schools');
var schoolsTable = schoolsDataset.table('schoolsData');

// Import data into a table.
schoolsTable.import('/local/file.json', function(err, job) {});

// Get results from a query job.
var job = bigqueryClient.job('job-id');

// Use a callback.
job.getQueryResults(function(err, rows) {});

// Or get the same results as a readable stream.
job.getQueryResults().on('data', function(row) {});

Cloud Bigtable (Alpha)

You may need to create an instance to use the Cloud Bigtable API with your project.

Using the Cloud Bigtable API module
$ npm install --save @google-cloud/bigtable
var bigtable = require('@google-cloud/bigtable');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var bigtableClient = bigtable({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

var instance = bigtableClient.instance('my-instance');
var table = instance.table('prezzy');

table.getRows(function(err, rows) {});

// Update a row in your table.
var row = table.row('alincoln');

row.save('follows:gwashington', 1, function(err) {
  if (err) {
    // Error handling omitted.
  }

  row.get('follows:gwashington', function(err, data) {
    if (err) {
      // Error handling omitted.
    }

    // data = {
    //   follows: {
    //     gwashington: [
    //       {
    //         value: 1
    //       }
    //     ]
    //   }
    // }
  });
});

Cloud DNS (Alpha)

Using the Cloud DNS API module
$ npm install --save @google-cloud/dns
var dns = require('@google-cloud/dns');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var dnsClient = dns({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Create a managed zone.
dnsClient.createZone('my-new-zone', {
  dnsName: 'my-domain.com.'
}, function(err, zone) {});

// Reference an existing zone.
var zone = dnsClient.zone('my-existing-zone');

// Create an NS record.
var nsRecord = zone.record('ns', {
  ttl: 86400,
  name: 'my-domain.com.',
  data: 'ns-cloud1.googledomains.com.'
});

zone.addRecords([nsRecord], function(err, change) {});

// Create a zonefile from the records in your zone.
zone.export('/zonefile.zone', function(err) {});

Cloud Pub/Sub (Alpha)

Using the Cloud Pub/Sub API module
$ npm install --save @google-cloud/pubsub
var pubsub = require('@google-cloud/pubsub');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you
// auth on a global basis (see Authentication section above).

var pubsubClient = pubsub({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Reference a topic that has been previously created.
var topic = pubsubClient.topic('my-topic');

// Publish a message to the topic.
topic.publish('New message!', function(err) {});

// Subscribe to the topic.
topic.subscribe('subscription-name', function(err, subscription) {
  // Register listeners to start pulling for messages.
  function onError(err) {}
  function onMessage(message) {}
  subscription.on('error', onError);
  subscription.on('message', onMessage);

  // Remove listeners to stop pulling for messages.
  subscription.removeListener('message', onMessage);
  subscription.removeListener('error', onError);
});

Cloud Resource Manager (Alpha)

Using the Cloud Resource Manager API module
$ npm install --save @google-cloud/resource
var resource = require('@google-cloud/resource');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var resourceClient = resource({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Get all of the projects you maintain.
resourceClient.getProjects(function(err, projects) {
  if (!err) {
    // `projects` contains all of your projects.
  }
});

// Get the metadata from your project. (defaults to `grape-spaceship-123`)
var project = resourceClient.project();

project.getMetadata(function(err, metadata) {
  // `metadata` describes your project.
});

Cloud Spanner (Alpha)

Using the Cloud Spanner API module
$ npm install --save @google-cloud/spanner
var spanner = require('@google-cloud/spanner');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var spannerClient = spanner({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

var instance = spannerClient.instance('my-instance');
var database = instance.database('my-database');

// Create a table.
var schema =
  'CREATE TABLE Singers (' +
  '  SingerId INT64 NOT NULL,' +
  '  FirstName STRING(1024),' +
  '  LastName STRING(1024),' +
  '  SingerInfo BYTES(MAX),' +
  ') PRIMARY KEY(SingerId)';

database.createTable(schema, function(err, table, operation) {
  if (err) {
    // Error handling omitted.
  }

  operation
    .on('error', function(err) {})
    .on('complete', function() {
      // Table created successfully.
    });
});

// Insert data into the table.
var table = database.table('Singers');

table.insert({
  SingerId: 10,
  FirstName: 'Eddie',
  LastName: 'Wilson'
}, function(err) {
  if (!err) {
    // Row inserted successfully.
  }
});

// Run a query as a readable object stream.
database.runStream('SELECT * FROM Singers')
  .on('error', function(err) {})
  .on('data', function(row) {
    // row.toJSON() = {
    //   SingerId: 10,
    //   FirstName: 'Eddie',
    //   LastName: 'Wilson'
    // }
  }
  })
  .on('end', function() {
    // All results retrieved.
  });

Cloud Speech (Alpha)

Using the Cloud Speech API module
$ npm install --save @google-cloud/speech
var speech = require('@google-cloud/speech');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var speechClient = speech({
  projectId: 'my-project',
  keyFilename: '/path/to/keyfile.json'
});

// Detect the speech in an audio file.
speechClient.recognize('./audio.raw', {
  encoding: 'LINEAR16',
  sampleRateHertz: 16000
}, function(err, transcript) {
  // transcript = 'how old is the Brooklyn Bridge'
});

// Detect the speech in an audio file stream.
fs.createReadStream('./audio.raw')
  .on('error', console.error)
  .pipe(speechClient.createRecognizeStream({
    config: {
      encoding: 'LINEAR16',
      sampleRateHertz: 16000
    },
    singleUtterance: false,
    interimResults: false
  }))
  .on('error', console.error)
  .on('data', function(data) {
    // data.results = "how old is the Brooklyn Bridge"
  });

Google Compute Engine (Alpha)

Using the Compute Engine API module
$ npm install --save @google-cloud/compute
var compute = require('@google-cloud/compute');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var gce = compute({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Create a new VM using the latest OS image of your choice.
var zone = gce.zone('us-central1-a');
var name = 'ubuntu-http';

zone.createVM(name, { os: 'ubuntu' }, function(err, vm, operation) {
  // `operation` lets you check the status of long-running tasks.

  operation
    .on('error', function(err) {})
    .on('running', function(metadata) {})
    .on('complete', function(metadata) {
      // Virtual machine created!
    });
});

Google Prediction API (Alpha)

Using the Prediction API module
$ npm install --save @google-cloud/prediction
var prediction = require('@google-cloud/prediction');
Preview
// Authenticating on a per-API-basis. You don't need to do this if you auth on a
// global basis (see Authentication section above).

var predictionClient = prediction({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

// Get all of the trained models in your project.
predictionClient.getModels(function(err, models) {
  if (!err) {
    // `models` is an array of Model objects.
  }
});

// Reference an existing trained model.
var model = predictionClient.model('my-existing-model');

// Train a model.
model.train('english', 'Hello from your friends at Google!', function(err) {});

// Query a model.
model.query('Hello', function(err, results) {
  if (!err) {
    // results.winner == 'english'
    // results.scores == [
    //   {
    //     label: 'english',
    //     score: 1
    //   },
    //   {
    //     label: 'spanish',
    //     score: 0
    //   }
    // ]
  }
});

Google Stackdriver Debugger (Alpha)

The source code for the Node.js Cloud Debugger Agent lives in a separate repo.

Using the Stackdriver Debug Agent module
$ npm install --save @google-cloud/debug-agent
require('@google-cloud/debug-agent').start({ allowExpressions: true });

For more details on API usage, please see the Stackdriver Debug Agent Github Repository.

Google Stackdriver Error Reporting (Alpha)

Using the Stackdriver Error Reporting module
$ npm install --save @google-cloud/error-reporting

The module provides automatic uncaught exception handling, manual error reporting, and integration with common frameworks like express and hapi.

var errors = require('@google-cloud/error-reporting')();
Preview
errors.report(new Error('Something broke!'));

For more details on API usage, please see the documentation.

Google Stackdriver Monitoring (Alpha)

:warning: This is an auto-generated API

It does not follow the conventions you're familiar with from other parts of our library. A handwritten layer is not yet available.

The example below shows you how to instantiate the generated client. For further documentation, please browse the Monitoring .proto files on GitHub.

Using the Google Stackdriver Monitoring API module
$ npm install --save @google-cloud/monitoring
var monitoring = require('@google-cloud/monitoring');
Preview
var monitoringClient = monitoring.v3({
  projectId: 'grape-spaceship-123',
  keyFilename: '/path/to/keyfile.json'
});

Google Stackdriver Trace (Alpha)

The source code for the Node.js Cloud Trace Agent lives in a separate repo.

Using the Stackdriver Trace Agent module
$ npm install --save @google-cloud/trace-agent
var trace = require('@google-cloud/trace-agent').start();

For more details on API usage, please see the Stackdriver Trace Agent Github Repository.

Versioning

This library follows Semantic Versioning.

Please note it is currently under active development. Any release versioned 0.x.y is subject to backwards-incompatible changes at any time.

GA: Libraries defined at the GA (general availability) quality level are stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.

Please note that the auto-generated portions of the GA libraries (the ones in modules such as v1 or v2) are considered to be of Beta quality, even if the libraries that wrap them are GA.

Beta: Libraries defined at the Beta quality level are expected to be mostly stable, while we work towards their release candidate. We will address issues and requests with a higher priority.

Alpha: Libraries defined at the Alpha quality level are still a work-in-progress and are more likely to get backwards-incompatible updates.

Contributing

Contributions to this library are always welcome and highly encouraged.

See CONTRIBUTING for more information on how to get started.

License

Apache 2.0 - See COPYING for more information.

Keywords

FAQs

Package last updated on 05 Jul 2017

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc