Security News
NIST Misses 2024 Deadline to Clear NVD Backlog
NIST has failed to meet its self-imposed deadline of clearing the NVD's backlog by the end of the fiscal year. Meanwhile, CVE's awaiting analysis have increased by 33% since June.
@segment/analytics-node
Advanced tools
> ### Warning: Until 1.x release, use this library at your own risk! While the API is very similar, the documentation for the legacy SDK (`analytics-node`) is here: https://segment.com/docs/connections/sources/catalog/libraries/server/node/
@segment/analytics-node is a Node.js client for Segment, a customer data platform that helps you collect, clean, and control your customer data. This package allows you to send data from your Node.js applications to Segment, which can then route it to various analytics and marketing tools.
Track
The `track` method allows you to record any actions your users perform. It is useful for tracking events like purchases, sign-ups, or any other user activity.
const Analytics = require('@segment/analytics-node');
const analytics = new Analytics('YOUR_WRITE_KEY');
analytics.track({
userId: 'user123',
event: 'Item Purchased',
properties: {
item: 'T-shirt',
price: 19.99
}
});
Identify
The `identify` method lets you tie a user to their actions and record traits about them. This is useful for associating user data like name, email, and other attributes.
const Analytics = require('@segment/analytics-node');
const analytics = new Analytics('YOUR_WRITE_KEY');
analytics.identify({
userId: 'user123',
traits: {
name: 'John Doe',
email: 'john.doe@example.com'
}
});
Group
The `group` method allows you to associate an individual user with a group, such as a company or organization. This is useful for B2B applications where you need to track users within the context of their organization.
const Analytics = require('@segment/analytics-node');
const analytics = new Analytics('YOUR_WRITE_KEY');
analytics.group({
userId: 'user123',
groupId: 'group123',
traits: {
name: 'Company Inc.',
industry: 'Technology'
}
});
Page
The `page` method is used to record page views on your website. This is useful for tracking which pages your users are visiting.
const Analytics = require('@segment/analytics-node');
const analytics = new Analytics('YOUR_WRITE_KEY');
analytics.page({
userId: 'user123',
category: 'Docs',
name: 'Node.js SDK',
properties: {
url: 'https://example.com/docs/nodejs-sdk'
}
});
Alias
The `alias` method is used to merge two user identities, effectively linking an anonymous user with an identified user. This is useful for scenarios where a user initially interacts anonymously and later signs up or logs in.
const Analytics = require('@segment/analytics-node');
const analytics = new Analytics('YOUR_WRITE_KEY');
analytics.alias({
previousId: 'temp_user123',
userId: 'user123'
});
Mixpanel is an advanced analytics service that helps improve web and mobile applications by tracking how users interact and engage with them. It offers similar functionalities to @segment/analytics-node, such as tracking events, identifying users, and analyzing user behavior. However, Mixpanel is a standalone analytics platform, whereas Segment acts as a data hub that can route data to multiple analytics and marketing tools.
Amplitude is a product analytics platform that provides insights into user behavior and helps drive product strategy. Like @segment/analytics-node, it allows you to track events and user actions. Amplitude focuses more on in-depth product analytics and user behavior insights, while Segment provides a broader data routing and integration service.
Keen.io is a data analytics platform that allows you to collect, analyze, and visualize event data. It offers similar event tracking and user identification features as @segment/analytics-node. Keen.io is more focused on custom analytics and data visualization, whereas Segment provides a more comprehensive data integration and routing solution.
Warning: Until 1.x release, use this library at your own risk!
While the API is very similar, the documentation for the legacy SDK (analytics-node
) is here: https://segment.com/docs/connections/sources/catalog/libraries/server/node/
# npm
npm install @segment/analytics-node
# yarn
yarn add @segment/analytics-node
# pnpm
pnpm install @segment/analytics-node
import { Analytics } from '@segment/analytics-node'
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
app.post('/login', (req, res) => {
analytics.identify({
userId: req.body.userId,
previousId: req.body.previousId
})
res.sendStatus(200)
})
app.post('/cart', (req, res) => {
analytics.track({
userId: req.body.userId,
event: 'Add to cart',
properties: { productId: '123456' }
})
res.sendStatus(200)
});
For Business plans with access to Regional Segment, you can use the host configuration parameter to send data to the desired region:
Oregon (Default) — api.segment.io/v1 Dublin — events.eu1.segmentapis.com An example of setting the host to the EU endpoint using the Node library would be:
const analytics = new Analytics('YOUR_WRITE_KEY', {
host: "https://events.eu1.segmentapis.com"
});
See complete list of settings in the AnalyticsSettings interface.
const analytics = new Analytics({
writeKey: '<MY_WRITE_KEY>',
plugins: [plugin1, plugin2],
host: 'https://api.segment.io',
path: '/v1/batch',
maxRetries: 3,
maxEventsInBatch: 15,
flushInterval: 10000,
// ... and more!
})
Our libraries are built to support high performance environments. That means it is safe to use our Node library on a web server that’s serving thousands of requests per second.
Every method you call does not result in an HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation.
By default, our library will flush:
settings.maxEventsInBatch
).settings.flushInterval
)There is a maximum of 500KB per batch request and 32KB per call.
If you don’t want to batch messages, you can turn batching off by setting the maxEventsInBatch
setting to 1, like so:
const analytics = new Analytics({ '<MY_WRITE_KEY>', { maxEventsInBatch: 1 });
Batching means that your message might not get sent right away. But every method call takes an optional callback, which you can use to know when a particular message is flushed from the queue, like so:
analytics.track({
userId: '019mr8mf4r',
event: 'Ultimate Played'
callback: (ctx) => console.log(ctx)
})
Subscribe and log all event delivery errors.
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
analytics.on('error', (err) => console.error(err))
.closeAndFlush()
to stop collecting new events and flush all existing events.await analytics.closeAndFlush()
// or
await analytics.closeAndFlush({ timeout: 5000 }) // force resolve after 5000ms
import { Analytics } from '@segment/analytics-node'
import express from 'express'
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
const app = express()
app.post('/cart', (req, res) => {
analytics.track({
userId: req.body.userId,
event: 'Add to cart',
properties: { productId: '123456' }
})
res.sendStatus(200)
})
const server = app.listen(3000)
const onExit = async () => {
await analytics.closeAndFlush() // flush all existing events
server.close(() => {
console.log("Gracefully closing server...")
process.exit()
})
}
['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit))
If you absolutely need to preserve all possible events in the event of a forced timeout, even ones that came in after analytics.closeAndFlush()
was called, you can collect those events.
const unflushedEvents = []
analytics.on('call_after_close', (event) => unflushedEvents.push(events))
await analytics.closeAndFlush()
console.log(unflushedEvents) // all events that came in after closeAndFlush was called
// subscribe to identify calls
analytics.on('identify', (err) => console.error(err))
// subscribe to a specific event
analytics.on('track', (ctx) => console.log(ctx))
Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of Analytics with different settings:
import { Analytics } from '@segment/analytics-node'
const marketingAnalytics = new Analytics('MARKETING_WRITE_KEY');
const appAnalytics = new Analytics('APP_WRITE_KEY');
Double check that you’ve followed all the steps in the Quick Start.
Make sure that you’re calling a Segment API method once the library is successfully installed: identify, track, etc.
Log events and errors the event emitter:
['initialize', 'call_after_close',
'screen', 'identify', 'group',
'track', 'ready', 'alias',
'page', 'error', 'register',
'deregister'].forEach((event) => analytics.on(event, console.log)
// old
import Analytics from 'analytics-node'
// new
import { Analytics } from '@segment/analytics-next'
// old
var analytics = new Analytics('YOUR_WRITE_KEY');
// new
const analytics = new Analytics({ writeKey: 'YOUR_WRITE_KEY' });
// old
await analytics.flush(function(err, batch) {
console.log('Flushed, and now this program can exit!');
});
// new
await analytics.closeAndFlush()
Other Differences:
enable
configuration option has been removed-- see "Disabling Analytics" sectionerrorHandler
configuration option has been remove -- see "Error Handling" sectionflushAt
configuration option -> maxEventsInBatch
.callback
option is moved to configuration// old
analytics.track({
userId: '019mr8mf4r',
event: 'Ultimate Played'
}), function(err, batch){
if (err) {
console.error(err)
}
});
// new
analytics.track({
userId: '019mr8mf4r',
event: 'Ultimate Played',
callback: (ctx) => {
if (ctx.failedDelivery()) {
console.error(ctx)
}
}
})
You should prefer mocking. However, if you need to intercept the request, you can do:
// Note: nock will _not_ work if polyfill fetch with something like undici, as nock uses the http module. Undici has its own interception method.
import nock from 'nock'
const mockApiHost = 'https://foo.bar'
const mockPath = '/foo'
nock(mockApiHost) // using regex matching in nock changes the perf profile quite a bit
.post(mockPath, (body) => true)
.reply(201)
.persist()
const analytics = new Analytics({ host: mockApiHost, path: mockPath })
FAQs
https://www.npmjs.com/package/@segment/analytics-node
The npm package @segment/analytics-node receives a total of 368,918 weekly downloads. As such, @segment/analytics-node popularity was classified as popular.
We found that @segment/analytics-node demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 286 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
NIST has failed to meet its self-imposed deadline of clearing the NVD's backlog by the end of the fiscal year. Meanwhile, CVE's awaiting analysis have increased by 33% since June.
Security News
Cloudflare has launched a setup wizard allowing users to easily create and manage a security.txt file for vulnerability disclosure on their websites.
Security News
The Socket Research team breaks down a malicious npm package targeting the legitimate DOMPurify library. It uses obfuscated code to hide that it is exfiltrating browser and crypto wallet data.