Security News
RubyGems.org Adds New Maintainer Role
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Connect to Gnip streaming API and manage rules. You must have a Gnip account with any data source available, like Twitter Power Track.
Currenly, this module only supports JSON activity stream format, so you must enable data normalization in your admin panel.
This class is an EventEmitter and allows you to connect to the stream and start receiving data.
As requested in the Gnip docs (http://support.gnip.com/apis/powertrack/api_reference.html), this option in the constructor allows us to set a read timeout in the client. The recommended value is >=30 seconds, so the constructor will throw an error if a smaller timeout is provided. The default value for this option is 35 seconds.
Number of minutes to backfill after connecting to the stream. Optional. Value should be 0 - 5.
Partition of the Firehose stream you want to connect to. Only required for Firehose streams.
Parser library for incoming JSON data. Optional, but defaults to the excellent json-bigint library.
Matching tag IDs are sent to us as big integers which can't be reliably parsed by the native JSON library in Node.js. More info on this issue can be found at StackOverflow
Connect to the stream and start receiving data. At this point you should have registered at least one event listener for any of these events: 'data', 'object' or 'tweet'.
Terminates the connection.
Emitted when the connection has been successfully established
Emitted for each data chunk (decompressed)
Emitted when any type of error occurs. An error is raised if the response status code is not 20x. {error: String} objects are also checked here.
Emitted for each JSON object.
Emitted for each tweet.
Emitted for each deleted tweet.
Emitted when the connection is terminated. This event is always emitted when an error occurs and the connection is closed.
This class allows you to manage an unlimited number of tracking rules.
GNIP account username.
GNIP account password.
GNIP Rules endpoint url e.g. https://gnip-api.twitter.com/rules/${streamType}/accounts/${account}/publishers/twitter/${label}.json
The batch size used when adding/deleting rules in bulk. (Defaults to 5000)
Much like the parser
option allowed in the Gnip Stream constructor, you can pass a custom parser handler/library for incoming JSON data. This is optional, and defaults to the json-bigint library. More details.
Get cached rules.
Creates or replaces the live tracking rules.
Rules are sent in batches of options.batchSize
, so you can pass an unlimited number of rules.
The current tracking rules are stored in a local JSON file so you can update the existing rules efficiently without having to remove them all.
The callback receives an object as the 2nd argument and contains the number of added and deleted rules.
Clears cached rules.
This class is an EventEmitter and allows you to connect to either the 30 day or full archive search API and start receiving data.
GNIP account username.
GNIP account password.
GNIP Search endpoint url e.g. https://gnip-api.twitter.com/search/30day/accounts/{ACCOUNT_NAME}/{LABEL}.json
Rule to match tweets.
The oldest date from which tweets will be gathered. Date given in the format 'YYYYMMDDHHMM'. Optional.
The most recent date to which tweets will be gathered. Date given in the format 'YYYYMMDDHHMM'. Optional.
The maximum number of search results to be returned by a request. A number between 10 and 500. Optional.
Used to segregate rules and their matching data into different logical groups. Optional.
The unit of time for which count data will be provided. Options: "day", "hour", "minute". Optional, for /counts calls.
A limiter object, used to control the rate of collection. Optional. If unspecified, a rate limit of 30 requests a minute will be shared between Search streams. If you have a non-standard rate limit, you should pass this parameter.
var RateLimiter = require('limiter').RateLimiter;
// Allow 60 requests per minute
var limiter = new RateLimiter(60, 'minute');
var stream = new Gnip.Search({
rateLimiter : limiter,
...
});
Start receiving data. At this point you should have registered at least one event listener for 'object' or 'tweet'.
Terminates the connection.
Emitted when tweets have started to be collected.
Emitted when a recoverable (non fatal) error occurs.
Emitted for each JSON object.
Emitted for each tweet.
Emitted when the connection is terminated. If the stream has ended due to a fatal error, the error object will be passed.
This class allows you to track activity consumption across Gnip products.
GNIP API url, e.g: https://gnip-api.twitter.com/metrics/usage/accounts/{ACCOUNT_NAME}.json
Username for authentication.
Password for authentication.
Example:
var usage = new Gnip.Usage({
url : 'https://gnip-api.twitter.com/metrics/usage/accounts/{ACCOUNT_NAME}.json',
user : 'xxx',
password : 'xxx'
});
Error passed as first parameter to callback, result as second.
Information on request parameters can be found here.
usage.get({ bucket:'day', fromDate:'201612010000', toDate:'201612100000' },function( err, body )
{
...
});
npm install gnip
var Gnip = require('gnip');
var stream = new Gnip.Stream({
url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod.json',
user : 'xxx',
password : 'xxx',
backfillMinutes: 5 // optional
});
stream.on('ready', function() {
console.log('Stream ready!');
});
stream.on('tweet', function(tweet) {
console.log(tweet);
});
stream.on('error', function(err) {
console.error(err);
});
var rules = new Gnip.Rules({
url : 'https://gnip-api.twitter.com/rules/powertrack/accounts/xxx/publishers/twitter/prod.json',
user : 'xxx',
password : 'xxx',
batchSize: 1234 // not required, defaults to 5000
});
var newRules = [
'#hashtag',
'keyword',
'@user',
{value: 'keyword as object'},
{value: '@demianr85', tag: 'rule tag'}
];
rules.update(newRules, function(err) {
if (err) throw err;
stream.start();
});
var search = new Gnip.Search({
url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod.json',
user : 'xxx',
password : 'xxx',
query : '@user'
});
search.on('tweet', function(tweet) {
console.log(tweet);
});
search.on('error', function(err) {
console.error(err);
});
search.on('end', function(err) {
if( err )
console.error(err);
});
// search counts usage
var counts = new Gnip.Search({
url : 'https://gnip-stream.twitter.com/stream/powertrack/accounts/xxx/publishers/twitter/prod/counts.json',
user : 'xxx',
password : 'xxx',
query : '@user',
bucket: 'day'
});
counts.on('object', function(object) {
console.log(object.results);
counts.end();
});
More details and tests soon...
FAQs
Connect to Gnip streaming API and manage rules
The npm package gnip receives a total of 28 weekly downloads. As such, gnip popularity was classified as not popular.
We found that gnip demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.
Security News
Research
Socket's threat research team has detected five malicious npm packages targeting Roblox developers, deploying malware to steal credentials and personal data.