
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
hive-stream
Advanced tools
A layer for streaming actions on the Hive blockchain and reacting to them.
A Node.js layer for Hive that allows you to watch for specific actions on the Hive blockchain.
npm install hive-stream
const { Streamer } = require('hive-stream');
const ss = new Streamer();
// Watch for all custom JSON operations
ss.onCustomJson((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
// React to custom JSON operations
});
new Streamer() is now side-effect free. The default SQLite adapter is created lazily on first use, and the built-in Express API is opt-in via apiEnabled: true on start() or an explicit startApiServer() call.
For external tooling (like visual builders), Hive Stream now exports a read-only metadata object:
const { HIVE_STREAM_METADATA, getHiveStreamMetadata } = require('hive-stream');
console.log(HIVE_STREAM_METADATA.subscriptions);
console.log(getHiveStreamMetadata().writeOperations);
This metadata is static runtime data (no network calls) and includes config defaults, event callback signatures, write operation signatures, adapter metadata, contract trigger info, and valid TimeAction values.
This repo now includes installable AI skills for both Claude Code and Codex, tailored to building on top of hive-stream instead of raw Hive RPC primitives.
Quick locations:
.claude/skills/hive-streamcodex-skills/hive-streamFor full install and usage instructions, see AI-SKILLS.md.
Both skill bundles include focused references for package surface, contracts and triggers, transfer flows and builder APIs, and the built-in contract catalog.
The Streamer object can accept an object of configuration values which are all optional. However, some operations like transferring Hive Engine tokens or other operations on the blockchain that are not READ ONLY, will require the active key and/or posting keys supplied as well as a username.
The blockCheckInterval value is how often to check for new blocks or in cases of error or falling behind, to poll for new blocks. You should keep this as the default 1000ms value which is one second. This allows you to account for situations where blocks fall behind the main block.
The blocksBehindWarning value is a numeric value of the number of blocks your API will fall behind from the master before warning to the console.
To resume automatically from stored state, keep resumeFromState enabled (default). To force a specific start block, set resumeFromState to false and supply lastBlockNumber.
For faster catch-up, catchUpBatchSize controls how many blocks are processed per polling cycle, and catchUpDelayMs controls the delay between catch-up batches (set to 0 for fastest catch-up).
The apiNodes are the Hive API endpoints used for failover. Set apiEnabled to true if you want start() to boot the built-in API server, or call startApiServer() manually. If you want verbose logs, set debugMode to true. The configuration values and their defaults can be found in src/config.ts.
CamelCase config keys are recommended for readability. Legacy uppercase keys are still supported for backwards compatibility.
const options = {
env: true,
activeKey: '',
postingKey: '',
jsonId: 'hivestream',
hiveEngineApi: 'https://api.hive-engine.com/rpc',
hiveEngineId: 'ssc-mainnet-hive',
payloadIdentifier: 'hive_stream',
appName: 'hive-stream',
username: '',
lastBlockNumber: 0,
blockCheckInterval: 1000,
blocksBehindWarning: 25,
resumeFromState: true,
catchUpBatchSize: 50,
catchUpDelayMs: 0,
apiNodes: ['https://api.hive.blog', 'https://api.openhive.network', 'https://rpc.ausbit.dev'],
apiEnabled: false,
apiPort: 5001,
debugMode: false
}
const ss = new Streamer(options);
If you prefer loading credentials from environment variables, pass env: true. Hive Stream will read canonical keys like ACTIVE_KEY and USERNAME, plus Hive-friendly aliases like HIVE_ACCOUNT and HIVE_ACTIVE_KEY.
If you want the built-in API without starting block streaming yet:
await ss.startApiServer();
The configuration itself can also be overloaded using the setConfig method which allows you to pass one or more of the above configuration options, useful in situations where multiple keys might be used for issuing.
ss.setConfig({
activeKey: 'newactivekey',
username: 'newusername'
});
The following subscription methods are read only methods, they allow you to react to certain Hive and Hive Engine events on the blockchain. You do not need to pass in any keys to use these methods as they're purely read only.
These event subscriptions and contract actions are separate paths: subscriptions fire for matching operations, while contracts only run when a payload wrapper exists under PAYLOAD_IDENTIFIER.
The following actions DO require calling the start method first to watch the blockchain
ss.onTransfer('myaccount', (op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
// Fires only when op.to === 'myaccount'
// Parse op.amount yourself, for example: "1.000 HIVE"
});
ss.onEscrowTransfer((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowApprove((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowDispute((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onEscrowRelease((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onCustomJson((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
})
ss.onCustomJsonId((op, { sender, isSignedWithActiveKey }, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
}, 'your-custom-json-id');
ss.onHiveEngine((contractName, contractAction, contractPayload, sender, op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onPost((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
ss.onComment((op, blockNumber, blockId, prevBlockId, trxId, blockTime) => {
});
All of the below methods require an active key has been supplied in the constructor above called ACTIVE_KEY. The methods below are all promised based, so you can await them or use then to confirm a successful result.
The following actions do NOT require calling the start method first to watch the blockchain
const ss = new Streamer({
ACTIVE_KEY: 'youractivekey'
});
transferHiveTokens(from, to, amount, symbol, memo = '') {
}
burnHiveTokens(from, amount, symbol, memo = '') {
}
burnTransferPercentage(from, transferOrAmount, percentage, memo = '', allowedSymbols = ['HIVE', 'HBD']) {
}
transferHiveEngineTokens(from, to, symbol, quantity, memo = '') {
}
burnHiveEngineTokens(from, symbol, quantity, memo = '') {
}
transferHiveEngineTokensMultiple(from, accounts = [], symbol, memo = '', amount = '0') {
}
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoBurnIncomingTransfers({
percentage: 67,
memo: ({ transaction }) => `Auto-burn 67% of ${transaction.id}`
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoForwardIncomingTransfers({
to: 'treasury',
percentage: 100,
memo: ({ transaction }) => `Forwarded from ${transaction.id}`
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoSplitIncomingTransfers({
recipients: [
{ account: 'null', percentage: 69, memo: 'Feel the burn' },
{ account: 'treasury' }
]
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRefundIncomingTransfers({
memo: ({ transfer }) => `Refunded ${transfer.rawAmount} to ${transfer.from}`
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRouteIncomingTransfers({
routes: [
{ type: 'burn', percentage: 67, memo: 'Auto-burn 67%' },
{ to: 'treasury', memo: 'Treasury remainder' }
]
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows.autoRouteIncomingTransfers({
account: 'tweet-backup',
routes: [
{ to: 'tweet-catcher', percentage: 20, memo: 'Tweet watcher share' },
{ group: [{ account: 'node-1' }, { account: 'node-2' }], percentage: 4, memo: 'Node operator share' },
{ group: [{ account: 'wit-1' }, { account: 'wit-2' }], percentage: 6, memo: 'Witness share' },
{ type: 'burn', percentage: 70, memo: 'Burn share' },
{ to: 'platform-op', mode: 'onTop', percentage: 8, memo: 'Optional platform donation' }
]
});
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
ss.flows
.incomingTransfers()
.burn(69, 'Feel the burn')
.remainderTo('treasury', 'Treasury remainder')
.start();
ss.start();
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
const plan = ss.flows
.incomingTransfers('tweet-backup')
.forwardTo('tweet-catcher', 20, 'Tweet watcher share')
.forwardGroup([{ account: 'node-1' }, { account: 'node-2' }], 4, { memo: 'Node operator share' })
.remainderToGroup([{ account: 'wit-1' }, { account: 'wit-2' }], { memo: 'Witness share' })
.burn(70, 'Burn share')
.donateOnTop('platform-op', 8, 'Optional platform donation')
.plan({ from: 'buyer', to: 'tweet-backup', amount: '1.080 HBD', memo: 'Archive this tweet' });
console.log(plan.baseAmount); // "1.000"
console.log(plan.onTopAmount); // "0.080"
console.log(plan.routes);
flows.autoBurnIncomingTransfers() is the quickest high-level option for the burn case. flows.autoForwardIncomingTransfers() covers treasury forwarding, flows.autoSplitIncomingTransfers() handles common revenue-sharing, and flows.autoRefundIncomingTransfers() is useful for rejecting unsupported payments. flows.autoRouteIncomingTransfers() is the general router for mixed burn/transfer/group routes, and flows.planIncomingTransferRoutes() previews the same math without broadcasting. In base routes, one destination can omit percentage/basisPoints and automatically receive the remainder. Routes with mode: 'onTop' are treated as a surcharge on the base payout amount, so a 1.000 HBD base payout with an 8% donation should arrive as 1.080 HBD.
flows.incomingTransfers() is the chainable version of the same idea. Single-step builders compile down to autoBurnIncomingTransfers(), autoForwardIncomingTransfers(), or autoRefundIncomingTransfers(). Multi-step builders compile down to autoRouteIncomingTransfers(), and .plan(...) gives you the exact rounded output before any transfer is sent.
const ss = new Streamer();
ss.money.parseAssetAmount('1.000 HIVE');
ss.money.formatAmount('1.2399'); // "1.239"
ss.money.calculatePercentageAmount('10.000', 12.5); // "1.250"
ss.money.splitAmountByBasisPoints('1.000', [6900, 3100]); // ["0.690", "0.310"]
ss.money.splitAmountByWeights('1.080', [10000, 800]); // ["1.000", "0.080"]
issueHiveEngineTokens(from, to, symbol, quantity, memo = '') {
}
issueHiveEngineTokensMultiple(from, accounts = [], symbol, memo = '', amount = '0') {
}
escrowTransfer({
from,
to,
agent,
escrow_id,
hive_amount = '0.000 HIVE',
hbd_amount = '0.000 HBD',
fee,
ratification_deadline,
escrow_expiration,
json_meta
}, signingKeys?)
escrowApprove({ from, to, agent, who, escrow_id, approve }, signingKeys?)
escrowDispute({ from, to, agent, who, escrow_id }, signingKeys?)
escrowRelease({ from, to, agent, who, receiver, escrow_id, hive_amount, hbd_amount }, signingKeys?)
broadcastOperations(operations, signingKeys?)
broadcastMultiSigOperations(operations, signingKeys)
createAuthority(keyAuths, accountAuths, weightThreshold)
updateAccountAuthorities(account, authorityUpdate, signingKeys?)
recurrentTransfer({ from, to, amount, memo, recurrence, executions }, signingKeys?)
createProposal({ creator, receiver, start_date, end_date, daily_pay, subject, permlink }, signingKeys?)
updateProposalVotes({ voter, proposal_ids, approve }, signingKeys?)
removeProposals({ proposal_owner, proposal_ids }, signingKeys?)
const { Streamer } = require('hive-stream');
const ss = new Streamer({ env: true });
await ss.ops
.transfer()
.from('alice')
.to('bob')
.hive(1.25)
.memo('Builder transfer example')
.send();
await ss.ops
.createProposal()
.creator('alice')
.receiver('treasury')
.startDate(new Date('2026-04-01T00:00:00.000Z'))
.endDate(new Date('2026-05-01T00:00:00.000Z'))
.dailyHbd(12.5)
.subject('Builder proposal example')
.permlink('builder-proposal-example')
.send();
Additional chainable write builders are available for Hive Engine token ops and governance/voting:
await ss.ops
.transferEngine()
.from('alice')
.to('bob')
.symbol('BEE')
.quantity('1.23456')
.memo('Engine transfer')
.send();
await ss.ops
.voteProposals()
.voter('alice')
.ids(1, 2, 3)
.approve()
.send();
await ss.ops
.upvote()
.author('bob')
.permlink('my-post')
.weight(25)
.send();
upvote(votePercentage = '100.0', username, permlink) {
}
downvote(votePercentage = '100.0', username, permlink) {
}
Hive Stream allows you to register contract definitions that execute when a transfer memo or custom JSON operation includes a contract wrapper. The payload lives under the PAYLOAD_IDENTIFIER key (default: hive_stream).
Regular event handlers like onTransfer and onCustomJson still run for matching operations even when no contract wrapper is present.
The payload shape is:
contract: the name of the contract you registeredaction: the action name defined in your contractpayload: data passed to the actionmeta: optional metadataContracts are defined with defineContract + action. Each action can specify a trigger (custom_json, transfer, time, escrow_transfer, escrow_approve, escrow_dispute, escrow_release, or recurrent_transfer) and an optional Zod schema for payload validation.
For a full contract-building guide (payloads, context, triggers, validation, error handling, and exchange setup), see DOCUMENTATION.md.
Register a contract definition. Registration is async so hooks can initialize state.
import { defineContract, action } from 'hive-stream';
const MyContract = defineContract({
name: 'mycontract',
actions: {
hello: action(async (payload, ctx) => {
console.log('hello', payload, ctx.sender);
}, { trigger: 'custom_json' })
}
});
await streamer.registerContract(MyContract);
Unregister a contract that has been registered.
await streamer.unregisterContract('mycontract');
JSON.stringify({
hive_stream: {
contract: 'hivedice',
action: 'roll',
payload: { roll: 22 }
}
})
This will match a registered contract called hivedice, run the roll action, and pass the payload into your handler.
The library includes several built-in contract examples in the src/contracts folder:
createDiceContract - A dice rolling game contractcreateCoinflipContract - A coin flip game contractcreateLottoContract - A lottery-style game contractcreateTokenContract - A contract for token operationscreateNFTContract - A contract for NFT operationscreateRpsContract - A rock-paper-scissors game contractcreatePollContract - A poll/voting contractcreateTipJarContract - A tip jar + message board contractcreateExchangeContract - A basic exchange with deposits, withdrawals, balances, and order matching (SQL adapter required)createAuctionHouseContract - Auctions with reserve prices, buy-now support, and timed settlementcreateSubscriptionContract - Subscription plans with transfer and recurrent-transfer renewalscreateCrowdfundContract - Crowdfunding campaigns with milestones, finalization, and refund trackingcreateBountyBoardContract - Funded bounties, submissions, and award selectioncreateInvoiceContract - Invoices with partial payments, recurring payments, and overdue sweepscreateSavingsContract - Savings goals with recurring contributions and withdrawal requestscreateBookingContract - Reservable listings with paid booking windows and confirmationscreateGiftCardContract - Gift card issuance, redemption, and cancellation flowscreateGroupBuyContract - Threshold-based pooled purchases and participant commitmentscreateSweepstakesContract - Paid-entry sweepstakes with deterministic winner drawscreateDcaBotContract - Time-based DCA bot scheduling and execution request eventscreateMultisigTreasuryContract - Multisig vaults, proposal approvals, and execution readiness trackingcreateRevenueSplitContract - Revenue share ledgers and withdrawal requests for collaboratorscreatePaywallContract - Paid access control for gated resources and membershipscreateDomainRegistryContract - App-level namespaces with registrations, renewals, transfers, and expiriescreateRentalContract - Escrow-backed rental agreements for items, passes, or assetscreateLaunchpadContract - Launchpad sales with allocations, finalization, and claim flowscreatePredictionMarketContract - Prediction markets with positions, resolution, and winner claimscreateQuestPassContract - Seasonal passes with progress tracking and reward claimscreateCharityMatchContract - Donation campaigns with matched totals and closing summariescreateReferralContract - Affiliate programs with codes, funded budgets, and payout balancescreateInsurancePoolContract - Insurance pools with premium-backed policies, claims, and reserve managementcreateOracleBountyContract - Oracle bounty feeds with report rounds, median finalization, and reporter rewardscreateGrantRoundsContract - Matching grant rounds with project submissions, donations, and post-close allocationscreatePayrollContract - Recurring team payrolls with funded budgets, scheduled runs, and recipient withdrawalscreateProposalTimelockContract - Timelocked governance queues with approvals, delays, and execution requestscreateBundleMarketplaceContract - Fixed-price bundle storefronts with inventory tracking and fulfillment statescreateTicketingContract - Event ticketing with purchases, check-ins, refunds, and capacity enforcementcreateFanClubContract - Paid fan clubs with member renewals, engagement points, and perk redemptionsThese can be imported and used as examples for building your own contracts:
import { createDiceContract, createCoinflipContract, createLottoContract } from 'hive-stream';
Most built-in contracts in src/contracts persist SQL tables internally, so they require a SQL-capable adapter such as SQLite or PostgreSQL. MongoDB remains supported for streamer persistence and custom contracts that do not depend on raw SQL queries.
Sample snippets for the newest contracts live in examples/contracts/:
examples/contracts/rps.tsexamples/contracts/poll.tsexamples/contracts/tipjar.tsexamples/contracts/exchange.tsHigher-level flow examples live in examples/flows/:
examples/flows/auto-burn.tsexamples/flows/auto-forward.tsexamples/flows/auto-split.tsexamples/flows/auto-refund.tsexamples/flows/builder-burn-route.tsexamples/flows/grouped-route-on-top.tsexamples/flows/builder-payout-plan.tsChainable operation examples live in examples/ops/:
examples/ops/transfer-builder.tsexamples/ops/proposal-builder.tsIt's like a cron job for your contracts. Time-based actions allow you to execute contract functions over a wide variety of different periods. Want to call a function every 3 seconds block time or want to call a function once per day? Time-based actions are an easy way to run time code.
The following example will run a contract action every 30 seconds. All you do is register a new TimeAction instance.
import { TimeAction, Streamer } from 'hive-stream';
const streamer = new Streamer({
ACTIVE_KEY: ''
});
const testAction = new TimeAction('30s', 'test30s', 'hivedice', 'testauto');
streamer.registerAction(testAction);
streamer.start();
The TimeAction instance accepts the following values:
new TimeAction(timeValue, uniqueId, contractName, contractAction, date)
At the moment, the timeValue passed in as the first argument to TimeAction cannot accept just any value. However, there are many available out-of-the-box with more flexibility to come in the future.
3s or block will run a task every block (3 seconds, approximately)10s will run a task every 10 seconds30s will run a task every 30 seconds1m or minute will run a task every 60 seconds (1 minute)5m will run a task every 5 minutes15m or quarter will run a task every 15 minutes30m or halfhour will run a task every 30 minutes1h or hourly will run a task every 60 minutes (every hour)12h or halfday will run a task every 12 hours (half a day)24h, day, or daily will run a task every 24 hours (day)week or weekly will run a task every 7 days (week)Values will be persisted if using one of the database adapters that ship with the library.
The Hive Stream library supports custom adapters for various actions that take place in the library. When the library first loads, it makes a call to get the last block number or when a block is processed, storing the processed block number. This library ships with three adapters: SQLite, MongoDB, and PostgreSQL. These provide robust database storage for blockchain state and operations.
By default, Streamer uses SQLite adapter. To use a different adapter, use the registerAdapter() method:
import { Streamer, SqliteAdapter } from 'hive-stream';
const streamer = new Streamer(config);
// SQLite is used by default, but you can explicitly register a custom SQLite database:
const adapter = new SqliteAdapter('./hive-stream.db');
await streamer.registerAdapter(adapter);
import { Streamer, MongodbAdapter } from 'hive-stream';
const streamer = new Streamer(config);
const adapter = new MongodbAdapter('mongodb://localhost:27017', 'hive_stream');
await streamer.registerAdapter(adapter);
MongoDB supports block state, transfers, custom JSON persistence, and custom contracts that manage their own state without SQL. Built-in SQL-backed contracts should use SQLite or PostgreSQL.
import { Streamer, PostgreSQLAdapter } from 'hive-stream';
const streamer = new Streamer(config);
const adapter = new PostgreSQLAdapter({
host: 'localhost',
port: 5432,
user: 'postgres',
password: 'your_password',
database: 'hive_stream'
});
// Or with connection string
const adapter = new PostgreSQLAdapter({
connectionString: 'postgresql://user:pass@localhost:5432/hive_stream'
});
await streamer.registerAdapter(adapter);
When creating an adapter, at a minimum your adapter requires two methods: loadState and saveState. It must also extend AdapterBase which is exported from the package.
You can see a few adapters that ship with Hive Stream in the src/adapters directory.
Simply copy the ecosystem.config.js file from this repository into your application, globally install pm2 via npm install pm2 -g and change the script value below to reflect the main file of your application.
ecosystem.config.js
module.exports = {
apps: [
{
name: 'hive-stream',
script: 'index.js',
ignore_watch: ['node_modules'],
env: {
NODE_ENV: 'development'
},
env_production: {
NODE_ENV: 'production'
}
}
]
FAQs
A layer for streaming actions on the Hive blockchain and reacting to them.
The npm package hive-stream receives a total of 14 weekly downloads. As such, hive-stream popularity was classified as not popular.
We found that hive-stream demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.