
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
panoptes-sdk
Advanced tools
Panoptes SQL auditing SDK for Node.js with multi-database support, before/after data capture, and database transport
Enterprise-grade SQL Audit Trails for Node.js Applications
Automatically audit all SQL queries with rich context, before/after snapshots, and zero configuration. A modern alternative to database triggers.
npm install panoptes-sdk
# or
yarn add panoptes-sdk
# or
pnpm add panoptes-sdk
import { initAudit, createAuditedPostgresClient } from 'panoptes-sdk';
import { Pool } from 'pg';
// 1. Initialize audit configuration (do this once at app startup)
initAudit({
appName: 'my-app',
environment: 'production',
transports: {
enabled: ['console', 'database']
}
});
// 2. Create your database connection
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
// 3. Wrap your client with Panoptes audit
const auditedClient = createAuditedPostgresClient(pool, {
database: 'mydb',
engine: 'postgres'
});
// 4. Use it like a normal client - audits are automatic!
await auditedClient.query('UPDATE users SET email = $1 WHERE id = $2',
['new@email.com', 123]
);
// ✅ This query is now automatically logged with full context
Panoptes SDK automatically tracks every database change in your application:
Instead of this (database triggers):
-- Complex trigger logic that's hard to maintain
CREATE TRIGGER audit_users_update
AFTER UPDATE ON users
FOR EACH ROW
BEGIN
-- Manual JSON building, no user context, database-specific...
END;
Just do this:
const db = createAuditedPostgresClient(pool, { database: 'mydb', engine: 'postgres' });
// That's it! All queries are now automatically audited with full context
This is the Node.js implementation of Panoptes SDK.
Visit https://panoptes-sdk.pages.dev/ for complete guides:
Panoptes SDK works with all major SQL databases. Here's how to use it with each:
import { initAudit, createAuditedPostgresClient } from 'panoptes-sdk';
import { Pool } from 'pg';
initAudit({ appName: 'my-app', environment: 'production' });
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
const client = createAuditedPostgresClient(pool, {
database: 'mydb',
engine: 'postgres'
});
await client.query('INSERT INTO users (name, email) VALUES ($1, $2)', ['John', 'john@example.com']);
import { initAudit, createAuditedMySQLClient } from 'panoptes-sdk';
import mysql from 'mysql2/promise';
initAudit({ appName: 'my-app', environment: 'production' });
const pool = mysql.createPool({
host: 'localhost',
user: 'root',
database: 'mydb'
});
const client = createAuditedMySQLClient(pool, {
database: 'mydb',
engine: 'mysql'
});
await client.query('INSERT INTO users (name, email) VALUES (?, ?)', ['John', 'john@example.com']);
import { initAudit, createAuditedMSSQLClient } from 'panoptes-sdk';
import sql from 'mssql';
initAudit({ appName: 'my-app', environment: 'production' });
const pool = new sql.ConnectionPool({
server: 'localhost',
database: 'mydb',
user: 'sa',
password: 'password'
});
await pool.connect();
const client = createAuditedMSSQLClient(pool, {
database: 'mydb',
engine: 'mssql'
});
await client.query('INSERT INTO users (name, email) VALUES (@name, @email)', {
name: 'John',
email: 'john@example.com'
});
import { initAudit, createAuditedSQLiteClient } from 'panoptes-sdk';
import sqlite3 from 'sqlite3';
initAudit({ appName: 'my-app', environment: 'development' });
const db = new sqlite3.Database('./mydb.sqlite');
const client = createAuditedSQLiteClient(db, {
database: 'mydb',
engine: 'sqlite'
});
await client.query('INSERT INTO users (name, email) VALUES (?, ?)', ['John', 'john@example.com']);
import { initAudit, createAuditedOracleClient } from 'panoptes-sdk';
import oracledb from 'oracledb';
initAudit({ appName: 'my-app', environment: 'production' });
const pool = await oracledb.createPool({
user: 'system',
password: 'password',
connectString: 'localhost/XEPDB1'
});
const client = createAuditedOracleClient(pool, {
database: 'mydb',
engine: 'oracle'
});
await client.query('INSERT INTO users (name, email) VALUES (:name, :email)', {
name: 'John',
email: 'john@example.com'
});
import express from 'express';
import { initAudit, createAuditedPostgresClient, setUserContext } from 'panoptes-sdk';
import { Pool } from 'pg';
const app = express();
// Initialize Panoptes at app startup
initAudit({
appName: 'my-express-app',
environment: process.env.NODE_ENV,
transports: {
enabled: ['console', 'database']
}
});
// Create audited database client
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
const db = createAuditedPostgresClient(pool, {
database: 'mydb',
engine: 'postgres'
});
// Middleware to set user context for ALL requests
app.use((req, res, next) => {
setUserContext({
userId: req.user?.id,
username: req.user?.email,
ipAddress: req.ip,
sessionId: req.sessionID,
userAgent: req.get('user-agent')
});
next();
});
// Your routes now automatically log all database changes
app.post('/users', async (req, res) => {
const { name, email } = req.body;
// This query is automatically audited with user context!
const result = await db.query(
'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
[name, email]
);
res.json(result.rows[0]);
});
app.put('/users/:id', async (req, res) => {
const { id } = req.params;
const { email } = req.body;
// Update is audited with before/after snapshots
await db.query(
'UPDATE users SET email = $1 WHERE id = $2',
[email, id]
);
res.json({ success: true });
});
app.listen(3000);
import Fastify from 'fastify';
import { initAudit, createAuditedMySQLClient, setUserContext } from 'panoptes-sdk';
import mysql from 'mysql2/promise';
const fastify = Fastify();
initAudit({
appName: 'my-fastify-app',
environment: 'production'
});
const pool = mysql.createPool({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME
});
const db = createAuditedMySQLClient(pool, {
database: 'mydb',
engine: 'mysql'
});
// Hook to set user context
fastify.addHook('onRequest', async (request, reply) => {
setUserContext({
userId: request.user?.id,
username: request.user?.username,
ipAddress: request.ip
});
});
fastify.post('/products', async (request, reply) => {
const { name, price } = request.body;
const [result] = await db.query(
'INSERT INTO products (name, price) VALUES (?, ?)',
[name, price]
);
return { id: result.insertId, name, price };
});
await fastify.listen({ port: 3000 });
// database.module.ts
import { Module } from '@nestjs/common';
import { initAudit, createAuditedPostgresClient } from 'panoptes-sdk';
import { Pool } from 'pg';
@Module({
providers: [
{
provide: 'DATABASE_CLIENT',
useFactory: () => {
initAudit({
appName: 'my-nestjs-app',
environment: process.env.NODE_ENV
});
const pool = new Pool({
connectionString: process.env.DATABASE_URL
});
return createAuditedPostgresClient(pool, {
database: 'mydb',
engine: 'postgres'
});
}
}
],
exports: ['DATABASE_CLIENT']
})
export class DatabaseModule {}
// user.service.ts
import { Injectable, Inject } from '@nestjs/common';
@Injectable()
export class UserService {
constructor(@Inject('DATABASE_CLIENT') private db: any) {}
async createUser(name: string, email: string) {
const result = await this.db.query(
'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
[name, email]
);
return result.rows[0];
}
async updateUser(id: number, email: string) {
await this.db.query(
'UPDATE users SET email = $1 WHERE id = $2',
[email, id]
);
}
}
import { initAudit } from 'panoptes-sdk';
initAudit({
appName: 'my-app',
environment: 'production'
});
initAudit({
appName: 'my-app',
environment: process.env.NODE_ENV,
// Use different transports based on environment
transports: {
enabled: process.env.NODE_ENV === 'production'
? ['database', 'http'] // Production: save to DB and send to external service
: ['console'] // Development: just log to console
},
// Enable async mode in production for better performance
asyncMode: process.env.NODE_ENV === 'production',
bufferSize: 500
});
initAudit({
appName: 'my-app',
// Only audit these tables
rules: {
includeTables: ['users', 'transactions', 'orders', 'payments'],
operations: ['INSERT', 'UPDATE', 'DELETE'] // Don't audit SELECT
}
});
initAudit({
appName: 'my-app',
rules: {
excludeTables: ['sessions', 'cache', 'temp_data'], // Skip these tables
operations: ['INSERT', 'UPDATE', 'DELETE']
}
});
initAudit({
appName: 'my-app',
transports: {
enabled: ['console', 'database', 'file', 'http'],
// File transport configuration
file: {
path: './logs/audit.log',
maxSize: '100MB',
maxFiles: 10
},
// HTTP transport (send to external service)
http: {
url: 'https://your-logging-service.com/api/logs',
headers: {
'Authorization': `Bearer ${process.env.LOGGING_API_KEY}`
}
}
}
});
import { setUserContext } from 'panoptes-sdk';
// Set context with custom fields
setUserContext({
userId: user.id,
username: user.email,
ipAddress: req.ip,
// Add custom metadata
metadata: {
organizationId: user.organizationId,
department: user.department,
role: user.role,
requestId: req.headers['x-request-id']
}
});
initAudit({
appName: 'my-app',
// Process audits asynchronously (recommended for production)
asyncMode: true,
bufferSize: 1000, // Buffer up to 1000 audit logs before flushing
// Flush interval (in milliseconds)
flushInterval: 5000 // Flush every 5 seconds
});
Once you start using Panoptes, all audits are stored in the audit_logs table. Here's how to query them:
SELECT
operation,
table_name,
user_id,
username,
ip_address,
created_at,
query
FROM audit_logs
ORDER BY created_at DESC
LIMIT 50;
SELECT *
FROM audit_logs
WHERE user_id = 123
ORDER BY created_at DESC;
SELECT
operation,
before_state,
after_state,
username,
created_at
FROM audit_logs
WHERE table_name = 'users'
AND query LIKE '%WHERE id = 123%'
ORDER BY created_at DESC;
-- Find all deletions in the last 30 days
SELECT
table_name,
before_state,
username,
ip_address,
created_at
FROM audit_logs
WHERE operation = 'DELETE'
AND created_at > NOW() - INTERVAL '30 days'
ORDER BY created_at DESC;
➡️ More Examples in Documentation
initAudit(config)Initializes Panoptes audit system. Call this once at application startup.
initAudit({
appName: 'my-app', // Required: Your application name
environment: 'production', // Required: Environment (dev, staging, prod, etc.)
// Optional configurations
asyncMode: true, // Process audits asynchronously
bufferSize: 1000, // Buffer size for async mode
flushInterval: 5000, // Flush interval in ms
rules: {
includeTables: ['users'], // Only audit these tables
excludeTables: ['cache'], // Skip these tables
operations: ['INSERT', 'UPDATE', 'DELETE'] // Which operations to audit
},
transports: {
enabled: ['console', 'database', 'file', 'http'],
file: { path: './logs/audit.log' },
http: { url: 'https://api.example.com/logs' }
}
});
createAuditedPostgresClient(pool, config)Creates an audited PostgreSQL client.
const client = createAuditedPostgresClient(pool, {
database: 'mydb',
engine: 'postgres'
});
createAuditedMySQLClient(pool, config)Creates an audited MySQL client.
const client = createAuditedMySQLClient(pool, {
database: 'mydb',
engine: 'mysql'
});
createAuditedMSSQLClient(pool, config)Creates an audited MSSQL client.
const client = createAuditedMSSQLClient(pool, {
database: 'mydb',
engine: 'mssql'
});
createAuditedSQLiteClient(db, config)Creates an audited SQLite client.
const client = createAuditedSQLiteClient(db, {
database: 'mydb',
engine: 'sqlite'
});
createAuditedOracleClient(pool, config)Creates an audited Oracle client.
const client = createAuditedOracleClient(pool, {
database: 'mydb',
engine: 'oracle'
});
setUserContext(context)Sets user context for the current request/operation. Call this in your middleware.
setUserContext({
userId: 123, // User ID
username: 'john@example.com', // Username or email
ipAddress: '192.168.1.1', // IP address
sessionId: 'abc123', // Session ID
userAgent: 'Mozilla/5.0...', // User agent
// Custom metadata
metadata: {
organizationId: 456,
role: 'admin'
}
});
0.2.0>=18No! Panoptes automatically creates the audit_logs table on first run.
In async mode (recommended for production), auditing happens in the background with minimal performance impact. Typical overhead is <5ms per query.
Yes, but you'll need to wrap the underlying database connection. See the Integrations guide for examples.
The audit table is created automatically. For schema changes, see the Migration guide.
Yes, you can configure this in initAudit():
initAudit({
appName: 'my-app',
tableName: 'my_custom_audit_table'
});
MIT License - Free for personal and commercial use.
Need help? Here's how to get support:
Made by malydev
If Panoptes SDK helps your project, consider giving it a ⭐ on GitHub!
📚 Read Full Documentation • 🐛 Report Bug • 💡 Request Feature
FAQs
Panoptes SQL auditing SDK for Node.js with multi-database support, before/after data capture, and database transport
We found that panoptes-sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.