
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
supabase-test
Advanced tools
supabase-test offers isolated, role-aware, and rollback-friendly PostgreSQL environments for integration tests with Supabase defaults baked in
supabase-test is a Supabase-optimized version of pgsql-test with Supabase defaults baked in. It provides instant, isolated PostgreSQL databases for testing with automatic transaction rollbacks, context switching, and clean seeding — configured for Supabase's local development environment. It's also great for GitHub Actions and CI/CD testing.
Explore a full working example (including GitHub Actions CI/CD) in the supabase-test-suite repo.
npm install supabase-test
.setContext().sql files, programmatic seeds, or even load fixturesJest, Mocha, etc.📚 Learn how to test with Supabase →
getConnections() OverviewgetConnections() Options import { getConnections } from 'supabase-test';
let db, teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections());
await db.query(`SELECT 1`); // ✅ Ready to run queries
});
afterAll(() => teardown());
getConnections() Overviewimport { getConnections } from 'supabase-test';
// Complete object destructuring
const { pg, db, admin, teardown, manager } = await getConnections();
// Most common pattern
const { db, teardown } = await getConnections();
The getConnections() helper sets up a fresh PostgreSQL test database and returns a structured object with:
pg: a PgTestClient connected as the root or superuser — useful for administrative setup or introspectiondb: a PgTestClient connected as the app-level user — used for running tests with RLS and granted permissionsadmin: a DbAdmin utility for managing database state, extensions, roles, and templatesteardown(): a function that shuts down the test environment and database poolmanager: a shared connection pool manager (PgTestConnector) behind both clientsTogether, these allow fast, isolated, role-aware test environments with per-test rollback and full control over setup and teardown.
The PgTestClient returned by getConnections() is a fully-featured wrapper around pg.Pool. It provides:
PgTestClient API Overviewlet pg: PgTestClient;
let teardown: () => Promise<void>;
beforeAll(async () => {
({ pg, teardown } = await getConnections());
});
beforeEach(() => pg.beforeEach());
afterEach(() => pg.afterEach());
afterAll(() => teardown());
The PgTestClient returned by getConnections() wraps a pg.Client and provides convenient helpers for query execution, test isolation, and context switching.
query(sql, values?) – Run a raw SQL query and get the QueryResultbeforeEach() – Begins a transaction and sets a savepoint (called at the start of each test)afterEach() – Rolls back to the savepoint and commits the outer transaction (cleans up test state)setContext({ key: value }) – Sets PostgreSQL config variables (like role) to simulate RLS contextsany, one, oneOrNone, many, manyOrNone, none, result – Typed query helpers for specific result expectationsThese methods make it easier to build expressive and isolated integration tests with strong typing and error handling.
The PgTestClient returned by getConnections() is a fully-featured wrapper around pg.Pool. It provides:
import { getConnections } from 'supabase-test';
let db; // A fully wrapped PgTestClient using pg.Pool with savepoint-based rollback per test
let teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections());
await db.query(`
CREATE TABLE users (id SERIAL PRIMARY KEY, name TEXT);
CREATE TABLE posts (id SERIAL PRIMARY KEY, user_id INT REFERENCES users(id), content TEXT);
INSERT INTO users (name) VALUES ('Alice'), ('Bob');
INSERT INTO posts (user_id, content) VALUES (1, 'Hello world!'), (2, 'Graphile is cool!');
`);
});
afterAll(() => teardown());
beforeEach(() => db.beforeEach());
afterEach(() => db.afterEach());
test('user count starts at 2', async () => {
const res = await db.query('SELECT COUNT(*) FROM users');
expect(res.rows[0].count).toBe('2');
});
The supabase-test framework provides powerful tools to simulate authentication contexts during tests, which is particularly useful when testing Row-Level Security (RLS) policies.
Use setContext() to simulate different user roles and JWT claims:
db.setContext({
role: 'authenticated',
'jwt.claims.user_id': '123',
'jwt.claims.org_id': 'acme'
});
This applies the settings using SET LOCAL statements, ensuring they persist only for the current transaction and maintain proper isolation between tests.
describe('authenticated role', () => {
beforeEach(async () => {
db.setContext({ role: 'authenticated' });
await db.beforeEach();
});
afterEach(() => db.afterEach());
it('runs as authenticated', async () => {
const res = await db.query(`SELECT current_setting('role', true) AS role`);
expect(res.rows[0].role).toBe('authenticated');
});
});
For non-superuser testing, use the connection options described in the options section. The db.connection property allows you to customize the non-privileged user account for your tests.
Use setContext() to simulate Role-Based Access Control (RBAC) during tests. This is useful when testing Row-Level Security (RLS) policies. Your actual server should manage role/user claims via secure tokens (e.g., setting current_setting('jwt.claims.user_id')), but this interface helps emulate those behaviors in test environments.
This approach enables testing various access patterns:
Note: While this interface helps simulate RBAC for testing, your production server should manage user/role claims via secure authentication tokens, typically by setting values like
current_setting('jwt.claims.user_id')through proper authentication middleware.
The second argument to getConnections() is an optional array of SeedAdapter objects:
const { db, teardown } = await getConnections(getConnectionOptions, seedAdapters);
This array lets you fully customize how your test database is seeded. You can compose multiple strategies:
seed.sqlfile() – Execute raw .sql files from diskseed.fn() – Run JavaScript/TypeScript logic to programmatically insert dataseed.csv() – Load tabular data from CSV filesseed.json() – Use in-memory objects as seed dataseed.pgpm() – Apply a PGPM project or set of packages (compatible with sqitch)✨ Default Behavior: If no
SeedAdapter[]is passed, pgpm seeding is assumed. This makessupabase-testzero-config for pgpm-based projects.
This composable system allows you to mix-and-match data setup strategies for flexible, realistic, and fast database tests.
You can seed data using either approach:
1. Adapter Pattern (setup phase via getConnections)
const { db, teardown } = await getConnections({}, [
seed.json({ 'users': [{ id: 1, name: 'Alice' }] })
]);
2. Direct Load Methods (runtime via PgTestClient)
await db.loadJson({ 'users': [{ id: 1, name: 'Alice' }] });
await db.loadCsv({ 'users': '/path/to/users.csv' });
await db.loadSql(['/path/to/schema.sql']);
Note:
loadCsv()andloadPpgm()do not apply RLS context (PostgreSQL limitation). UseloadJson()orloadSql()for RLS-aware seeding.
Adapter Pattern:
const { db, teardown } = await getConnections({}, [
seed.sqlfile(['schema.sql', 'fixtures.sql'])
]);
Direct Load Method:
await db.loadSql(['schema.sql', 'fixtures.sql']);
import path from 'path';
import { getConnections, seed } from 'supabase-test';
const sql = (f: string) => path.join(__dirname, 'sql', f);
let db;
let teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections({}, [
seed.sqlfile([
sql('schema.sql'),
sql('fixtures.sql')
])
]));
});
afterAll(async () => {
await teardown();
});
Adapter Pattern:
const { db, teardown } = await getConnections({}, [
seed.fn(async ({ pg }) => {
await pg.query(`INSERT INTO users (name) VALUES ('Seeded User')`);
})
]);
Direct Load Method:
// Use any PgTestClient method directly
await db.query(`INSERT INTO users (name) VALUES ('Seeded User')`);
import { getConnections, seed } from 'supabase-test';
let db;
let teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections({}, [
seed.fn(async ({ pg }) => {
await pg.query(`
INSERT INTO users (name) VALUES ('Seeded User');
`);
})
]));
});
Adapter Pattern:
const { db, teardown } = await getConnections({}, [
seed.csv({
'users': '/path/to/users.csv',
'posts': '/path/to/posts.csv'
})
]);
Direct Load Method:
await db.loadCsv({
'users': '/path/to/users.csv',
'posts': '/path/to/posts.csv'
});
Note: CSV loading uses PostgreSQL COPY which does not support RLS context.
You can load tables from CSV files using seed.csv({ ... }). CSV headers must match the table column names exactly. This is useful for loading stable fixture data for integration tests or CI environments.
import path from 'path';
import { getConnections, seed } from 'supabase-test';
const csv = (file: string) => path.resolve(__dirname, '../csv', file);
let db;
let teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections({}, [
// Create schema
seed.fn(async ({ pg }) => {
await pg.query(`
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL
);
CREATE TABLE posts (
id SERIAL PRIMARY KEY,
user_id INT REFERENCES users(id),
content TEXT NOT NULL
);
`);
}),
// Load from CSV
seed.csv({
users: csv('users.csv'),
posts: csv('posts.csv')
}),
// Adjust SERIAL sequences to avoid conflicts
seed.fn(async ({ pg }) => {
await pg.query(`SELECT setval(pg_get_serial_sequence('users', 'id'), (SELECT MAX(id) FROM users));`);
await pg.query(`SELECT setval(pg_get_serial_sequence('posts', 'id'), (SELECT MAX(id) FROM posts));`);
})
]));
});
afterAll(() => teardown());
it('has loaded rows', async () => {
const res = await db.query('SELECT COUNT(*) FROM users');
expect(+res.rows[0].count).toBeGreaterThan(0);
});
Adapter Pattern:
const { db, teardown } = await getConnections({}, [
seed.json({
'custom.users': [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
]
})
]);
Direct Load Method:
await db.loadJson({
'custom.users': [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
]
});
You can seed tables using in-memory JSON objects. This is useful when you want fast, inline fixtures without managing external files.
import { getConnections, seed } from 'supabase-test';
let db;
let teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections({}, [
// Create schema
seed.fn(async ({ pg }) => {
await pg.query(`
CREATE SCHEMA custom;
CREATE TABLE custom.users (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL
);
CREATE TABLE custom.posts (
id SERIAL PRIMARY KEY,
user_id INT REFERENCES custom.users(id),
content TEXT NOT NULL
);
`);
}),
// Seed with in-memory JSON
seed.json({
'custom.users': [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
],
'custom.posts': [
{ id: 1, user_id: 1, content: 'Hello world!' },
{ id: 2, user_id: 2, content: 'Graphile is cool!' }
]
}),
// Fix SERIAL sequences
seed.fn(async ({ pg }) => {
await pg.query(`SELECT setval(pg_get_serial_sequence('custom.users', 'id'), (SELECT MAX(id) FROM custom.users));`);
await pg.query(`SELECT setval(pg_get_serial_sequence('custom.posts', 'id'), (SELECT MAX(id) FROM custom.posts));`);
})
]));
});
afterAll(() => teardown());
it('has loaded rows', async () => {
const res = await db.query('SELECT COUNT(*) FROM custom.users');
expect(+res.rows[0].count).toBeGreaterThan(0);
});
Zero Configuration (Default):
// pgpm migrate is used automatically
const { db, teardown } = await getConnections();
Adapter Pattern (Custom Path):
const { db, teardown } = await getConnections({}, [
seed.pgpm('/path/to/your-pgpm-workspace', true) // with cache
]);
Direct Load Method:
await db.loadPgpm('/path/to/your-pgpm-workspace', true); // with cache
Note: pgpm deployment has its own client handling and does not apply RLS context.
If your project uses pgpm modules with a precompiled pgpm.plan, you can use supabase-test with zero configuration. Just call getConnections() — and it just works:
import { getConnections } from 'supabase-test';
let db, teardown;
beforeAll(async () => {
({ db, teardown } = await getConnections()); // pgpm module is deployed automatically
});
pgpm uses Sqitch-compatible syntax with a TypeScript-based migration engine. By default, supabase-test automatically deploys any pgpm module found in the current working directory (process.cwd()).
To specify a custom path to your pgpm module, use seed.pgpm() explicitly:
import path from 'path';
import { getConnections, seed } from 'supabase-test';
const cwd = path.resolve(__dirname, '../path/to/pgpm-workspace');
beforeAll(async () => {
({ db, teardown } = await getConnections({}, [
seed.pgpm(cwd)
]));
});
pgpm provides the best of both worlds:
By maintaining Sqitch compatibility while supercharging performance, pgpm enables you to keep your existing migration patterns while enjoying the speed benefits of our TypeScript engine.
getConnections OptionsThis table documents the available options for the getConnections function. The options are passed as a combination of pg and db configuration objects.
db Options (PgTestConnectionOptions)| Option | Type | Default | Description |
|---|---|---|---|
db.extensions | string[] | [] | Array of PostgreSQL extensions to include in the test database |
db.cwd | string | process.cwd() | Working directory used for PGPM or Sqitch projects |
db.connection.user | string | 'app_user' | User for simulating RLS via setContext() |
db.connection.password | string | 'app_password' | Password for RLS test user |
db.connection.role | string | 'anonymous' | Default role used during setContext() |
db.template | string | undefined | Template database used for faster test DB creation |
db.rootDb | string | 'postgres' | Root database used for administrative operations (e.g., creating databases) |
db.prefix | string | 'db-' | Prefix used when generating test database names |
pg Options (PgConfig)Environment variables will override these options when available:
PGHOST, PGPORT, PGUSER, PGPASSWORD, PGDATABASE| Option | Type | Default | Description |
|---|---|---|---|
pg.user | string | 'postgres' | Superuser for PostgreSQL |
pg.password | string | 'password' | Password for the PostgreSQL superuser |
pg.host | string | 'localhost' | Hostname for PostgreSQL |
pg.port | number | 5423 | Port for PostgreSQL |
pg.database | string | 'postgres' | Default database used when connecting initially |
const { conn, db, teardown } = await getConnections({
pg: { user: 'postgres', password: 'secret' },
db: {
extensions: ['uuid-ossp'],
cwd: '/path/to/project',
connection: { user: 'test_user', password: 'secret', role: 'authenticated' },
template: 'test_template',
prefix: 'test_',
rootDb: 'postgres'
}
});
The supabase-test/utils module provides utilities for sanitizing query results for snapshot testing. These helpers replace dynamic values (IDs, UUIDs, dates, hashes) with stable placeholders, making snapshots deterministic.
import { snapshot } from 'supabase-test/utils';
const result = await db.any('SELECT * FROM users');
expect(snapshot(result)).toMatchSnapshot();
See pgsql-test Snapshot Utilities for the full API reference.
🚀 Quickstart: Getting Up and Running Get started with modular databases in minutes. Install prerequisites and deploy your first module.
📦 Modular PostgreSQL Development with Database Packages Learn to organize PostgreSQL projects with pgpm workspaces and reusable database modules.
✏️ Authoring Database Changes Master the workflow for adding, organizing, and managing database changes with pgpm.
🧪 End-to-End PostgreSQL Testing with TypeScript Master end-to-end PostgreSQL testing with ephemeral databases, RLS testing, and CI/CD automation.
⚡ Supabase Testing Use TypeScript-first tools to test Supabase projects with realistic RLS, policies, and auth contexts.
💧 Drizzle ORM Testing Run full-stack tests with Drizzle ORM, including database setup, teardown, and RLS enforcement.
🔧 Troubleshooting Common issues and solutions for pgpm, PostgreSQL, and testing.
SET LOCAL) into queries—ideal for setting role, jwt.claims, and other session settings.libpg_query, converting SQL into parse trees.🛠 Built by the Constructive team — creators of modular Postgres tooling for secure, composable backends. If you like our work, contribute on GitHub.
AS DESCRIBED IN THE LICENSES, THE SOFTWARE IS PROVIDED "AS IS", AT YOUR OWN RISK, AND WITHOUT WARRANTIES OF ANY KIND.
No developer or entity involved in creating this software will be liable for any claims or damages whatsoever associated with your use, inability to use, or your interaction with other users of the code, including any direct, indirect, incidental, special, exemplary, punitive or consequential damages, or loss of profits, cryptocurrencies, tokens, or anything else of value.
FAQs
supabase-test offers isolated, role-aware, and rollback-friendly PostgreSQL environments for integration tests with Supabase defaults baked in
We found that supabase-test demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.