
A tiny wrapper around pg that makes PostgreSQL a lot
of fun to use. Written in TypeScript.
import Pg from '@mojojs/pg';
const pg = new Pg('postgres://user:password@localhost:5432/database');
const results = await pg.query`SELECT ${'One'} AS one`;
for (const row of results) {
console.log(row.one);
}
const db = await pg.db();
const users = await db.query`SELECT * FROM users`;
const roles = await db.query`SELECT * FROM roles`;
await db.release();
Tagged template literals are used everywhere to protect from SQL injection attacks and to make syntax highlighting
easy.
Examples
This distribution also contains a great example you can use for inspiration. The well-structured
blog application will show you how to apply the MVC
design pattern in practice.
TypeScript
TypeScript is fully supported, just pass along a type with your query. This works for all query methods.
interface User {
id: number;
name: string;
}
const results = await pg.query<User>`SELECT * FROM users`;
for (const {id, name} of results) {
console.log(`${id}: ${name}`);
}
All APIs are designed to be compatible with
explicit resource management. And with TypeScript you
can already use the feature in production.
await using pg = new Pg('postgres://user:password@localhost:5432/database');
await using db = await pg.db();
try {
await using tx = await db.begin();
await db.query`INSERT INTO users (name) VALUES ('sri')`;
await db.query`this_is_an_error`;
await tx.commit();
} catch (e) {
console.warn('Something went wrong with the transaction');
}
SQL building
For easier SQL query building with partials, there are also pg.sql
and db.sql
tagged template literals (provided by
@mojojs/sql). They can be used recursively to build complex queries
securely.
const role = 'admin';
const partialQuery = pg.sql`AND role = ${role}`;
const name = 'root';
const results = await pg.query`SELECT * FROM users WHERE name = ${name} ${partialQuery}`;
But if you need a little more control over the generated SQL query, you can of course also bypass safety features with
the tagged template literals pg.sqlUnsafe
and db.sqlUnsafe
. But make sure to use methods like pg.escapeLiteral()
to escape unsafe values yourself.
const role = 'role = ' + pg.escapeLiteral('power user');
const partialQuery = pg.sqlUnsafe`AND ${role}`;
const name = 'root';
const results = await pg.query`SELECT * FROM users WHERE name = ${name} ${partialQuery}`;
And if you want to do complex things like reusing the same placeholder in multiple places, there is also
pg.rawQuery()
and db.rawQuery()
available.
const results = await pg.rawQuery('SELECT * FROM users WHERE name = $1 AND login = $1', 'Sara');
Transactions
It's best to use try
/finally
blocks whenever you dequeue a connection with pg.db()
, to ensure efficient resource
management.
const db = await pg.db();
try {
const tx = await db.begin();
try {
for (const user of ['Daniel', 'Isabell']) {
await db.query`INSERT INTO users (name) VALUES (${user})`;
}
await tx.commit();
} finally {
await tx.rollback();
}
} finally {
await db.release();
}
The tx.rollback()
call does nothing if tx.commit()
has been called first.
Migrations
To manage your database schema, there is also a minimal SQL based migration system built-in. A migration file is just a
collection of SQL blocks, with one or more statements, separated by comments of the form -- VERSION UP/DOWN
.
CREATE TABLE messages (message TEXT);
INSERT INTO messages VALUES ('I ♥ Mojolicious!');
DROP TABLE messages;
CREATE TABLE stuff (whatever INT);
DROP TABLE stuff;
The idea is to let you migrate from any version, to any version, up and down. Migrations are very safe, because they
are performed in transactions and only one can be performed at a time. If a single statement fails, the whole migration
will fail and get rolled back. Every set of migrations has a name
, which is stored together with the currently active
version in an automatically created table named mojo_migrations
.
import Path from '@mojojs/path';
await pg.migrations.fromFile(Path.currentFile().sibling('migrations', 'myapp.sql'), {name: 'myapp'});
await pg.migrations.migrate();
await pg.migrations.migrate(0);
await pg.migrations.migrate();
pg.migrations.fromString('-- 1 up\n...', {name: 'my_other_app'});
await pg.migrations.fromDirecory(Path.currentFile().sibling('migrations'), {name: 'yet_another_app'});
To store your individual migration steps in separate SQL files you can use a directory structure like this. These files
do not require special comments, because the version and migration direction are contained in the file names.
`--migrations
|-- 1
| |-- up.sql
| `-- down.sql
|-- 2
| `-- up.sql
|-- 4
| |-- up.sql
| `-- down.sql
`-- 5
|-- up.sql
`-- down.sql
Migrations are also compatible with Mojo::Pg, if you want to mix Perl and
JavaScript code.
Notifications
You can use events as well as async iterators for notifications.
const db = await pg.db();
await db.notify('foo', 'just a message');
await db.listen('foo');
for await (const message of db) {
console.log(`${message.channel}: ${message.payload}`);
break;
}
await db.unlisten('foo');
await db.listen('bar');
db.on('notification', (message) => {
console.log(`${message.channel}: ${message.payload}`);
});
await db.unlisten('bar');
Errors
Since the default exceptions thrown by pg for query errors are often not very
helpful, we expand them with context information, like the position in the SQL query and the file/line the query
originated from.
$ node sql-error.js
/home/sri/pg.js/node_modules/pg-protocol/dist/parser.js:287
const message = name === 'notice' ? new messages_1.NoticeMessage(length, messageValue) : new messages_1.DatabaseError(messageValue, length, name);
^
error: relation "users" does not exist
Line 1: SELECT * FROM users
^ at sql-error.js line 4
at Parser.parseErrorMessage (/home/sri/pg.js/node_modules/pg-protocol/dist/parser.js:287:98)
at Parser.handlePacket (/home/sri/pg.js/node_modules/pg-protocol/dist/parser.js:126:29)
at Parser.parse (/home/sri/pg.js/node_modules/pg-protocol/dist/parser.js:39:38)
at Socket.<anonymous> (/home/sri/pg.js/node_modules/pg-protocol/dist/index.js:11:42)
at Socket.emit (node:events:537:28)
at addChunk (node:internal/streams/readable:324:12)
at readableAddChunk (node:internal/streams/readable:297:9)
at Readable.push (node:internal/streams/readable:234:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23) {
length: 104,
severity: 'ERROR',
code: '42P01',
...
Introspection
You can set the MOJO_PG_DEBUG
environment variable to get all SQL queries printed to STDERR
.
$ MOJO_PG_DEBUG=1 node myapp.js
INSERT INTO users (name) VALUES ($1)
...
Editor Support
Installation
All you need is Node.js 16.0.0 (or newer).
$ npm install @mojojs/pg
Support
If you have any questions the documentation might not yet answer, don't hesitate to ask in the
Forum, on Matrix, or
IRC.