Security News
UK Officials Consider Banning Ransomware Payments from Public Entities
The UK is proposing a bold ban on ransomware payments by public entities to disrupt cybercrime, protect critical services, and lead global cybersecurity efforts.
The 'postgres' npm package is a modern, high-performance PostgreSQL client for Node.js. It provides a simple and efficient way to interact with PostgreSQL databases, supporting both basic and advanced database operations.
Connecting to a Database
This feature allows you to establish a connection to a PostgreSQL database using a connection string.
const postgres = require('postgres');
const sql = postgres('postgres://username:password@localhost:5432/mydatabase');
Executing Queries
You can execute SQL queries using tagged template literals, which helps in preventing SQL injection attacks.
const result = await sql`SELECT * FROM users WHERE id = ${userId}`;
Parameterized Queries
This feature allows you to use parameterized queries to safely pass variables into your SQL statements.
const result = await sql`SELECT * FROM users WHERE age > ${age}`;
Transactions
You can perform multiple database operations within a transaction to ensure atomicity.
await sql.begin(async sql => {
await sql`INSERT INTO users (name) VALUES (${name})`;
await sql`INSERT INTO profiles (user_id, bio) VALUES (${userId}, ${bio})`;
});
Connection Pooling
The package supports connection pooling to manage multiple database connections efficiently.
const sql = postgres({
host: 'localhost',
port: 5432,
username: 'username',
password: 'password',
database: 'mydatabase',
max: 10 // maximum number of connections in the pool
});
The 'pg' package is the official PostgreSQL client for Node.js. It is widely used and provides a comprehensive set of features for interacting with PostgreSQL databases. Compared to 'postgres', 'pg' is more mature and has a larger community, but 'postgres' offers a more modern API and better performance in some cases.
The 'knex' package is a SQL query builder for Node.js that supports multiple database engines, including PostgreSQL. It provides a flexible and powerful API for building and executing SQL queries. While 'knex' is more versatile due to its support for various databases, 'postgres' is more specialized and optimized for PostgreSQL.
The 'sequelize' package is a promise-based ORM for Node.js that supports PostgreSQL, MySQL, SQLite, and MSSQL. It provides a higher-level abstraction for database operations, including model definitions and associations. 'sequelize' is more feature-rich and suitable for complex applications, but 'postgres' offers a simpler and more direct approach to interacting with PostgreSQL.
Install
$ npm install postgres
Use
const postgres = require('postgres')
const sql = postgres({ ...options }) // will default to the same as psql
await sql`
select name, age from users
`
// > [{ name: 'Murray', age: 68 }, { name: 'Walter', age 78 }]
postgres([url], [options])
You can use either a postgres://
url connection string or the options to define your database connection properties. Options in the object will override any present in the url.
const sql = postgres('postgres://username:password@host:port/database', {
host : '', // Postgres ip address or domain name
port : 5432, // Postgres server port
path : '', // unix socket path (usually '/tmp')
database : '', // Name of database to connect to
username : '', // Username of database user
password : '', // Password of database user
ssl : false, // True, or options for tls.connect
max : 10, // Max number of connections
timeout : 0, // Idle connection timeout in seconds
types : [], // Array of custom types, see more below
onnotice : fn // Defaults to console.log
onparameter : fn // (key, value) when server param change
debug : fn // Is called with (connection, query, parameters)
transform : {
column : fn, // Transforms incoming column names
value : fn, // Transforms incoming row values
row : fn // Transforms entire rows
},
connection : {
application_name : 'postgres.js', // Default application_name
... // Other connection parameters
}
})
More info for the ssl
option can be found in the Node.js docs for tls connect options
sql` ` -> Promise
A query will always return a Promise
which resolves to a results array [...]{ rows, command }
. Destructuring is great to immediately access the first element.
const [new_user] = await sql`
insert into users (
name, age
) values (
'Murray', 68
)
returning *
`
// new_user = { user_id: 1, name: 'Murray', age: 68 }
Parameters are automatically inferred and handled by Postgres so that SQL injection isn't possible. No special handling is necessary, simply use JS tagged template literals as usual.
let search = 'Mur'
const users = await sql`
select
name,
age
from users
where
name like ${ search + '%' }
`
// users = [{ name: 'Murray', age: 68 }]
sql` `.stream(fn) -> Promise
If you want to handle rows returned by a query one by one, you can use .stream
which returns a promise that resolves once there are no more rows.
await sql`
select created_at, name from events
`.stream(row => {
// row = { created_at: '2019-11-22T14:22:00Z', name: 'connected' }
})
// No more rows
When you call listen, a dedicated connection will automatically be made to ensure that you receive notifications in real time. This connection will be used for any further calls to listen. Listen returns a promise which resolves once the LISTEN
query to Postgres completes, or if there is already a listener active.
await sql.listen('news', payload => {
const json = JSON.parse(payload)
console.log(json.this) // logs 'is'
})
Notify can be done as usual in sql, or by using the sql.notify
method.
sql.notify('news', JSON.stringify({ no: 'this', is: 'news' }))
sql``
Tagged template functions are not just ordinary template literal strings. They allow the function to handle any parameters within before interpolation. This means that they can be used to enforce a safe way of writing queries, which is what Postgres.js does. Any generic value will be serialized according to an inferred type, and replaced by a PostgreSQL protocol placeholders $1, $2, ...
and then sent to the database as a parameter to let it handle any need for escaping / casting.
This also means you cannot write dynamic queryes or concat queries together by simple string manipulation. To enable dynamic queries in a safe way, the sql
function doubles as a regular function which escapes any value properly. It also includes overloads for common cases of inserting, selecting, updating and querying.
sql() inside tagged template
Postgres.js has a safe, ergonomic way to aid you in writing queries. This makes it easier to write dynamic inserts, selects, updates and where queries.
const user = {
name: 'Murray',
age: 68
}
sql`
insert into users ${
sql(user, 'name', 'age')
}
`
// Is translated into this query:
insert into users (name, age) values ($1, $2)
You can leave out the column names and simply do sql(user)
if you want to get all fields from the object as columns, but be careful not to allow users to supply columns you don't want.
If you need to insert multiple rows at the same time it's also much faster to do it with a single insert
. Simply pass an array of objects to sql()
.
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
}, {
name: 'Walter',
age: 78
}]
sql`
insert into users ${
sql(users, 'name', 'age')
}
`
This is also useful for update queries
const user = {
id: 1,
name: 'Muray'
}
sql`
update users set ${
sql(user, 'name')
} where
id = ${ user.id }
`
// Is translated into this query:
update users set name = $1 where id = $2
const columns = ['name', 'age']
sql`
select ${
sql(columns)
} from users
`
// Is translated into this query:
select name, age from users
sql.array(Array)
PostgreSQL has a native array type which is similar to js arrays, but only allows the same type and shape for nested items. This method automatically infers the item type and serializes js arrays into PostgreSQL arrays.
const types = sql`
insert into types (
integers,
strings,
dates,
buffers,
multi
) values (
${ sql.array([1,2,3,4,5]) },
${ sql.array(['Hello', 'Postgres']) },
${ sql.array([new Date(), new Date(), new Date()]) },
${ sql.array([Buffer.from('Hello'), Buffer.from('Postgres')]) },
${ sql.array([[[1,2],[3,4]][[5,6],[7,8]]]) },
)
`
sql.json(object)
const body = { hello: 'postgres' }
const [{ json }] = await sql`
insert into json (
body
) values (
${ sql.json(body) }
)
returning body
`
// json = { hello: 'postgres' }
sql.file(path, [args], [options]) -> Promise
Using an .sql
file for a query. The contents will be cached in memory so that the file is only read once.
sql.file(path.join(__dirname, 'query.sql'), [], {
cache: true // Default true - disable for single shot queries or memory reasons
})
sql.begin(fn) -> Promise
Calling begin with a function will return a Promise which resolves with the returned value from the function. The function provides a single argument which is sql
with a context of the newly created transaction. BEGIN
is automatically called, and if the Promise fails ROLLBACK
will be called. If it succeeds COMMIT
will be called.
const [user, account] = await sql.begin(async sql => {
const [user] = await sql`
insert into users (
name
) values (
'Alice'
)
`
const [account] = await sql`
insert into accounts (
user_id
) values (
${ user.user_id }
)
`
return [user, account]
})
sql.savepoint([name], fn) -> Promise
sql.begin(async sql => {
const [user] = await sql`
insert into users (
name
) values (
'Alice'
)
`
const [account] = (await sql.savepoint(sql =>
sql`
insert into accounts (
user_id
) values (
${ user.user_id }
)
`
).catch(err => {
// Account could not be created. ROLLBACK SAVEPOINT is called because we caught the rejection.
})) || []
return [user, account]
})
.then(([user, account])) => {
// great success - COMMIT succeeded
})
.catch(() => {
// not so good - ROLLBACK was called
})
Do note that you can often achieve the same result using WITH
queries (Common Table Expressions) instead of using transactions.
You can add ergonomic support for custom types, or simply pass an object with a { type, value }
signature that contains the Postgres oid
for the type and the correctly serialized value.
Adding Query helpers is the recommended approach which can be done like this:
const sql = sql({
types: {
rect: {
to : 1337,
from : [1337],
serialize : ({ x, y, width, height }) => [x, y, width, height],
parse : ([x, y, width, height]) => { x, y, width, height }
}
}
})
const [custom] = sql`
insert into rectangles (
name,
rect
) values (
'wat',
${ sql.types.rect({ x: 13, y: 37: width: 42, height: 80 }) }
)
returning *
`
// custom = { name: 'wat', rect: { x: 13, y: 37: width: 42, height: 80 } }
To ensure proper teardown and cleanup on server restarts use sql.end({ timeout: null })
before process.exit()
Calling sql.end()
will reject new queries and return a Promise which resolves when all queries are finished and the underlying connections are closed. If a timeout is provided any pending queries will be rejected once the timeout is reached and the connections will be destroyed.
import prexit from 'prexit'
prexit(async () => {
await sql.end({ timeout: 5 })
await new Promise(r => server.close(r))
})
Connections are created lazily once a query is created. This means that simply doing const sql = postgres(...)
won't have any effect other than instantiating a new sql
instance.
No connection will be made until a query is made.
This means that we get a much simpler story for error handling and reconnections. Queries will be sent over the wire immediately on the next available connection in the pool. Connections are automatically taken out of the pool if you start a transaction using sql.begin()
, and automatically returned to the pool once your transaction is done.
Any query which was already sent over the wire will be rejected if the connection is lost. It'll automatically defer to the error handling you have for that query, and since connections are lazy it'll automatically try to reconnect the next time a query is made. The benefit of this is no weird generic "onerror" handler that tries to get things back to normal, and also simpler application code since you don't have to handle errors out of context.
There are no guarantees about queries executing in order unless using a transaction with sql.begin()
or setting max: 1
. Of course doing a series of queries, one awaiting the other will work as expected, but that's just due to the nature of js async/promise handling, so it's not necessary for this library to be concerned with ordering.
sql.unsafe
- Advanced unsafe use casessql.unsafe(query, [args], [options]) -> promise
If you know what you're doing, you can use unsafe
to pass any string you'd like to postgres. Please note that this can lead to sql injection if you're not careful.
sql.unsafe('select ' + danger + ' from users where id = ' + dragons)
Errors are all thrown to related queries and never globally. Errors coming from PostgreSQL itself are always in the native Postgres format, and the same goes for any Node.js errors eg. coming from the underlying connection.
There are also the following errors specifically for this library.
X (X) is not supported
Whenever a message is received from Postgres which is not supported by this library. Feel free to file an issue if you think something is missing.
Max number of parameters (65534) exceeded
The postgres protocol doesn't allow more than 65534 (16bit) parameters. If you run into this issue there are various workarounds such as using sql([...])
to escape values instead of passing them as parameters.
Message type X not supported
When using SASL authentication the server responds with a signature at the end of the authentication flow which needs to match the one on the client. This is to avoid man in the middle attacks. If you receive this error the connection was cancelled because the server did not reply with the expected signature.
Query not called as a tagged template literal
Making queries has to be done using the sql function as a tagged template. This is to ensure parameters are serialized and passed to Postgres as query parameters with correct types and to avoid SQL injection.
Auth type X not implemented
Postgres supports many different authentication types. This one is not supported.
write CONNECTION_CLOSED host:port
This error is thrown if the connection was closed without an error. This should not happen during normal operation, so please create an issue if this was unexpected.
write CONNECTION_ENDED host:port
This error is thrown if the user has called sql.end()
and performed a query afterwards.
write CONNECTION_DESTROYED host:port
This error is thrown for any queries that were pending when the timeout to sql.end({ timeout: X })
was reached.
A really big thank you to @JAForbes who introduced me to Postgres and still holds my hand navigating all the great opportunities we have.
Thanks to @ACXgit for initial tests and dogfooding.
Also thanks to Ryan Dahl for letting me have the postgres
npm package name.
v1.0.2 - 21 Jan 2020
FAQs
Fastest full featured PostgreSQL client for Node.js
The npm package postgres receives a total of 269,670 weekly downloads. As such, postgres popularity was classified as popular.
We found that postgres demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The UK is proposing a bold ban on ransomware payments by public entities to disrupt cybercrime, protect critical services, and lead global cybersecurity efforts.
Security News
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.
Research
Security News
Socket researchers found several malicious npm packages typosquatting Chalk and Chokidar, targeting Node.js developers with kill switches and data theft.