Security News
PyPI’s New Archival Feature Closes a Major Security Gap
PyPI now allows maintainers to archive projects, improving security and helping users make informed decisions about their dependencies.
a lightweight SQL composer for Node.js and PostgreSQL
await db.select(
db.libraries
.join(db.holdings) // implicit join on foreign key holdings.library_id
.join(db.books) // implicit join on foreign key holdings.book_id
.join(db.authors, db.$join.left, {[db.authors.$id]: db.books.$author_id})
.filter({
[db.libraries.$postcode]: '12345',
[`${db.authors.$name} ilike`]: 'Lauren%Ipsum'
})
.project({
$key: db.libraries.$id,
$columns: [...db.libraries],
authors: [{
$key: db.authors.$id,
$columns: [
db.authors.$name,
db.expr(
`extract(year from age(coalesce(${db.authors.$death}, now()), ${db.authors.$birth}))`
).as('age')
],
// notice `books` is a collection on authors, even though we join authors to books!
books: [{
$key: db.books.$id,
$columns: [...db.books]
}]
}]
})
);
npm i monstrous
monstrous helps you build composable, reusable SQL functionality without models by deriving an API from your running database schema. It is a successor to MassiveJS that departs significantly from Massive's interface and syntax to afford a lighter and far more flexible translation layer for interacting with PostgreSQL from Node.
Then it gets out of the way and lets you write and run your own SQL. That part hasn't changed.
monstrous is additionally inspired by the Clojure library Penkala and the "Relations as First-Class Citizen" principle elucidated in Ruby by alf/bmg (although, like Penkala, it sticks to SQL instead of implementing a full relational algebra).
Unlike many database object- or model-oriented tools, data access with monstrous starts at the action: db.select()
rather than Massive's db.libraries.select()
, a Library.find_by()
, or other variations on the subject-then-verb theme.
db.select()
has one required argument: the bare minimum query is db.select(db.libraries)
. Inside those parentheses is a relation, with a db.libraries.primary_key
and qualified
columns, which can be transformed into (essentially) another relation, and another, and another, through fluent functions like join
, filter
, limit
, and project
. Each relation-builder function returns a new relation -- intermediary relations can be used and built off of repeatedly.
Relations' columns are dollar-prefixed properties: db.libraries.$id
, db.authors.$name
. Use these properties instead of strings -- while you can get away with un- or partially-qualified names much of the time, you also lose out on some of monstrous' automatic resolution on the fully-qualified and quoted name returned by db.libraries.$id
, and risk ambiguous column names appearing in the final query. Keys as in criteria objects can be specified {[db.libraries.$id]: 2}
. Spreading an object ([...db.libraries]
) returns all columns, useful in projections.
Fluent functions come in two flavors:
Every time these are invoked,
join
will declare a single new join relation and its join conditionsfilter
will augment an existing or create a new where
clausegroup
will append its field(s) to the group by
clauseproject
will overwrite the output record shape definitionorder
will replace the order by
clauseoffset
and limit
will reset their respective valuesFunctions can be used in any order and combination as long as references are preserved: you can't filter
a column in authors
before you join(db.authors)
into your statement.
await db.select(db.libraries
.join(db.holdings)
.join(db.books)
.filter({[db.libraries.$postcode]: '12345'})
.join(db.authors)
.filter({[`${db.authors.$name} ilike`]: 'lauren%ipsum'})
.limit(10)
.limit(5)
// both filters will be applied when the statement is executed, and the
// results will be nested libraries -> holdings -> books -> authors. Up
// to five results will be returned.
);
Statement execution independent from composition means that a statement can be instantiated as an ordinary variable; used, expanded, and reused within that variable's scope; and even stored on the monstrous connection itself with db.attach(statement)
(but be careful about when and where you do that).
const ipsum = db.libraries
.join(db.holdings)
.join(db.books)
.join(db.authors)
.filter({[`${db.authors.$name} ilike`]: 'lauren%ipsum'});
// don't do this on an application hot path!
db.attach(ipsum, ['saved_queries', 'ipsum']);
// elsewhere....
const result = await db.select(
db.saved_queries.ipsum.filter({[db.libraries.$postcode]: '12345'})
);
Furthermore, other attached or ad-hoc queries can use ipsum
as a base without altering its future executions. Every fluent function returns an updated clone of its base statement.
const db = await monstrous({
host: 'localhost',
port: 5432,
database: 'librarysystem',
user: 'postgres',
password: 'supersecure',
scripts: 'lib/sql'
});
Exprs interpolate SQL text into the final query. They're a thin layer over pg-promise's custom type formatting and accept the SQL, an array of indexed or an object of named parameters, and options. Both latter arguments are optional.
db.expr(
`extract(years from justify_interval($1 - ${db.employees.hired_at}))`,
[new Date()]
).as('tenure') // alias only required in projection!
Exprs can be used in many places: as criteria values in join
and filter
, as output fields in project
, or as change values in insert
and update
.
Composite types can be represented with exprs in persistence methods with SQL formatted as a record ($1, $2)
and appropriate parameters.
In projections, exprs must be aliased: db.expr(...).as('name')
.
Tuples are a subclass of Expr which represent composite types or records. These do not include an SQL snippet, instead transforming their arguments -- which can include other exprs in order to reference columns in the query -- into record values.
db.tuple(1, 2, 3, 'some text')
Tuples, and exprs provided as tuple arguments, are always interpolated directly into the final SQL statement; user input is not sanitized against SQL injection.
statement.join()
adds a single relation to the query under construction (this differs dramatically from Massive, where join()
is used a single time to attach all desired relations at once). The full specification looks like this, with the target relation, the join type, and the on
criteria:
const values = db.values('v', {author_id: 1, extra: 'text'});
const result = await db.select(
db.libraries
.join(db.holdings, db.$join.left, {[db.holdings.$library_id]: db.libraries.$id})
.join(
// left-joining the filtered subquery means some holdings will be empty!
db.books.filter({[`${db.books.$id} <`]: 5}),
db.$join.left,
{[db.holdings.$book_id]: db.books.$id}
)
.join(
values,
db.$join.left,
{[values.$author_id]: db.books.$author_id}
)
);
The join type defaults to inner
and may be omitted.
The on
criteria map the columns of the newly joined relation to, usually, fully qualified columns of any already-joined relation. There is no "handedness" to criteria, so the division into criteria keys and values is a matter of taste. Literals and expr
s are also valid on
criteria values only.
Results are automatically projected as new_relname: [record object, record object]
to the most recently joined relation referenced in the on
, but you can also override with your own projection.
If on
criteria are not specified, monstrous will search for a foreign key relationship between the target table and any other table involved in the query. If it finds one and only one such foreign key, it will generate the appropriate on
clause automatically; otherwise, it will raise an error on execution telling you to specify on
criteria.
statement.filter()
restricts results with a where
clause. It accepts either criteria objects mapping column names to values and expr
s or, for tables, an integer or UUID primary key.
statement.project()
defines the shape of your query output, articulating nested collections and objects. It unifies Massive's resultset decomposition (aka join definitions) and the fields
and exprs
options.
project()
takes a single argument, which represents the shape of a single output record. This may be a flat array of column names:
[db.libraries.$id, db.libraries.$name] // restrict to a subset of columns
Spreading a relation with ...
yields all columns:
[...db.employees, db.libraries.$postcode] // add library postcode to all employee fields
Alternatively, the projection may be a complex object.
const directors = db.employees
.filter({position: 'director'})
.as('directors'); // set the default projection name
await db.select(
db.libraries
.join(
director,
{[director.$library_id]: db.libraries.$id}
)
.join(db.patrons)
.project({
$key: db.libraries.$id,
$columns: [...db.libraries],
director: { // override the subquery alias to singular
$key: director.$id,
name: director.$name,
tenure: db.expr(`now() - ${director.$hired_on}`)
},
patrons: [{
$key: db.patrons.primary_key,
$columns: [...db.patrons]
}]
})
);
Here, director
is a nested object (a library has only one director; the subquery filters employee records appropriately). It's also named arbitrarily, since the records are coming from the employees
table. Meanwhile, patrons
is a nested array named for its table.
All three quantities -- the root record, director
, and patrons
-- specify a $key
uniquely identifying a library, an employee, or a patron respectively. This is usually the table primary key, and indeed each table has a primary_key
property as seen with patrons
. Composite primary keys are supported with arrays.
libraries
and patrons
also use the $columns
field to specify fields without renaming them. $columns
is processed similarly to the array form of the projection. In the example, the director
instead specifies only a subset of fields one by one, and tenure
is a raw SQL expression which will be interpolated into the query. As always, raw SQL is a potential vector for SQL injection; use it carefully!
db.expr
s may be included in a $columns
array, but must be aliased. In other words, the director
could be constructed thus:
await db.select(
db.libraries.join(db.employees).project({
$key: db.employees.$id,
$columns: [
db.employees.$name,
db.expr(`now() - ${db.employees.$hired_on}`).as('tenure')
]
})
);
db.libraries.as('bookplaces')
lets you rename relations within the context of a query. This helps with generating correct names in the default projection of a join, but is also required for self-joins since each instance of a relation in a join must have a unique name.
Once introduced into a query, aliases must be used consistently: it is generally an error if you select records from libraries as bookplaces
but filter where libraries.postcode = '12345'
(Postgres wants bookplaces.postcode
). Declaring your alias as a variable in JavaScript -- const bookplaces = db.libraries.as('bookplaces')
-- can help ensure consistency.
Aliases carry through to projections: nested properties will be named for the alias rather than the original relation.
await db.select(
db.libraries.order(
db.$order.desc(db.libraries.$founded),
db.libraries.$name
)
);
Any SQL expression may be written in an expr and passed to db.$sort.asc
or db.$sort.desc
.
Here as in many other places raw text is accepted but can increase your risk of SQL injection, and may generate SQL that Postgres cannot execute, for example because of an ambiguous column name.
Postgres supports JSON traversal either through subscripting(json_field['alpha']['beta']
) or custom operators (json_field ->> 'alpha'
or json_field #>> '{alpha,beta}'
). monstrous supports both methods in filter criteria keys. Returning elements of JSON fields can be done with exprs.
Simple inserts declare a table and pass one or more individual value objects. Arrays are not supported; use the ...
spread operator if you have an array of value objects.
await db.insert(db.authors, {name: 'Lauren Ipsum'}, {name: 'Daler S. Ahmet'});
Inserts into a single table can also pass a subquery:
await db.insert(db.holdings,
db.books
.filter({author_id: 1})
.project({
book_id: db.books.$id,
library_id: 1
})
);
Add records to multiple related tables by inserting against a join. You may only insert a single parent record, or any record with children; for example, if the statement below could insert multiple books only if it did not also insert holdings..
await db.insert(db.authors.join(db.books), {
name: 'Consuela Ctetur',
books: [{
title: 'The Placeholder',
publisher: 'Aleatory Domicile',
print_date: new Date(2001, 0),
// only "leaf" tables can have multiple values!
holdings: [{
library_id: 1
}, {
library_id: 2
}]
}]
}, db.$target.one); // return Consuela as an object
Keys in the value object referencing dependent tables may use the name as above, the fully qualified path ([db.books.$title]: [{...}]
), or the path from the table alias if applicable.
The join fields (e.g. the foreign key) of dependent records are automatically filled.
db.update
expects the statement and an object mapping columns of the root relation to changes.
await db.update(db.authors.join(db.books).filter({publisher: 'Aleatory Domicile'}), {
is_in_print: db.expr('case when print_date > $1 then true else false end', check_date)
});
db.save()
passes a single value to db.insert
or db.update
depending on whether or not it contains values for the primary key. It automatically applies db.$target.one
to return an object.
await db.save(db.authors, {
id: 123,
name: 'Lauren Ipsum',
birth: new Date(1920, 1, 1)
});
Deletes look a lot like selects except the resolved records aren't there afterwards.
await db.delete(
db.holdings.join(db.books).filter({[db.books.$author_id]: 3})
);
db.task(callback, tag)
provides a dedicated connection as the sole argument to callback
. db.transaction(callback, mode, tag)
does similarly, except that the connection opens a transaction which commits at the end of callback
or rolls back if an error is thrown.
monstrous' split between statement construction and execution, or between database structure and connection, mean that you still work with db
in the task/transaction callback! You use the dedicated task
or tx
connection to execute -- that is, issue the verbs -- while referencing db
to build statements, since the database structure is not cloned to each connection.
Statements built outside the callback can be used within it.
await db.transaction(async tx => {
await tx.insert(db.books,
{author_id: 1, title: 'Makeshift Examples'},
{author_id: 1, title: 'Inter/Inter'},
{author_id: 1, title: 'Signifying Nothing'}
);
result = await tx.insert(db.holdings,
db.books
.filter({author_id: 1})
.project({
book_id: db.books.$id,
library_id: 1
})
);
});
db.all(db.libraries, db.authors.filter(1))
selects each statement on a single dedicated connection. It's like a read-only task, except you can destructure the array of individual results into variables directly.
Result-manipulating functionality like Massive's findOne
or stream
methods, and the build
option, is now provided by "query targets" passed as the final argument to any operation such as db.select
or db.insert
. Available targets:
db.$target.one
: return a single record as an object, or error if more than one record would be returneddb.$target.unit
: return a single value (one row, one column) as a primitive or objectdb.$target.stream
: not implemented yetdb.$target.log
: return the SQL that would be emitted to PostgresIn a task or transaction, use task.$target.one
/tx.$target.one
.
A non-exhaustive catalog.
Never interpolate user input into an expr:
const str = '; drop table patrons; --';
db.expr(`${libraries.$id} + 1 ${str}`).as('asking_for_trouble'),
It will not always bite you; exprs are ordinarily compiled using pg-promise's custom type formatting which sanitizes input. Sometimes, however, they must be directly interpolated, and the circumstances in which that's true are sufficiently arcane it's smarter to avoid the practice entirely.
Tuples are always unsafe.
db.values(alias, ...objects)
group by
expr
s in update changes
expr
s in join and filter criterianulls first/last
in order by
expr
s)on conflict
having
window
exists
and correlated subqueries?union [all]
, intersect
, except
)merge
?FAQs
a lightweight SQL composer for Node.js and PostgreSQL
The npm package monstrous receives a total of 6 weekly downloads. As such, monstrous popularity was classified as not popular.
We found that monstrous demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
PyPI now allows maintainers to archive projects, improving security and helping users make informed decisions about their dependencies.
Research
Security News
Malicious npm package postcss-optimizer delivers BeaverTail malware, targeting developer systems; similarities to past campaigns suggest a North Korean connection.
Security News
CISA's KEV data is now on GitHub, offering easier access, API integration, commit history tracking, and automated updates for security teams and researchers.