Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
co-postgres-queries
Advanced tools
:warning: Deprecated
This library is deprecated as we progressively move to marmelab/postgres-queries. Use at your own risk and expect no update unless a security issue threaten legacy applications.
Utility to generate and execute postgresql queries with ease.
Requirements
npm install --save co-postgres-queries
The library can be divided in two parts:
PgPool
, that allows to connect to the postgres database and execute queries.import PgPool from "co-postgres-queries";
const clientOptions = {
user,
password,
database,
host,
port
};
const pool = new PgPool(clientOptions);
pool
.connect()
.then(client => {
return client.query("SELECT * from user WHERE firstname = $firstname", {
firstname: "john"
});
})
.then(rows => {
// do something with all the user named john
});
Each query builder takes the form:
query(config)(...parameters);
On the first call it receives its configuration, eg, the table name, column name, etc... For example:
import insertOne from "co-postgres-queries/queries/insertOne";
const insertOne = insertOne({
table: "user",
writableCols: ["name", "firstname"],
returnCols: ["id", "name", "firstname"]
});
On the second call it takes the query parameters and returns an object of the form { sql, parameters }
,
For example:
insertOne({ name: 'doe', firstname: 'john', other: 'data' });
// would return
{
sql: 'INSERT INTO user (name, firstname)VALUES($name, $firstname) RETURNING id, name, firstname',
parameters: { name: 'doe', firstname: 'john' }
}
The result can then be directly passed to client.query
to be executed.
client.query(insertOne({ name: "doe", firstname: "john", other: "data" }));
There is also a crud
helper function to generate basic crud queries for a given table:
const userCrud = crud({
table: "user",
primaryKey: "id",
writableCols: ["name", "firstname", "mail"],
returnCols: ["id", "name", "firstname", "mail"]
});
This will configure query builders for selectOne
, select
, insert
, updateOne
, deleteOne
, countAll
and batchInsert
in one literal.
coPostGresQueries provides the saga
command to execute the generated query as effect in a generator.
function* subscribeUser(userId) {
try {
yield begin(); // begin transaction block
const user = yield userCrud.selectOne(userId);
if (!user) {
throw new Error('not found');
}
yield userCrud.updateOne(userId, { subscribed: true });
// ... some other queries
yield commit();
return result;
} catch (error) {
yield rollback();
throw error;
}
...
}
client.saga(getUserAndDoSomething(5)).then(...);
since the generator yield only query objects, it is easily testable.
Extends node-pg-pool Allows to connect to postgresql and execute query It adds:
import PgPool from "co-postgres-queries";
const clientOptions = {
user,
password,
database,
host,
port
};
const poolingOptions = {
max, // Max number of clients to create (defaults to 10)
idleTimeoutMillis // how long a client is allowed to remain idle before being closed (defaults to 30 000 ms)
};
const pool = new PgPool(clientOptions, poolingOptions);
const pool = new pgPool();
pool.connect().then(client => {
// use the client
});
// async/await
(async () => {
const pool = new pgPool();
const client = await pool.connect();
})();
// co
co(function*() {
const pool = new pgPool();
const client = yield pool.connect();
});
Executes a query, it takes three parameters:
// query use named parameter
client
.query("SELECT $name::text as name", { name: "world" }) // query return a promise
.then(result => {
// result contain directly the row
console.log(`Hello ${result[0].name}`);
});
// It work with async/await
(async () => {
const pool = new PgPool();
const result = await pool.query("SELECT $name::text as name", {
name: "world"
});
console.log(`Hello ${result[0].name}`);
})();
// Or with co
co(function*() {
const pool = new PgPool();
const result = yield pool.query("SELECT $name::text as name", {
name: "world"
});
console.log(`Hello ${result[0].name}`);
});
client.query
can also be called with an object literal:
pool.query({
sql: "SELECT $name::text as name",
parameters: { name: "world" }
});
You can also execute a query directly from the pool. A client will then get automatically retrieved, and released once the query is done. Transactions are not possible this way since the client would change on each query.
Takes a query or a literal of query and returns a function that Takess the query parameter and executes it
const query = insertOneQuery('table', ['col1', 'col2']);
const insertOne = client.link(query);
yield insertOne({ col1: 'val1', col2: 'val2' });
// or
const queries = crudQueries(table, ['col1', 'col2'], ['col1']);
const crud = client.link(queries);
yield crud.insertOne({ col1: 'val1', col2: 'val2' });
Returns the client to the pool, to be used again. Do not forget to call this when you are done.
Closes the client. It will not return to the pool.
Takes a generator yielding object queries ({ sql, parameters }
), and returns an async function that run the generator executing the yielded query.
function* getUserAndDoSomething(id) {
const user = yield {
sql: 'SELECT * FROM user WHERE $id=id',
parameters: { id },
returnOne: true,
};
...
}
client.saga(getUserAndDoSomething(5)).then(...);
The yielded query object will be internally passed to client.query
then the result will be passed back to the generator.
If an error occurs during the query, it will be thrown back into the generator where it can be catched.
const executeQUery = client.saga(function* (id) {
try {
const user = yield {
sql: 'bad query',
};
...
} catch (error) {
// handle the error
}
});
Since the queries functions return query object, they can be yielded.
const selectOneUserById = selectOne({ table: 'user' });
const getUserAndDoSomethig = client.saga(function* (id) {
const user = yield selectOneByUserId({ id });
...
});
You can also yield an array of query to be run in parallel:
const getUserAndCommands = client.saga(function* (id) {
const [user, commands] = yield [
selectOneByUserId({ id }),
selectCommandsByUserId({ id }),
];
...
});
Or even a literal:
const getUserAndCommands = client.saga(function* (id) {
const { user, commands } = yield {
user: selectOneByUserId({ id }),
commands: selectCommandsByUserId({ id }),
};
...
});
Since the generator yield plain objects, they can be easily tested without needing any mocks:
const iterator = someQueryGenerator();
const {
value: { sql, parameters }
} = iterator.next();
// we get the generated sql and parameters.
iterator.next(queryResult); // we can pass what we want as result
iterator.throw(queryError); // or we can resume by throwing error
Each query helper takes the form:
query(config)(...parameters);
On the first call it receives its configuration, eg, the table name, column name, etc... For example:
import insertOne from "co-postgres-queries/queries/insertOne";
const insertOne = insertOne({
table: "user",
writableCols: ["name", "firstname"],
returnCols: ["id", "name", "firstname"]
});
On the second call it takes the query parameters and returns an object of the form { sql, parameters }
,
with the sql containing named parameter, and parameters having been sanitized based on the configuration.
For example:
insertOne({ name: 'doe', firstname: 'john', other: 'data' });
// would return
{
sql: 'INSERT INTO user (name, firstname)VALUES($name, $firstname) RETURNING id, name, firstname',
parameters: { name: 'doe', firstname: 'john' }
}
The result can then be directly passed to client.query
to be executed.
import insertOne from "co-postgres-queries/queries/insertOne";
insertOne({ table, writableCols, returnCols })(row);
Returns a query to insert one given row.
A literal object in the form of:
{
column: value,
...
}
import batchInsert from "co-postgres-queries/queries/batchInsert";
batchInsert(table, writableCols, returnCols)(rows);
allow to create a query to insert an array of rows.
An array of literal objects in the form of:
[
{
column: value,
...
}, ...
]
import selectOne from "co-postgres-queries/queries/selectOne";
selectOne({ table, primaryKey, returnCols, permanentFilters })(row);
Creates a query to select one row.
id
){ deleted_at: null}
A literal in the form of:
{
id1: value,
id2: value,
...
}
Any key not present in primaryKey will be ignored.
import select from "co-postgres-queries/queries/select";
select({
table,
primaryKey,
returnCols,
searchableCols,
specificSorts,
groupByCols,
withQuery,
permanentFilters,
returnOne
})({ limit, offset, filters, sort, sortDir });
Creates a query to select one row.
table:
the table name, accept JOIN statements
exemple:
{
table: 'table1 JOIN table2 ON table1.table2_id = table2.id'
}
primaryKey: One or more columns representing the primary key. Accept either an array or a single value. (default: id
)
returnCols: list of columns retrieved by the query
searchableCols: list of columns that can be searched (usable in filter parameter). Defaults to return columns
specificSorts: allow to specify sort order for a given column. Useful when we want to order string other than by alphabetical order. example:
{
level: ["master", "expert", "novice"];
}
will order level column with all master first, then expert and finally novice
groupByCols allow to add a GROUP BY clause to the query on the given columns
withQuery
specify that we want to encompass the query in WITH RESULT AS <query> SELECT * FROM result
This add a temporary result table that allow to sort on computed and joined column.
if the table configuration contain a JOIN clause, this will be automatically set to true.
permanentFilters: List of filters applied by default, e. g. for a soft delete with permanentFilters as { deleted_at: null}
returnOne: Optional, if set to true, returns only the first result instead of an array.
A literal object with:
For instance, specifying the following filters value:
{
first_name: "John",
last_name: "Doe",
last_paid_at: null,
}
Will produce the following WHERE
clause:
WHERE
first_name = 'John'
AND last_name = 'Doe'
AND last_paid_at IS NULL
Other SQL matching operators may be used by specifying some prefixes to the column names. For instance:
{
not_first_name: "John", // first_name != "John"
not_last_paid_at: null, // last_paid_at IS NOT NULL
from_last_paid_at: '2010-01-01', // last_paid_at >= '2010-01-01'
to_last_paid_at: '3010-01-01', // last_paid_at <= '3010-01-01'
like_position: 'Sales', // position ILIKE '%Sales%'
not_like_position: 'Manager' // position NOT ILIKE '%Manager%'
}
It is also possible to match to all searchable column with match:
{
match: 'value',
}
will return only row for which any searchableCols matching value (case insensitive).
import countAll from "co-postgres-queries/queries/countAll";
countAll({ table, permanentFilters })({ filters: { enabled: true } });
Create a query to count all rows. It also takes an optional plain object parameter filters
, applied to the query in addition to the permanentFilters
.
{ deleted_at: null}
import update from "co-postgres-queries/queries/update";
update({
table,
writableCols,
filterCols,
returnCols,
permanentFilters
})(filters, data);
Creates a query to update rows.
{ deleted_at: null}
Two arguments:
{
column: "value";
}
will update only row for which column equal 'value'import updateOne from "co-postgres-queries/queries/updateOne";
updateOne({
table,
writableCols,
primaryKey,
returnCols,
permanentFilters
})(identifier, data);
Creates a query to update one row.
id
){ deleted_at: null}
Two arguments:
{ id1: value, id2: otherValue }
. All configured primaryKey columns must be given a value.import remove from "co-postgres-queries/queries/remove";
remove({ table, filterCols, returnCols, permanentFilters })(filters);
Creates a query to delete rows.
{ deleted_at: null}
A literal specifying wanted value for given column example:
{
column: "value";
}
will update only row for which column equal 'value'
import removeOne from "co-postgres-queries/queries/removeOne";
removeOne({ table, primaryKey, returnCols, permanentFilters })(identitfier);
Creates a query to delete one row.
id
){ deleted_at: null}
The identifier: either a single value for a single primaryKey column, or a literal if several columns:{ id1: value, id2: otherValue }
. All configured primaryKey columns must be given a value.
import batchRemove from "co-postgres-queries/queries/batchRemove";
batchRemove({ table, primaryKey, returnCols, permanentFilters })(
identifierList
);
Allow to create a query to delete several row at once
id
){ deleted_at: null}
The list of identifier either an array of single value for a single primaryKey column, or an array of literal if several columns:[{ id1: value, id2: otherValue }, ...]
. All configured primaryKey columns must be given a value.
import upsertOne from "co-postgres-queries/queries/upsertOne";
upsertOne({
table,
primaryKey,
writableCols,
returnCols,
permanentFilters
})(row);
Creates a query to update one row or create it if it does not already exists.
id
){ deleted_at: null}
the row to upsert
import batchUpsert from "co-postgres-queries/queries/batchUpsert";
batchUpsert({
table,
primaryKey,
writableCols,
returnCols,
permanentFilters
})(rows);
Creates a query to update a batch row creating those that does not already exists.
id
){ deleted_at: null}
The array of rows to upsert
import selectByOrderedIdentifiers from "co-postgres-queries/queries/selectByOrderedIdentifiers";
selectByOrderedIdentifiers({
table,
primaryKey,
returnCols
})(values);
Creates a query to select multiple row given an array of identifier. The result will keep the order of the identifier. Due to the nature of the query, this will only work for primaryKey composed of a single column.
The array of identifier to retrieve. The array order will determine the result order.
import crud from "co-postgres-queries/queries/crud";
crud({
table,
writableCols,
primaryKey,
returnCols,
permanentFilters
});
Creates configured queries for insertOne, batchInsert, selectOne, select, updateOne, deleteOne and batchDelete.
id
){
level: ["master", "expert", "novice"];
}
will order level column with all master first, then expert and finally noviceWITH RESULT AS <query> SELECT * FROM result
This add a temporary result table that allow to sort on computed and joined column.
if the table configuration contain a JOIN clause, this will be automatically set to true.{ deleted_at: null}
import {
begin,
commit,
savepoint,
rollback
} from "co-postgres-queries/queries/transaction";
Simple helper to manage transaction
You must retrieve a client with pool.connect()
to use those.
import begin from "co-postgres-queries/queries/transaction/begin";
begin();
// { sql: 'BEGIN' }
create a query to start a transaction
import commit from "co-postgres-queries/queries/transaction/commit";
commit();
// { sql: 'COMMIT' }
create a query to commit a transaction
import savepoint from "co-postgres-queries/queries/transaction/savepoint";
savepoint(name);
// { sql: 'SAVEPOINT name' }
create a query to add a save point during transsaction
import rollback from "co-postgres-queries/queries/transaction/rollback";
rollback();
// { sql: 'ROLLBACK' }
// or
rollback(name);
// { sql: 'ROLLBACK to name' }
Rollback the transaction to the given save point, or to its beginning if not specified.
FAQs
Query builder for PostgreSQL in Node.js, built for async
We found that co-postgres-queries demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.