
Product
Introducing Pull Request Stories to Help Security Teams Track Supply Chain Risks
Socket’s new Pull Request Stories give security teams clear visibility into dependency risks and outcomes across scanned pull requests.
@protoutil/celql
Advanced tools
A set of utilities for validating and converting CEL expressions to SQL WHERE clauses to be used in queries. The goal of this library is to assist in authoring filters which can be parsed, type-checked, stanitized, and translated to a database query both
A set of utilities for validating and converting CEL expressions to SQL WHERE clauses to be used in queries. The goal of this library is to assist in authoring filters which can be parsed, type-checked, stanitized, and translated to a database query both in the browser and on the web. It is not meant to be a full-featured SQL replacement. It is also not meant to provide an integration with any JavaScript framework (i.e. storing filter state in URL query parameters or providing UI components). It is a goal to eventually provide such utilities. But, they will be implemented in separate, framework-specific libraries.
Use your configured package manager to install the @protoutil/celql
package. i.e. install from npm using npm install @protoutil/celql
.
First, you will need to let the CEL environment know about your columns. This can be done by defining a CEL Env
instance. You can define each column with its CEL-equivalent type individually:
import { variable } from '@protoutil/cel';
import { DefaultEnv } from '@protoutil/celql';
const env = new DefaultEnv(
variable('name', StringType),
...
);
Or, if you have a protobuf representation of your database table, you can define them all in one go with declareContextProto
:
import { declareContextProto } from '@protoutil/cel';
import { DefaultEnv } from '@protoutil/celql';
import { MySchema } from './gen/myschema_pb.js';
const env = new DefaultEnv(
declareContextProto(MySchema),
...
);
Now, you can convert your CEL expressions to SQL:
import { translatePostgres } from '@protoutil/celql';
const whereClause = translatePostgres('my_column == "foo"', env);
// Will output { sql: 'my_column = $1', vars: ['foo'] }
SQL output is separated into a query clause and an array of variables. This is done so user input can be sanitized. This can be done either by manual sanitization or using a parameterized query. Ideally, both. The exception is that Timestamp
values will be printed in 'yyyy-mm-dd hh:mi:ss.us' format with millisecond resolution. String parsing of Timestamp
values will throw an error with invalid inputs which should disallow any unsanitized malicious user input.
If you want to validate your expression before sending it to the server to be converted, you can do that with the compile
function:
import { compile } from '@protoutil/celql';
try {
// This will fail because it uses an invalid column and does not evaluate to a boolean expression
compile('invalid_column + 1', env);
} catch (e) {
// Handle your error
}
In addition to the defaults (DefaultEnv
, DefaultDialect
, translateDefault
), this library also exports Postgres-specific functionality (PostgresEnv
, PostgresDialect
, translatePostgres
). It is an eventual goal to provide environments, dialects, and translation functions for many different flavors of database.
Timestamp
values will be formatted in 'yyyy-mm-dd hh:mi:ss.us' format with millisecond resolution. Successful querying will depend on how your flavor of SQL handles those inputs. You may need to specify a fractional section resolution for Timestamp
columns (i.e. TIMESTAMP(3)
). You can override this behavior with a custom dialect.The DefaultEnv
supports most default CEL expressions. But, the purpose of this library is to translate expressions to SQL clauses. As a result, some functionality is either not implemented or may have different signatures. There are also built-in SQL-specific functions.
TODO: Document functionality that is not implemented
The string contains
, endsWith
, and startsWith
member functions can optionally take a boolean ignoreCase
parameter to control case sensitivity. Passing true
will make their searches case insensitive. It is important to note that this parameter may cause the LOWER
function to be called. So, creating a lower-case index of string columns may significantly improve performance for these queries.
my_column.contains('foo', true); // Will output a case-insensitive query i.e. ILIKE for PostgreSQL
my_column.contains('foo', false); // Will output a case-sensitive query i.e. LIKE for PostgreSQL
timestamp
functions optionally take a second parameter which corresponds to the time zone.
timestamp('2023-01-01T12:34:56Z', 'America/New_York'); // Will output { sql: `TIMESTAMP '2023-01-01 12:34:56.000' AT TIME ZONE $1`, vars: ['America/New_York'] }
Signatures:
date(date) -> date
(identity)date(string) -> date
converts a string to a Date
date(timestamp) -> date
converts a Timestamp
to a Date
date(my_column) == date('2023-10-01'); // Will output { sql: 'DATE(my_column) = DATE($1)', vars: ['2023-10-01'] }
Signatures:
time(time) -> time
(identity)time(string) -> time
converts a string to a Time
time(timestamp) -> time
converts a Timestamp
to a Time
time(my_column) == time('12:34:56'); // Will output { sql: 'TIME(my_column) = TIME($1)', vars: ['12:34:56'] }
atTimeZone - Converts a Timestamp
to the specified time zone.
my_column.atTimeZone('America/New_York'); // Will output { sql: 'my_column AT TIME ZONE $1', vars: ['America/New_York'] }
lower - Converts a string to lower case.
my_column.lower(); // Will output { sql: 'LOWER(my_column)', vars: [] }
upper - Converts a string to upper case.
my_column.upper(); // Will output { sql: 'UPPER(my_column)', vars: [] }
trim - Trims whitespace from a string.
my_column.trim(); // Will output { sql: 'TRIM(my_column)', vars: [] }
like - Tests whether the operand matches a pattern. Uses the LIKE
logical operator and can optionally take a boolean ignoreCase
parameter to control case sensitivity. It is important to note that this parameter may not have an effect on all databases. For example, MySQL LIKE
queries are case-insensitive by default.
my_column.like('foobar'); // Will output { sql: 'my_column LIKE $1', vars: ['foobar'] }
my_column.like('foobar', true); // Will output { sql: 'my_column ILIKE $1', vars: ['foobar'] } for PostgreSQL
!my_column.like('foobar'); // Will output { sql: 'NOT my_column LIKE $1', vars: ['foobar'] }
You are able to define your own Dialect
class and add functions by extending the CEL environment:
import { DefaultEnv, DefaultDialect, translate } from '@protoutil/celql';
import { BoolType, func, overload, StringType } from '@protoutil/cel';
const myFuncOverload = 'myFunc';
class MyDialect extends DefaultDialect {
override functionToSqlOverrides(unparser: Unparser, functionName: string, args: Expr[]): boolean {
switch (functionName) {
case myFuncOverload:
unparser.visit(args[0]);
unparser.writeString(' MY_CUSTOM_OPERATOR ');
unparser.visit(args[1]);
return true;
default:
return super.functionToSqlOverrides(unparser, functionName, args);
}
}
}
const env = new DefaultEnv(
...,
func(myFuncOverload, overload(myFuncOverload, [StringType, StringType], BoolType))
)
translate(`myFunc('a', 'b')`, env, new MyDialect());
// Will output: { sql: '$1 MY_CUSTOM_OPERATOR $2', vars: ['a', 'b'] }
Run nx build celql
to build the library.
Run nx test celql
to execute the unit tests via Jest.
0.2.5 (2025-07-02)
FAQs
A set of utilities for validating and converting CEL expressions to SQL WHERE clauses to be used in queries. The goal of this library is to assist in authoring filters which can be parsed, type-checked, stanitized, and translated to a database query both
We found that @protoutil/celql demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket’s new Pull Request Stories give security teams clear visibility into dependency risks and outcomes across scanned pull requests.
Research
/Security News
npm author Qix’s account was compromised, with malicious versions of popular packages like chalk-template, color-convert, and strip-ansi published.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.