
Research
/Security News
5 Malicious Rust Crates Posed as Time Utilities to Exfiltrate .env Files
Published late February to early March 2026, these crates impersonate timeapi.io and POST .env secrets to a threat actor-controlled lookalike domain.
@effect/sql
Advanced tools
A SQL toolkit for Effect.
import { Config, Effect, Struct, pipe } from "effect"
import * as Sql from "@effect/sql-pg"
const SqlLive = Sql.client.layer({
database: Config.succeed("effect_pg_dev")
})
const program = Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const people = yield* _(
sql<{
readonly id: number
readonly name: string
}>`SELECT id, name FROM people`
)
yield* _(Effect.log(`Got ${people.length} results!`))
})
pipe(program, Effect.provide(SqlLive), Effect.runPromise)
sqlfxIf you are coming from the sqlfx package, here are some differences that should be noted:
For example, to create the client Layer, instead of:
import { Config } from "effect"
import * as Sql from "@sqlfx/pg"
const SqlLive = Sql.makeLayer({
database: Config.succeed("effect_pg_dev")
})
You now do:
import { Config } from "effect"
import * as Sql from "@effect/sql-pg"
const SqlLive = Sql.client.layer({
database: Config.succeed("effect_pg_dev")
})
To continue using your sqlfx migrations table, you can setup your migrator Layer as below:
import { Config } from "effect"
import * as Sql from "@effect/sql-pg"
const MigratorLive = Layer.provide(
Sql.migrator.layer({
loader: Sql.migrator.fromFileSystem(
fileURLToPath(new URL("migrations", import.meta.url))
),
table: "sqlfx_migrations"
}),
SqlLive
)
Or you can rename the sqlfx_migrations table to effect_sql_migrations.
sql.resolver -> Sql.resolver.orderedsql.resolverVoid -> Sql.resolver.voidsql.resolverId -> Sql.resolver.findByIdsql.resolverIdMany -> Sql.resolver.groupedsql.resolverSingle* has been removed in favour of using the effect/Cache module with the schema apissql.schema -> Sql.schema.findAllsql.schemaSingle -> Sql.schema.singlesql.schemaSingleOption -> Sql.schema.findOnesql.schemaVoid -> Sql.schema.voidIn sqlfx you could pass an array to the sql(array) function to pass an list of items to a SQL IN clause. Now you have to use sql.in(array).
import { Effect, pipe } from "effect"
import * as Schema from "@effect/schema/Schema"
import * as Sql from "@effect/sql-pg"
class Person extends Schema.Class<Person>("Person")({
id: Schema.Number,
name: Schema.Strin/* */g,
createdAt: Schema.DateFromSelf,
updatedAt: Schema.DateFromSelf
}) {}
const InsertPersonSchema = Schema.Struct(
Struct.omit(Person.fields, "id", "createdAt", "updatedAt")
)
export const makePersonService = Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const InsertPerson = yield* _(
Sql.resolver.ordered("InsertPerson", {
Request: InsertPersonSchema,
Result: Person,
execute: (requests) =>
sql`
INSERT INTO people
${sql.insert(requests)}
RETURNING people.*
`
})
)
const insert = InsertPerson.execute
return { insert }
})
import { Effect, pipe } from "effect"
import * as Schema from "@effect/schema/Schema"
import * as Sql from "@effect/sql-pg"
class Person extends Schema.Class<Person>("Person")({
id: Schema.Number,
name: Schema.String,
createdAt: Schema.DateFromSelf,
updatedAt: Schema.DateFromSelf
}) {}
export const makePersonService = Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const GetById = yield* _(
Sql.resolver.findById("GetPersonById", {
Id: Schema.Number,
Result: Person,
ResultId: (_) => _.id,
execute: (ids) => sql`SELECT * FROM people WHERE ${sql.in("id", ids)}`
})
)
const getById = (id: number) =>
Effect.withRequestCaching("on")(GetById.execute(id))
return { getById }
})
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
export const make = (limit: number) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM people LIMIT ${limit}`
// e.g. SELECT * FROM people LIMIT ?
})
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
const table = "people"
export const make = (limit: number) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM ${sql(table)} LIMIT ${limit}`
// e.g. SELECT * FROM "people" LIMIT ?
})
import * as Effect from "effect/Effect"
import * as Sql from "@effect/sql-pg"
type OrderBy = "id" | "created_at" | "updated_at"
type SortOrder = "ASC" | "DESC"
export const make = (orderBy: OrderBy, sortOrder: SortOrder) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM people ORDER BY ${sql(orderBy)} ${sql.unsafe(sortOrder)}`
// e.g. SELECT * FROM people ORDER BY `id` ASC
})
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
export const make = (names: string[], cursor: string) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM people WHERE ${sql.and([
sql.in("name", names),
sql`created_at < ${cursor}`
])}`
// SELECT * FROM people WHERE ("name" IN (?,?,?) AND created_at < ?)
})
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
export const make = (names: string[], cursor: Date) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM people WHERE ${sql.or([
sql.in("name", names),
sql`created_at < ${cursor}`
])}`
// SELECT * FROM people WHERE ("name" IN (?,?,?) OR created_at < ?)
})
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
export const make = (names: string[], afterCursor: Date, beforeCursor: Date) =>
Effect.gen(function* (_) {
const sql = yield* _(Sql.client.PgClient)
const statement = sql`SELECT * FROM people WHERE ${sql.or([
sql.in("name", names),
sql.and([`created_at > ${afterCursor}`, `created_at < ${beforeCursor}`])
])}`
// SELECT * FROM people WHERE ("name" IN (?,?,?) OR (created_at > ? AND created_at < ?))
})
A Migrator module is provided, for running migrations.
Migrations are forward-only, and are written in Typescript as Effect's.
Here is an example migration:
// src/migrations/0001_add_users.ts
import { Effect } from "effect"
import * as Sql from "@effect/sql-pg"
export default Effect.flatMap(
Sql.client.PgClient,
(sql) => sql`
CREATE TABLE users (
id serial PRIMARY KEY,
name varchar(255) NOT NULL,
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
updated_at TIMESTAMP NOT NULL DEFAULT NOW()
)
`
)
To run your migrations:
// src/main.ts
import { Config, Effect, Layer, pipe } from "effect"
import { NodeContext, NodeRuntime } from "@effect/platform-node"
import * as Sql from "@effect/sql-pg"
import { fileURLToPath } from "node:url"
const program = Effect.gen(function* (_) {
// ...
})
const SqlLive = Sql.client.layer({
database: Config.succeed("example_database")
})
const MigratorLive = Sql.migrator
.layer({
loader: Sql.migrator.fromFileSystem(
fileURLToPath(new URL("migrations", import.meta.url))
),
// Where to put the `_schema.sql` file
schemaDirectory: "src/migrations"
})
.pipe(Layer.provide(SqlLive))
const EnvLive = Layer.mergeAll(SqlLive, MigratorLive).pipe(
Layer.provide(NodeContext.layer)
)
pipe(program, Effect.provide(EnvLive), NodeRuntime.runMain)
FAQs
A SQL toolkit for Effect
The npm package @effect/sql receives a total of 193,531 weekly downloads. As such, @effect/sql popularity was classified as popular.
We found that @effect/sql demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
/Security News
Published late February to early March 2026, these crates impersonate timeapi.io and POST .env secrets to a threat actor-controlled lookalike domain.

Security News
A recent burst of security disclosures in the OpenClaw project is drawing attention to how vulnerability information flows across advisory and CVE systems.

Research
/Security News
Mixed-script homoglyphs and a lookalike domain mimic imToken’s import flow to capture mnemonics and private keys.