Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@ff00ff/mammoth

Package Overview
Dependencies
Maintainers
1
Versions
45
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@ff00ff/mammoth

> This is `mammoth@next` which is not available in npm yet. Switch to [mammoth@master](https://github.com/Ff00ff/mammoth/tree/master) to see the current version.

  • 1.0.0-beta.7
  • npm
  • Socket score

Version published
Weekly downloads
154
increased by15.79%
Maintainers
1
Weekly downloads
 
Created
Source

This is mammoth@next which is not available in npm yet. Switch to mammoth@master to see the current version.

Mammoth

Mammoth: A type-safe Postgres query builder for TypeScript.

Build Status Coverage Status MIT License

npm i @ff00ff/mammoth

Mammoth is a type-safe query builder. It only supports Postgres which we consider a feature. It's syntax is as close to SQL as possible so you already know how to use it. It's autocomplete features are great. It helps you avoid mistakes so you can develop applications faster. It comes with all features you need to create production ready apps.


Features

  • Type-safe query builder
  • Supports Postgres only
  • Excellent autocomplete
  • Transactions
  • Connection pooling
  • Automatic camelCase to snake_case conversion
  • No build step or watch needed


Quick start

const rows = await select(list.id, list.createdAt)
  .from(list)
  .where(list.createdAt.gt(now().minus(days(2))).or(list.value.eq(0)))
  .limit(10);
SELECT list.id, list.created_at FROM list WHERE list.created_at > NOW() - $1::interval OR list.value = $2 LIMIT 10;

A select should not require declaring an additional interface explicitly.

The type of rows is automatically derived from the table. .notNull() columns are automatically required and the other columns are all optional.

const rows: {
  id: string;
  createdAt: Date;
}[];

Update

When executing an update query, by default, the return type is number which indicates the number of affected rows.

const numberOfUpdates = await update(list)
  .set({
    name: `New Name`,
  })
  .where(list.id.eq(`acb82ff3-3311-430e-9d1d-8ff600abee31`));
UPDATE list SET name = $1 WHERE list.id = $2

But when you use .returning(..) the return type is changed to an array of rows.

// { id: string }[]
const rows = await update(list)
  .set({
    name: `New Name`,
  })
  .where(list.id.eq(`acb82ff3-3311-430e-9d1d-8ff600abee31`))
  .returning(`id`);
UPDATE list SET name = $1 WHERE list.id = $2 RETURNING id

Insert

To insert a row you only have to specify the .notNull() without a .default(). The other columns are optional.

const numberOfRows = await insertInto(list).values({
  name: `My List`,
});
INSERT INTO list (name) VALUES ($1)

In an earlier version of Mammoth you still had to pass undefined for nullable columns, but with some type magic this is now fixed!

Again, if you use .returning(..) the return type is changed automatically.

// { id: string, createdAt: Date, name: string }
const list = await insertInto(list)
  .values({
    name: `My List`,
  })
  .returning(`id`, `createdAt`, `name`);
INSERT INTO list (name) VALUES ($1) RETURNING id, created_at, name

If you insert an array of rows and you use the .returning(..) the return type will change to an array as well.

// { id: string, createdAt: Date, name: string }[]
const lists = await db
  .insertInto(db.list)
  .values([
    {
      name: `List #1`,
    },
    {
      name: `List #2`,
    },
  ])
  .returning(`id`, `createdAt`, `name`);
INSERT INTO list (name) VALUES ($1), ($2) RETURNING id, created_at, name

Transactions

You can call transaction(callback) which begins a transaction and depending on the promise you return in the transaction will commit or rollback the transaction.

It's important you use the db passed in the transaction's callback, if not, you're effectively executing statements outside the transaction.

const result = await transaction(({ insertInto }) => {
  const row = await insertInto(list)
    .values({
      name: `My List`,
    })
    .returning(`id`);

  await insertInto(listItem).values({
    listId: row.id,
    name: `My Item`,
  });

  return row;
});

Schema

Before Mammoth can offer type safety features you have to define the schema in Mammoth's syntax. The syntax is designed to be as close to SQL as possible.

defineTable({
  id: dataType(`UUID`).primary().notNull().default(`gen_random_uuid()`);
  createdAt = dataType<Date>(`TIMESTAMP WITH TIME ZONE`).notNull().default(`NOW()`);
  name = dataType(`TEXT`).notNull();
  value = dataType<number>(`INTEGER`);
})

But to make things easier, there are data type specific functions. When using auto import this should be a breeze.

export const list = defineTable({
  id: uuid()
    .primary()
    .notNull()
    .default(`gen_random_uuid()`),
  createdAt: timestampWithTimeZone()
    .notNull()
    .default(`NOW()`),
  name: text().notNull(),
  value: integer(),
});

export const listItem = defineTable({
  id: uuid().primary().notNull().default(`gen_random_uuid()`),
  createdAt: timestampWithTimeZone().notNull().default(`now()`),
  listId: uuid().notNull().references(list, 'id'),
  name: text().notNull(),
}
CREATE TABLE list (
  id UUID PRIMARY KEY NOT NULL DEFAULT gen_random_uuid(),
  created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(),
  name TEXT NOT NULL,
  value INTEGER
);

CREATE TABLE list_item (
  id UUID PRIMARY KEY NOT NULL DEFAULT gen_random_uuid(),
  created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(),
  list_id UUID NOT NULL REFERENCES list (id),
  name TEXT NOT NULL
);

We choose

Migrations

The accompanying mammoth-cli helps you generate migrations based on your schema and existing migrations.

Raw queries

When a new keyword is introduced in Postgres which you want to use badly but is not supported in this library yet, you can always fall back to raw sql. You can mix the type-safe functions with raw sql:

select(account.id).from(account).append`MAGIC NEW ORDER BY`;
SELECT account.id FROM account MAGIC NEW ORDER BY

You can also write raw sql completely. This is not advised, obviously, because it defeats the whole purpose of this library.

const result = await sql`SELECT * FROM account WHERE account.name = ${name}`;

Column data type

ClassSQL data type
BinaryColumnBYTEA
BlobColumnBYTEA
ByteaColumnBYTEA
CaseInsensitiveTextColumnCITEXT
CitextColumnCITEXT
DateColumnDATE
DecimalColumnDECIMAL
EnumColumnCreates an enum type
IntegerColumnINTEGER
IntervalColumnINTERVAL
JSONBColumnJSONB
JSONColumnJSON
MoneyColumnMONEY
NumberColumnIntegerColumn
SerialColumnSERIAL
StringColumnTextColumn
TextColumnTEXT
TextColumnTEXT
TimeColumnTIME
TimestampColumnTIMESTAMP
TimestampWithoutTimeZoneColumnTIMESTAMP WITHOUT TIME ZONE
TimestampWithTimeZoneColumnTIMESTAMP WITH TIME ZONE
TimeWithoutTimeZoneColumnTIME WITHOUT TIME ZONE
TimeWithTimeZoneColumnTIME WITH TIME ZONE
UuidColumnUUID

Enum alternative

Instead of using an EnumColumn, because you cannot remove values (only add or rename), you can also opt to use a TextColumn<T> which allows enforcing a type in your application e.g. TextColumn<'ONE' | 'TWO' | 'THREE'>.

export const item = defineTable({
  id: uuid()
    .primaryKey()
    .notNull()
    .default(`gen_random_uuid()`),
  value: text<'FOO' | 'BAR' | 'BAZ'>().notNull(),
});

Which enforces type checking of the value column in TypeScript land.

// Allowed
await db.insertInto(item).values({ value: `FOO` });

// Not allowed
await db.insertInto(item).values({ value: `another string value` });

Of course it doesn't create any constraints on the database level like EnumColumn is doing. If that's something you desire you should pick enum instead.

Documents & JSON(B)

You can use JSONBColumn<T> to store json data. By using the T parameter you can specify the type e.g. JSONBColumn<{ foo: number }>. This makes it easier to work with json columns.

class MyTable {
  id = new UuidColumn()
    .primaryKey()
    .notNull()
    .default(new GenRandomUuid());
  value = new JSONBColumn<{ foo: string }>();
}

There is currently limited support for the different json(b) functions and operators. This is planned for a next release.

You do need to be careful when your type needs to evolve (change).

Contribute / Set up locally

To contribute to this library, you first need to do a few things to get set up.

First make sure you have a test postgres database. For example, mammoth_test:

$ createdb mammoth_test

If you installed postgres using homebrew, make sure you have a postgres user named postgres. You can create one using this command: createuser -s postgres

Finally, make sure all the tests run and pass before making any changes. Create a .env file with the following contents.

DATABASE_URL=postgres://postgres@localhost/mammoth_test

Replace the database url connection string with a string to your local database

$ yarn test

Mammoth logo created by Eucalyp from the Noun Project.

Keywords

FAQs

Package last updated on 30 Mar 2020

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc