Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

offshore

Package Overview
Dependencies
Maintainers
1
Versions
16
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

offshore

An ORM for Node.js

  • 1.0.4
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
25
increased by108.33%
Maintainers
1
Weekly downloads
 
Created
Source

npm version Build Status Coverage Status NSP Status Dependencies Status

Offshore is a fork of Waterline project.

It provides a uniform API for accessing stuff from different kinds of databases, protocols, and 3rd party APIs. That means you write the same code to get and store things like users, whether they live in Redis, mySQL, LDAP, MongoDB, or Postgres.

Offshore strives to inherit the best parts of ORMs like ActiveRecord, Hibernate, and Mongoose, but with a fresh perspective and emphasis on modularity, testability, and consistency across adapters.

For detailed documentation, go to Offshore Documentation repository.

Why a fork?

because it's the only way to integrate new features.

Changelog

What's Next?

  • polymorphic associations

Installation

Install from NPM.

$ npm install offshore

Example

Usage
var Offshore = require('offshore');

// Define your collection (aka model)
var User = Offshore.Collection.extend({

  attributes: {

    firstName: {
      type: 'string',
      required: true
    },

    lastName: {
      type: 'string',
      required: true
    }
  }
});

Overview

Adapters Concept

Offshore uses the concept of an Adapter to translate a predefined set of methods into a query that can be understood by your data store. Adapters allow you to use various datastores such as MySQL, PostgreSQL, MongoDB, Redis, etc. and have a clear API for working with your model data.

It also allows an adapter to define it's own methods that don't necessarily fit into the CRUD methods defined by default in Offshore. If an adapter defines a custom method, Offshore will simply pass the function arguments down to the adapter.

NOTE: When using custom adapter methods the features of Offshore are not used. You no longer get the Lifecycle Callbacks and Validations as you would when using a defined Offshore method.

You may also supply an array of adapters and Offshore will map out the methods so they are both mixed in. It works similar to Underscore's Extend method where the last item in the array will override any methods in adapters before it. This allows you to mixin both the traditional CRUD adapters such as MySQL with specialized adapters such as Twilio and have both types of methods available.

Community Adapters

using waterline compatibility mode

offshore.initialize({ compat: "waterline", ...

Collection

A Collection is the main object used in Offshore. It defines the layout/schema of your data along with any validations and instance methods you create.

To create a new collection you extend Offshore.Collection and add in any properties you need.

options

Available options are:

  • tableName Define a custom table name to store the models
  • adapter the name of the adapter you would like to use for this collection
  • schema Set schema true/false to only allow fields defined in attributes to be saved. Only for schemaless adapters.
  • attributes A hash of attributes to be defined for a model
  • autoCreatedAt and autoUpdatedAt Set false to prevent creating createdAt and updatedAt properties in your model
  • autoPK Set false to prevent creating id. By default id will be created as index with auto increment
  • lifecyle callbacks
  • any other class method you define!
Attributes

The following attribute types are currently available:

  • string
  • text
  • integer
  • float
  • date
  • time
  • datetime
  • boolean
  • binary
  • array
  • json
Example Collection
var User = Offshore.Collection.extend({

  // Define a custom table name
  tableName: 'user',

  // Set schema true/false for adapters that support schemaless
  schema: true,

  // Define an adapter to use
  adapter: 'postgresql',

  // Define attributes for this collection
  attributes: {

    firstName: {
      type: 'string',

      // also accepts any validations
      required: true
    },

    lastName: {
      type: 'string',
      required: true,
      maxLength: 20
    },

    email: {

      // Special types are allowed, they are used in validations and
      // set as a string when passed to an adapter
      type: 'email',

      required: true
    },

    age: {
      type: 'integer',
      min: 18
    },

    // You can also define instance methods here
    fullName: function() {
      return this.firstName + ' ' + this.lastName
    }
  },

  /**
   * Lifecycle Callbacks
   *
   * Run before and after various stages:
   *
   * beforeValidate
   * afterValidate
   * beforeUpdate
   * afterUpdate
   * beforeCreate
   * afterCreate
   * beforeDestroy
   * afterDestroy
   */

  beforeCreate: function(values, cb) {

    // An example encrypt function defined somewhere
    encrypt(values.password, function(err, password) {
      if(err) return cb(err);

      values.password = password;
      cb();
    });
  },

  // Class Method
  doSomething: function() {
    // Do something here
  }

});

Now that a collection is defined we can instantiate it and begin executing queries against it. All Collections take options and callback arguments.

Options will be made up of:

  • tableName, used if not defined in a Collection definition
  • adapters object that specifies each adapter, either custom definitions or from NPM
var postgres = require('sails-postgresql');

new User({ tableName: 'foobar', adapters: { postgresql: postgres }}, function(err, Model) {

  // We now have an instantiated collection to execute queries against
  Model.find()
  .where({ age: 21 })
  .limit(10)
  .exec(function(err, users) {
    // Now we have an array of users
  });

});

Model

Each result that gets returned from a Offshore query will be an instance of Model. This will add in any instance methods defined in your collection along with some CRUD helper methods. View the Core Instance Methods to see how they are implemented.

Default CRUD instance methods:

  • save
  • destroy
  • toObject
  • toJSON

If you would like to filter records and remove certain attributes you can override the toJSON method like so:

var user = Offshore.Collection.extend({

  attributes: {
    name: 'string',
    password: 'string',

    // Override toJSON instance method
    toJSON: function() {
      var obj = this.toObject();
      delete obj.password;
      return obj;
    }
  }
});

// Then on an instantiated user:
user.find({ id: 1 }).exec(function(err, model) {
  return model.toJSON(); // Will return only the name
});

Query Methods

Queries can be run with either a callback interface or with a deferred object. For building complicated queries, the deferred object method is the best choice. For convenience, promises are supported by default.

Callback Method

User.findOne({ id: 1 }, function(err, user) {
  // Do stuff here
});

Deferred Object Method

User.find()
.where({ id: { '>': 100 }})
.where({ age: 21 })
.limit(100)
.sort('name')
.exec(function(err, users) {
  // Do stuff here
});

Promises

User.findOne()
.where({ id: 2 })
.then(function(user){
    var comments = Comment.find({userId: user.id}).then(function(comments){
        return comments;
    });
    return [user.id, user.friendsList, comments];
}).spread(function(userId, friendsList, comments){
    // Promises are awesome!
}).catch(function(err){
    // An error occurred
})

Promises use the Bluebird library, so anything you do after the first then call (or spread, or catch), will be a complete Bluebird promise object. Remember, you must end the query somehow (by calling then or one of the other functions) in order to complete the database request.

Each of the following basic methods are available by default on a Collection instance:

  • findOne
  • find
  • create
  • update
  • destroy
  • count

In addition you also have the following helper methods:

  • createEach
  • findOrCreateEach (DEPRECATED)
  • findOrCreate
  • findOneLike
  • findLike
  • startsWith
  • endsWith
  • contains

Based on your Collection attributes you also have dynamic finders. So given a name attribute the following queries will be available:

  • findOneByName
  • findOneByNameIn
  • findOneByNameLike
  • findByName
  • findByNameIn
  • findByNameLike
  • countByName
  • countByNameIn
  • countByNameLike
  • nameStartsWith
  • nameEndsWith
  • nameContains

Pagination

In addition to the other find methods, there are a few helper methods to take care of pagination:

  • skip
  • limit
  • paginate

Skip takes an integer and can be used to skip records:

User.find().skip(20);

Limit takes an integer and limits the amount of records returned:

User.find().limit(10);

And put together they create the ability to paginate through records as you would pages. For example, if I wanted 'page 2' of a given record set, and I only want to see 10 records at a time, I know that I need to skip(10) and limit(10) like so:

User.find().skip(10).limit(10);

But, while we are thinking in terms of pagination, or pages, it might be easier to use the final helper - paginate:

User.find().paginate({page: 2, limit: 10});

Paginate has several options:

  • paginate() defaults options to {page: 0, limit: 10}
  • paginate({page: 2}) uses {page: 2, limit: 10} as the options
  • paginate({limit: 20}) uses {page: 0, limit: 20} as the options
  • paginate({page: 1, limit: 20}) uses {page: 1, limit: 20} as the options

It returns a deferred object so that you can continue to chain your helpers.

Sorting

Sorting can be performed in the deferred object query method sort or by adding the sort key into the criteria object. Simply specify an attribute name for natural (ascending) sort, or specify an asc or desc flag for ascending or descending orders respectively.

User.find()
.sort('roleId asc')
.sort({ createdAt: 'desc' })
.exec(function(err, users) {
  // Do stuff here
});

Validations

Validations are handled by Offshore-validator which is based off of Node Validate and supports most of the properties in node-validate. For a full list of validations see: Offshore-validator Validations.

Validations are defined directly in you Collection attributes. In addition you may set the attribute type to any supported Offshore-validator type and Offshore will build a validation and set the schema type as a string for that attribute.

Validation rules may be defined as simple values or functions (both sync and async) that return the value to test against.

var User = Offshore.Collection.extend({

  attributes: {

    firstName: {
      type: 'string',
      required: true,
      minLength: 5,
      maxLength: 15
    },

    lastName: {
      type: 'string',
      required: true,
      minLength: 5,
      maxLength: 100
    },

    age: {
      type: 'integer',
      after: '12/12/2001'
    },

    website: {
      type: 'string',
      // Validation rule may be defined as a function. Here, an async function is mimicked.
      contains: function(cb) {
        setTimeout(function() {
          cb('http://');
        }, 1);
      }
    }
  }
});

var Event = Offshore.Collection.extend({

  attributes: {

    startDate: {
      type: 'date',
      // Validation rule functions allow you to validate values against other attributes
      before: function() {
        return this.endDate;
      }
    },

    endDate: {
      type: 'date',
      after: function() {
        return this.startDate;
      }
    }

  }

}

Custom Types

You can define your own types and their validation with the types hash. It's possible to access and compare values to other attributes.

var User = Offshore.Collection.extend({
  types: {
    point: function(latlng){
     return latlng.x && latlng.y
    },

    password: function(password) {
      return password === this.passwordConfirmation;
    });
  },

  attributes: {
    firstName: {
      type: 'string',
      required: true,
      minLength: 5,
      maxLength: 15
    },

    location: {
      // Note, that the base type (json) still has to be defined
      type: 'json',
      point: true
    },

    password: {
      type: 'string',
      password: true
    },

    passwordConfirmation: {
      type: 'string'
    }
  }
});

Indexing

You can add an index property to any attribute to create an index if your adapter supports it. This comes in handy when performing repeated queries against a key.

var User = Offshore.Collection.extend({

  attributes: {

    serviceID: {
      type: 'integer',
      index: true
    }
  }
});

Currently Offshore doesn't support multi-column indexes in the attributes definition. If you would like to build any sort of special index you will still need to build that manually. Also note that when adding a unique property to an attribute, an index will automatically be created for that attribute.

There is currently an issue with adding indexes to string fields. Because Offshore performs its queries in a case insensitive manner, we are unable to use the index on a string attribute. There are some workarounds being discussed but nothing is implemented so far. This will be updated in the near future to fully support indexes on strings.

Lifecycle Callbacks

Lifecycle callbacks are functions you can define to run at certain times in a query. They are useful for mutating data before creating or generating properties before they are validated.

Callbacks run on Create

  • beforeValidate / fn(values, cb)
  • afterValidate / fn(values, cb)
  • beforeCreate / fn(values, cb)
  • afterCreate / fn(newlyInsertedRecord, cb)

Callbacks run on Update

  • beforeValidate / fn(valuesToUpdate, cb)
  • afterValidate / fn(valuesToUpdate, cb)
  • beforeUpdate / fn(valuesToUpdate, cb)
  • afterUpdate / fn(updatedRecord, cb)

Callbacks run on Destroy

  • beforeDestroy / fn(criteria, cb)
  • afterDestroy / fn(cb)

Associations

With Offshore you can associate models with other models across all data stores. This means that your users can live in PostgreSQL and their photos can live in MongoDB and you can interact with the data as if they lived together on the same database. You can also have associations that live on seperate connections or in different databases within the same adapter. Read more about associations here.

Tests

All tests are written with mocha and should be run with npm:

  $ npm test

Coverage

To generate the code coverage report, run:

  $ npm run cover

And have a look at coverage/lcov-report/index.html.

Keywords

FAQs

Package last updated on 31 Jan 2019

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc