Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

keystone-s3-upload-adapter

Package Overview
Dependencies
Maintainers
1
Versions
8
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

keystone-s3-upload-adapter

[![NPM](https://nodei.co/npm/keystone-s3-upload-adapter.png)](https://nodei.co/npm/keystone-s3-upload-adapter/)

  • 1.0.7
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
5
Maintainers
1
Weekly downloads
 
Created
Source

S3-based storage adapter for KeystoneJS

NPM

This is an alternative to the official KeystoneJS Adapter for S3File Upload. The main issue with that library is that it is based on knox which not up to date. So this library is an aws-sdk version of the same. The usage is exactly the same as of the original one.

This also adds support for non US S3 Buckets as knox was not able to incorporate all of them.

The Usage is exaclty same as that of the official version but with different internals. So users dont have to change their existing code's to use this package.

This adapter is designed to replace the existing S3File field in KeystoneJS using the new storage API.

Compatible with Node.js 0.12+

Usage

Install Package:

npm install --save keystone-s3-upload-adapter

Configure the storage adapter:

var s3Storage = new keystone.Storage({
    adapter: require('keystone-s3-upload-adapter'),
    s3: {
        key: 's3-key', // required; defaults to process.env.S3_KEY
        secret: 'secret', // required; defaults to process.env.S3_SECRET
        bucket: 'bucket', // required; defaults to process.env.S3_BUCKET
        region: 'region', // optional; defaults to process.env.S3_REGION, or if that's not specified, us-east-1
        path: 'images',
        headers: {
            'x-amz-acl': 'public-read', // add default headers; see below for details
        },
    },
    schema: {
        bucket: true, // optional; store the bucket the file was uploaded to in your db
        etag: true, // optional; store the etag for the resource
        path: true, // optional; store the path of the file in your db
        url: true, // optional; generate & store a public URL
    },
});

Use it as a type in Keystone Field (Example Below):

imageUpload: {
        type: Types.File,
        storage: s3Storage,
        filename: function (item, file) {
            return encodeURI(item._id + '-' + item.name);
        },
    },

Options:

The adapter requires an additional s3 field added to the storage options. It accepts the following values:

  • key: (required) AWS access key. Configure your AWS credentials in the IAM console.

  • secret: (required) AWS access secret.

  • bucket: (required) S3 bucket to upload files to. Bucket must be created before it can be used. Configure your bucket through the AWS console here.

  • region: AWS region to connect to. AWS buckets are global, but local regions will let you upload and download files faster. Defaults to 'us-standard'. Eg, 'us-west-2'.

  • path: Storage path inside the bucket. By default uploaded files will be stored in the root of the bucket. You can override this by specifying a base path here. Path can be either absolute, for example '/images/profilepics', or relative, for example 'images/profilepics'.

  • headers: Default headers to add when uploading files to S3. You can use these headers to configure lots of additional properties and store (small) extra data about the files in S3 itself. See AWS documentation for options. Examples: {"x-amz-acl": "public-read"} to override the bucket ACL and make all uploaded files globally readable.

Schema

The S3 adapter supports all the standard Keystone file schema fields. It also supports storing the following values per-file:

  • bucket, path: The bucket, and path within the bucket, for the file can be is stored in the database. If these are present when reading or deleting files, they will be used instead of looking at the adapter configuration. The effect of this is that you can have some (eg, old) files in your collection stored in different bucket / different path inside your bucket.

The main use of this is to allow slow data migrations. If you don't store these values you can arguably migrate your data more easily - just move it all, then reconfigure and restart your server.

  • etag: The etag of the stored item. This is equal to the MD5 sum of the file content.

Keywords

FAQs

Package last updated on 14 Feb 2018

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc