🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more
Socket
DemoInstallSign in
Socket

postgres-large-object

Package Overview
Dependencies
Maintainers
1
Versions
5
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

postgres-large-object

Large object support for PostgreSQL clients (postgres)

1.0.5
latest
Source
npm
Version published
Weekly downloads
36
-35.71%
Maintainers
1
Weekly downloads
 
Created
Source

postgres-large-object

GitHub Workflow Status GitHub codecov npm npm

Large object support for PostgreSQL clients using the postgres library.

The API of this library is exactly same with node-postgres-large-object and resembles the JDBC library for PostgreSQL.

Installation

npm install --save postgres-large-object

You will also need to install postgres library. Currently only test with postgres@beta

npm install --save postgres@beta

Some of the methods in this library require PostgreSQL 9.3 (server) and up:

  • LargeObject.seek()
  • LargeObject.tell()
  • LargeObject.size()
  • LargeObject.truncate()

All other methods should work on PostgreSQL 8.4 and up.

Large Objects

Large Objects in PostgreSQL lets you store files/objects up to 4 TiB in size. The main benefit of using Large Objects instead of a simple column is that the data can be read and written in chunks (e.g. as a stream), instead of having to load the entire column into memory.

Examples

library exposes a callback style interface (for backwards compatibility) and a promise style interface (see API Documentation). All functions that end with "Async" will return a promise

Reading a large object using a stream:

const postgres = require('postgres')
const { LargeObjectManager } = require('postgres-large-object')
const { createWriteStream } = require('fs')

const sql = postgres('postgres://postgres:1234@localhost/postgres')

// When working with Large Objects, always use a transaction
sql
  .begin((tx) => {
    const man = new LargeObjectManager(tx)

    // A LargeObject oid, probably stored somewhere in one of your own tables.
    const oid = 123

    // If you are on a high latency connection and working with
    // large LargeObjects, you should increase the buffer size.
    // The buffer should be divisible by 2048 for best performance
    // (2048 is the default page size in PostgreSQL, see LOBLKSIZE)
    const bufferSize = 16384

    return man.openAndReadableStreamAsync(oid, bufferSize).then(([size, stream]) => {
      console.log('Streaming a large object with a total size of', size)

      // Store it as an image
      const fileStream = createWriteStream('my-file.png')
      stream.pipe(fileStream)

      return new Promise((resolve, reject) => {
        stream.on('end', resolve)
        stream.on('error', reject)
      })
    })
  })
  .then(() => {
    console.log('Done!')
  })
  .catch((error) => {
    console.log('Something went horribly wrong!', error)
  })

Creating a new large object using a stream and pg-promise:

const postgres = require('postgres')
const { LargeObjectManager } = require('postgres-large-object')
const { createReadStream } = require('fs')

const sql = postgres('postgres://postgres:1234@localhost/postgres')

// When working with Large Objects, always use a transaction
sql
  .begin((tx) => {
    const man = new LargeObjectManager(tx)

    // If you are on a high latency connection and working with
    // large LargeObjects, you should increase the buffer size.
    // The buffer should be divisible by 2048 for best performance
    // (2048 is the default page size in PostgreSQL, see LOBLKSIZE)
    const bufferSize = 16384

    return man.createAndWritableStreamAsync(bufferSize).then(([oid, stream]) => {
      // The server has generated an oid
      console.log('Creating a large object with the oid', oid)

      const fileStream = createReadStream('upload-my-file.png')
      fileStream.pipe(stream)

      return new Promise((resolve, reject) => {
        stream.on('finish', resolve)
        stream.on('error', reject)
      })
    })
  })
  .then(() => {
    console.log('Done!')
  })
  .catch((error) => {
    console.log('Something went horribly wrong!', error)
  })

Testing

You can test this library by running:

npm install postgres-large-object
npm test

The test assumes that postgres://nodetest:nodetest@localhost/nodetest is a valid database. Or specify connection URL with environment variable POSTGRES_URL

Keywords

big object

FAQs

Package last updated on 21 Jun 2020

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts