Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

s3-readstream

Package Overview
Dependencies
Maintainers
1
Versions
19
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

s3-readstream

*Updated for AWS-SDK v3* Zero dependency S3Client streaming solution

  • 2.0.1
  • latest
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

s3-readstream

AWS S3 Read Stream made easy

Simple wrapper around AWS S3Client GetObjectCommand's grab-by-range call allowing intuitive and stable smart streaming.

  • ZERO Dependencies
  • Simple interface for streaming any size file from AWS S3
  • Easily speed-up, and slow down, the streaming at any point
  • All of the functionaly you love with NodeJS Readable streams
  • Drop in replacement for (await S3Client.send(new GetObjectCommand())).Body

AWS v3 updates

Since AWS updated their SDK to be more modular, it introduced breaking changes into version 1 of this package. So we have updated as well! Going forward, version 2 of this package will only work with the new AWS v3 SDK. However, if your project still uses AWS v2 sdk, you can use the npm tag sdk to install version 1 of this package. Checkout the documentation on version 1.

Installing the package

To install the package:

npm install s3-readstream

Using the package

You can integrate the S3ReadStream class with the @aws-sdk/clientS3 package easily:

import { S3Client, GetObjectCommand, HeadObjectCommand } from "@aws-sdk/client-s3";
import {S3ReadStream} from 's3-readstream';
// Pass in your AWS S3 credentials
const s3 = new S3Client({
    region: "us-east-1",
    credentials: {
        accessKeyId: s3Env.accessKey,
        secretAccessKey: s3Env.secret
    }
});

const bucketParams = {
  Bucket: s3Env.bucket, // S3 Bucket Path
  Key: s3Env.key // S3 file
};

// Grab the headobject like normal to get the length of the file
const headObjectCommand = new HeadObjectCommand(bucketParams);
const headObject = await s3.send(headObjectCommand);

// Because AWS sdk is now modular, pass in the `GetObjectCommand` command object
const options = {
  s3,
  command: new GetObjectCommand(bucketParams),
  maxLength: headObject.ContentLength,
  byteRange: 1024 * 1024 // 1 MiB (optional - defaults to 64kb)
};

// Instantiate the S3ReadStream in place of (await S3Client.send(new GetObjectCommand())).Body
const stream = new S3ReadStream(options);

Adjusting the read stream

To adjust the range of bytes grabbed from S3:

// You can adjust the range at any point during the stream (adjusting the speed)
stream.adjustByteRange(1024 * 1024 * 5); // 5 MiB

To adjust cursor position:

// You can move the cursor forwards to skip ahead (or back) in the file
// By default, the stream will skip ahead by the current Range
stream.moveCursorForward();
stream.moveCursorBack();

// Both of these methods also take in a `bytes` parameter for finer control
stream.moveCursorForward(10 * 1024); // Move cursor forward 10 KiB in file
stream.moveCursorBack(5 * 1024); // Move cursor back 5 KiB in file

Inherited features from NodeJS Readable class

You can alse use this S3ReadStream like any other NodeJS Readable stream, setting an event listener is exactly the same:

stream.on('data', (chunk) => {
  console.log(`read: ${chunk.toString()}`);
});
stream.on('end', () => {
  console.log('end');
});

To work with zipped files:

import {createGunzip} from 'zlib';

const gzip = createGunzip();
// pipe into gzip to unzip files as you stream!
stream.pipe(gzip);

API

S3ReadStream(options: S3ReadStreamOptions)

Instantiates a new S3ReadStream object.

Parameters:

  • options (S3ReadStreamOptions) - Container object to hold options
    • options.command (GetObjectCommand) - S3 get object command object
    • options.s3 (S3) - Resolved S3 object
    • options.maxLength (number) - Total length of file in S3 bucket
    • options.byteRange (number) - (optional) Range of bytes to grab in S3 getObject call (defaults to 64 KiB)
  • nodeReadableStreamOptions (ReadableOptions) - (optional) NodeJs Readable options to pass to super call

adjustByteRange(bytes: number)

Adjusts the S3ReadStream._s3DataRange property. Can be used to slow down or speed up the stream by grabbing a smaller or larger range of bytes.

Parameter:

  • bytes (number) - New range of bytes to set

moveCursorForward(bytes: number)

Drains the internal buffer and adjusts the S3ReadStream._currentCusorPosition property moving the cursor forward bytes amount. If current cursor position + number of bytes to move forward is > the length of the file, set cursor at end of file

Parameter:

  • bytes (number) - (optional) Number of bytes to move forward (defaults to current range)

moveCursorBack(bytes: number)

Drains the internal buffer and adjusts the S3ReadStream._currentCusorPosition property moving the cursor back bytes amount. If current cursor position - number of bytes to move back is <= 0, set cursor at begining of file

Parameter:

  • bytes (number) - (optional) Number of bytes to move forward (defaults to current range)

Example

See s3-readstream in action in an HD video streaming app example and read a blog on its origins.

Keywords

FAQs

Package last updated on 30 Jun 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc