Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
s3-readstream
Advanced tools
*Updated for AWS-SDK v3* Zero dependency S3Client streaming solution
AWS S3 Read Stream made easy
Simple wrapper around AWS S3Client GetObjectCommand's grab-by-range call allowing intuitive and stable smart streaming.
(await S3Client.send(new GetObjectCommand())).Body
Since AWS updated their SDK to be more modular, it introduced breaking changes into version 1 of this package. So we have updated as well! Going forward, version 2 of this package will only work with the new AWS v3 SDK. However, if your project still uses AWS v2 sdk, you can use the npm tag sdk
to install version 1 of this package. Checkout the documentation on version 1.
To install the package:
npm install s3-readstream
You can integrate the S3ReadStream
class with the @aws-sdk/clientS3
package easily:
import { S3Client, GetObjectCommand, HeadObjectCommand } from "@aws-sdk/client-s3";
import {S3ReadStream} from 's3-readstream';
// Pass in your AWS S3 credentials
const s3 = new S3Client({
region: "us-east-1",
credentials: {
accessKeyId: s3Env.accessKey,
secretAccessKey: s3Env.secret
}
});
const bucketParams = {
Bucket: s3Env.bucket, // S3 Bucket Path
Key: s3Env.key // S3 file
};
// Grab the headobject like normal to get the length of the file
const headObjectCommand = new HeadObjectCommand(bucketParams);
const headObject = await s3.send(headObjectCommand);
// Because AWS sdk is now modular, pass in the `GetObjectCommand` command object
const options = {
s3,
command: new GetObjectCommand(bucketParams),
maxLength: headObject.ContentLength,
byteRange: 1024 * 1024 // 1 MiB (optional - defaults to 64kb)
};
// Instantiate the S3ReadStream in place of (await S3Client.send(new GetObjectCommand())).Body
const stream = new S3ReadStream(options);
To adjust the range of bytes grabbed from S3:
// You can adjust the range at any point during the stream (adjusting the speed)
stream.adjustByteRange(1024 * 1024 * 5); // 5 MiB
To adjust cursor position:
// You can move the cursor forwards to skip ahead (or back) in the file
// By default, the stream will skip ahead by the current Range
stream.moveCursorForward();
stream.moveCursorBack();
// Both of these methods also take in a `bytes` parameter for finer control
stream.moveCursorForward(10 * 1024); // Move cursor forward 10 KiB in file
stream.moveCursorBack(5 * 1024); // Move cursor back 5 KiB in file
You can alse use this S3ReadStream
like any other NodeJS Readable stream, setting an event listener is exactly the same:
stream.on('data', (chunk) => {
console.log(`read: ${chunk.toString()}`);
});
stream.on('end', () => {
console.log('end');
});
To work with zipped files:
import {createGunzip} from 'zlib';
const gzip = createGunzip();
// pipe into gzip to unzip files as you stream!
stream.pipe(gzip);
S3ReadStream(options: S3ReadStreamOptions)
Instantiates a new S3ReadStream
object.
Parameters:
options
(S3ReadStreamOptions) - Container object to hold options
options.command
(GetObjectCommand) - S3 get object command objectoptions.s3
(S3) - Resolved S3 objectoptions.maxLength
(number) - Total length of file in S3 bucketoptions.byteRange
(number) - (optional) Range of bytes to grab in S3 getObject
call (defaults to 64 KiB)nodeReadableStreamOptions
(ReadableOptions) - (optional) NodeJs Readable options to pass to super calladjustByteRange(bytes: number)
Adjusts the S3ReadStream._s3DataRange
property. Can be used to slow down or speed up the stream by grabbing a smaller or larger range of bytes.
Parameter:
bytes
(number) - New range of bytes to setmoveCursorForward(bytes: number)
Drains the internal buffer and adjusts the S3ReadStream._currentCusorPosition
property moving the cursor forward bytes
amount.
If current cursor position + number of bytes to move forward is > the length of the file, set cursor at end of file
Parameter:
bytes
(number) - (optional) Number of bytes to move forward (defaults to current range)moveCursorBack(bytes: number)
Drains the internal buffer and adjusts the S3ReadStream._currentCusorPosition
property moving the cursor back bytes
amount.
If current cursor position - number of bytes to move back is <= 0, set cursor at begining of file
Parameter:
bytes
(number) - (optional) Number of bytes to move forward (defaults to current range)See s3-readstream
in action in an HD video streaming app example and read a blog on its origins.
FAQs
*Updated for AWS-SDK v3* Zero dependency S3Client streaming solution
We found that s3-readstream demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.