Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
s3-sync-client
Advanced tools
AWS CLI s3 sync for Node.js provides a modern client to perform S3 sync operations between file systems and S3 buckets in the spirit of the official AWS CLI command
AWS CLI s3 sync
for Node.js is a modern TypeScript client to perform S3 sync operations between file systems and S3 buckets, in the spirit of the official AWS CLI command.
AWS CLI installation is NOT required by this module.
--delete
, --dryrun
, --size-only
, --include
, --exclude
, --follow-symlinks
, --no-follow-symlinks
npm i s3-sync-client
S3SyncClient
is a wrapper for the AWS SDK S3Client
class.
import S3Client from '@aws-sdk/client-s3';
import { S3SyncClient } from 's3-sync-client';
const s3Client = new S3Client({ /* ... */ });
const { sync } = new S3SyncClient({ client: s3Client });
// aws s3 sync /path/to/local/dir s3://mybucket2
await sync('/path/to/local/dir', 's3://mybucket2');
await sync('/path/to/local/dir', 's3://mybucket2', { partSize: 100 * 1024 * 1024 }); // uses multipart uploads for files higher than 100MB
// aws s3 sync /path/to/local/dir s3://mybucket2/zzz --delete
await sync('/path/to/local/dir', 's3://mybucket2/zzz', { del: true });
// aws s3 sync s3://mybucket /path/to/some/local --delete
await sync('s3://mybucket', '/path/to/some/local', { del: true });
// aws s3 sync s3://mybucket2 /path/to/local/dir --dryrun
const diff = await sync('s3://mybucket2', '/path/to/local/dir', { dryRun: true });
console.log(diff); // log operations to perform
// aws s3 sync s3://my-source-bucket s3://my-target-bucket --delete
await sync('s3://my-source-bucket', 's3://my-target-bucket', { del: true });
import { TransferMonitor } from 's3-sync-client';
const monitor = new TransferMonitor();
monitor.on('progress', (progress) => console.log(progress));
await sync('s3://mybucket', '/path/to/local/dir', { monitor });
/* output:
...
{
size: { current: 11925, total: 35688 },
count: { current: 3974, total: 10000 }
}
...
*/
// to pull status info occasionally only, use monitor.getStatus():
const timeout = setInterval(() => console.log(monitor.getStatus()), 2000);
try {
await sync('s3://mybucket', '/path/to/local/dir', { monitor });
} finally {
clearInterval(timeout);
}
import { AbortController } from '@aws-sdk/abort-controller';
import { TransferMonitor } from 's3-sync-client';
const abortController = new AbortController();
setTimeout(() => abortController.abort(), 30000);
await sync('s3://mybucket', '/path/to/local/dir', { abortSignal: abortController.signal });
import mime from 'mime-types';
/*
commandInput properties can either be:
- a plain object
- a function returning a plain object
*/
// set ACL, fixed value
await sync('s3://mybucket', '/path/to/local/dir', {
commandInput: {
ACL: 'aws-exec-read',
},
});
// set content type, dynamic value (function)
await sync('s3://mybucket1', 's3://mybucket2', {
commandInput: (input) => ({
ContentType: mime.lookup(input.Key) || 'text/html',
}),
});
// aws s3 sync s3://my-source-bucket s3://my-target-bucket --exclude "*" --include "*.txt" --include "flowers/*"
await sync('s3://my-source-bucket', 's3://my-target-bucket', {
filters: [
{ exclude: () => true }, // exclude everything
{ include: (key) => key.endsWith('.txt') }, // include .txt files
{ include: (key) => key.startsWith('flowers/') }, // also include everything inside the flowers folder
],
});
// move objects from source folder a/b/ to target folder zzz/
await sync('s3://my-source-bucket', 's3://my-target-bucket', {
relocations: [ // multiple relocations can be applied
(currentPath) =>
currentPath.startsWith('a/b/')
? currentPath.replace('a/b/', 'zzz/')
: currentPath
],
});
// aws s3 sync s3://mybucket/flowers/red /path/to/local/dir
// as in cli, folder flowers/red will be flattened during sync
await sync('s3://mybucket/flowers/red', '/path/to/local/dir');
Note: relocations are applied after every other options such as filters.
Additional examples are available in the repo test directory.
A complete API reference is available in the repo docs directory.
new S3SyncClient(options)
options
<S3SyncClientConfig>
client
<S3Client> instance of AWS SDK S3Client.client.sync(localDir, bucketPrefix[, options])
Sync from file system to S3 bucket.
Similar to AWS CLI aws s3 sync localDir bucketPrefix [options]
.
localDir
<string> Local directorybucketPrefix
<string> Remote bucket name which may contain a prefix appended with a /
separatoroptions
<SyncBucketWithLocalOptions>
dryRun
<boolean> Equivalent to CLI --dryrun
optiondel
<boolean> Equivalent to CLI --delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored if del
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI --size-only
optionfollowSymlinks
<boolean> Equivalent to CLI --follow-symlinks
option (default true
)relocations
<Relocation[]> Allows uploading objects to remote folders without mirroring the source directory structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI --exclude
and --include
options. Filters can be specified using plain objects including either an include
or exclude
property. The include
and exclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<PutObjectCommandInput>> | <CommandInput<CreateMultipartUploadCommandInput>> Set any of the SDK <PutObjectCommandInput> or <CreateMultipartUploadCommandInput> options to uploadsmonitor
<TransferMonitor>
progress
event to receive upload progress notificationsgetStatus()
to retrieve progress info on demandmaxConcurrentTransfers
<number> Each upload generates a Promise which is resolved when a local object is written to the S3 bucket. This parameter sets the maximum number of upload promises that might be running concurrently.partSize
<number> Set the part size in bytes for multipart uploads. Default to 5 MB.client.sync(bucketPrefix, localDir[, options])
Sync from S3 bucket to file system.
Similar to AWS CLI aws s3 sync bucketPrefix localDir [options]
.
bucketPrefix
<string> Remote bucket name which may contain a prefix appended with a /
separatorlocalDir
<string> Local directoryoptions
<SyncLocalWithBucketOptions>
dryRun
<boolean> Equivalent to CLI --dryrun
optiondel
<boolean> Equivalent to CLI --delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored if del
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI --size-only
optionfollowSymlinks
<boolean> Equivalent to CLI --follow-symlinks
option (default true
)relocations
<Relocation[]> Allows downloading objects to local directories without mirroring the source folder structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI --exclude
and --include
options. Filters can be specified using plain objects including either an include
or exclude
property. The include
and exclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<GetObjectCommandInput>> Set any of the SDK <GetObjectCommandInput> options to downloadsmonitor
<TransferMonitor>
progress
event to receive download progress notificationsgetStatus()
to retrieve progress info on demandmaxConcurrentTransfers
<number> Each download generates a Promise which is resolved when a remote object is written to the local file system. This parameter sets the maximum number of download promises that might be running concurrently.client.sync(sourceBucketPrefix, targetBucketPrefix[, options])
Sync from S3 bucket to S3 bucket.
Similar to AWS CLI aws s3 sync sourceBucketPrefix targetBucketPrefix [options]
.
sourceBucketPrefix
<string> Remote reference bucket name which may contain a prefix appended with a /
separatortargetBucketPrefix
<string> Remote bucket name to sync which may contain a prefix appended with a /
separatoroptions
<SyncBucketWithBucketOptions>
dryRun
<boolean> Equivalent to CLI --dryrun
optiondel
<boolean> Equivalent to CLI --delete
optiondeleteExcluded
<boolean> Delete excluded target objects even if they match a source object. Ignored if del
is false. See this CLI issue.sizeOnly
<boolean> Equivalent to CLI --size-only
optionrelocations
<Relocation[]> Allows copying objects to remote folders without mirroring the source folder structure. Each relocation is as a callback taking a string posix path param and returning a relocated string posix path.filters
<Filter[]> Almost equivalent to CLI --exclude
and --include
options. Filters can be specified using plain objects including either an include
or exclude
property. The include
and exclude
properties are functions that take an object key and return a boolean.abortSignal
<AbortSignal> Allows aborting the synccommandInput
<CommandInput<CopyObjectCommandInput>> Set any of the SDK <CopyObjectCommandInput> options to copy operationsmonitor
<TransferMonitor>
progress
event to receive copy progress notificationsgetStatus()
to retrieve progress info on demandmaxConcurrentTransfers
<number> Each copy generates a Promise which is resolved after the object has been copied. This parameter sets the maximum number of copy promises that might be running concurrently.See CHANGELOG.md.
AWS CLI s3 sync
for Node.js has been developed to solve the S3 sync limitations of the existing GitHub repo and NPM modules.
Most of the existing repo and NPM modules suffer one or more of the following limitations:
The following JavaScript modules suffer at least one of the limitations:
FAQs
AWS CLI s3 sync for Node.js provides a modern client to perform S3 sync operations between file systems and S3 buckets in the spirit of the official AWS CLI command
We found that s3-sync-client demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.