Security News
tea.xyz Spam Plagues npm and RubyGems Package Registries
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
alamo
Advanced tools
Readme
alamo
is a wrapper around knox that provides an higher level abstraction
for s3 with handling of response status codes and automatic parsing of XML error bodies. It also provide a consistent
full (writing and reading) streaming interface, including multipart upload for large artifacts. Alamo implements
automatic retries on error with exponential back-off.
npm install --save alamo
Returns an s3 client. It accepts the same options as knox.createClient plus the following ones that set how retries work (see retry):
retries
: max number of retries (default 10)factor
: factor for retry (default 2)minTimeout
: minimum time for first retry (default 1000)maxTimeout
: maximum time for all retries (default 60000)var client = require('alamo').createClient({
key: process.env.AWS_ACCESS_KEY_ID,
secret: process.env.AWS_SECRET_ACCESS_KEY,
bucket: process.env.AWS_BUCKET,
region: process.env.AWS_REGION
});
Access to the lower level knox client
Returns a readable stream.
var fs = require('fs');
client.createReadStream('/somekey').pipe(fs.createWriteStream('somefile'));
filename
: the s3 file name to retrieveheaders
: optional headersoptions
: retry optionsAlias: readStream
Returns a writable upload stream. You can optionally pass a buffer to upload instead of piping to it.
var fs = require('fs');
var ws = client.createWriteStream('/somekey');
fs.createReadStream('somefile')
.pipe(ws)
.on('error', console.error)
.on('finish', console.log.bind(console, 'Upload complete'));
filename
: the s3 file name to upload toheaders
: optional headerscontent
: optional content to upload. If content is passed, it is passed to the underlying request.end
options
: retry optionsAlias: writeStream
Generic stream implementation that accepts the method as 1st argument as 2nd argument
var fs = require('fs');
var ws = client.stream('PUT', '/somekey');
fs.createReadStream('somefile')
.pipe(ws)
.on('error', console.error)
.on('finish', console.log.bind(console, 'Upload complete'));
method
: the http method e.g. GET
, PUT
filename
: the s3 file name to upload toheaders
: optional headerscontent
: optional content to upload. If content is passed, it is passed to the underlying request.end
options
: retry optionsGet an object and retrieve the response with the body
client.get('/somekey', function (err, res) {
if (err) console.error(err);
else console.log(res.statusCode, res.body.toString());
});
filename
: the s3 file name to retrieveheaders
: optional headersoptions
: retry optionscb
: callback that returns an error or null
as 1st argument, and the response
with the body if no error as 2nd argumentDelete an object from s3
client.del('/somekey', function (err, res) {
if (err) console.error(err);
else console.log('object deleted %d', res.statusCode);
});
filename
: the s3 file name to deleteheaders
: optional headersoptions
: retry optionscb
: callback that returns an error or null
as 1st argument, and the response if no error as 2nd argumentPut an object
client.put('/somekey', 'somedata', function (err, res) {
if (err) console.error(err);
else console.log('object uploaded %d', res.statusCode);
});
filename
: the s3 file name to upload tocontent
: content to uploadheaders
: optional headersoptions
: retry optionscb
: callback that returns an error or null
as 1st argument, and the response
if no error as 2nd argumentPost an object
client.post('/somekey', 'somedata', function (err, res) {
if (err) console.error(err);
else console.log('object uploaded %d with etag %s', res.statusCode, res.headers.etag);
});
filename
: the s3 file name to post tocontent
: content to postheaders
: optional headersoptions
: retry optionscb
: callback that returns an error or null
as 1st argument, and the response
if no errorGeneric non streaming interface
client.request('PUT', '/somekey', 'somedata', function (err, res) {
if (err) console.error(err);
else console.log('object uploaded %d with etag %s', res.statusCode, res.headers.etag);
});
method
: the http method e.g. GET
, PUT
, DELETE
, POST
filename
: the s3 file namecontent
: content to postheaders
: optional headerscb
: callback that returns an error or null
as 1st argument, and the response with the body if no errorReturns a signed url
var url = client.signedUrl('/somekey', 1000 * 60 * 15);
console.log(url);
filename
: the s3 file name to retrieveexpiration
: number of milliseconds that the signed url is valid foroptions
: signed url options passed to knox, take verb
, contentType
, and qs
objectReturns a writable stream to upload using the s3 multipart API. The stream is uploaded by chunks of 5mb in parallel with max concurrent uploads and automatic retries
var fs = require('fs');
var ws = client.multipart('/somekey');
fs.createReadStream('somefile')
.pipe(ws)
.on('error', console.error)
.on('finish', console.log.bind(console, 'Upload complete'));
filename
: the s3 file name to upload toheaders
: optional headersoptions
: retry optionsvar fs = require('fs');
var XML = require('simple-xml');
var req = client.get('/filename');
req.on('response', function (res) {
if (res.statusCode !== 200) {
var body = '';
res.on('data', function (chunk) {
body += chunk.toString();
});
res.on('end', function () {
try {
body = XML.parse(body);
cb(new Error(body.message);
} catch (err) {
cb(new Error(body);
}
cb(null, res);
});
res.on('error', cb);
return;
}
res.pipe(fs.createWriteStream('filename').on('error', cb).on('finish', cb.bind(null, null));
});
req.on('error', cb);
Please open issues for bugs and suggestions in github. Pull requests with tests are welcome.
Jerome Touffe-Blin, @jtblin, About me
alamo is copyright 2015 Jerome Touffe-Blin and contributors. It is licensed under the BSD license. See the include LICENSE file for details.
FAQs
High level full streaming API s3 client with multipart upload and concurrent uploads, and automatic retries with exponential backoff
The npm package alamo receives a total of 5 weekly downloads. As such, alamo popularity was classified as not popular.
We found that alamo demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Tea.xyz, a crypto project aimed at rewarding open source contributions, is once again facing backlash due to an influx of spam packages flooding public package registries.
Security News
As cyber threats become more autonomous, AI-powered defenses are crucial for businesses to stay ahead of attackers who can exploit software vulnerabilities at scale.
Security News
UnitedHealth Group disclosed that the ransomware attack on Change Healthcare compromised protected health information for millions in the U.S., with estimated costs to the company expected to reach $1 billion.