Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
An unofficial package to easily deal with Backblaze B2 API on Node.js:
import Bucket from "backblaze";
const bucket = Bucket("bucket-name", {
id: process.env.B2_ID,
key: process.env.B2_KEY
});
console.log(await bucket.list());
// [{ name: 'favicon.png', ...}, { name: 'hello.png', ...}, ...]
// Upload a file from a local file to an auto-generated name
const file = await bucket.upload("./avatar.png");
// Let's download it now as a copy locally
await bucket.download(file, "./avatar-copy.png");
Please note that file paths are relative to the working directory as specified on Node.js' fs. You can always provide absolute paths.
You can work with multiple buckets as well by creating them as expected:
const bucket1 = Bucket('bucket-name-1', { ... });
const bucket2 = Bucket('bucket-name-2', { ... });
All of the methods are async so they should be used with await
:
File
: a description for the remote file use in the API.Bucket(name, { id, key })
: initialize the API with the credentials.bucket.info()
: load some information related to the bucket itself.bucket.list([prefix])
: show a list with all of the files in your bucket.bucket.count()
: display the number of items inside a bucket.bucket.upload(local, remote)
: upload a local file to the bucket.bucket.download(remote, local)
: downloads a file from the bucket into the server.bucket.read(remote)
: read a file straight away.bucket.exists(remote)
: check whether a file exists on the bucket.bucket.remove(remote)
: delete a file from the bucket.A File
is a plain object with these properties:
name
: the filename, same as the one listed with bucket.list()
.type
: the MIME type of the file, e.g. image/png
.size
: the weight of the file in Bytes.url
: the URL for the file, specially useful if the bucket is public so that you can save this url straight away. It has the shape of https://fNNN.backblazeb2.com/file/BUCKET/FILE
, where NNN
depends on your account, BUCKET
is the bucket name and FILE
is the file name.timestamp
: the uploaded timestamp as an instance of a native Date() object.This is useful to define since it appears in different parts of the API, like in the .list()
and .upload()
APIs:
const file = await bucket.upload(...);
console.log(file);
// {
// name: 'kwergvckwsdb.png',
// type: 'image/png',
// size: 11554,
// url: 'https://fNNN.backblazeb2.com/file/BUCKET/kwergvckwsdb.png'
// timestamp: new Date(...)
// }
Note that it is not a class or an instance of anything, just a shared plain object structure.
Create an instance that can communicate with the specified bucket:
import Bucket from "backblaze";
const bucket = Bucket("bucket-name", { id, key });
// await bucket.upload();
// await bucket.download();
// ...
You should not use the
new
norawait
keywords when creating a bucket.
It receives first the bucket name, and then an object with the config which preferably should come from the environment variables:
const bucket = Bucket("bucket-demo", {
id: process.env.B2_ID,
key: process.env.B2_KEY
});
The id
and key
fields correspond to these fields when creating a new key:
The id
and key
, and the second parameter altogether, can be skipped if the environment variables B2_ID
and B2_KEY
have been set respectively:
const bucket = Bucket("bucket-demo");
It will start loading the bucket as soon as initialized like this, and if it hasn't loaded by the time it's used then it will await on the first operation for it. That is why you don't need the await
or new
keywords.
If you really want to wait for it finish auth and other loading before using it, you can force it like this:
const bucket = Bucket("bucket-demo", {
id: process.env.B2_ID,
key: process.env.B2_KEY
});
await bucket.info();
Load some information related to the bucket itself:
const info = await bucket.info();
console.log(info);
// {
// accountId: '...',
// bucketId: '...',
// bucketInfo: {},
// bucketName: '...',
// bucketType: '...',
// corsRules: [],
// lifecycleRules: [],
// options: [ 's3' ],
// revision: 2,
// baseURL: 'https://fNNN.backblazeb2.com/file/BUCKET/'
// }
Show a list with all of the files in your bucket. Each one is a File object with few properties related to the file. It includes files in subfolders:
const list = await bucket.list();
const list = await bucket.list("profile/"); // With a filter
console.log(list);
// [
// {
// name: 'kwergvckwsdb.png',
// type: 'image/png',
// size: 11554,
// url: 'https://fNNN.backblazeb2.com/file/BUCKET/kwergvckwsdb.png'
// timestamp: new Date(...)
// },
// ...
// ]
You can pass an optional prefix filter, and only those files starting by it will be returned. Use abc/
to return only the files in folder abc
.
You might just want to read only e.g. the filenames, so you can .map()
it with plain Javascript:
const list = await bucket.list();
console.log(list.map(file => file.name));
// ['avatar.png', 'kwergvckwsdb.png', ...]
Display the number of items inside a bucket, including sub-folder files:
await bucket.count();
// 27
Upload a local file to the bucket:
bucket.upload(localFilePath, [remoteFileName]) => File
The arguments are:
localFilePath
(required): the path to the file to be uploaded. It will be relative to the working directory as specified on Node.js' fs. TODO: accept a byte sequence.remoteFileName
(optional): the name of the file in the bucket. Leave it empty to autogenerate the name. We are purposefully avoiding reusing the localFilePath
name to avoid collisions and other issues.It returns a File object with the properties name
, type
, size
, url
and timestamp
as usual:
// Just upload a file and get the path in the response:
const file = await bucket.upload("./avatar.png");
console.log(file);
// {
// name: 'kwergvckwsdb.png',
// type: 'image/png',
// size: 11554,
// url: 'https://fNNN.backblazeb2.com/file/BUCKET/kwergvckwsdb.png'
// timestamp: new Date(...)
// }
// Upload a file inside a folder and specify the remote name:
await bucket.upload("./public/favicon.png", "favicon.png");
// Upload a file to a folder in the bucket:
await bucket.upload("./avatar.png", "public/favicon.png");
// Absolute paths:
await bucket.upload(__dirname + "/avatar.png", "favicon.png");
If you are using a modern Node.js version that doesn't define __dirname
, you can create __dirname
like this:
import { dirname } from "path";
import { fileURLToPath } from "url";
const __dirname = dirname(fileURLToPath(import.meta.url));
Downloads a file from the bucket into the server:
bucket.download(remoteFileName, [localFilePath]) => localFilePath
The arguments are:
remoteFileName
(required): the name of the file in the bucket. It can be inside a folder as well. You can pass either a plain string with the name, or a full File reference.localFilePath
(optional): the path where the file will be located. It will be relative to the working directory as specified on Node.js' fs. Leave it empty to use the current working directory and the remote file name.// Upload the file with the same name as locally:
const path = await bucket.download("avatar.png");
console.log(path); // /users/me/projects/backblaze/avatar.png
// Upload a file inside a folder to the root:
await bucket.download("favicon.png", "./public/favicon.png");
// Upload a file to a folder in the bucket:
await bucket.download("public/favicon.png", "./avatar.png");
// Absolute paths:
await bucket.download("favicon.png", __dirname + "/avatar.png");
If you are using a modern Node.js version that doesn't define __dirname
, you can create __dirname
like this:
import { dirname } from "path";
import { fileURLToPath } from "url";
const __dirname = dirname(fileURLToPath(import.meta.url));
Gets the content of the given file into a variable:
bucket.read(remoteFileName) => file data
The arguments are:
remoteFileName
(required): the name of the file in the bucket. It can be inside a folder as well. You can pass either a plain string with the name, or a full File reference.const raw = await bucket.read("mydata.json");
const data = JSON.parse(raw);
Check whether a file exists on the bucket:
bucket.exists(remoteFileName) => Boolean
It accepts either a string name or a full File reference:
if (await bucket.exists("avatar.png")) {
console.log("Avatar already exists");
}
// Check inside a subfolder
if (await bucket.exists("users/abc.png")) {
console.log("User already has a profile picture");
}
Delete a file from the bucket:
bucket.remove(remoteFileName) => File
It accepts either a string name or a full File reference:
const file = await bucket.remove("avatar.png");
console.log(file);
// {
// name: 'kwergvckwsdb.png',
// type: 'image/png',
// size: 11554,
// url: 'https://fNNN.backblazeb2.com/file/BUCKET/kwergvckwsdb.png'
// timestamp: new Date(...)
// }
// Remove from inside a subfolder
await bucket.remove("users/abc.png");
It returns the description of the file that was removed.
FAQs
An unofficial package to easily deal with Backblaze B2 API
We found that backblaze demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.