Research
Security News
Malicious npm Package Targets Solana Developers and Hijacks Funds
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
Simple, micro-dependency, pseudo-database using Apache Avro serialization on S3-compatible storages, inspired by lowdb.
💾 Simple, lightning fast, object pseudo-database for S3-compatible storages, strongly inspired by lowdb(https://github.com/typicode/lowdb/).
Since version 2.0.0, lowstorage
has undergone significant changes:
ultralight-s3
package.If you are migrating from version 1.x.x, please review the new constructor parameters and usage examples below.
R2 uses the S3 API to allow users and their applications to migrate with ease. When comparing to AWS S3, Cloudflare has removed some API operations’ features and added others. The S3 API operations are listed below with their current implementation status. Feature implementation is currently in progress. Refer back to this page for updates. The API is available via the https://<ACCOUNT_ID>.r2.cloudflarestorage.com
endpoint. Find your account ID in the Cloudflare dashboard.
import { lowstorage, lowstorage_ERROR_CODES } from 'lowstorage';
// Initialize object and get users collection
const storage = new lowstorage({
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY',
endpoint: 'YOUR_ENDPOINT',
bucketName: 'YOUR_BUCKET_NAME',
region: 'YOUR_REGION', // fallback to auto
// optional params from here
logger?: console, // logger object for your tough times
dirPrefix?: 'lowstorage', // folder name prefix for collections
maxRequestSizeInBytes?: 50 * 1024 * 1024, // request size in bytes for S3 operations (default: 5MB)
});
// example user schema
const userAvroSchema = {
type: 'record',
name: 'User',
fields: [
{ name: '_id', type: 'string', size: 16, logicalType: 'UUID' },
{ name: 'name', type: 'string' },
{ name: 'age', type: 'int' },
{ name: 'gender', type: 'string' },
{ name: 'posts', type: { type: 'array', items: 'string' } },
],
};
// Create a collection
const userCol = await storage.createCollection('users');
// or
const userCol = await storage.collection('users'); // get collection
// or
const userCol = await storage.collection('users', userAvroSchema); // get collection with specificschema
// Add new user - Avro schema is inferred from the data (_id is optional and will be autogenerated)
const newUser = await userCol.insert({
name: 'Kevin',
age: 32,
gender: 'whatever',
posts: [],
});
// Show all users
const allUsers = await userCol.find({});
// Find users with pagination (e.g., page 2, 10 users per page)
const secondPageUsers = await userCol.find({}, { skip: 10, limit: 10 });
// Find user by ID and update name
const kevin = await userCol.findOne({ name: 'Kevin' });
await userCol.update({ _id: kevin._id }, { name: 'Carlos' });
// Delete user
await userCol.delete({ name: 'Carlos' });
// Delete all users
await userCol.deleteAll();
// Count users
const count = await userCol.count();
// Rename collection
await userCol.renameCollection('usersOld');
// Remove collection
await userCol.removeCollection();
// List all collections
const listCollections = await storage.listCollections();
// Get S3 instance and perform S3 operations (get, put, delete, etc.) Read more about ultralight-s3 here: https://github.com/sentienhq/ultralight-s3
const s3ops = await storage.s3();
// check the API section for more details or /examples and /dev folder for more samples
npm install lowstorage
yarn add lowstorage
pnpm add lowstorage
To set up and bind your storage, configure your storage client with the appropriate credentials and bucket information. Here is an example setup for AWS S3:
import { lowstorage, lowstorage_ERROR_CODES } from 'lowstorage';
const storage = new lowstorage({
endPoint: 's3.amazonaws.com',
region: 'YOUR-REGION',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY',
bucketName: 'your-bucket-name',
});
For Cloudflare R2, follow similar steps with your R2-specific endpoint and credentials.
Behavior: Creates a new lowstorage instance.
Input: An object containing the following properties:
accessKeyId
: The access key ID for your S3 account.secretAccessKey
: The secret access key for your S3 account.endpoint
: The endpoint URL for your S3 account.bucketName
: The name of the bucket to use.region?
: The region for your S3 bucket. Default is auto
.logger?
: An optional logger object for your tough times.dirPrefix?
: An optional directory prefix for your collections. Default is lowstorage
.maxRequestSizeInBytes?
: An optional maximum request size in bytes for S3 operations. Default is 5MB.Returns: A new lowstorage instance.
Throws: A lowstorageError if there's an error.
colName
: The name of the collection.schema?
: An optional schema object for the collection.autoCreate?
: An optional boolean indicating whether to automatically create the collection if it doesn't exist. Default is true
.
colName
: The name of the collection.schema?
: An optional schema object for the collection.data?
: An optional array of data to initialize the collection with - if not provided, an empty array will be used and empty file will be created.colName
: The name of the collection.colName
: The name of the collection.colName
: The name of the collection.schema
: The Avro schema for the collection.s3
: The S3 instance.dirPrefix?
: An optional directory prefix for the collection. Default is lowstorage
.safeWrite?
: An optional boolean indicating whether to perform safe writes. Default is false
. Safe writes doublechecks the ETag of the object before writing. False = overwrites the object, True = only writes if the object has not been modified. (One request extra for safe writes = slower)chunkSize?
: An optional chunk size for reading and writing data. Default is 5MB.colName
: The name of the collection.s3
: The S3 instance.schema
: The Avro schema for the collection.avroParse
: The Avro parse instance.avroType
: The Avro type instance.dirPrefix
: The directory prefix for the collection.safeWrite
: A boolean indicating whether to perform safe writes.chunkSize
: The chunk size for reading and writing data.props
: An object containing the following properties:
colName
: The name of the collection.s3
: The S3 instance.schema
: The Avro schema for the collection.avroParse
: The Avro parse instance.avroType
: The Avro type instance.dirPrefix
: The directory prefix for the collection.safeWrite
: A boolean indicating whether to perform safe writes.chunkSize
: The chunk size for reading and writing data.safeWrite
: A boolean indicating whether to perform safe writes.schema
: An object representing the Avro schema.data
: An object or an array of objects representing the data to infer the schema from.doc
: An object or an array of objects to insert into the collection.schema?
: An optional schema object for the collection.query
: An object representing the query to filter documents.options
: An object representing the options for pagination.query
: An object representing the query to filter documents.query
: An object representing the query to filter documents.update
: An object representing the update operations.options
: An object representing the options for pagination.query
: An object representing the query to filter the document to update.update
: An object representing the update operations.options
: An object representing the options for pagination.query
: An object representing the query to filter documents.query
: An object representing the query to filter documents.newColName
: The new name of the collection.newSchema?
: An optional new schema object for the collection.message?
: An optional string representing the error message.code?
: An optional error code.message?
: An optional string representing the error message.code?
: An optional error code.message?
: An optional string representing the error message.code?
: An optional error code.message?
: An optional string representing the error message.code?
: An optional error code.message?
: An optional string representing the error message.code?
: An optional error code.src/errors.ts
for more details.MISSING_ARGUMENT
: A string representing the missing argument error code.COLLECTION_EXISTS
: A string representing the collection exists error code.CREATE_COLLECTION_ERROR
: A string representing the create collection error code.RENAME_COLLECTION_ERROR
: A string representing the rename collection error code.REMOVE_COLLECTION_ERROR
: A string representing the remove collection error code.UPDATE_COLLECTION_SCHEMA_ERROR
: A string representing the update collection schema error code.COLLECTION_NOT_FOUND
: A string representing the collection not found error code.SCHEMA_VALIDATION_ERROR
: A string representing the schema validation error code.DOCUMENT_VALIDATION_ERROR
: A string representing the document validation error code.S3_OPERATION_ERROR
: A string representing the S3 operation error code.FIND_ERROR
: A string representing the find error code.FIND_ONE_ERROR
: A string representing the find one error code.SAVE_DATA_ERROR
: A string representing the save data error code.INSERT_ERROR
: A string representing the insert error code.UPDATE_ERROR
: A string representing the update error code.UPDATE_ONE_ERROR
: A string representing the update one error code.DELETE_ERROR
: A string representing the delete error code.COUNT_ERROR
: A string representing the count error code.UNKNOWN_ERROR
: A string representing the unknown error code.lowstorage
is primarily designed for small, hobby, or personal projects. We advise extreme caution when using lowstorage
for critical applications or production environments, as it may not offer the robustness or features required for such use cases.
Feel free to dive in! Open an issue or submit PRs.
Standard Readme follows the Contributor Covenant Code of Conduct.
FAQs
Simple, micro-dependency, pseudo-database using Apache Avro serialization on S3-compatible storages, inspired by lowdb.
We found that lowstorage demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
Security News
Research
Socket researchers have discovered malicious npm packages targeting crypto developers, stealing credentials and wallet data using spyware delivered through typosquats of popular cryptographic libraries.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.