Security News
pnpm 10.0.0 Blocks Lifecycle Scripts by Default
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
@alcalzone/jsonl-db
Advanced tools
Simple JSONL-based key-value store. Uses an append-only file to store the data. With support for database dumps and compressing the db file.
Load the module:
import { DB } from "@alcalzone/jsonl-db";
Open or create a database file and use it like a Map
// Open
const db = new DB("/path/to/file");
await db.open();
// db.isOpen is now true
// and use it
db.set("key", value);
db.delete("key");
db.clear();
if (db.has("key")) {
result = db.get("key");
}
// ...forEach, keys(), entries(), values(), ...
If corrupt data is encountered while opening the DB, the call to open()
will be rejected. If this is to be expected, use the options parameter on the constructor to turn on forgiving behavior:
const db = new DB("/path/to/file", { ignoreReadErrors: true });
await db.open();
Warning: This may result in inconsistent data since invalid lines are silently ignored.
You can optionally transform the parsed values by passing a reviver function. This allows storing non-primitive objects in the database if those can be transformed to JSON (e.g. by overwriting the toJSON
method).
function reviver(key: string, value: any) {
// MUST return a value. If you don't want to transform `value`, return it.
}
const db = new DB("/path/to/file", { reviver });
await db.open();
Data written to the DB is persisted asynchronously. Be sure to call close()
when you no longer need the database in order to flush all pending writes and close all files:
await db.close();
Now, db.isOpen
is false
. While the db is not open, any calls that access the data will throw an error.
By default, the database immediately writes to the database file. You can throttle the write accesses using the throttleFS
constructor option. Be aware that buffered data will be lost in case the process crashes.
const db = new DB("/path/to/file", { throttleFS: { /* throttle options */ } });
The following options exist:
Option | Default | Description |
---|---|---|
intervalMs | 0 | Write to the database file no more than every intervalMs milliseconds. |
maxBufferedCommands | +Infinity | Force a write after maxBufferedCommands have been buffered. This reduces memory consumption and data loss in case of a crash. |
To create a compressed copy of the database in /path/to/file.dump
, use the dump()
method. If any data is written to the db during the dump, it is appended to the dump but most likely compressed.
await db.dump();
After a while, the main db file may contain unnecessary entries. The raw number of entries can be read using the uncompressedSize
property. To remove unnecessary entries, use the compress()
method.
await db.compress();
Note: During this call, /path/to/file.dump
is overwritten and then renamed, /path/to/file.bak
is overwritten and then deleted. So make sure you don't have any important data in these files.
The database can automatically compress the database file under some conditions. To do so, use the autoCompress
parameter of the constructor options:
const db = new DB("/path/to/file", { autoCompress: { /* auto compress options */ }});
The following options exist (all optional) and can be combined:
Option | Default | Description |
---|---|---|
sizeFactor | +Infinity | Compress when uncompressedSize >= size * sizeFactor |
sizeFactorMinimumSize | 0 | Configure the minimum size necessary for auto-compression based on size |
intervalMs | +Infinity | Compress after a certain time has passed |
intervalMinChanges | 1 | Configure the minimum count of changes for auto-compression based on time |
onClose | false | Compress when closing the DB |
onOpen | false | Compress after opening the DB |
Importing JSON files can be done this way:
// pass a filename, the import will be asynchronous
await db.importJson(filename);
// pass the object directly, the import will be synchronous
db.importJson({key: "value"});
In both cases, existing entries in the DB will not be deleted but will be overwritten if they exist.
Exporting JSON files is also possible:
await db.exportJson(filename[, options]);
The file will be overwritten if it exists. The 2nd options argument can be used to control the file formatting. Since fs-extra
's writeJson
is used under the hood, take a look at that method documentation for details on the options object.
Leading directories are now created if they don't exist
Added functionality to throttle write accesses
Export JsonlDBOptions
from the main entry point
Added auto-compress functionality
Fix: The main export no longer exports JsonlDB
as DB
.
Added an optional reviver function to transform non-primitive objects while loading the DB
DB
class to JsonlDB
open()
now skips empty linesopen()
throws an error with the line number when it encounters an invalid line. These errors can be ignored using the new constructor options argument.importJson
and exportJson
methodsisOpen
propertycompress()
replaces files are now persistedcompress()
no longer overwrites the main file while the DB is being closedFirst official release
FAQs
Simple JSONL-based key-value store
The npm package @alcalzone/jsonl-db receives a total of 8,075 weekly downloads. As such, @alcalzone/jsonl-db popularity was classified as popular.
We found that @alcalzone/jsonl-db demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.
Product
Socket now supports uv.lock files to ensure consistent, secure dependency resolution for Python projects and enhance supply chain security.
Research
Security News
Socket researchers have discovered multiple malicious npm packages targeting Solana private keys, abusing Gmail to exfiltrate the data and drain Solana wallets.