lowdb
Tiny local JSON database for small projects 🦉
db.data.posts.push({ id: 1, title: 'lowdb is awesome' })
db.write()
{
"posts": [
{ "id": 1, "title": "lowdb is awesome" }
]
}
Free for Open Source
To help with OSS funding, lowdb v2 is released under Parity license for a limited time. It'll be released under MIT license once the goal of 100 sponsors is reached (currently at 55) or in five months.
Meanwhile, lowdb v2 can be freely used in Open Source projects. Sponsors can use it in any type of project.
If you've installed this new version without knowing about the license change, you're excused for 30 days. There's also a 30 days trial. See license files for more details.
Thank you for your support!
Note: if you're already sponsoring husky, you can use lowdb v2 today :)
Companies
Become a sponsor and have your company logo here.
Features
- Lightweight
- Minimalist and easy to learn API
- Query and modify data using plain JS
- Improved TypeScript support
- Atomic write
- Hackable:
- Change storage, file format (JSON, YAML, ...) or add encryption via adapters
- Add lodash, ramda, ... for super powers!
Install
npm install lowdb
Usage
import { join } from 'path'
import { Low, JSONFile } from 'lowdb'
const file = join(__dirname, 'db.json')
const adapter = new JSONFile(file)
const db = new Low(adapter)
await db.read()
db.data ||= { posts: [] }
db.data.posts.push('hello world')
db.data.posts[0]
const { posts } = db.data
posts.push('hello world')
await db.write()
{
"posts": [ "hello world" ]
}
TypeScript
Lowdb now comes with TypeScript support. You can even type db.data
content.
type Data = {
posts: string[]
}
const db = new Low<Data>(adapter)
db.data
.posts
.push(1)
Lodash
You can easily add lodash or other utility libraries to improve lowdb.
import lodash from lodash
db.chain = lodash.chain(db.data)
const post = db.chain
.get('posts')
.find({ id: 1 })
.value()
More examples
For CLI, server and browser usage, see examples/
directory.
API
Classes
Lowdb has two classes (for asynchronous and synchronous adapters).
new Low(adapter)
import { Low, JSONFile } from 'lowdb'
const db = new Low(new JSONFile('file.json'))
await db.read()
await db.write()
new LowSync(adapterSync)
import { LowSync, JSONFileSync } from 'lowdb'
const db = new LowSync(new JSONFileSync('file.json'))
db.read()
db.write()
Methods
db.read()
Calls adaper.read()
and sets db.data
.
Note: JSONFile
and JSONFileSync
adapters will set db.data
to null
if file doesn't exist.
db.data
db.read()
db.data
db.write()
Calls adapter.write(db.data)
.
db.data = { posts: [] }
db.write()
db.data = {}
db.write()
Properties
db.data
Holds your db content. If you're using the adapters coming with lowdb, it can be any type supported by JSON.stringify
.
For example:
db.data = 'string'
db.data = [1, 2, 3]
db.data = { key: 'value' }
Adapters
Lowdb adapters
JSONFile
JSONFileSync
Adapters for reading and writing JSON files.
new Low(new JSONFile(filename))
new LowSync(new JSONFileSync(filename))
Memory
MemorySync
In-memory adapters. Useful for speeding up unit tests.
new Low(new Memory())
new LowSync(new MemorySync())
LocalStorage
Synchronous adapter for window.localStorage
.
new LowSync(new LocalStorage(name))
Third-party adapters
If you've published an adapter for lowdb, feel free to create a PR to add it here.
Writing your own adapter
You may want to create an adapter to write db.data
to YAML, XML, ... or encrypt data.
An adapter is a simple class that just needs to expose two methods:
class AsyncAdapter {
read() { }
write(data) { }
}
class SyncAdapter {
read() { }
write(data) { }
}
For example, let's say you have some async storage and want to create an adapter for it:
import { api } from './AsyncStorage'
class CustomAsyncAdapter {
constructor(args) {
}
async read() {
const data = await api.read()
return data
}
async write(data) {
await api.write(data)
}
}
const adapter = new CustomAsyncAdapter()
const db = new Low(adapter)
See src/adapters/
for more examples.
Limits
Lowdb doesn't support Node's cluster module.
If you have large JavaScript objects (~10-100MB
) you may hit some performance issues. This is because whenever you call db.write
, the whole db.data
is serialized and written to storage.
Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write
only when you need it.
If you plan to scale, it's highly recommended to use databases like PostgreSQL, MongoDB, ...
License
License Zero Parity 7.0.0 and MIT (contributions) with exception License Zero Patron 1.0.0.