LevelUP
Fast & simple storage - a Node.js-style LevelDB wrapper
LevelDB is a simple key/value data store built by Google, inspired by BigTable. It's used in Google Chrome and many other products. LevelDB supports arbitrary byte arrays as both keys and values, singular get, put and delete operations, batched put and delete, forward and reverse iteration and simple compression using the Snappy algorithm which is optimised for speed over compression.
LevelUP aims to expose the features of LevelDB in a Node.js-friendly way. Both keys and values are treated as Buffer
objects and are automatically converted using a specified 'encoding'
. LevelDB's iterators are exposed as a Node.js style object-ReadStream
and writing can be peformed via an object-WriteStream
.
Basic usage
All operations are asynchronous although they don't necessarily require a callback if you don't need to know when the operation was performed.
var levelup = require('levelup')
var options = { createIfMissing: true, errorIfExists: false }
levelup('./mydb', options, function (err, db) {
if (err) return console.log('Ooops!', err)
db.put('name', 'LevelUP', function (err) {
if (err) return console.log('Ooops!', err)
db.get('name', function (err, value) {
if (err) return console.log('Ooops!', err)
console.log('name=' + value)
})
})
})
Options
levelup()
takes an optional options object as its second argument; the following properties are accepted:
-
createIfMissing
(boolean): If true
, will initialise an empty database at the specified location if one doesn't already exit. If false
and a database doesn't exist you will receive an error in your open()
callback and your database won't open. Defaults to false
.
-
errorIfExists
(boolean): If true
, you will receive an error in your open()
callback if the database exists at the specified location. Defaults to false
.
-
encoding
(string): The encoding of the keys and values passed through Node.js' Buffer
implementation (see Buffer#toString())
'utf8'
is the default encoding for both keys and values so you can simply pass in strings and expect strings from your get()
operations. You can also pass Buffer
objects as keys and/or values and converstion will be performed.
Supported encodings are: hex, utf8, ascii, binary, base64, ucs2, utf16le.
'json'
encoding is also supported, see below.
-
keyEncoding
and valueEncoding
(string): use instead of encoding
to specify the exact encoding of both the keys and the values in this database.
Additionally, each of the main interface methods accept an optional options object that can be used to override encoding
(or keyEncoding
& valueEncoding
).
Batch operations
For faster write operations, the batch()
method can be used to submit an array of operations to be executed sequentially. Each operation is contained in an object having the following properties: type
, key
, value
, where the type is either 'put'
or 'del'
. In the case of 'del'
the 'value'
property is ignored.
var ops = [
{ type: 'del', key: 'father' }
, { type: 'put', key: 'name', value: 'Yuri Irsenovich Kim' }
, { type: 'put', key: 'dob', value: '16 February 1941' }
, { type: 'put', key: 'spouse', value: 'Kim Young-sook' }
, { type: 'put', key: 'occupation', value: 'Clown' }
]
db.batch(ops, function (err) {
if (err) return console.log('Ooops!', err)
console.log('Great success dear leader!')
})
Streams
ReadStream
You can obtain a ReadStream of the full database by calling the readStream()
method. The resulting stream is a complete Node.js-style Readable Stream where 'data'
events emit objects with 'key'
and 'value'
pairs.
db.readStream()
.on('data', function (data) {
console.log(data.key, '=', data.value)
})
.on('error', function (err) {
console.log('Oh my!', err)
})
.on('close', function () {
console.log('Stream closed')
})
.on('end', function () {
console.log('Stream closed')
})
The standard pause()
, resume()
and destroy()
methods are implemented on the ReadStream, as is pipe()
(see below). 'data'
, 'error'
, 'end'
and 'close'
events are emitted.
Additionally, you can supply an options object as the first parameter to readStream()
with the following options:
-
'start'
: the key you wish to start the read at. By default it will start at the beginning of the store. Note that the start doesn't have to be an actual key that exists, LevelDB will simply find the next key, greater than the key you provide.
-
'end'
: the key you wish to end the read on. By default it will continue until the end of the store. Again, the end doesn't have to be an actual key as an (inclusive) <=
-type operation is performed to detect the end. You can also use the destroy()
method instead of supplying an 'end'
parameter to achieve the same effect.
-
'reverse'
: a boolean, set to true if you want the stream to go in reverse order. Beware that due to the way LevelDB works, a reverse seek will be slower than a forward seek.
-
'keys'
: a boolean (defaults to true
) to indicate whether the 'data'
event should contain keys. If set to true
and 'values'
set to false
then 'data'
events will simply be keys, rather than objects with a 'key'
property. Used internally by the keyStream()
method.
-
'values'
: a boolean (defaults to true
) to indicate whether the 'data'
event should contain values. If set to true
and 'keys'
set to false
then 'data'
events will simply be values, rather than objects with a 'value'
property. Used internally by the valueStream()
method.
KeyStream
A KeyStream is a ReadStream where the 'data'
events are simply the keys from the database so it can be used like a traditional stream rather than an object stream.
You can obtain a KeyStream either by calling the keyStream()
method on a LevelUP object or by passing passing an options object to readStream()
with keys
set to true
and values
set to false
.
db.keyStream()
.on('data', function (data) {
console.log('key=', data)
})
db.readStream({ keys: true, values: false })
.on('data', function (data) {
console.log('key=', data)
})
ValueStream
A ValueStream is a ReadStream where the 'data'
events are simply the values from the database so it can be used like a traditional stream rather than an object stream.
You can obtain a ValueStream either by calling the valueStream()
method on a LevelUP object or by passing passing an options object to readStream()
with valuess
set to true
and keys
set to false
.
db.valueStream()
.on('data', function (data) {
console.log('value=', data)
})
db.readStream({ keys: false, values: true })
.on('data', function (data) {
console.log('value=', data)
})
WriteStream
A WriteStream can be obtained by calling the writeStream()
method. The resulting stream is a complete Node.js-style Writable Stream which accepts objects with 'key'
and 'value'
pairs on its write()
method. Tce WriteStream will buffer writes and submit them as a batch()
operation where the writes occur on the same event loop tick, otherwise they are treated as simple put()
operations.
db.writeStream()
.on('error', function (err) {
console.log('Oh my!', err)
})
.on('close', function () {
console.log('Stream closed')
})
.write({ key: 'name', value: 'Yuri Irsenovich Kim' })
.write({ key: 'dob', value: '16 February 1941' })
.write({ key: 'spouse', value: 'Kim Young-sook' })
.write({ key: 'occupation', value: 'Clown' })
.end()
The standard write()
, end()
, destroy()
and destroySoon()
methods are implemented on the WriteStream. 'drain'
, 'error'
, 'close'
and 'pipe'
events are emitted.
Pipes and Node Stream compatibility
A ReadStream can be piped directly to a WriteStream, allowing for easy copying of an entire database. A simple copy()
operation is included in LevelUP that performs exactly this on two open databases:
function copy (srcdb, dstdb, callback) {
srcdb.readStream().pipe(dstdb.writeStream().on('close', callback))
}
The ReadStream is also fstream-compatible which means you should be able to pipe to and from fstreams. So you can serialize and deserialize an entire database to a directory where keys are filenames and values are their contents, or even into a tar file using node-tar. See the fstream functional test for an example. (Note: I'm not really sure there's a great use-case for this but it's a fun example and it helps to harden the stream implementations.)
KeyStreams and ValueStreams can be treated like standard streams of raw data. If 'encoding'
is set to 'binary'
the 'data'
events will simply be standard Node Buffer
objects straight out of the data store.
JSON
You specify 'json'
encoding for both keys and/or values, you can then supply JavaScript objects to LevelUP and receive them from all fetch operations, including ReadStreams. LevelUP will automatically stringify your objects and store them as utf8 and parse the strings back into objects before passing them back to you.
Important considerations
- LevelDB is thread-safe but is not suitable for accessing with multiple processes. You should only ever have a LevelDB database open from a single Node.js process.
TODO
- Windows support (see issue #5 if you would like to help)
- Benchmarks
Licence & copyright
LevelUP is Copyright (c) 2012 Rod Vagg @rvagg and licenced under the MIT licence. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE file for more details.
LevelUP builds on the excellent work of the LevelDB and Snappy teams from Google and additional contributors. LevelDB and Snappy are both issued under the New BSD Licence.