DBFFile
Summary
Read and write .dbf (dBase III and Visual FoxPro) files in Node.js:
- Supported field types:
C
(string)N
(numeric)F
(float)I
(integer)L
(logical)D
(date)T
(datetime)B
(double)M
(memo) Note: memo support is experimental/partial, with the following limitations:
- read-only (can't create/write DBF files with memo fields)
- dBase III (version 0x83) and dBase IV (version 0x8b)
.dbt
memo files only
- 'Loose' read mode - tries to read any kind of .dbf file without complaining. Unsupported field types are simply skipped.
- Can open an existing .dbf file
- Can access all field descriptors
- Can access total record count
- Can access date of last update
- Can read records in arbitrary-sized batches
- Can include deleted records in results
- Supports very large files
- Can create a new .dbf file
- Can use field descriptors from a user-specified object of from another instance
- Can append records to an existing .dbf file
- Supports very large files
- Can specify character encodings either per-file or per-field.
- the default encoding is
'ISO-8859-1'
(also known as latin 1) - example per-file encoding:
DBFFile.open(<path>, {encoding: 'EUC-JP'})
- example per-field encoding:
DBFFile.open(<path>, {encoding: {default: 'latin1', FIELD_XYZ: 'EUC-JP'}})
- supported encodings are listed here.
- All operations are asynchronous and return a promise
Installation
npm install dbffile
or yarn add dbffile
Example: reading a .dbf file
import {DBFFile} from 'dbffile';
async function testRead() {
let dbf = await DBFFile.open('<full path to .dbf file>');
console.log(`DBF file contains ${dbf.recordCount} records.`);
console.log(`Field names: ${dbf.fields.map(f => f.name).join(', ')}`);
let records = await dbf.readRecords(100);
for (let record of records) console.log(record);
}
Example: writing a .dbf file
import {DBFFile} from 'dbffile';
async function testWrite() {
let fieldDescriptors = [
{ name: 'fname', type: 'C', size: 255 },
{ name: 'lname', type: 'C', size: 255 }
];
let records = [
{ fname: 'Joe', lname: 'Bloggs' },
{ fname: 'Mary', lname: 'Smith' }
];
let dbf = await DBFFile.create('<full path to .dbf file>', fieldDescriptors);
console.log('DBF file created.');
await dbf.appendRecords(records);
console.log(`${records.length} records added.`);
}
Loose Read Mode
Not all versions and variants of .dbf file are supported by this library. Normally, when an unsupported file version or
field type is encountered, an error is reported and reading halts immediately. This has been a problem for users who
just want to recover data from old .dbf files, and would rather not write a PR or wait for one that adds the missing
file/field support.
A more forgiving approach to reading .dbf files is now provided by passing the option {readMode: 'loose'}
to the
DBFFile.open(...)
function. In this mode, unrecognised file versions, unsupported field types, and missing memo files
are all tolerated. Unsupported/missing field types are still present in the fields
field descriptors, but will be missing in
the record data returned by the readRecords(...)
method.
API
The module exports the DBFFile
class, which has the following shape:
class DBFFile {
static open(path: string, options?: OpenOptions): Promise<DBFFile>;
static create(path: string, fields: FieldDescriptor[], options?: CreateOptions): Promise<DBFFile>;
path: string;
recordCount: number;
dateOfLastUpdate: Date;
fields: FieldDescriptor[];
readRecords(maxCount?: number): Promise<object[]>;
appendRecords(records: object[]): Promise<DBFFile>;
}
interface FieldDescriptor {
name: string;
type: 'C' | 'N' | 'F' | 'L' | 'D' | 'I' | 'M' | 'T' | 'B';
size: number;
decimalPlaces?: number;
}
interface OpenOptions {
readMode?: 'strict' | 'loose'
encoding?: Encoding;
includeDeletedRecords?: boolean;
}
interface CreateOptions {
fileVersion?: FileVersion;
encoding?: Encoding;
}
type Encoding = string | {default: string, [fieldName: string]: string};