Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

zstd-napi

Package Overview
Dependencies
Maintainers
1
Versions
8
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

zstd-napi - npm Package Compare versions

Comparing version 0.0.7 to 0.0.8

dist/simple.d.ts

578

binding.d.ts

@@ -0,6 +1,53 @@

/**
* This module (imported as `zstd-napi/binding`) exposes a thin (but safe!)
* wrapper around the native Zstandard API. If you aren't trying to do something
* weird, use the {@link "index" | high-level API} instead.
*
* Native API functions that are associated with a data structure are methods on
* the wrapper class corresponding to that data structure. For example, many
* compression functions take a `ZSTD_CCtx` and can therefore be found on the
* {@link CCtx} class.
*
* The upstream
* {@link https://facebook.github.io/zstd/zstd_manual.html | Zstandard manual}
* is the best source for understanding how to use this API. While this
* documentation tries to be helpful, it has only a small fraction of the
* information. Documentation for structures and functions within this module
* will mention the corresponding native names to enable cross-referencing.
* (Alas, it appears not to be possible to _link_ to functions on that page...)
*
* @module
*/
/**
* Magic number denoting the start of a Zstandard frame.
*
* Corresponds to `ZSTD_MAGICNUMBER`.
*/
export const MAGICNUMBER: number;
/**
* Magic number denoting the start of a Zstandard dictionary.
*
* Corresponds to `ZSTD_MAGIC_DICTIONARY`.
*/
export const MAGIC_DICTIONARY: number;
/**
* Corresponds to `ZSTD_MAGIC_SKIPPABLE_START`.
* @experimental
*/
export const MAGIC_SKIPPABLE_START: number;
/**
* Corresponds to `ZSTD_MAGIC_SKIPPABLE_MASK`.
* @experimental
*/
export const MAGIC_SKIPPABLE_MASK: number;
/**
* Parameters for Zstandard compression.
*
* @category Advanced API
*/
export enum CParameter {

@@ -15,2 +62,3 @@ compressionLevel,

strategy,
targetCBlockSize,
enableLongDistanceMatching,

@@ -29,2 +77,9 @@ ldmHashLog,

/**
* Parameters for Zstandard decompression.
*
* Corresponds to `ZSTD_dParameter`.
*
* @category Advanced API
*/
export enum DParameter {

@@ -34,14 +89,43 @@ windowLogMax,

/**
* Identifies whether to flush or close the current frame.
*
* Corresponds to `ZSTD_EndDirective`.
*
* @category Streaming
*/
export enum EndDirective {
/** Don't flush or end the frame */
continue,
/** Flush all data written so far */
flush,
/** Flush all data written so far and end the frame */
end,
}
/**
* Identifies what parts of a (de)compression context to reset.
*
* Corresponds to `ZSTD_ResetDirective`.
*
* @category Advanced API
*/
export enum ResetDirective {
/** Abort the current frame, but keep dictionary/parameters */
sessionOnly,
/** Reset the dictionary/parameters (only works if not in a frame) */
parameters,
/** Reset both the frame and dictionary/parameters */
sessionAndParameters,
}
/**
* Compression strategies.
*
* Used as values for {@link CParameter.strategy}.
*
* Corresponds to `ZSTD_strategy`.
*
* @category Advanced API
*/
export enum Strategy {

@@ -58,9 +142,75 @@ fast,

// 0: Return value of compressStream2/decompressStream
// 1: Bytes produced to dstBuf (output.pos)
// 2: Bytes consumed from srcBuf (input.pos)
type StreamResult = [number, number, number];
/**
* Composite return value for streaming (de)compression functions.
*
* These functions effectively return three values:
* - `returnValue`: the function's normal return value
* - `dstProduced`: the number of bytes written to the output buffer
* - `srcProduced`: the number of bytes read from the input buffer
*
* @remarks
* The latter two of these are out parameters in C, and the most efficient way
* to map that to a JavaScript API is to return a composite value instead. We
* use a tuple for performance reasons: Node-API (unlike V8) doesn't have an API
* to efficiently construct objects with a fixed set of properties.
*
* @category Streaming
*/
type StreamResult = [
returnValue: number,
dstProduced: number,
srcConsumed: number,
];
/**
* Compression context.
*
* Wraps `ZSTD_CCtx` (which is also `ZSTD_CStream`). The finalizer automatically
* calls `ZSTD_freeCCtx` when this object is garbage collected.
*/
export class CCtx {
/**
* Creates a new compression context.
*
* Wraps `ZSTD_createCCtx`.
*/
constructor();
/**
* Compresses `srcBuf` into `dstBuf` at compression level `level`.
*
* `dstBuf` must be large enough to fit the entire result. See
* {@link compressBound} for a way to compute an upper bound on that size.
*
* This ignores any parameters set by {@link setParameter} and compresses
* at the level specified by `level`. See {@link compress2} for a similar
* function that respects those parameters.
*
* Wraps `ZSTD_compressCCtx`.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param level - Compression level
* @returns Number of compressed bytes written to `dstBuf`
*/
compress(dstBuf: Uint8Array, srcBuf: Uint8Array, level: number): number;
/**
* Compresses `srcBuf` into `dstBuf`, using `dictBuf` as a dictionary.
*
* Works like {@link CCtx.compress | compress}, except it uses a dictionary.
*
* Wraps `ZSTD_compress_usingDict`.
*
* @remarks
* Loading the dictionary from a buffer is expensive. If the dictionary will
* be used more than once, it's better to load it into a {@link CDict} once
* and use {@link compressUsingCDict} instead.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param dictBuf - Compression dictionary
* @param level - Compression level
* @returns Number of compressed bytes written to `dstBuf`
*/
compressUsingDict(

@@ -72,2 +222,16 @@ dstBuf: Uint8Array,

): number;
/**
* Compresses `srcBuf` into `dstBuf` using the prepared dictionary `dict`.
*
* Works like {@link CCtx.compress | compress}, except it uses a dictionary.
* The compression level is selected at dictionary load time.
*
* Wraps `ZSTD_compress_usingCDict`.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param dict - Prepared dictionary
* @returns Number of compressed bytes written to `dstBuf`
*/
compressUsingCDict(

@@ -78,6 +242,71 @@ dstBuf: Uint8Array,

): number;
/**
* Set a compression parameter.
*
* Note that these parameters are only respected by the {@link compress2}
* and {@link compressStream2} methods.
*
* Wraps `ZSTD_CCtx_setParameter`.
*
* @param param - Parameter to set
* @param value - New parameter value
*/
setParameter(param: CParameter, value: number): void;
/**
* Set the uncompressed length of the next frame.
*
* Allows populating the header with the uncompressed size when using the
* streaming compression interface. Compression will throw an error if this
* commitment is not respected.
*
* Wraps `ZSTD_CCtx_setPledgedSrcSize`.
*/
setPledgedSrcSize(size: number): void;
/**
* Resets this compression context.
*
* The `reset` parameter controls what exactly is reset.
*
* Wraps `ZSTD_CCtx_reset`.
*/
reset(reset: ResetDirective): void;
/**
* Compresses `srcBuf` into `dstBuf`.
*
* Works like {@link CCtx.compress | compress}, except it respects the
* configuration set on this object with other methods.
*
* Wraps `ZSTD_compress2`.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @returns Number of compressed bytes written to `dstBuf`
*/
compress2(dstBuf: Uint8Array, srcBuf: Uint8Array): number;
/**
* Compresses `srcBuf` into `dstBuf` with a streaming interface.
*
* The `endOp` parameter indicates whether to flush data or end the frame.
*
* May consume all or part of `srcBuf`, and may partially write `dstBuf`.
* Returns a tuple with a bound on how many bytes are left to flush, how many
* bytes were written, and how many bytes were consumed.
*
* Wraps `ZSTD_compressStream2`.
*
* @remarks
* This function requires some care to use correctly, consult the {@link
* https://facebook.github.io/zstd/zstd_manual.html | Zstandard manual} for
* full usage information.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param endOp - Whether to flush or end the frame
* @returns Compression progress information
*/
compressStream2(

@@ -88,12 +317,118 @@ dstBuf: Uint8Array,

): StreamResult;
/**
* Load a compression dictionary from `dictBuf`.
*
* Note that this dictionary will only be used by the {@link compress2} and
* {@link compressStream2} methods.
*
* Wraps `ZSTD_CCtx_loadDictionary`.
*/
loadDictionary(dictBuf: Uint8Array): void;
private __brand: 'CCtx';
}
/**
* Prepared dictionary for compression.
*
* Wraps `ZSTD_CDict`. The finalizer automatically calls `ZSTD_freeCDict` when
* this object is garbage collected.
*
* @category Dictionary
*/
export class CDict {
/**
* Load a dictionary for compression from the bytes in `dictBuf`.
*
* The compression level must be given in `level` and will override any
* compression level set when this dictionary is used.
*
* Wraps `ZSTD_createCDict`.
*/
constructor(dictBuf: Uint8Array, level: number);
/**
* Returns the ID for this dictionary.
*
* Wraps `ZSTD_getDictID_fromCDict`.
*
* @returns The ID, or 0 if this is a non-standard/content-only dictionary
*/
getDictID(): number;
private __brand: 'CDict';
}
/**
* Decompression context.
*
* Wraps `ZSTD_DCtx` (which is also `ZSTD_DStream`). The finalizer automatically
* calls `ZSTD_freeDCtx` when this object is garbage collected.
*/
export class DCtx {
/**
* Creates a new decompression context.
*
* Wraps `ZSTD_createDCtx`.
*/
constructor();
/**
* Decompresses `srcBuf` into `dstBuf`.
*
* `dstBuf` must be large enough to fit the entire result. `srcBuf` must end
* on a frame boundary (no partial frames or other trailing data).
*
* Wraps `ZSTD_decompressDCtx`.
*
* @remarks
* If the frame has the uncompressed size in the header, you can use
* {@link getFrameContentSize} to determine how big the buffer needs to be.
* If it's too large, or unknown, use {@link decompressStream} instead.
*
* @param dstBuf - Output buffer for decompressed bytes
* @param srcBuf - Data to decompress
* @returns Number of decompressed bytes written to `dstBuf`
*/
decompress(dstBuf: Uint8Array, srcBuf: Uint8Array): number;
/**
* Decompresses `srcBuf` into `dstBuf` with a streaming interface.
*
* May consume all or part of `srcBuf`, and may partially write `dstBuf`.
* Returns a tuple with a bound on how many bytes are left to flush, how many
* bytes were written, and how many bytes were consumed.
*
* Wraps `ZSTD_decompressStream`.
*
* @remarks
* This function requires some care to use correctly, consult the {@link
* https://facebook.github.io/zstd/zstd_manual.html | Zstandard manual} for
* full usage information.
*
* @param dstBuf - Output buffer for decompressed bytes
* @param srcBuf - Data to decompress
* @returns Decompression progress information
*/
decompressStream(dstBuf: Uint8Array, srcBuf: Uint8Array): StreamResult;
/**
* Decompresses `srcBuf` into `dstBuf`, using `dictBuf` as a dictionary.
*
* Works like {@link DCtx.decompress | decompress}, except it uses the
* provided dictionary instead of any set on this context.
*
* Wraps `ZSTD_decompress_usingDict`.
*
* @remarks
* Loading the dictionary from a buffer is expensive. If the dictionary will
* be used more than once, it's better to load it into a {@link DDict} once
* and use {@link decompressUsingDDict} instead.
*
* @param dstBuf - Output buffer for decompressed bytes
* @param srcBuf - Data to decompress
* @param dictBuf - Compression dictionary
* @returns Number of compressed bytes written to `dstBuf`
*/
decompressUsingDict(

@@ -104,2 +439,16 @@ dstBuf: Uint8Array,

): number;
/**
* Decompresses `srcBuf` into `dstBuf` using the prepared dictionary `dict`.
*
* Works like {@link DCtx.decompress | decompress}, except it uses the
* provided dictionary instead of any set on this context.
*
* Wraps `ZSTD_decompress_usingDDict`.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param dict - Prepared dictionary
* @returns Number of compressed bytes written to `dstBuf`
*/
decompressUsingDDict(

@@ -110,19 +459,110 @@ dstBuf: Uint8Array,

): number;
/**
* Set a decompression parameter.
*
* Wraps `ZSTD_DCtx_setParameter`.
*
* @param param - Parameter to set
* @param value - New parameter value
*/
setParameter(param: DParameter, value: number): void;
/**
* Resets this decompression context.
*
* The `reset` parameter controls what exactly is reset.
*
* Wraps `ZSTD_DCtx_reset`.
*/
reset(reset: ResetDirective): void;
/**
* Load a compression dictionary from `dictBuf`.
*
* This dictionary will be used by {@link DCtx.decompress | decompress} and
* {@link decompressStream}.
*
* Wraps `ZSTD_DCtx_loadDictionary`.
*/
loadDictionary(dictBuf: Uint8Array): void;
private __brand: 'DCtx';
}
/**
* Prepared dictionary for decompression.
*
* Wraps `ZSTD_DDict`. The finalizer automatically calls `ZSTD_freeDDict` when
* this object is garbage collected.
*
* @category Dictionary
*/
export class DDict {
/**
* Load a dictionary for decompression from the bytes in `dictBuf`.
*
* Wraps `ZSTD_createDDict`.
*/
constructor(dictBuf: Uint8Array);
/**
* Returns the ID for this dictionary.
*
* Wraps `ZSTD_getDictID_fromDDict`.
*
* @returns The ID, or 0 if this is a non-standard/content-only dictionary
*/
getDictID(): number;
private __brand: 'DDict';
}
/**
* Inclusive lower and upper bounds for a parameter.
*
* Corresponds to `ZSTD_bounds`.
*
* @category Advanced API
*/
export interface Bounds {
/** Minimum allowed value */
lowerBound: number;
/** Maximum allowed value */
upperBound: number;
}
/**
* Returns the Zstandard library version as a number.
*
* Wraps `ZSTD_versionNumber`.
*/
export function versionNumber(): number;
/**
* Returns the Zstandard library version as a string.
*
* Wraps `ZSTD_versionString`.
*/
export function versionString(): string;
/**
* Compresses `srcBuf` into `dstBuf` at compression level `level`.
*
* `dstBuf` must be large enough to fit the entire result. See
* {@link compressBound} for a way to compute an upper bound on that size.
*
* Wraps `ZSTD_compress`.
*
* @remarks
* This function is here for completeness, creating a {@link CCtx} and reusing
* it will give better performance than calling this repeatedly. The high-level
* {@link index.compress | compress} function will take care of this for you.
*
* @param dstBuf - Output buffer for compressed bytes
* @param srcBuf - Data to compress
* @param level - Compression level
* @returns Number of compressed bytes written to `dstBuf`
* @category Simple API
*/
export function compress(

@@ -133,15 +573,145 @@ dstBuf: Uint8Array,

): number;
/**
* Decompresses `srcBuf` into `dstBuf`.
*
* `dstBuf` must be large enough to fit the entire result. `srcBuf` must end on
* a frame boundary (no partial frames or other trailing data).
*
* Wraps `ZSTD_decompress`.
*
* @remarks
* This function is here for completeness, creating a {@link DCtx} and reusing
* it will give better performance than calling this repeatedly. The high-level
* {@link index.decompress | decompress} function will take care of this for
* you.
*
* @param dstBuf - Output buffer for decompressed bytes
* @param srcBuf - Data to decompress
* @returns Number of decompressed bytes written to `dstBuf`
* @category Simple API
*/
export function decompress(dstBuf: Uint8Array, srcBuf: Uint8Array): number;
/**
* Returns the number of decompressed bytes in the provided frame.
*
* Wraps `ZSTD_getFrameContentSize`.
*
* @param frameBuf - Buffer with Zstandard frame (or frame header)
* @returns Number of decompressed bytes, or `null` if unknown
* @category Simple API
*/
export function getFrameContentSize(frameBuf: Uint8Array): number | null;
/**
* Returns the size of the first compressed frame in `frameBuf`.
*
* @param frameBuf - Buffer containing at least one complete Zstandard frame
* @returns Size of the first frame in `frameBuf`
* @category Simple API
*/
export function findFrameCompressedSize(frameBuf: Uint8Array): number;
/**
* Returns worst-case maximum compressed size for an input of `srcSize` bytes.
*
* @category Simple API
*/
export function compressBound(srcSize: number): number;
/**
* Returns the minimum valid compression level.
*
* @category Simple API
*/
export function minCLevel(): number;
/**
* Returns the maximum valid compression level.
*
* @category Simple API
*/
export function maxCLevel(): number;
/**
* Returns the default compression level.
*
* @category Simple API
*/
export function defaultCLevel(): number;
/**
* Get upper and lower bounds for a compression parameter.
*
* Wraps `ZSTD_cParam_getBounds`.
*
* @category Advanced API
*/
export function cParamGetBounds(param: CParameter): Bounds;
/**
* Get upper and lower bounds for a decompression parameter.
*
* Wraps `ZSTD_dParam_getBounds`.
*
* @category Advanced API
*/
export function dParamGetBounds(param: DParameter): Bounds;
/**
* Returns the recommended size of a streaming compression input buffer.
*
* Wraps `ZSTD_CStreamInSize`.
*
* @category Streaming
*/
export function cStreamInSize(): number;
/**
* Returns the recommended size of a streaming compression output buffer.
*
* Wraps `ZSTD_CStreamOutSize`.
*
* @category Streaming
*/
export function cStreamOutSize(): number;
/**
* Returns the recommended size of a streaming decompression input buffer.
*
* Wraps `ZSTD_DStreamInSize`.
*
* @category Streaming
*/
export function dStreamInSize(): number;
/**
* Returns the recommended size of a streaming decompression input buffer.
*
* Wraps `ZSTD_DStreamOutSize`.
*
* @category Streaming
*/
export function dStreamOutSize(): number;
/**
* Returns the dictionary ID stored in the provided dictionary.
*
* Wraps `ZSTD_getDictID_fromDict`.
*
* @param dictBuf - Buffer containing the dictionary
* @returns The dictionary ID, or 0 if the buffer does not contain a dictionary
* @category Dictionary
*/
export function getDictIDFromDict(dictBuf: Uint8Array): number;
/**
* Returns the dictionary ID recorded in a Zstandard frame.
*
* @param frameBuf - Buffer containing a Zstandard frame
* @returns The dictionary ID, or 0 if the frame header doesn't include one (or
* if the buffer doesn't contain a valid frame header)
* @category Dictionary
*/
export function getDictIDFromFrame(frameBuf: Uint8Array): number;

@@ -91,3 +91,3 @@ Zstandard library files

- While invoking `make libzstd`, it's possible to define build macros
`ZSTD_LIB_COMPRESSION, ZSTD_LIB_DECOMPRESSION`, `ZSTD_LIB_DICTBUILDER`,
`ZSTD_LIB_COMPRESSION`, `ZSTD_LIB_DECOMPRESSION`, `ZSTD_LIB_DICTBUILDER`,
and `ZSTD_LIB_DEPRECATED` as `0` to forgo compilation of the

@@ -123,2 +123,11 @@ corresponding features. This will also disable compilation of all

On the compressor side, Zstd's compression levels map to several internal
strategies. In environments where the higher compression levels aren't used,
it is possible to exclude all but the fastest strategy with
`ZSTD_LIB_EXCLUDE_COMPRESSORS_DFAST_AND_UP=1`. (Note that this will change
the behavior of the default compression level.) Or if you want to retain the
default compressor as well, you can set
`ZSTD_LIB_EXCLUDE_COMPRESSORS_GREEDY_AND_UP=1`, at the cost of an additional
~20KB or so.
For squeezing the last ounce of size out, you can also define

@@ -174,2 +183,6 @@ `ZSTD_NO_INLINE`, which disables inlining, and `ZSTD_STRIP_ERROR_STRINGS`,

- The C compiler macro `HUF_DISABLE_FAST_DECODE` disables the newer Huffman fast C
and assembly decoding loops. You may want to use this macro if these loops are
slower on your platform.
#### Windows : using MinGW+MSYS to create DLL

@@ -176,0 +189,0 @@

8

deps/zstd/README.md

@@ -8,3 +8,3 @@ <p align="center"><img src="https://raw.githubusercontent.com/facebook/zstd/dev/doc/images/zstd_logo86.png" alt="Zstandard"></p>

Zstandard's format is stable and documented in [RFC8878](https://datatracker.ietf.org/doc/html/rfc8878). Multiple independent implementations are already available.
This repository represents the reference implementation, provided as an open-source dual [BSD](LICENSE) and [GPLv2](COPYING) licensed **C** library,
This repository represents the reference implementation, provided as an open-source dual [BSD](LICENSE) OR [GPLv2](COPYING) licensed **C** library,
and a command line utility producing and decoding `.zst`, `.gz`, `.xz` and `.lz4` files.

@@ -202,2 +202,6 @@ Should your project require another programming language,

### Bazel
You easily can integrate zstd into your Bazel project by using the module hosted on the [Bazel Central Repository](https://registry.bazel.build/modules/zstd).
## Testing

@@ -218,3 +222,3 @@

Zstandard is dual-licensed under [BSD](LICENSE) and [GPLv2](COPYING).
Zstandard is dual-licensed under [BSD](LICENSE) OR [GPLv2](COPYING).

@@ -221,0 +225,0 @@ ## Contributing

@@ -5,24 +5,76 @@ /// <reference types="node" />

import * as binding from '../binding';
export type StrategyName = keyof typeof binding.Strategy;
/**
* Zstandard compression parameters.
*
* Most applications will only need the {@link compressionLevel} parameter. See
* the {@link https://facebook.github.io/zstd/zstd_manual.html | Zstandard manual}
* for a full description.
*/
export interface CompressParameters {
compressionLevel: number;
windowLog: number;
hashLog: number;
chainLog: number;
searchLog: number;
minMatch: number;
targetLength: number;
strategy: StrategyName;
enableLongDistanceMatching: boolean;
ldmHashLog: number;
ldmMinMatch: number;
ldmBucketSizeLog: number;
ldmHashRateLog: number;
contentSizeFlag: boolean;
checksumFlag: boolean;
dictIDFlag: boolean;
nbWorkers: number;
jobSize: number;
overlapLog: number;
/**
* Compression level, where higher numbers compress better but are slower.
*
* Typical values range from 1 to 9, with a default of 3, but values up to 22
* are allowed, as are negative values (see {@link binding.minCLevel}). Zero
* is interpreted as "use the default".
*
* @category Basic parameters
*/
compressionLevel?: number | undefined;
/** @category Advanced compression options */
windowLog?: number | undefined;
/** @category Advanced compression options */
hashLog?: number | undefined;
/** @category Advanced compression options */
chainLog?: number | undefined;
/** @category Advanced compression options */
searchLog?: number | undefined;
/** @category Advanced compression options */
minMatch?: number | undefined;
/** @category Advanced compression options */
targetLength?: number | undefined;
/** @category Advanced compression options */
strategy?: keyof typeof binding.Strategy | undefined;
/** @category Advanced compression options */
targetCBlockSize?: number | undefined;
/** @category Long-distance matching */
enableLongDistanceMatching?: boolean | undefined;
/** @category Long-distance matching */
ldmHashLog?: number | undefined;
/** @category Long-distance matching */
ldmMinMatch?: number | undefined;
/** @category Long-distance matching */
ldmBucketSizeLog?: number | undefined;
/** @category Long-distance matching */
ldmHashRateLog?: number | undefined;
/** @category Frame parameters */
contentSizeFlag?: boolean | undefined;
/** @category Frame parameters */
checksumFlag?: boolean | undefined;
/** @category Frame parameters */
dictIDFlag?: boolean | undefined;
/** @category Multi-threading parameters */
nbWorkers?: number | undefined;
/** @category Multi-threading parameters */
jobSize?: number | undefined;
/** @category Multi-threading parameters */
overlapLog?: number | undefined;
}
/**
* High-level interface for customized single-pass Zstandard compression.
*
* @example Basic usage
* ```
* const cmp = new Compressor();
* const result = cmp.compress(Buffer.from('your data here'));
* ```
*
* @example Advanced usage
* ```
* const cmp = new Compressor();
* cmp.setParameters({compressionLevel: 9});
* cmp.loadDictionary(fs.readFileSync('path/to/dictionary.dct'));
* const result = cmp.compress(Buffer.from('your data here'));
* ```
*/
export declare class Compressor {

@@ -32,16 +84,87 @@ private cctx;

private scratchLen;
/**
* Compress the data in `buffer` with the configured dictionary/parameters.
*
* @param buffer - Data to compress
* @returns A new Buffer containing the compressed data
*/
compress(buffer: Uint8Array): Buffer;
loadDictionary(data: Buffer): void;
setParameters(parameters: Partial<CompressParameters>): void;
updateParameters(parameters: Partial<CompressParameters>): void;
/**
* Load a compression dictionary from the provided buffer.
*
* The loaded dictionary will be used for all future {@link compress} calls
* until removed or replaced. Passing an empty buffer to this function will
* remove a previously loaded dictionary.
*
* Set any parameters you want to set before loading a dictionary, since
* parameters can't be changed while a dictionary is loaded.
*/
loadDictionary(data: Uint8Array): void;
/**
* Reset the compressor state to only the provided parameters.
*
* Any loaded dictionary will be cleared, and any parameters not specified
* will be reset to their default values.
*/
setParameters(parameters: CompressParameters): void;
/**
* Modify compression parameters.
*
* Parameters not specified will be left at their current values. Changing
* parameters is not possible while a dictionary is loaded.
*/
updateParameters(parameters: CompressParameters): void;
}
/**
* High-level interface for streaming Zstandard compression.
*
* Implements the standard Node stream transformer interface, so can be used
* with `.pipe` or any other streaming interface.
*
* @example Basic usage
* ```
* import { pipeline } from 'stream/promises';
* const cmp = new CompressStream();
* await pipeline(
* fs.createReadStream('data.txt'),
* new CompressStream(),
* fs.createWriteStream('data.txt.zst'),
* );
* ```
*/
export declare class CompressStream extends Transform {
private cctx;
private buffer;
constructor(parameters?: Partial<CompressParameters>);
/**
* Create a new streaming compressor with the specified parameters.
*
* @param parameters - Compression parameters
*/
constructor(parameters?: CompressParameters);
/**
* End the current Zstandard frame without ending the stream.
*
* Frames are compressed independently, so this can be used to create a
* "seekable" archive, or to provide more resilience to data corruption by
* isolating parts of the file from each other.
*
* The optional `callback` is invoked with the same semantics as it is for a
* a stream write.
*/
endFrame(callback?: (error?: Error | null) => void): void;
/**
* Flush internal compression buffers to the stream.
*
* Ensures that a receiver can decompress all bytes written so far without
* as much negative impact to compression as {@link endFrame}.
*
* The optional `callback` is invoked with the same semantics as it is for a
* a stream write.
*/
flush(callback?: (error?: Error | null) => void): void;
private doCompress;
/** @internal */
_transform(chunk: unknown, _encoding: string, done: TransformCallback): void;
/** @internal */
_flush(done: TransformCallback): void;
}
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.CompressStream = exports.Compressor = void 0;
const assert_1 = require("assert");
const stream_1 = require("stream");
const binding = require("../binding");
const util_1 = require("./util");
const PARAM_MAPPERS = {
compressionLevel: util_1.mapNumber,
// Advanced compression options
windowLog: util_1.mapNumber,
hashLog: util_1.mapNumber,
chainLog: util_1.mapNumber,
searchLog: util_1.mapNumber,
minMatch: util_1.mapNumber,
targetLength: util_1.mapNumber,
strategy: (0, util_1.mapEnum)(binding.Strategy),
targetCBlockSize: util_1.mapNumber,
// Long-distance matching options
enableLongDistanceMatching: util_1.mapBoolean,
ldmHashLog: util_1.mapNumber,
ldmMinMatch: util_1.mapNumber,
ldmBucketSizeLog: util_1.mapNumber,
ldmHashRateLog: util_1.mapNumber,
// Frame parameters
contentSizeFlag: util_1.mapBoolean,
checksumFlag: util_1.mapBoolean,
dictIDFlag: util_1.mapBoolean,
// Multi-threading parameters
nbWorkers: util_1.mapNumber,
jobSize: util_1.mapNumber,
overlapLog: util_1.mapNumber,
};
function updateCCtxParameters(cctx, parameters) {
for (const [key, value] of Object.entries(parameters)) {
if (value === undefined)
continue;
const name = key;
let mapped;
switch (name) {
case 'compressionLevel':
case 'windowLog':
case 'hashLog':
case 'chainLog':
case 'searchLog':
case 'minMatch':
case 'targetLength':
case 'ldmHashLog':
case 'ldmMinMatch':
case 'ldmBucketSizeLog':
case 'ldmHashRateLog':
case 'nbWorkers':
case 'jobSize':
case 'overlapLog':
mapped = parameters[name];
break;
case 'enableLongDistanceMatching':
case 'contentSizeFlag':
case 'checksumFlag':
case 'dictIDFlag':
mapped = Number(parameters[name]);
break;
case 'strategy':
mapped = binding.Strategy[parameters[name]];
break;
default:
(0, util_1.assertInvalidParameter)(name);
}
cctx.setParameter(binding.CParameter[name], mapped);
const mapped = (0, util_1.mapParameters)(binding.CParameter, PARAM_MAPPERS, parameters);
for (const [param, value] of mapped) {
cctx.setParameter(param, value);
}
}
/**
* High-level interface for customized single-pass Zstandard compression.
*
* @example Basic usage
* ```
* const cmp = new Compressor();
* const result = cmp.compress(Buffer.from('your data here'));
* ```
*
* @example Advanced usage
* ```
* const cmp = new Compressor();
* cmp.setParameters({compressionLevel: 9});
* cmp.loadDictionary(fs.readFileSync('path/to/dictionary.dct'));
* const result = cmp.compress(Buffer.from('your data here'));
* ```
*/
class Compressor {
constructor() {
this.cctx = new binding.CCtx();
this.scratchBuf = null;
this.scratchLen = -1;
}
cctx = new binding.CCtx();
scratchBuf = null;
scratchLen = -1;
/**
* Compress the data in `buffer` with the configured dictionary/parameters.
*
* @param buffer - Data to compress
* @returns A new Buffer containing the compressed data
*/
compress(buffer) {

@@ -63,3 +79,3 @@ let dest;

// Destination buffer is too wasteful, trim by copying
result = Buffer.from(dest.slice(0, length));
result = Buffer.from(dest.subarray(0, length));
// Save the old buffer for scratch if it's small enough

@@ -73,3 +89,3 @@ if (dest.length <= 128 * 1024 && buffer.length > this.scratchLen) {

// Destination buffer is about the right size, return it directly
result = dest.slice(0, length);
result = dest.subarray(0, length);
// Make sure we don't re-use the scratch buffer if we're returning it

@@ -83,2 +99,12 @@ if (Object.is(dest, this.scratchBuf)) {

}
/**
* Load a compression dictionary from the provided buffer.
*
* The loaded dictionary will be used for all future {@link compress} calls
* until removed or replaced. Passing an empty buffer to this function will
* remove a previously loaded dictionary.
*
* Set any parameters you want to set before loading a dictionary, since
* parameters can't be changed while a dictionary is loaded.
*/
loadDictionary(data) {

@@ -90,2 +116,8 @@ // TODO: Compression parameters get locked in on next compress operation,

}
/**
* Reset the compressor state to only the provided parameters.
*
* Any loaded dictionary will be cleared, and any parameters not specified
* will be reset to their default values.
*/
setParameters(parameters) {

@@ -95,2 +127,8 @@ this.cctx.reset(binding.ResetDirective.parameters);

}
/**
* Modify compression parameters.
*
* Parameters not specified will be left at their current values. Changing
* parameters is not possible while a dictionary is loaded.
*/
updateParameters(parameters) {

@@ -104,4 +142,28 @@ updateCCtxParameters(this.cctx, parameters);

const dummyEndBuffer = Buffer.alloc(0);
/**
* High-level interface for streaming Zstandard compression.
*
* Implements the standard Node stream transformer interface, so can be used
* with `.pipe` or any other streaming interface.
*
* @example Basic usage
* ```
* import { pipeline } from 'stream/promises';
* const cmp = new CompressStream();
* await pipeline(
* fs.createReadStream('data.txt'),
* new CompressStream(),
* fs.createWriteStream('data.txt.zst'),
* );
* ```
*/
class CompressStream extends stream_1.Transform {
cctx = new binding.CCtx();
buffer = Buffer.allocUnsafe(BUF_SIZE);
// TODO: Allow user to specify a dictionary
/**
* Create a new streaming compressor with the specified parameters.
*
* @param parameters - Compression parameters
*/
constructor(parameters = {}) {

@@ -111,4 +173,2 @@ // TODO: autoDestroy doesn't really work on Transform, we should consider

super({ autoDestroy: true });
this.cctx = new binding.CCtx();
this.buffer = Buffer.allocUnsafe(BUF_SIZE);
updateCCtxParameters(this.cctx, parameters);

@@ -118,5 +178,24 @@ }

// TODO: Provide API to allow changing parameters between frames
/**
* End the current Zstandard frame without ending the stream.
*
* Frames are compressed independently, so this can be used to create a
* "seekable" archive, or to provide more resilience to data corruption by
* isolating parts of the file from each other.
*
* The optional `callback` is invoked with the same semantics as it is for a
* a stream write.
*/
endFrame(callback) {
this.write(dummyEndBuffer, undefined, callback);
}
/**
* Flush internal compression buffers to the stream.
*
* Ensures that a receiver can decompress all bytes written so far without
* as much negative impact to compression as {@link endFrame}.
*
* The optional `callback` is invoked with the same semantics as it is for a
* a stream write.
*/
flush(callback) {

@@ -130,6 +209,6 @@ this.write(dummyFlushBuffer, undefined, callback);

if (produced > 0) {
this.push(this.buffer.slice(0, produced));
this.push(this.buffer.subarray(0, produced));
this.buffer = Buffer.allocUnsafe(Math.max(BUF_SIZE, ret));
}
chunk = chunk.slice(consumed);
chunk = chunk.subarray(consumed);
if (chunk.length == 0 && (!flushing || ret == 0))

@@ -139,6 +218,7 @@ return;

}
/** @internal */
_transform(chunk, _encoding, done) {
try {
// The Writable machinery is responsible for converting to a Buffer
(0, util_1.tsAssert)(chunk instanceof Buffer);
(0, assert_1.strict)(chunk instanceof Buffer);
// Handle flushes indicated by special dummy buffers

@@ -153,6 +233,9 @@ let endType = binding.EndDirective.continue;

catch (err) {
return done(err);
done(err);
return;
}
return done();
done();
return;
}
/** @internal */
_flush(done) {

@@ -163,7 +246,9 @@ try {

catch (err) {
return done(err);
done(err);
return;
}
return done();
done();
return;
}
}
exports.CompressStream = CompressStream;
/// <reference types="node" />
/// <reference types="node" />
import { Transform, TransformCallback } from 'stream';
/**
* Zstandard decompression parameters.
*
* Most applications will not need to adjust these. See the
* {@link https://facebook.github.io/zstd/zstd_manual.html | Zstandard manual}
* for a full description.
*/
export interface DecompressParameters {
windowLogMax: number;
windowLogMax?: number | undefined;
}
/**
* High-level interface for customized single-pass Zstandard decompression.
*
* @example Basic usage
* ```
* const dec = new Decompressor();
* const result = dec.decompress(compressedBuffer);
* ```
*
* @example Advanced usage
* ```
* const dec = new Decompressor();
* dec.setParameters({windowLogMax: 24});
* dec.loadDictionary(fs.readFileSync('path/to/dictionary.dct'));
* const result = dec.decompress(compressedBuffer);
* ```
*/
export declare class Decompressor {
private dctx;
/**
* Decompress the data in `buffer` with the configured dictionary/parameters.
*
* @param buffer - Compressed data
* @returns A new buffer with the uncompressed data
*/
decompress(buffer: Uint8Array): Buffer;
loadDictionary(data: Buffer): void;
setParameters(parameters: Partial<DecompressParameters>): void;
updateParameters(parameters: Partial<DecompressParameters>): void;
/**
* Load a compression dictionary from the provided buffer.
*
* The loaded dictionary will be used for all future {@link decompress} calls
* until removed or replaced. Passing an empty buffer to this function will
* remove a previously loaded dictionary.
*/
loadDictionary(data: Uint8Array): void;
/**
* Reset the decompressor state to only the provided parameters.
*
* Any loaded dictionary will be cleared, and any parameters not specified
* will be reset to their default values.
*/
setParameters(parameters: DecompressParameters): void;
/**
* Modify decompression parameters.
*
* Parameters not specified will be left at their current values.
*/
updateParameters(parameters: DecompressParameters): void;
}
/**
* High-level interface for streaming Zstandard decompression.
*
* Implements the standard Node stream transformer interface, so can be used
* with `.pipe` or any other streaming interface.
*
* @example Basic usage
* ```
* import { pipeline } from 'stream/promises';
* await pipeline(
* fs.createReadStream('data.txt.zst'),
* new DecompressStream(),
* fs.createWriteStream('data.txt'),
* );
* ```
*/
export declare class DecompressStream extends Transform {
private dctx;
private inFrame;
constructor(parameters?: Partial<DecompressParameters>);
/**
* Create a new streaming decompressor with the specified parameters.
*
* @param parameters - Decompression parameters
*/
constructor(parameters?: DecompressParameters);
/** @internal */
_transform(chunk: unknown, _encoding: string, done: TransformCallback): void;
/** @internal */
_flush(done: TransformCallback): void;
}

@@ -8,16 +8,9 @@ "use strict";

const util_1 = require("./util");
const PARAM_MAPPERS = {
windowLogMax: util_1.mapNumber,
};
function updateDCtxParameters(dctx, parameters) {
for (const [key, value] of Object.entries(parameters)) {
if (value === undefined)
continue;
const name = key;
let mapped;
switch (name) {
case 'windowLogMax':
mapped = Number(parameters[name]);
break;
default:
(0, util_1.assertInvalidParameter)(name);
}
dctx.setParameter(binding.DParameter[name], mapped);
const mapped = (0, util_1.mapParameters)(binding.DParameter, PARAM_MAPPERS, parameters);
for (const [param, value] of mapped) {
dctx.setParameter(param, value);
}

@@ -38,6 +31,27 @@ }

const BUF_SIZE = binding.dStreamOutSize();
/**
* High-level interface for customized single-pass Zstandard decompression.
*
* @example Basic usage
* ```
* const dec = new Decompressor();
* const result = dec.decompress(compressedBuffer);
* ```
*
* @example Advanced usage
* ```
* const dec = new Decompressor();
* dec.setParameters({windowLogMax: 24});
* dec.loadDictionary(fs.readFileSync('path/to/dictionary.dct'));
* const result = dec.decompress(compressedBuffer);
* ```
*/
class Decompressor {
constructor() {
this.dctx = new binding.DCtx();
}
dctx = new binding.DCtx();
/**
* Decompress the data in `buffer` with the configured dictionary/parameters.
*
* @param buffer - Compressed data
* @returns A new buffer with the uncompressed data
*/
decompress(buffer) {

@@ -70,3 +84,3 @@ // TODO: Default allocation limit, with option to override

const [, produced, consumed] = this.dctx.decompressStream(chunk, remainingInput);
resultChunks.push(chunk.slice(0, produced));
resultChunks.push(chunk.subarray(0, produced));
remainingInput = remainingInput.subarray(consumed);

@@ -77,5 +91,18 @@ }

}
/**
* Load a compression dictionary from the provided buffer.
*
* The loaded dictionary will be used for all future {@link decompress} calls
* until removed or replaced. Passing an empty buffer to this function will
* remove a previously loaded dictionary.
*/
loadDictionary(data) {
this.dctx.loadDictionary(data);
}
/**
* Reset the decompressor state to only the provided parameters.
*
* Any loaded dictionary will be cleared, and any parameters not specified
* will be reset to their default values.
*/
setParameters(parameters) {

@@ -85,2 +112,7 @@ this.dctx.reset(binding.ResetDirective.parameters);

}
/**
* Modify decompression parameters.
*
* Parameters not specified will be left at their current values.
*/
updateParameters(parameters) {

@@ -91,4 +123,27 @@ updateDCtxParameters(this.dctx, parameters);

exports.Decompressor = Decompressor;
/**
* High-level interface for streaming Zstandard decompression.
*
* Implements the standard Node stream transformer interface, so can be used
* with `.pipe` or any other streaming interface.
*
* @example Basic usage
* ```
* import { pipeline } from 'stream/promises';
* await pipeline(
* fs.createReadStream('data.txt.zst'),
* new DecompressStream(),
* fs.createWriteStream('data.txt'),
* );
* ```
*/
class DecompressStream extends stream_1.Transform {
dctx = new binding.DCtx();
inFrame = false;
// TODO: Allow user to specify a dictionary
/**
* Create a new streaming decompressor with the specified parameters.
*
* @param parameters - Decompression parameters
*/
constructor(parameters = {}) {

@@ -98,6 +153,5 @@ // TODO: autoDestroy doesn't really work on Transform, we should consider

super({ autoDestroy: true });
this.dctx = new binding.DCtx();
this.inFrame = false;
updateDCtxParameters(this.dctx, parameters);
}
/** @internal */
_transform(chunk, _encoding, done) {

@@ -107,3 +161,3 @@ // TODO: Optimize this by looking at the frame header

// The Writable machinery is responsible for converting to a Buffer
(0, util_1.tsAssert)(chunk instanceof Buffer);
(0, assert_1.strict)(chunk instanceof Buffer);
let srcBuf = chunk;

@@ -114,4 +168,4 @@ for (;;) {

if (produced > 0)
this.push(dstBuf.slice(0, produced));
srcBuf = srcBuf.slice(consumed);
this.push(dstBuf.subarray(0, produced));
srcBuf = srcBuf.subarray(consumed);
if (srcBuf.length === 0 && (produced < dstBuf.length || ret === 0)) {

@@ -124,12 +178,18 @@ this.inFrame = ret !== 0;

catch (err) {
return done(err);
done(err);
return;
}
return done();
done();
return;
}
/** @internal */
_flush(done) {
if (this.inFrame)
return done(new Error('Stream ended in middle of compressed data frame'));
return done();
if (this.inFrame) {
done(new Error('Stream ended in middle of compressed data frame'));
return;
}
done();
return;
}
}
exports.DecompressStream = DecompressStream;

@@ -1,2 +0,22 @@

export { CompressParameters, CompressStream, Compressor, StrategyName, } from './compress';
export { DecompressParameters, DecompressStream, Decompressor, } from './decompress';
/**
* This module (imported as `zstd-napi`) provides a high-level interface for
* Zstandard compression and decompression. If you aren't sure what you need,
* this is the right place to start!
*
* - The {@link compress} and {@link decompress} functions are the simplest,
* single-pass (in-memory) interface.
* - The {@link Compressor} and {@link Decompressor} classes provide a
* single-pass interface with dictionary support.
* - The {@link CompressStream} and {@link DecompressStream} classes provide
* a streaming interface.
*
* If you're looking for low-level bindings to the native Zstandard library,
* see the {@link "binding" | binding module}.
*
* @module index
*/
export { CompressStream, Compressor } from './compress';
export type { CompressParameters } from './compress';
export { DecompressStream, Decompressor } from './decompress';
export type { DecompressParameters } from './decompress';
export { compress, decompress } from './simple';
"use strict";
/**
* This module (imported as `zstd-napi`) provides a high-level interface for
* Zstandard compression and decompression. If you aren't sure what you need,
* this is the right place to start!
*
* - The {@link compress} and {@link decompress} functions are the simplest,
* single-pass (in-memory) interface.
* - The {@link Compressor} and {@link Decompressor} classes provide a
* single-pass interface with dictionary support.
* - The {@link CompressStream} and {@link DecompressStream} classes provide
* a streaming interface.
*
* If you're looking for low-level bindings to the native Zstandard library,
* see the {@link "binding" | binding module}.
*
* @module index
*/
Object.defineProperty(exports, "__esModule", { value: true });
exports.Decompressor = exports.DecompressStream = exports.Compressor = exports.CompressStream = void 0;
exports.decompress = exports.compress = exports.Decompressor = exports.DecompressStream = exports.Compressor = exports.CompressStream = void 0;
var compress_1 = require("./compress");

@@ -10,1 +27,4 @@ Object.defineProperty(exports, "CompressStream", { enumerable: true, get: function () { return compress_1.CompressStream; } });

Object.defineProperty(exports, "Decompressor", { enumerable: true, get: function () { return decompress_1.Decompressor; } });
var simple_1 = require("./simple");
Object.defineProperty(exports, "compress", { enumerable: true, get: function () { return simple_1.compress; } });
Object.defineProperty(exports, "decompress", { enumerable: true, get: function () { return simple_1.decompress; } });

@@ -1,2 +0,14 @@

export declare function assertInvalidParameter(parameter: never): never;
export declare function tsAssert(value: unknown, msg?: string | Error): asserts value;
interface ParamMapper<T> {
validateInput(value: unknown): value is T;
mapValue(value: T): number;
}
type ParamObject<M> = {
[key in keyof M]?: M[key] extends ParamMapper<infer T> ? T | undefined : never;
};
type StrKeys<O> = Extract<keyof O, string>;
type OnlyKeys<O, K> = O & Record<Exclude<keyof O, K>, never>;
export declare const mapNumber: ParamMapper<number>;
export declare function mapEnum<E extends Record<StrKeys<E>, number>>(enumObj: E): ParamMapper<StrKeys<E>>;
export declare const mapBoolean: ParamMapper<boolean>;
export declare function mapParameters<E, M extends Record<StrKeys<E>, ParamMapper<unknown>>, P extends ParamObject<M>>(paramEnum: E, mapper: OnlyKeys<M, StrKeys<E>>, params: OnlyKeys<P, StrKeys<E>>): Map<E[keyof E], number>;
export {};
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.tsAssert = exports.assertInvalidParameter = void 0;
const assert_1 = require("assert");
function assertInvalidParameter(parameter) {
throw new RangeError(`Invalid parameter name: ${parameter}`);
exports.mapParameters = exports.mapBoolean = exports.mapEnum = exports.mapNumber = void 0;
exports.mapNumber = {
validateInput: (value) => typeof value === 'number',
mapValue: (value) => value,
};
function mapEnum(enumObj) {
return {
validateInput: (value) => typeof value === 'string' && value in enumObj,
mapValue: (value) => enumObj[value],
};
}
exports.assertInvalidParameter = assertInvalidParameter;
function tsAssert(value, msg) {
(0, assert_1.strict)(value, msg);
exports.mapEnum = mapEnum;
exports.mapBoolean = {
validateInput: (value) => typeof value === 'boolean',
mapValue: (value) => Number(value),
};
function mapParameter(name, mapper, value) {
if (!mapper.validateInput(value)) {
throw new TypeError(`Invalid type for parameter: ${name}`);
}
return mapper.mapValue(value);
}
exports.tsAssert = tsAssert;
function mapParameters(paramEnum, mapper, params) {
const result = new Map();
for (const [rawKey, value] of Object.entries(params)) {
if (value !== undefined) {
if (!(rawKey in mapper)) {
throw new RangeError(`Invalid parameter name: ${rawKey}`);
}
const key = rawKey;
result.set(paramEnum[key], mapParameter(key, mapper[key], value));
}
}
return result;
}
exports.mapParameters = mapParameters;
{
"name": "zstd-napi",
"version": "0.0.7",
"version": "0.0.8",
"description": "zstd bindings with N-API",

@@ -9,49 +9,63 @@ "repository": {

},
"license": "Apache-2.0",
"author": "Andrew Drake <adrake@adrake.org>",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"files": [
"NOTICE",
"binding.d.ts",
"binding.gyp",
"binding.js",
"deps",
"dist",
"src"
],
"binary": {
"napi_versions": [
8
]
},
"scripts": {
"build": "node-gyp configure && node-gyp build",
"build-prebuild": "prebuild -r napi --all",
"ci-ignore-scripts": "npm ci --ignore-scripts",
"clang-format": "clang-format -i src/*",
"install": "prebuild-install -r napi || node-gyp rebuild",
"lint": "eslint --ext .js,.ts .",
"prettier": "prettier -l --write '**/*.{js,json,ts,yml}'",
"lint": "eslint .",
"prepare": "tsc -p tsconfig.emit.json",
"prettier": "prettier -l --write .",
"submodule-update": "git submodule update --init",
"test": "jest",
"test-coverage": "jest --coverage --coverageReporters=json"
"test-coverage": "jest --coverage --coverageReporters=json",
"typedoc": "typedoc"
},
"author": "Andrew Drake <adrake@adrake.org>",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node14": "^1.0.3",
"@types/jest": "^29.5.2",
"@typescript-eslint/eslint-plugin": "^5.59.11",
"@typescript-eslint/parser": "^5.59.11",
"eslint": "^8.42.0",
"eslint-config-prettier": "^8.8.0",
"eslint-plugin-jest": "^27.2.1",
"jest": "^29.5.0",
"prebuild": "^11.0.4",
"prettier": "^2.8.8",
"ts-jest": "^29.1.0",
"typescript": "^5.1.3"
},
"dependencies": {
"@types/node": "^16.11.39",
"node-addon-api": "^2.0.2",
"@types/node": "*",
"node-addon-api": "^7.0.0",
"prebuild-install": "^7.1.1"
},
"binary": {
"napi_versions": [
6,
8
]
"devDependencies": {
"@eslint/js": "8.57.0",
"@fast-check/jest": "1.8.0",
"@tsconfig/node18": "18.2.4",
"@tsconfig/strictest": "2.0.3",
"@types/eslint__js": "8.42.3",
"@types/eslint-config-prettier": "6.11.3",
"@types/jest": "29.5.12",
"eslint": "8.57.0",
"eslint-config-prettier": "9.1.0",
"eslint-plugin-jest": "27.9.0",
"eslint-plugin-tsdoc": "0.2.17",
"expect-type": "0.18.0",
"globals": "15.0.0",
"jest": "29.7.0",
"node-gyp": "10.0.1",
"prettier": "3.2.5",
"ts-jest": "29.1.2",
"typedoc": "0.25.12",
"typescript": "5.4.4",
"typescript-eslint": "7.5.0"
},
"files": [
"binding.d.ts",
"binding.gyp",
"binding.js",
"deps",
"dist",
"src"
]
"engines": {
"node": "^12.22.0 || ^14.17.0 || ^15.12.0 || >=16"
}
}

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc