Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
rdf-canonize
Advanced tools
An implementation of the RDF Dataset Canonicalization algorithm in JavaScript
An implementation of the RDF Dataset Canonicalization specification in JavaScript.
See the RDF Dataset Canonicalization specification for details on the specification and algorithm this library implements.
npm install rdf-canonize
const canonize = require('rdf-canonize');
This package has support for rdf-canonize-native. This package can be useful if your application requires doing many canonizing operations asynchronously in parallel or in the background. It is highly recommended that you understand your requirements and benchmark using JavaScript vs native bindings. The native bindings add overhead and the JavaScript implementation may be faster with modern runtimes.
The native bindings are not installed by default and must be explicitly installed.
npm install rdf-canonize
npm install rdf-canonize-native
Note that the native code is not automatically used. To use the native bindings
you must have them installed and set the useNative
option to true
.
const canonize = require('rdf-canonize');
Install in your project with npm
and use your favorite browser bundler tool.
// canonize a dataset with the default algorithm
const dataset = [
// ...
];
const canonical = await canonize.canonize(dataset, {algorithm: 'RDFC-1.0'});
// parse and canonize N-Quads with the default algorithm
const nquads = "...";
const canonical = await canonize.canonize(nquads, {
algorithm: 'RDFC-1.0',
inputFormat: 'application/n-quads'
});
Using this library with React Native requires a polyfill such as
data-integrity-rn
to be imported before this library:
import '@digitalcredentials/data-integrity-rn'
import * as canonize from 'rdf-canonize'
The polyfill needs to provide the following globals:
crypto.subtle
TextEncoder
rejectURDNA2015
is truthy, it will cause an error to be
thrown if "URDNA2015" is used.RDF_CANONIZE_TRACE_URDNA2015
is truthy, it will cause
console.trace()
to be called when "URDNA2015" is used. This is designed
for development use only to find where "URDNA2015" is being used. It
could be very verbose.Inputs may vary in complexity and some inputs may use more computational resources than desired. There also exists a class of inputs that are sometimes referred to as "poison" graphs. These are structured or designed specifically to be difficult to process but often do not provide any useful purpose.
The canonize
API accepts an
AbortSignal
as the signal
parameter that can be used to control processing of
computationally difficult inputs. signal
is not set by default. It can be
used in a number of ways:
AbortController.abort()
AbortSignal.timeout()
AbortSignal
. This
could track memory pressure or system load.AbortSignal
such as with
AbortSignal.any()
or
signals.For performance reasons this signal is only checked periodically during processing and is not immediate.
The canonize
API has parameters to limit how many times the blank node deep
comparison algorithm can be run to assign blank node labels before throwing an
error. It is designed to control exponential growth related to the number of
blank nodes. Graphs without blank nodes, and those with simple blank nodes will
not run the algorithms that use this parameter. Those with more complex deeply
connected blank nodes can result in significant time complexity which these
parameters can control.
The canonize
API has the following parameters to control limits:
maxWorkFactor
: Used to calculate a maximum number of deep iterations based
on the number of non-unique blank nodes.
0
: Deep inspection disallowed.1
: Limit deep iterations to O(n). (default)2
: Limit deep iterations to O(n^2).3
: Limit deep iterations to O(n^3). Values at this level or higher will
allow processing of complex "poison" graphs but may take significant
amounts of computational resources.Infinity
: No limitation.maxDeepIterations
: The exact number of deep iterations. This parameter is
for specialized use cases and use of maxWorkFactor
is recommended. Defaults
to Infinity
and any other value will override maxWorkFactor
.In practice, callers must balance system load, concurrent processing, expected
input size and complexity, and other factors to determine which complexity
controls to use. This library defaults to a maxWorkFactor
of 1
and no
timeout signal. These can be adjusted as needed.
This library includes a sample testing utility which may be used to verify that changes to the processor maintain the correct output.
The test suite is included in an external repository:
https://github.com/w3c/rdf-canon
This should be a sibling directory of the rdf-canonize
directory or in a
test-suites
directory. To clone shallow copies into the test-suites
directory you can use the following:
npm run fetch-test-suite
Node.js tests:
npm test
Browser tests via Karma:
npm run test-karma
If you installed the test suites elsewhere, or wish to run other tests, use
the TEST_DIR
environment var:
TEST_DIR="/tmp/tests" npm test
To generate EARL reports:
# generate a JSON-LD EARL report with Node.js
EARL=earl-node.jsonld npm test
# generate a Turtle EARL report with Node.js
EARL=js-rdf-canonize-earl.ttl npm test
# generate official Turtle EARL report with Node.js
# turns ASYNC on and SYNC and WEBCRYPTO off
EARL_OFFICIAL=true EARL=js-rdf-canonize-earl.ttl npm test
See docs in the benchmark README.
The source code for this library is available at:
https://github.com/digitalbazaar/rdf-canonize
Commercial support for this library is available upon request from Digital Bazaar: support@digitalbazaar.com
FAQs
An implementation of the RDF Dataset Canonicalization algorithm in JavaScript
We found that rdf-canonize demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.