Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

rdfjs-di

Package Overview
Dependencies
Maintainers
0
Versions
7
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

rdfjs-di - npm Package Compare versions

Comparing version 0.0.95 to 0.0.96

dist/lib/debug.d.ts

20

dist/index.d.ts

@@ -17,9 +17,12 @@ /**

*
* A single previous proof reference may also be set, although that really makes sense in the case of a single key only
*
* @param dataset
* @param keyData
* @param previous - A previous proof ID, when applicable
* @throws - Error if there was an issue while signing.
* @returns
*/
export declare function generateProofGraph(dataset: rdf.DatasetCore, keyData: Iterable<KeyData>): Promise<rdf.DatasetCore[]>;
export declare function generateProofGraph(dataset: rdf.DatasetCore, keyData: KeyData): Promise<rdf.DatasetCore>;
export declare function generateProofGraph(dataset: rdf.DatasetCore, keyData: Iterable<KeyData>, previous?: rdf.Quad_Subject): Promise<rdf.DatasetCore[]>;
export declare function generateProofGraph(dataset: rdf.DatasetCore, keyData: KeyData, previous?: rdf.Quad_Subject): Promise<rdf.DatasetCore>;
/**

@@ -51,6 +54,14 @@ * Verify the separate proof graph.

* If the anchor is defined, then that will be the subject for quads with the `proof` property is added (one for each proof graph).
* In the case of a VC, the ID of the credential itself is naturally the anchor, but there is no such "natural" node for a general
* RDF dataset.
*
* If the `keyPair` argument is an Array, then the proof graphs are considered to be a Proof Chain. Otherwise,
* If the `keyPair` argument is an Array, then the proof graphs are considered to define a Proof Chain. Otherwise,
* (e.g., if it is a Set), it is a Proof Set.
* Proof chains are somewhat restricted compared to the specification: proof chains and sets are not mixed. In other words, either
* all proofs are part of a chain or form a chain; the case when a previous proof reference points at a set of proofs is not possible.
*
* The anchor should exist to create a proper chain per spec, because the spec requires it to sign over the previous proof reference. The chain
* will be created in the absence of an anchor, but the result will not be conform to the specification (that _requires_ the addition of a proof
* reference triple).)
*
* @param dataset

@@ -66,3 +77,4 @@ * @param keyData

* If the anchor is present, the proof graphs are identified by the object terms of the corresponding [`proof`](https://www.w3.org/TR/vc-data-integrity/#proofs) quads.
* Otherwise, the type relationship to [`DataIntegrityProof`](https://www.w3.org/TR/vc-data-integrity/#dataintegrityproof) are considered. Note that if no anchor is provided, this second choice
* Otherwise, the type relationship to [`DataIntegrityProof`](https://www.w3.org/TR/vc-data-integrity/#dataintegrityproof) are considered.
* Note that if no anchor is provided, this second choice
* may lead to erroneous results because some of the embedded proof graphs are not meant to be a proof for the full dataset. (This may

@@ -69,0 +81,0 @@ * be the case in a ["Verifiable Presentation" style datasets](https://www.w3.org/TR/vc-data-model-2.0/#presentations-0).)

203

dist/index.js

@@ -12,2 +12,3 @@ "use strict";

const types = require("./lib/types");
const debug = require("./lib/debug");
const utils_1 = require("./lib/utils");

@@ -20,4 +21,4 @@ const proof_utils_1 = require("./lib/proof_utils");

// n3.DataFactory is a namespace with some functions...
const { quad } = n3.DataFactory;
async function generateProofGraph(dataset, keyData) {
const { quad, namedNode } = n3.DataFactory;
async function generateProofGraph(dataset, keyData, previous) {
// Start fresh with results

@@ -30,3 +31,3 @@ const report = { errors: [], warnings: [] };

// execute the proof graph generation concurrently
const promises = Array.from(keyPairs).map((keypair) => (0, proof_utils_1.generateAProofGraph)(report, toBeSigned, keypair));
const promises = Array.from(keyPairs).map((keypair) => (0, proof_utils_1.generateAProofGraph)(report, toBeSigned, keypair, previous));
const retval = await Promise.all(promises);

@@ -70,4 +71,4 @@ // return by taking care of overloading.

return {
dataset: (0, utils_1.convertToStore)(pr),
id: undefined,
proofQuads: (0, utils_1.convertToStore)(pr),
proofGraph: undefined,
};

@@ -89,6 +90,14 @@ });

* If the anchor is defined, then that will be the subject for quads with the `proof` property is added (one for each proof graph).
* In the case of a VC, the ID of the credential itself is naturally the anchor, but there is no such "natural" node for a general
* RDF dataset.
*
* If the `keyPair` argument is an Array, then the proof graphs are considered to be a Proof Chain. Otherwise,
* If the `keyPair` argument is an Array, then the proof graphs are considered to define a Proof Chain. Otherwise,
* (e.g., if it is a Set), it is a Proof Set.
* Proof chains are somewhat restricted compared to the specification: proof chains and sets are not mixed. In other words, either
* all proofs are part of a chain or form a chain; the case when a previous proof reference points at a set of proofs is not possible.
*
* The anchor should exist to create a proper chain per spec, because the spec requires it to sign over the previous proof reference. The chain
* will be created in the absence of an anchor, but the result will not be conform to the specification (that _requires_ the addition of a proof
* reference triple).)
*
* @param dataset

@@ -100,35 +109,72 @@ * @param keyData

async function embedProofGraph(dataset, keyData, anchor) {
const retval = (0, utils_1.convertToStore)(dataset);
const output = (0, utils_1.convertToStore)(dataset);
const keyPairs = (0, utils_1.isKeyData)(keyData) ? [keyData] : Array.from(keyData);
const proofGraphs = await generateProofGraph(dataset, keyPairs);
// Essential: in this API, an array is automatically a key chain, otherwise a key set.
// The peculiarity of the key chain embedding is that it requires the anchor to follow the official algorithm...
const isKeyChain = keyPairs.length > 1 && Array.isArray(keyData);
const chain = [];
for (let i = 0; i < proofGraphs.length; i++) {
const proofTriples = proofGraphs[i];
const proofGraphID = retval.createBlankNode();
let allProofs = [];
// Convert a proof graph, generated by the appropriate method, into a proof chain entry;
// it extracts the data necessary to combine several proofs into what is necessary.
const storeProofData = (proofTriples) => {
// Look for the type statement among the graph entries
for (const q of proofTriples) {
retval.add(quad(q.subject, q.predicate, q.object, proofGraphID));
if (isKeyChain && q.predicate.value === proof_utils_1.rdf_type.value && q.object.value === proof_utils_1.sec_di_proof.value) {
// Storing the values to create the proof chains in a subsequent step
// The subject is the ID of the proof
chain.push({
if (q.predicate.value === proof_utils_1.rdf_type.value && q.object.value === proof_utils_1.sec_di_proof.value) {
return {
proofId: q.subject,
graph: proofGraphID,
});
proofGraph: output.createBlankNode(),
// In fact, refactoring may not be necessary, because the proof graph generated by
// this package does not contain bnodes. But the user may decide to do it by hand and
// include extra stuff...
proofQuads: (0, utils_1.refactorBnodes)(output, proofTriples),
// This may be enough in most cases:
// proofQuads : proofTriples,
};
}
}
;
if (anchor) {
const q = quad(anchor, proof_utils_1.sec_proof, proofGraphID);
retval.add(q);
// This, in fact, does not happen. The proofTriples are generated by a function that does add a type triple...
// Returning null is just a way of making the TS compiler happy
return null;
};
// Unfortunately, the key chain and key set cases are fairly different
if (isKeyChain) {
for (let i = 0; i < keyPairs.length; i++) {
// Generate the intermediate quads that are temporarily added to
// the core dataset before signing. This is, in effect,
// the verbatim copy of the previous proof, which therefore
// "signed over" by the current proof.
const extraQuads = (0, utils_1.extraChainQuads)(allProofs, i, anchor);
debug.log(extraQuads);
// The intermediate quads added to the dataset to secure the chain
// (This is an n3 specific API method!)
output.addQuads(extraQuads);
// We generate the relevant proof graph using the dedicated utility...
const proofTriples = await generateProofGraph(output, keyPairs[i], i !== 0 ? allProofs[i - 1].proofId : undefined /* Reference to the previous proof, if applicable */);
// Remove the intermediate quads
// (This is an n3 specific API method!)
output.removeQuads(extraQuads);
// Generate a complete proof structure for the new proof...
const newProof = storeProofData(proofTriples);
//... and store it on the list of proofs.
if (newProof !== null) {
allProofs.push(newProof);
}
}
}
// Adding the chain statements, if required
if (isKeyChain) {
for (let i = 1; i < chain.length; i++) {
const q = quad(chain[i].proofId, proof_utils_1.sec_previousProof, chain[i - 1].proofId, chain[i].graph);
retval.add(q);
else {
// This is the key set case
// All graphs can be generated in one step, making the processing way simpler...
const proofGraphs = await generateProofGraph(dataset, keyPairs);
allProofs = proofGraphs.map(storeProofData);
}
// Merge all generated proof datasets into the result
for (const proof of allProofs) {
if (anchor) {
output.add(quad(anchor, proof_utils_1.sec_proof, proof.proofGraph));
}
for (const q of proof.proofQuads) {
// No need bnode reconciliation, because proof graphs never contain bnodes
output.add(quad(q.subject, q.predicate, q.object, proof.proofGraph));
}
}
return retval;
return output;
}

@@ -140,3 +186,4 @@ exports.embedProofGraph = embedProofGraph;

* If the anchor is present, the proof graphs are identified by the object terms of the corresponding [`proof`](https://www.w3.org/TR/vc-data-integrity/#proofs) quads.
* Otherwise, the type relationship to [`DataIntegrityProof`](https://www.w3.org/TR/vc-data-integrity/#dataintegrityproof) are considered. Note that if no anchor is provided, this second choice
* Otherwise, the type relationship to [`DataIntegrityProof`](https://www.w3.org/TR/vc-data-integrity/#dataintegrityproof) are considered.
* Note that if no anchor is provided, this second choice
* may lead to erroneous results because some of the embedded proof graphs are not meant to be a proof for the full dataset. (This may

@@ -163,4 +210,6 @@ * be the case in a ["Verifiable Presentation" style datasets](https://www.w3.org/TR/vc-data-model-2.0/#presentations-0).)

async function verifyEmbeddedProofGraph(dataset, anchor) {
const report = { errors: [], warnings: [] };
const dataStore = new n3.Store();
const proofGraphs = new utils_1.DatasetMap();
let isProofChain = false;
// First, identify the possible dataset graph IDs

@@ -172,3 +221,3 @@ for (const q of dataset) {

if (q.object.termType !== "Literal") {
proofGraphs.item(q.object);
proofGraphs.set(q.object);
}

@@ -180,10 +229,15 @@ }

// This branch is the reason we have to use a DatasetMap for the
// storage of graph IDs; we should not have duplicate entries.
// storage of graph IDs: we should not have duplicate entries.
if (q.predicate.equals(proof_utils_1.rdf_type) && q.object.equals(proof_utils_1.sec_di_proof)) {
proofGraphs.item(q.graph);
if (q.graph.termType === "DefaultGraph") {
report.errors.push(new types.Proof_Verification_Error("Proof type cannot be the default graph"));
}
else {
proofGraphs.set(q.graph);
}
}
}
}
// By now, we got the identification of all the proof graphs, we can separate the quads among
// the data graph and the relevant proof graphs
// By now, we got the identification of all the proof graphs, we can separate the quads into
// the "real" data graph and the relevant proof graphs
for (const q of dataset) {

@@ -195,7 +249,2 @@ if (q.predicate.equals(proof_utils_1.sec_proof) && proofGraphs.has(q.graph)) {

}
else if (q.predicate.equals(proof_utils_1.sec_previousProof)) {
// Per the cryptosuite specifications, the "previous proof" statement is not part of the "proof options", ie,
// should not be used for the generation of the final proof. It was not used to generate the proof graph when signing.
continue;
}
else if (q.graph.termType === "DefaultGraph") {

@@ -205,6 +254,18 @@ dataStore.add(q);

else if (proofGraphs.has(q.graph)) {
// this quad belongs to a proof graph!
// Note that the separated proof graphs contain only triples, they become
// stand-alone RDF graphs now
proofGraphs.item(q.graph).add(quad(q.subject, q.predicate, q.object));
// this quad belongs to one of the proof graphs!
// Note that the separated proof graphs should contain only as they become
// stand-alone RDF graphs now, not part of a dataset
const proofStore = proofGraphs?.get(q.graph);
// let us store the data itself first:
proofStore.proofQuads.add(quad(q.subject, q.predicate, q.object));
// see if this triple gives us the proof object ID;
if (q.predicate.equals(proof_utils_1.rdf_type) && q.object.equals(proof_utils_1.sec_di_proof)) {
proofStore.proofId = q.subject;
// see if this is a previous proof statement; if so, store the reference for a subsequent ordering
}
else if (q.predicate.equals(proof_utils_1.sec_previousProof) && q.object.termType !== "Literal") {
proofStore.previousProof = q.object;
// marking the whole thing a chain!
isProofChain = true;
}
}

@@ -216,13 +277,47 @@ else {

}
const report = { errors: [], warnings: [] };
const hash = await (0, utils_1.calculateDatasetHash)(dataStore);
const proofs = proofGraphs.data();
const verified = await (0, proof_utils_1.verifyProofGraphs)(report, hash, proofs);
return {
verified,
verifiedDocument: verified ? dataStore : null,
errors: report.errors,
warnings: report.warnings
};
if (isProofChain) {
// Get the proofs into a reference order, just like when it is submitted
const allProofs = proofGraphs.orderedData();
let verified;
if (allProofs.length === 0) {
report.errors.push(new types.Proof_Verification_Error("Proof Chain has no start."));
verified = false;
}
else {
const verified_list = [];
if (anchor === undefined) {
report.warnings.push(new types.Unclassified_Error("No anchor has been provided for a proof chain."));
}
for (let i = 0; i < allProofs.length; i++) {
const extraQuads = (0, utils_1.extraChainQuads)(allProofs, i, anchor);
// These are the intermediate quads added to the dataset to secure the chain
// (This is an n3 specific API method!)
dataStore.addQuads(extraQuads);
const hash = await (0, utils_1.calculateDatasetHash)(dataStore);
const verifiedChainLink = await (0, proof_utils_1.verifyProofGraphs)(report, hash, [allProofs[i]]);
verified_list.push(verifiedChainLink);
dataStore.removeQuads(extraQuads);
}
verified = !verified_list.includes(false);
}
return {
verified,
verifiedDocument: verified ? dataStore : null,
errors: report.errors,
warnings: report.warnings
};
}
else {
// This is the simple case...
const hash = await (0, utils_1.calculateDatasetHash)(dataStore);
const proofs = proofGraphs.data();
const verified = await (0, proof_utils_1.verifyProofGraphs)(report, hash, proofs);
return {
verified,
verifiedDocument: verified ? dataStore : null,
errors: report.errors,
warnings: report.warnings
};
}
}
exports.verifyEmbeddedProofGraph = verifyEmbeddedProofGraph;

@@ -193,3 +193,3 @@ "use strict";

if (signature.length === 0 || signature[0] !== 'u') {
report.errors.push(new types.Malformed_Proof_Error(`Signature is of an incorrect format (${signature})`));
report.errors.push(new types.Proof_Verification_Error(`Signature is of an incorrect format (${signature})`));
return false;

@@ -207,2 +207,5 @@ }

const retval = await crypto.subtle.verify(algorithm, key, rawSignature, rawMessage);
if (retval === false) {
report.errors.push(new types.Proof_Verification_Error(`Signature ${signature} is invalid`));
}
return retval;

@@ -209,0 +212,0 @@ }

@@ -13,3 +13,3 @@ /**

import { Errors, KeyData } from './types';
import { GraphWithID } from './utils';
import { ProofStore } from './utils';
/***************************************************************************************

@@ -44,5 +44,6 @@ * Namespaces and specific terms that are used several times

* @param keyData
* @param previousProof - reference to a previous proof, if applicable
* @returns
*/
export declare function generateAProofGraph(report: Errors, hashValue: string, keyData: KeyData): Promise<rdf.DatasetCore>;
export declare function generateAProofGraph(report: Errors, hashValue: string, keyData: KeyData, previousProof?: rdf.Quad_Subject): Promise<rdf.DatasetCore>;
/**

@@ -68,2 +69,2 @@ * Check a series of proof graphs, ie, check whether the included signature of a proof graph corresponds to the hash value.

*/
export declare function verifyProofGraphs(report: Errors, hash: string, proofs: GraphWithID[]): Promise<boolean>;
export declare function verifyProofGraphs(report: Errors, hash: string, proofs: ProofStore[]): Promise<boolean>;

@@ -21,2 +21,3 @@ "use strict";

const multikey_1 = require("./multikey");
const debug = require("./debug");
// n3.DataFactory is a namespace with some functions...

@@ -76,3 +77,5 @@ const { namedNode, literal, quad } = n3.DataFactory;

// The return value must be the hash of the proof option graph
return await (0, utils_1.calculateDatasetHash)(proofGraph);
// debug.log(`The proof graph to hash:`, proofOptions);
// debug.log('\n');
return await (0, utils_1.calculateDatasetHash)(proofOptions);
}

@@ -86,5 +89,6 @@ /**

* @param keyData
* @param previousProof - reference to a previous proof, if applicable
* @returns
*/
async function generateAProofGraph(report, hashValue, keyData) {
async function generateAProofGraph(report, hashValue, keyData, previousProof) {
const cryptosuite = keyData?.cryptosuite || (0, crypto_utils_1.cryptosuiteId)(report, keyData);

@@ -118,4 +122,3 @@ // Generate the key data to be stored in the proof graph; either multikey or jwk, depending on the cryptosuite

const proofGraphResource = namedNode(`urn:uuid:${(0, uuid_1.v4)()}`);
const verificationMethodId = `urn:uuid:${(0, uuid_1.v4)()}`;
const keyResource = namedNode(verificationMethodId);
const keyResource = namedNode(`urn:uuid:${(0, uuid_1.v4)()}`);
// Create the resource for the proof graph itself, referring to a separate key resource

@@ -129,2 +132,4 @@ proofGraph.addQuads([

]);
if (previousProof !== undefined)
proofGraph.add(quad(proofGraphResource, exports.sec_previousProof, previousProof));
// Create the separate key resource triples (within the same graph)

@@ -145,2 +150,3 @@ if (keyData.controller)

// concatenation of the original dataset's hash and the hash of the proof option graph.
/* @@@@@ */ debug.log(`Signing ${proofOptionHashValue} + ${hashValue}`);
const signature = await (0, crypto_utils_1.sign)(report, proofOptionHashValue + hashValue, keyData.private);

@@ -188,7 +194,7 @@ // Close up...

if (proof_values.length === 0) {
localErrors.push(new types.Malformed_Proof_Error("No proof value"));
localErrors.push(new types.Proof_Verification_Error("No proof value"));
return null;
}
else if (proof_values.length > 1) {
localErrors.push(new types.Malformed_Proof_Error("Several proof values"));
localErrors.push(new types.Proof_Verification_Error("Several proof values"));
}

@@ -201,7 +207,7 @@ return proof_values[0].object.value;

if (verificationMethod.length === 0) {
localErrors.push(new types.Malformed_Proof_Error("No verification method"));
localErrors.push(new types.Proof_Verification_Error("No verification method"));
return null;
}
else if (verificationMethod.length > 1) {
localErrors.push(new types.Malformed_Proof_Error("Several verification methods"));
localErrors.push(new types.Proof_Verification_Error("Several verification methods"));
}

@@ -237,3 +243,3 @@ const publicKey = verificationMethod[0].object;

if (keys_jwk.length > 0 && keys_multikey.length > 0) {
localWarnings.push(new types.Malformed_Proof_Error(`JWK or Multikey formats can be used, but not both.`));
localWarnings.push(new types.Proof_Verification_Error(`JWK or Multikey formats can be used, but not both.`));
return null;

@@ -253,3 +259,3 @@ }

catch (e) {
localWarnings.push(new types.Malformed_Proof_Error(`Parsing error for Multikey: ${e.message}`));
localWarnings.push(new types.Proof_Verification_Error(`Parsing error for Multikey: ${e.message}`));
return null;

@@ -270,3 +276,3 @@ }

// This happens if there is a JSON parse error with the key...
localWarnings.push(new types.Malformed_Proof_Error(`Parsing error for JWK: ${e.message}`));
localWarnings.push(new types.Proof_Verification_Error(`Parsing error for JWK: ${e.message}`));
return null;

@@ -294,3 +300,3 @@ }

if (wrongPurposes.length > 0) {
localErrors.push(new types.Mismatched_Proof_Purpose(`Invalid proof purpose value(s): ${wrongPurposes.join(", ")}`));
localErrors.push(new types.Proof_Transformation_Error(`Invalid proof purpose value(s): ${wrongPurposes.join(", ")}`));
}

@@ -318,5 +324,8 @@ }

const proofOptionGraphHash = await calculateProofOptionsHash(proof);
/* @@@@@ */ debug.log(`Verifying ${proofOptionGraphHash} + ${hash}`);
const check_results = await (0, crypto_utils_1.verify)(report, proofOptionGraphHash + hash, proofValue, publicKey);
// the return value should nevertheless be false if there have been errors
return check_results ? localErrors.length === 0 : true;
const output = check_results ? localErrors.length === 0 : false;
/* @@@@@ */ debug.log(`verification result: ${output}`);
return output;
}

@@ -353,3 +362,3 @@ else {

allErrors.push(singleReport);
return verifyAProofGraph(singleReport, hash, pr.dataset, pr.id);
return verifyAProofGraph(singleReport, hash, pr.proofQuads, pr.proofGraph);
};

@@ -356,0 +365,0 @@ const promises = proofs.map(singleVerification);

@@ -53,6 +53,6 @@ /**

}
export declare class Malformed_Proof_Error extends ProblemDetail {
export declare class Proof_Verification_Error extends ProblemDetail {
constructor(detail: string);
}
export declare class Mismatched_Proof_Purpose extends ProblemDetail {
export declare class Proof_Transformation_Error extends ProblemDetail {
constructor(detail: string);

@@ -59,0 +59,0 @@ }

@@ -8,3 +8,3 @@ "use strict";

Object.defineProperty(exports, "__esModule", { value: true });
exports.Unclassified_Error = exports.Invalid_Verification_Method = exports.Mismatched_Proof_Purpose = exports.Malformed_Proof_Error = exports.Proof_Generation_Error = exports.ProblemDetail = exports.Cryptosuites = void 0;
exports.Unclassified_Error = exports.Invalid_Verification_Method = exports.Proof_Transformation_Error = exports.Proof_Verification_Error = exports.Proof_Generation_Error = exports.ProblemDetail = exports.Cryptosuites = void 0;
var Cryptosuites;

@@ -47,14 +47,14 @@ (function (Cryptosuites) {

exports.Proof_Generation_Error = Proof_Generation_Error;
class Malformed_Proof_Error extends ProblemDetail {
class Proof_Verification_Error extends ProblemDetail {
constructor(detail) {
super(detail, 'Malformed proof error', -17);
super(detail, 'Proof verification error', -17);
}
}
exports.Malformed_Proof_Error = Malformed_Proof_Error;
class Mismatched_Proof_Purpose extends ProblemDetail {
exports.Proof_Verification_Error = Proof_Verification_Error;
class Proof_Transformation_Error extends ProblemDetail {
constructor(detail) {
super(detail, 'Mismatched proof purpose', -18);
super(detail, 'Proof transformation error', -18);
}
}
exports.Mismatched_Proof_Purpose = Mismatched_Proof_Purpose;
exports.Proof_Transformation_Error = Proof_Transformation_Error;
class Invalid_Verification_Method extends ProblemDetail {

@@ -61,0 +61,0 @@ constructor(detail) {

@@ -33,9 +33,27 @@ /**

/**
* Structure with a separate store and its ID as a graph
* The general structure for a Proof
*/
export interface GraphWithID {
id: rdf.Quad_Graph | undefined;
dataset: n3.Store;
export interface Proof {
/**
* A collection of statements for a proof is to be in its own graph, generally with a blank node.
*
* Note that the type restriction for this term is `Quad_Subject`, which stands for a term or a blank node, which is
* more restrictive than a `Quad_Graph`, which may also have the value of a default graph. But proofs are always
* real graphs.
*/
proofGraph: rdf.Quad_Graph;
/** The proof ID, which, in this implementation, is never a blank node, usually a UUID */
proofId?: rdf.Quad_Subject;
/** The proof statements themselves, a set of triples (not quads) */
proofQuads: rdf.DatasetCore;
}
/**
* The general structure for a Proof using n3.Store specifically; it also has a `perviousProof` key.
* This subclass is used when key chains or sets are extracted from an embedded proof.
*/
export interface ProofStore extends Proof {
proofQuads: n3.Store;
previousProof?: rdf.Quad_Subject;
}
/**
* A shell around a Map, which is indexed by the *value* of rdf Terms.

@@ -53,2 +71,5 @@ *

*
* See the remark above for the graph value's type constraint: it is o.k. to use `Quad_Subject`, because it
* should never be a default graph.
*
* @param graph

@@ -58,5 +79,42 @@ * @returns

item(graph: rdf.Quad_Graph): n3.Store;
/**
* Get a proof, or `undefined` if it has not been stored yet
*
* @param graph
* @returns - the proof store data
*/
get(graph: rdf.Quad_Graph): ProofStore | undefined;
/**
* Set a proof
*
* @param graph
* @returns - the current dataset map
*/
set(graph: rdf.Quad_Graph): DatasetMap;
/**
* Has a proof been stored with this graph reference/
*
* @param graph
* @returns
*/
has(graph: rdf.Term): boolean;
/**
* Get the dataset references (in no particular order)
*
* @returns - the datasets
*/
datasets(): n3.Store[];
data(): GraphWithID[];
/**
* Get the proof entries (in no particular order)
* @returns - the proof entries
*/
data(): ProofStore[];
/**
* Get the proof entries, following the order imposed by the `previousProof` statements. First element is the one that has no previous proof defined. If there are no nodes with previous proof, an empty array is returned.
*
* This is equivalent to the way proof chains are passed on as arguments when embedded chains are created.
*
* @returns - the proofs entries
*/
orderedData(): ProofStore[];
}

@@ -88,2 +146,13 @@ /*****************************************************************************************

/**
* Create and store the values in a dataset in a new n3 Store. This may be
* necessary because the methods are not supposed to modify the original
* dataset.
*
* The n3.Store objects includes functions to retrieve quads, which is a great plus
*
* @param dataset
* @returns
*/
export declare function copyToStore(dataset: rdf.DatasetCore): n3.Store;
/**
* Convert the dataset into an n3.Store, unless it is already a store.

@@ -96,1 +165,21 @@ * This is done to manage the various quads more efficiently.

export declare function convertToStore(dataset: rdf.DatasetCore): n3.Store;
/**
* "Refactor" BNodes in a dataset: bnodes are replaced by new one to avoid a clash with the base dataset.
* Extremely inefficient, but is used for very small graphs only (proof graphs), so efficiency is not really an issue.
*
* The trick is to use the bnode generator of the base dataset, and that should make it unique...
*
* @param base
* @param toTransform
*/
export declare function refactorBnodes(base: n3.Store, toTransform: rdf.DatasetCore): rdf.DatasetCore;
/**
* When handling proof chains, the dataset must be temporarily extended with a number of quads that
* constitute the "previous" proof. This function calculates those extra quads.
*
* @param allProofs - the array of Proofs in chain order
* @param index - current index into allProofs
* @param anchor - the possible anchor that includes the `proof` reference triple
* @returns
*/
export declare function extraChainQuads(allProofs: Proof[], index: number, anchor?: rdf.Quad_Subject): rdf.Quad[];

@@ -14,6 +14,6 @@ "use strict";

Object.defineProperty(exports, "__esModule", { value: true });
exports.convertToStore = exports.calculateDatasetHash = exports.isKeyData = exports.isDatasetCore = exports.DatasetMap = exports.createPrefix = void 0;
exports.extraChainQuads = exports.refactorBnodes = exports.convertToStore = exports.copyToStore = exports.calculateDatasetHash = exports.isKeyData = exports.isDatasetCore = exports.DatasetMap = exports.createPrefix = void 0;
const rdfjs_c14n_1 = require("rdfjs-c14n");
const n3 = require("n3");
const { namedNode } = n3.DataFactory;
const { namedNode, quad } = n3.DataFactory;
/***************************************************************************************

@@ -71,2 +71,5 @@ * Namespace handling

*
* See the remark above for the graph value's type constraint: it is o.k. to use `Quad_Subject`, because it
* should never be a default graph.
*
* @param graph

@@ -76,26 +79,98 @@ * @returns

item(graph) {
const proofStore = this.get(graph);
return proofStore?.proofQuads;
}
/**
* Get a proof, or `undefined` if it has not been stored yet
*
* @param graph
* @returns - the proof store data
*/
get(graph) {
if (this.index.has(graph.value)) {
// The '?' operator is to make deno happy. By virtue of the
// test we know that the value cannot be undefined, but
// the deno checker does not realize this...
return this.index.get(graph.value)?.dataset;
return this.index.get(graph.value);
}
else {
return undefined;
}
}
/**
* Set a proof
*
* @param graph
* @returns - the current dataset map
*/
set(graph) {
if (!this.index.has(graph.value)) {
const dataset = new n3.Store();
this.index.set(graph.value, {
id: graph,
dataset
});
return dataset;
const proofStore = {
proofGraph: graph,
proofQuads: dataset,
};
this.index.set(graph.value, proofStore);
}
return this;
}
/**
* Has a proof been stored with this graph reference/
*
* @param graph
* @returns
*/
has(graph) {
return this.index.has(graph.value);
}
/**
* Get the dataset references (in no particular order)
*
* @returns - the datasets
*/
datasets() {
return Array.from(this.index.values()).map((entry) => entry.dataset);
return Array.from(this.index.values()).map((entry) => entry.proofQuads);
}
/**
* Get the proof entries (in no particular order)
* @returns - the proof entries
*/
data() {
return Array.from(this.index.values());
}
/**
* Get the proof entries, following the order imposed by the `previousProof` statements. First element is the one that has no previous proof defined. If there are no nodes with previous proof, an empty array is returned.
*
* This is equivalent to the way proof chains are passed on as arguments when embedded chains are created.
*
* @returns - the proofs entries
*/
orderedData() {
const stores = this.data();
// Look for the start of the chain
const start = (() => {
for (const proof of stores) {
if (proof.previousProof === undefined) {
return proof;
}
}
return undefined;
})();
if (start === undefined) {
return [];
}
else {
const output = [start];
let current = start;
nextInChain: for (; true;) {
for (const q of stores) {
if (q.previousProof && q.previousProof.equals(current.proofId)) {
output.push(q);
current = q;
continue nextInChain;
}
}
// If we get there, we got to a proof that is never referred to as 'previous'
// which should be the end of the chain...
return output;
}
}
}
}

@@ -160,2 +235,3 @@ exports.DatasetMap = DatasetMap;

}
exports.copyToStore = copyToStore;
/**

@@ -172,1 +248,62 @@ * Convert the dataset into an n3.Store, unless it is already a store.

exports.convertToStore = convertToStore;
/**
* "Refactor" BNodes in a dataset: bnodes are replaced by new one to avoid a clash with the base dataset.
* Extremely inefficient, but is used for very small graphs only (proof graphs), so efficiency is not really an issue.
*
* The trick is to use the bnode generator of the base dataset, and that should make it unique...
*
* @param base
* @param toTransform
*/
function refactorBnodes(base, toTransform) {
const bNodeMapping = new Map();
const newTerm = (term) => {
if (term.termType === "BlankNode") {
if (bNodeMapping.has(term.value)) {
return bNodeMapping.get(term.value);
}
else {
const bnode = base.createBlankNode();
bNodeMapping.set(term.value, bnode);
return bnode;
}
}
else {
return term;
}
};
const retval = new n3.Store();
for (const q of toTransform) {
let subject = newTerm(q.subject);
let predicate = q.predicate;
let object = newTerm(q.object);
retval.add(quad(subject, predicate, object));
}
return retval;
}
exports.refactorBnodes = refactorBnodes;
/**
* When handling proof chains, the dataset must be temporarily extended with a number of quads that
* constitute the "previous" proof. This function calculates those extra quads.
*
* @param allProofs - the array of Proofs in chain order
* @param index - current index into allProofs
* @param anchor - the possible anchor that includes the `proof` reference triple
* @returns
*/
function extraChainQuads(allProofs, index, anchor) {
if (index !== 0) {
// if there is an anchor, then the intermediate store gets an extra triple
const previousProof = allProofs[index - 1];
const output = Array.from(previousProof.proofQuads).map((q) => {
return quad(q.subject, q.predicate, q.object, previousProof.proofGraph);
});
if (anchor)
output.push(quad(anchor, namedNode('https://w3id.org/security#proof'), previousProof.proofGraph));
return output;
}
else {
return [];
}
}
exports.extraChainQuads = extraChainQuads;
{
"name": "rdfjs-di",
"version": "0.0.95",
"date": "2024-06-25",
"version": "0.0.96",
"date": "2024-07-24",
"description": "Secure an RDF Dataset through VC's Data Integrity",

@@ -6,0 +6,0 @@ "main": "dist/index.js",

@@ -15,10 +15,12 @@ --- NOT PRODUCTION READY ---

- Although it implements the the [EdDSA](https://www.w3.org/TR/vc-di-eddsa/) and [ECDSA](https://www.w3.org/TR/vc-di-ecdsa/) cryptosuites, the Multikey encoding of the latter is not yet conform to the Multikey specification.
The difference is that the Multikey encoding is done on the uncompressed crypto key as opposed to the compressed one, which is required by the specification.
The difference is that the Multikey encoding is done on the uncompressed crypto key as opposed to the compressed one, which is required by the specification.
(I have not yet found a reliable package, that also works with TypeScript, to uncompress a compressed key.)
- The management of proof chains is a bit restricted compared to the specification: proof chains and sets are not mixed. In other words, either all proofs are part of a chain or form a chain; the case when a previous proof reference points at a set of proofs has not been implemented.
- It has not (yet) been cross-checked with other DI implementations and, in general, should be much more thoroughly tested.
There is also a missing feature in the DI specification regarding the usage for Datasets in general. For a Verifiable Credential there is a natural "anchor" Resource used to "connect" the input dataset with its proof.
There is also a missing feature in the DI specification regarding the usage for Datasets in general. For a Verifiable Credential there is a natural "anchor" Resource used to "connect" the input dataset with its proof.
This is generally not true (see, e.g. [separate discussion](https://github.com/w3c/vc-data-model/issues/1248)) and, in this implementation, it must be provided explicitly to embed the proof into the dataset.
What the implementation proves, however, is that the _DI specification may indeed be used, with minor adjustment on the "anchor", to provide a proof for an RDF Dataset in the form of a separate "Proof Graph"_, i.e., an RDF Graph containing a signature and its metadata that can be separated by a verifier.
What the implementation proves, however, is that the
_DI specification may indeed be used, with minor adjustment on the "anchor", to provide proofs for an RDF Dataset in the form of separate "Proof Graphs"_, i.e., RDF Graphs containing a signature and its metadata that can be separately verified.

@@ -34,2 +36,4 @@ ## Some details

An extra complication occurs for proof chains: the specification requires that the previous proof in the chain is also "signed over", i.e., the dataset is expanded to include, for the purpose of a signature, the previous proof graph in its entirety.
The package has separate API entries to generate, and validate, such proof graphs. It is also possible, following the DI spec, to provide "embedded" proofs, i.e., a new dataset, containing the original data, as well as the proof graph(s), each as a separate graph within an RDF dataset. If a separate "anchor" resource is provided, then this new dataset will also contain additional RDF triples connecting the anchor to the proof graphs.

@@ -44,5 +48,6 @@

Although not strictly necessary for this package, a separate method is available as part of the API to generate cryptography keys for one of these four algorithms.
Although not strictly necessary for this package, a separate method is available as part of the API to generate cryptography keys for one of these four algorithms.
The first two algorithms are specified by cryptosuites, identified as `eddsa-rdfc-2022` and `ecdsa-rdfc-2019`, respectively.
The other two are non-standard, and are identified with the temporary cryptosuite name of `rsa-pss-rdfc-ih` and `rsa-ssa-rdfc-ih`, respectively. Note that there is no Multikey encoding for RSA keys, so the keys are stored in JWK format as a literal with an `rdf:JSON` datatype.
The other two are non-standard, and are identified with the temporary cryptosuite name of `rsa-pss-rdfc-ih` and `rsa-ssa-rdfc-ih`, respectively.
Note that there is no Multikey encoding for RSA keys, so the keys are stored in JWK format as a literal with an `rdf:JSON` datatype.

@@ -54,3 +59,3 @@ The user facing APIs use the JWK encoding of the keys only. This makes it easier for the user; Web Crypto provides JWK export "out of the box", and it becomes more complicated for Multikey. This may be changed in future.

- [Separate document for the API](https://iherman.github.io/rdfjs-di/modules/index.html)
- [A small RDF graph](https://github.com/iherman/rdfjs-di/blob/main/examples/small.ttl) and its ["verifiable" version with embedded proof graphs](https://github.com/iherman/rdfjs-di/blob/main/examples/small_with_proofs.ttl)
- [A small RDF graph](https://github.com/iherman/rdfjs-di/blob/main/examples/small.ttl) and its ["verifiable" version with embedded proof graphs](https://github.com/iherman/rdfjs-di/blob/main/examples/small_with_proofs.ttl).

@@ -57,0 +62,0 @@ (Note that the API works on an RDF Data model level, and does not include a Turtle/TriG parser or serializer; that should be done separately.)

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc