Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@meteorjs/reify

Package Overview
Dependencies
Maintainers
0
Versions
11
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@meteorjs/reify

Enable ECMAScript 2015 modules in Node today. No caveats. Full stop.

  • 0.25.4
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
16K
decreased by-13.69%
Maintainers
0
Weekly downloads
 
Created
Source

re·i·fy verb, transitive   Build Status

re·i·fied past   re·i·fies present   re·i·fy·ing participle   re·i·fi·ca·tion noun   re·i·fi·er noun

  1. to make (something abstract) more concrete or real
    "these instincts are, in humans, reified as verbal constructs"
  2. to regard or treat (an idea, concept, etc.) as if having material existence
  3. to enable ECMAScript 2015 modules in any version of Node.js

Usage

  1. Run npm install --save @meteorjs/reify in your package or app directory. The --save is important because reification only applies to modules in packages that explicitly depend on the @meteorjs/reify package.
  2. Call require("@meteorjs/reify") before importing modules that contain import and export declarations.

You can also easily @meteorjs/reify the Node REPL:

% node
> require("@meteorjs/reify")
{}
> import { strictEqual } from "assert"
> strictEqual(2 + 2, 5)
AssertionError: 4 === 5
    at repl:1:1
    at REPLServer.defaultEval (repl.js:272:27)
  ...

How it works

Code generated by the reify compiler relies on a simple runtime API that can be explained through a series of examples. While you do not have to write this API by hand, it is designed to be easily human readable and writable, in part because that makes it easier to explain.

I will explain the Module.prototype.link method first, then the Module.prototype.export method after that. Note that this Module is the constructor of the CommonJS module object, and the import and export methods are custom additions to Module.prototype.

module.link(id, setters)

Here we go:

import a, { b, c as d } from "./module";

becomes

// Local symbols are declared as ordinary variables.
let a, b, d;
module.link("./module", {
  // The keys of this object literal are the names of exported symbols.
  // The values are setter functions that take new values and update the
  // local variables.
  default(value) { a = value; },
  b(value) { b = value; },
  c(value) { d = value; },
});

All setter functions are called synchronously before module.link returns, with whatever values are immediately available. However, when there are import cycles, some setter functions may be called again, when the exported values change. Calling these setter functions one or more times is the key to implementing live bindings, as required by the ECMAScript 2015 specification.

Importing a namespace object is no different from importing a named export. The name is simply "*" instead of a legal identifier:

import * as utils from "./utils";

becomes

let utils;
module.link("./utils", {
  "*"(ns) { utils = ns; }
});

Note that the ns object exposed here is !== require("./utils"), but instead a normalized view of the require("./utils") object. This approach ensures that the actual exports object is never exposed to the caller of module.link.

Notice that this compilation strategy works equally well no matter where the import declaration appears:

if (condition) {
  import { a as b } from "./c";
  console.log(b);
}

becomes

if (condition) {
  let b;
  module.link("./c", {
    a(value) { b = value; }
  });
  console.log(b);
}

See WHY_NEST_IMPORTS.md for a much more detailed discussion of why nested import declarations are worthwhile.

module.export(getters)

What about export declarations? One option would be to transform them into CommonJS code that updates the exports object, since interoperability with Node and CommonJS is certainly a goal of this approach.

However, if Module.prototype.link takes an id string and a map of setter functions, then it seems natural for Module.prototype.export to be method that registers getter functions. Given these getter functions, whenever module.link(id, ...) is called by a parent module, the getters for the id module will run, updating its module.exports object, so that the module.link method has access to the latest exported values.

The module.export method is called with a single object literal whose keys are exported symbol names and whose values are getter functions for those exported symbols. So, for example,

export const a = "a", b = "b", ...;

becomes

module.export({
  a: () => a,
  b: () => b,
  ...
});
const a = "a", b = "b", ...;

This code registers getter functions for the variables a, b, ..., so that module.link can easily retrieve the latest values of those variables at any time. It's important that we register getter functions rather than storing computed values, so that other modules always can import the newest values.

Export remapping works, too:

let c = 123;
export { c as see }

becomes

module.export({ see: () => c });
let c = 123;

Note that the module.export call is "hoisted" to the top of the block where it appears. This is safe because the getter functions work equally well anywhere in the scope where the exported variable is declared, and a good idea because the hoisting ensures the getters are registered as early as possible.

What about export default <expression> declarations? It would be a mistake to defer evaluation of the default expression until later, so wrapping it in a hoisted getter function is not exactly what we want.

Instead,

export default computeDefault();

gets replaced where it is (without any hoisting) by

module.exportDefault(computeDefault());

The module.exportDefault method is just a convenient wrapper around module.export:

module.exportDefault = function (value) {
  return this.export({
    default: function () {
      return value;
    }
  }, true);
};

That true argument we're passing to module.export is a hint that the value returned by this getter function will never change, which enables some optimizations behind the scenes.

module.runSetters()

Now, suppose you change the value of an exported local variable after the module has finished loading. Then you need to let the module system know about the update, and that's where module.runSetters comes in. The module system calls this method on your behalf whenever a module finishes loading, but you can also call it manually, or simply let reify generate code that calls module.runSetters for you whenever you assign to an exported local variable.

Calling module.runSetters() with no arguments causes any setters that depend on the current module to be rerun, but only if the value a setter would receive is different from the last value passed to the setter.

If you pass an argument to module.runSetters, the value of that argument will be returned as-is, so that you can easily wrap assignment expressions with calls to module.runSetters:

export let value = 0;
export function increment(by) {
  return value += by;
};

should become

module.export({
  value: () => value,
  increment: () => increment,
});
let value = 0;
function increment(by) {
  return module.runSetters(value += by);
};

Note that module.runSetters(argument) does not actually use argument. However, by having module.runSetters(argument) return argument unmodified, we can run setters immediately after the assignment without interfering with evaluation of the larger expression.

Because module.runSetters runs any setters that have new values, it's also useful for potentially risky expressions that are difficult to analyze statically:

export let value = 0;

function runCommand(command) {
  // This picks up any new values of any exported local variables that may
  // have been modified by eval.
  return module.runSetters(eval(command));
}

runCommand("value = 1234");

exports that are really imports

What about export ... from "./module" declarations? The key insight here is that export declarations with a from "..." clause are really just import declarations that update the exports object instead of updating local variables:

export { a, b as c } from "./module";

becomes

module.link("./module", {
  a(value) { exports.a = value; },
  b(value) { exports.c = value; },
});

Since this pattern is so common, and no local variables need to be modified by these setter functions, the runtime API supports an alternative shorthand for re-exporting values:

module.link("./module", { a: "a", b: "c" });

This strategy cleanly generalizes to export * from "..." declarations:

export * from "./module";

becomes

module.link("./module", {
  "*"(ns) {
    Object.assign(exports, ns);
  }
});

Though the basic principle is the same, in reality the Reify compiler generates shorthand notation for this pattern as well:

module.link("./module", { "*": "*" });

This version is shorter, does not rely on Object.assign (or a polyfill), can be a little smarter about copying special properties such as getters, and reliably modifies module.exports instead of the exports variable (whatever it may be). Win!

Exporting named namespaces (proposal):

export * as ns from "./module";

becomes

module.link("./module", {
  "*"(ns) { exports.ns = ns; }
});

Shorthand:

module.link("./module", { "*": "ns" });

Re-exporting default exports (proposal):

export a, { b, c as d } from "./module";

becomes

module.link("./module", {
  default(value) { exports.a = value; },
  b(value) { exports.b = value; },
  c(value) { exports.d = value; }
});

Shorthand:

module.link("./module", {
  default: "a",
  b: "b",
  c: "d"
});

While these examples have not covered every possible syntax for import and export declarations, I hope they provide the intuition necessary to imagine how any declaration could be compiled.

When I have some time, I hope to implement a live-compiling text editor to enable experimentation.

Top Level Await

To enable top level await, set the topLevelAwait option to true when compiling files with reify (it is currently disabled by default). This wraps modules in a module.wrapAsync function that handles running the modules and its deps in a spec-compliant way:

!module.wrapAsync(async function (module, __reifyWaitForDeps__, __reifyAsyncResult__) {
    "use strict";
    try {
      let utils;
      module.link("./utils", {
        "*"(ns) { utils = ns; }
      });
  
      if (__reifyWaitForDeps__()) (await __reifyWaitForDeps__())();
  
      const language = utils.currentLanguage();
      const message = await import(`./message/${language}.js`); 
  
      __reifyAsyncResult__();
    } catch (_reifyError) {
      __reifyAsyncResult__(_reifyError);
    }
  },
  { self: this, async: true }
);

This is more complicated than other parts of the runtime, but can be broken down into 3 parts:

  1. At the top of the function passed to wrapAsync, it links any dependencies
  2. If any of the dependencies are async, it waits for them to be fully evaluated
  3. Afterwards, it runs the module code

If you require an async module, require will return a promise that resolves to the module's exports, instead of directly returning the exports. This will only work for require in modules that reify was enabled for.

Keywords

FAQs

Package last updated on 23 Oct 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc